The Case Against Financial Algorithms
In a recent survey conducted by JPMorgan Chase & Co., asking “institutional and professional traders” what they believed would have the biggest impact on the financial markets in the coming years, AI won by a large majority. But is this a good thing? I don’t think so.
There is an inherent problem with financial algorithms that can’t be remedied even as they develop. These algorithms are used en masse by multiple investment firms. I think this inevitably will be taken to an extreme, rendering irrelevant any decisions firms make as a result, as these decisions will fundamentally shape reality.
It’s not an issue with algorithms, it’s an issue with humans. If a source (the algorithm) says to buy TLSA because it’s going to go up, why wouldn’t we execute that trade? Why wouldn’t the ninety other companies that hold 35% of the world’s stock market do the same thing?
So everybody buys, because they also trust the algorithm. Then, TLSA goes up; the algorithm was right! Now we’ve justified the use of the algorithm and embed more trust in it. If it fails once in a while, so what? What’s a 1% loss to a 700% gain? If its hit rate is still good, why stop using it? Because all of the algorithms are using the same data.There’s no reason for a diversity of programs when the superiority of one is quantifiable. Archimedes, for example, is one of the best trading algorithms available. It uses news, quarterly reports, expected income, trends, and trading patterns, among other variables. It even checks for laws that may affect individual stock. The algorithm is effective, but if everyone uses it, we’d all just be frontrunning each other.
Well, if ChatGPT is anything to go by, AI will only get more clever. Won’t we eventually be able to design algorithms that can account for being used en masse? No. It’s a paradox: The program considers an action as the most effective, but then finds that it isn’t due to the market response to all algorithms suggesting that action. But other algorithms say the same thing. The decision is effective again, no one is making that decision anymore, and so on.
I’m not just arguing against the imperfection of financial algorithms in application, I am arguing for the imperfection of humans as superior. The human bug is actually a feature: We can’t process data as fast as an algorithm can. We think, “Am I going to lose my job if this goes bad? Let me think this through before I spend ten percent of our entire venture fund, because my job depends on it, and so does my marriage (probably).” Humans will make different decisions from one another, creating complexity in the market, and many winners and losers of varying degrees. Algorithms, by contrast, don’t get married. They don’t second-guess themselves. They create monoliths. There are only winners or losers, and only a select few can be – always will be – winners.
The long-term implications of mass adoption of algorithms is the creation of an oligopoly of companies. This oligopoly is created by and invested in by a monopoly of firms in a market that strangles innovation. The firms win: iPhones are $4000 because Apple is now a seven trillion dollar company, so sayeth the algorithm. Why sell Apple stock when it’s performing so well? Why invest in new ideas? Why threaten shareholder profits?
Then the market collapses. Why? Because the monopoly of firms is investing in an oligopoly of companies, effectively creating “God” chips. This drives the value of the oligopoly companies through the roof, even though these companies don’t make enough money to support it.
Meanwhile, everyone is putting their money into this monopoly of investment firms because of the 7000% returns they’re getting. Then, one day, an oligopoly company fails, or the algorithm says that an oligopoly company is going to fail and is no longer a good investment. All the firms rush to be the first to pull out before it’s too late, decimating the value, and some firms lose, badly. So sows the first seeds of mistrust and uncertainty. What happens when that 7000% return in year one becomes a 25% return in year two? Fear. Everyone starts pulling out of the investment firms. One firm collapses, or multiple. The market collapses, and now we’re all losers. Thanks, AI.
Don’t get me wrong. I completely agree that financial algorithms are the future. This future. Is there something we can do about it? Probably not. We are imperfect, and have one major bug in our design that is most definitely not a feature: chasing money off a cliff.