The August interruption in Nasdaq service is merely the latest example of the increasing instability of the U.S. equity market.
Other examples are:
So far, investors have avoided major losses. But the potential for disaster looms if regulators don't address the source of the instability — a market that has become too complex.
Currently, U.S. stocks trade in more than 60 separate venues. These include exchanges, electronic networks, systematic internalizers and alternative trading systems. The proliferation of trading execution outlets has created several sources of instability.
One source is the fragmented nature of the marketplace. In fact, the Aug. 22 Nasdaq disruption hinged on the inability of rival computer networks to communicate with each other. The large number of trading venues makes it difficult for buyers and sellers to find each other. To compensate, traders use complex routing algorithms to forecast, or guess, where opposing orders might be posted. But in many cases, orders are unnecessarily intermediated by high-frequency trading algorithms designed to profit from minute deviations in stock prices. Fragmentation also causes inefficiencies that raise execution costs and leave larger orders vulnerable to abuse from predatory algorithms.
A second source of instability is the length of time required to route orders between execution venues, a concept known as latency. Increased latency creates opportunities for high-frequency traders and risks for the market. In a high-latency environment, orders bounce among many venues, seeking an execution opportunity. All the while, they leak information about their size and trading pattern. Like U-boat captains stalking cargo ships, high-frequency traders use predatory algorithms to detect large orders moving through the market. Once they spot one, algorithms try to front-run them, move the price, and offer it back to the original order at a higher price.
Market complexity also introduces a higher cost structure. The cost of the systems needed to manage a routing network to each venue and maintain competitive levels of speed on those networks has become burdensome for broker-dealers. Some now spend more on technology than they do on people. These increased costs will eventually be passed on to investors in the form of higher commissions.
In addition to raising the cost for investors, the current technological arms race also poses risks to the market. Each time a new venue opens or an existing one is modified, each broker-dealer must update its software. The more frequent the changes, the greater the risk of potential error. Knight Capital, one of the country's largest market makers, experienced this risk last year when a coding mistake caused it to lose $389 million in less than 45 minutes. Knight was blamed for releasing bad code into the market, but it was a technologically savvy firm. If this kind of error could happen there, it can happen anywhere.
Complex markets and heightened risks of system failure raise daunting liability issues, too.
Who will be held responsible for losses? When Nasdaq's technology failed during the Facebook IPO, its clients sustained significant losses. The exchange is attempting to limit its liability by filing a motion to dismiss some claims against it. Fortunately for investors, the broker-dealers are standing between Nasdaq and the losses. But what if a broker-dealer owned the venue where the error occurred? And what if that broker-dealer had to choose between forcing the loss onto its clients or bankruptcy?
Regulators are taking steps to deal with the symptoms of an unstable market. The Securities and Exchange Commission has launched programs to monitor trading activity, control liquidity gaps with circuit breakers and implement best practices for software deployment. But these solutions don't necessarily address the underlying problems. Instead, they might add more complexity to markets that already have too much of it.
In our opinion, simplifying the markets is a better approach to reducing risks. Two changes could produce dramatic improvements.
First, requiring all execution venues to be hosted in the same geographic location would eliminate latency by moving servers close to each other and standardizing the technology connecting them. Orders bouncing from one venue to another in search of liquidity would travel at the same speed as other orders.
A second change would be to increase capital requirements for trading venues. Raising minimum regulated capital to a more significant level, say, $500 million, would help insure investors against losses from a system failure. This new minimum would increase the cost structure of the 60-plus trading venues and some products would be forced out of the market. However, fewer venues would mean less pressure on broker-dealers' expense line to maintain their routing networks, which in turn would relieve pressure on commissions.
Any moves to alter the market's structure must be considered carefully. Changes made by the SEC in the past 20 years have lowered transaction costs and increased efficiency. But the system failures over the past few years cannot be ignored. As market leaders seek solutions, they should consider the advice of Leonardo da Vinci, one of history's greatest inventors, who said: “Simplicity is the ultimate sophistication.”
This article originally appeared in the October 14, 2013 print issue as, "Easing trading risks from market fragmentation and tech arms race".