Does speed kill? Or more specifically, does the quest for ever-greater trading speeds kill resources that could be better dedicated—from the perspective of investors and listed companies—to other activities? Six years after the publication of Michael Lewis’s Flash Boys, the question is still generating more heat than light.
A new academic paper published by the UK Financial Conduct Authority has set off yet another firestorm about high-frequency trading. Is it a gift to investors, lowering their trading costs and increasing their investment returns, as HFT firms and their lobbies argue, or a “tax” on their transactions—as the authors assert? The conclusion legislators and regulators draw could have consequential public-policy implications, as opponents of HFT have called for transaction taxes and rules to discourage or limit it.
The FCA study is an interesting one methodologically, as it makes use of a unique LSE dataset that comprises all messages sent to the exchange over a given period, and not just orders. It thus captures aspects of speed races that are missed by excluding messages which fail to change the order book (because they are too late).
One of the study’s key, controversial findings is that eliminating latency arbitrage, defined as “sniping” stale quotes, would reduce investors’ “cost of liquidity”—as proxied by a lower bid-ask spread or greater market depth—by about 17 percent. This effect owes to behavior undertaken by liquidity providers to avoid being picked off—behavior they would not undertake absent incessant speed races.
I will not try to vet the study’s methodology—which is complex, as with most counterfactual-based econometrics exercises. But I will offer the following observations.
First, the ritual denunciations that I see coming in—along the lines of “HFT has massively reduced spreads, you idiots!”—are ignoring a very real problem considered by the paper. That is that avoidance behavior is logical, actual, and costly to investors. The costs of “orders not placed,” in order to avoid latency-related losses, must obviously be set against the benefits of HFT (which the paper tries to do), but we cannot ignore their existence.
Second, even if the paper overestimates the profitability of latency arbitrage, the resources dedicated to it are an important measure of its social cost, which must be weighed against its social benefits. What are the benefits? I’ll go out on a limb here—the benefits to society of bringing information to the market microseconds faster are zero. That is, they do not make the market more efficient in any meaningful way. Thus any resources devoted to shaving microseconds are wasted resources—from society’s perspective.
Third, some of the inducements for latency arbitrage are artificial—that is, not market-based. If they were removed, there would be less of the activity—and no one would miss it (except the handful of ritual winners). I am thinking in particular of SIP-based latency arbitrage in the United States. Since the SEC and the market have accorded “official” status to prices produced and disseminated by a mechanism that is inherently slower than private alternatives, anyone relying on it for midpoint matches etc. is inviting those with access to more timely prices to exploit them. It is perhaps therefore time for the SIP to go the way of LIBOR and pass into the realm of flawed ideas of antiquity.
Fourth, even among those of my industry colleagues who are HFT-agnostic I sense a certain weary “so what?” reaction to papers like this. That is, there is an implicit belief that speed races are simply an unavoidable aspect of trading. From one perspective, this is clearly true. If a market operates on the basis of price-time priority, there will inevitably be speed races. On the old trading floors, the “high frequency traders” were big guys with loud mouths—that is, guys who could trade before others because of their visibility and audibility. But price-time priority is a market-structure choice, and was never written into the fabric of the cosmos. Options floors were typically price-size priority, not price-time, and many electronic versions retain this feature. A decade ago, Nasdaq experimented with a price-size equity segment—it failed. But that does not mean we should not continue to look for alternatives—particularly given the rapid advances in computer power and AI.
Finally, and related to my point on market structure above, the spread-fetish of those who insist that all types of HFT are “for the best in the best of all possible markets” is indefensible in a market dominated by institutions. Who cares what the spread is for 100 shares if you need to buy 50,000? The fact that HFT reduces spreads does not mean that it minimizes trading costs. Buying 50,000 shares in 500 trades is bound to result in information leakage and market impact costs.
I come, then, to the main point of my piece, which is that continuous price-time trading has conspicuous flaws for which there exist market-based solutions—solutions that work. No, I’m not talking about “speed bumps.” Speed bumps just create new opportunities for arbitrage. Auction-based market structures, in contrast, don’t seek to slow anyone down, but rather to take speed out of the strategic equation entirely. Instead of processing trades sequentially and bilaterally, which is an archaic practice inherited from the pre-computer days, auctions can process them simultaneously and multilaterally. And they don’t, contrary to thinking of the exchanges, have to be run at the open or close—or at any fixed time. They can be run on demand. Traders can initiate them by revealing to the market nothing more than the fact that they wish to transact in a given security—without indicating side or size. The result is transactions dozens of times larger than typical sequential-bilateral trades on exchange and ATS platforms—with no discernible market impact.
Now, some of you are obviously asking, “well, hey, if this is such a great way to trade why don’t all institutions do it all the time?” A great question. And I have three answers. First, institutions are still using flawed trading-cost analysis (TCA) measures like VWAP—measures that were created for an old-fashioned price-time market with sequential, bilateral trading. These measures fail to capture the benefits of eliminating information leakage and market impact inherent in trading simultaneously and multilaterally. Second, brokers are only just beginning to adapt their routing algorithms to accommodate this new market structure. And third, despite the obvious benefits of the structure, regulators have not incorporated it into their best-execution regimes.
The SEC’s Order Protection Rule (OPR), for example, does nothing to encourage auction markets, since auction bids and offers are not limit orders (and therefore not protected). More importantly, treating “best execution” as trading at or within the best buy and sell limit orders represents a fundamental misunderstanding of what good trading is—from the client’s perspective. A client who wishes to buy 50,000 shares of a thinly traded security should know that the inside spread is a poor approximation of the range of prices at which he or she can complete the purchase, as each hundred-share constituent trade serves to push that spread upward. A block, multilateral trade of 50,000 shares is almost sure to be better priced, even if priced outside the spread preceding it.
This brings me, finally, to the important issue of competition. The OPR exists not just to mitigate spatial fragmentation, but to encourage competition to incumbents by guaranteeing that their best bids and offers cannot be ignored. But this has only promoted competition among identical market structures, and not competition among differentiated ones. As we should expect, then, the benefits are limited, at best. (At worst, they are net costs, owing to the spending required to access the spaghetti bowl of protected quotes across venues.) New structures are left outside the panoply of rules designed, ostensibly, to support capital-raising and investing.
So what should be done? The Commission could better adhere to its mission by bringing auctions explicitly within its best-execution framework. SEC rules currently oblige “broker-dealers to provide quarterly reports on routing of customer orders and require markets to supply monthly reports on execution quality.” If such reporting required explicit consideration of auction mechanisms, including measures of their performance, investors and listed companies would be manifestly better served.
Coming back to the question of speed in trading, then, I say it’s time we broadened our horizons and stopped debating it like a fight about the costs and benefits of gun rights. In the context of continuous price-time markets, traders have always pursued speed and will always do so, irrespective of the declining (and perhaps negative) marginal contribution to market performance. The best way to stop over-investment in speed is to encourage and pursue experiments with market structures that don’t reward it.
Don Ross is the CEO of PDQ Enterprises, LLC, the parent company of CODA Markets, Inc., a FINRA regulated broker-dealer which operates an electronic alternative trading system under SEC Regulation ATS.