Bridging the Capital Markets Digital Divide

In the first quarter of this year, quantitative hedge funds received $4.6 billion in net new investments. By comparison, the overall hedge fund industry during the quarter, had withdrawals of $5.5 billion. Those figures, were reported in a Wall Street Journal article ominously titled The Quants Run Wall Street Now.

The rise of quants is not just a hedge fund issue. For several years now, there have been debates and discussions about the digital divide that is emerging in capital markets, between an elite group of technology-focused firms that have invested in data and computing resources for spotting alpha in highly automated markets, and a larger group of market participants who are still wrestling with how significantly they should shift from traditional trading models.

Whats more daunting, is that despite the impact that big data and algorithmic trading have already had on markets, the biggest changes may still be yet to come. Industry veteran Michael Spencer, who founded electronic markets and post-trade specialist ICAP, which became NEX Group after selling its hybrid voice broking business Tullett-Prebon last year, predicts a tectonic change in the finance industry over the next few years.

Technology will bring us a new breed of trader, and those trading firms that embrace this change will end up looking more like Silicon Valley than Wall Street, Spencer wrote in a recent column in Financial News.

For the market players who have not yet crossed the digital divide, the investment required to compete with statistical arbitrage firms who are successfully leveraging highly automated quantitative strategies may seem overwhelming. But for players who have already invested in high-tech trading infrastructures, costs are challenging as well. Guggenheim Partners LLC, which built a supercomputing cluster at Lawrence Berkeley National Laboratory in California, spends $1 million a year just on the electricity to run the cluster, The Wall Street Journal reported. Anecdotally, some firms have told us that the cost of training algorithms to leverage data for a trading advantage can be ten times the already pricey data acquisition and management expenses.

A new infrastructure paradigm

Joel Steinmetz, a Fluent Trade LLC managing director and a veteran of Citadel and Citigroup, pointed out in an article in Tabb Forum a few years ago that trading in milliseconds means that there are 23,400,000 possible trade times in each trading day. Microsecond trading means there are 23,400,000,000 potential execution points in a day.

To be able to compete in markets moving at that pace may require a new paradigm for infrastructure investment. We are now at a point where new developments in enterprise technology are beginning to enable that new paradigm. Breakthroughs in cloud technology allow firms to quickly access enormous amounts of computing power in a more cost-effective, pay-per-use way. Through assembling a hybrid cloud/on premise architecture, firms can access commoditized data and technology in an agile and efficient way, while maintaining their most proprietary tools in house. Third party processing power can support real-time applications. Cloud-based data and analytics can add richness and depth to instantaneous algorithms. Critically for the finance industry, a software-as-a-service, coupled with emerging data-as-a-service models, can give firms the flexibility to scale up and scale down based on intraday needs.

Building momentum

No longer just a theoretical idea, the hybrid cloud model is already attracting a groundswell of support. Cloud infrastructure providers Amazon and Google are achieving critical mass in their levels of adoption. Markets have always worked on the concept of strength in numbers, and, the growing use of large cloud infrastructures by the finance industry has made the cloud a more valuable data storage and computing power resource.

A collection of technologies that improves the capturing, sharing and management of data across networks is also beginning to proliferate. High precision time stamping, layer 1 switching, packet capture and other technologies provide cost-effective ways to capture much richer data sets instantly, and consequently improve the viability of the cloud for real-time support. Layer 1 switching works like a mirror, replicating all the data it sees in a handful of nanoseconds. Packet capture efficiently collects all data crossing an entry point to a network. Time stamping, critically for high-speed markets, can synchronize data collection across disparate systems and networks. These technologies provide as crucial plumbing for high speed markets. Coupled with other emerging technologies, such as artificial intelligence, firms can use technological tools to leverage the cloud to access vast pools of data for testing and fine-tuning predictive market analytics.

Global research firm Markets and Media predicts that the size of the finance cloud serving financial firms will grow from $9.89 billion in 2016 to $29.47 billion in 2027, at a compound annual growth rate of 24.4%. Those figures are more than a prediction about growing momentum – they are also a wake-up call. In an era of smaller spreads and harder-to-spot arbitrage opportunities, the challenges an individual might face in trying to grow at that pace would likely be insurmountable. Emerging cloud-based technologies have the potential to level the playing field, so that financial markets leaders will not have to have the largest technology infrastructures, just the smartest.

Patrick Flannery is the CEO of MayStreet