Data.
It has, according to some pundits, replaced research as the main driver of trading these days. And as traders and others increasingly depend on data and subsequently the flow of data, exchanges have gotten into the data delivery business and engaged in a quiet battle to become number one in this area.
As exchanges have increased revenue throughout the last several years, market data has also become a larger piece of their revenue pie. Over the last decade, global spending on market data has risen to a record $32 billion in 2019. As it becomes more essential to the business, are exchanges prepared to manage the scaling up of market data services for clients and how can they be more efficient in the process of distributing data to external clients? Crux Informatics CEO & Founder Philip Brittan spoke with Traders Magazine Editor John D’Antona Jr. on how exchanges are managing the current flow of data.
TRADERS MAGAZINE: Exchanges had delivered market data files through one specific delivery pipeline until recently – what has changed exactly?
Philip Brittan: The simple answer is that data is essential and customers want more value from it than ever before. They want more data, they want it faster, they want to use more advanced tools to wring signals from it. They also want to be more cost-efficient in how they go about all this.
As a result, consumer requirements and expectations around data consumption have changed drastically. The explosion in types and sources of data has significantly overwhelmed consumers’ ability to ingest, prepare, and analyze all of this data – resulting in the proliferation of data warehousing, processing, and analytics platforms where customers want their data.
Also, a lot of applications have been developed to enable users of data to access (query) data faster in different environments. Multiply this across the industry with every financial institution setting up different architectures and processes and it makes for a complicated supply chain for exchanges and market data providers to navigate. To clarify, most exchanges currently deliver their data files through a single delivery method – they are bottlenecked in this new environment and are actively working with partners like Crux to scale their delivery needs.
TM: Big exchanges are now burdened with the number of data files multiplying due to the number of endpoints and delivery methods as well as the number of custom client needs – please explain this further. Is this because of increased traders in the market? More trading venues that require this data?
Brittan: This burden is not particularly associated with the number of data files, but rather on the infrastructure and delivery side required to deliver to the various endpoints. Building, operating, and maintaining each data pipeline to each endpoint requires a tremendous amount of work and resources, and doing that type of work is not particularly differentiating to the exchanges so it makes for a hard investment decision to develop. To add to the complexity, it is not clear which endpoints to prioritize.
TM: Please comment on the recent SEC decision about NMS II – how NYSE, Nasdaq and Cboe Global Markets Inc. will be forced to revamp the management of public data feeds. How does this affect end users? Does it affect your business?
Brittan: The sentiment behind the decision echoes what we have been sharing with the market for years. By centralizing these public data feeds, the industry will realize an increase in transparency for regulatory purposes. It also highlights the need for a central utility to help with the flow of data across suppliers and consumers of data. The current architecture is unsustainable for the growth of data and various use case/endpoint requirements.