Future Proofing Network Latency in Stock Trading
Traders Magazine Online News, November 7, 2017
In an InformationWeek article published a decade ago, a large global investment bank stated that every millisecond lost results in $100m per annum in lost opportunity[1] This article is still quoted today and is more relevant than ever.
Ultra-low latency is central to securing a trade at the desired price before it fluctuates. This is applicable to high-frequency traders (HFT), market makers or statistical arbitrage traders. Traders must be able to access a range of applications in real-time. Applications that handle tasks such as trade placement, analysis and modeling, and settlement, are the lifeblood of most firms, and the slightest latency incurred can impact the price of a security.
A Wall Street and Technology magazine webcast polled attendees with the following question: "What is the greatest limitation or challenge in your current infrastructure and organization around processing and analyzing real-time market data?" 43.1% stated that latency was their biggest concern.[2]
In networking, latency is the time it takes to move data from one network device to another, and in the world of the Internet it is measured in round-trip time. There are several factors that can contribute to latency such as packet processing speed, optical digital signal processing (DSP) time, routers changing the packet headers, traffic delays, distances between sites, switching bottlenecks and overall network performance/speed.
The need to consistently reduce communications network latency for capital markets is growing, and the ubiquity of algorithmic trading inherently necessitates firms to act on market events faster than their competition to drive profitability.
Speed is vital to traders due to the fundamental volatility of many financial instruments. Being faster than other traders is also of great significance because it can create opportunities to profit by enabling a quick response to news and other market influencing factors. This reality is fueling a race to deploy cutting-edge technology to gain an edge by reducing latency. This has advanced the “millisecond environment” significantly, where algorithms reply to each other 100 times faster than the blink of an eye.
The need for latency reduction is accelerating at a pace where industry competition is being defined by how quickly transactions can happen, over the transactions themselves.
In finance, business results are measured by the bottom line. Contributing to the bottom line is the number of transactions processed per minute (TPM) or transactions processed per second (TPS). Every single financial transaction such as a bank card authorization or a stock trade must process in real-time with virtually zero latency, and error free. Global investment banks, stock exchanges, traders and hedge funds stand to gain the most from real-time market data and processing of that information.
The hurdles to real-time (virtually zero latency) data include:
? Traditional store and then process systems
? Processing a combination of live and historical data in tandem
? The inability to accurately predict and rely on bandwidth in the wide area
? Managing imperfections within a stream
? Load distribution / load balancing and hardware performance
? Decimalization in real time
? Non-deterministic networks (such as Ethernet)
For more information on related topics, visit the following channels:
Comments (0)