Low-latency is the new normal. What was once the sole domain of the tech-savvy is now pretty standard and the gap between the low-latency haves and have-nots continues to narrow as technology makes its inevitable progress.
There is talk about powerful computers being used by high-speed traders, but what does that mean? Google has a powerful computer if you consider the sum of the parts, which at an individual level are comprised of relatively low cost computers.
A teenagers gaming PC will have a liquid-cooled, over-clocked CPU, loads of memory and a high performance graphics processor that would challenge the performance of most trading servers — all for a few thousand dollars. To state the obvious, computers are fast. There is little exclusive about high performance computing these days.
The low-latency barrier to entry is not the price of hardware. Rather, it is the will and expertise required to put it all together. And while co-location and networks are costly overheads, they are probably no different to the costs of propriety market data and news feeds that businesses have accepted for decades.
Expertise can be acquired and the knowledge is increasingly available. The hard part is having the will – the will to measure and to optimize processes for repeatability and determinism. Often these steps are ignored because it is considered too hard. Sometimes, we will measure parts of the system (the easy bits) but unless we examine it as a whole, we never really know what is going on.
Taking a rigorous approach to trading begins with accurate recording and time stamping of market data and order-line events in the traders network. This provides a baseline for modeling strategies and understanding the microstructure of the market and provides a valuable ongoing repository of market knowledge.
But, in order to make accurate observations you need a well-tuned system with as little unnecessary latency and variability as possible. The way to achieve this is through deterministic hardware and software, and the reduction of latency. Many systems lack determinism and therefore lack repeatability, making it difficult to achieve correctness and performance. And, often the problem with latency is not just being slow but being unpredictable.
The major source of latency and unpredictability is the network rather than the CPU. This includes the interconnections from the exchange as well as your own network connections, interfaces and switches. Optimizing these components as well as the network stack within a server can lead to the biggest improvements.
Low-latency trading systems are not just about high-speed trading. They also provide a foundation for a rigorous approach to accurate and predictable behavior. They should be viewed as a normal part of business because low-latency is normal, its achievable, and it doesnt cost the earth.
Scott Newham is co-founder of Metamako, a low latency provider based in the Sydney, Australia.