There has been very little innovation in algorithmic trading over the past 15 years, according to Stephen Ponzio, Head of Electronic Trading, BTIG.
“There have been developments in liquidity sources – single-dealer platforms, for example – but almost nothing in terms of algorithmic functionality,” he told Traders Magazine.
According to Ponzio, the most interesting recent development is trajectory crossing, where executions are matched at the parent level, but only over a short horizon, such as five minutes.
“Both sides get the average market price for the five-minute interval, which is an improvement over executing in the market. This kind of crossing is now offered by several ATS’s,” he said.
Developing better-performing variants to traditional algorithms like VWAP and POV is top-of-mind for BTIG.
Ponzio said that a lot of firms use these algorithms but do not need strict adherence to the standard schedule or rate.
“They would be very eager to give the algorithm a bit more discretion in exchange for better performance. This is not an avenue the sell-side has explored very much,” he said.
“Some brokers have offered VWAP with “dark overlay” which is very simplistic and does not perform well because of adverse selection,” he added.
BTIG has taken a scientific approach to the problem, modeling volume carefully to understand the difference between baseline volume and stochastic volume fluctuations, which they call “opportunistic volume.” “This is particularly important for illiquid stocks,” Ponzio said.
“We have also developed about a dozen signals that the algorithm uses to recognize conditions when a price move is likely to revert or to be sustained,” he stressed.
“These approaches are packaged into our “Opportunistic VWAP” and “Opportunistic POV” which may get a bit ahead or behind the schedule/rate in order to trade at better times and get better prices,” he added.
Performance is the key driver for several components of the BTIG’s system: “We have a proprietary design for accessing the markets, featuring semi-autonomous modules that are co-located at each of the exchanges.”
Ponzio said that the real-time analytics are calculated using Julia, a high-performance programming language developed at MIT for use in data science and machine learning.
According to Ponzio, there is a wide range of the buy-side’s adoption of algorithms.
Some use algorithms, but only manually, and don’t really measure performance, but instead use poor benchmarks like venue analysis, or focus only on commissions and fees, Ponzio.
Others are more sophisticated and use algorithms extensively and systematically, they add more automation and introduce randomization (“wheels”), he said.
This is really a prerequisite for measuring performance, he noted.
“You’ll have a very difficult time measuring performance if part of your process is manual. It’s impossible to compare brokers A and B if the utilities trader always sends to A but healthcare sends to B, and one sends larger orders but tends to cancel when they go badly,” he stressed.
“To make a valid comparison (“apples to apples”), you need to remove the manual part of the process. By being more systematic, you develop an approach that is repeatable and predictable, and best of all, measurable. Now you have a data-driven process that will lead to real savings,” he said.
Ponzio said that while some people mistakenly believe algorithms have been commoditized, the more sophisticated are realizing there can be an enormous difference in performance—easily 5-10 times the commission.
Obviously, Ponzio said, computers can do things that humans are simply incapable of: “A trader cannot check 50+ venues for liquidity or detect subtle patterns in the market. A computer does these things very well, so an algorithm is the perfect solution once an objective has been determined.”
He added that an algorithm can execute basic strategies very well, but it’s not going to predict 6-hour returns.
“That’s where the buy-side can add value to the execution process—by determining when to trade and how aggressively to trade. These are very difficult decisions to optimize, and again, should be approached systematically,” he commented.
For a firm that already uses algorithms widely, there is tremendous opportunity for further savings, Ponzio said.
“Algorithms are not as commoditized as some people think. While behavior across brokers may be fairly standard, performance is not,” he said.
The difference in performance between brokers can be 5-10 basis points, which is multiples of the commission, he said.
“One basis point is $0.0050 (50 mils) on a $50-dollar stock, so we’re talking several pennies per share,” he added.
When asked about the future of the electronic trading space, Ponzio said: “I think that in this age of AI, people will finally start looking to algorithms to optimize executions, rather than trying to micromanage them.”
The past ten years has seen algorithms adding bells and whistles, but very little guidance in how to use them effectively – things like minimum fill quantity, max display size, venue blacklists and restrictions on execution rate, he said.
Traders often set values for these parameters without any real evidence for whether it will help the execution, Ponzio said
In reality, he added, different values are needed at different times and places over the course of the execution.
“It may be better to use different minimum fill quantities for different venues. It may be better to use different venues for different purposes. It may be better to trade faster at some times and slower at others. These decisions are best left to a computer, which can optimize based on data,” Ponzio said.
“This will be a welcome development for everyone – the buy-side will benefit from more efficient execution and the sell-side will be free to innovate and compete,” he concluded.