By Arjun Jayaram, CEO, Baton Systems
In the increasingly intricate and fast-moving capital markets landscape, there has never been greater pressure on investment banks to streamline and automate their post-trade and operations systems. Beyond drastically reducing costs, which is often necessary for increasing speed, scalability, and future-proofing the business, this offers a compelling opportunity to more effectively identify, assess, and mitigate risk in an increasingly complex market environment.
However, the widespread notion that this requires a radical technological overhaul is both daunting and misguided, with heavy disruption the only sure outcome. To solve today’s problems, a more practical approach is required, one that works within the binding constraints of a large organisation while also providing a modern technology and workflow stack that future-proofs the business.
This is a tall hurdle, but not an insurmountable one. The banking sector’s infrastructure – especially in the post-trade environment – is often a bedazzling web of siloed, albeit critical, IT systems. Many of these will have been in place for decades, evolving constantly as software engineers look to optimise processes and mergers and acquisitions (M&As) onboard other businesses’ legacy systems.
In addition, peripheral systems and processes may be hardwired to depend on these legacy technologies for their data and reports. Given that many of these systems operate batch-based processes without clear boundaries, or standard APIs, or message formats, replacing, upgrading, or migrating legacy software becomes all the more challenging and risky.
In this context, most organisations simply “make do” with suboptimal functionality and prolong the life of legacy systems—swallowing IT budgets in an effort to keep the lights on and endlessly patching up the cracks. However, this approach is becoming increasingly untenable due to recent market developments, not least the shift to T+1 settlement, RTGS moving to 24×7, and ISO 20022.
To liberate users from the existing constraints of legacy infrastructure and innovate at the speed required to ensure strong client retention, market viability, and growth, the concept of interoperability – which refers to the ability of different systems, devices, or applications to work together seamlessly and exchange information effectively – offers a far more practical pathway to innovation. And it does not require the costly and disruptive immediate “rip and replace” of existing legacy IT systems.
Achieving interoperability is no quick and easy fix, though. Firms must ensure several critical elements are in place before making meaningful progress. Firstly, backward compatibility with current systems is essential. This means being minimally invasive to existing legacy systems. From how data is ingested and exported, to messaging and events, all elements must work with the current setup.
System changes should also be limited to configurations, avoiding significant code changes. Meanwhile, external venues that current systems connect to should also witness minimal amends, except for requests coming from a new, secure IP address. By ensuring backward compatibility in this manner, financial institutions can transform their business capabilities without the risk and cost of overhauling existing infrastructures.
Secondly, real-time SaaS capabilities are imperative. The system must operate in real-time utilising modern technology stacks and data processing techniques for data ingestion, processing, workflows, messaging, and actions. Combining cloud connectivity with modern technologies is an essential step in delivering this. It ensures low latency, data encryption, enhanced security and monitoring, and resilience, among other benefits.
Furthermore, it should be prepared for a multi-vendor stack, allowing best-of-breed technologies to integrate seamlessly with industry standards and APIs. This approach enables financial institutions to innovate and leverage cutting-edge technologies while still maintaining robust, secure, real-time operations.
Thirdly, and perhaps most importantly, future-proofing the system is vital, as post-trade processes will significantly change over the coming decade. New asset classes – such as digital, hybrid assets – faster settlement, new automated post-trade workflows involving multiple distributed ledger technology (DLT) stacks, as well as new messaging standards, domain models, and trading venues are all on the horizon. Each innovation will possess a different maturity cycle, so the software needs to be modular, delivering new functionality incrementally and frequently.
Additionally, frameworks must accommodate further innovations in artificial intelligence, real-time analytics, and cybersecurity from multiple vendors, without compromising data security and privacy. This modular and adaptive approach ensures financial institutions can evolve with, and remain at the forefront of, market developments and technological advancements.
Ultimately, the path to modernisation in investment banking does not lie in the disruptive “rip and replace” of legacy systems. Embracing interoperability provides a more pragmatic route to innovation, enabling banks to enhance their capabilities and stay competitive without the upheaval of a complete overhaul.
As financial institutions navigate the complexities of today’s markets, interoperability will be the cornerstone of their strategic evolution. This approach not only ensures continuity and stability but also positions banks to capitalize on future opportunities and technological advancements as and when they emerge. In the increasingly competitive capital markets arena, this could be the difference between success and failure.