To date, the financial services industry has been focused on standardization of data exchange across organizations, but little on standardization of how data is managed within an organization, according to the Depository Trust & Clearing Corporation (DTCC).
In its latest whitepaper “Data Strategy & Management in Financial Markets”, DTCC said this has created significant problems. Due to heterogeneous formats and disparate systems, most financial institutions use only a minority of the data they possess on a modern data platform to generate insight.
According to the paper, several aspects of post-trade processes remain manual, which means potentially valuable data never gets analyzed or even stored in modern technology platform.
DTCC believes that today’s data infrastructure is inefficient: “These inefficiencies in data management cause operational risk at various points in the trade lifecycle, increase processing time and often require costly reconciliations.”
According to DTCC, three factors are driving change in how data is exchanged and managed: acceleration in adoption of new technologies; cultural shift towards more collaborative approaches; and growing financial innovation, requiring modern approaches to data.
Kapil Bansal, Managing Director, Head of Business Architecture, Data Strategy & Analytics at DTCC, said as new technological advancements, including broad adoption of cloud technology, spark an evolution of global markets, the financial services industry has an opportunity to reimagine how data is exchanged and managed across financial markets and firms.
“For many years, companies have collected massive stores of data, but the siloed nature of data and the need for better data quality limited the ability of market participants to extract strategic insights to support more effective decision-making,” he said.
“We’re now at a unique moment where we can make this vision a reality, but long-term success hinges on market participants taking action to ensure their data strategy can meet the demands of a highly digitalized and interconnected marketplace,” he added.
DTCC’s whitepaper details four hypotheses that will drive how data is used in financial markets in the future: Data will be more accessible and secure: Data users will have increased flexibility in determining how and what data is received at desired times. To enable this, data governance, privacy and security will need to be prioritized.
Interconnected data ecosystems as a new infrastructure layer for the financial industry: Industry participants will free their own data from legacy systems and be able to pool it into data ecosystems and connect those ecosystems to others. This will reduce duplication of data across the industry and allow for the co-development of innovative data insights.
Increased capacity to focus on data insights: More efficient data management, cloud enabled capabilities, and further automation of routine data management tasks will free up capacity and accelerate time to market for new product development, reducing the need for specialized data analysts and data operations teams to focus on deriving insights from vast stores of data.
Ubiquity of “open source” data standards: It is anticipated that the industry will continue to adopt more standards around data models, with the most viable use cases being reference and transaction reporting data. This will result in increased operational efficiency and better data quality.
To enable these changes, the whitepaper suggests institutions that produce and consume significant amounts of data embed key principles into their data operating models, including: establishing robust foundational data management capabilities, including having a thorough understanding and catalog of data, breaking down data silos and implementing robust data quality practices; supporting strong data governance, including the right set of data privacy and security standards to enable data collaboration with partners; and exploring where there is mutual benefit from collaborative data environments across firms and the industry to advance interoperability.
Applying these principals will help market participants gain access to data that is trapped or underutilized today and allow for new and faster insights, Bansal said.
“Building the future of data exchange and management will require close consultation and coordination among industry participants and service providers, including standardization in how data is managed and shared,” he said.
“At DTCC, we’ve been engaging with our clients and partners to identify and prioritize next steps, and we look forward to continuing this dialogue to maximize the value and potential of data across the industry,” Bansal added.