Conquering the ESG Data Challenge

Insights from this article first featured in FINBOURNE’s latest report ‘ESG Data Resiliency:  From fire-fighting regulations to commercializing new opportunities, with contributions from KPMG.

With ESG and net-zero targets rising to the top of the global agenda, investment managers are facing new operational challenges, with regulatory overhauls and a new generation of investors pushing for greater transparency and more accountability.

Many investors now expect managers to generate consistent returns across multi-asset portfolios whilst not negatively affecting communities and the environment. However, in doing so, these managers are facing challenges on several operational fronts: from the aggregation of complex and diverse data sets to interpreting non-standardized reporting frameworks; and the lack of entity-level granularity.

Challenges still remain for asset managers

It’s clear that the ESG data challenge is not a problem in itself – at the heart of it is the fundamental problem of understanding and deriving value from data. With firms facing increasing data consistency and granularity challenges, it is the epitome of what is wrong with capital markets infrastructure today. There are huge limitations associated with mainframe legacy technology, and this data dilemma just so happens to have planet saving consequences.

Chris Farrell, FINBOURNE Technology

“The ESG data challenge is not entirely unique. Many of the challenges apply to other types of data but the increasing focus on ESG and its relevance as a ‘new data set’ have exposed these existing challenges, such as ridged data models. The key is to interoperate with existing systems and data models to be able to identify, surface, join and interpret key data sets to support evidence-based, data-driven decision making”.

Chris Farrell, Head of US and Global Head of Strategy, FINBOURNE Technology

From ‘best endeavors’ to commercializing new opportunities

Asset managers are increasingly under pressure from ESG regulatory forces, many are still in ‘firefighting’ mode when it comes to ESG data management and meeting regulation requirements. 

The reality today is that firms are struggling to aggregate and translate ESG investment data, into a timely and reliable, firm-wide view – with a detrimental impact to both portfolio operations and reporting. They are yet to tap into its commercial opportunities such as proprietary insights, launching new products, attracting a diverse investor base and driving sustainable growth. 

At the root of these struggles are legacy systems responsible for data fragmentation, organizational silos, and subsequently a host of labor-intensive manual workarounds. This, together with the addition of niche ESG bolt-on systems, has created a tangle of systems and interfaces, and accelerated the ESG data struggle.

Achieving consistent data quality requires a flexible model which can derive value from multiple, disparate ESG data sets, whilst also navigating the evolving ESG regulatory framework. 

Achieving data resiliency without compromising on cost

Overcoming data consistency and granularity challenges are not limited to ESG, but the key to success is establishing a data operating model which not only works now but also works in the future. COOs should therefore be asking the question: Is our operating model evolving at the right pace? Can it cope with new and changing data sets and reporting requirements, and still achieve the accuracy needed? And can it deliver the analytics and insights required, to create new and appealing products while keeping total cost of ownership low? Having the right data stack and operating model will be critical to achieving all of this. 

With cost-saving high on the agenda, firms want to avoid an expensive, ‘big bang’ change to their data and operations. Having a flexible model will enable firms to adapt to changes in ESG regulation further down the line and seek out opportunities to deliver meaningful change to the planet and returns to stakeholders.