Heather McKenzie explains why robust frameworks are key in unlocking the value of data.
Although buyside firms recognise that market and reference data are a significant source of alpha, challenges remain for buyside market and reference data strategies.
This was the findings of a recent Refinitiv report – Connected data: challenges remain for buyside market and reference data strategies – which canvassed 800 executives at buyside firms in 16 countries. It said to fully unlock the value, firms must evolve their market and reference data strategies, including ensuring the entire organisation has access to the data.
Eighty five per cent of the buyside executives surveyed agreed that better use of market and reference data would help their firms to achieve greater efficiency to reallocate resources to more revenue-earning streams. More than two-thirds (67%) of respondents noted the importance of market and reference data had increased at their firm during the 12 months to April 2021, compared with 64% for the sellside.
Refinitiv believes against this backdrop, it is possible that buyside firms see robust data governance as providing a strategic advantage. It improves data quality, enhances collaboration around data across silos, and enables the business to engage with data faster and with greater trust.
One quarter of the buyside firms surveyed rated their organisations’ data governance programmes as market leaders in supporting the critical challenges around market and reference data, compared with just 16% on the sellside. As a result, Refinitiv says they are putting a priority on data governance that would enable some of them to claim market leader status.
Looking at the technical perspective, the survey found that more than one quarter (27%) of respondents identified difficulty in finding new data sets as their main challenge (see Fig 1). Data consistency from different channels or suppliers was identified as a hurdle by 26% of respondents, a reflection of the fact that buyside firms are working with an increasing number of data sources. Other obstacles include remote working and mobile usage, incomplete data, and slow delivery of new data sets.
Thirty five per cent of respondents said the inconsistency of data used by front, middle and back offices was a significant challenge, with one third citing inconsistency of data sources/vendors between front, middle and back offices.
To overcome the difficulties of disparate data sets of variable quality, Roy Kirby, Head of Core Products, SIX Financial Information, says buyside firms must understand the strength and expertise of each individual data provider they are using. “Firms with genuinely strong data management frameworks in place are the ones with a rich understanding of their data and knowledge of the top providers in each asset class, content set, geography, and jurisdiction in which they operate,” he says.
However, when it comes to the actual source of the data, firms must be aware of the structure that has been applied to the data they are receiving from their providers. “These rule sets should link to available market standards and be adaptable so these can be updated as and when new rules need to be applied quickly,” he says.
Data needs to be fast and accurate, so the rule sets must be well understood with market standards behind them and linked to regulatory requirements (see Fig 2), so that as soon as any change occurs, this can just flow straight through data systems, he adds.
Linda Coffman, EVP SmartStream RDU, says firms need to ensure that they don’t let in bad data. “It comes down to procurement and quality assessment. Firms pay a lot for market and reference data and thus they should feel confident the data they purchased and integrated can do the job. You have to make sure that when you open the front door, you are only letting in high quality data fit for purpose.”
Evolving data management practices is about “connecting the dots”, she adds, saying firms should implement strong tools that can cross-reference exchange symbology and market data symbology, along with support for their own internal symbology. “If you don’t have the tools and services in place that enable mapping between data sets, you will end up with a large number of exceptions, increased cost and multiple data silos instead of a cohesive, clean set of data.”
Building bridges
Any framework for data platforms should have three components, says Coffman: where the data lives, how it is accessed and how it is managed. Aligning the three is “the trick”. “A firm can build a very complex framework with all the bells and whistles, but if you cannot effectively access the data or the data is of poor quality, the effort and cost has been wasted.” Defining and planning around the three components will ensure that the framework is fit for purpose.
“The experience will be different for every institution based on internal requirements and legacy systems – for instance, some have put data enterprise management on the cloud, for others that doesn’t make sense at this point in time. Even the best practice of having a single security master file may not work at every organisation. The industry has realised there is no one magic formula – data can look different, as long as you keep the three core features at the forefront of your framework design and resource management.”
Kirby says in building frameworks larger buy-side firms should always start with two questions: why is all this data required and what is it going to be used for? “Focusing on the why and the what helps firms to break down high volumes and rationalise workflows that in turn help with efficiency and cost,” he says.
From this perspective, firms must ensure they are thinking about the trusted data suppliers with whom they choose to work. Market participants need to know the source of the data and have trust in its quality.
“Firms must build data frameworks in a way that is adaptable and works across common standards linking data points together,” he adds. “For example, wherever there are regulatory standards in place, such as ISINs, firms must ensure they are using these reference points because it is then far easier to pull in other pieces of data at a later date. This is where the idea of connected data comes into play, and gives an additional value to the data.”
As with other areas of financial markets, firms are also driving towards greater automation of workflows. Kirby says it is important to minimise the amount of manual manipulation required in a firm’s data flow as this can lead to errors in reporting and consistency. “Stronger frameworks enable firms to achieve high quality outcomes and invest more effectively.”
An important trend in data management is virtualisation, says Coffman. This enables firms to react quickly to downstream users’ requirements while adhering to due diligence around data management. “How to deal with downstream users’ requirements in a fast-changing environment is one of the biggest challenges for buyside firms. By using a virtual platform, users can play around with the data and solidify their requirements before they are ingested into the data management ecosystem.”
Alongside the virtualisation trend, some SmartStream clients are also using visualisation tools such as dashboards to help users understand the data sets available. “So many more people within a firm are now using data. Cloud computing, APIs, virtualisation and visualisation, data governance and data quality programs are all part of best practice at firms now. How these elements are implemented will be based on the unique characteristics of each firm.”
©Markets Media Europe 2022
[divider_to_top]