A MOST PRECIOUS ASSET.
Data is no longer just disparate pieces of a puzzle but a key plank in regulatory compliance. Heather McKenzie reports on how firms are now looking at the bigger picture.
While data is a crucial element in ensuring best execution, the way it has been maintained and managed during the past few decades often has been poor. Pressure on costs and new regulatory obligations though are changing the way buyside firms deal with the situation. In the past 15 years, they have moved from focusing solely on data entry to a more holistic approach that requires strong data governance frameworks.
The ultimate goal is to transform data from a process into an asset. This is not without its challenges, however. Disparate systems and silo architecture mean that while data may enter a firm through a single point, its distribution throughout the organisation can compromise the quality. In other words, data loses its integrity as different systems treat it in different ways.
What has become apparent is that senior management involvement is required if a firm is to make this transformation. A buyside survey conducted by US-based data specialists Rimes found the firm’s clients were adopting a more strategic approach to data management with greater senior management involvement. The Rimes 2015 Buy-Side Survey found that regulatory compliance is a fast growing driver of moves to achieve data governance best practice. In the 2014 survey, 24% of firms cited regulatory compliance as the main benefit of good data governance; in 2015 that figure had climbed to 42%.
Another growing factor is reputation. In 2014 only 16% of respondents said reputation was a main benefit of data governance best practice but in 2015 that almost doubled to 30%. Meanwhile, cost as a driver of best practice data governance fell from 57% in 2014 to 45% last year.
Moving up the agenda
Firms are, “much more aware”, says Rimes, of the need to improve transparency and control through robust data governance. There is more emphasis on aligning spend with business success rather than focusing on cost reduction as an end in itself. All firms must adapt to meet new industry dynamics and this is reflected in emerging data management priorities.
The establishment of data governance frameworks is one of the highest priorities for firms, says Paul McInnis, head of enterprise data management at US-based data technology company Eagle Investment Systems. A survey the firm conducted among its customers found that while 80% felt their organisations valued data as an asset, more than 60% said they had no formal governance frameworks in place.
However, McInnis believes a change of attitude is taking place as changes are forced on the market following the global credit crisis of 2008. “Before that time, organisations had separate lines of business and no one was really forced to be efficient because everyone was making money,” he says. Silos within firms had their own IT budgets and responsibility for managing data. It was only after 2008, when budgets were cut, that the inability of organisations to look holistically across portfolios and identify exposures became an issue.
Regulators – through initiatives such as the Basel Committee on Banking Supervision’s BCBS 239 rule – are mandating that financial organisations know their liquidity and risk exposures. The only way to meet such obligations, says McInnis, is to aggregate and consolidate data in one place. But this is not a technical challenge for the IT department; it also requires the involvement of the business side. “The business owns the data, while IT is responsible for storing it. IT is not responsible for the business decisions regarding how data is remediated or treated,” he says.
An important element of the regulatory initiatives is the granularity required by regulators, says Ed Royan, COO EMEA at US-based provider of regulatory reporting technology, Axiom SL. “There are new slices and dices of information that regulators want in order to judge the risk level of particular organisations,” he says. Previous ways of dealing with data will no longer be adequate. Like McInnis, he says the business side of organisations needs to be more closely involved and that data governance policies must be clear about who is responsible for data and who signs it off.
Regulatory reporting requirements can be used as a catalyst to improve data management and to inform data governance frameworks, says Royan. “The regulatory reporting aspect is a good place to start. BCBS 239, for example, asks firms about their control processes and how they know that information is reported correctly and consistently. We believe regulatory reporting requirements can drive changes in business behaviour when it comes to data.”
Speaking at a recent data conference in London, a global data services executive from a large buyside firm said his company’s strategy was to ensure that all data management professionals within the firm understood the importance of what they were doing. “Internal users of data expect consistency, reliability and simplicity.” To deliver this, data should not be regarded as merely a procedure.
Another speaker at the event joked that data is a “career damaging opportunity”, an observation that resonated with the audience. He also stressed the idea that the business side owns the data and that chief data officers will be expected to drive value in order for data to be valued as an asset. The data services executive agreed and said at his firm the chief data officer was now more highly engaged with various business level executives, which had been “extremely positive” for the data business.
The head of information management at another large asset manager said data governance frameworks should focus on collaboration. Companies such as Amazon get the most out of their data by bringing together finance, sales and marketing teams. Doing this in buyside firms would give the organisation as a whole better understanding of assets under management, outflows, performance etc. “We need to get heads of department around the table because there is much more demand now for data analytics. We have to make data work harder and generate more from it.”
A shared response
There’s a growing interest in utility models for data. The argument is that by enabling a central point to cleanse and aggregate data, firms can then focus on monetising their data assets. “Utilities are not a market data cost play,” says Joseph Turso, VP product management at SmartStream. “Utilities are about efficiency and operational benefits. By embracing a unified set of information, financial institutions can focus on the elements that are more specific to their businesses.”
A utility approach is also offered by Avox, which is a wholly owned subsidiary of the US central securities depository, the Depository Trust & Clearing Corp (DTCC). Avox is focused on legal entity reference data but of course, the philosophy is the same – by taking on the burden of maintaining this data, firms are able to focus on core business. “One of the biggest challenges firms face is finding consistency in data. Many different systems are doing different things and trading desks within firms have inconsistent views of the same information,” says Mark Davies, general manager at Avox and managing director of DTCC Europe. Being wrong, but being consistent with a single view of data is a preference, he claims.
The concept of data as an asset has been around for a while, but it truly resonates when it comes to reconciliations, says Davies. “If a firm can get to the point where it has one view of the truth from one source of information, which is distributed and not overwritten locally, this can have far reaching benefits for the firm.” In practice, this would mean that rather than different departments and trading desks spending 10% of their time on data analysis and repair, this is all pushed back to the centre. Removing this inefficiency can establish a firm business case for utilities, he adds.
[divider_line]©BestExecution 2016
[divider_to_top]