By Kuhan Tharmananthar, Etrading Software
Data, data everywhere… and not a standardised byte in sight! Financial markets have been caught in a double helix of technological advancement and financial innovation for the last 40 years. Every time technology changes, financial products, markets or processes alter to maximise the value from the technology and, of course, the reverse is also true. Whilst there have been huge benefits from this: speed of execution; automated settlement; new financial products; international access; reduced explicit transaction costs and improved price discovery, the distribution of these benefits has been uneven across market participants leading to a much higher cost base across the market.
Some of this imbalance is caused by the fragmentation and misinterpretation of data. As markets have dematerialised, the importance of data has increased. It is data that defines the instrument being traded; it is data that contains the potential prices from different traders; it is data that tells market participants what the actual trade prices are and where the market has closed.
Unsurprisingly, this data is now valuable and expensive. What’s worse is that much of it is essential – without it, it isn’t possible to be obtain a good price… let alone ‘best’. As a result, there is now an entire ecosystem surrounding the generation, maintenance and distribution of data.
In OTC markets, financial securities and the data surrounding them is often originated by banks and their corporate clients creating a cash security, whether it be equity or debt. Various data vendors then digitise, clean and enhance this data before selling it to any and all market participants including those who generated it. The data is now covered by strong licensing restrictions and presented and stored in proprietary formats. Market participants embed these proprietary formats into their own systems, building complex mapping and verification/scrubbing tools of their own to validate and clean data from different sources so they can build a single picture of the markets they’re interested in following.
So far, so bad. Subject to proprietary structures and changes by those who own those structures along with charges based on value rather than cost, participants have become increasingly subject to non-trade-specific high costs just to stay in the market. These charges around data are now generating the majority of the revenue for some exchanges and is evidence of how vital they’ve become to a functioning marketplace. Whether by accident or design, regulators are about to have an impact on this business model through MiFID II – it requires market participants of all stripes to engage in publishing reference data, pre-trade and post-trade data, transaction reporting amongst others. Some of this data is currently subject to ownership and distribution restrictions due to the terms of their data licenses. It is not clear how the conflict with these licenses will be resolved once it is made public by a regulator or due to regulatory rules.
Examining pre-trade transparency
Taking pre-trade transparency as an example: previously, in the credit market, the closest parallel to this was the pre-trade indications sent via spreadsheets, instant messaging and sometime direct transfer into a client’s systems. However, this was proprietary, uncontrolled and did not necessarily reflect the inventory or intent of the sell-side distributing the pre-trade data. In addition, the idea of meeting a regulatory requirement to ‘publish’ pre-trade data would’ve been incredibly expensive. The Neptune utility has taken some steps to addressing this issue.
In the first instance, the network uses the FIX open standard as the protocol with which the structured pre-trade indications are distributed. By implementing the basic network or ‘pipes’, Neptune is deliberately not focused on any ‘value-add’ services. All it is trying to do is provide the commoditised technology layer upon which other companies offer richer, more enhanced services. By providing this base layer, Neptune is helping to reduce the cost for the whole market. This model has the potential for expansion or duplication to fit the specific requirements for regulators around pre-trade transparency data. The expectation is that new rules surrounding data will continue to be issued as regulators take steps to ‘improve’ market data. Neptune is a powerful example of how an open standard utility model can provide the basis on which market participants can efficiently meet these new requirements as they arrive.
In our next article, we further focus this discussion to the role that standards can play in this changing environment and how they can improve the efficiency and effectiveness of capital markets.
We’d love to hear your feedback on this article. Please click here