By Sassan Danesh, Managing Partner, and Kuhan Tharmananthar, Product Development, eTrading Software
Recent years have witnessed an inexorable rise in the electronifcation of trading, driven by a trinity of regulatory, technological and cost pressures. The benefits to investors have been significant, including speed of execution; reduced explicit transaction costs and improved price discovery. However, alongside these benefits has been the ever growing cost of connecting and consuming the data required to operate efficiently in the new electronic world.
Much of this data is essential: It is data that defines the instrument being traded; it is data and connectivity to it that reveals potential prices from different brokers; it is data that captures actual trade prices and market closing levels. It is not surprising that this data is now valuable and that an entire ecosystem has sprung up surrounding the generation, maintenance, distribution and consumption of this data.
The challenges associated with the high cost of data are not new to the equity markets, which have been wrestling with this problem for many years. However, the increased electronification that is now occurring in fixed income markets is threatening to create the same challenges for OTC markets. The timing is therefore ideal for the community to start discussions on creating a fixed income market data ecosystem with a more optimal cost structure and data governance model for all market participants.
Reducing market data costs through collaboration
Part of the reason for today’s high market data costs rests in the structure of the market data ecosystem: raw data is provided by brokers in various formats; various data vendors then clean, reformulate and enhance this data before selling it to market participants including those who generated it. This enriched data is delivered via proprietary channels and using custom formats. Market participants embed these proprietary formats into their own systems, building complex mapping and verification/scrubbing tools of their own to validate and clean data from different sources so they can build a single picture of the markets. This provides an opportunity for the market as a whole to collaborate to create a standardised distribution mechanism to simplify the ecosystem and drastically reduce market data costs to end-users. However, any successful initiative that wishes to simplify this ecosystem must address the needs and sensitivities of the key stakeholders, including the sell-side (typically the data originators); the buy-side (typically the data consumers); and ideally also the data distributors (typically data vendors who provide the connectivity and value-added services to end users).
Taking the needs and sensitivities of each stakeholder in turn:
Buy-side: The fragmentation of fixed income markets is driving the buy-side need to access market data from as wide a set of data sources as possible, and to be able to sift through this data systematically. The buy-side therefore have an in-built incentive to promote such cost reduction. However, given the disparate sources of market data, a key challenge for the buy-side is to receive this data in a standardised format based on open standards to allow aggregation across data sources and to avoid individual vendor lock-in that can result in increased costs.
Sell-side: Fixed income market data is typically generated by the market-making desks of individual brokers for their client base in a particular product. Typically, the broker does not charge for providing this data as it is a pre-requisite for client trading. Indeed, the broker has an incentive to supply the data in the most cost-effective manner to ensure as wide a dissemination to their specific client-base as possible. However, given the risk of information leakage and data being passed to other market actors that the dealer does not have a relationship with, a key sensitivity for the sell-side is ensuring an appropriate data governance wrapper around any data distribution mechanism.
Data vendors: The cost of establishing connectivity to sell-side (data originators) and buy-side (data consumers) is a major challenge for vendors. Reduced connectivity costs can allow vendors to focus their investments on value added services such as TCA or other pre- and post-trade analytics and to market such services to the larger client base that becomes possible as a result of simplified connectivity. However, a key sensitivity for vendors is in ensuring that their business models can be made consistent with an ecosystem in which connectivity has become commoditised.
The right model of industry collaboration
Crafting a collaborative model that meets the aforementioned needs and sensitivities is a challenge. Luckily, the industry already has an example that it can emulate: Neptune, an initiative by a group of market participants to distribute pre-trade axe/inventory, provides a model that addresses many of the critical requirements discussed above:
- A standardised, open-source data format to simplify the consumption of data from multiple sources
- A data governance model to address information leakage concerns
- Reduced connectivity costs and interoperability with EMS & OMS vendors via the use of FIX to allow vendors to provide value-added services such as reference data or additional analytics
- Low fees through a utility model that ensures the benefits of scale and the value of the data are captured by end-users
Of course market data has its own specific needs as well, such as providing for industry governance over the creation of the composite price. Such a composite can accelerate the adoption of buy-side TCA and best-execution analytics, since these are predicated on the availability of a clearly defined and reliable industry ‘mid’. Therefore a successful collaborative model must also address the creation of an appropriately balanced governance structure for an industry composite.
And finally
The key question, given the theoretical benefits of such a collaborative model, is whether the industry is ready to launch such an initiative.
The debate continues, but what is clear is the increased industry receptiveness to establishing open standards and industry utilities that make life easier for all participants whilst still providing scope for each firm to compete on top of the richer market infrastructure that such standards and utilities create.
We’d love to hear your feedback on this article. Please click here