With John Cosenza, Co-head Electronic Trading, The Cowen Group
One of our bigger value-adds to customers is the fact that we are venue neutral. We do not operate a dark pool. We have no electronic market making. We have no prop trading within the organisation. As a result it is imperative to our business model that we maintain proficiency in analysing liquidity venues and understanding which venues we should route to in various situations; how, when, and where. Data and analytics on liquidity venues serves as both a driver and feedback loop to this core competency.
The venue and routing component of the execution equation is extremely important given the complexity associated with fragmentation and lack of transparency in the marketplace. Having residential expertise in optimising routing and analysing liquidity venue data, our hope and what we believe in is that this focus will translate into better execution quality for our customers.
The push for standardisation
We don’t have a dark pool so it’s very important for us to have a consistent framework where we can look at all the places which we route orders to, normalise the data and then draw meaningful conclusions on our routing practices to these third party liquidity venues. The firms that are venue-neutral and approaching it this way are driving for more transparency and standardisation.
For the camp that provides electronic products but also manage their own dark pool, the data could be a little bit more daunting, in that they have electronic products which route to liquidity venues and their venue is one of them.
For this type of provider, if the data comes out to show that there is an extremely disproportionate amount of volume going to their internal pool, a detailed explanation is in order.
In such a scenario, it could be reasoned that there are tangible benefits to internalisation, rendering the data as a positive outcome. And if there are scenarios where it can’t be reasoned that there’s a tangible execution benefit of internalisation to the end customer, then it probably requires some adjustment to routing. Either way, this increased transparency is only going to be good for the marketplace.
In order to be meaningful, firms will of course need to accumulate significant sample size, but the initial steps that Tabb are putting in place are important. If you can normalise data through an independent source not involved in the execution process, a level playing field for meaningful comparative conclusions can be drawn. Similar to traditional TCA, an important part of the evolution of venue and routing data starts when there are no more issues questioning the source of the data, which has historically been a concern when brokers calculated and individually provided their own data.
Hopefully, down the road, if there is a consistent standard and streamlined process everyone in the ecosystem will benefit. While some of the larger institutions have dedicated resources and teams to do this themselves, the overwhelming majority do not. For the customers that don’t have the team or resources for big data, this is a great way for them to receive data in a comprehensible way and make good decisions for their firm with their counterparties.
There still has to be a tieback into how this improves execution quality. Optimisation of Routing to Liquidity Venues is an important piece of the overall puzzle. With an agreed upon standard between the customers, brokers and venues, data on routing and venue analysis can serve to assist the overall optimisation of execution performance, translated into minimisation of risk-adjusted execution costs.
Fundamentally, how do we navigate a marketplace that is complex, extremely fragmented and lacks transparency in some critical areas? Transparency is moving in the right direction but is not ideal. It is our job to use data and context to combat these gaps in transparency with the end goal of getting better execution quality for the end customer. It’s not going to happen overnight but I think we’re going down the right path.