Mark Goodman, Head of Quantitative Electronic Services, Europe, Societe Generale Corporate & Investment Banking writes on venue and algorithm analysis in a low volume environment.
The initial stages of the financial crisis were characterised by highvolatility and high-volumes. The post-Lehman environment was difficult to navigate for many reasons, but low market volume was not one of them. To some extent this continued up until early 2011, but since then we have seen a steady decline in trading activity, albeit interspersed with moments of fear-driven volume peaks such as last August.
However, since the focus of the financial crisis moved from institutional to sovereign risk, European markets have been primarily characterised by a lack of activity combined with moments of significant volatility. The markets have settled into a directionless drift with no strong conviction to stimulate more activity; as a result a low volume environment is becoming the norm.
Whilst there are few signs that volumes will increase, there are some indications that there may be further downward pressure. Some of the core proposals of MiFID II could have a further negative impact on market activity. There is often heated debate regarding the positive or negative impact of high frequency on more traditional market participants but the fact is this activity is responsible for around 35% of European exchange volumes. MiFID II does not seek an outright ban on these counterparties, but efforts to better regulate them is likely to result in a reduction in their activity, and thus market volumes, at a time when markets are already struggling. Specifically the removal of credit/debits and the imposition of market making obligations will reduce the profitability of some strategies. There is clearly a distinction to be made between volume and liquidity when trying to understand the impact of this legislation on market activity but the initial outcome of lower market turnover is indisputable.
Faced with this environment the buy-side trader has to work harder to get trades done – even trades which previously may have been relatively simple. Actively managing access to liquidity and an in-depth understanding of the micro-structure of the market have become a necessity. In addition, algorithms which worked in an environment of higher liquidity need to be reassessed and in some cases rebuilt.
More Venues, or Right venues?
In terms of managing access to liquidity, one aspect of this is ensuring you can get to the maximum number of venues. Whilst this has been a core topic of discussion with our clients since MiFID I kick-started the process of fragmentation, the discussion has evolved from maximising the number of venues to a more cautious approach focused on using the right venues. There is clearly an inherent contradiction that in a low liquidity environment a trader would restrict their venue selection, but where the perception is that the shortfall is being made up of predatory, short-term alpha seeking strategies then this concern is justified.
This change of approach can be seen in the widespread adoption of the FPL execution venue reporting guidelines released last May. Whilst these guidelines have been published as recommendations, we feel this level of transparency should be mandatory for brokers; clearly showing where an order was executed, including ensuring the end venue is correctly identified when orders are “onward-routed”, is not an unreasonable demand from the buyside whose order it is being worked. Buy-side traders are analysing the minutiae of venues and associated impact on their trading, execution by execution, in order to exclude those venues where they are vulnerable to adverse selection.
However as highlighted, excluding a venue means excluding liquidity and this can be costly when trying to capture short-term alpha. A more nuanced approach is using the right venue at the right time, which treats each order according to its specific objective. This means that an order with a longer-term alpha horizon where there is no need to signal the market should only include high-quality, low adverse selection venues; however an urgent order where a significant market movement is expected should be focusing primarily on the probability of execution when selecting venues.
This approach requires a systematic approach to profiling each venue in a way that enables the algorithm to differentiate between the different sources of liquidity. SocGen has developed a proprietary methodology called “Quality of Venue Measure” (QVM) which captures the cost of trading (adverse selection) on each venue as well as the signalling cost to the residual of the order of getting that execution done. An example of the signalling cost to the residual can be seen in the diagram below; this is an analysis of the behaviour of a stock following an execution in the respective venue. All orders are normalised to buys and an observation is made of the price direction following the execution.
This analysis is one way to show a distinct profile for each of the venues. If we try to capture a half-spread saving in venue B but continue to trade our order afterwards then the potential performance impact on the residual execution is up to 3bps. Whilst this does not mean that venue B should be excluded from all trading it should be used as an input in the routing decision when trading a larger, over the day order where that impact may be meaningful, or when completing the residual of an order when any post-trade impact will not affect performance.
For the broker, this systematic analysis of the differentiation between venues should not be a one-off event. Venue operators are under the same commercial pressures as brokers due to the low volumes. As a result there is a continual experimentation with pricing structures, crossing rules, amongst other elements, in order to attract new participants, and this can have a fundamental and continual impact on the nature of the venue itself. Whilst the Pipeline situation in the US earlier this year is an extreme example of this, failure to continually analyse each venue means that relative changes between them can be missed; this results in use of a venue having an unexpected, negative impact on performance.
Equally, there are instances where changes can attract more benign liquidity which can improve the execution of the order by absorbing some of the market impact; examples of this generally relate to changes in matching rules, such as prioritising size over time or implementing auction models. In this situation a lack of analysis means you could be missing good quality liquidity, which is a scarce resource in today’s market.
Yesterday’s Market
The other but related significant impact of this environment is that algorithms previously used to good effect by the buy-side have seen their performance decline. Effectively these were designed and optimised for yesterday’s market; the change in the market environment has not been reflected in the product offering of most brokers.
Whilst one reaction could be to release a new suite of algorithms, our experience has shown that a larger list of options to choose from is certainly not what clients are demanding. As the focus on the micro-structure of the market has increased the detailed understanding of the buy-side trade, this same focus on the logic of the algorithms has resulted in an increase in individual requirements. Customisation is an over-used response to this demand; it is not about the algorithm, it is about defining the objective the client is trying to achieve. This approach to algorithmic development is a very in-depth, iterative approach which combined with legacy technology can result in a costly solution; many brokers are reluctant to go down this path when market volumes mean this cost may not be covered.
However, this is not a commercial issue but an architectural one; not only were most algorithms designed for yesterday’s market but most technology was designed for yesterday’s client. Not only do brokers need to consider their product offering, but they need to break this down and offer it in a way that can be deployed quickly, bringing the component parts together for each individual. This tends to be a learning process for both counterparties so each delivery should be seen as a step towards a final objective rather than an end point. Attempting to achieve this with technology which was built to deliver a uniform product suite for all clients and then commercialise the effort when faced with historically low volumes is challenging.
There is general consensus in the market that the current situation will not change in the mediumterm. This means that both buy- and sell-side need to adapt their approach to trading to the new norm: for the buy-side this means a focus on how to maximise access to liquidity venues whilst minimising slippage; for the sellside this means a more pragmatic and flexible approach to providing access to venues and algorithm solutions. Whether the sell-side can adapt to this new reality at a time when investment is scarce will dictate who the leaders in this field will be for the next few years.