Using analytics to empower decision making and improve profitability
Cat Turley, CEO, ExeQution Analytics
It is undeniable that market data is expensive. Exchanges are well aware of the value of their data and are charging higher non display fees. Volumes continue to grow, requiring more storage, more ram, and better technology to manage this data, all equating to higher costs. But there is enormous value hiding in this data, and it only appears expensive when firms are not working to extract the maximum value from it.
That value may be in alpha, left on the table because trade timings aren’t optimised; or in the 1-2 bps that can be shaved off implicit costs by optimising trading strategies. On the execution side, a better understanding of the strengths and weaknesses of proprietary models and algorithms, can enable sales teams to engage with clients more meaningfully and improve rankings on broker panels. When applied to annual turnover, all of these generate serious return on investment.
So what steps can firms take to make sure that they are maximising the value available within the market data they’re already paying for?
- Ensure the data is clean, high quality, available and consistent across all trading functions
Data lakes, or data warehouses, often contain valuable data that is accumulating and gathering dust. There is significant effort spent on infrastructure and pipelines to store vast amounts of data, but the same level of effort is rarely applied to ensuring systematic and efficient access to this data. The real value-add is in transforming the raw data in these warehouses into analytics that can drive change, empower decision making and optimise trading performance.
- Use AI appropriately
In the last 18 months the conversation around AI has been dominated by the advances in generative AI, but AI has been hard at work in trading for the best part of a decade. This typically takes the form of supervised or unsupervised machine learning models involved in price prediction to improve trading performance. These deep learning models can take many forms – decision trees, random forests, neural network models – but they all have one thing in common… they’re only as good as the features they’re trained on. That’s where good analytics can make a difference; ensuring that these features are accessible, accurate, consistent and efficient.
- Ensure access is efficient, and appropriate to each use case.
There are six key use cases where data can be used to create more value.
- Quant Research: this wide-ranging term covers many areas including portfolio optimisation, alpha generation or volume or price prediction in the execution space but the challenges are often similar. Quant research requires a lot of data and one of the most common complaints from quants is that they spend 80% of their time on data acquisition, but it doesn’t have to be this way. Provision of efficient and flexible analytics via R and Python APIs, allow the quant team to focus on the development of proprietary models, eliminating the time consuming data acquisition and data cleansing tasks and allows them to create better products and results for the firm.
- Intra Trade Monitoring: as the buyside continues to invest in their own analytics space, there is increasing pressure on the sell-side to provide high touch service at low-touch commissions. Trading desks are typically monitoring thousands of algo orders at once, so it is crucial to make it as easy as possible to identify where algos are working or where they’re behaving in an unexpected way; where they might have residuals; what’s out of limit; what’s POV restricted; where the market is moving and what’s causing those movements. By using the right data and the right technology, it’s relatively straightforward to have these metrics – and more – available as alerts, as real time charts and even bots that can be queried for answers to questions like “what is the algo thinking right now?”
- Trading Analytics: gone are the days when it was sufficient to generate a volume profile and let the VWAP algo do its thing. Now, an algo strategy needs to be initiated with trade metrics at the start of the day, and then regularly updated with predictions driven by models that react to the market in real time. Trading strategies cannot rely on purely historical metrics, they need to use more dynamic views of what’s happening in the market in real-time, and respond immediately to vary execution appropriately.
- Best Execution: 20 years ago, algo strategies were sold to clients as sophisticated, unknowable black boxes with complex models that could magically improve execution performance. Those days are behind us and the key now is ensuring transparency and granular communication around the contributing factors that drive the outperformance of a particular strategy. Analytics are critical to translating these factors in ways that sales teams, and ultimately clients, can quickly and easily understand in the context of each firm’s unique trading structure and goals.
- Backtesting: one of the biggest challenges our clients face is ensuring that quantitative models remain effective and accurate as markets change. In many organisations, there is a divide between the quant and trading functions. The quant team puts in a lot of effort to develop models, but once they go into production, the impact of these models on each trading strategy should be explicitly and regularly analysed. By better connecting these two functions and helping them communicate effectively, firms can ensure that models remain up to date as the dynamics of the markets move.
- Surveillance and compliance: it goes without saying that a complete understanding of market data, trading and analytics is necessary to be able to identify when there’s something unwanted going on.
- Use analytics as a common language to enable improved communication between trading, tech, quant and management.
Despite the rapid increase in technical understanding across everyone in the trading world, these four functions still often fail to speak the same language, which can lead to a wide range of sub-optimal outcomes. Analytics is a powerful connector that, done well, can be a common firm-wide language to drive better communication and better outcomes.
Turning data into value is easier with a structured and efficient analytics framework, based on a library of APIs, that can manipulate and aggregate raw data into metrics that are flexible, consistent, easy to access and appropriate summaries of market and trading behaviour. Critically, it must provide flexibility for all users to access the metrics they need, without sacrificing performance. Giving everyone at the firm detailed and consistent analytics, through simple API calls, that are flexible enough to satisfy requirements from varied perspectives, empowers far better decision making and can be a fundamental profit driver.