The use of technologies for analysing real time data can unleash insights and added value for buy-sides, which have been lacking from broker offerings until now, according to Head of APAC for OneMarketData Serdar Armutcu
Take TCA for example. Traditionally brokers have provided TCA reporting based on a post-trade basis, but increasingly in an environment of dynamic algorithms, fragmentation, dark pools and SORs this is no longer sufficient for meeting the challenges of analysing real time order performance. There is still significant informational asymmetry between the buy-side and brokers on an intraday basis. Democratising the access to intraday execution information can be an area that brokers can focus on to differentiate themselves from the competition. Helping clients act on insights as they are revealed in real time will allow them to be seen as much more pro-active than the competition.
TCA Gets Real
In real time TCA, the main difference is that data; market trading data, orders, executions, news data as well as any other types of proprietary data, is coming in a continuous stream, which makes the data set much more complex in structure and far richer in terms of scope for providing additional colour and diagnostics for the trader. While the traditional types of metrics which have been used thus far provide a way to pierce the data to get at the longer term trends in execution costs they only provide a rear-view mirror picture and therefore they lack the specificity to guide the buy-side trader on how to adjust their actions on the next order or how to set algo parameter values during trading when market conditions can change very quickly. Another weakness of such TCA products is that they tend to focus on the costs and generally do not consider the risks. It is also difficult to incorporate into the analysis exogenous factors such as the trading environment that the trader is working in and instructions or constraints attached to the order by the clients.
In contrast, in the real time setting in addition to these standard metrics, there is tremendous opportunity and scope to provide additional market colour via decision support tools that incorporate metrics and diagnostic tools which cover areas which were not relevant a few years ago but which have a direct impact on execution quality now. Some examples of these are real time liquidity analysis by venue and dark pools, toxicity levels, latency statistics of different exchanges and venues, prediction of order completion probabilities etc.
This also means it makes it so much more challenging to design interfaces and visualisations that don’t overwhelm the end users. One of the trends in this space has seen the entry of visualisation companies to fill the gap between the large data sets that are being produced by the execution process and the end users whose decisions depend on insights from such large continuous streams of data. While these tools can help with building quick dashboards, building useful visualisations is just as much an art as it is science, and requires special sensitivity to things like colour, shape and form that can’t be gained by simply purchasing a software system. It is a bit like giving brushes and paint to somebody and expecting them to produce the Mona Lisa.
The other main challenge for brokers is to justify the investment in technology and people in a time of shrinking commission pools for such real-time analytics products, as these have tended to be more costly and labour intensive to implement than say pre- or post-trade analytics because it requires expensive things like data connections and market data feeds. But these costs have been coming down thanks in large part to the commoditisation of the fundamental technology needed to implement such systems. Companies can now purchase off the shelf products to help them get up the analytics curve quickly in building their solutions. Amongst the technologies which have made the most impact when it comes to analysing real time streaming data are the Complex Event Processing (CEP) platforms.
Complex Event Processing (CEP)
The CEP model of computing has its roots in a range of fields. The concepts found in CEP were developed independently in areas such as Discrete Event Simulation, Network Management, Active Databases and Middleware.
A CEP application is at its core about finding patterns amongst a stream of events. The word complex is meant to imply an abstraction or a compound of a set of basic patterns rather than something that is complicated to calculate. While it is out of scope to go into a full discussion of CEP technologies in this short article, it is probably important to state that while it is possible to implement a CEP type application in a general purpose programming language such as JAVA or C++, the field has evolved to a point where there are significant advantages to using these specialised CEP platforms. Queries using the CEP driven model of computing can be much more efficiently and clearly captured than say a more low level language such as JAVA for example. CEP languages generally employ techniques such as filtering, aggregation, event stream processing constructs such as joins and merges to allow users to be able to express temporal relationships amongst events in a highly efficient manner. Such queries would be difficult if not cumbersome to implement in say a traditional database language such as SQL.
So what are some of the important points you should be thinking about when you are considering CEP technology? Here is a shortlist of questions you can engage your CEP vendor next time you are in the market for a CEP platform.
What core design assumptions underlie your architecture and who designs and builds them?
It is important to ascertain whether there are any inbuilt limitations which might make it difficult to apply CEP technology to your particular problem or domain. How easy is it to scale up to handle an increased number of input streams or an increased number of users. Is the design flexible enough to accommodate users distributed across different geographic regions?
Does the system provide the ability to extend functionality?
Can the system meet not only your current requirements with out of the box functionality but also any future requirements that might pop up unexpectedly? If you can extend the functionality through configuration rather than coding the result will be less costly, less error prone and a faster turnaround on your applications.
What type of support do you provide for third party market data vendor feeds, internal proprietary data sources and the ability to process and load non-structured data ?
It will save you a lot of time if the process of connecting to market data vendors, subscribing to symbols and capturing the raw message traffic is part of the out of the box functionality so that time does not need to be spent on what most people would consider low value work or “plumbing”. Handling non-structured data is also important for example when backfilling gaps in your datasets and incorporating non-traditional sources into your quant analysis.
What type of storage mechanisms do you provide to store the results of the CEP calculations and market data? Is there an in-memory database to capture the streaming data and store results of calculations? How much data can be stored?
Most CEP platforms have some type of temporary storage area which can be written to from their event processing engines, usually however this area is generally insufficient to be able store both the results of queries and the streaming market data. It is important then to understand the storage limitations of the system if there are any for storing both intraday data as well as historical data.
What type of logging, troubleshooting and diagnostic tools are available to monitor, measure and ensure the integrity of the data and of the system?
It is important to be able to have a complete audit trail of the system on individual user queries, be able to monitor system performance statistics such as throughput latencies, detect when there are outages, when system performance is coming under strain, user entitlement violations, etc. Logging should be easily customizable to suit the various needs of different user groups. Monitoring and troubleshooting tools should come as part of the system and not depend on proprietary third party technologies.
Trading in Transition
Technology is reshaping the trading landscape as algorithms and low-latency analytics continue to dominate. Yet despite this proliferation of tools and executions services, there are still many buy-side traders in Asia who are not comfortable with the process of selecting, managing and evaluating broker algorithms. This discomfort can be attributable to factors such as lack of training, technological support, opaqueness of broker tools. But new services that are now appearing that seek to address these issues such as real time TCA we believe will help in overcoming what has been missing from the trading landscape such as simple, intuitive interfaces to enable automation of complicated strategies and to provide users with real time holistic feedback regarding changing market conditions, market impact and order executions. Uncovering the performance of trading behaviour through customised, personalised real time transaction cost analysis is a critical component to any investors’ profitability.