It’s 2013 already and we’re still talking about regulating high frequency trading (HFT) in the absence of data quality and standards conversations. That needs to change.
2012 seemed like the year of regulators taking a prolonged look at computer trading – defining what it might be, its potential effects, and why it may be problematic. Looking at where we are now, it is still far from clear that we have answers to these fundamental questions.
Regardless, HFT has become a national obsession, resulting in a complex and divergent landscape for those trying to move the invisible hand.
Those who are aiming to come up with a singular view will find it difficult. The Germans are pushing ahead with licensing and minimum time orders ahead of an already delayed MiFID II, apparently including circuit breakers, organised trading facilities and minimum tick sizes. The CFTC is still musing over a “concept release” of HFT regulation, with a fledgling definition and FINRA has announced in 2013 it will use “examinations and targeted investigations” to ensure firms have adequate “testing and controls” related to HFT and algos. How these controls will work without a robust set of trading workflows and identification requirements is a question which should be keeping ESMA awake at night.
The UK’s long awaited set of answers didn’t appear last year. Highlighting the controversy that follows HFT wherever it goes, the Foresight Commission’s weighty computer trading study had fewer practical suggestions than expected. Perhaps even worse, aspersions have been cast from some quarters about conflicts of interest, its highly academic nature, and its methodology and data set.
This confused regulatory space is ignoring one indisputable fact: HFT is part of the financial system and it is not going to go away. Politicians, in their drive to make markets safer without concentrating on real underlying issues, are in danger of introducing poorly conceived controls that don’t get the job – whatever that is meant to be – done. The bottom line is that trading is a technical arms race and regulators will therefore always be one step behind.
In light of this, regulators and the industry must concentrate on not only defining what exactly HFT is, but also refining data collection, aggregation and analysis to balance political demands for market safety, without stifling capital allocation. This will require examining the market infrastructures that facilitate computer trading and the trade information they produce. With many other regulatory initiatives also struggling with issues of data collection, aggregation and analysis, regulators should ensure that HFT features in the larger conversations on standards and data quality.
Shining a light onto the issue, the FSA has provided the industry with valuable insight into the depth of this problem. In its January paper, entitled ‘High Frequency Trading and the Execution Costs of Institutional Investors’, it takes a more practical approach in examining HFT by comparing 30 days of trading data. The diagnosis at the centre of the report is that regulators do not have the data required to regulate, nor the standards to make this data truly useable.
The report begins on a well-known theme: the lack of a common definition of HFT making it difficult to be sure the scope of the trading activity is adequately captured. This data was then juxtaposed to HFT data from exchanges, held by the FSA. Huge discrepancies over the course of a year are apparent, with the FSA data detailing 70-80% of HFT at the beginning of 2010 and only 40% by end 2010. The reason? Unregulated HFTs are “not observed”, and firms that have HFTs that are not regulated under MiFID do not need to report. The report concludes this is “not a fair representation of true HFT activity”.
The report does, however, identify another key roadblock to an accurate representation of HFT activity – poor data quality. Some examples include trading time inconsistencies between FSA and exchange clocks, misreporting of counterparty codes, instrument and venue data and only trades reported with BIC codes able to be identified.
Without a robust understanding of HFT, regulators and politicians will be unable to regulate effectively in both their own interests and the interests of the industry. They must begin by collecting and aggregating a robust data set, based on common units of measurement, such as tick sizes and minimum resting times. While there is potential for this to be mandated in MiFID II, and it is supported by the Foresight Commission, the lack of global agreement on these requirements will result in traders taking advantage of regulatory inconsistencies. Therefore, a more practice-based, unified definition of HFT must be crafted, that will finally allow regulators to focus on the practices they wish to control.
As supported by the FSA’s research, HFT must also be included in discussions over identifiers, a mix of which is creating a convoluted landscape today, and market conduct specialists could benefit heavily from uniform standards, such as the legal entity identifier (LEI).
The ultimate key to success for these efforts will be getting the right people around the table, including exchanges, traders, regulators and academics, to offer the right balance between theory and practice. Through this cross-industry collaboration, the confusion over what HFT is, and the problems in obtaining quality, standardised data to solve the problem can finally be addressed. To get this right, all major markets need to sitting at this table, which is still waiting to be set for the discussion.
“This article is provided for information purposes only. Nothing herein should be construed as legal or other professional advice or be relied upon as such. Text and artwork remains the copyright of JWG”