Franklin Templeton Investments’ Director of Global Trading Strategy Bill Stephenson looks at the four main challenges facing the buy-side today.
What are some of the challenges your global traders are facing?
Well, there are four major themes and problems we have been trying to solve; most of them involve a technology solution.
The first issue is liquidity, which is hard to measure. Contrary to some beliefs, it isn’t always correlated with trade volume, bid/ask spreads, depth of book, or by the speed of execution. While all of these things can make the market appear more liquid or easier to trade, we just haven’t found that to necessarily be the case.
The second challenge would be defining and reducing our market impact, and potential implicit costs. In a more complex trading environment, we need to be smarter about reducing information leakage and leveraging real contra-side liquidity to minimise our footprint. The third theme concerns transparency; this boils down to understanding how and where our orders are routed. It includes knowing where they may rest, and the methodology involved in how they interact with various liquidity destinations.
Finally, the fourth area of focus is data management. This involves how we take the intelligence we gather about our stocks and execution performance, and combine it with our fundamental internal information and analysis. This is probably the most proprietary effort we are undertaking right now, which could potentially yield the greatest benefit to our process.
Can you expand on these a little and describe how you are helping your firm solve these challenges?
Well, as I said, technology plays a part in this; quite honestly, in most cases it is the key to our success. I would say that a considerable amount of my time, and that of others within our group, is spent working with our traders and technology teams to figure out the best way of addressing these issues. Take our first challenge — liquidity. One of our senior traders, Ben Batory, led an effort to author an internal white paper that specifically addressed some of the challenges we are facing. It was essentially a thought piece geared toward helping us synthesise our views on the difficulty of current stock trading when compared to previous periods. It was something we could share with our portfolio managers, to help them understand why they may not be executing in the same manner that they have expected in the past. While the lack of volume was part of our theory, the quality of that volume was potentially even more important. Of course, there are many reasons why these two things are occurring right now. One reason is the macro environment and lack of conviction in the marketplace. Another part of it may be the market structure rules that have evolved over time, leading to unparalleled execution speed.
This increased speed has helped the markets in some ways and made it more challenging in others — or as some would describe, more nefarious. For instance, speed allows algorithmic traders to cancel orders very quickly, permitting apparent liquidity to be fleeting. We know we can’t just create liquidity or even fully optimise the type of liquidity we want to interact with; but we do believe it is a challenge we should continually strive to overcome. While we all want to transact with a natural counterparty at a price with minimal market impact, we completely understand the necessity of liquidity provisioning, and the resultant cost to access it. We do, however, believe there are times of unnecessary provisioning that disintermediates a buyer and seller, which comes at an additional cost. We need to better understand the types of liquidity pools — such as exchanges, dark pools, and crossing networks — and target those destinations we deem to contain less toxicity, i.e. less intermediation and information leakage. We can customise our experience in these liquidity venues by only interacting with certain tiers or liquidity providers, using minimum fill sizes, disabling outbound messaging, etc. We have always said that complexity isn’t always a bad thing in the marketplace, because it does create opportunities. However, overly arcane rules, the continuous introduction of new order types, and the ever-changing pricing schemes can actually cause participants to cease trading — which then impacts liquidity and trading costs. For example, in the last two years there have been well over a hundred exchange pricing changes, and hundreds of different exchange order types offered with perhaps thousands of permutations; and this is just in the US.
Understanding the types of liquidity pools leads to my second point relating to transparency. We have seen enough to know that you can trust, but you must verify. Just asking how something works isn’t always going to be the best practice. We have required that our dark pools and ATSs open the kimono regarding how their systems work. For instance, at what prices do they cross, do they allow pinging, and what data feeds are they using? Led by the co-head of Americas Trading, Dave Lewis, we have moved on to the algorithmic providers as well, by insisting on two pieces of information. First, we require real-time information on specific FIX tags surrounding the executing destination. This includes whether they provided or added liquidity, the associated rebates/fees, and quote at the time of trade. Secondly, we ask for an end-of-day file with similar information, but with more details relating to all the order routes, fill and non-fill destinations, millisecond precision timestamps, and other related data. We completely understand that this can be highly proprietary information, but we also know that this information is necessary for us to understand how our orders are being represented — and it is our obligation to understand.
Additionally, this allows us to conduct further analysis to understand other measures, such as the velocity of trading and quote changes before and after our trade, price reversion, and the potential impact of odd-lot fills. We have millions of fills per month, so the challenge of building a platform to concatenate, analyse, and interpret that data is complex, but essential. We are building up our technology and people resources in this area. It is important for us to understand this data, since it will only continue to get more complicated.
As I just mentioned, the number of records being generated becomes huge. This leads me to the challenge of data management. We now have a variety of data that we believe is important to manage our business better — beyond just execution data. Managing this data has been a herculean challenge, but it is also one that presents immense opportunities. If we can better collect, manage, visualise, and interpret our data, both proprietarily and externally generated, we can create a value added overlay in our investment process. For example, we are building a proprietary system conceptualised by Mat Gulley, the global head of trading, called Investment Dashboard. Without getting into specifics, the Investment Dashboard takes data from different departments in our firm, and organises it in a way that becomes insightful to users. This capability creates investment opportunities that were previously harder to identify, if at all, and does it in a very timely manner. We think this will be a game changer for our organisation, because it really gets to the core of our white paper’s message — liquidity is challenging and the market is more complex than ever. But with greater communication between traders and portfolio managers with better tool sets, long-term investing can still be short-term opportunistic. In fact it needs to be in order to outperform in today’s environment. We view this symbiosis between trading and portfolio management through technology as core to our investment implementation process.
Finally, there is the issue of market impact. At the end of the day, we hope that all of this ties together. Through transparency, analysis, collaboration, and smarter liquidity seeking strategies we aim to reduce our impact on the market. We are not naïve; we realise that we will have market impact, given the potential size of our orders. We know that we can’t always beat the higher frequency traders to the punch, but our low frequency process, trader discretion, and investment horizon allows us to think long-term and also act opportunistically while in the market. Since our traders have discretion on their orders, they are not as predictable to market participants. We believe this creates an expected value for our clients. Quite frankly, we don’t need to be able to get from one exchange server in New Jersey to another in 161 microseconds to effectively compete. But we need to know how and why it is happening, so we can think more strategically about our choices.
A few years ago, you spoke with us about TCA. Has anything changed in how you measure your market impact or the value the traders are adding using your investment or trading horizon?
Sure, we have done a few things that tie the data together better. For instance, we believed that real-time analysis should seamlessly flow into our next day post-trade analysis. With that idea in mind, we re-engineered and outsourced our existing point-to-point FIX infrastructure to our TCA provider. We felt this would create more continuity for the traders, simplifying their monitoring processes, and ultimately allow for finer strategy adjustments. We are in the midst of this migration, which involves helping the vendor conceptualise, design, and prioritise specific enhancements to the real-time platform. The innovation will allow us more insight into our activity — not only for the trader, but also from a management and risk oversight standpoint. In addition, we have re-vamped our post-trade analysis to a more granular level, in addition to enhancing the data-set we deliver to the vendor. Again, we think TCA is not just about measuring the past, but using it to manage risks, plan strategy, and ultimately to help reduce costs or even add alpha going forward. It is definitely part of our culture when it comes to this analysis. We believe it helps us increase the predictability of the performance, and the experience for the portfolio manager over time. As we like to say, best execution is a predictable execution — and that comes from a foundationally sound best execution process.
It sounds like you have been working on a lot of different initiatives. Do you find that all this work is beginning to pay off for your investors?
We think that it is. Of course, we’d like to be able to precisely measure these improvements, but not everything is easily quantifiable. We feel strongly about how we measure our execution performance, which includes both our market impact and opportunity costs. This is because we designed our system and measurement methodology to match the investment process and align the goals of the trader and the portfolio over the expected duration of trade implementation. However, while we spend a lot of time trying to optimise performance, there will always be circumstantial impediments to our performance gains from things we can’t control but must successfully learn to adapt to or navigate through. These factors could include: market volatility, the types of stocks we are trading, price constraints or just the intra-day timing of orders from the portfolio managers. When it comes to the quantitative intelligence we are gaining from the FIX data we are requiring or the qualitative measures we use to measure potential toxicity, it is by no means easy to quantify the impact. With that said, we believe we have created a roadmap to navigate the global markets that takes in everything we have learned over many years of engagement. We do know that this intelligence does creep into performance for our clients, potentially fractions of a penny at a time — but we’ll take that in volumes. Getting back to the internal white paper, we can’t necessarily determine the impact of more collaboration, the increased speed of opportunity realisation, or even better expectation management; but we know that a tighter connection between portfolio managers, traders, and analysts, is the main driver toward allowing a process to be opportunistic, versus one that is always a step behind in the alpha generation process.