The movie The Theory of Everything, which recently won a leading actor Oscar, has its lead character Stephen Hawking laying out his vision of a single equation that explains all physical aspects of the universe.
The scientist explains in lay terms the two broad areas of theoretical physics that have emerged over the last century – general relativity (as famously developed by Einstein) and quantum field theory (analysing the properties and effects of sub-atomic particles) – and the challenges of integrating both approaches in one over-arching set of theories. One approach looks at very broad aspects of the universe and space and time, while the other focuses on infinitesimally small objects as the basis for broader theories and interpretation.
In a way this rarefied scientific debate has echoes in the more prosaic world of Transaction Cost Analysis (TCA) in financial markets, where the availability of more granular data coupled with pressure from regulators has combined to drive a whole new wave of research and analysis.
Typically the analysis of trading costs has focused on the big picture, identifying the implicit costs incurred in the investment process. But now a much more granular level of analysis is also both possible and required. There is a risk that these latest tools may be thought by some to be able to answer all the questions on trading costs and best execution. This is clearly not the case, and a combination of methods of analysis is vital.
Traditionally TCA was conducted at a relatively high level, focusing on the outcome of orders and looking at the implicit costs incurred by price movements caused by market impact or by delays in the execution process (as distinct from explicit costs such as commissions). This “implementation shortfall” can be calculated and analysed to determine where and when inefficiencies occur in the investment process. Fine tuning can lead to significantly improved investment performance within the context of an underlying process.
Most leading institutions continually monitor their TCA data for trends, and aim to identify opportunities to make improvements. If left unaddressed, such hidden costs of trading can and do have a major impact on investment returns and rankings in the performance tables.
‘Investment DNA’ should be reflected in TCA methodology
Every institution has an investment process, which forms a sort of investment DNA for everything it does. It is reflected in activities such as portfolio construction, stock selection, decision timing and trading strategies. Some firms are value-oriented and incur relatively low transaction costs, as they are typically trading against the consensus. Others are more event-driven and momentum-oriented; inherently they need to trade more quickly than others, incurring higher impact costs in order to capture as much alpha as possible before others do so. Similarly some portfolios are made up of many small positions which can be easily and cheaply traded, while others consist of fewer positions which may be highly illiquid, and cannot be readily and quickly traded without severe loss of value.
All of this should be reflected in the approach to TCA which a firm employs, and the metrics which are used to monitor efficiency in achieving optimal outcomes. There is no one-size-fits-all in this respect. There have been calls in some quarters for a standardised approach to TCA. Such thinking should be firmly resisted, given the wide range of needs and types of analysis. The high level analysis must take into account many aspects of the underlying process, since the costs will be highly linked to factors beyond the control of the trader.
Greater market complexity requires forensic analysis
But then a whole new level of complexity was introduced to European financial markets. This reflected a number of developments over the last decade, starting with the fragmentation of trading that resulted from the first Mifid set of regulations in 2007.
This led to several new trading venues emerging in Europe, reducing the market share of the traditional exchanges and making the trading landscape considerably more complex. At the same time new generations of trading systems allowed asset managers to record and analyse details of every single fill that is generated by their orders. With algorithms often slicing a large order up into very small pieces, this can literally mean thousands of separate executions for just one order. The final element of complexity – albeit a welcome response to the need for better information – has been the increased use of data tags to track and report information on these individual fills.
Together these factors have driven a rapid evolution in analytical approaches which have recently taken on added urgency as a result of the publication of the UK Financial Conduct Authority’s Thematic Review on Best Execution and the final draft of the proposed Mifid II regulations. These stipulate that investors must not just monitor the venues on which their trades are being executed, but require them to describe the steps they undertake in the choice of those venues and their execution strategies to achieve best execution. While more traditional approaches to TCA tended to look at the context of the investment process and at high level trading data, the new requirements entail much more precision and forensic analysis of the tactics used at the most granular levels in terms of sizes, timing and venues of trades.
Analysis of venues must be linked to execution strategies
The linking of venues to execution strategies is more than coincidental, and indeed is crucial. The way in which an algo is designed to route an order is inextricably linked to the execution strategy selected. This may for instance be a fixed participation strategy, or liquidity-seeking, or aimed at trading only in the so-called dark pools or crossing networks. Each strategy will tend to execute in different venues, or in different sequences, or in different volumes at different times. Hence it is essential for the latest applications of TCA to link the analysis of venues to that of execution strategies as it drills down into these details.
With this new granularity of data, new metrics also come into play. Looking at simple average price or implementation shortfall calculations is not necessarily as relevant in determining the efficacy of one venue versus another. Shorter term statistics on reversion or spread capture may be more revealing. Similarly the number and sequence of venues used can be analysed (basically the more venues used, the higher the overall cost), as can the costs or benefits of trading in lit or dark venues (with dark in general achieving better outcomes, particularly in larger sized trades). Traders now regularly use such data to monitor the ways in which their brokers execute their orders, for instance in the differing patterns of behaviour of smart order routers or algo strategies. And using this data it is also possible to predict what is likely to be the most efficient way to execute a given type of order. As with more traditional approaches to TCA, the post-trade data can become a vital input to pre-trade decision making.
Michael Sparkes is director of Analytical Products and Research at ITG.