The results of a recent panel discussion organised by Interactive Data to examine the latest developments in algorithmic trading…
The ongoing growth in the adoption of algorithmic trading models is boosting market data rates, particularly in equities markets, and forcing firms to invest in their market data infrastructures, although the upsurge in volumes may ultimately turn out to be cyclical with the market returning to more manageable data rates in the future.
These and other conclusions surrounding the impact of algorithmic trading on the market data community were heard at a panel discussion organised by Interactive Data's ComStock business last November: 'The Growth of Algorithmic Trading - Impact on Market Data'.
Introducing the session, Mark Hepsworth, president of ComStock, said the company had seen a lot of change in the past few years, including exponential increases in market data volumes, and ComStock has made extensive changes in its infrastructure to handle it. He said that ComStock had seen an increase in demand for low latency access to data, and had worked with clients on different pricing models that enable them to do more with the data they receive in support of algorithmic trading. There has also been increasing demand for more sophisticated APIs to access the data.
Chaired by Octavio Marenzi, founder and CEO of Celent, the panel featured speakers from a broad array of market participants. Joining Octavio Marenzi were Gabriella Stern, senior editor of Dow Jones Newswires in Europe, Middle East and Africa; Wendy Morgan, head of Real-Time Data for the London Stock Exchange (LSE); Eli Lederman, managing director and the head of the Electronic Trading Services Group at Morgan Stanley's Institutional Equities Division; and Thomas Colucci, head of Business Development at Pali International Ltd. Some 130 representatives of financial institutions, consultants and market commentators attended the event.
Market data is a key component of algorithmic trading. Firms offering their algorithmic trading strategies to clients must expose the market price information underlying their algorithmic models, making it essential to maintain access to timely and accurate data at all times.
The growth of algorithmic trading appears to have outstripped expectations over the past two years. An audience poll found that most attendees expected algorithmic trading to represent more than 25 per cent of total equity trading volume in two years' time. In fact, according to Morgan Stanley's Lederman, volumes for US program trading are already in the 50 to 60 per cent range, with statistical arbitrage at around 33 per cent of total volumes.
These kinds of figures illustrate why information providers like the LSE have bolstered their data processing infrastructures. According to Morgan, the LSE has seen year-on-year volume increases of about 30 per cent and upgraded its technical capabilities, as a result, to support new low latency market data services that are needed for the rapid execution of electronic trades. She said the exchange has managed to reduce the latency in its information systems from 30 milliseconds to less than 3 milliseconds.
How long this trend will continue is open to question. Lederman told the audience that he expected the recent high data volumes would turn out to be cyclical, as the advantage of arbitrage strategies is worn away by the broad market uptake of algorithmic models. At some point, he said, algorithmic trading becomes so prevalent that you arbitrage away inefficiencies and the value diminishes. When this happens, he said, the market will return to value investing rather than merely identifying and exploiting inefficiencies.
Colucci reckoned that sophisticated market participants are already able to identify from market data when firms are using algorithms to exploit market inefficiencies. There is a lot of manual intervention in the last hour of trading, he said. If one looks for the volume, many times one will find a large VWAP order behind it. Lederman said it was easy to pick off bad algorithmic trading, and that one of the challenges is to produce well-written models in order to hide the footprint.
The cyclical nature of the trading markets notwithstanding, the marketplace is expected to require lower latency access to market data. Polling of the audience showed that data throughput is seen as rising to more than 42 megabits per second by the end of the first quarter of 2006, from the 18 megabits per second that ComStock's feed handled a year earlier.
Morgan said that the LSE, in line with a trend among exchanges, is looking at ways in which it could balance out the costs incurred in providing more and faster data with the revenues associated with providing information services to the marketplace.
Lederman said that firms themselves needed to conduct a cost-benefit analysis when considering how much investment to make in implementing low latency information systems.
The panel agreed that algorithmic trading was an important element in a firm's overall order flow segmentation. Lederman suggested that while today most algorithmic trading is price-driven, it could become more news-driven in the future. Colucci said his company, a boutique trading firm, was receiving a growing number of calls from market participants - including many hedge funds - seeking to interpret events in the marketplace rather than merely react to price movements.
Morgan said she was aware of computer programs that analysed news information through the trading day and used key words to drive trading strategies. Dow Jones Newswires' Stern said her company was working to expand beyond that kind of use of news, speaking to customers to find out new and creative ways to use news in their trading strategies.
Marenzi suggested that non real-time data such as corporate actions and historic tick data would also have a role to play within the trading environment, whilst Wendy Morgan said the challenge will be to work with industry participants to create standards for this kind of data, so that it can be integrated into their systems.
The onset of the EU's Markets in Financial Instruments Directive (MiFID) in 2007 may also impact on how data is distributed to the marketplace. Lederman pointed out that algorithms are not currently able to take advantage of off-exchange trades, where transactions are 'internalised' by institutions. This practice, according to Morgan, could represent as much as one-third of all trades in the London marketplace. MiFID, by increasing the transparency of those trades, could open up access to them for algorithmic models, although its true impact would not be understood until more issues around the directive are decided. What is clear, though, is that MiFID will be sweeping in its impact.
While the market's uptake in using direct market data feeds is growing, Morgan acknowledged the importance of aggregated feeds like ComStock in the overall mix. She said that implementing direct feeds is an expensive process, not only in terms of paying for data but also the requirement to maintain one's own systems for handling it. However, she said that the buy-side is growing more sophisticated in its use of directly sourced data, allowing firms to handle the relationship with their sell-side brokers more effectively.
This Panel Discussion was organised by Interactive Data Corporation (NYSE: IDC), a leading global provider of securities pricing, financial information and analytic tools to institutional and individual investors. ComStock, part of Interactive Data, is a leading provider of real-time global financial information to financial institutions, financial information re-distributors and online media portals worldwide.
For further information on Trading & Execution and related articles, please click here