Digital Assets Report

Newsletter

Like this article?

Sign up to our free newsletter

Electronic trading in volatile markets – it’s not about better algos but better usage of algos

Related Topics

Within the buy-side community there’s a tendency right now for traders to focus too much on the algorithms they use and less on trading according to Nick Nielsen, head of trading at Marshall Wace Asset Management. Speaking at a recent FIX Protocol 2012 EMEA Trading Conference held at London’s Old Billingsgate on 13 March 2012, Nielsen said that, generally speaking, there’s been a lot of focus on infrastructure and technology implementation “and less on actual intelligence”.     

Within the buy-side community there’s a tendency right now for traders to focus too much on the algorithms they use and less on trading according to Nick Nielsen, head of trading at Marshall Wace Asset Management. Speaking at a recent FIX Protocol 2012 EMEA Trading Conference held at London’s Old Billingsgate on 13 March 2012, Nielsen said that, generally speaking, there’s been a lot of focus on infrastructure and technology implementation “and less on actual intelligence”.     

“I think people fail to focus on the bigger picture sometimes,” said Nielsen, who was joined by fellow panellists Brian Gallagher (pictured), head of European electronic trading at Morgan Stanley and Salvador Rodriguez, EMEA head of electronic sales trading at Citi.

Referring to recent TABB Group research, Rebecca Healey, senior analyst at TABB and moderator of the debate entitled Electronic Trading in Volatile Markets, said that 57 per cent of people interviewed no longer had the same provider as they did last year. Traders, it seems, are starting to question the algos they’re getting from their brokers and whether they’re actually achieving the desired results.

As the trading environment gets more fragmented and regulated it is putting added pressure on brokerages to meet buy-side needs and focus on intelligence and venue analysis to get their clients, not necessarily the fastest liquidity, but the right kind of liquidity. Gallagher commented that the issue is not about better algorithms but better usage of algorithms: “In different periods of volatility knowing when to use a specific algo is going to change and evolve.” Rodriguez agreed, adding: “It’s really about being able to customise quickly and deliver solutions that people can use effectively.”

This was echoed back in January 2012 when Bob Santella, head of SunGard’s capital markets US trading software and brokerage business, said that because algo trading had become commoditized “brokerage firms will buy more sophisticated algorithms – or purchase the tools to build them – to help maintain the profitability of their algorithmic trading operations”. 

Market volatility like that seen in the second half of 2011 makes it even more important for brokerages to demonstrate a clear value-add in electronic trading, particularly for hedge funds who often juggle multiple prime broker relationships. Data, analysis, insight and guidance that will, according to Laurie Berke, principal at TABB Group, “result in quantifiable reductions in trading costs and positive performance”. 

Gallagher said that how one defines market volatility depends on what is being referenced as the benchmark but admitted conditions in the second half of 2011 had created a lack of overall liquidity and made things difficult. “Between August and January you had tremendous macro events like the US credit downgrade, eurozone problems, short selling bans. The market became macro-driven, everything was correlated which made it difficult to be a stock picker. One of the things we saw was a decrease in equity volumes by 30-40 per cent as many institutions were trading less.”

Gallagher added that differentiated algos have been created but that by and large “the majority that are used by clients remain schedule-based strategies like VWAP and Participate, but even a VWAP order at 5 per cent of the volume can leave a footprint. We have seen and encouraged a shift into opportunistic strategies to focus on liquidity and price capture.” 

In Nielsen’s view, markets are less volatile but people are choosing not to engage with them in a way that reflects that. He opined that the biggest hurdle cost for traders is how aggressive to be: “As spreads have come down, trade sizes have come down, it effectively means that you need to lengthen your execution window or decrease your aggression so as not to show market participants what you’re doing.”

That’s not to say that traders are necessarily using dark strategies more frequently. “People sometimes believe that if you trade dark it has no influence on any other markets which couldn’t be further from the truth,” added Nielsen.

For Citi’s Rodriguez, it’s a case of adjusting to trading trends and going back to basics: “There are occasions where you discuss strategy with clients. It’s a case of doing the basics well and understanding what they’re trying to achieve. The client needs to be aware that in times of volatility there’s going to be a big fluctuation in price.”

Dark venues have evolved and offer a number of different types of dark liquidity in the market place so it’s no longer the sole preserve of block trading – indeed, trade execution firms like Liquidnet have seen significant buy-side demand in Asia, where dark venues are less well evolved than in the US or Europe. “Clients want to understand the different order types in dark and whether they can customise them to opt in or opt out of certain venues and liquidity types,” said Gallagher.

One of the challenges of volatile markets, particularly during periods like 2008, is how algos are deployed to seek out liquidity and whether they are able to avoid toxic liquidity. On of the more important questions that Morgan Stanley is getting asked a lot by clients is not where they went to trade but where did they try to trade.

Avoiding venue toxicity is not exactly the be-all and end-all in Nielsen’s view. He commented: “I think people spend too much time talking about venue toxicity. When you look at the types of trades people do in the market place the venue toxicity-type numbers that people are talking about aren’t that significant, maybe a basis point. Sometimes it’s better to pay for this toxicity because you’re decreasing the time to execution.”

Today’s traders aren’t looking for a cookie cutter solution to electronic trading. They want choice, and the ability to use customised algos that best optimise their trading book. Consequently, different brokerages are being used for different algos – who has the best technology sector solution? The best small-cap US equities solution? On top of this, changing regulation – in particular MiFID II – means that more OTC derivatives will soon trade electronically and require execution support, putting brokerages under even more pressure to deliver.

“I think one of the biggest challenges we’re facing right now is regulation. What are the new rules on OTFs going to mean? We need to be able to change and deliver in terms of what a) the client wants and b) the regulator wants. That’s a main focus for us. It’s about delivering a menu of choice to traders. We need to be able to say to clients ‘we have a menu that you can pick and choose from’,” explained Rodriguez.

This suggests that buy-side traders still rely heavily on their brokers for customisation. Gallagher said that Morgan Stanley has the expertise and technology to offer customisation, as most tier-one providers do: “We know what is going to work for a long-only manager versus a quant-driven hedge fund manager.” The ability to eek out one basis point over an USD1billion portfolio can make a real difference to a fund’s performance and ultimately determine whether it’s a second quartile or first quartile fund. “We look at clients’ trading styles and where they could be doing things differently, trying to help them optimise the cost side of things,” added Gallagher. 

When it comes to internal customisation there is, said Nielsen, a limit to what can be achieved in relation to marginal speed benefit implementations. Even though upgrading hardware is still relatively cheap and will continue to happen, a point has been reached where the alpha forecasts being encoded aren’t granular enough to take advantage of the speed with which computers can execute strategies.  


“The costs of marginal speed are high relative to revenue. That’s not to say people won’t maintain and try and improve things where it makes sense, but the money they previously spent on hardware will, I believe, over time, be spent on research,” stated Nielsen.

 

Like this article? Sign up to our free newsletter

Most Popular

Further Reading

Featured