Digital Assets Report


Like this article?

Sign up to our free newsletter

Contextualising alternative data is key to garnering true insights

Related Topics

The world has witnessed an unprecedented explosion of data over the last few years. Most of us will be familiar with the term Terabyte, which represents 1012 or 1 trillion bytes of data. But such is the data-drenched world in which we live today that Caltech estimates 463 Exabytes of data will be created, every day, by 2025. One Exabyte is 1018, equivalent to one quintillion bytes!

The world has witnessed an unprecedented explosion of data over the last few years. Most of us will be familiar with the term Terabyte, which represents 1012 or 1 trillion bytes of data. But such is the data-drenched world in which we live today that Caltech estimates 463 Exabytes of data will be created, every day, by 2025. One Exabyte is 1018, equivalent to one quintillion bytes!

The numbers are mind-boggling and too much for our human brain to comprehend. For the asset management industry, finding ways to harness technology in a way that can bring a kernel of insight to investment portfolios, is likely to be the next significant phase of evolution, where data management will define the winners from the losers.

In its latest white paper entitled “The exponential pull of innovation”, SEI refers to it as the “Googlisation” of financial services. More than just a placeholder for the idea of big data, “Google plays the role of a reliable means of deriving utilitarian knowledge from data. It is emblematic of data abundance and our strides in using that data effectively,” the white paper suggests.

Data evangelism

How asset managers approach big data is far from straightforward. Many are traditional, conservative businesses that find it hard to justify spending significant budgets on innovation and moving away from a bedrock of stability that has served them well over the years, or decades.

For somebody to come along and suggest they might want to re-think how they do business, and that the way they do things today could be defunct in 10 years time…that’s a hard pill to swallow.

“It’s incredibly difficult to institute change within any asset management firm,” comments Qaisar Hasan, Portfolio Manager for the 1798 Q Strategy at Lombard Odier Investment Managers.

1798 Q is an equity market neutral strategy that combines rigorous data science with fundamental analysis. Before joining LOIM in May 2018, Hasan was a portfolio manager for Point 72 Asset Management, where he launched the firm’s alternative data-driven long/short investment strategy.

To go down a path of greater innovation requires greater resources (human and financial), more access to data and systems, and the disruption it potentially introduces to existing personnel and processes, so one can understand why there is some reluctance among asset managers.

Some will be better than others at crossing that path successfully.

“We run a dedicated strategy that was custom built around the strengths and weaknesses of alternative data, and source our datasets from a fundamental lens. LOIM is bullish about big data and has fostered a culture of collaboration across teams when it comes to data.

“You have to be an evangelist. You have to spread the word and help people understand the benefits. If they see you put it into practice they are more likely to follow you, rather than think of big data as being a largely academic exercise,” says Hasan.

Big Data is a big deal

As SEI’s white paper points out, analysts the world over are toiling over spreadsheets as they scrub data sets to glean insights before the data becomes obsolete. To combat this, firms are looking to address the problem by automating the analysis of streaming data.

To do this, some asset managers are busy transitioning from the old fashioned world of descriptive analytics to more cutting edge, predictive analytics, underpinned by sophisticated machine learning tools that consume raw data with the same rapacious appetite as a black hole.

Not that this necessarily means traditional asset managers have to become fully quantitative.

With the 1798 Q Strategy, LOIM is trying to blend two distinct legacy worlds of quantitative and fundamental investing.

“Historically, these two styles have been like oil and water,” says Hasan, “they’ve never really mixed well together. Five or so years ago, we determined that alternative data could potentially act as a bridge between the two because with alternative data you get the same insights on companies i.e. pricing, revenue growth, that any fundamental investor would care about. But at the same time, you need statistical modelling and computer science expertise – the preserve of quantitative investing – to clean the data, visualise it and interpret it correctly.”

In a paper that LOIM authored last year, “Big Data is a big deal”, Hasan wrote that as the type of data and technology around processing it evolves, so must our investment approach. However, if data is gasoline and machine learning is the combustion engine, this new form of alternative data is crude oil.

In other words, to make sense of alternative data, asset managers will need to utilise technology in a way that removes the impurities, so that whatever data sets are used for portfolio management are additive, and not introducing false positives, or erroneous signals.

Hasan says: “We view ourselves as the next generation of fundamental investors, where we try to understand company dynamics like any fundamental investor would, but in addition we go out and find data sets from external vendors. That way we can see – in almost real time – what is happening to companies, in more granular detail.”

To do this, Hasan and his team built a massive database that combines three different types of data sets: 1) fundamental company data (balance sheets, cash flow information etc.), 2) market data (how is a company valued, what is the investor positioning), and 3) alternative data, to better understand drivers of demand, revenue, costs, margins etc.

“It’s only when you are able to make all of these data sets talk to each other, once they’ve been cleaned, that real insights start to emerge,” says Hasan.

In his view, the ability to how one contextualises the data is key.

“That’s where I think a lot of the pure quants struggle; alternative data throws up so many possible permutations, when you back test you always get some that look incredible until you apply them to the real world and they blow up.

“The real challenge is determining when an alternative data set can really matter – even if it doesn’t necessarily back test well – because we know that it fits our fundamental model of a particular company.

“It is a marriage of good data sourcing skills, good engineering skills and having an overlay of fundamental analysis.”

The challenge of data obsolescence

How firms integrate different data sets together to create meta sets will go some way towards addressing one of the main challenges, which is obsolescence. Most alternative data sets have a short shelf life. In his previous role, Hasan used to cover communication equipment companies. One day, he came across a data set (from US distributors) that tracked US sales for Cisco really well. A year later, he came across another data set that was the mirror image, provided by international distributors.

“Neither data set on its own did a good job, they both had limitations, but when I was able to combine them together, it gave me a much more robust picture of global sales for the company.

“Every alternative data set is like this. It may represent 1 per cent of a company’s business and you’re trying to extrapolate from that a broader company performance. But such a small sample size brings with it geographic bias, consumer bias and so on. When you can combine different data sets together to overcome those biases, you can generate far stronger signals on how a company is performing,” Hasan remarks.

As SEI is keen to point out in its paper, privacy concerns aside, “it is clear that data and analytics will continue to be viewed as key sources of competitive advantage for the foreseeable future”.

Hasan concludes by adding that people, ultimately, are going to be the final determinant of who succeeds in this data challenge and who fails.

“We have five data engineers and computer scientists and three fundamental analysts on our investment team.

“Going forward, I think we’ll see investment professionals having to wear both hats but it’s going to be a challenge as graduate programmes haven’t been designed that way. You might find that computer science graduates study for an MBA or CFA, while the more traditional financial analysts study Python coding. Maybe we will start to have blended degrees, combining both skill sets, emerging over the next 10 to 15 years.”

To read SEI’s latest white paper – No2 in a five-part series – please click here – US version or UK version.


Like this article? Sign up to our free newsletter

Most Popular

Further Reading