Running a successful hedge fund trading strategy requires high levels of skill, research, idea generation and a clear understanding of market dynamics. Such perspicacity, however, is becoming just as important in how a fund’s operations run.

Regardless of how complex the operational infrastructure, quality data is the lifeblood on which a fund’s day-to-day success depends. Indeed, in the opinion of Bennett Egeth, President of Broadridge Investment Management Solutions, data is the buy-side community’s new KPI. “It impacts every aspect of a firm’s alpha generation, operational efficiency, regulatory compliance and risk management.”

Fund managers are becoming mindful of the fact that data impacts the entire enterprise. Solutions that can manage and organize reliable data sets not only help to improve operational efficiency and reduce costs, they bring much needed peace of mind as hedge funds adjust to the forensic needs of institutional investors and global regulators.

“Firms that figure out how to manage and improve data quality will be better able to deploy new strategies, reduce costs and improve performance,” says Egeth.

Superior data supports superior decisions

Having good quality data can enable traders to react more quickly and confidently to market dynamics, potentially gaining an edge on their competitors. Most people think about data quality as decreasing their operational cost and expense, says Egeth, who adds: “One of my favourite phrases is “superior data supports superior decisions”. The point being, it’s not only about having the data accessible to the front, middle and back office functions, it’s also about the quality of data. Any error or incompleteness in the data diminishes the quality of decision making and increases the operational expense of managing and remediating exceptions.”
 
Much has been written recently about operational alpha and the benefits it can bring to managers, with some estimating that outsourced solutions for things like middle office services can help to reduce operating costs by 10 to 20 per cent. Certainly, it can help managers reduce the size of their operations teams and the time taken up with reporting, and compliance tasks can be redeployed to areas such as marketing or capital raising.
 
But whilst operational alpha can help managers squeeze a bit extra out of their trading decisions, the quality of data has a significant impact on the quality of the trading decision. Egeth believes hedge funds should be concentrating as much on data quality as the technology used to acquire it: “One of the biggest reasons for failure in risk monitoring and the implementation of risk systems is poor data quality.”
 
Broadridge Investment Management Solutions counts 18 fund administrators and several hundred fund managers as its clients, not to mention broker-dealers, custodians etc. Given that 50 per cent of its managers are based in the US, with 25 per cent in Europe and 25 per cent in Asia Pacific, the firm is plugged in to the data challenges that hedge funds face.
 
Many of the global regulatory demands being placed on managers require consistent and high quality reference data. Regulators are now concerned with the ability of a fund to track the linkage and providence of their data, and make sure it is used consistently across a fund’s infrastructure. Data management is becoming a standard topic in investor operational due diligence questionnaires.
 
Regulators want to understand the extent to which hedge funds represent a systemic risk; the bottom line being to protect end investors, many of which are Mom and Pop investors as pension funds continue to increase their allocations.
 
Unfortunately, many hedge funds remain operationally inefficient. 
 
“We estimate that 70 per cent of time spent by operations teams is on maintaining data quality; that is, fixing trade breaks or dealing with exceptions that result from poor data quality,” says Egeth.
 
“Broadridge is very conscious of that from our sell-side business. The banks and broker/dealers are further ahead in their understanding of the causality of where their operations teams spend most of their time. I don’t think the majority of fund managers have yet developed a data maturity model that gives them the ability to understand the relationship between data quality and the day-to-day tasks of their operational teams.”
 
Some hedge funds will take data natively from market data providers without testing it or making sure that it is valid and conforms to their expectations. Part of the problem stems from the fact that managers often lack system integration. Many are still using legacy systems that were built to handle a more limited set of requirements for a specific asset class, before regulation reared its head.
 
“That creates a much larger burden on their data infrastructure that nobody anticipated when they first built it,” says Egeth. “Very few firms built their systems before OTC derivative markets took off. Risk reporting requirements have grown substantially.”
 
The rise of managed services
 
More than ever, managers need clean, consistent data.
 
“That’s why we have over 18 administrators as clients. Early on, we were providing them with the tools they needed to manage data. When we talk about the data needs of a fund we are talking, generally, about:
 

  • Security terms and conditions
  • Pricing
  • Data warehousing (being able to action data from a single place so that it is accessible to fund managers)

 
“Managers need the data in a normalized place. They need it to be programmatically extensible, they need the tools to reconcile it on an automatic basis with their prime brokers, fund administrators and any other counterparties. In my view, the future is going to be data delivered as a managed service, as opposed to just a technology solution,” comments Egeth.
 
For example, say a data content provider is missing maturity data on one of its feeds. Every bank, broker/dealer, custodian, administrator and fund manager needs to detect that and fix it. The question is, should they be doing that or should they be using someone – via a managed service arrangement – who works to address these issues on their behalf and provide them with clean data and operational scale.
 
The latter is becoming an increasingly attractive option.
 
“We offer a technology solution for the data, but we also now offer a managed service. A fund’s operations team can come in first thing in the morning and see a summary of all the trade breaks and/or reconciliations that have been resolved overnight,” says Egeth.
 
Integration enhances data aggregation
 
Increasingly, fund managers are outsourcing middle and back office tasks to their administrator, giving them sole responsibility for the Investment Book of Record (IBOR). Some start-up managers, says Egeth, are not even bothering to install a portfolio system because of the potentially large total cost of ownership involved.
 
Most managers, however, do use front-end platforms. This may involve using two or three systems for different asset classes, with an additional system between the fund and the administrator to aggregate the data and reconcile everything in one place. Given how important data quality is today, such a fragmented set-up is far from ideal.
 
“Because we are able to handle virtually every asset class, we have found that the trading platform becomes the data which then gets reconciled to the administrator without the need for having another system in the middle. Our Portfolio Master is basically a combined order management and portfolio system. We usually deliver that system integrated with our reference data product, called Security Master, and our data warehouse, called Analytics Master as a fully integrated solution,” explains Egeth.
 
This automatic reconciliation that Egeth refers to is a key point. Whether it’s security terms and conditions, pricing data, risk data or performance and attribution data, an integrated solution gives managers the ability to check every piece of data that goes into calculating a fund’s NAV and reconcile it against the NAV being produced by their administrator. This ensures that operational due diligence is robust, even when key tasks are outsourced.
 
This is something that the UK financial regulator, the FCA, has focused on. Last November it produced a thematic report (Outsourcing in the Asset Management Industry) in which it highlighted two key areas of risk that managers need to address when outsourcing: ‘resilience risk’, meaning what contingency plans are in place should a service provider fail, and ‘oversight risk’; what internal controls/processes are in place to monitor what the service provider is doing.
 
Broadridge’s technology provides not only data quality but the ability to effectively shadow what the counterparty is doing, addressing this ‘oversight risk’ issue identified by the FCA.
 
“Managers can’t abdicate responsibility for their business and at the same time tell investors or regulators ‘Well, my administrator is supposed to be doing that’,” says Egeth.
 
In Egeth’s view, the price point for institutional-quality infrastructure has reduced considerably, noting that today, an emerging manager can license affordable high quality infrastructure off the shelf. This infrastructure is not only crucial to the investor operational due diligence process to raise capital, but also signals to prime brokers and administrators that the manager is committed to automate, and willing to invest in their business.
 


Subscribe to free daily newsletter
Furtherreading
from our other sites

ProShares to close 17 ETFs

Wed 24/12/2014 - 11:51

Cayman Islands funds

Fri 19/12/2014 - 11:23

events
2 weeks 4 days from now - New Orleans
3 weeks 6 days from now - Boston
3 weeks 6 days from now - New York
4 weeks 3 days from now - New York
specialreports