Digital Assets Report

Newsletter

Like this article?

Sign up to our free newsletter

How to monitor, measure and manage liquidity risk

Related Topics

What’s the situation? Asset managers today are faced with having to deal with a plethora of liquidity regulations. Most of the liquidity-related concerns in respect of Comprehensive Capital Analysis Review (CCAR) prescribed by the Federal Reserve Board, Solvency II, MiFID II – due to go live on 3rd January 2018 – and liquidity coverage ratios under Basel III are essentially just rules from the regulator to adhere to. 

This has required financial institutions to become more prescriptive in terms of improving their trade compliance frameworks and enhancing pre-trade analytics. But the goalposts are changing. 

Regulations such as the Alternative Investment Fund Managers Directive (AIFMD) on the buy-side, prudent valuation regulations introduced by the European Banking Association for the sell-side, and the upcoming Investment Company Liquidity Risk Management Programs rule – known as SEC rule 22e-4 – place an actual requirement on the end user to estimate their liquidity. 

“With AIFMD, it is requiring fund managers to time bucket their liquidity and report those to the national regulators, but in my view SEC rule 22e-4 is probably the most challenging rule, to date. It is the first mandated regulation around measuring, categorising, and reporting on liquidity,” says Naz Quadri, Head of Liquidity Analytics for Bloomberg’s Enterprise Solutions business. 

Four time buckets under SEC rule 22e-4 

SEC Rule 22e-4 was passed unanimously on 13th October 2016 and impacts US open- ended mutual funds and ETFs. 

The objective of this rule is for registered investment advisers to put in place a liquidity risk management process to define how they are managing their liquidity – reporting on order execution, reporting on breaches, etc. 

Secondly, they want the ability for firms, on a monthly basis, to categorise their holdings on a time basis, within four time buckets. And thirdly, alongside that time bucketing, to set thresholds. 
This will require fund managers to have a well-constructed liquidity risk management framework. 
Briefly, the four time buckets are defined as follows: 

1. Highly liquid

Positions that can be sold and settled within three business days.

2. Moderately liquid

Positions that can be sold and settled in more than three but less than seven calendar days. The SEC asks funds to also consider “disposal”, ie classify the remaining positions that can be sold but not necessarily settled in current market conditions.

3. Less liquid

Positions that can be sold within seven calendar days but probably will need more than seven calendar days to settle.

4. Illiquid positions

Positions that will take greater than seven calendar days to sell.

When trying to figure out which liquidity bucket something should drop into, the rule states that it should be based on the amount of time it would take to liquidate “without significantly changing the market value of the investment”. 

It is, therefore, up to the individual to determine how much cost they are willing to incur to liquidate a series of positions. 

The second piece of wording is that they need to be able to do this under “foreseeably stressed market conditions”. 

“SEC Rule 22e-4 calls for several monitoring components such as the Highly Liquid Investment Minimum (HLIM) and 15% Illiquid Investment threshold,” says Kevin Fay, who leads Bank of New York Mellon’s Liquidity Risk Oversight Program as a Senior Risk Analyst. “Should either of these breach, action items would include notification to your fund board and a confidential filing to the SEC via form N-Liquid. 

“I don’t see how anything other than daily monitoring of liquidity can be avoided given the language within the final rule along with a supporting audit trail.” 

More sophisticated liquidity risk management 

The trend in regulation, and its intent, is to lead to a change in business practice whereby things aren’t just reported, they are integrated with the rest of one’s risk management procedures. 
“That’s really where we are seeing liquidity as a topic,” opines Chris Casey, Global Head of Regulatory Products and Reference Data, Bloomberg. “In a couple of years’ time, we are probably just going to be talking about liquidity risk in the same context as market risk, operations risk, counterparty risk and credit risk.” 

It is not the SEC’s intention necessarily to increase regulatory reporting. What they want to achieve with the rule is more consistency in the level of sophistication and quality of liquidity risk management processes within open-ended funds. 

The way they apply that using one of the mechanisms to raise that standard is to impose a reporting requirement, “but my expectation is that the liquidity risk management programme that a fund has is now to a level of satisfaction and consistency across the industry to meet daily redemptions. 

“The first principals of a lot of these regulations – UCITS, AIFMD, MiFID II – is that the regulators want to bring a level of consistency across the industry in terms of how institutions operate,” says Casey. 

Creating best practices 

Geoff Craddock is Chief Risk Officer at OppenheimerFunds. Given that OppenheimerFunds, like many other institutional fund managers, are, to an extent, in liquidity transformation business, it is extremely important to focus on liquidity. This has always been the case, says Craddock, way before regulations like Rule 22e-4 were created.

He thinks that the prescriptive nature of the rule is unfortunate on some level, although the goal is a worthy one – “I accept there may be marginal players in the industry who do not manage liquidity the way they are supposed to” – but the fact that every fund, even those that are highly liquid, have to have liquidity programmes, “is a bit of an overreach”. 

“It’s sweeping up hundreds of liquid funds in a programme for which it is all but unnecessary, simply to catch a few outliers,” says Craddock. “Some of the detail for liquid funds, in my view, becomes a bit arduous.” 

Given that OppenheimerFunds manages multiple fund products across the liquidity spectrum, the way the group focuses on liquidity management is tailored accordingly. 

“We run a significant bank loan fund product, for example, where liquidity is extremely important to manage, in particular because of delayed settlement times. Similarly, we have a high yield municipal bond fund, where liquidity can sometimes be patchy, and we run a large developing markets equity fund, where we need to take care with how certain positions transact. 

“For many of our large-cap equity products, which are highly liquid and can be turned over easily, I’ve not needed to spend as much time, other than to simply establish a set of protocols. Indeed, for every strategy, we have a defined maximum capacity. We look at a fund and scale up to what we think would be a maximum size for that fund, realistically, given market liquidity. And in most cases that is a significant multiple of the assets we have. 

“But in some instances that isn’t the case and we’ve felt it prudent to implement programmes to ensure we can maintain liquidity under reasonably stressed conditions. 

“In this sense, the rule is really reflecting best practices that we have in place already,” explains Craddock. 

Liquidity adjusted pricing 

The question of valuation and how it intersects with liquidity has been quite an interesting area of discussion. The requirement for fair valuation has existed for a long time, allowing US mutual fund managers to develop a well established infrastructure for their daily liquidity products, using independent pricing services to value positions. That typically does not include a liquidity assessment. 
Liquidity adjusted pricing is typically determined using a traditional capital asset pricing model (CAPM) to try to model how liquidity could change the return of a given security under different market conditions. 

The concept of the trade-off between liquidity versus price versus time is one that is very much an important part of the liquidity analysis. 

One could tie the two together but given the slightly different purposes of a liquidity assessment and a valuation assessment, they are still discrete. Even though on some level they intersect, the fact that for a mutual fund the manager has to strike a single NAV that works for inflows and outflows precludes a more proactive approach to liquidity, from a valuation perspective. 

“I can assess the discount we would have to transact to achieve certain levels of liquidity, but given the need for a single NAV we don’t have a mechanism to build that into pricing in any way,” adds Craddock. “Buried within the liquidity rule is the concept of being able to realise cash without significantly changing the market value of the investment. 

“We’ve had a lot of internal debate on this. Does it vary by asset class? The reality is yes, it probably does. Realising a large position in a US Treasury versus a high yield bond is likely to produce a very different price impact,” says Craddock. 

How does one monitor & measure liquidity?

Solving market impact by size of position The starting point, from the logic of the rule is an assessment of the need for liquidity. One of the things that risk teams might choose to focus on is the nature and make- up of the shareholder base in each of its funds.

This requires having reasonably strong data on historical flows in those funds. If one advisor controls investors that make up 10 per cent of the fund’s total AUM, that is one decision node that could have a big impact on the fund’s liquidity. 

“We’ve done quite a lot of work on this,” Craddock confirms. “That aspect informs us on reasonably likely flows under different circumstances. It establishes a building block for the whole context of what liquidity is required for the fund. 

“Once we’ve established a view of reasonably expected flows, and stress that assumption, we then apply it to the securities in the fund and build a profile of sales. Then we look at the price versus time component and for each fund and ultimately each security we arrive at an acceptable discount; for example, a 25 basis point impact to a particular security, and how much of that do well sell over one, two, three days.” 

Then you ultimately end up with a liquidity characterisation for the portfolio. Can the fund be liquidated to realise cash quickly enough to meet the stressed redemption that we think is possible? If yes, that’s fine. If no, then it would drive the portfolio manager’s requirement to hold more liquid securities. The Highly Liquid Minimum is the plug that makes that whole equation work. 

Quality of data 

BNY Mellon’s Fay agrees that measuring liquidity and formulating an approach to liquidity adjusted pricing depends on the quality of the data. 

Measuring liquidity, he says, can be very challenging, especially in fixed income markets where trading is infrequent and data is scare. For a given municipal bond issue that has only traded a handful of times in the past year, does this mean it’s illiquid? No, not necessarily. Something that might appear illiquid could actually be highly liquid. 

“In fact, it can be quite the opposite if the issue in question is a high quality bond with a high coupon geared towards buy-and-hold retirement investors that have every intention of holding that bond to maturity,” says Fay, noting that for equity markets, the task is somewhat easier given that securities are traded much more frequently and as such the data is more readily available. 

As such, measuring and assessing liquidity within particular markets is likely to require a quantitative approach based on sound and understandable assumptions. 

Asked when managers should choose to buy off-the-shelf models versus build their own proprietary models, Fay comments: “The more complex and diverse the instruments are that you hold, the more likely you might want to consider an external third party model that has the data behind it and uses
a quantitative methodology to determine the liquidity of a position. If you have a simple equities portfolio where the data is readily available then a simple proprietary model might suffice. But as you look at larger firms who trade a wider range of instruments and asset classes one would probably want to consider more of a buy than build approach for measuring liquidity.” 

Unintended consequences 

One of the potential dangers of regulation such as Rule 22e-4 is that if a security genuinely becomes less liquid over time, there may be strong incentives for people not to hold it in aggregate. 

If everyone in the industry is using similar tools to estimate liquidity, and a security becomes evidently less liquid because a market event, this standardised view across the buy-side could lead to a mass incentive to sell the security, simply because of its liquidity rating and liquidity adjusted price. That might have negative consequences on fund performance and hurt investors; the very ones that the regulator is trying to protect. 

“It is a theoretical problem but one that concerns me slightly,” says Craddock. “Ending up with multiple holders with the same liquidity score for the same security… I’m not sure that’s a good thing.” 

Monitoring non-equities – how big a challenge?

To some extent, the SEC allows financial institutions a degree of flexibility in terms of determining how they measure and monitor liquidity risk. It’s ultimately a judgment call. What they will want to see evidence of, however, is a rigorous liquidity risk programme. 

“If the regulator asks how you measure liquidity risk and you respond, ‘I ask the portfolio manager to bucket things on his own book’, no matter what the asset class that approach will no longer be acceptable,” asserts Casey. “The regulator will say you’ve got no control over the process, 
no quantitative measurement to check the classifications, etc. They would view that as a liquidity risk management programme with significant deficiencies.” 

If, however, one can articulate that for more esoteric asset classes like OTC instruments that you have a process that you adhere to for classification, that is the sort of benchmark they will be looking for. 

Casey says that one initiative Bloomberg is developing is a proxy system to help managers navigate in different regulatory environments; this is especially true of MiFID II, where OTC instruments will suddenly need to be traded on venue (referred to as Organised Trading Facilities). 

“A derivative that is off-venue and truly OTC has very similar characteristics to the same derivative that is on venue. Therefore the liquidity profile of both instruments should, by proxy, be comparable. That kind of approach, using new technology, new systems, is something the regulators seem quite open about,” says Casey. 

Regardless of the asset class, the daily assessment of liquidity could be helpful in terms of allowing the manager to build up a framework of what their liquidity looks like over time. 

Asset managers with a strong liquidity risk management framework in place who look beyond this as an exercise in satisfying the regulator are those who see it as an opportunity to build a comprehensive business that also satisfies shareholders, fund boards and senior management. 

Knowing how the liquidity for each fund has changed over time due to changes in the market is a powerful position to be in. 

“This is about building infrastructure to go a step further than merely meeting regulatory requirements and monitoring purposes. It’s also for taking appropriate action when breaches occur,” says Fay. 

Liquidity stress testing 

This overlaps with how one should best measure liquidity to cope with future market events. Measuring liquidity in any market is difficult, but substantially so during a stress event. It is, ultimately, only as good as the assumptions used behind it. 

Despite the current challenges and lackof clarity and guidance on how to stress a portfolio for liquidity, investment companies should conduct ongoing liquidity stress testing in different scenarios in order to properly assess and validate that a fund has sufficient liquidity now, in addition to times of a liquidity crunch. 

As referred to above, monitoring non-equities is quite complex given the scarcity of data. When these are included in a diverse portfolio containing multiple securities – and liquidity profiles – being able to stress the portfolio’s liquidity becomes that much harder, compared to a portfolio containing pure equities, for example. 

Fay says that a generic cookie cutter approach is likely destined for failure. 

“Using multiple scenarios with fund specific circumstances applied when applicable, is something that I would personally advocate for to get a more holistic view of liquidity in both normal and stressed market conditions,” he says. 

Don’t replicate the past 

Fay offers two suggestions on how best to stress test liquidity. Firstly, don’t try to replicate past historical events. Trying to model something like the Taper Tantrum is difficult as instruments may behave differently in the future compared to a past event. 

“The other reason I’m against mimicking past events is I don’t think it really fits into the spirit of what the SEC means by “reasonably foreseeable stressed market conditions.”

“The Taper Tantrum is not a reasonably foreseeable stressed market condition to build out. History doesn’t so much repeat but it rhymes. Will there be another event just like the Taper Tantrum with the same conditions that resembles it? I doubt it.

We can learn from the past, but to build out scenarios that replicate those same conditions exactly would be very challenging and likely for very little benefit since the drivers behind the next liquidity strain is likely to react differently across various asset classes.

“The dangers of a market wide liquidity crunch are that it can happen very quickly and at a time when you need liquidity most. Liquidity risk stress testing is very different than that of market risk as the latter tends to be tied to systematic drivers over time such as declining GDP, or overvalued markets. Many of the liquidity stressed events that we’ve directly observed in recent years have their roots in a single day and event that triggered them. The Flash Crash (US Treasuries); and of course the failure of the Third Avenue Focused Credit Fund (High Yield) are just two examples,” says Fay. 

Redemption behaviour 

Secondly, look at what the redemption behaviour of a fund was during a historical stress event. Were you more or less sensitive to the market compared to other like-for-like funds in the industry? 

Ideally, based on the available redemption data, one should assume a realistic redemption rate, and on top of this, shock the portfolio using certain factors. These might be a widening of bid:ask spreads, 
an increase in volatility, a decrease in market depth, or a decrease in expected daily volume. 

“That should give you a forecast for what a given redemption level will do to the fund’s liquidity. In my view, a combination of the two – redemption rate and factors – to model liquidity in stressed markets, would be the best option,” concludes Fay. 

Like this article? Sign up to our free newsletter

Most Popular

Further Reading

Featured