Retail Banking

Retail Banking

Opinions expressed on this blog reflect the writer’s views and not the position of the Capgemini Group

Liquidity Coverage Ratio: How analytics can help banks steer through financial compliance

The current regulatory environment in financial services is complex and quickly evolving. Firms are devoting continuously more resources to governance, risk and general regulatory controls. This shifting landscape, created by a stream of legislation, affects both banks and their customers in various ways, including product alteration, supply of services and increased scrutiny.  

At the same time, the “too big to fail” notion seems to have a central position in regulators’ minds after the 2007/8 events with conduct risk issues generating a raft of enforcement actions and heavy fines from regulators around the world.

This is where analytics enters, to help banks steer through this difficult regulatory environment. The increased adoption of technology along with the advanced simulation and predictive models, make it possible to assess the impact of decisions even without the need of data.

Increased regulatory supervision

As part of a wider attempt to increase supervision of liquidity risk, the Basel Committee has developed the Liquidity Coverage Ratio (LCR), which is the first of a series of measures aimed to endorse short-term resilience of banks. The end goal of the metric is to enable banks to avoid significantly severe stresses of liquidity – like “bank runs” – within a period of 30 days and is calculated as follows:

Stock of High Quality Liquid Assets      ≥ 100% 

Total net cash outflows over the next 30 calendar days

What this means is that, under LCR, banks are required to constantly retain a “high quality” stock of liquid assets large enough to at least cover the bank’s net cash outflows over a period of 30 days.

And while this regulatory metric had been in effect since 2011, the 100% minimum has been fully enforced from the 1st of October 2015. The savviest of the banks have been thoroughly prepared, adjusting their strategic goals and aligning planning to reflect the changing environment. The wider consensus amongst treasurers, however, is that targets will be easily achieved and requirements will be met within the mandated deadlines. But this has led to a repose that might prove harmful to the long term profitability of the bank. So is it prudent to rest on our laurels?

Change is coming, how will banks adapt?

Having gone through 6 years of quantitative easing (QE) has certainly impacted on how the market adapts and reacts. All central banks around the world have injected trillions of cash into the market expanding its liquidity, aimed at stimulating growth and stability.

Added to the uncertainty associated with the fact that central governments will have to start de-leveraging, which may happen either abruptly, in the form of a crash, or gradually through what has been named “tapering” is ‘regulatory noise’. In both cases mentioned, and even more so in the former, the main after-effect will be the abatement of the current beneficial circumstances created by QE. Access to liquidity will be much more difficult to find and meeting LCR requirements, for example, could become an unforeseen barrier.

It therefore becomes evident that this recent spate of financial reforms could pose a confluence of peril for the banks’ business, with a far less optimistic economic backdrop. Established sources of revenue may get narrowed or even eliminated, while balance sheets and underlying assets will be revaluated due to the changes in the underlying calculations.

Banks’ customers will be affected as well - core products offered to institutional, commercial and retail clients might have to be altered to fit new liquidity requirements. Also, the banks’ propensity to lend as well as the collateral they will require for that may tighten to an uneconomical degree. So how could predictive analytics help banks navigate through this constantly changing regulatory environment?

The role of Analytics in regulatory compliance

Firstly, when I talk about predictive analytics in this context I’m referring to much more than the traditional application of analytics. Segmenting customers and calculating their individual risk or predicting their behaviour is definitely a salient part of the banks’ operations, but what I’m suggesting here is making analytics an integral part of strategic decision making. Using data to break down the problem at hand, evaluate decision levers, potential actions and articulate end goals.

This would allow the creation of innovative products and pricings that can revamp revenue flow. The created decision models can help the bank simulate how a new product would do in the market under a whole range of different scenarios, if current product lines will be affected or cannibalised in any way and the best part of it is that all these can be done without waiting for the right data to come in!

The second implication of this is the complete reversal and reestablishment of decision making. Instead of determining which customers would have the highest propensity of buying an existing product, banks could develop products that fit specifically to certain ranges of their customer base.

Regulatory scrutiny is definitely taking a toll on the banks’ profitability and is becoming increasingly difficult for them to comply with the new measures. Therefore, differentiation and adroitness become a prerequisite for them to keep up. Analytics can support both!

 

About the author

Thanos Terzidis
Thanos Terzidis
Thanos is an associate consultant in the Customer Experience & Analytics practice of Capgemini Consulting UK. His previous experience revolves around the financial services sector, with a specific interest in analytics and modelling.

Leave a comment

Your email address will not be published. Required fields are marked *.