Close
Close

Your message has been sent, thank you!

Print the page

Risk Management

Taking the holistic approach

Risk Management

Internal Audit

Five major drivers

Internal Audit

Islamic Banking and Finance

Interpretation & Implementation

Islamic Banking & Finance

Banking

Banking Regulation and Business

Banking

Treasury and Capital Markets

Market and Liquidity Risk

Treasury & Capital Markets

Financial Crime

Don't be scared, be prepared

Financial Crime

Please contact us for more information:

Lisette Mermod

Managing Director

Risk Reward Limited

T. +44 (0)20 7638 5558

F. +44 (0)20 7638 5571

LM@riskrewardlimited.com

IFRS9 and Model Validation

Key Project Challenges - View from the Field

You may be looking at IFRS9 with concern. There is so much to do and many of the techniques that are being used are at best unproven.  Model validation itself is a fairly new discipline for many firms so this has exacerbated the challenges that companies are likely to face.   Add to this the recurrent and emerging data challenges and limited multidisciplinary teams needed and the difficulties in this project becomes clear. Here is a brief perspective from those in the field offered to help as you approach your project.

 

I. The IFRS9 Challenges

The key challenge is to move from the IAS 39 incurred risk model to the IFRS 9 expected loss model for provisioning purposes.  This new expected loss provisioning assessment is intended to be forward looking and accordingly is both subjective and reliant increasingly upon models.

Data is at the heart of the issue here and the challenges faced by firms can be major. How good is your historic data?  

To what extent have you captured the attributes that you now need to consider?

How will you apply data modelling to low-default portfolios including new business activities?  How will you then factor in the lifetime loss calculation to calculate the forward-looking provision?  Are there solutions out there to help you?   Let us explore some of these themes.

 

II.  The Link to the ICAAP

Another challenge is the extent to which banks can build upon the internal ratings-based (IRB) approach set out by the Bank for International Settlements (BIS) to assist with this process.  While few firms are using IRB for capital calculation, instead relying upon the standardised approach, many incorporate the measures from this process into both their Internal Capital Adequacy Assessment Process (ICAAP) and their internally utilised risk measures including the calculation of risk appetite.  

However, the IRB calculation under Basel 2 took expected Exposure at Default (EAD), Loss Given Default (LGD) and Probability of Default (PD) and then referred to this as being the capital calculation to achieve the soundness standard (99.9% confidence level in the one-year view).  While firms could vary the capital calculation to take account of changing circumstances there was no incentive for firms to do so. Accordingly, in practice few firms varied this calculation. 

Further while the initial calculation of EAD, PD and LGD was fairly robust, management were concerned that bringing into softer and less certain factors such as lifetime economic inputs might undermine the quality of the data.  

Here is a thought: if you are providing a loan which has a tenure of perhaps 25 years, the pricing would always have needed to consider the lifetime loss experience, rather than a shorter period. That many firms failed to include such matters into pricing calculations and models was always surprising.

If a firm has a robust IRB model then this can be the starting point for the IFRS 9 forward looking loss assessment, although the IRB model itself will need to be subject to independent validation.

 

III.  The IFRS 9 Project

The IFRS 9 project is really a major data project.  Starting with the 12-month expected credit loss calculation and then using this data to predict the expected credit losses that result from all possible defaults over the lifetime of the financial instrument requires you to know your data in detail.  Identification of attributes which meaningfully differentiate risk to identify the pools on which analysis is to be conducted is clearly the first stage.  The problem for firms is that they are unlikely to have designed their systems to enable such information to be easily obtained.  This data cleansing and scaling exercise comes first. Recognise that this will need to be assessed by both internal audit and the validation team as well.  The issue that appears is whether the expected credit losses will have increased significantly from initial recognition.  If that is the case the firm does need to calculate the lifetime credit losses and, in this case, it would include any assets that contain a significant financing component. Indeed, for simplicity many firms are electing to introduce an accounting policy whereby they move to recognising lifetime expected losses in respect of all relevant assets regardless of the financing requirements.

Obtaining and analysing this data to identify the attributes and appreciate how the foreseeable events impacts the analysis over the asset life cycle is a major challenge for modellers, and consequently a challenge for data validators.  Trying to justify the analysis to senior management is also a challenge and one that needs to be faced by many teams.  If the methodology adopted in the modelling technique relies to much upon unproven modelling approaches than the level of certainty required by the governance teams and external auditors is unlikely to be achieved.  

In practice a combination of techniques is likely to be utilised identifying the key attributes that are of greatest significance and the events that are likely to distort the outcomes from the credit appraisal analysis.

 

IV.  The Validation Exercise

There is no specified basis for what is model validation in any of the regulations, although it does say what it is not.  Clearly two bad models do not make a good model, so validation needs to be planned carefully to ensure that all relevant matters are considered.  

Our experience tends to show that firms do not start from a consideration which looks at all the models available in the world and then justifies the model that is being used. That would be the valid approach to adopt. Most firms work from the model that they already have developed and try to prove that it achieves its objectives.  In the case of an IFRS 9 credit risk model this review would seek to establish that the model outputs at least appear to be both realistic and sensible.  Indeed, there is always a need for a human review of model outputs to ensure that the information that results from use of the model is not misleading or unhelpful.  

Having decided upon the model style that is being used then the next stage is to identify the pools of assets that are to be modelled.

This requires the assessment of the attributes and the extent to which they lead to changes in default data sets. The team that built the model will have needed to undertake this work prior to the model being used so there should be analysis available to look at. However much of that analysis will have been directed towards the previous rules and will be focused on loss within the one-year view.  There is a major challenge to be addressed in taking this data to life cycle credit losses.

Reviewing previous pre- and post- implementation validation exercises is always informative, although it cannot provide substantial assurance to the validation team.  That is because the world changes so quickly and previous validation exercises will have been based on historic economic environments.

Besides, just because a previous validation exercise failed to adequately address the issues that need to be faced does not mean that your subsequent exercise will also be flawed.

 

V.   Reviewing Detailed Modelling

The next challenge is data quality of the data analysis conducted.  Too frequently firms fail to adequately address both scaling and the changing economic environment.  At its core this is a major issue to deal with. Models are at their best when you are in the middle of a trend.  This is not the case at present.  Not only do we have market uncertainties as demonstrated by the yield curve and market volatilities we are seeing; but the way people are working, and buying is also changing.  Whether this is the increased focus on the environment and carbon emissions or the change in purchasing patterns due to on-line vendors, there are major changes going on.  These can be identified in data sets, but this requires different types of analysis.  The effect of that is that the data needs to be amended to enable it to support life cycle analysis as required by IFRS 9.

The validation team will need to consider the economic inputs to the models and firstly assess their adequacy.  We find that often the inputs are at best too simplistic and fail to properly connect the way that risks move globally together, either positively or negatively. It is important to see how reliable economic inputs are to the model.  This typically needs to assess the inputs to the economic model and also whether the model has been adequately backtested recognising the inherent limitations in this analysis as set out earlier.    

Exercises essentially use attribute sampling techniques to address the requirement to undertake substantive tests of detail. In reality this stage is normally preceded by a walkthough test which ensures that the process is properly understood and flaws can be promptly identified.  The substantive test of detail will then follow to achieve the necessary positive assurance to enable the reporting to be conducted.  Recognise that this is also where out-of-sample testing needs to be addressed as well as the impact of any human intervention.

Frequently there is an exercise to ensure that the stress testing and scenario modelling have been properly factored into the analysis.  This is needed to consider the range of potential outcomes and the impact that is likely to be consequent on the IFRS 9 analysis.

Reconciliations to data included within other reports including ICAAP analysis will need to be conducted.

Many banks appear to think that the validation is purely to check the calculation is properly conducted.  It is much more than that and consequently can take a lot longer than expected.

Something as simple as Loss Given Default can be a problem.  The analysis will need to consider whether over the lifecycle the costs of disposal are likely to change and also the market for the asset being realised. This could take the economic modelling into an entirely different phase.  It is never easy to assess the property market in general, but you will all be aware that local considerations can make major changes to the way the market operates.  The necessary model granularity will need to pick this up.

 

VI.  Use of Software

Should there be a requirement for data validation software?  Generally this is not the case.  However, if the firm does not currently own a credit portfolio modelling solution then acquisition of a solution may be required. Often the firm does have the solution available to it, but that the solution is currently being used for another purpose. Such a review should be undertaken prior to acquisition.  Such software has already been acquired, tested, populated and implemented so its use can speed up modelling solutions development.  In the absence of such a solution our concern would be that the senior management might not been able to properly model their portfolio for years and the difficulties now faced by the validation team only mirror those that may have been faced by management.  

Expensive solutions cannot in themselves expose or mask a data problem.  If the issue is data, then it is data that needs to be addressed.  Issues such as low default portfolios and new products together with the use and development of synthetic data will always be a challenge.    

 

VII. Conclusion

Model validation is a fairly new discipline, challenged by availability and quality data and a paucity of multidisciplinary teams of independent skilled people available.

This combined knowledge of banking, modelling, finance, regulations and economics makes the industry particularly threadbare.  Mathematicians and modellers generally prefer to build models than validate them, so that in itself also reduces the talent pool.  That model validation has become such a major challenge so quickly does increase the demand to properly train new generations of model validation specialists.  The problem is that firms have their data held in different ways and a solution that works effectively for one firm might be completely inappropriate for another. Each of our recent and on-going projects - whether for a bank, an auditor or a regulator - affords a unique challenge, approach and perspective.

Let us know if our experience can help you.

 

Dennis Cox

Chief Executive

riskupdate