Quantitative Risk Modeling and Management—New Regulatory Challenges

A special issue of Risks (ISSN 2227-9091).

Deadline for manuscript submissions: closed (20 November 2021) | Viewed by 10729

Special Issue Editor


E-Mail Website
Guest Editor
Department of Banking and Insurance, Prague University of Economics and Business, Winstona Churchill Sq. 4, 130 67 Prague 3, Czech Republic
Interests: credit risk management and modeling; financial derivatives; volatility estimations; Bayesian methods (MCMC, particle filters)

Special Issue Information

Dear Colleagues,

A number of new regulatory documents have brought many new challenges to quantitative risk modeling and management theory and practice. For example, according to IFRS 9, banks need not only to estimate traditional one-year probabilities default, but also life-time expected credit losses’ conditional and various future macroeconomic scenarios. According to IFRS 13, institutions dealing with financial derivatives must account for the counterparty credit risk using the concept of credit valuation adjustment (CVA), and in addition, according to Basel III, probability distribution and CVA VaR should be modelled as well. The new Basel Minimum capital requirements for market risk coming into effect in 2022 introduce the Expected Shortfall and liquidity modeling into the market risk capital requirement calculations, and so on. In this issue, we cordially invite researchers to share their results on the topics of quantitative modeling and risk management, in particular those that aim to answer to the regulatory challenges in terms of methodology and new empirical research.

Prof. Dr. Jiří Witzany
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Risks is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Basel regulation
  • Expected credit loss (ECL)
  • Credit risk stress testing
  • Credit valuation adjustment
  • Derivative valuation
  • Volatility modeling
  • Quantitative trading

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

22 pages, 5695 KiB  
Article
Development of an Impairment Point in Time Probability of Default Model for Revolving Retail Credit Products: South African Case Study
by Douw Gerbrand Breed, Niel van Jaarsveld, Carsten Gerken, Tanja Verster and Helgard Raubenheimer
Risks 2021, 9(11), 208; https://0-doi-org.brum.beds.ac.uk/10.3390/risks9110208 - 15 Nov 2021
Cited by 1 | Viewed by 3853
Abstract
A new methodology to derive IFRS 9 PiT PDs is proposed. The methodology first derives a PiT term structure with accompanying segmented term structures. Secondly, the calibration of credit scores using the Lorenz curve approach is used to create account-specific PD term structures. [...] Read more.
A new methodology to derive IFRS 9 PiT PDs is proposed. The methodology first derives a PiT term structure with accompanying segmented term structures. Secondly, the calibration of credit scores using the Lorenz curve approach is used to create account-specific PD term structures. The PiT term structures are derived by using empirical information based on the most recent default information and account risk characteristics prior to default. Different PiT PD term structures are developed to capture the structurally different default risk patterns for different pools of accounts using segmentation. To quantify what a materially different term structure constitutes, three tests are proposed. Account specific PiT PDs are derived through the Lorenz curve calibration using the latest default experience and credit scores. The proposed methodology is illustrated on an actual dataset, using a revolving retail credit portfolio from a South African bank. The main advantages of the proposed methodology include the use of well-understood methods (e.g., Lorenz curve calibration, scorecards, term structure modelling) in the banking industry. Further, the inclusion of re-default events in the proposed IFRS 9 PD methodology will simplify the development of the accompanying IFRS 9 LGD model due to the reduced complexity for the modelling of cure cases. Moreover, attrition effects are naturally included in the PD term structures and no longer require a separate model. Lastly, the PD term structure is based on months since observation, and therefore the arrears cycle could be investigated as a possible segmentation. Full article
(This article belongs to the Special Issue Quantitative Risk Modeling and Management—New Regulatory Challenges)
Show Figures

Figure 1

26 pages, 1912 KiB  
Article
Using Model Performance to Assess the Representativeness of Data for Model Development and Calibration in Financial Institutions
by Chamay Kruger, Willem Daniel Schutte and Tanja Verster
Risks 2021, 9(11), 204; https://0-doi-org.brum.beds.ac.uk/10.3390/risks9110204 - 10 Nov 2021
Cited by 3 | Viewed by 2975
Abstract
This paper proposes a methodology that utilises model performance as a metric to assess the representativeness of external or pooled data when it is used by banks in regulatory model development and calibration. There is currently no formal methodology to assess representativeness. The [...] Read more.
This paper proposes a methodology that utilises model performance as a metric to assess the representativeness of external or pooled data when it is used by banks in regulatory model development and calibration. There is currently no formal methodology to assess representativeness. The paper provides a review of existing regulatory literature on the requirements of assessing representativeness and emphasises that both qualitative and quantitative aspects need to be considered. We present a novel methodology and apply it to two case studies. We compared our methodology with the Multivariate Prediction Accuracy Index. The first case study investigates whether a pooled data source from Global Credit Data (GCD) is representative when considering the enrichment of internal data with pooled data in the development of a regulatory loss given default (LGD) model. The second case study differs from the first by illustrating which other countries in the pooled data set could be representative when enriching internal data during the development of a LGD model. Using these case studies as examples, our proposed methodology provides users with a generalised framework to identify subsets of the external data that are representative of their Country’s or bank’s data, making the results general and universally applicable. Full article
(This article belongs to the Special Issue Quantitative Risk Modeling and Management—New Regulatory Challenges)
Show Figures

Figure 1

17 pages, 571 KiB  
Article
Adapting the Default Weighted Survival Analysis Modelling Approach to Model IFRS 9 LGD
by Morne Joubert, Tanja Verster, Helgard Raubenheimer and Willem D. Schutte
Risks 2021, 9(6), 103; https://0-doi-org.brum.beds.ac.uk/10.3390/risks9060103 - 01 Jun 2021
Cited by 4 | Viewed by 3232
Abstract
Survival analysis is one of the techniques that could be used to predict loss given default (LGD) for regulatory capital (Basel) purposes. When using survival analysis to model LGD, a proposed methodology is the default weighted survival analysis (DWSA) method. This paper is [...] Read more.
Survival analysis is one of the techniques that could be used to predict loss given default (LGD) for regulatory capital (Basel) purposes. When using survival analysis to model LGD, a proposed methodology is the default weighted survival analysis (DWSA) method. This paper is aimed at adapting the DWSA method (used to model Basel LGD) to estimate the LGD for International Financial Reporting Standard (IFRS) 9 impairment requirements. The DWSA methodology allows for over recoveries, default weighting and negative cashflows. For IFRS 9, this methodology should be adapted, as the estimated LGD is a function of in the expected credit losses (ECL). Our proposed IFRS 9 LGD methodology makes use of survival analysis to estimate the LGD. The Cox proportional hazards model allows for a baseline survival curve to be adjusted to produce survival curves for different segments of the portfolio. The forward-looking LGD values are adjusted for different macro-economic scenarios and the ECL is calculated for each scenario. These ECL values are probability weighted to produce a final ECL estimate. We illustrate our proposed IFRS 9 LGD methodology and ECL estimation on a dataset from a retail portfolio of a South African bank. Full article
(This article belongs to the Special Issue Quantitative Risk Modeling and Management—New Regulatory Challenges)
Show Figures

Figure 1

Back to TopTop