Next Issue
Volume 6, September
Previous Issue
Volume 6, March
 
 

Risks, Volume 6, Issue 2 (June 2018) – 39 articles

Cover Story (view full-size image): We have designed a stochastic simulation machine that allows the user to generate a synthetic insurance portfolio of individual claims histories. This simulation machine is based on neural networks to incorporate individual claims feature information and is calibrated using real non-life insurance data. The simulated individual claims histories enable the user to back-test classical aggregate claims reserving methods—such as the chain-ladder method—as well as to develop new claims reserving methods which are based on individual claims histories. This simulation machine may provide a common ground and publicly available (synthetic) data for research in the field of claims reserving. View this paper.
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
26 pages, 1086 KiB  
Article
A Least-Squares Monte Carlo Framework in Proxy Modeling of Life Insurance Companies
by Anne-Sophie Krah, Zoran Nikolić and Ralf Korn
Risks 2018, 6(2), 62; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020062 - 11 Jun 2018
Cited by 21 | Viewed by 7586
Abstract
The Solvency II directive asks insurance companies to derive their solvency capital requirement from the full loss distribution over the coming year. While this is in general computationally infeasible in the life insurance business, an application of the Least-Squares Monte Carlo (LSMC) method [...] Read more.
The Solvency II directive asks insurance companies to derive their solvency capital requirement from the full loss distribution over the coming year. While this is in general computationally infeasible in the life insurance business, an application of the Least-Squares Monte Carlo (LSMC) method offers a possibility to overcome this computational challenge. We outline in detail the challenges a life insurer faces, the theoretical basis of the LSMC method and the necessary steps on the way to a reliable proxy modeling in the life insurance business. Further, we illustrate the advantages of the LSMC approach via presenting (slightly disguised) real-world applications. Full article
(This article belongs to the Special Issue Capital Requirement Evaluation under Solvency II framework)
Show Figures

Figure 1

28 pages, 2351 KiB  
Article
On Exactitude in Financial Regulation: Value-at-Risk, Expected Shortfall, and Expectiles
by James Ming Chen
Risks 2018, 6(2), 61; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020061 - 01 Jun 2018
Cited by 42 | Viewed by 10709
Abstract
This article reviews two leading measures of financial risk and an emerging alternative. Embraced by the Basel accords, value-at-risk and expected shortfall are the leading measures of financial risk. Expectiles offset the weaknesses of value-at-risk (VaR) and expected shortfall. Indeed, expectiles are the [...] Read more.
This article reviews two leading measures of financial risk and an emerging alternative. Embraced by the Basel accords, value-at-risk and expected shortfall are the leading measures of financial risk. Expectiles offset the weaknesses of value-at-risk (VaR) and expected shortfall. Indeed, expectiles are the only elicitable law-invariant coherent risk measures. After reviewing practical concerns involving backtesting and robustness, this article more closely examines regulatory applications of expectiles. Expectiles are most readily evaluated as a special class of quantiles. For ease of regulatory implementation, expectiles can be defined exclusively in terms of VaR, expected shortfall, and the thresholds at which those competing risk measures are enforced. Moreover, expectiles are in harmony with gain/loss ratios in financial risk management. Expectiles may address some of the flaws in VaR and expected shortfall—subject to the reservation that no risk measure can achieve exactitude in regulation. Full article
Show Figures

Figure 1

19 pages, 930 KiB  
Article
Risk Aversion, Loss Aversion, and the Demand for Insurance
by Louis Eeckhoudt, Anna Maria Fiori and Emanuela Rosazza Gianin
Risks 2018, 6(2), 60; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020060 - 25 May 2018
Cited by 12 | Viewed by 5862
Abstract
In this paper we analyze insurance demand when the utility function depends both upon final wealth and the level of losses or gains relative to a reference point. Besides some comparative statics results, we discuss the links with first-order risk aversion, with the [...] Read more.
In this paper we analyze insurance demand when the utility function depends both upon final wealth and the level of losses or gains relative to a reference point. Besides some comparative statics results, we discuss the links with first-order risk aversion, with the Omega measure, and with a tendency to over-insure modest risks that has been been extensively documented in real insurance markets. Full article
(This article belongs to the Special Issue New Perspectives in Actuarial Risk Management)
16 pages, 440 KiB  
Article
On the Moments and the Distribution of Aggregate Discounted Claims in a Markovian Environment
by Shuanming Li and Yi Lu
Risks 2018, 6(2), 59; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020059 - 23 May 2018
Cited by 2 | Viewed by 3091
Abstract
This paper studies the moments and the distribution of the aggregate discounted claims (ADCs) in a Markovian environment, where the claim arrivals, claim amounts, and forces of interest (for discounting) are influenced by an underlying Markov process. Specifically, we assume that claims occur [...] Read more.
This paper studies the moments and the distribution of the aggregate discounted claims (ADCs) in a Markovian environment, where the claim arrivals, claim amounts, and forces of interest (for discounting) are influenced by an underlying Markov process. Specifically, we assume that claims occur according to a Markovian arrival process (MAP). The paper shows that the vector of joint Laplace transforms of the ADC occurring in each state of the environment process by any specific time satisfies a matrix-form first-order partial differential equation, through which a recursive formula is derived for the moments of the ADC occurring in certain states (a subset). We also study two types of covariances of the ADC occurring in any two subsets of the state space and with two different time lengths. The distribution of the ADC occurring in certain states by any specific time is also investigated. Numerical results are also presented for a two-state Markov-modulated model case. Full article
(This article belongs to the Special Issue Risk, Ruin and Survival: Decision Making in Insurance and Finance)
Show Figures

Figure 1

25 pages, 475 KiB  
Article
A Credit-Risk Valuation under the Variance-Gamma Asset Return
by Roman V. Ivanov
Risks 2018, 6(2), 58; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020058 - 17 May 2018
Cited by 5 | Viewed by 2699
Abstract
This paper considers risks of the investment portfolio, which consist of distributed mortgages and sold European call options. It is assumed that the stream of the credit payments could fall by a jump. The time of the jump is modeled by the exponential [...] Read more.
This paper considers risks of the investment portfolio, which consist of distributed mortgages and sold European call options. It is assumed that the stream of the credit payments could fall by a jump. The time of the jump is modeled by the exponential distribution. We suggest that the returns on stock are variance-gamma distributed. The value at risk, the expected shortfall and the entropic risk measure for this portfolio are calculated in closed forms. The obtained formulas exploit the values of generalized hypergeometric functions. Full article
18 pages, 373 KiB  
Article
On Two Mixture-Based Clustering Approaches Used in Modeling an Insurance Portfolio
by Tatjana Miljkovic and Daniel Fernández
Risks 2018, 6(2), 57; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020057 - 17 May 2018
Cited by 9 | Viewed by 3649
Abstract
We review two complementary mixture-based clustering approaches for modeling unobserved heterogeneity in an insurance portfolio: the generalized linear mixed cluster-weighted model (CWM) and mixture-based clustering for an ordered stereotype model (OSM). The latter is for modeling of ordinal variables, and the former is [...] Read more.
We review two complementary mixture-based clustering approaches for modeling unobserved heterogeneity in an insurance portfolio: the generalized linear mixed cluster-weighted model (CWM) and mixture-based clustering for an ordered stereotype model (OSM). The latter is for modeling of ordinal variables, and the former is for modeling losses as a function of mixed-type of covariates. The article extends the idea of mixture modeling to a multivariate classification for the purpose of testing unobserved heterogeneity in an insurance portfolio. The application of both methods is illustrated on a well-known French automobile portfolio, in which the model fitting is performed using the expectation-maximization (EM) algorithm. Our findings show that these mixture-based clustering methods can be used to further test unobserved heterogeneity in an insurance portfolio and as such may be considered in insurance pricing, underwriting, and risk management. Full article
(This article belongs to the Special Issue New Perspectives in Actuarial Risk Management)
Show Figures

Figure 1

21 pages, 543 KiB  
Article
Stochastic Modeling of Wind Derivatives in Energy Markets
by Fred Espen Benth, Luca Di Persio and Silvia Lavagnini
Risks 2018, 6(2), 56; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020056 - 16 May 2018
Cited by 34 | Viewed by 5249
Abstract
We model the logarithm of the spot price of electricity with a normal inverse Gaussian (NIG) process and the wind speed and wind power production with two Ornstein–Uhlenbeck processes. In order to reproduce the correlation between the spot price and the wind power [...] Read more.
We model the logarithm of the spot price of electricity with a normal inverse Gaussian (NIG) process and the wind speed and wind power production with two Ornstein–Uhlenbeck processes. In order to reproduce the correlation between the spot price and the wind power production, namely between a pure jump process and a continuous path process, respectively, we replace the small jumps of the NIG process by a Brownian term. We then apply our models to two different problems: first, to study from the stochastic point of view the income from a wind power plant, as the expected value of the product between the electricity spot price and the amount of energy produced; then, to construct and price a European put-type quanto option in the wind energy markets that allows the buyer to hedge against low prices and low wind power production in the plant. Calibration of the proposed models and related price formulas is also provided, according to specific datasets. Full article
Show Figures

Figure 1

13 pages, 604 KiB  
Article
Using Cutting-Edge Tree-Based Stochastic Models to Predict Credit Risk
by Khaled Halteh, Kuldeep Kumar and Adrian Gepp
Risks 2018, 6(2), 55; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020055 - 16 May 2018
Cited by 11 | Viewed by 4273
Abstract
Credit risk is a critical issue that affects banks and companies on a global scale. Possessing the ability to accurately predict the level of credit risk has the potential to help the lender and borrower. This is achieved by alleviating the number of [...] Read more.
Credit risk is a critical issue that affects banks and companies on a global scale. Possessing the ability to accurately predict the level of credit risk has the potential to help the lender and borrower. This is achieved by alleviating the number of loans provided to borrowers with poor financial health, thereby reducing the number of failed businesses, and, in effect, preventing economies from collapsing. This paper uses state-of-the-art stochastic models, namely: Decision trees, random forests, and stochastic gradient boosting to add to the current literature on credit-risk modelling. The Australian mining industry has been selected to test our methodology. Mining in Australia generates around $138 billion annually, making up more than half of the total goods and services. This paper uses publicly-available financial data from 750 risky and not risky Australian mining companies as variables in our models. Our results indicate that stochastic gradient boosting was the superior model at correctly classifying the good and bad credit-rated companies within the mining sector. Our model showed that ‘Property, Plant, & Equipment (PPE) turnover’, ‘Invested Capital Turnover’, and ‘Price over Earnings Ratio (PER)’ were the variables with the best explanatory power pertaining to predicting credit risk in the Australian mining sector. Full article
Show Figures

Figure 1

11 pages, 590 KiB  
Article
Diversification and Systemic Risk: A Financial Network Perspective
by Rüdiger Frey and Juraj Hledik
Risks 2018, 6(2), 54; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020054 - 15 May 2018
Cited by 3 | Viewed by 6137
Abstract
In this paper, we study the implications of diversification in the asset portfolios of banks for financial stability and systemic risk. Adding to the existing literature, we analyse this issue in a network model of the interbank market. We carry out a simulation [...] Read more.
In this paper, we study the implications of diversification in the asset portfolios of banks for financial stability and systemic risk. Adding to the existing literature, we analyse this issue in a network model of the interbank market. We carry out a simulation study that determines the probability of a systemic crisis in the banking network as a function of both the level of diversification, and the connectivity and structure of the financial network. In contrast to earlier studies we find that diversification at the level of individual banks may be beneficial for financial stability even if it does lead to a higher asset return correlation across banks. Full article
Show Figures

Figure 1

35 pages, 500 KiB  
Article
A General Framework for Portfolio Theory—Part I: Theory and Various Models
by Stanislaus Maier-Paape and Qiji Jim Zhu
Risks 2018, 6(2), 53; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020053 - 08 May 2018
Cited by 11 | Viewed by 5107
Abstract
Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two-dimensional space of utility and risk. This is a rather [...] Read more.
Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two-dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz (1959) and the capital market pricing model Sharpe (1964), are special cases of our general framework when the risk measure is taken to be the standard deviation and the utility function is the identity mapping. Using our general framework, we also recover and extend the results in Rockafellar et al. (2006), which were already an extension of the capital market pricing model to allow for the use of more general deviation measures. This generalized capital asset pricing model also applies to e.g., when an approximation of the maximum drawdown is considered as a risk measure. Furthermore, the consideration of a general utility function allows for going beyond the “additive” performance measure to a “multiplicative” one of cumulative returns by using the log utility. As a result, the growth optimal portfolio theory Lintner (1965) and the leverage space portfolio theory Vince (2009) can also be understood and enhanced under our general framework. Thus, this general framework allows a unification of several important existing portfolio theories and goes far beyond. For simplicity of presentation, we phrase all for a finite underlying probability space and a one period market model, but generalizations to more complex structures are straightforward. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

22 pages, 587 KiB  
Article
Modelling and Forecasting Stock Price Movements with Serially Dependent Determinants
by Rasika Yatigammana, Shelton Peiris, Richard Gerlach and David Edmund Allen
Risks 2018, 6(2), 52; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020052 - 07 May 2018
Cited by 2 | Viewed by 5067
Abstract
The direction of price movements are analysed under an ordered probit framework, recognising the importance of accounting for discreteness in price changes. By extending the work of Hausman et al. (1972) and Yang and Parwada (2012),This paper focuses on improving the forecast performance [...] Read more.
The direction of price movements are analysed under an ordered probit framework, recognising the importance of accounting for discreteness in price changes. By extending the work of Hausman et al. (1972) and Yang and Parwada (2012),This paper focuses on improving the forecast performance of the model while infusing a more practical perspective by enhancing flexibility. This is achieved by extending the existing framework to generate short term multi period ahead forecasts for better decision making, whilst considering the serial dependence structure. This approach enhances the flexibility and adaptability of the model to future price changes, particularly targeting risk minimisation. Empirical evidence is provided, based on seven stocks listed on the Australian Securities Exchange (ASX). The prediction success varies between 78 and 91 per cent for in-sample and out-of-sample forecasts for both the short term and long term. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

20 pages, 5816 KiB  
Article
Real-Option Valuation in a Finite-Time, Incomplete Market with Jump Diffusion and Investor-Utility Inflation
by Timothy Hillman, Nan Zhang and Zhuo Jin
Risks 2018, 6(2), 51; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020051 - 04 May 2018
Cited by 1 | Viewed by 2777
Abstract
We extend an existing numerical model (Grasselli (2011)) for valuing a real option to invest in a capital project in an incomplete market with a finite time horizon. In doing so, we include two separate effects: the possibility that the project value is [...] Read more.
We extend an existing numerical model (Grasselli (2011)) for valuing a real option to invest in a capital project in an incomplete market with a finite time horizon. In doing so, we include two separate effects: the possibility that the project value is partly describable according to a jump-diffusion process, and incorporation of a time-dependent investor utility function, taking into account the effect of inflation. We adopt a discrete approximation to the jump process, whose parameters are restricted in order to preserve the drift and the volatility of the project-value process that it modifies. By controlling for these low-order effects, the higher-order effects may be considered in isolation. Our simulated results demonstrate that the inclusion of the jump process tends to decrease the value of the option, and expand the circumstances under which it should be exercised. Our results also demonstrate that an appropriate selection of the time-dependent investor utility function yields more reasonable investor-behaviour predictions regarding the decision to exercise the option, than would occur otherwise. Full article
Show Figures

Figure 1

13 pages, 878 KiB  
Article
The Effect of Non-Proportional Reinsurance: A Revision of Solvency II Standard Formula
by Gian Paolo Clemente
Risks 2018, 6(2), 50; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020050 - 02 May 2018
Cited by 2 | Viewed by 4380
Abstract
Solvency II Standard Formula provides a methodology to recognise the risk-mitigating impact of excess of loss reinsurance treaties in premium risk modelling. We analyse the proposals of both Quantitative Impact Study 5 and Commission Delegated Regulation highlighting some inconsistencies. This paper tries to [...] Read more.
Solvency II Standard Formula provides a methodology to recognise the risk-mitigating impact of excess of loss reinsurance treaties in premium risk modelling. We analyse the proposals of both Quantitative Impact Study 5 and Commission Delegated Regulation highlighting some inconsistencies. This paper tries to bridge main pitfalls of both versions. To this aim, we propose a revision of non-proportional adjustment factor in order to measure the effect of excess of loss treaties on premium risk volatility. In this way, capital requirement can be easily assessed. As numerical results show, this proposal appears to be a feasible and much more consistent approach to describe the effect of non-proportional reinsurance on premium risk. Full article
(This article belongs to the Special Issue Capital Requirement Evaluation under Solvency II framework)
Show Figures

Figure 1

12 pages, 268 KiB  
Article
Properties of Stochastic Arrangement Increasing and Their Applications in Allocation Problems
by Wei Wei
Risks 2018, 6(2), 49; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020049 - 30 Apr 2018
Viewed by 2740
Abstract
There are extensive studies on the allocation problems in the field of insurance and finance. We observe that these studies, although involving different methodologies, share some inherent commonalities. In this paper, we develop a new framework for these studies with the tool of [...] Read more.
There are extensive studies on the allocation problems in the field of insurance and finance. We observe that these studies, although involving different methodologies, share some inherent commonalities. In this paper, we develop a new framework for these studies with the tool of arrangement increasing functions. This framework unifies many existing studies and provides shortcuts to developing new results. Full article
(This article belongs to the Special Issue Recent Development in Actuarial Science and Related Fields)
20 pages, 2507 KiB  
Article
The Italian Pension Gap: A Stochastic Optimal Control Approach
by Alessandro Milazzo and Elena Vigna
Risks 2018, 6(2), 48; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020048 - 28 Apr 2018
Viewed by 4303
Abstract
We study the gap between the state pension provided by the Italian pension system pre-Dini reform and post-Dini reform. The goal is to fill the gap between the old and the new pension by joining a defined contribution pension scheme and adopting an [...] Read more.
We study the gap between the state pension provided by the Italian pension system pre-Dini reform and post-Dini reform. The goal is to fill the gap between the old and the new pension by joining a defined contribution pension scheme and adopting an optimal investment strategy that is target-based. We find that it is possible to cover, at least partially, this gap with the additional income of the pension scheme, especially in the presence of late retirement and in the presence of stagnant careers. Workers with dynamic careers and workers who retire early are those who are most penalised by the reform. Results are intuitive and in line with previous studies on the subject. Full article
(This article belongs to the Special Issue Recent Development in Actuarial Science and Related Fields)
Show Figures

Figure 1

17 pages, 587 KiB  
Article
The Cascade Bayesian Approach: Prior Transformation for a Controlled Integration of Internal Data, External Data and Scenarios
by Bertrand K. Hassani and Alexis Renaudin
Risks 2018, 6(2), 47; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020047 - 27 Apr 2018
Cited by 1 | Viewed by 3029
Abstract
According to the last proposals of the Basel Committee on Banking Supervision, banks or insurance companies under the advanced measurement approach (AMA) must use four different sources of information to assess their operational risk capital requirement. The fourth includes ’business environment and internal [...] Read more.
According to the last proposals of the Basel Committee on Banking Supervision, banks or insurance companies under the advanced measurement approach (AMA) must use four different sources of information to assess their operational risk capital requirement. The fourth includes ’business environment and internal control factors’, i.e., qualitative criteria, whereas the three main quantitative sources available to banks for building the loss distribution are internal loss data, external loss data and scenario analysis. This paper proposes an innovative methodology to bring together these three different sources in the loss distribution approach (LDA) framework through a Bayesian strategy. The integration of the different elements is performed in two different steps to ensure an internal data-driven model is obtained. In the first step, scenarios are used to inform the prior distributions and external data inform the likelihood component of the posterior function. In the second step, the initial posterior function is used as the prior distribution and the internal loss data inform the likelihood component of the second posterior function. This latter posterior function enables the estimation of the parameters of the severity distribution that are selected to represent the operational risk event types. Full article
(This article belongs to the Special Issue Capital Requirement Evaluation under Solvency II framework)
Show Figures

Figure 1

16 pages, 1154 KiB  
Article
Volatility Is Log-Normal—But Not for the Reason You Think
by Martin Tegnér and Rolf Poulsen
Risks 2018, 6(2), 46; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020046 - 24 Apr 2018
Cited by 8 | Viewed by 4115
Abstract
It is impossible to discriminate between the commonly used stochastic volatility models of Heston, log-normal, and 3-over-2 on the basis of exponentially weighted averages of daily returns—even though it appears so at first sight. However, with a 5-min sampling frequency, the models can [...] Read more.
It is impossible to discriminate between the commonly used stochastic volatility models of Heston, log-normal, and 3-over-2 on the basis of exponentially weighted averages of daily returns—even though it appears so at first sight. However, with a 5-min sampling frequency, the models can be differentiated and empirical evidence overwhelmingly favours a fast mean-reverting log-normal model. Full article
Show Figures

Figure 1

16 pages, 390 KiB  
Article
Estimating and Forecasting Conditional Risk Measures with Extreme Value Theory: A Review
by Marco Bee and Luca Trapin
Risks 2018, 6(2), 45; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020045 - 24 Apr 2018
Cited by 8 | Viewed by 4306
Abstract
One of the key components of financial risk management is risk measurement. This typically requires modeling, estimating and forecasting tail-related quantities of the asset returns’ conditional distribution. Recent advances in the financial econometrics literature have developed several models based on Extreme Value Theory [...] Read more.
One of the key components of financial risk management is risk measurement. This typically requires modeling, estimating and forecasting tail-related quantities of the asset returns’ conditional distribution. Recent advances in the financial econometrics literature have developed several models based on Extreme Value Theory (EVT) to carry out these tasks. The purpose of this paper is to review these methods. Full article
Show Figures

Figure 1

34 pages, 7958 KiB  
Article
An Empirical Study on Stochastic Mortality Modelling under the Age-Period-Cohort Framework: The Case of Greece with Applications to Insurance Pricing
by Apostolos Bozikas and Georgios Pitselis
Risks 2018, 6(2), 44; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020044 - 23 Apr 2018
Cited by 14 | Viewed by 4511
Abstract
During the last decades, life expectancy has risen significantly in the most developed countries all over the world. Greece is a case in point; consequently, higher governmental financial responsibilities occur as well as serious concerns are raised owing to population ageing. To address [...] Read more.
During the last decades, life expectancy has risen significantly in the most developed countries all over the world. Greece is a case in point; consequently, higher governmental financial responsibilities occur as well as serious concerns are raised owing to population ageing. To address this issue, an efficient forecasting method is required. Therefore, the most important stochastic models were comparatively applied to Greek data for the first time. An analysis of their fitting behaviour by gender was conducted and the corresponding forecasting results were evaluated. In particular, we incorporated the Greek population data into seven stochastic mortality models under a common age-period-cohort framework. The fitting performance of each model was thoroughly evaluated based on information criteria values as well as the likelihood ratio test and their robustness to period changes was investigated. In addition, parameter risk in forecasts was assessed by employing bootstrapping techniques. For completeness, projection results for both genders were also illustrated in pricing insurance-related products. Full article
Show Figures

Figure 1

10 pages, 1809 KiB  
Article
Life Insurance and Annuity Demand under Hyperbolic Discounting
by Siqi Tang, Sachi Purcal and Jinhui Zhang
Risks 2018, 6(2), 43; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020043 - 23 Apr 2018
Cited by 7 | Viewed by 4312
Abstract
In this paper, we analyse and construct a lifetime utility maximisation model with hyperbolic discounting. Within the model, a number of assumptions are made: complete markets, actuarially fair life insurance/annuity is available, and investors have time-dependent preferences. Time dependent preferences are in contrast [...] Read more.
In this paper, we analyse and construct a lifetime utility maximisation model with hyperbolic discounting. Within the model, a number of assumptions are made: complete markets, actuarially fair life insurance/annuity is available, and investors have time-dependent preferences. Time dependent preferences are in contrast to the usual case of constant preferences (exponential discounting). We find: (1) investors (realistically) demand more life insurance after retirement (in contrast to the standard model, which showed strong demand for life annuities), and annuities are rarely purchased; (2) optimal consumption paths exhibit a humped shape (which is usually only found in incomplete markets under the assumptions of the standard model). Full article
(This article belongs to the Special Issue Optimal Demands for Life Insurance and Annuities)
Show Figures

Figure 1

25 pages, 2834 KiB  
Review
Credit Risk Meets Random Matrices: Coping with Non-Stationary Asset Correlations
by Andreas Mühlbacher and Thomas Guhr
Risks 2018, 6(2), 42; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020042 - 23 Apr 2018
Cited by 2 | Viewed by 4385
Abstract
We review recent progress in modeling credit risk for correlated assets. We employ a new interpretation of the Wishart model for random correlation matrices to model non-stationary effects. We then use the Merton model in which default events and losses are derived from [...] Read more.
We review recent progress in modeling credit risk for correlated assets. We employ a new interpretation of the Wishart model for random correlation matrices to model non-stationary effects. We then use the Merton model in which default events and losses are derived from the asset values at maturity. To estimate the time development of the asset values, the stock prices are used, the correlations of which have a strong impact on the loss distribution, particularly on its tails. These correlations are non-stationary, which also influences the tails. We account for the asset fluctuations by averaging over an ensemble of random matrices that models the truly existing set of measured correlation matrices. As a most welcome side effect, this approach drastically reduces the parameter dependence of the loss distribution, allowing us to obtain very explicit results, which show quantitatively that the heavy tails prevail over diversification benefits even for small correlations. We calibrate our random matrix model with market data and show how it is capable of grasping different market situations. Furthermore, we present numerical simulations for concurrent portfolio risks, i.e., for the joint probability densities of losses for two portfolios. For the convenience of the reader, we give an introduction to the Wishart random matrix model. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

16 pages, 2020 KiB  
Article
Active Management of Operational Risk in the Regimes of the “Unknown”: What Can Machine Learning or Heuristics Deliver?
by Udo Milkau and Jürgen Bott
Risks 2018, 6(2), 41; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020041 - 23 Apr 2018
Cited by 4 | Viewed by 5546
Abstract
Advanced machine learning has achieved extraordinary success in recent years. “Active” operational risk beyond ex post analysis of measured-data machine learning could provide help beyond the regime of traditional statistical analysis when it comes to the “known unknown” or even the “unknown unknown.” [...] Read more.
Advanced machine learning has achieved extraordinary success in recent years. “Active” operational risk beyond ex post analysis of measured-data machine learning could provide help beyond the regime of traditional statistical analysis when it comes to the “known unknown” or even the “unknown unknown.” While machine learning has been tested successfully in the regime of the “known,” heuristics typically provide better results for an active operational risk management (in the sense of forecasting). However, precursors in existing data can open a chance for machine learning to provide early warnings even for the regime of the “unknown unknown.” Full article
Show Figures

Figure 1

13 pages, 1195 KiB  
Article
An Intersection–Union Test for the Sharpe Ratio
by Gabriel Frahm
Risks 2018, 6(2), 40; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020040 - 19 Apr 2018
Cited by 1 | Viewed by 4305
Abstract
An intersection–union test for supporting the hypothesis that a given investment strategy is optimal among a set of alternatives is presented. It compares the Sharpe ratio of the benchmark with that of each other strategy. The intersection–union test takes serial dependence into account [...] Read more.
An intersection–union test for supporting the hypothesis that a given investment strategy is optimal among a set of alternatives is presented. It compares the Sharpe ratio of the benchmark with that of each other strategy. The intersection–union test takes serial dependence into account and does not presume that asset returns are multivariate normally distributed. An empirical study based on the G–7 countries demonstrates that it is hard to find significant results due to the lack of data, which confirms a general observation in empirical finance. Full article
Show Figures

Figure A1

16 pages, 1345 KiB  
Article
How Does Distress Acquisition Incentivized by Government Purchases of Distressed Loans Affect Bank Default Risk?
by Jyh-Jiuan Lin, Chuen-Ping Chang and Shi Chen
Risks 2018, 6(2), 39; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020039 - 19 Apr 2018
Cited by 2 | Viewed by 3360
Abstract
The topic of bank default risk in connection with government bailouts has recently attracted a great deal of attention. In this paper, the question of how a bank’s default risk is affected by a distress acquisition is investigated. Specifically, the government provides a [...] Read more.
The topic of bank default risk in connection with government bailouts has recently attracted a great deal of attention. In this paper, the question of how a bank’s default risk is affected by a distress acquisition is investigated. Specifically, the government provides a bailout program of distressed loan purchases for a strong bank to acquire a bank in distress. The acquirer bank may likely refuse the acquisition with a bailout when the amount of distressed loan purchases is large or the knock-out value of the acquired bank is high. When the acquirer bank realizes acquisition gains, the default risk in the consolidated bank’s equity return is negatively related to loan purchases, but positively to the knock-out value of the acquired bank. The government bailout, as such, in large part contributes to banking stability. Full article
(This article belongs to the Special Issue Risks in Financial and Real Estate Markets)
Show Figures

Figure 1

20 pages, 620 KiB  
Article
Credit Risk Analysis Using Machine and Deep Learning Models
by Peter Martey Addo, Dominique Guegan and Bertrand Hassani
Risks 2018, 6(2), 38; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020038 - 16 Apr 2018
Cited by 140 | Viewed by 32508
Abstract
Due to the advanced technology associated with Big Data, data availability and computing power, most banks or lending institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision-making and transparency. In this work, [...] Read more.
Due to the advanced technology associated with Big Data, data availability and computing power, most banks or lending institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision-making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modeling process to test the stability of binary classifiers by comparing their performance on separate data. We observe that the tree-based models are more stable than the models based on multilayer artificial neural networks. This opens several questions relative to the intensive use of deep learning systems in enterprises. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

21 pages, 3439 KiB  
Article
Estimating the Potential Risks of Sea Level Rise for Public and Private Property Ownership, Occupation and Management
by Georgia Warren-Myers, Gideon Aschwanden, Franz Fuerst and Andy Krause
Risks 2018, 6(2), 37; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020037 - 14 Apr 2018
Cited by 18 | Viewed by 6836
Abstract
The estimation of future sea level rise (SLR) is a major concern for cities near coastlines and river systems. Despite this, current modelling underestimates the future risks of SLR to property. Direct risks posed to property include inundation, loss of physical property and [...] Read more.
The estimation of future sea level rise (SLR) is a major concern for cities near coastlines and river systems. Despite this, current modelling underestimates the future risks of SLR to property. Direct risks posed to property include inundation, loss of physical property and associated economic and social costs. It is also crucial to consider the risks that emerge from scenarios after SLR. These may produce one-off or periodic events that will inflict physical, economic and social implications, and direct, indirect and consequential losses. Using a case study approach, this paper combines various forms of data to examine the implications of future SLR to further understand the potential risks. The research indicates that the financial implications for local government will be loss of rates associated with total property loss and declines in value. The challenges identified are not specific to this research. Other municipalities worldwide experience similar barriers (i.e., financial implications, coastal planning predicaments, data paucity, knowledge and capacity, and legal and political challenges). This research highlights the need for private and public stakeholders to co-develop and implement strategies to mitigate and adapt property to withstand the future challenges of climate change and SLR. Full article
Show Figures

Figure 1

23 pages, 901 KiB  
Article
Operational Choices for Risk Aggregation in Insurance: PSDization and SCR Sensitivity
by Xavier Milhaud, Victorien Poncelet and Clement Saillard
Risks 2018, 6(2), 36; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020036 - 13 Apr 2018
Viewed by 3251
Abstract
This work addresses crucial questions about the robustness of the PSDization process for applications in insurance. PSDization refers to the process that forces a matrix to become positive semidefinite. For companies using copulas to aggregate risks in their internal model, PSDization occurs when [...] Read more.
This work addresses crucial questions about the robustness of the PSDization process for applications in insurance. PSDization refers to the process that forces a matrix to become positive semidefinite. For companies using copulas to aggregate risks in their internal model, PSDization occurs when working with correlation matrices to compute the Solvency Capital Requirement (SCR). We examine how classical operational choices concerning the modelling of risk dependence impacts the SCR during PSDization. These operations refer to the permutations of risks (or business lines) in the correlation matrix, the addition of a new risk, and the introduction of confidence weights given to the correlation coefficients. The use of genetic algorithms shows that theoretically neutral transformations of the correlation matrix can surprisingly lead to significant sensitivities of the SCR (up to 6%). This highlights the need for a very strong internal control around the PSDization step. Full article
(This article belongs to the Special Issue Capital Requirement Evaluation under Solvency II framework)
Show Figures

Figure 1

18 pages, 969 KiB  
Article
On Central Branch/Reinsurance Risk Networks: Exact Results and Heuristics
by Florin Avram and Sooie-Hoe Loke
Risks 2018, 6(2), 35; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020035 - 12 Apr 2018
Cited by 2 | Viewed by 2745
Abstract
Modeling the interactions between a reinsurer and several insurers, or between a central management branch (CB) and several subsidiary business branches, or between a coalition and its members, are fascinating problems, which suggest many interesting questions. Beyond two dimensions, one cannot expect exact [...] Read more.
Modeling the interactions between a reinsurer and several insurers, or between a central management branch (CB) and several subsidiary business branches, or between a coalition and its members, are fascinating problems, which suggest many interesting questions. Beyond two dimensions, one cannot expect exact answers. Occasionally, reductions to one dimension or heuristic simplifications yield explicit approximations, which may be useful for getting qualitative insights. In this paper, we study two such problems: the ruin problem for a two-dimensional CB network under a new mathematical model, and the problem of valuation of two-dimensional CB networks by optimal dividends. A common thread between these two problems is that the one dimensional reduction exploits the concept of invariant cones. Perhaps the most important contribution of the paper is the questions it raises; for that reason, we have found it useful to complement the particular examples solved by providing one possible formalization of the concept of a multi-dimensional risk network, which seems to us an appropriate umbrella for the kind of questions raised here. Full article
(This article belongs to the Special Issue Capital Requirement Evaluation under Solvency II framework)
Show Figures

Figure 1

11 pages, 319 KiB  
Article
Multivariate Credibility in Bonus-Malus Systems Distinguishing between Different Types of Claims
by Emilio Gómez-Déniz and Enrique Calderín-Ojeda
Risks 2018, 6(2), 34; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020034 - 11 Apr 2018
Cited by 12 | Viewed by 2864
Abstract
In the classical bonus-malus system the premium assigned to each policyholder is based only on the number of claims made without having into account the claims size. Thus, a policyholder who has declared a claim that results in a relatively small loss is [...] Read more.
In the classical bonus-malus system the premium assigned to each policyholder is based only on the number of claims made without having into account the claims size. Thus, a policyholder who has declared a claim that results in a relatively small loss is penalised to the same extent as one who has declared a more expensive claim. Of course, this is seen unfair by many policyholders. In this paper, we study the factors that affect the number of claims in car insurance by using a trivariate discrete distribution. This approach allows us to discern between three types of claims depending wether the claims are above, between or below certain thresholds. Therefore, this model implements the two fundamental random variables in this scenario, the number of claims as well as the amount associated with them. In addition, we introduce a trivariate prior distribution conjugated with this discrete distribution that produce credibility bonus-malus premiums that satisfy appropriate traditional transition rules. A practical example based on real data is shown to examine the differences with respect to the premiums obtained under the traditional system of tarification. Full article
(This article belongs to the Special Issue Credibility Theory: New Developments and Applications)
39 pages, 584 KiB  
Article
Mixed Periodic-Classical Barrier Strategies for Lévy Risk Processes
by José-Luis Pérez and Kazutoshi Yamazaki
Risks 2018, 6(2), 33; https://0-doi-org.brum.beds.ac.uk/10.3390/risks6020033 - 05 Apr 2018
Cited by 13 | Viewed by 2576
Abstract
Given a spectrally-negative Lévy process and independent Poisson observation times, we consider a periodic barrier strategy that pushes the process down to a certain level whenever the observed value is above it. We also consider the versions with additional classical reflection above and/or [...] Read more.
Given a spectrally-negative Lévy process and independent Poisson observation times, we consider a periodic barrier strategy that pushes the process down to a certain level whenever the observed value is above it. We also consider the versions with additional classical reflection above and/or below. Using scale functions and excursion theory, various fluctuation identities are computed in terms of the scale functions. Applications in de Finetti’s dividend problems are also discussed. Full article
Previous Issue
Next Issue
Back to TopTop