Financial institutions need to take volatility clustering into account:
I. To avoid taking on an undesirable level of risk
II. To know the right level of capital they need to hold
III. To meet regulatory requirements
IV. To account for mean reversion in returns
B
Explanation:
Volatility clustering leads to levels of current volatility that can be significantly different from long
run averages. When volatility is running high, institutions need to shed risk, and when it is running
low, they can afford to increase returns by taking on more risk for a given amount of capital. An
institution's response to changes in volatility can be either to adjust risk, or capital, or both.
Accounting for volatility clustering helps institutions manage their risk and capital and therefore
statements I and II are correct.
Regulatory requirements do not require volatility clustering to be taken into account (at least not
yet). Therefore statement III is not correct, and neither is IV which is completely unrelated to
volatility clustering.
As the persistence parameter under EWMA is lowered, which of the following would be true:
B
Explanation:
The persistence parameter, , is the coefficient of the prior day's variance in EWMA calculations. A
higher value of the persistence parameter tends to 'persist' the prior value of variance for longer.
Consider an extreme example - if the persistence parameter is equal to 1, the variance under EWMA
will never change in response to returns.
1 - is the coefficient of recent market returns. As is lowered, 1 - increases, giving a greater
weight to recent market returns or shocks. Therefore, as is lowered, the model will react faster to
market shocks and give higher weights to recent returns, and at the same time reduce the weight on
prior variance which will tend to persist for a shorter period.
Which of the following is not a limitation of the univariate Gaussian model to capture the
codependence structure between risk factros used for VaR calculations?
C
Explanation:
In the univariate Gaussian model, each risk factor is modeled separately independent of the others,
and the dependence between the risk factors is captured by the covariance matrix (or its equivalent
combination of the correlation matrix and the variance matrix). Risk factors could include interest
rates of different tenors, different equity market levels etc.
While this is a simple enough model, it has a number of limitations.
First, it fails to fit to the empirical distributions of risk factors, notably their fat tails and skewness.
Second, a single covariance matrix is insufficient to describe the fine codependence structure among
risk factors as non-linear dependencies or tail correlations are not captured. Third, determining the
covariance matrix becomes an extremely difficult task as the number of risk factors increases. The
number of covariances increases by the square of the number of variables.
But an inability to capture linear relationships between the factors is not one of the limitations of the
univariate Gaussian approach - in fact it is able to do that quite nicely with covariances.
A way to address these limitations is to consider joint distributions of the risk factors that capture the
dynamic relationships between the risk factors, and that correlation is not a static number across an
entire range of outcomes, but the risk factors can behave differently with each other at different
intersection points.
Which of the following are considered properties of a 'coherent' risk measure:
I. Monotonicity
II. Homogeneity
III. Translation Invariance
IV. Sub-additivity
B
Explanation:
All of the properties described are the properties of a 'coherent' risk measure.
Monotonicity means that if a portfolio's future value is expected to be greater than that of another
portfolio, its risk should be lower than that of the other portfolio. For example, if the expected return
of an asset (or portfolio) is greater than that of another, the first asset must have a lower risk than
the other. Another example: between two options if the first has a strike price lower than the second,
then the first option will always have a lower risk if all other parameters are the same. VaR satisfies
this property.
Homogeneity is easiest explained by an example: if you double the size of a portfolio, the risk
doubles. The linear scaling property of a risk measure is called homogeneity. VaR satisfies this
property.
Translation invariance means adding riskless assets to a portfolio reduces total risk. So if cash (which
has zero standard deviation and zero correlation with other assets) is added to a portfolio, the risk
goes down. A risk measure should satisfy this property, and VaR does.
Sub-additivity means that the total risk for a portfolio should be less than the sum of its parts. This is
a property that VaR satisfies most of the time, but not always. As an example, VaR may not be sub-
additive for portfolios that have assets with discontinuous payoffs close to the VaR cutoff quantile.
The largest 10 losses over a 250 day observation period are as follows. Calculate the expected
shortfall at a 98% confidence level:
20m
19m
19m
17m
16m
13m
11m
10m
9m
9m
C
Explanation:
For a dataset with 250 observations, the top 2% of the losses will be the top 5 observations. Expected
shortfall is the average of the losses beyond the VaR threshold. Therefore the correct answer is (20 +
19 + 19 + 17 + 16)/5 = 18.2m .
Note that Expected Shortfall is also called conditional VaR (cVaR), Expected Tail Loss and Tail average.
A risk analyst attempting to model the tail of a loss distribution using EVT divides the available
dataset into blocks of data, and picks the maximum of each block as a data point to consider.
Which approach is the risk analyst using?
A
Explanation:
The risk analyst is using the block maxima approach. The data points that result will then be used to
fit a GEV distribution.
Expected shortfall refers to the expected losses beyond a specified threshold. The peaks-over-
threshold approach is an alternative approach to the block maxima approach, and involves
considering exceedances above a threshold. Fourier transformation is not relevant in this context,
and is a non-sensical option.
Which of the following are valid approaches for extreme value analysis given a dataset:
I. The Block Maxima approach
II. Least squares approach
III. Maximum likelihood approach
IV. Peak-over-thresholds approach
C
Explanation:
For EVT, we use the block maxima or the peaks-over-threshold methods. These provide us the data
points that can be fitted to a GEV distribution.
Least squares and maximum likelihood are methods that are used for curve fitting, and they have a
variety of applications across risk management.
Which of the following belong to the family of generalized extreme value distributions:
I. Frechet
II. Gumbel
III. Weibull
IV. Exponential
B
Explanation:
Extreme value theory focuses on the extreme and rare events, and in the case of VaR calculations, it
is focused on the right tail of the loss distribution. In very simple and non-technical terms, EVT says
the following:
1. Pull a number of large iid random samples from the population,
2. For each sample, find the maximum,
3. Then the distribution of these maximum values will follow a Generalized Extreme Value
distribution.
(In some ways, it is parallel to the central limit theorem which says that the the mean of a large
number of random samples pulled from any population follows a normal distribution, regardless of
the distribution of the underlying population.)
Generalized Extreme Value (GEV) distributions have three parameters: (shape parameter),
(location parameter) and (scale parameter). Based upon the value of , a GEV distribution may
either be a Frechet, Weibull or a Gumbel. These are the only three types of extreme value
distributions.
The 99% 10-day VaR for a bank is $200mm. The average VaR for the past 60 days is $250mm, and the
bank specific regulatory multiplier is 3. What is the bank's basic VaR based market risk capital
charge?
C
Explanation:
The current Basel rules for the basic VaR based charge for market risk capital set market risk capital
requirements as the maximum of the following two amounts:
1. 99%/10-day VaR,
2. Regulatory Multiplier x Average 99%/10-day VaR of the past 60 days
The 'regulatory multiplier' is a number between 3 and 4 (inclusive) calculated based on the number
of 1% VaR exceedances in the previous 250 days, as determined by backtesting.
- If the number of exceedances is <= 4, then the regulatory multiplier is 3.
- If the number of exceedances is between 5 and 9, then the multiplier = 3 + 0.2*(N-4), where N is
the number of exceedances.
- If the number of exceedances is >=10, then the multiplier is 4.
So you can see that in most normal situations the risk capital requirement will be dictated by the
multiplier and the prior 60-day average VaR, because the product of these two will almost often be
greater than the current 99% VaR.
The correct answer therefore is = max(200mm, 3*250mm) = $750mm.
Interestingly, also note that a 99% VaR should statistically be exceeded 1%*250 days = 2.5 times,
which means if the bank's VaR model is performing as it should, it will still need to use a reg
multiplier of 3.
Which of the following was not a policy response introduced by Basel 2.5 in response to the global
financial crisis:
B
Explanation:
The CCAR is a supervisory mechanism adopted by the US Federal Reserve Bank to assess capital
adequacy for bank holding companies it supervises. It was not a concept introduced by the
international Basel framework.
The other three were indeed rules introduced by Basel 2.5, which was ultimately subsumed into
Basel III.
Stressed VaR is just the standard 99%/10 day VaR, calculated with the assumption that relevant
market factors are under stress.
The Incremental Risk Charge (IRC) is an estimate of default and migration risk of unsecuritized credit
products in the
trading book. (Though this may sound like a credit risk term, it relates to market risk - for example, a
bond rated A being downgraded to BBB. In the old days, the banking book where loans to customers
are held was the primary source of credit risk, but with OTC trading and complex products the trading
book also now holds a good deal of credit risk. Both IRC and CRM account for these.)
While IRC considers only non-securitized products, the CRM (Comprehensive Risk Model) considers
securitized products such as tranches, CDOs, and correlation based instruments.
The IRC, SVaR and CRM complement standard VaR by covering risks that are not included in a
standard VaR model. Their results are therefore added to the VaR for capital adequacy
determination.