User Login

Home Courses eLearning Calendar Insights Faculty Us Resources Contact

Podcast: Fundamental Review of the Trading Book - Part 2 - Internal Model vs Standardized Approach

Speaker: Dr Simon Acomb

To listen to this podcast now, please fill-in your details below:

A course on this topic is available in London Time Zone, New York Time Zone and Singapore Time Zone

Podcast Overview

In the second part of this podcast, Dr Simon Acomb gives us additional insights into the Fundamental Review of the Trading Book (FRTB) - the recent piece of regulation by the Basel Committee affecting capital standards for market risk. In particular, Simon addresses the differences between the Standard model and the Internal model approaches for financial institutions of different sizes. He also describes new concepts such as vega risk and residual risk add-on, which were introduced by the new standardized approach.

Podcast Transcript

1. In our previous conversation you mentioned that large banks were using internal risk models whereas mid-size and smaller banks were leaning towards the standard approach. Could you tell me a bit more about how the FRTB addresses these two different possible approaches?
A. The first thing to say is that the FRTB requires all banks whatever their size to calculate what the capital would be, according to the standardized approach. This means that even if a bank has an internal model for capital calculation the regulators will still insist on see the results from the standardized approached, so that they can compare results. The standardized approach is designed to be risk-based and takes standard risk measures such as delta and vega and combines them by a series of calculations to come up with a capital requirement. The idea here is to have a universal standard and so nearly all the fine detail is specified even down to how you should calculate a risk measure such as delta.

On the other hand, the internal model approach allows banks to develop their own models of risk, and then use these for calculating capital. The use of internal models has become a great deal more prescriptive under the FRTB and is far more tightly controlled and monitored. The first major change is that the methodology implemented has to be a measure of expected short fall rather than previous regulations which used VAR. On top of this major change the FRTB also imposes rules which control how this expected shortfall calculation gets modified to take into account things like the liquidity of assets, the uncertainty in correlations and difficulties in observing market data.

Previous capital regimes worked by banks applying to use an internal model and once this was approved it was used for market risk capital across the entire bank. This is going to be dramatically different under the FRTB. Approval is given to use an internal model at the level of a trading desk. On-going backtesting and performance monitoring has to be carried out by the bank and this can easily result in an individual trading desk having to return to reporting capital according to the less favourable standardized approach.

So what is the impact of all this? Well the first thing to say is that if you just trade vanilla products and keep them well hedged, then the standardized approach won't be all that bad. It is designed to be risk-based, so if you don't run market risk the capital impact should be small. However, soon as you ste p away from this business model, by trading more complex products, or not being perfectly hedged the capital impact could be large. There are reports of banks seeing an increase in capital of 2 to 6 times higher when they use the standardized approach.

What's the alternative to this? There is a big benefit from using an internal model as this captures a greater degree of offsetting between the diverse risk factors. Users of internal model are reporting increases in capital in the range of 1 - 1.5 times higher when compared with previous VAR calculation. There is no doubt that an internal model will give lower capital. The problem is that the barriers to entry to using an internal model have gone through the roof in this new regime .

2. This new standardized approach also introduces some new concepts such as vega risk capital. Isn’t that adding too much complexity? What are your thoughts on this?
A. Yes, vega risk capital is a new introduction in capital calculations. Although there has been some resistance to this approach, I think it should be welcomed as it brings risk capital into line with best practices in the risk management of derivative products. In general, I strongly believe that capital should reflect the risk run by banks, and should be reduced when banks hedge efficiently. Using vega as a sensitivity for calculating capital is a step toward this goal. However, there are parts of the new vega capital calculations which have not been thought through very well. For example it is unclear from the current rules how the vega on an index option should be treated. It suggests that the vega should be separated out into individual constituents of the index, but doesn't give you any guidance on how this should be done. I would expect that either the Basel Committee will come back with some further guidance on vega calculation as a response to a "Frequently Asked Question", or the local regulators will be left to specify what is required for banks under their supervision. Whatever happens this issue and many others like it represents a significant degree of uncertainty for banks looking to implement the new rules.

3. So, even though we already have deadlines for the FRTB, not everything has been completely defined yet?
A. That’s right. And in fact, the vega risk capital is not the only new thing that has been introduced within the standardized approach. There are also new capital calculations require for curvature, which are particularly important for derivative products. This is designed to capture any non-linear risk in options. There is also a requirement to calculate a default risk charge which is also new. This puts capital against the trading of any product where the underlying is at risk of default. This includes products such as government t bond and equities, as well as corporate bonds and credit default swaps.

Finally, another thing that is not all that well specified is this feature called the residual risk add-on. This is an additional charge for products which don't have a simple payoff, or a simple underlying.

4. Can you tell us a bit more about this residual risk add-on and why it was included by the Basel Committee?
A. The residual risk add-on was introduced as an acknowledgement that the standardized approach has its own limitations. The standardized approach is designed to work well for basic products and simple derivatives such as swaps and vanilla options. Beyond this there is an acknowledgement that all the complexities of a product may not be properly captured by the new calculations. To compensate for this banks have to apply the residual risk add-on for each instrument designated as having residual risk. These add-ons are then just summed up and added to the total capital figure. This means that there is no diversification or offsetting between products of this residual risk. The calculation itself is actually very simple - it's just a flat percentage of notional. The scope of the residual risk, however, is very broad. It applies for any derivative product which is not a simple sum of vanilla options. So for example it would apply to Bermudan options, spread options, or barrier options. In addition a residual risk add-on applies for any underlying which is not covered by the standardized approach. This would include things like mortality risk included in hedges for annuity products in a bank’s trading book, but also variance as an underlying seen in variance swaps for example.

For any business which trade in these sorts of products there will be significant capital hit if they use the standardized approach. They are faced with the a limited number of options, either close down these businesses, ensure that they are always back to back on these products with a larger bank that uses an internal model, or take the decision to invest in building an internal model.

5. On the other hand, there are also campaigns for a simpler, a more Basic Approach. What's really happening here?
A. Recently, the Basel Committee published a consultative paper, which outlined a pared down version of the Standardized Approach, which is called the "Reduced Sensitivity Method". As you say there has been significant push back from Banks and local regulators due to the complexity of the Standardized Approach. The European Banking Authority was already considering a materiality criterion on the size of a trading booking before applying the new rules. So, the consultative paper from Basel Committee outlines how this drive for a yet more basic approach could be incorporated into the FRTB. The approach that they put forward would be restricted to the smallest institutions with trading books of a limited size, and a strictly limited number of derivatives. Even those institutions would not be able to use the proposed "basic Reduced Sensitivity Method" if they wrote options, or held correlation products such as CDOs.

For the limited number of banks which meet these criteria, the proposal is that they do not have to calculate the vega risk or the curvature risk (this should have limited impact as by definition these institutions have small number of non-linear option positions). However they will be required to calculate delta risk in the same sort of way as the standardized approach, but with a reduced set of risk factors and more conservative risk weights and correlations.

In my opinion this proposal will only simplify capital calculations for the smallest institutions. They will still be required to implement new calculations for the default risk charge and the residual risk add-on. So, even for these smallest institutions, it is only a partial operational saving. The more conservative risk weighting and correlation also mean that for these smallest institutions, if they adopt this method, it is likely to result in an increase in capital.

6. Simon, when you say “the smallest institutions”, what sort of size are we talking about? How do we measure this?
A. The requirement is that the total amount of their trading book has to be within a certain size. And this will only affect those smallest institutions; it certainly won’t apply to anything which is considered to be domestically significant.

So, in a way it’s in proportion to the local market and also in proportion to the size of their trading book and the types of products that they have in there…

Exactly. The thing to remember as well is that at the moment this basic approach this is only a proposal and is not yet incorporated into the rules. What I think is most significant is that the Basel Committee have published this document in the first place. In my opinion it reflects a long overdue recognition that the FRTB places a large operational burden on banks as well as having a potential capital hit.

LFS offers the 2-day 'Fundamental Review of the Trading Book' programme with Dr Simon Acomb in London, New York, Toronto, Sydney, Hong Kong and Singapore.

To find out more, click on the link above or contact us at

Are you interested in running our public courses in-house? Contact our in-house team to discuss further.

Why travel? Many clients are already attending our courses from the convenience of their home or office with LFS's state-of-the-art remote learning platform: LFS Live