, Singapore

An over-engineered finance industry has gone too far

By Moorad Choudhry

The other day I conducted a simple statistical experiment. I calculated the “value-at-risk” exposure for an hypothetical investment portfolio consisting of just one bond holding.

The bond I selected for the exercise was a vanilla fixed-coupon, fixed-term security issued by Aston Martin (no doubts where my aspirations on car ownership lie!). My fictitious sample portfolio held £1million notional of this bond, a market value of £1.003 million at the time.

Depending on which calculation methodology I selected, the VaR for this portfolio ranged from around £1,000 to just over £11,000, an order of magnitude over 10 times. This variation is so wide as to render the result almost unusable.

And that’s for a portfolio we would be hard-pressed to surpass in terms of simplicity – just think what variation of result must exist for complex portfolios of the kind held by the larger financial institutions.

And yet VaR results are an input to the level of regulatory capital a bank is required to hold. What a scary thought! If bank capital amount is based on VaR model output, what result do you think a bank would use when calculating its capital base requirement, £1,000 or £11,000? The original regulatory capital accord, Basel I, attracted heavy criticism partly because of its “scattergun” approach, with too broad categories.

Basel II was an attempt to address this, but because internal models are so varied in scope and output, one can’t really make comparisons of results across banks. We see the same everywhere in finance.

Complex mathematical models, using assumptions based on assumptions to come up with valuations or risk exposures or “loss given defaults” that could be out by 50% or more. We’ve seen these cross over to the funding and liquidity space as well, traditionally the preserve of the more non-Physics-educated types in a bank, where some banks are calculating “liquidity-at-risk”. You’re kidding me?!

When the method becomes more important than the result, it’s time to roll back the clock. The only people that benefit from over-engineering finance are software vendors and the consultants who advise banks on implementing this software.

Betraying a somewhat Paulian-conversion in this regard, I see in hindsight how the valuation approach for certain structured finance securities – relying on a metric known as default correlation, which can’t actually be observed in the market – was not really tenable. Yet it became accepted practice, adopted by ratings agencies and approved by regulators.

This column always looks to extract some silver linings from the cloud of the crash. Here could be one of them: roll back the influence of the “quants” and re-adopt more simple approaches.

The search for precision in risk estimation is chimerical, we should accept that banking is as much art as science, and roll back the over-engineered approach that has governed so much in finance over the last 20 years. It hasn’t made the art of banking easier or clearer, it’s just made the mistakes, when they do occur, that much larger every time we’ve discovered them.

Join Asian Banking & Finance community
Since you're here...

...there are many ways you can work with us to advertise your company and connect to your customers. Our team can help you dight and create an advertising campaign, in print and digital, on this website and in print magazine.

We can also organize a real life or digital event for you and find thought leader speakers as well as industry leaders, who could be your potential partners, to join the event. We also run some awards programmes which give you an opportunity to be recognized for your achievements during the year and you can join this as a participant or a sponsor.

Let us help you drive your business forward with a good partnership!