Mindsets, Decisions & Actions
Almost without exception risk registers reflect the potential impact and probability of occurrence of risk events external to the organization; for example the potential impact on enterprise (organizational) performance of a supply chain failure whether goods or services or the impact of increased cost of finance, whatever the cause. Very rare indeed is it for risks consequential to current organizational philosophy-mindset (e.g. we’ve been in the business a long time – we know what we are doing) or governance-management decisions and actions (e.g. cuts to capital and maintenance budgets, freezing of wages and salaries, sinking-lid and no replacement staff employment policies, slashing training budgets, etc.), to find their way onto the enterprise (organizational) risk register, let alone the risk dashboard.
My context here is overall Enterprise (Organizational) Performance Risk Management, not siloed subsets of risk such as financial risk, health & safety risk, operational risk, credit risk, supply chain risk, project risk, etc; the impact of today’s mindsets, decisions and actions upon future organizational performance, however it is measured – in terms of profit, return on investment, program cost or outcomes delivered.
Before rambling on, a quote from G. K. Chesterton who over a century ago with great insight wrote:
The real trouble with this world of ours is not that it is an unreasonable world, or even that it is a reasonable one. The commonest kind of trouble is that it is nearly reasonable, but not quite. Life is not an illogicality; yet it is a trap for logicians. It looks just a little more mathematical than it is; its exactitude is obvious, but its inexactitude is hidden; its wildness lies in wait.
What is hidden by the presumptions that mindset often engenders or the unknown consequences of today’s management decisions and actions?
What lies in wait with a metaphorical bite?
There is more going on here than meets the eye…
RAF Nimrod XV230
On 2 September 2006, RAF Nimrod XV230 was on a routine mission in Afghanistan when a catastrophic mid-air fire resulted in the total loss of the aircraft and the death of all those on board. Charles Haddon-Cave QC in his report (An Independent Review into the Broader Issues Surrounding the Loss of the RAF Nimrod MR2 Aircraft in Afghanistan in 2006 – printed 28th October, 2009) concluded that organizational causes played a major part in the loss of XV230 and in particular a period of sustained organizational trauma between 1998 and 2006, which began with a strategic review in 1998. The following is an excerpt from Haddon-Cave’s report.
Financial pressures and cuts drove a cascade of multifarious organizational changes, which lead to a dilution of the airworthiness regime and culture within the MOD (Ministry of Defence), and distraction from safety and airworthiness issues as top priority. There was a shift in culture and priorities in MOD towards “business” and financial targets; at the expense of functional values such as safety and rworthiness….Airworthiness was a victim of the process started by the 1998 Strategic Defence Review.
In 1998, the same year as the strategic review a Nimrod report warned of the conflict between ever-reducing resources and…increasing demands; whether they be operational, financial, legislative, or merely those symptomatic of keeping an old aircraft flying.
This is neither the time nor the place to discuss Haddon-Cave’s report further, suffice to say that a governance directive implemented by management decisions and actions underpinned by a mindset that the Nimrod was “safe anyway” because it had successfully flown for 30 years, had dire consequences; that had lain in wait so to speak for 8 years.
Financial Services Authority (FSA)
In the United Kingdom, the FSA is mandated in law (Financial Services Act 2000) to amongst other things:
- Maintain confidence in the financial system;
- Promote public understanding of the financial system;
- Secure appropriate degrees of protection for consumers.
It is a well funded organization with massive intellectual capability at its disposal, both internally and externally.
Question: Why did the FSA fail so lamentably to fulfil its mandate in law?
Lord Adair Turner, now chairman of the FSA, was asked in October 2008 to review the causes of the financial crisis that engulfed not only the United Kingdom, but many other countries. His report (The Turner Review – A
regulatory response to the global banking crisis. March 2009) provides some pointers to the reasons.
Before the 2008-2009 financial crisis the philosophy underpinning the FSA’s regulatory and supervisory approach was:
- Markets are in general self-correcting;
- The primary responsibility for managing risk lies with senior management and boards of individual firms;
- Customer protection is best ensured not by product regulation or direct intervention in markets, but by ensuring that wholesale markets are as unfettered and transparent as possible.
The outworking of this philosophy in practice was a supervisory approach which involved:
- A focus on the supervision of individual institutions rather than the whole system;
- A focus on ensuring that systems and processes were correctly defined, rather than on challenging business models and strategies;
- A balance between conduct of business regulation and prudential regulation, which hindsight has show was biased towards the former.
The FSA, a risk-based regulator, embedded its firm-focused approach in ARROW (Advanced Risk Response Operating frameWork), the framework it uses to assess risk to the FSA’s, not shareholder objectives. But this most sophisticated of frameworks constructed by highly intelligent and capable people failed to meet the intent of its designers.
Why – what lesson do we have to learn?
The short answer is that they were refereeing the game with no line umpires so to speak. They were not refereeing offside play because they did not know the players were offside – their (FSA) rules (no umpires) did not allow them to see the off side play – they were blinded by their own rules – Chesterton’s its ‘exactitude is obvious, but its inexactitude is hidden‘.
The ARROW framework revealed exactly what it was designed to reveal – no more and no less. But as with all risk frameworks, it’s what is not included – that is hidden, that is of concern.
Charles R. Morris in a wonderful book published in 2008 (The Trillion Dollar Meltdown – Easy Money, High Rollers and the Great Credit Crash) makes the following observation:
All three of those trends – the shift of financial transactions to unregulated markets, the steady worsening of the Agency problem, and the pretense that all of finance can be mathematized – flowed together to create the great credit bubble of the 2000s.
In fairness to the FSA, ARROW has now been updated to ARROW II to embrace all that was learnt from the 2008-09 crisis.
The VAR (Value at Risk) Metric
The VAR metric in its many forms is designed to measure market risk and to guide trading strategies. The assumption underpinning the metric is that analysis of past price movement patterns can deliver statistically robust inferences relating to the probability of prices movements in the future.
Lord Turner concluded (March 2009 report) that the financial crisis had revealed severe problems with VAR techniques, in particular:
- Short observation periods and the potential for significant procyclicality;
- Models based on non-normal distributions systematically underestimating the chances of small probability high impact events;
- Systemic versus idiosyncratic risk – assumption that movement in markets will not induce similar and simultaneous behaviour by numerous players.
At the nub of the VAR debate is the growing understanding that financial market movements together with many other phenomena such as earth quakes, forest fires and power blackouts conform to low probability high impact power law (fat-tail) distributions. But, and this is the issue, the rarity of these low probability high impact events means that for much of the time the data appears to conform to a normal distribution with its statistical inference characteristics, which for obvious reasons is very attractive to the finance industry.
The problem is that the VAR metric works more or less to expectations most of the time, but not all the time and determining those times when it is not conforming is very difficult, if not impossible.
We have another classic instance of: It looks just a little more mathematical than it is; its exactitude is obvious, but its inexactitude is hidden; its wildness lies in wait.
The Nimrod XV230 tragedy illustrates how mindsets, decisions and actions can have unintended consequences. No matter how good you believe your framework (FSA’s ARROW) to be, if you are refereeing only part of the game – the game will rapidly get beyond your control. Just because a Risk Metric (VAR) works most of the time does not mean that it is correct, nor that it will work all the time; the problem is that you do not know when it is working properly and when it is not.
That risk is so much more than a register, a framework or a mechanistic process was highlighted in a recent Ernst & Young report (Navigating the crisis – A survey of the World’s largest banks, 2008), their respondents noting that prior to the 2008-09 financial crisis, companies had underestimated the vital importance of the human factor in managing risk; that human judgement, insight and experience should be valued more highly and leveraged more fully throughout the organization.
This is the sophistication that we must aspire to if we are ever to tame the Wicked Problem that is Risk Management. We have to move our thinking beyond the mechanistic and simplistic to embrace the complexity of reality.
If we believe that by simply appointing a Risk Manager or creating a Risk Framework and a Risk Register that we are managing risk, then we are deluding ourselves. There is so much more to it than that.
Embedded in our mindsets, our decisions and our actions are the seeds of risk, the consequences of which may be immediate or not apparent until years or even decades later, if ever.
The tragedy of Nimrod XV230 is that but for delays in its replacement programme, it would probably have no longer have been flying when the disaster occurred. Years of major delays and cost overruns, however, meant that Nimrod XV230 continued to be required for operational duties.
This is the real complexity of risk, often a litany of events non of which of themselves are the direct cause, but over time all contribute to the ultimate fruition of the risk event.
A Risk Framework cannot capture the essence of such complexity, but a Risk-Aware Culture can and this is what both communities and organizations must strive to embed.
We need to understand that the ever-increasing connectedness of the world in which we live has created a level of complexity that simple frameworks cannot capture.
A Risk Dashboard can tell us where we are, our current state of health, but it cannot tell us, for example, that the bridge 500 metres down the road is about to be swept away by flood waters, the result of heavy rain in a catchment 20 kilometres away. A mindset that is risk-aware, however, could alert us to the possibility of impending danger.
Dr John S. Bircham
Papamoa Beach, New Zealand
December 18th, 2009
See the side panel »»