Seeing the future as merely an extension of the past is a fundamental error. Complexity’s immense, volatile and unpredictable character will make tomorrow less resemble today than yesterday did. So navigating a journey through the rear-view mirror won’t help you avoid the brick wall up ahead.
Following the stock market crash of October 1987, value-at-risk (VAR) models became the rage in financial services. JP Morgan was an early convert and pioneer as VAR provided the Chairman, promptly at 4:15pm each day a single number that defined the extent of the bank’s risk exposure. The allure of sophisticated mathematics to simplify risk management fanned VARs popularity, with over 200 books published about it. The only problem was that it was nothing more than a “beautiful lie.”
VAR is driven by historical data and dependant on one huge assumption: that the future will follow the same pattern as the past (from where the data was taken). In other words, if you haven’t driven into a wall yet, you won’t. But as Nassim Taleb has pointed out, the highly improbable nature of rare events (black swans) must be considered together with the oversize nature of their potential consequences. Driving into a wall may be unlikely but, as it’ll probably kill you and everyone else in the car, you're probably best advised to remain aware enough not to do it.
The culture of financial service firms particularly predisposes them to VAR-type errors. Miles Kennedy, a PwC partner described the culture as a “tendency to place greater confidence in risk information that is data-driven, in the belief that this confers objectivity and truth.” But whilst objectivity is a noble aim in business - freeing us from cognitive bias and politicking - objectivity “doesn’t equate to truth” where the future is concerned, for “risk is about the future, and there are no facts about the future.” Substituting foresight about possible futures and consequences for forecasts that are pleasing yet flawed is, as the world has discovered over the last six years, dangerous.
The economics professor and writer, John Kay, describes this cultural bias as a “belief that a number based on the flimsiest of data is better than a qualitative, and necessarily subjective, judgement.” Although this bias is rampant in the ‘numbers professions’ - bankers, accountants, economists - it’s spreading to other organisations through the allure of big data. And where certainty is claimed about the future - a place where no certainty can exist - it'll be a treacherous guide to navigating an increasingly complex world.