Six things I have learnt from thinking about alien invasion

Hopefully that grabs your attention. But I am not actually going to talk about alien invasion per se. Rather, the topic of this article is extreme risks – potential events that are unlikely to occur but that could have a significant impact on economic growth and asset returns, should they happen.

Extreme risks have always been of special interest to us in the Thinking Ahead Group. Our belief is that, in a complex world, extreme risks are more likely than implied by most financial models. Moreover, we live only once, facing problems in series, not in parallel. So when we are confronted with an extreme event, there is no going back in time and diluting the impact with other less negative ones. One must deal with its consequences. Plus, there’s a nerdy appeal to having the intellectual freedom to debate what could happen if a hostile extra-terrestrial invasion were to occur…

We have just published our 4th report on extreme risks. The top three extreme risks identified in this latest update are global temperature change, global trade collapse and cyber warfare. It has been ten years since we published the first report in 2009 so I thought now was a good opportunity to reflect on my personal learning journey. Here are my six lessons learnt:

  1. Cognitive biases are powerful

In a conference I recently attended, Richard Thaler, the behavioural economist who won the 2017 Nobel Prize in Economics, said people always refer to biases as “what other people do”. We all think we are above average in avoiding those biases. And that itself is a bias. Back to extreme risks: in our report in 2009, we called out economic depression, hyperinflation and excess leverage as the top three risks. It’s hardly surprising that would be the view in 2009. But were we over-weighting recent experience then? And are we doing so now? Could current concerns and headlines around climate change, trade wars and cyber warfare be drawing attention away from lower-profile but greater existential threats?

  1. When it comes to extreme risks, physics envy is particularly harmful

We knew this from the get-go: by definition extreme risks are infrequent so a quantitative approach is unlikely to be very informative. In 2009 we identified five risks (excessive leverage, depression, currency crisis, political crisis and protectionism) that were believed to have one-in-10-years likelihood. How long of a historical record do we need to build to have confidence, in a statistical sense, in this claim? Much longer than 10 years and probably a lot longer than anyone’s career. And even if you successfully build a long enough history, by the time you have it, the underlying driving forces will have evolved so much that a historical distribution may become irrelevant to future outcomes.

  1. Understanding cause and effect is the way to go

However that doesn’t mean we should give up on understanding these risks. Human intelligence is not limited to learning from observing the past (inductive reasoning); we are also capable of applying generalised truth to circumstances that have not yet occurred (deductive reasoning). Human civilisation has never experienced a climate change at 2°C and beyond. But that shouldn’t stop us from trying to understand the potential impact of such scenarios. For example, we have knowledge of the ice-albedo feedback and other linear and non-linear climate feedback loops. We understand well enough the effect of rising temperature on sea level rise, on frequency of heat waves, on risk of rainfall extremes over land, on global population exposed to severe drought and on reducing crop yields. An event without historical precedent can still be learnt and understood.

  1. Turn your “unknowns” into “knowns”

The more time I have spent thinking about extreme risks, the more I am reminded about what I do NOT know. Over the years I have found it useful to make a distinction between knowable parts of the “unknowns” and the unknowable parts of these “unknowns” because the ways to address them are very different. Dealing with the knowable parts requires intellectual curiosity and diligence. We can turn “unknowns” into “knowns” through collecting more information, building more sophisticated models and/or stronger theories and, of course, learning from others. By showing you a list of risk events that you have not thought about before, an opportunity arises to turn your “unknowns” into “knowns”. It allows you to eventually construct hedging strategies to protect you from the risks you are unwilling to take.

  1. Addressing “unknowables” is about making a portfolio resilient

On the other hand, “unknowables” are the knowledge that is simply out of reach at any point in time. There is no data or theory about them. They are unpredictable. They are the “black swans” in Taleb’s terminology. Alien invasion is very much in that territory. But we shouldn’t let this knowledge vacuum paralyse our decision-making. It is simply a reminder that our understanding of the world is always incomplete. The existence of “unknowables” means that resilience in an investment portfolio is at least as valuable as efficiency. Take the concept of diversification as an example. An investment portfolio with genuine diversity offers protection not only against unrewarded idiosyncratic risks, but also against our own ignorance.

  1. A mind-expanding exercise

At the end of the day, I see extreme risks thinking as an exercise for the mind. They remind us that it is naïve and dangerous to cling to a single vision about the future. Yes, we do not know what the future holds. But our brains are more than capable of imagining multiple versions of the future. And that is the game that investing is ultimately in. As investors, we are trying to navigate a highly volatile, uncertain, complex and ambiguous world. In my view, the extreme risk scenarios described in our report(s) can be turned into useful material to facilitate a collective learning experience for your organisation. The scenarios are most effective when they are used, in a deliberately-created interactive environment, to make explicit – and to challenge – assumptions that underpin your investment portfolios or your business strategy[i].

When I worked on our first extreme risks report, never in a million years did I expect one day to be accused of “alien-washing”[ii]. Seriously or not, it happened. It certainly wasn’t an extreme risk – despite very low probability, the impact wasn’t anything more than having a good laugh. I do hope, however, that our analysis will be of some value in helping both to prepare for and to respond to extreme risks – whatever form they take.


[i] To truly harvest the power of scenario learning, we hope this Thinking Ahead Institute paper – It's story time: The why, how and what of scenario learning – can help you

[ii] Quoting directly from this report – “Many of the initiatives that were identified seemed to resemble ‘alien-washing’. For example, despite the fact that Towers Watson communicated on alien invasion as one of the top 5 extreme environmental risks, there is no evidence that this risk is considered in the context of investment consulting services offered by Towers Watson.