I’ve recently finished reading the book ‘How Spies Think’ by David Omand, the former chief executive of GCHQ, and it’s had me ruminating over what makes for a good decision. I’ve since pondered over questions such as how many of the decisions we make are truly rational. And how many of our decisions do we subconsciously pretend are rational, when in fact they are based on underlying biases, emotions, and faulty memories.
Being the boss of one of the world’s largest and most renowned intelligence agencies, David Omand had first-hand insight into the strategic and tactical decisions made by the UK Government to combat the threat of organised crime, hostile foreign nations, and the rise of global terrorism.
Whilst I imagine very few of the people reading this are involved in making decisions that could affect the health and safety of millions of people, there are plenty of lessons to be learned from the step-by-step methodology that decision-makers within intelligence agencies use, which Omand somewhat appropriately dubbed ‘The SEES Model.’
In this insights article, I thought I’d share a few takeaway points from The SEES Model, which I hope will help you make better-informed decisions in both your business and personal life.
What exactly is the SEES model? The SEES Model was an acronym devised by David Omand during his work as an intelligence analyst. In an unpredictable world, where everyone has underlying objectives, intelligence analysts are required to keep a cool head and to be as rational as possible when making decisions that could potentially have huge geopolitical consequences, potentially affecting the lives of millions of people.
The SEES Model aims to help analysts make rational decisions amidst the ever-changing circumstances and pressures brought on by each situation. The model has four components: situational awareness, explanation, estimation and modelling, and finally strategic notice.
Despite being originally intended for use within the intelligence community, the model is gaining growing popularity among decision-makers from all walks of life.
The first step in the SEES model is situational awareness. Decision-makers must be aware of the environment they operate in, and constantly be on the lookout for more information that fills gaps in our knowledge. A crucial step in getting better situational awareness is to identify what the gaps in our knowledge might be, however in a complex world this is often easier said than done.
The key to having great situational awareness, according to Omand, is to have a high degree of awareness of new sources of information, even if we might have originally
A common human trait is to form a hypothesis in our mind, and then ignore or explain away any evidence that contradicts our original assumption. Being human, we often form biased hypotheses about a situation without even realising we are doing so! Furthermore, studies have shown that we actively seek out information that confirms our beliefs, and subconsciously dismiss information that contradicts it.
Omand offers two solutions for combatting our inherited tendencies. The first is to keep a log of your hypothesis, so that you have a clear audit trail of your thought processes surrounding a situation and can refer to it if required.
The second solution involves a little math. Enter: conditional probability and Bayesian Inference; mathematical models and techniques that allow us to rationally adjust our decision-making process based on the emergence of new information.
Conditional probability is a term in mathematics which analyses the likelihood of something happening on the condition that something else has already happened. For example, what’s the probability that you’ll arrive on work by 9am if you leave the house at 8:15am?
Powerful algorithms interact with you every day without even realising, and most are based on conditional probability. Take Spotify as an example: if you listed to Eminem yesterday, what’s the conditional probability that you’ll want to listen to Snoop Dogg today? Such an example might sound trivial in nature, but when you’re making serious decisions, having a model based on conditional probability is crucial.
The second step in the SEES model is an explanation for what we are seeing. After we have gathered (and continue to gather) information about our environment, we need to explain why we believe that certain events are unfolding in the way that they are. This stage often leads to plenty of biases coming to the forefront of our decision making, hence why it’s crucial we minimise the cognitive biases outlined previously.
After you’ve gained your information, you need to formulate several hypotheses that could explain why you are seeing what you are seeing. It’s important that we don’t dismiss a hypothesis too quickly, as again, being human we are likely to be swayed by subconscious beliefs and prejudices. It’s also important not to jump to conclusions.
A great exercise at this stage is to list out each hypothesis, and then list each item of evidence that supports or contradicts it. The idea is that the hypothesis with the least amount of evidence contradicting it is most likely to be the truth.
The third step in the SEES model is exploration. In this stage, we forecast ahead to predict what the outcome would be if a certain scenario unfolds based on the information we have already gathered. Remember that your estimates can be thrown off by unforeseen events, and it’s important to have contingency plans in place to help you respond.
The fourth step in the SEES model is strategic notice. By this point, we’ve already gathered information, formulated hypotheses, tried to predict how events might unfold in the future, and then mitigated any unforeseen events and circumstances. However, we’re not quite finished yet.
In the fourth stage of the SEES model, we need to raise our periscope and attempt to foresee far-flung strategic risks which could knock your decision-making modelling off course.
Strategic risks tend to be far-flung scenarios, each with a varying degree of probability and harm, which if they took place could cause serious disruption to your business. The timeliest example of a strategic risk would be the recent Covid pandemic, which caught many unprepared businesses off-guard.
Omand suggests that decision-makers should have a thorough grasp of the risks that could knock their plans off track, regardless of how remote they may seem.
Omand’s book was much more thorough and detailed, and I could never do it justice by reciting this synopsis of my take-aways from it. I’d encourage you to add it to your reading list if you haven’t done so already.