To make good decisions, we first need to understand what's going on. This is "Situational Awareness." This module explains the three levels of awareness and explores the common mental traps - tunnel vision and bias - that can catch any of us out, especially when we're under pressure.


Learning Outcomes

  • Recall the three levels of situational awareness (Endsley's model).
  • Define 'fixation error' (tunnel vision) and 'cognitive bias'.
  • Give examples of common cognitive biases in healthcare.

What is Situational Awareness?

Situational Awareness is a term that originated in aviation. In simple terms, it's:

  • Knowing what's going on around you.
  • Understanding what it means.
  • Thinking ahead about what might happen next.

The most common model for situational awareness breaks it down into three ascending levels:

1. Perception (Observation)

First, you must perceive the environment. This is purely about data entry, seeing, hearing, and gathering facts.

You look at the patient. You see the monitor reads 110 HR and 26 RR. You see their skin is pale.

2. Comprehension (Analysis)

Next, you apply your training to understand the significance of that data.

You connect the dots. You know that 110/26 are not baseline numbers. You recognise that High HR + High RR + Paleness = Unstable. This isn't just data anymore; it's a diagnosis of a problem.

3. Projection (Prediction)

Finally, you stay ahead of the curve by projecting future risks.

You simulate the next hour in your mind. "If I walk away, this patient crashes." Because you can predict the septic shock, you intervene before it happens.

Losing Situational Awareness is failing at any of these levels, is a major cause of error.

Advanced SA Simulation: Triage
Scenario: Saturday Night A&E
Briefing

Challenge: Triage Protocol

The Context: You are the lead nurse. You have 1 Resus bed available.

The Task: Three patients have just arrived. You have 15 seconds to Scan (Perceive) them. You must filter the distractions and Predict (Project) who is going to crash.



Pitfall 1: Fixation Error (Tunnel Vision)

When we are under pressure, stressed, or fatigued, our brains use 'mental shortcuts' (known as heuristics) to make decisions faster. These are often useful, but they can also lead us into traps.

The first and most dangerous trap is Fixation Error, also known as "Tunnel Vision".

💡
Definition: Fixation is when "a healthcare professional's attention is so highly focused on a specific goal... that warning signs that should normally prompt a change... are entirely missed."

Your critical thinking disappears. You become fixated on a single task and fail to see the bigger picture.

A tragic and well-known example is the case of Elaine Bromiley.

  • Elaine was a healthy patient admitted for routine sinus surgery.
  • During anaesthesia, the team ran into breathing problems. It became a "can't intubate, can't ventilate" emergency.
  • For over 15 minutes, three highly experienced consultants made repeated attempts to secure the airway. They became fixated on this one task.
  • They missed warning signs (dangerously low oxygen levels) and did not listen to nurses who had brought emergency equipment to the room.
  • By the time they stopped, Elaine had suffered a severe brain injury from lack of oxygen, and she later died.

This tragedy was caused by fixation error. The team's attention tunnelled onto the task of intubation, and they lost situational awareness of the patient's overall, deteriorating condition.


Pitfall 2: Cognitive Biases

Biases are mental shortcuts (heuristics). While they help our brains process information quickly, they often lead us to the wrong conclusion. These occur unconsciously; nobody is immune to them.

Here are the three most common traps in healthcare:

⚓ Anchoring Bias

The "First Impression" Trap
This happens when we fixate on the first piece of information we receive. We "anchor" to that initial idea and find it difficult to let go, even when new evidence appears.

A patient arrives at A&E with chest pain. You immediately decide it is likely muscular. When the ECG returns with subtle changes, you dismiss them because you are already anchored to your initial - and incorrect - diagnosis.

🍒 Confirmation Bias

The "Cherry-Picking" Trap
This is the tendency to hunt for evidence that supports our current belief while ignoring anything that contradicts it.

You suspect a patient has a simple chest infection. You focus entirely on signs that confirm this (a cough, a mild fever) but unconsciously overlook the new, severe leg pain that actually points to a blood clot (PE).

🧠 Availability Heuristic

The "Recency" Trap
We tend to overestimate the likelihood of events that are recent, dramatic, or memorable. If something is fresh in your mind, you assume it is more common than it really is.

You recently treated a patient with a very rare, serious condition. When the next patient walks in with vague symptoms, you immediately suspect that same rare disease, ignoring the fact that a common virus is far more likely.
Bias Detective: Cognitive Traps in Healthcare
🕵️ Bias Detective
Case: 1/3

Identify the Cognitive Trap

You will see three clinical cases where a doctor has made a quick judgment. Your job is to identify which cognitive bias is influencing their decision.

🍒
🧠

The Suspects:
Anchoring: Stuck on the first impression.
🍒 Confirmation: Only seeing what you want to see.
🧠 Availability: Influenced by recent, dramatic events.

Correct!
Explanation goes here.
🎉
Investigation Complete
You have reviewed all cases. Being aware of these traps is the first step to avoiding them in your own practice.

Key Takeaways

  • Situational Awareness is a key skill. It has three levels: Perception (noticing), Comprehension (understanding), and Projection (thinking ahead).
  • When we are under pressure, we are vulnerable to mental traps.
  • Fixation Error (Tunnel Vision) is when we focus so intently on one task that we miss the bigger picture and ignore warning signs.
  • Cognitive Biases are mental shortcuts that lead to errors. Common examples are Anchoring Bias (sticking to your first idea) and Confirmation Bias (cherry-picking data to fit your idea).