Dutch study uncovers cognitive biases undermining cyber security board decisions

by Wire Tech

Dutch study uncovers cognitive biases undermining cyber security board decisions

Dutch research reveals how cognitive biases can lead to catastrophic security decisions

The traditional traffic light system used by chief information security officers (CISOs) to report cyber risks to boards is showing signs of strain. After interviewing more than 10 CISOs across Europe, PhD researcher Gulet Barre from the Open University of the Netherlands has uncovered concerning evidence: the cognitive biases inherent in how boards interpret amber risks appear to be creating a dangerous gulf between what CISOs report and what boards understand.

“Think of it like driving,” said Barre, whose research focuses on communication between CISOs and boards. “We all know what to do with green and red traffic lights, but amber? Some drivers accelerate through, others brake. That ambiguity in amber is exactly what’s happening in boardrooms when CISOs present cyber security risks.”

The consequences of this misinterpretation are severe. When decisions are made based on cognitive distortions, it can lead to catastrophic consequences, warned Barre. His research has identified seven cognitive biases that systematically undermine cyber security decision-making at the board level, with potentially devastating results for organisations.

Reactive approach to funding

Perhaps the most damaging finding from Barre’s research is what one CISO candidly admitted: “Bad news is good news. When a critical attack is executed on a competitor, then board members stand ready with a bag of money. Then you get what you want as a CISO – the red team exercises you desperately want to get funded. Unfortunately, that’s the reality.”

This reactive approach to cyber security funding creates a vicious cycle. CISOs struggle to secure adequate resources for preventive measures, whilst boards remain convinced their organisations are adequately protected until a major incident strikes. By then, the damage is done.

The problem stems from how amber-rated risks are communicated and interpreted. In traditional traffic light reporting, green signifies low risk, red indicates immediate action required, but amber exists in a grey area that different stakeholders interpret entirely differently.

“A CISO might lean towards amber being closer to green – they’re relatively optimistic about managing the risk,” Barre said. “But a board member might be more pessimistic, viewing that same amber risk as dangerously close to red. This fundamental misalignment in risk perception – what we call ambiguity bias – creates a massive gap in how cyber threats are understood and addressed.”

Seven biases undermining cyber decisions

Barre’s research has identified seven specific cognitive biases affecting cyber security governance: optimism bias, pessimism bias, herding bias, confirmation bias, ambiguity bias, overconfidence bias and endowment bias.

Optimism bias leads CISOs to underestimate the probability of adverse outcomes, while pessimism bias can cause board members to view situations as worse than they are. The combination creates what Barre calls “catastrophic decision-making scenarios”.

Herding bias proves particularly dangerous in boardroom settings. “If there’s an ex-CISO sitting as a board member, other directors might think: ‘This person believes we should go that direction, so they must be right, let’s follow’,” Barre added. “People defer to perceived expertise without critical evaluation.”

This behaviour extends beyond individual meetings. Boards often make cyber security decisions based on what competitor organisations are doing, rather than their own specific risk profiles.

“They see a competitor moving right, so they move right too, without considering whether that’s appropriate for their organisation,” Barre said.

Confirmation bias compounds these problems. Board members who’ve seen ransomware attacks reported in the media become fixated on those specific threats, pushing CISOs to address risks that align with their preconceptions rather than the most pressing actual vulnerabilities.

Meanwhile, overconfidence bias leads board members to overestimate their understanding of cyber risks, particularly when they receive new information such as security audit reports, creating a false sense of security. And endowment bias manifests when boards resist changing existing systems.

“A CISO might recommend replacing an outdated tool, but board members, especially senior ones, will say, ‘That system is good enough’ simply because they’re used to it,” Barre said. “It’s a form of change resistance that leaves organisations vulnerable.”

The illusion of amber

The core issue lies in amber’s fundamental nature, as Barre terms it, as “an illusion”. Unlike hospitals, where patients either stay or go home with no middle ground, cyber security reporting has maintained this problematic middle category that serves neither CISOs nor boards effectively.

“Amber essentially isn’t an action point,” said Barre. “It becomes a parking space for both boards and CISOs. When you have limited time in board meetings and multiple risks to discuss, there’s a tendency to push red risks towards amber to create more time for discussion. But this creates dangerous ambiguity.”

The research reveals that the gap in establishing and interpreting risks is enormous. When the same cyber security scenario is presented to different stakeholders, CISOs might classify it as red while board members see it as green, or vice versa. This fundamental disconnect in risk assessment undermines the entire cyber security governance process.

According to Barre, it would be valuable to investigate whether temporal distance affects risk perception. “If you’re discussing a cyber security threat that could impact operations next week, board members could tend to be more pessimistic, more cautious,” he said. “But if the same threat is projected for months ahead, they may become more optimistic, less concerned.”

Reporting update needed

So, what’s the solution? Barre’s ongoing research is exploring whether traditional traffic light reporting can be salvaged or needs complete replacement.

“You can live in a house for 20 years without moving, but you can renovate,” he said. “The question is whether amber needs a ‘patient information leaflet’ like medications have, explaining exactly what it means and what actions are required, or whether we need to eliminate amber entirely.”

The pharmaceutical analogy is deliberate. When doctors prescribe medication, they provide detailed guidance on dosage, timing and potential side effects. Cyber security reporting currently lacks this specificity, leaving boards to interpret risks based on their own biases and limited understanding.

Some high-reliability organisations are already moving beyond simple colour coding, implementing sophisticated dashboards and AI-driven risk assessment tools. However, the grey area of amber persists regardless of the complexity of the reporting system.

With new European legislation, such as NIS2, making board members personally liable for cyber security failures, the stakes have never been higher. The research suggests that simply recognising these cognitive biases can be the first step towards better decision-making.

“If board members and CISOs can identify when optimism bias, herding behaviour or confirmation bias are influencing their discussions, they can hold each other accountable and make more rational decisions,” Barre said.

His experimental research, currently underway, will test whether different presentation methods can reduce bias-driven misinterpretations of cyber security risks. The goal is to develop practical tools that improve communication between CISOs and boards.

Barre’s message to current boards is stark: “I hope board members realise that green, amber, red doesn’t cover the full scope of risk. They can’t just park issues in amber and assume they’re handled.”

For CISOs, the research suggests that recognising and actively addressing board biases is as important as technical security measures. “Psychological distortions are present in every decision-making process,” Barre concluded. “The key is keeping each other sharp and recognising when these biases are influencing critical cyber security decisions.”

As cyber threats continue to evolve and regulatory pressure intensifies, organisations can no longer afford the luxury of ambiguous risk reporting. The traffic light system, which has served as a cornerstone of cyber security governance for years, may itself need a security update.

Read more about cyber security

Originally published at ECT News

You may also like

Leave a Comment

Unlock the Power of Technology with Tech-Wire: The Ultimate Resource for Computing, Cybersecurity, and Mobile Technology Insights

Copyright @2023 All Right Reserved