Americas

  • United States

Asia

Oceania

Andrada Fiscutean
Freelance writer

Why are people so bad at risk assessment? Blame the brain

Feature
Nov 09, 202110 mins
CSO and CISORisk Management

Stakeholders and CISOs tend to have different perspectives on estimating the risk of a potential cybersecurity incident. Understanding the psychological aspects can help bridge the gap.

risk assessment - safety analysis - security audit
Credit: Thinkstock

Almost four decades have passed since the release of Brain, one of the first computer viruses that traveled the world. Since then, we’ve witnessed a wide range of attacks: Stuxnet destroyed almost a fifth of Iran’s nuclear centrifuges, WannaCry infected computers in 150 countries, ransomware gangs stole millions of US dollars, and thousands of companies have been affected by data breaches. Yet, despite that, many organizations still underestimate the risk of a potential cybersecurity incident.

As humans, we’ve developed a danger detection system to protect against concrete threats, such as wolves, bears, and other predators. When it comes to the relatively new field of cybersecurity, it’s different. “The risk of a cyberattack seems a pretty abstract concept,” says Ralf Schmälzle, assistant professor at Michigan State University. “We cannot sense the danger.”

Schmälzle, who studies the brain mechanisms of successful risk communication, said that danger means different things to different people. It’s an “uncertain judgment,” meaning that it’s very subjective. “The problem with computer safety and viruses is that you don’t feel any risk,” he says.

In many organizations, there appear to be two sides: stakeholders and CISOs. “Stakeholders, who have never looked at firewall logs to see how many attacks are continuously happening, tend to be overly relaxed and believe that CISOs are overly paranoid,” says Stefan Tanase, cyberintelligence expert at CSIS Group. “The two sides, stakeholders and CISOs, have different opinions and it’s difficult for them to understand the other perspective.”

Academics specializing in risk assessment and management said barriers prevent us from understanding risk, including our biases, our experience of similar incidents, and even our perceived performance at work, which can influence our willingness to take chances.

What prevents us from assessing risk correctly?

When estimating potential risks, we often rely on our intuitive sense of danger. We tend to be too optimistic or overconfident. We might also be subject to confirmation bias or have a false sense of control that could skew our perspective.

“People are all over the map when it comes to [assessing risk], but they lean toward being excessively optimistic about most things, including cybersecurity,” says Hersh Shefrin, Mario L. Belotti Professor of Finance at Santa Clara University’s Leavey School of Business and author of Behavioral Risk Management: Managing the Psychology that Drives Decisions and Influences Operational Risk.

Schmälzle also notices that unrealistic optimism tends to influence our judgment. “[It’s] the assumption that we will be better off than others: less likely to have cancer, more likely to have smart kids, less likely to become a victim of a cyberattack,” he says. While this tendency has been documented around the world, it seems to be accentuated in developed societies.

Closely related to that is the illusion of control, a bias that can have terrible consequences in a business setting. When we feel we are in charge, like many C-level executives do, we can downplay risks. For example, many people feel safer when they drive their cars but are afraid to board planes, although statistically flying is safer.

Schmälzle, who has “a tiny bit of flying anxiety,” understands this well. He added, though, that our perception of risk can be influenced by how far the dangerous event is in the future. When he books the tickets, a few weeks before the flight, he feels no anxiety. “However, once we are accelerating at the runway, then there’s a bit of sweaty palm feeling creeping in.”

He feels this example illustrates a broader point: “In my case, it is only the immediate situation (perceived danger) that elicits the feeling—imagining it from a distance doesn’t cut it,” he adds. “If things are too remote, too far ahead, then we don’t ‘feel’ the risk—regardless of how much experts want to warn us about this or that.”

Even when experts warn us, we might discard that advice, as we tend to favor the information we already know and want to hear, ignoring the rest. Researchers say that confirmation bias can be found in searching for information, interpretation, and how we recall things. Realistically, we can’t fully avoid confirmation bias, but we can keep it in check through critical thinking, researchers say.

Assessing risk accurately is a difficult task, and even the smallest things matter, including how we feel about our job performance. “Fear of failure, being below aspiration, induces people to be aggressive risk takers,” Shefrin says. “That is a key reason underlying breaches at companies like Yahoo and Equifax.”

What baffles Schmälzle the most when it comes to cybersecurity is that organizations get hit multiple times in a row—Acer and Olympus are recent examples. “Normally, once you get burned, you become very cautious,” he says. “For instance, putting your hand on a stove is normally an example of single-trial avoidance learning.”

Schmälzle says that one explanation for getting attacked repeatedly and not correcting errors is that companies who only suffered small incidents find it hard “to vividly imagine the worst case.”

CSIS’s Tanase says that the business world might suffer from alarm fatigue. “When reading threat reports, a lot of them are unfortunately overhyped for marketing and PR purposes. How can the reader correctly assess the risk when every cyberattack is ‘the most complex’ and every threat actor is ‘the most sophisticated’? When everything is important, nothing is important.”

Getting better at understanding risk

Although our brains are not necessarily optimized to assess the risk of cybersecurity incidents, we can do a couple of things to improve our chances. First, cybersecurity could learn from its older sister, physical security, as Matt Blaze suggested in an iconic paper published in 2004. The fundamental idea of the paper is that almost all systems can be broken given enough time.

“Perhaps owing to its long history and relatively stable technological base, the physical security community—and especially the safe and vault community—generally seeks remarkable precision in defining the expected capabilities of the adversary and the resources required for a successful attack to occur,” he wrote. “Far more than in computers or networks, security here is recognized to be a tradeoff, and a quantifiable one at that. The essence of the compromise is time.”

Talking about risks should be more common in the cybersecurity world, says Tom Sammel, a retired US Marine Corps officer who has worked as a CISO and is now an incident response commander at Secureworks. In the past few decades, he noticed that companies tend to postpone having uncomfortable conversations. “We live in a world of pain that most executives fear to go to,” he says. “They struggle with equating an incident like data theft or ransomware to how that may impact business reputation or operations. Since this calculus is hard, they tend to simply avoid it.”

Some tools based on the NIST framework can help organizations identify and reduce the cybersecurity risk with the lowest effort, says Michael Benz, partner and fractional CIO at Fortium Partners. These are particularly relevant to small- and mid-sized organizations that often lack the resources of large enterprises. While these tools are not perfect, they can make a difference in organizations that aren’t doing enough to protect against digital threats. “I’ve found that most C-level executives rely on hope, prayer, and luck to drive their cybersecurity risk management strategy,” Benz says.

Organizations shouldn’t focus only on having the right tools. They should also hire the best people for the job. This means including diverse employees and giving them a voice, says cybersecurity entrepreneur Jane Frankland, author of the book IN Security: Why a Failure to Attract and Retain Women in Cybersecurity is Making Us All Less Safe. “Countless studies have shown that women and men do gauge risk differently,” she says. “Women are typically more risk averse and their natural detailed exploration makes them more highly attuned to changing pattern behaviors—a skill that’s needed for correctly identifying threat actors and protecting environments.”

Women score highly when it comes to intuition and emotional and social intelligence, Frankland adds. Typically, diverse groups are smarter and better at reacting in case of an emergency. “Any time you have uniformity of thought, you miss out on the most creative solutions or tactics that can help us beat the threat actors,” she says.

With more diverse teams, better tools, and better processes, organizations can be better at assessing risk. Things have changed in the past decade, says Sammel. Risk is experiential, and many organizations that have gone through incidents have learned from it. Still, there’s a gap between what stakeholders think might happen and the logs CISOs see. The gap can potentially be addressed if security experts find better ways to talk about risk.

Communicating risk effectively

After months or years of living in risky situations, people are no longer sensitive to number variations, as the COVID-19 crisis has shown. “What does it really mean to say that a risk is 60% or 80%?” Schmälzle asks. “Practically, I cannot use this information. Rather, when it comes down to assessing my personal risk, what I do is break it down into two categories: ‘could happen’ versus ‘probably won’t happen.'”

It is why CISOs must spend time explaining to the stakeholders that risk is not abstract but concrete. Worst scenarios will happen given enough time, as Blaze’s paper explained a decade and a half ago.

“The CISO has to spend time with their teams, work through different scenarios. Then, they have to talk through how different incidents unfold,” Sammel says. Then, once the plan is ready, CISOs “need to take the results of that analysis and go find the most appropriate stakeholders to help quantify the effects of the incidents.”

These conversations should include specific numbers. Researchers recommend using a combination of the probability of the unwanted event and the severity of the consequences. CISOs should warn stakeholders that it’s not enough to purchase insurance, check all the compliance boxes, and buy some tech that could increase the level of security. These help, but they are not enough.

“In the US Marine Corps, we had fun saying, ‘An inspection-ready Marine is never ready for war, and a combat-ready Marine is never ready for inspection,’ because there’s being functionally ready for something, and then there’s being ready for an inspection,” Sammel says. “It takes a good security team to go in and look at the questions and then dive beyond just what the question is asking.”

CISOs can also translate the risk of potential cybersecurity incidents into money. For example, during one pre-ransomware incident Sammel worked on, he advised the victim company to shut down operations for a few days to fix the issues. “I broke it down numerically for them,” Sammel explains. “I said: if you take this outage, you’re going to cost yourself about five to eight million. If you don’t take the outage and the attacker is able to come back in, gain a foothold and detonate and put you out of work for two to four weeks minimum, you’re looking at a $500 million bill. It takes that level of honesty, of brutal understanding of a situation.”

Still, even when CISOs translate the risk into money, they should keep in mind that “facts alone don’t move people,” as Shefrin puts it. Instead, hard facts should be complemented by vivid images and emotions that prompt stakeholders to take action. “Emotion refers to ‘motion,’ meaning the brain’s system for moving, taking action,” he says.