The overlooked component of risk management

Jigsaw - risk management

Share this content

Facebook
Twitter
LinkedIn

Michael Gips, Principal of Global Insights in Professional Security and Dr Gavriel Schneider, CEO of the Risk 2 Solution group of companies discuss the importance of psychology in risk management.

Effective risk management

Good risk managers assess their organization’s risk profile and apply appropriate strategies and measures commensurate with their appetite and tolerance.

Very good risk managers adapt their risk management approach to be flexible in a fast-moving world, heightening organizational resilience and ability to recover from a crisis.

Excellent risk managers combine strategic and tactical thinking.

They adopt a future focus and an opportunistic mindset that enable their organization to anticipate crises and leverage opportunities that arise from them, leaving them stronger afterwards.

The best risk managers also incorporate an extra element: they understand the psychology of risk.

They are students and shapers of the organization’s psychological approach to risk management, as well as of the risk psychology of leadership, asset managers, rank and file employees, and other stakeholders.

They also understand their personal psychological approach to risk – a whole-of-person approach – which they apply to staff throughout the organization.

After all, people are the most important component of effective risk management – knowing individuals’ risk appetite and tolerance, understanding their motivations and biases, comprehending how organizational culture and structure influence decision-making and judgment.

Often unrecognized, misunderstood or ignored, personal risk psychology is a crucial underpinning to risk management.

What is risk psychology?

Understanding and leveraging psychology enables you to understand why people – most importantly, yourself – think, feel and act the way they do, especially in trying or stressful situations.

That’s critical for the following reasons:

  • It predicts reactions: risk isn’t just about things that might go wrong; it’s also about how people respond to these situations. Knowing psychology can help us guess what people might do in different scenarios
  • It leads to smarter choices: biases and mental shortcuts can twist thinking. By understanding these biases, we make better, clearer decisions
  • It enhances communication: whether it’s giving instructions or sharing information, knowing how people think and what motivates them can make our communication more effective
  • It teaches mental toughness: psychology teaches us how to be mentally and emotionally tough, so we can recover quickly from difficult situations
  • It encourages healthy practices: understanding psychology helps in getting people to take action before problems happen, like following safety rules or looking after their mental health
  • It motivates staff: good leaders need to know how to motivate and guide their teams, especially when things get stressful. Psychology can offer tips on how to do this well
  • It helps handle crises: in a crisis, knowing about psychology can help manage stress and fear, both for yourself and others. It’s about keeping a cool head and helping others do the same

Cognitive bias and heuristics

Everyone is biased, in the sense that they tend to respond in a certain way due to flaws or distortions in their decision-making.

Cognitive biases and heuristics are mental shortcuts that our brains use to simplify decision-making in everyday life.

While they can be helpful in quickly processing information and making decisions, they can also lead to errors in judgment.

Identifying, acknowledging and overcoming biases is critical in risk management.

Cognitive biases are patterns of thinking that can lead to irrational or inaccurate decisions.

They’re often affected by our experiences, emotions or social influences.

For example, “confirmation bias” leads us to favor information that validates our existing beliefs and ignore information that contradicts them.

If you believe an alarm provider is the best, you might only notice positive reviews about it and ignore the negative ones.

Heuristics are simple, efficient rules (either learned or hardwired into our brains) that help us make quick decisions.

One common heuristic is the “availability heuristic,” where we judge the likelihood of an event based on how easily examples come to mind.

If you’ve recently read about attacks in hospital emergency rooms, you might overestimate the danger of entering a hospital, even though it’s statistically very safe.

Another important heuristic in risk management is the affect heuristic.

This refers to the process of relying on emotions to make a quick decision.

In crisis scenarios, this can have tragic consequences.

Consider an active shooter situation in a building that houses a daycare center.

An employee in the building gets word of an assailant in the building and, overriding the organization’s clear policies and procedures and the employee’s own scenario training, ignores their own safety to get to their child.

Mental shortcuts can be both beneficial and problematic.

On the plus side, they speed decision-making and help us ignore extraneous information in a complex world.

For instance, if you always drive the same route home because you are familiar with the traffic patterns, you’re using a heuristic to make a quick and effortless decision for your commute.

On the other hand, biases and heuristics can lead to flawed decisions.

For example, taking the same route home every day may make you an easy target for kidnappers or armed robbers.

Understanding cognitive biases and heuristics helps us recognize when our decision-making might be going astray.

By being aware of these mental shortcuts, we can make more reasoned and objective decisions.

Cognitive dissonance

Cognitive dissonance describes the discomfort that results when a person holds two or more contradictory beliefs, values or ideas simultaneously, or when these beliefs are challenged by new information or new interpretations.

This state of tension drives individuals to reduce the dissonance, either by changing their beliefs or rationalizing the contradictions.

In the context of risk management and intelligence, understanding and managing cognitive dissonance is crucial for several reasons, including improved decision making, adaptability and learning, enhanced risk evaluation, and innovation and creativity.

Strategies to handle cognitive dissonance effectively include:

  1. Recognize and acknowledge cognitive dissonance. Understanding that it’s a common psychological response can make it easier to address
  2. Increase your knowledge on the subject causing dissonance. Educating yourself might help reconcile conflicting beliefs or justify a change in attitude
  3. Be flexible and open to change, which can help you adapt based on new information or understanding, reducing the psychological discomfort associated with dissonance
  4. When you notice conflicting beliefs, consider which belief aligns more closely with your values and overall goals, and adjust accordingly
  5. Engage in critical thinking to evaluate the validity and reliability of the information that is contributing to the dissonance. This can help prioritize which beliefs might need reassessment
  6. Seek advice. Discuss your thoughts and feelings with others. Sometimes, external perspectives can provide clarity or alternative ways of thinking about the issue
  7. Make small gradual changes instead of drastic ones. Gradually adjusting your beliefs can reduce the impact of dissonance
  8. Reflect on why you hold your current beliefs and how they align with your actions. Reflection can lead to deeper understanding and easier reconciliation of conflicting thoughts

By actively managing cognitive dissonance, you enhance your ability to make decisions that are more aligned with your true self and values, which is essential for developing risk management and intelligence, fostering resilience in various aspects of life.

Whole of person model

No one can completely cordon off aspects of their lives.

Work life, social life, family life and virtual life all spill over into each other.

Consequently, understanding anyone’s risk psychology requires an understanding of how they perceive risk within these separate spheres and holistically – what we call the Whole of Person Model.

This approach ensures a comprehensive, adaptive and proactive risk management, enhancing overall performance and resilience in a complex and interconnected environment.

Organizations can take on the collective risk psychology of its personnel or populations.

In fact, risk psychology and, accordingly, risk tolerance and appetite, can differ by team, floor, department, facility, campus or geography.

It’s critical to recognize that these risk profiles can be at odds with each other.

For example, software companies are under constant pressure to get new products, versions and updates into the hands of consumers, making tough decisions along the way on dealing with software vulnerabilities, privacy issues and other potential risks of moving quickly.

Sales and marketing teams tend to be risk-tolerant, while departments such as compliance and legal are risk-averse.

However, even within the software development team, there may well exist substantive differences in risk psychology.

While it may not be feasible for a risk management professional to calculate a risk psychology profile for every staff member, risk managers from various departments, facilities, geographies, etc. can collectively create risk profiles of key staff and determine the risk profiles of everything from mall teams to the entire enterprise.

This article only touches on the many intricacies of probing and identifying the psychological of risk among employees and other stakeholders.

Just by becoming aware of the importance of personal perceptions of risk – including the biases and heuristics that undergird those perceptions – constitutes a key step in building an organization that is adaptive, resilient and empathetic.

Cognitive biases and heuristics in security

Countless cognitive biases and heuristic exist.

The following appear frequently, or have significant consequences, in the world of risk management:

Automation bias: the tendency to defer to information produced by automation or technology when making a decision.

The authors frequently see this occurring with Chat-GPT, when individuals rely on false information generated by that platform.

Decision fatigue: people will often ignore alarms, security warnings and other alerts if they are bombarded by them.

This is problematic when it afflicts patrol staff and SOC personnel, among others.

Dunning-Kruger effect: overestimating one’s own knowledge, ability, competence or judgment.

“No corporate spy on LinkedIn will be able to coax any valuable information from me.”

Herd mentality: going along with the crowd: “No one else is leaving the building during this fire alarm, so I won’t either.”

The authors are aware of individuals who try to hide in their office during an alarm, assuming it will be a false alarm.

Hindsight bias: the mistaken assumption that we understand the past and will be able to apply it to future scenarios.

Much was learned from the 2020 Covid pandemic and ensuing shutdown, but the next global crisis will reveal lessons that weren’t learned from 2020 as well as a host of new issues.

Outcome bias: judging a decision based on the result rather than the quality of the decision making.

This is a common trap for security professionals.

For example, a security manager might install a comprehensive security system in a facility that is routinely bypassed by staff who bring outsiders into the building without incident.

Outcome bias suggests that the process of developing and installing a new system was sufficient even without policies restricting guest access after-hours.

Pareidolia: finding meaning or patterns that don’t exist from random data.

Finding correlations that aren’t there can derail investigations, for example.

In the short run, AI might exacerbate that tendency as well, especially if queried about possible links among disparate data.

Plan-continuation bias: failure to recognize that a given plan, such as a business continuity plan, doesn’t apply or no longer applies.

Many companies failed to realize that, a mere few weeks into the Covid-19 pandemic, their business continuity plans had reached the end of their useful lives.

Sunk cost fallacy: continuing to invest resources in a losing cause, even when it’s clear you would be better off cutting ties, simply because you have already invested significant resources.

The authors frequently see this phenomenon in security managers who have invested time, manpower and money in initiatives and equipment, and continue to do so even when they don’t get the results they desire.

About the authors

Dr Gavriel Schneider is CEO of the Risk 2 Solution group of companies, which focus on delivering innovative and cutting-edge solutions in the risk, intelligence, safety, security, medical and emergency response sectors.

He is also the author of the book Can I See your Hands: A Guide to Situational Awareness, Personal Risk Management, Resilience and Security (Universal Publishers, 2017) as well as a forthcoming title on Presilience, which sets forth a new proactive and agile approach to risk management.

Michael Gips, JD, is a security consultant, attorney, business executive and writer.

He has published more than 1,000 articles on security topics including risk management, business continuity, technological advances, executive protection, loss prevention, negligent security and intelligence, among dozens of others.

He is the Principal of Global Insights in Professional Security, where he advises corporate security departments, security providers and startups.

This article was originally published in the September edition of Security Journal Americas. To read your FREE digital edition, click here.

Newsletter
Receive the latest breaking news straight to your inbox