When a criminal case is closed or an unexpected incident enters public discussion, one sentence is often repeated:
“We never thought they were capable of this.”
This response is usually labelled as shock. Psychologically, however, it points to something more precise: a failure of risk perception. In many situations, danger was neither invisible nor absent. It was simply searched for in the wrong place.
Human cognition is not built for constant, detailed risk analysis. To function under uncertainty, the mind relies on cognitive shortcuts—known as heuristics—that allow for rapid judgement (Tversky & Kahneman, 1974; Kahneman, 2011). In everyday life, this system is highly efficient. It reduces cognitive load, supports routine decision-making, and enables people to navigate complex environments without becoming mentally overwhelmed.
The problem arises when these same shortcuts are applied to situations involving potential harm.
When Efficiency Distorts Judgement
Heuristics simplify information. Rather than evaluating each situation from the beginning, the brain draws on familiarity, prior experience and emotional cues. While adaptive in most circumstances, this process can quietly distort risk assessment—particularly in social and relational contexts.
These distortions are known as cognitive biases. They are not conscious errors, but predictable patterns of thinking shaped by the brain’s need for speed, coherence and psychological comfort (APA, 2022).
One of the most consistently identified biases in crime psychology is familiarity bias.
When Knowing Feels Like Safety
Familiarity bias refers to the tendency to associate closeness with harmlessness. The belief that “someone I know wouldn’t hurt me” feels reassuring, yet research repeatedly challenges it. International crime statistics show that a substantial proportion of both violent and non-violent offences are committed by individuals known to the victim—partners, colleagues, relatives or acquaintances (UNODC, 2023).
In everyday life, familiarity bias often operates quietly:
-
intrusive behaviour at work is excused because the person is “long-standing”;
-
discomfort in close relationships is dismissed because “they mean well”;
-
boundary violations are tolerated because they occur in trusted environments.
Familiarity lowers perceived threat, not actual risk.
“It Won’t Happen To Me”
Alongside familiarity bias operates comparative optimism bias—the tendency to believe that negative events are less likely to affect oneself than others (Sharot, 2011).
Psychologically, this bias reduces anxiety and supports emotional stability. At the same time, it leads individuals to underestimate personal vulnerability.
Optimism bias frequently appears in responses to fraud warnings, manipulation or ethical misconduct. Risks are acknowledged in theory, yet mentally assigned to other people, other situations, or unlikely cases.
In practice, this bias narrows attention precisely when broader awareness is required.
When Irregular Becomes Ordinary
A third mechanism, normality bias, plays a central role in how risk fades from awareness over time. It refers to the tendency to interpret gradually developing irregularities as acceptable (Vaughan, 1996).
Subtle warning signs—mild intimidation, repeated discomfort, ethical shortcuts or minor boundary crossings—rarely appear dramatic. When they do not lead to immediate consequences, the mind adapts. What once felt wrong becomes familiar.
Instead of reassessing the situation, expectations are quietly adjusted.
Across familiarity bias, optimism bias and normality bias, the common function is not denial but psychological accommodation. Risk is not removed; it is made tolerable.
Noticing The Signal, Losing Its Meaning
In many real-world situations, early warning signs are noticed.
A conversation feels off.
A request crosses a boundary.
A behaviour creates unease without a clear explanation.
The breakdown occurs not in perception, but in interpretation.
The mind quickly generates explanations:
“I’m overthinking.”
“There was no bad intention.”
“It’s not serious enough to matter.”
These narratives reduce discomfort, but they do not improve safety. Research in risk psychology shows that danger rarely reveals itself through isolated incidents; it becomes visible through patterns over time (Vaughan, 1996).
Awareness, therefore, is not about constant suspicion, but about resisting premature closure.
Practical Psychological Skills For Everyday Risk Awareness
Psychological approaches to awareness do not encourage hypervigilance. Instead, they focus on small cognitive adjustments that interrupt automatic dismissal.
Allowing discomfort to remain briefly—rather than immediately explaining it away—preserves its informational value. A more useful question than “Is this dangerous?” is often:
“Is this repeating?”
Shifting attention from identity to behaviour further sharpens judgement. Psychologically, trust is not grounded in who someone claims to be, but in how their actions consistently affect boundaries.
A short reflective pause after an initial interpretation can also reduce bias. Asking, “Would I see this differently if it were happening to someone else?” frequently reveals overlooked inconsistencies.
Finally, it is worth questioning the function of explanations themselves. If an explanation reduces anxiety without increasing understanding, it may be serving comfort rather than clarity.
Why Danger Often Feels Elsewhere
People tend to associate danger with the dramatic: strangers, extremes and visible threats. This belief offers a sense of control. Yet psychological and forensic research consistently shows that risk most often develops within familiar environments, ordinary relationships and reasonable-sounding narratives.
Danger rarely announces itself.
It blends in.
The mistake is not failing to notice risk, but assuming it would look different if it were real.
Awareness Is Not Instinct
Risk awareness is often described as intuition. Psychological evidence suggests otherwise. It is a learnable cognitive skill, strengthened through reflection, pattern recognition and tolerance for short-term uncertainty.
The aim is not fear, but accuracy.
“Danger is not hidden.
We keep looking elsewhere.”
References
American Psychological Association. (2022). APA Dictionary of Psychology.
Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.
Sharot, T. (2011). The optimism bias. Current Biology, 21(23), R941–R945.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.
UNODC. (2023). Global Study on Homicide. United Nations.
Vaughan, D. (1996). The Challenger Launch Decision. University of Chicago Press.


