Survival as Predictive Excellence
Our very survival, both as individuals and as a species, hinges profoundly on the quality of our risk judgment and decision-making, barring those threats and hazards beyond human control. The task of prediction - our ability to evaluate what is likely to cause us harm or prove fatal - has undergone a gradual evolution, advancing in minor, slow-paced increments since our divergence from our primate ancestors.
Over generations, evolutionary pressures led to the development of sophisticated neuro-cognitive threat response systems. Our limbic system and “fight, flight, or freeze” reactions emerged to detect and respond to imminent dangers. While these adaptations served us well as hunter-gatherers, dodging predators and storms, they have limitations in the modern world.
A pivotal distinction between the threats inherent in our modern world and those faced by our ancestors lies in the timeline of the risk. Our forerunners were primarily engaged with 'present self' risks - imminent threats and hazards that demanded immediate attention and action for survival. These included predatory attacks, severe weather conditions, and acute food shortages. On the other hand, our modern, post-industrial risk landscape is dominated by 'future self' killers - often invisible, and distant threats such as cancer, diabetes, climate change and pandemics.
These 'future self' risks require skills, methods, and tools that overcome the limitations of our inherited risk Operating System (OS), augmenting our predictive capabilities and improving our ability to lead long, healthy lives.
Technology and Uncertainty Reduction
The modern era of survival and success is inextricably bound to technologies that helped us overcome the shortcomings of our cognitive skills and physical senses, and improve the quality of our forecasts. How far and long will we have to travel? What will the weather be, and what will that far-away terrain look like?
The compass and nautical charts fueled exploration by helping 15th-century sailors navigate uncertain seas. The barometer dramatically improved storm prediction and our weather-related decision-making along with the telegraph. Each breakthrough allowed us to better anticipate environmental challenges and expand our capabilities.
Today, most of us can’t imagine driving without Google Maps or Waze, going a day without consulting our favorite weather app, or a month without looking at our credit score (a prediction of the risk we present to financial institutions.) We navigate our daily lives by forecasting, and we frequently rely on predictive technology to see beyond the horizon, literally and figuratively.
This technology allows us to improve the accuracy of our estimations, assumptions, and expectations and enables us to identify errors in real time. Humans have become the dominant species on Earth partly due to our superior ability to interpret and predict the world around us, especially risks.
Yet, on matters of personal risk perception and decision-making, our faculty for discerning threats remains comparable to 16th-century coastal navigators. When estimating hazards, most still depend on limited individual experiences, privileging anecdote over data and instinct over objectivity. We defer to emotions and tribal allegiances, ignoring how often these have proven faulty guides. In short, we still embrace all the same strategies known to end in misfortune when sailing off the map’s edge - and with similar results.
The Modern Risk Map and Compass
So what is the equivalent of maps, compasses and barometers to augment our inherited OS for navigating the modern risk landscape? Interestingly, this predictive technology is also rooted in the Age of Exploration, though not in magnetism or atmospheric pressure. Rather, it lies in mathematics and statistics. Its name is the base rate.
In the early 1660s, John Graunt pioneered demographic statistics by developing a meticulous analysis of London's weekly records of deaths, known as Bills of Mortality. He was among the first to categorize different causes of death and identify patterns and regularities. His work led to life tables showing survival probabilities at different ages - instrumental for understanding mortality and forming the basis of modern life insurance and public health. Graunt's observations recognized seasonal variations in mortality and correlations between diseases and environmental factors. He also enabled early estimates of London's population size and growth, now seen as a seminal moment in statistics and social science.
However, the true power of these insights for personal risk forecasting emerged a century later. Specifically, during one of the deadliest times for growing urban populations - the Cholera outbreaks. John Snow was an English physician who plotted cases of cholera on a map, marking each near the victim's residence. This allowed visual and statistical correlation of the disease's spread with proximity to the contaminated Broad Street water pump.
Snow's work pioneered epidemiology through surveys, methodical data collection, and analysis. It established techniques still saving lives today. The effects of this statistical technology on the quality of risk judgment, and thwarting hazards, are difficult to overstate.
As mortality statistics for London evolved into national actuarial tables, they allowed generations to plan their futures based on probability rather than superstition. Incident and morbidity data informed pivotal public health policies, like sanitation infrastructure regulation, that decreased infectious diseases, workplace injuries, and much more. This enabled cities to flourish during the Industrial Revolution. Base rates also laid the foundation for evidence-based medicine, and clinical protocols that reduced surgical complications.
Overcoming Risk (Statistical) Innumeracy
I previously discussed the common yet misguided reliance on intuition and gut feelings regarding risk assessment and decision-making.
A common belief is that modernity has diminished our natural survival instincts, and restoring that "primitive wisdom" would keep us safer. But the data reveals a different story. In a pre-modern world, life expectancy plateaued around 30 years everywhere. As the Enlightenment dawned, lifespan increased at an unprecedented rate along with the spread of industrialization and modernization in countries worldwide.
Measured in improved longevity and reduced mortality (including violent death), statistical base rates, and prioritizing data-driven risk discernment over intuition or anecdote, have driven greater human survival advances in just over 150 years than eons of evolution. Human mortality has decreased so substantially that the difference between hunter-gatherers and today’s lowest mortality populations is greater than between hunter-gatherers and wild chimpanzees.
Like pioneer sailors, we face challenges adopting new technologies that upend conventional thinking. In subsequent articles, we'll explore the barriers to embracing probabilistic risk thinking - from judgment biases to lack of useful data. We'll also provide guidance on recognizing valuable risk data and intelligence, and how to improve our safe-esteem and personal resilience in the face of extreme uncertainty.
We stand at the dawn of a new era in which individual and group risks are increasingly complex and distinct from those that shaped our survival instincts, and technology can radically enhance safety and survival if we dare update our perspective. While the journey requires letting go of comfortable illusions, the rewards beckon - safer families, effective public policies, and extended longevity. I hope you'll join us on this voyage of enhanced risk understanding.