As I write this, much remains uncertain about Hurricane Florence’s landfall in the Carolinas — the exact place the storm will strike, the strength of the winds, how much flooding will occur both along the coast and inland.

But one thing is certain: When the storm finally does arrive, thousands of residents will find themselves unprepared. Many will find their homes flooded, only to discover they don’t carry flood insurance. More will be facing days (or weeks) without electrical power — and then realize that they failed to gather sufficient supplies to endure the post-storm recovery period.

Some will have chosen not to evacuate despite warnings to do so, only to find themselves trapped in their houses, praying that the structures will survive the wind and storm surge. Some may needlessly and tragically lose their lives because of these mistakes.

Lack of preparation helps to explain why the material losses we have experienced after recent disasters have been as severe as they were, even when people have been forewarned. And lack of preparation, the research shows, is caused by cognitive biases that lead people to underplay warnings and make poor decisions, even when they have the information they need.

The pattern occurs again and again. When Hurricane Sandy hit New York and the Mid-Atlantic states in 2012, for example, 40 people drowned because they failed to heed warnings to evacuate from flood-prone coastal areas. Yet the storm had been accurately forecast, and, what's more, people believed the forecasts.

A survey conducted in advance of the storm found that not only were residents in the area acutely aware of the storm threat, many believed it would be even worse than it was. One day before the storm arrived, for example, New Jersey residents believed there was an 80 percent chance they would experience hurricane-force winds from the storm — odds far higher than the actual risk they faced, according to estimates at the time provided by the National Hurricane Center.

Yet preparations for the storm were comparatively limited: Only 20 percent of residents surveyed indicated that they had a preparedness plan in place. What went wrong? In this case the cognitive bias of excessive optimism kicked in: Residents knew all too well that a storm was at their doorstep and that many people would be affected — they just thought it wouldn’t affect them.

The bias of herd thinking compounded the problem. Looking around and seeing that few others were making preparations, residents felt no social pressure to do more.

In addition to over-optimism and a herd mentality, several other psychological biases undermine preparation for dangerous natural events. Consider myopia. Sound preparation for disasters requires us to make short-term costly investments (buying insurance or evacuating, for instance) to stave off a potential future loss. But most of us tend to be shortsighted, focusing on the immediate cost or inconvenience of preemptive action rather than the more distant, abstract penalty for failing to act. That leads us to conclude that preparedness is something that can be put off.

Amnesia is also evident in people’s reactions to news of a storm heading their way. Even when we have been through disaster before, we tend to forget what it felt like the last time — the discomfort of being without power for days, the challenges of repairs. While we may remember the bare facts of the event, emotions are what tends to drive action, and those memories fade the fastest. Examples of this type of forgetting are evident in many areas of life. After the financial meltdown of 2008-2009, as after other crises, one heard calls to curb excessive risk taking on Wall Street, to minimize the possibility of a recurrence. But after the recovery investors were right back at it; they had a hard time fully reimagining the downturn.

Simple reminders do little to help. Many cities that have experienced deadly disasters — including Galveston, Texas, site of the deadliest natural disaster in U.S. history — have monuments to remind residents of these events. But they evidently do little to instill the horror of living through such an event and therefore do little to inspire preparation

Inertia and simplification are also enemies of sound decision-making. When we are unsure of what to do in the face of an incoming storm, we tend to stick to the status quo of — doing nothing. If we are unsure just when to evacuate, we tend not to evacuate at all. Additionally, we tend to simplify our courses of action, selectively attending to a subset of relevant factors when making choices involving risk. When preparing for a hurricane, many things may need doing: arranging for lodging in the event an evacuation is ordered, securing water and supplies for 72 hours, filling cars with gas, locating alternative power supplies. In the face of such complexity we may undertake one or two actions and consider the job done.

In Hurricane Sandy, for example, 90 percent of residents secured supplies - but typically only enough to get them through a single day without power.

It may be discouraging to hear how our minds work to defeat us. (To be sure, there are many reasons beyond psychology that people fail to act. They can lack the financial means to do so, or be limited by age or disability; that means that the definition of preparedness ought to include checking on one's neighbors.) But there is a silver lining: knowing why we under-prepare is the first step to knowing how to avoid these mistakes. The key is to accept the fact that the biases I've mentioned are a part of our cognitive DNA.

The key to better preparedness is thus not to eliminate these biases, a hopeless task, but rather to design preparedness measures that anticipate them. Consider the bias toward simplification: the tendency for people to consider themselves prepared after taking one or two actions. The fix? Officials shouldn't distribute long, generic checklists of preparedness measures, which, the research suggests, will lead people to pick a couple (often the easiest rather than the most important). Rather ordered lists should be issued: Tell people, "If you are going to do only one thing to prepare for a storm, it should be this. If you are going to do three, you ought to .. . ." To fight inertia, work hard to persuade people to develop precise preparedness plans that include a shopping list of supplies, and exact plans for when to, and where to, evacuate, should that be necessary.

Recent years have seen tremendous advances in our ability to predict natural disasters such as hurricanes, floods and heat waves — extreme events that may become increasingly common as the climate changes. But these advances have done little to reduce the damaging cost of these events. Reducing those costs will require advances of a different kind: a better understanding of the psychology biases that shape how people make decisions, and better preparedness systems that anticipate and work around these biases.

Robert Meyer

Robert J. Meyer is the Ecker/MetLife professor of marketing and co-director of the Wharton Center for Risk Management and Decision Processes, at the University of Pennsylvania. He is co-author of “The Ostrich Paradox: Why We Underprepare for Disasters,” with Howard Kunreuther.