Predicting Wildfires Could Save Lives. So Why Are We So Bad At It?
SHARE

Last year in Arizona, 19 firefighters got trapped in an unpredictably fast wildfire. All of them died. It was the highest firefighter death toll for a single fire since 1933. The same year, Sierra Nevada saw its largest fire ever, which ravaged 402 square miles, and Colorado suffered its most destructive wildfire in state history—nearly 500 homes were destroyed.

According to the Union of Concerned Scientists, some 140 wildfires struck the western U.S. states in the 1980s, while about 250 raged between 2000 and 2012. In those states, fire season has also grown, extending from five months in the 1970s to seven or more months today (California’s drought has been so bad this year, some experts say its fire season never actually ended).

Why are there more fires in more places? The simple answer is us. First, until the 1980s, we didn’t know that fire can be good for ecosystems, so firefighters operated under a policy of fire suppression. By preventing fires, though, they let underbrush build up in forests, fueling bigger fires later on. Then, there’s climate change: The planet’s increasing average temperatures are responsible for drier, longer, and significantly more extreme fire seasons than ever before. Finally, our cities are spreading, pushing their edges (and suburbs) into fire-prone areas.

If it seems like mega-fires are occurring more frequently, it’s because they are.

If we are to deal with these changes and avoid tragedy, experts first need to fully understand wildfires—events we actually know little about. Starting this year, a team of engineers, including our own National Institute of Standards and Technology (NIST) and the U.S. Forest Service, has been carrying out controlled burns and using the data to build computer models that could improve fire predictions. NIST engineers are also testing new building materials for roof tiles and house frames that can withstand wildfire conditions and make for safer homes.

If it seems like mega-fires are occurring more frequently, it’s because they are.

Even if scientists can figure out how to better predict and mitigate fires, people will still be in harm’s way. A team at University of California, Berkeley recently proposed the creation of fire-detecting and tracking satellites to provide advanced warnings. But they would likely be of little help in the most dangerous situations: Tiny blazes can turn into big burns too fast for the process to work.

Instead, we need a system that alerts people to danger even before a fire ever starts. Scientists at NIST recently developed a scale for labeling areas most at risk of wildfires. It’s called the Wildland Urban Interface Hazard Scale. And it would predict the severity and destructiveness of a fire in a particular area. It could also inform building codes for new construction in fire zones—and insurance costs. You pay premiums for deciding to live in the path of quakes and hurricanes, after all, so why not fires?

The method is tried and true. When general warnings won’t do the job, why not use the market to discourage people from doing stupid things like building in fire zones? Without a hazard scale in place, we’re just crossing our fingers and moving into regions that are more and more likely to burn.

This article originally appeared in the May 2014 issue of Popular Science.