This has been a big year for wildfires so far and an especially impactful year for some communities that have been devastated by fire. Several of us were keeping an eye on the Rough Fire as it looked like it was going to cross the north fork of the Kings River and head to the Teakettle Experimental Forest. In addition to media coverage of big wildfire events has been heightened attention to the rising costs of fire suppression and the ever larger fraction of US Forest Service budgets that suppression is consuming.
The news reports, the Rough Fire, and the focus on suppression costs caused me to take a look at some fire figures for the past decade posted on the National Interagency Fire Center (NIFC) website. So far, 2015 is second only to 2006 for number of acres burned and the 2015 fire season isn’t over yet. From 2005-2015, total acres burned ranged from 2.9 million acres in 2010 to 9.03 million acres in 2006. During this 10 year period we averaged 58,195 fires per year and we spent between $984 million and $1.92 billion fighting the fires. In the table below I have pulled the yearly fire data from the NIFC website and made a couple of calculations.
You might think that there is a relationship between the number of fires and the number of acres burned in a given year. However, the number of fires only accounts for about 12% of the variability in number of acres burned each year. As it turns out, the number of fires 100,000 acres or larger in a given year explains much more of the variability in acres burned each year.
The number of fires 100,000 acres or larger accounts for 81% of the variability in acres burned in a given year from 2005 to 2014. These larger fires account for between 16% and 45% of the acres burned over this time period and only 0.009 to 0.03% of the number of fires. These so-called megafires have accounted for a large fraction of the area burned this year.
The number of large wildfires in the western US have been increasing since the mid-1980s and are correlated with warmer temperature and earlier spring snowmelt. The National Research Council estimated that each degree Celsius of additional warming will increase the area burned by 200-400% in parts of the western US. Recent work has argued that we need to reevaluate the way we manage fire and fund wildfire suppression efforts. These ideas are especially salient considering that as the climate warms further, we can expect more large wildfires, which will not only impact communities where wildfires burn, but also impact communities that are further away with degraded air quality.
With over 23,000 posters and presentations at the 2014 Fall Meeting of the American Geophysical Union, there was tons of interesting research to learn about. Danelle Laflower and Shuang Liang gave posters in the session entitled Forests under a changing climate: uncertainties, carbon management, and adaptation.
Danelle presented some findings from her investigation into the influence of projected changes in climate on carbon dynamics at Joint Base Lewis McChord in Washington. Her work is part of the SERDP funded project to investigate the effects of forest management on carbon dynamics.
Danelle used the LANDIS-II model and downscaled climate projections under a business-as-usual emission scenario and a moderate emission scenario to examine how fluxes of carbon would change over the 21st century. She found that the moderate emission scenario and simulations run using climate data from the latter half of the 20th century produced similar carbon fluxes. The study area remained a carbon sink throughout the simulation period and the decline over time was driven primarily by forest succession and maturation. However, under the business-as-usual scenario, the amount of carbon taken up by the ecosystem declined much more rapidly, especially toward late-century. This result was primarily driven by increased temperature and decreased precipitation during the summer months causing increased water demand by the trees and decreased water availability for growth.
Net Ecosystem Carbon Balance (NECB) captures carbon gains by the system from photosynthesis and carbon losses from the system from respiration and disturbance. This figure shows the mean and standard error of NECB under the three different climate scenarios through late-century.
Shuang also used the LANDIS-II model and downscaled climate projections under the business-as-usual scenario. She simulated wildfire using recent distributions of wildfire size and frequency. Two of the common, widely distributed species in the Sierra Nevada are ponderosa pine and white fir. Ponderosa pine is more drought- and fire-tolerant than white fire. She found that relative to simulations with late-19th century climate, the amount of ponderosa pine biomass increased under all three projected climate and wildfire scenarios, while the amount white fir biomass had a small initial increase and the fell below the baseline scenario. This change is indicative of the area of the mountain range a range expansion for ponderosa and a range contraction for white fir. The changes in distribution of the two species is a result of the combined effects of changing climate and wildfire.
Both Shuang and Danelle are pushing forward on their projects, so stay tuned for more results as their research progresses.
In scientific inquiry we start with a model (conceptual or otherwise) about how we think some aspect of the world works. We use that model to develop a hypothesis. We design an experiment to test the hypothesis and we either find support for it (fail to reject the hypothesis) or we don't (reject the hypothesis). One of the hypotheses that we have been testing in the lab is that when we restore dry, fire-prone forests by restoring fire as a natural process, the remaining carbon (trees are 50% carbon) is is less likely to burn up in a wildfire (i.e. more stable) than in an unrestored forest.
The conceptual model that we are working with is that since we have been putting out natural fires in these dry forests for the better part of a century, the structure of the forest has changed (we now have more trees) and the amount of fuel (dead plant material on the forest floor) has increased. The change in structure and the build-up of fuel that make these forests more prone to wildfires that kill a large fraction of the trees has been demonstrated by many different studies in different locations. Agee and Skinner provide a review of the basic principles of these treatments in their 2005 paper.
When we restore forest structure by thinning trees and reduce fuel build-up through prescribed burning, carbon is removed from the forest (thinning) and emitted to the atmosphere (burning). A number of studies have documented this fact in Arizona, California, and Oregon.
To test the hypothesis about carbon stability we have evaluated field data and measures of fire risk and we have run simulation model experiments in Arizona and California dry forests. We continue to get consistent results - thinning and burning reduce the amount of carbon stored in the forest and reduce the risk of high-severity fires that kill many of the trees - and have failed to reject the hypothesis. That does not mean that we have found a universal truth - it is likely that we would reject this hypothesis if we tested it in a wet forest (see Steve Mitchell's results from coastal Oregon forests for an example).
I bring this up because I gave a presentation on this topic the other day. At the end of my talk I got a number of great questions. Some I could answer with certainty and some required the caveat that it is possible given what we know currently. The questions I couldn't or wouldn't answer were the ones that crossed the line I have personally set as the boundary between data-driven conclusion and opinion. These are typically questions that include some form of "What should we do?".
In his book The Honest Broker, Roger Pielke, Jr writes extensively about the different roles that scientists can play in the policy discussion. Pielke argues that we should aim to be "honest brokers of policy alternatives", meaning that scientists should not seek to limit the range of available options. I teach a graduate course in which we read and discuss Pielke's book, among others, and evaluate the effects that taking any one of these roles can have on an individual scientist's credibility and the outcome of the decision-making process. After numerous readings and discussions, my understanding of the concept continues to evolve. Currently I evaluate whether or not I'll respond to a question by asking myself if my scientific understanding of the system or topic can provide a piece of information useful to the discussion. As an example, following my presentation I received one question regarding treatment prioritization in a time of limited budgets and whether we should prioritize treatment in undeveloped areas or in areas where forests and communities meet. Do I have an opinion on the topic? Sure. Does my standing as a scientist that studies carbon dynamics in fire-prone forests give my opinion more weight than a person who lives in a community with high fire risk? Absolutely not?
There are many things that I enjoy about speaking to groups outside of the scientific community. I like the fact that it pushes me to reduce the field-specific jargon. I like the challenge of assembling my research for a non-specialist audience. I like the questions that push me to think about the science in a new way. But, my favorite part is that it pushes me to evaluate where the line between data-driven conclusions and opinion lies. Much the way that my conceptual model of the forests I work in evolves with more information, my perception of my role as a scientist communicating outside of the scientific community evolves with more experience.