The area burned by wildfire in the Sierra Nevada has increased by 274% over the last 40 years and the area impacted by stand-replacing fire has also increased. The forests in the Sierra Nevada are important for provision of clean water and are also part of the state’s climate action plan. As a result, figuring out how to reduce the chances of large, hot fires presents a large challenge.
We know that the current pace and scale of forest treatments to reduce the risk of large, hot fires is inadequate given the scale of the problem and the area burned by wildfire is projected to increase with on-going climate change. In a recent study led by Shuang Liang, we set out to determine how the pace of large-scale treatment implementation would alter carbon storage across the Sierra Nevada. We ran simulations under projected climate and wildfire and two management scenarios. Both management scenarios included applying thinning and prescribed burning treatments to low and mid-elevation forests. These are forests that have been most impacted by fire suppression. In the distributed scenario, we simulated an equal portion of the area treated at each time-step and with full treatment implementation by the end of this century. In the accelerated scenario, we simulated the same treatments over the same area, but schedule the treatments so they were complete by 2050. We included a control scenario that assumed no active management for comparison.
The area burned between all three scenarios was fairly consistent because we used the same fire size distributions in our simulations (black line in Figure 1). However, the proportion of burned area that was burned by stand-replacing fire (severity 4 and 5) decreased substantially. The faster pace of treatment under the accelerated scenario increased the proportion of area burned by surface fire (severity 1 and 2) and decreased the area burned by stand-replacing fire at a much faster rate than the distributed scenario.
Both the accelerated and distributed treatments ended up storing more carbon than the control by 2100 (Shown by the difference in Figure 2).
However, what was most striking was how these treatments influence the carbon balance of Sierra Nevada forests as a percentage of California’s 2020 emissions limit from the Governor’s Climate Action Plan. Initially, total carbon losses are higher in the treatment scenarios, with the accelerated treatment having the largest loss (Figure 3). However by 2030, carbon loss is similar amongst all three scenarios and by 2050 the accelerated scenario has lower emissions than the wildfire emissions under the control.
As we demonstrated in a previous study, changing climate and the increase in area burned has the potential to increase wildfire emissions by 19-101% by later this century. The results from this study demonstrate that restoring surface fires to the low and mid-elevation forests in the Sierra Nevada can reduce the magnitude of future emissions and maintain a larger amount of carbon stored in these forests.
Over a series of studies we have evaluated the carbon tradeoffs associated with treatments to reduce the risk of large, hot wildfires. In a 2008 study we posited that since large, hot wildfires release a considerable amount of carbon to the atmosphere because they kill trees and combust more plant material, thinning smaller trees and restoring surface fire would result in a net carbon benefit. The basis for this argument is that while thinning and prescribed burning both remove and release carbon from the ecosystem, the total carbon loss will be lower over time because when wildfire does occur the effects on the treated forest will be lower. One of the arguments against this line of thought was presented by John Campbell and colleagues in a 2012 paper. They argued that since we cannot predict exactly where fires will occur, more area will be treated than will burn in a wildfire and the net effect will be lower overall carbon storage.
Building on our previous work where we looked at the effects of increasingly severe fire weather, we sought to determine if we could use our understanding of where fires are likely be hottest and kill the most trees to inform the placement of thinning treatments. In a recent study led by Dan Krofcheck, we ran simulations of the same Dinkey Creek watershed in the Sierra Nevada using projected climate data from four different climate models and projected fire weather. We used the simulations from the first Dinkey Creek study to identify the locations on the landscape that had the largest chance of being burned by hot, tree-killing fire.
We then ran simulations where we thinned every location on the landscape that was legally and operationally available, meaning that these areas were not protected by law and the ground was not too steep to prevent thinning equipment from working. We called this the naïve scenario because it assumes that no prior information exists about where best to locate treatments. In reality, that is not how treatments are located. Forest managers use all kinds of information to inform the location of their treatments. In the optimized case, we only thinned areas of the landscape where there was a higher chance of stand-replacing fire. Importantly, we simulated prescribed fire to all forests where it is ecologically appropriate in both scenarios.
The results show that in terms of reducing the chance of stand-replacing fire, both scenarios performed almost exactly the same. In areas where the chance of high severity fire is high, both the naïve and optimized scenarios had the same reduction in fire severity when compared to the no-management scenario.
Both management scenarios also reduced the variability in carbon loss from the system because there were fewer stand-replacing fires. However, because we only thinned the highest risk places in the optimized scenario, we ended up thinning approximately 60% less area than in the naïve scenario. This resulted in much lower carbon loss from the system at the beginning of the simulations and an overall lower total carbon loss than both the no-management and naïve scenarios.
This research demonstrates that if we inform our treatment locations based on where we have the highest risk of stand-replacing fire, we can treat much less of the landscape with thinning and reduce both carbon loss from treatment and carbon loss from wildfire.
Warming and drying climate, bark beetle outbreaks, and wildfire all pose challenges to western US conifer forests. Widespread bark beetle outbreaks have been impacting large swaths of western North America. In the Lake Tahoe Basin, there is a strong link between severe drought and beetle outbreaks. Beetle outbreaks can also be affected by the density of their hosts because many beetle species use specific tree genera. As an example, mountain pine beetle only uses pine trees for hosts.
In drier, more fire-prone forests, fire-exclusion has increased tree density and in some cases host density. This change in structure has also increased the risk of stand-replacing fire. Since management activities to reduce the risk of stand-replacing fire typically involve reducing tree density and restoring surface fire, we sought to determine if these treatments might also reduce beetle outbreak potential.
In a study of Lake Tahoe Basin forests, led by Rob Scheller, we used the LANDIS-II simulation model to determine if beetle outbreaks would increase with climate change and if management activities to reduce wildfire hazard would reduce the impacts of beetle outbreaks. We hypothesized that climate change would increase beetle outbreaks and reduce carbon uptake by the forest and that management activities would reduce beetle-caused mortality and increase carbon uptake by the forest.
We found that climate change without beetles caused a reduction in carbon stored in trees and that beetles without climate change caused a reduction in carbon stored in trees. However, the combined effects of climate change and bark beetles caused a large reduction in the amount of carbon stored in trees across the study area (see Figure 1).
White fir and Jeffrey pine are two of the more common species in these forests. Previous research has shown that prior to fire-exclusion, Jeffrey pine was more common than white fir and with fire exclusion, white fir has become more common. In areas around communities, managers often focus harvesting efforts on white fir to reduce fire hazard. When we looked at these individual species in the areas treated around communities, we found that without management bark beetles caused both species to decline (see Figure 2). When we simulated management and no bark beetles, white fir decreased and Jeffrey pine increased. The combined effects of beetles and management really reduce the amount of white fir and resulted in a similar amount of Jeffrey pine as the simulation that only included beetles.
Our management simulations did not include treating all forests within the Basin. Treatments focused on communities and roadways. Because climate projections for the Basin get warmer and drier later in the 21st century, we looked at the chance that these forests become a source of carbon to the atmosphere with continued climate change and bark beetles. We found that the chance of these forests become a source of carbon to the atmosphere increases later this century with both beetles alone and beetles and management combined. Reducing the impacts of beetles may require more thinning to further reduce host density. These results show that we have to consider the full suite of disturbance agents when trying to determine the best path forward for managing forests under climate change.