As technology and insight advances in energy production and use, remote sensing and climate monitoring, numerous large high resolution and high-frequency data sets are being produced. As a world-leader in agricultural and environmental sciences, UC Davis is in a unique position to significantly enhance understanding, and develop improvements, of environmental and climate systems.
Such great understanding in these areas propels the development of solutions to problems of great importance to the people of California, and provides evidence to evaluate and inform environmental policy.
Rapid Quantification of Forest Fuels for Wildfire Risk Mitigation Using Computer Vision
Principal Investigator: Andrew Latimer, professor, Plant Sciences
Wildfires in northern California and throughout the western U.S. are burning at high severity across large contiguous areas. To reduce the risk of future high-severity wildfires, forest managers often perform “fuel reduction treatments,” including mechanical thinning of encroaching vegetation and application of low-severity prescribed fire. The assistance of accurate, high-resolution maps of fuel loads are needed to help prioritize the areas that receive these treatments and ensure effective resource allocation.
Latimer and his team, working with other UC Davis labs and the US Forrest Service, leverage nascent computer vision technology, extensive forest vegetation datasets, and unmanned aerial vehicle (UAV) imagery to enable high-resolution mapping of forest fuels across broad landscapes. Such collaboration will ensure findings are directly applicable to basic wildfire research and to land management aimed at mitigating wildfire risk.
A Model-Data Fusion Approach to Quantify and Predict the Fate of Terrestrial Carbon in California
Principal Investigator: Troy Magney, assistant professor, Plant Sciences
The state of California is aiming to remove 20 million metric tons of CO2 equivalent into soil and vegetation sinks by 2030. Achieving this will require statistically robust assessment tools to verify best carbon management practices and analysis of vulnerabilities of different carbon stocks. Unfortunately, the terrestrial carbon cycle is the least constrained component of the global carbon budget, with uncertainties stemming from lack of ground-based observations and scale-mismatch of available satellite products.
Magney and his team are working to address this challenge using model data fusion and machine learning approaches to quantify plant carbon allocation, stocks, residence times, and carbon-use efficiencies. Ultimately, the tool they develop will be used to predict the vulnerability and resilience of California’s carbon cycle under future climate change scenarios.
Bright Spots and Blind Spots: Can Big Data Improve Water Research in Latin America?
Principal Investigator: Samuel Sandoval Solis, associate professor, Land, Air and Water Resources
The exponential growth of scientific literature is unfathomable by any one researcher but offers the opportunity to turn the analytical tools of science on itself in an emerging discipline, the science of science, which studies the mechanisms underlying how science happens. Like many fields, the science of science benefits from the increasing availability of large-scale datasets coupled with breakthroughs in data science. Yet, significant data-driven findings remain inaccessible for stakeholders engaging with urgent societal issues.
Sandoval Solis and his team are addressing the discontinuity between science of science findings and the societal need for data-driven decision-making. Their research showcases how interpretable data science can significantly enhance our understanding of water resources research to tackle the tremendous challenge of guaranteeing safe and accessible water in a changing world.
Flood Risk and the Insurance Coverage Gap
Principal Investigator: Michael Springborn, associate professor, Environmental Science and Policy
Floods are among the costliest natural disasters with damages exceeding $40 billion/year globally. Regionally, California is one of the most flood-prone states, with an estimated $484 billion in building assets at risk. However, despite this substantial risk, demand for flood insurance is alarmingly low with only $66 billion insured in California.
Springborn and ARE Ph.D. student Joakim Weill are exploring this worrisome insurance coverage gap, working to quantify two different mechanisms that might explain it. They are looking at the low willingness to pay for insurance due to limiting financial resources, and the absence of knowledge about flood risk. This research will assess the impacts of previous flood maps updated by FEMA on insurance purchase decisions, and will provide background information for an updated set of USGS ARkStorm flooding scenarios to inform policy makers and residents.
Development of a Machine Learning Algorithm to Improve Computational Efficiency of Atmospheric Chemistry
Principal Investigator: Anthony Wexler, distinguished professor, Mechanical and Aerospace Engineering, Civil and Environmental Engineering, Land, Air and Water Resources
Air quality and climate models are structured into modules, where each module simulates one or up to a few physical or chemical processes. Usually, one or two of these modules consume the vast majority of the computer time used by the model. efforts to use machine learning approaches to speed these modules have not been sufficiently accurate to replace the original module. One source of this error is lack of conservation of physical and chemical properties that must be conserved.
Wexler and his team are using mathematical framework and machine learning approaches to dramatically improve the computational performance of modules within CMAQ, the model used by the EPA for simulating air quality in cities in the US. If this work is successful, it will be employed on similar modules in climate models such as CESM and E3SM.