Date
stringlengths
11
18
Link
stringlengths
62
62
Title
stringlengths
16
148
Summary
stringlengths
1
2.68k
Body
stringlengths
22
13k
Category
stringclasses
20 values
Year
int64
2k
2.02k
February 5, 2020
https://www.sciencedaily.com/releases/2020/02/200205132345.htm
Extreme weather conditions can tax urban drainage systems to the max
During a typical Canadian winter, snow accumulation and melt -- combined with sudden rainfalls -- can lead to bottlenecks in storm drains that can cause flooding.
With that in mind, researchers at UBC's Okanagan campus have been examining urban stormwater drainage systems, and they too have concerns about the resilience of many urban drainage systems.A recently published paper from the School of Engineering says existing design methods for urban drainage systems aren't going far enough to withstand possible catastrophic storms or even unpredictable failures during a moderate storm."As engineers, we run simulations of possible catastrophic events, and current systems often do not fare well," says doctoral student Saeed Mohammadiun. "We are seeing sources of overloading such as structural failures, severe rainfalls or abrupt snowmelt stressing these systems."Add any extreme situation including quick snowmelt or a heavy and sudden rainfall, and Mohammadiun says many systems aren't built to handle these worst-case scenarios. Mohammadiun has conducted several case studies of drainage systems in major urban areas around the world. He has determined many current urban standards designed for a 10-to-50 or even 100-year storm scenario are not meeting the increasing demands of climate change as well as intrinsic failure risk of networks' elements."Conventional, reliability-based design methods only provide acceptable performance under expected conditions of loading," he says. "Depending on the system, if something breaks down or there is a blockage, it can result in a failure and possible flooding."According to Mohammadiun, the resiliency of a system is not just dependent on the load it can handle, but also on its design and build. Many do not take into account the effects of climate change or unexpected weather conditions.To establish an efficient resilient system, Mohammadiun says it is important to consider various sources of uncertainty such as rainfall characteristics, heavy snowfalls followed by a quick melt and different possible malfunction scenarios along with budget constraints, he says."Building or improving the resilience of urban stormwater drainage systems is crucial to ensuring these systems are protected against failure as much as possible, or they can quickly recover from a potential failure," he adds. "This resilient capacity will provide urban drainage systems with the desired adaptability to a wide range of unexpected failures during their service life."The research points to several measures municipalities can proactively address the issue. Municipalities could build bypass lines and apply an appropriate combination of relief tunnels, storage units, and other distributed hydraulic structures in order to augment drainage system capacities in a resilient manner.With the recent heavy snowfalls across Canada, Mohammadiun says the silver lining when it comes to drainage is that it takes snow time to melt whereas heavy rainfall puts an immediate stress on these systems. But from the engineering point of view, it is necessary to consider both acute and chronic conditions.Not surprising, the research shows that urban drainage and stormwater systems that are built or modified to be more resilient, will handle extreme weather events more effectively and efficiently than conventional designs.
Weather
2,020
February 4, 2020
https://www.sciencedaily.com/releases/2020/02/200204112518.htm
Deep learning accurately forecasts heat waves, cold spells
Rice University engineers have created a deep learning computer system that taught itself to accurately predict extreme weather events, like heat waves, up to five days in advance using minimal information about current weather conditions.
Ironically, Rice's self-learning "capsule neural network" uses an analog method of weather forecasting that computers made obsolete in the 1950s. During training, it examines hundreds of pairs of maps. Each map shows surface temperatures and air pressures at five-kilometers height, and each pair shows those conditions several days apart. The training includes scenarios that produced extreme weather -- extended hot and cold spells that can lead to deadly heat waves and winter storms. Once trained, the system was able to examine maps it had not previously seen and make five-day forecasts of extreme weather with 85% accuracy.With further development, the system could serve as an early warning system for weather forecasters, and as a tool for learning more about the atmospheric conditions that lead to extreme weather, said Rice's Pedram Hassanzadeh, co-author of a study about the system published online this week in the American Geophysical Union's The accuracy of day-to-day weather forecasts has improved steadily since the advent of computer-based numerical weather prediction (NWP) in the 1950s. But even with improved numerical models of the atmosphere and more powerful computers, NWP cannot reliably predict extreme events like the deadly heat waves in France in 2003 and in Russia in 2010."It may be that we need faster supercomputers to solve the governing equations of the numerical weather prediction models at higher resolutions," said Hassanzadeh, an assistant professor of mechanical engineering and of Earth, environmental and planetary sciences at Rice. "But because we don't fully understand the physics and precursor conditions of extreme-causing weather patterns, it's also possible that the equations aren't fully accurate, and they won't produce better forecasts, no matter how much computing power we put in."In late 2017, Hassanzadeh and study co-authors and graduate students Ashesh Chattopadhyay and Ebrahim Nabizadeh decided to take a different approach."When you get these heat waves or cold spells, if you look at the weather map, you are often going to see some weird behavior in the jet stream, abnormal things like large waves or a big high-pressure system that is not moving at all," Hassanzadeh said. "It seemed like this was a pattern recognition problem. So we decided to try to reformulate extreme weather forecasting as a pattern-recognition problem rather than a numerical problem."Deep learning is a form of artificial intelligence, in which computers are "trained" to make humanlike decisions without being explicitly programmed for them. The mainstay of deep learning, the convolutional neural network, excels at pattern recognition and is the key technology for self-driving cars, facial recognition, speech transcription and dozens of other advances."We decided to train our model by showing it a lot of pressure patterns in the five kilometers above the Earth, and telling it, for each one, 'This one didn't cause extreme weather. This one caused a heat wave in California. This one didn't cause anything. This one caused a cold spell in the Northeast,'" Hassanzadeh said. "Not anything specific like Houston versus Dallas, but more of a sense of the regional area."At the time, Hassanzadeh, Chattopadhyay and Nabizadeh were barely aware that analog forecasting had once been a mainstay of weather prediction and even had a storied role in the D-Day landings in World War II."One way prediction was done before computers is they would look at the pressure system pattern today, and then go to a catalog of previous patterns and compare and try to find an analog, a closely similar pattern," Hassanzadeh said. "If that one led to rain over France after three days, the forecast would be for rain in France."He said one of the advantages of using deep learning is that the neural network didn't need to be told what to look for."It didn't matter that we don't fully understand the precursors because the neural network learned to find those connections itself," Hassanzadeh said. "It learned which patterns were critical for extreme weather, and it used those to find the best analog."To demonstrate a proof-of-concept, the team used model data taken from realistic computer simulations. The team had reported early results with a convolutional neural network when Chattopadhyay, the lead author of the new study, heard about capsule neural networks, a new form of deep learning that debuted with fanfare in late 2017, in part because it was the brainchild of Geoffrey Hinton, the founding father of convolutional neural network-based deep learning.Unlike convolutional neural networks, capsule neural networks can recognize relative spatial relationships, which are important in the evolution of weather patterns. "The relative positions of pressure patterns, the highs and lows you see on weather maps, are the key factor in determining how weather evolves," Hassanzadeh said.Another significant advantage of capsule neural networks was that they don't require as much training data as convolutional neural networks. There's only about 40 years of high-quality weather data from the satellite era, and Hassanzadeh's team is working to train its capsule neural network on observational data and compare its forecasts with those of state-of-the-art NWP models."Our immediate goal is to extend our forecast lead time to beyond 10 days, where NWP models have weaknesses," he said.Though much more work is needed before Rice's system can be incorporated into operational forecasting, Hassanzadeh hopes it might eventually improve forecasts for heat waves and other extreme weather."We are not suggesting that at the end of the day this is going to replace NWP," he said. "But this might be a useful guide for NWP. Computationally, this could be a super cheap way to provide some guidance, an early warning, that allows you to focus NWP resources specifically where extreme weather is likely."Hassanzadeh said his team is also interested in finding out what patterns the capsule neural network uses to make its predictions."We want to leverage ideas from explainable AI (artificial intelligence) to interpret what the neural network is doing," he said. "This might help us identify the precursors to extreme-causing weather patterns and improve our understanding of their physics."
Weather
2,020
February 3, 2020
https://www.sciencedaily.com/releases/2020/02/200203114325.htm
Losing coastal plant communities to climate change will weaken sea defenses
Coastal plant communities are a crucial element of global sea defences but are increasingly threatened by the human-induced effects of climate change, according to new research.
Rising sea levels and the increased frequency and intensity of extreme storm events are having a visible, global impact on beaches, cliff faces and coastal infrastructure.But a new report suggests their impact on coastal plants, an integral part of shoreline defences, needs to be placed in greater focus.The research was led by the University of Plymouth, in conjunction with scientists at Utrecht University and Manchester Metropolitan University, and is published in a special edition of the journal It follows a recent assessment by the Intergovernmental Panel on Climate Change (IPCC 2019), which asserted that anthropogenically-driven climate change poses a severe environmental threat to estuarine and coastal ecosystems.This report not only reviews how the flood and erosion threats posed by a combination of sea level rise and storms can affect coastal sub-, inter- and supra-tidal plant communities, but also highlights the contribution that habitats like saltmarshes, mangrove forests, sand dunes and kelp beds make to coastal protection.Dr Mick Hanley, Associate Professor (Reader) in the School of Biological and Marine Sciences at the University of Plymouth, led the research. He said: "It has been suggested that by 2050, it could cost well over $50billion to protect the world's largest cities from coastal flooding. In contrast, coastal vegetation can offer natural protection against erosion and flooding for a fraction of the costs associated with constructing so-called hard defences like concrete walls. Society is only just beginning to appreciate this, but estuarine and coastal ecosystems can be integrated into a dynamic, low-cost flood defence strategy to meet the ever increasing challenges posed by rising sea-levels and storms."As well as highlighting that the threats posed by extreme weather to coastal plant communities are undoubtedly severe, the study calls for biologists and ecologists to work alongside coastal scientists, environment agencies and land managers to identify the key species and habitats for coastal defence and how they can be both promoted and protected in the future.Central to that objective, the authors argue, is the need to develop and combine long-term monitoring with flood risk models to better predict where and how storms and other climate change-driven phenomenon influence coastal ecosystems and services.Dr Hanley, who also co-edited the
Weather
2,020
January 30, 2020
https://www.sciencedaily.com/releases/2020/01/200130131006.htm
Rapid weather swings increase flu risk
New research from a team of Florida State University scientists shows that rapid weather variability as a result of climate change could increase the risk of a flu epidemic in some highly populated regions in the late 21st century. The research was published today in the journal
Zhaohua Wu, an associate professor in the Department of Earth, Ocean and Atmospheric Science and scientist with the Center for Ocean-Atmospheric Prediction Studies, and an international team looked at historical data to see how significant weather swings in the autumn months affect flu season in highly populated regions of northern-mid latitudes of the world. They specifically looked at the United States, mainland China, Italy and France.Using surface air temperatures from Jan. 1, 1997 to Feb. 28, 2018, researchers analyzed weather patterns and average temperatures over 7,729 days. Simultaneously, they conducted statistical analysis on influenza data sets from the four countries over the same time period.Previous research suggested low temperatures and humidity in the winter create a favorable environment for transmitting the flu virus. However, the 2017-2018 flu season was one of warmest on record and yet also one of the deadliest. The Centers for Disease Control reported 186 children's deaths during the 2017-2018 season. The previous high was 171 during the 2012-2013 season.During the 2017-2018 flu season, scientists found that the extreme fluctuations in weather during the autumn months essentially kick-started the flu, building a patient population early in the season that snowballed in densely populated areas given the contagious nature of the virus."The historical flu data from different parts of the world showed that the spread of flu epidemic has been more closely tied to rapid weather variability, implying that the lapsed human immune system in winter caused by rapidly changing weather makes a person more susceptible to flu virus," Wu said.The issue going forward, scientists noted, is that rapid weather variability is common in warming climates. Having a better understanding of those weather patterns may be key to determining the severity of any future flu season threat. If these climate models are correct, there is an anticipation of increased flu risk in highly populated areas. Under this scenario, Europe could see a 50 percent increase in deaths tied to flu."The autumn rapid weather variability and its characteristic change in a warming climate may serve not only as a skillful predictor for spread of flu in the following season but also a good estimator of future flu risk," Wu said. "Including this factor in flu spread models may lead to significantly improved predictions of flu epidemic."Wu said he and his team are continuing to pursue this line of research with the ultimate goal of creating a model that incorporates both traditional flu indicators on the health and medicine side with environmental factors such as weather patterns.
Weather
2,020
January 29, 2020
https://www.sciencedaily.com/releases/2020/01/200129174450.htm
Meteorites reveal high carbon dioxide levels on early Earth
Tiny meteorites no larger than grains of sand hold new clues about the atmosphere on ancient Earth, according to scientists.
Iron micrometeorites found in ancient soils suggest carbon dioxide made up 25 to 50 percent of Earth's atmosphere 2.7 billion years ago, and that pressure at sea level may have been lower than today, Penn State researchers said.The meteorites melted as they streaked through the atmosphere and oxidized as they encountered atmospheric gases. Evidence of the oxidation remains on the tiny fragments that landed on Earth. The samples serve as a unique proxy for conditions in the upper atmosphere, the scientists said."This is a promising new tool for figuring out the composition of the upper atmosphere billions of years in the past," said Rebecca Payne, a doctoral candidate in geosciences and astrobiology at Penn State. Payne is lead author of the study, published recently in the journal The work builds on previous studies of the micrometeorites that suggested free oxygen molecules in the upper atmosphere oxidized the meteorites. Those findings would require oxygen levels on ancient Earth to be near modern day levels, a surprising conclusion that contradicts conditions expected on the young planet, Payne said.The researchers conducted a new analysis using photochemical and climate models and determined carbon dioxide, not oxygen, likely served as the main oxidant. For this to be possible, they found carbon dioxide had to comprise at least 25 percent of the atmosphere.Those levels of carbon dioxide would suggest a warm planet, but other climate evidence finds Earth was cool at the time and partly covered by glaciers. Lower nitrogen levels resulting in lower pressure would allow for both high carbon dioxide levels and cool conditions."There are data, referenced in our paper, that support lower nitrogen concentrations during this time," said Jim Kasting, Evan Pugh University Professor in the Department of Geosciences at Penn State and Payne's adviser. "Our study of micrometeorite oxidation falls in line with that interpretation. The possibility that our major atmospheric gas, nitrogen, was less abundant in the distant past is really intriguing."The findings may help reconcile disagreements in previous studies on carbon dioxide in the deep past and climate model estimates, according to the researchers.Previous estimates of carbon dioxide levels from billions of years ago rely on paleosols, or ancient soils, which may better reflect conditions in the lower atmosphere. Regional differences like weather or ground cover also can impact paleosols samples, and the findings from these studies often contradict each other and climate models, the scientists said."It was getting difficult to figure out where the agreement should have been between different paleosol studies and climate models," Payne said. "This is interesting, because it's a new point of comparison. It may help us find the right answer about atmospheric carbon dioxide in the deep past."Don Brownlee, professor at the University of Washington, also contributed to this research.
Weather
2,020
January 29, 2020
https://www.sciencedaily.com/releases/2020/01/200129104745.htm
Space super-storm likelihood estimated from longest period of magnetic field observations
A 'great' space weather super-storm large enough to cause significant disruption to our electronic and networked systems occurred on average once in every 25 years according to a new joint study by the University of Warwick and the British Antarctic Survey.
By analysing magnetic field records at opposite ends of the Earth (UK and Australia), scientists have been able to detect super-storms going back over the last 150 years.This result was made possible by a new way of analysing historical data, pioneered by the University of Warwick, from the last 14 solar cycles, way before the space age began in 1957, instead of the last five solar cycles currently used.The analysis shows that 'severe' magnetic storms occurred in 42 out of the last 150 years, and 'great' super-storms occurred in 6 years out of 150. Typically, a storm may only last a few days but can be hugely disruptive to modern technology. Super-storms can cause power blackouts, take out satellites, disrupt aviation and cause temporary loss of GPS signals and radio communications.Lead author Professor Sandra Chapman, from the University of Warwick's Centre for Fusion, Space and Astrophysics, said: "These super-storms are rare events but estimating their chance of occurrence is an important part of planning the level of mitigation needed to protect critical national infrastructure."This research proposes a new method to approach historical data, to provide a better picture of the chance of occurrence of super-storms and what super-storm activity we are likely to see in the future."The Carrington storm of 1859 is widely recognised as the largest super-storm on record, but predates even the data used in this study. The analysis led by Professor Chapman estimates what amplitude it would need to have been to be in the same class as the other super-storms- and hence with a chance of occurrence that can be estimated.Professor Richard Horne, who leads Space Weather at the British Antarctic Survey, said: "Our research shows that a super-storm can happen more often than we thought. Don't be misled by the stats, it can happen any time, we simply don't know when and right now we can't predict when."Space weather is driven by activity from the sun. Smaller scale storms are common, but occasionally larger storms occur that can have a significant impact.One way to monitor this space weather is by observing changes in the magnetic field at the earth's surface. High quality observations at multiple stations have been available since the beginning of the space age (1957). The sun has an approximately 11-year cycle of activity which varies in intensity and this data, which has been extensively studied, covers only five cycles of solar activity.If we want a better estimate of the chance of occurrence of the largest space storms over many solar cycles, we need to go back further in time. The aa geomagnetic index is derived from two stations at opposite ends of the earth (in UK and Australia) to cancel out the earth's own background field. This goes back over 14 solar cycles or 150 years, but has poor resolution.Using annual averages of the top few percent of the aa index the researchers found that a 'severe' super-storm occurred in 42 years out of 150 (28%), while a 'great' super-storm occurred in 6 years out of 150 (4%) or once in every 25 years. As an example, the 1989 storm that caused a major power blackout of Quebec was a great storm.In 2012 the Earth narrowly avoided trouble when a coronal mass ejection from the Sun missed the Earth and went off in another direction. According to satellite measurements if it had hit the Earth it would have caused a super-storm.Space weather was included in the UK National Risk Register in 2012 and updated in 2017 with a recommendation for more investment in forecasting. In September 2019 the Prime Minister announced a major new investment of £20 million into space weather. The object is to forecast magnetic storms and develop better mitigation strategies.
Weather
2,020
January 27, 2020
https://www.sciencedaily.com/releases/2020/01/200127164341.htm
Airborne measurements point to low EPA methane estimates in south central US
Approximately twice as much methane is seeping into the atmosphere than the Environmental Protection Agency estimates from oil and gas facilities in the south central U.S., according to a series of measurements taken by meteorologists using NASA aircraft.
In six flights through the region, researchers used onboard instruments from two planes to collect data roughly 1,000 feet above ground. They flew through massive methane plumes concentrated by regional weather patterns and used sample points and weather models to determine the actual methane concentrations of the plumes. These concentrated plumes were discovered during the Atmospheric Carbon and Transport-America (ACT-America) campaign, a much broader Penn State led-effort to understand greenhouse sources and sinks.Researchers found methane from oil and gas facilities to be 1.1 to 2.5 times greater than EPA estimates for the region that includes Arkansas, Texas, Louisiana and Oklahoma. In another key finding, scientists showed how frontal systems in the atmosphere can be used to track methane from much larger areas at the surface because large plumes of methane concentrations come together along the frontal boundary."When we flew across cold fronts, one thing we noticed was that warm air was being pulled up and funneling the region's greenhouse gases into large plumes," said Zach Barkley, researcher in meteorology and atmospheric science, Penn State. "We fed data from these plumes into our weather models and, when we compared the data with the EPA inventory, we saw there was a discrepancy."Methane comes from many sources -- including wetlands, animal agriculture and the oil and natural gas industry -- so researchers used ethane measurements to determine the source. Ethane is primarily found in methane produced by the natural gas industry, so researchers used that to omit methane produced by animal agriculture and other natural sources. The findings are reported in a recent issue of The EPA uses a bottom-up approach to estimate methane emissions from industry by applying a value to each well and transport component. Penn State researchers used a top-down approach, meaning the emissions were measured at their endpoint, the atmosphere."The one issue with the bottom-up approach is if you can't sample enough sources to get an accurate representation of the average," Barkley said. "When you multiply by all of the different devices and components across the U.S., you could potentially come up with a number that's not accurate."The region is important to combating greenhouse gas emissions at large, Barkley said, because it accounts for nearly 40 percent of the human-made methane emissions in the U.S. The region is a hotspot for both natural gas extraction and animal agriculture. Methane is an important greenhouse gas with 34 times the warming potential of carbon dioxide over a 100-year period, according to the Intergovernmental Panel on Climate Change.Barkley said there are also problems with the top-down approach to measuring methane. It is more expensive and does not identify which sources are emitting methane. He said the approach is more of a check on the accuracy of the existing approach.But it does point to areas to target for greenhouse gas reduction."If oil and gas emissions are off by a factor of two, that means that oil and gas are very significantly the highest human-made source of methane emissions in the U.S. and would be a prime area to target for reducing methane emissions, particularly if we find relatively few sources contributing significantly to the bulk of the emissions," Barkley said. "If we can figure out how to target those sources and fix them, that could be a significant reduction of greenhouse gas emissions coming from the oil and gas industry."In prior research, Barkley determined EPA estimates for methane from natural gas facilities were also low in a portion of Pennsylvania's Marcellus Shale region.NASA funded this research.
Weather
2,020
January 27, 2020
https://www.sciencedaily.com/releases/2020/01/200127145455.htm
Driven by Earth's orbit, climate changes in Africa may have aided human migration
In 1961, John Kutzbach, then a recent college graduate, was stationed in France as an aviation weather forecaster for the U.S. Air Force. There, he found himself exploring the storied caves of Dordogne, including the prehistoric painted caves at Lascoux.
Thinking about the ancient people and animals who would have gathered in these caves for warmth and shelter, he took up an interest in glaciology. "It was interesting to me, as a weather person, that people would live so close to an ice sheet," says Kutzbach, emeritus University of Wisconsin-Madison professor of atmospheric and oceanic sciences and the Nelson Institute for Environmental Studies.Kutzbach went on to a career studying how changes in Earth's movements through space -- the shape of its orbit, its tilt on its axis, its wobble -- and other factors, including ice cover and greenhouse gases, affect its climate. Many years after reveling at Ice Age cave art, today he's trying to better understand how changes in Earth's climate may have influenced human migration out of Africa.In a recent study published in the The study describes a dynamic climate and vegetation model that explains when regions across Africa, areas of the Middle East, and the Mediterranean were wetter and drier and how the plant composition changed in tandem, possibly providing migration corridors throughout time."We don't really know why people move, but if the presence of more vegetation is helpful, these are the times that would have been advantageous to them," Kutzbach says.The model also illuminates relationships between Earth's climate and its orbit, greenhouse gas concentrations, and its ice sheets.For instance, the model shows that around 125,000 years ago, northern Africa and the Arabian Peninsula experienced increased and more northerly-reaching summer monsoon rainfall that led to narrowing of the Saharan and Arabian deserts due to increased grassland. At the same time, in the Mediterranean and the Levant (an area that includes Syria, Lebanon, Jordan, Israel and Palestine), winter storm track rainfall also increased.These changes were driven by Earth's position relative to the sun. The Northern Hemisphere at the time was as close as possible to the sun during the summer, and as far away as possible during the winter. This resulted in warm, wet summers and cold winters."It's like two hands meeting," says Kutzbach. "There were stronger summer rains in the Sahara and stronger winter rains in the Mediterranean."Given the nature of Earth's orbital movements, collectively called Milankovitch cycles, the region should be positioned this way roughly every 21,000 years. Every 10,000 years or so, the Northern Hemisphere would then be at its furthest point from the sun during the summer, and closest during winter.Indeed, the model showed large increases in rainfall and vegetation at 125,000, at 105,000, and at 83,000 years ago, with corresponding decreases at 115,000, at 95,000 and at 73,000 years ago, when summer monsoons decreased in magnitude and stayed further south.Between roughly 70,000 and 15,000 years ago, Earth was in a glacial period and the model showed that the presence of ice sheets and reduced greenhouse gases increased winter Mediterranean storms but limited the southern retreat of the summer monsoon. The reduced greenhouse gases also caused cooling near the equator, leading to a drier climate there and reduced forest cover.These changing regional patterns of climate and vegetation could have created resource gradients for humans living in Africa, driving migration outward to areas with more water and plant life.For the study, the researchers, including Kutzbach's UW-Madison colleagues Ian Orland and Feng He, along with researchers at Peking University and the University of Arizona, used the Community Climate System Model version 3 from the National Center for Atmospheric Research. They ran simulations that accounted for orbital changes alone, combined orbital and greenhouse gas changes, and a third that combined those influences plus the influence of ice sheets.It was Kutzbach who, in the 1970s and 1980s, confirmed that changes in Earth's orbit can drive the strength of summer monsoons around the globe by influencing how much sunlight, and therefore, how much warming reaches a given part of the planet.Forty years ago, there was evidence for periodic strong monsoons in Africa, but no one knew why, Kutzbach says. He showed that orbital changes on Earth could lead to warmer summers and thus, stronger monsoons. He also read about periods of "greening" in the Sahara, often used to explain early human migration into the typically-arid Middle East."My early work prepared me to think about this," he says.His current modeling work mostly agrees with collected data from each region, including observed evidence from old lake beds, pollen records, cave features, and marine sediments. A recent study led by Orland used cave records in the Levant to show that summer monsoons reached into the region around 125,000 years ago."We get some things wrong (in the model)," says Kutzbach, so the team continues to refine it. For instance, the model doesn't get cold enough in southern Europe during the glacial period and not all vegetation changes match observed data. Computing power has also improved since they ran the model."This is by no means the last word," Kutzbach says. "The results should be looked at again with an even higher-resolution model."The study was supported by the National Science Foundation (grants AGS1602771, 1502990, 1603065, 1338553, 1603065 and 1702407) and Smithsonian Institute Contract 33330218CT0010211. It was also supported by high-performance computing from Yellowstone and Cheyenne, provided by the National Center for Atmospheric Research.
Weather
2,020
January 22, 2020
https://www.sciencedaily.com/releases/2020/01/200122122121.htm
Mapping the path of climate change
Since 1880, the Earth's temperature has risen by 1.9 degrees Fahrenheit and is predicted to continue rising, according to the NASA Global Climate Change website. Scientists are actively seeking to understand this change and its effect on Earth's ecosystems and residents.
In The researchers develop a climate change model based on probabilistic framework to explore the maximum likelihood climate change for an energy balance system under the influence of greenhouse effect and Lévy fluctuations. These fluctuations, which can present themselves as volcanic eruptions or huge solar outbreaks, for example, are suggested to be one factor that can trigger an abrupt climatic transition.Some results of these noise fluctuations are the rapid climate changes that occurred 25 times during the last glacial period, a series of pauses in geophysical turbulence, and protein production in gene regulation, which occurs in bursts."Although the climate changes may not easily be accurately predicted, we offer insights about the most likely trend in such changes for the surface temperature," said Duan. "In the present paper, we have uncovered that the maximum likelihood path, under an enhanced greenhouse effect, is a step-like growth process when transferring from the current temperature state to the high temperature one."By understanding the step-like growth process of the greenhouse effect, the authors can map out the path that climate changes may take. The researchers found larger influences of noise fluctuations can result in abrupt shifts from a cold climate state to a warmer one."The maximum likelihood path will be expected to be an efficient research tool, in order to better understand the climate changes under the greenhouse effect combined with non-Gaussian fluctuations in the environment," said Duan.
Weather
2,020
January 15, 2020
https://www.sciencedaily.com/releases/2020/01/200115130446.htm
NASA, NOAA analyses reveal 2019 second warmest year on record
According to independent analyses by NASA and the National Oceanic and Atmospheric Administration (NOAA), Earth's global surface temperatures in 2019 were the second warmest since modern recordkeeping began in 1880.
Globally, 2019 temperatures were second only to those of 2016 and continued the planet's long-term warming trend: the past five years have been the warmest of the last 140 years.This past year, they were 1.8 degrees Fahrenheit (0.98 degrees Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASA's Goddard Institute for Space Studies (GISS) in New York."The decade that just ended is clearly the warmest decade on record," said GISS Director Gavin Schmidt. "Every decade since the 1960s clearly has been warmer than the one before."Since the 1880s, the average global surface temperature has risen and the average temperature is now more than 2 degrees Fahrenheit (a bit more than 1 degree Celsius) above that of the late 19th century. For reference, the last Ice Age was about 10 degrees Fahrenheit colder than pre-industrial temperatures.Using climate models and statistical analysis of global temperature data, scientists have concluded that this increase mostly has been driven by increased emissions into the atmosphere of carbon dioxide and other greenhouse gases produced by human activities."We crossed over into more than 2 degrees Fahrenheit warming territory in 2015 and we are unlikely to go back. This shows that what's happening is persistent, not a fluke due to some weather phenomenon: we know that the long-term trends are being driven by the increasing levels of greenhouse gases in the atmosphere," Schmidt said.Because weather station locations and measurement practices change over time, the interpretation of specific year-to-year global mean temperature differences has some uncertainties. Taking this into account, NASA estimates that 2019's global mean change is accurate to within 0.1 degrees Fahrenheit, with a 95% certainty level.Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2019 annual mean temperature for the contiguous 48 United States was the 34th warmest on record, giving it a "warmer than average" classification. The Arctic region has warmed slightly more than three times faster than the rest of the world since 1970.Rising temperatures in the atmosphere and ocean are contributing to the continued mass loss from Greenland and Antarctica and to increases in some extreme events, such as heat waves, wildfires, intense precipitation.NASA's temperature analyses incorporate surface temperature measurements from more than 20,000 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations.These in situ measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heat island effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980.NOAA scientists used much of the same raw temperature data, but with a different interpolation into the Earth's polar and other data-poor regions. NOAA's analysis found 2019 global temperatures were 1.7 degrees Fahrenheit (0.95 degrees Celsius) above the 20th century average.Video: NASA's full 2019 surface temperature data set and the complete methodology used for the temperature calculation and its uncertainties are available at:
Weather
2,020
January 15, 2020
https://www.sciencedaily.com/releases/2020/01/200115093437.htm
Global warming to increase violent crime in the United States
People in the United States could see tens of thousands of extra violent crimes every year -- because of climate change alone.
"Depending on how quickly temperatures rise, we could see two to three million more violent crimes between now and the end of the century than there would be in a non-warming world," said Ryan Harp, researcher at the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder and lead author of a new study published today in In 2018, Harp and his coauthor, Kris Karnauskas, CIRES Fellow and associate professor in the Department of Atmospheric and Oceanic Sciences at CU Boulder, mined an FBI crime database and NOAA climate data to identify a set of compelling regional connections between warming and crime rates, especially in winter. Warmer winters appeared to be setting the stage for more violent crimes like assault and robbery, likely because less nasty weather created more opportunities for interactions between people.Now, the team has projected additional future violent crimes in the United States, by combining the mathematical relationships they uncovered in previous work with output from 42 state-of-the-art global climate models. The team accounted for key factors that previous studies have overlooked, including variations in crime rates across seasons and for different regions of the country."We are just beginning to scratch the surface on the myriad ways climate change is impacting people, especially through social systems and health," Karnauskas said. "We could see a future where results like this impact planning and resource allocation among health, law enforcement and criminal justice communities."
Weather
2,020
January 14, 2020
https://www.sciencedaily.com/releases/2020/01/200114101715.htm
Historical housing disparities linked with dangerous climate impacts
Extreme heat kills more people in the United States than any other type of hazardous weather and will likely become even deadlier due to climate change. However, extreme heat does not affect all people equally. Surface temperatures in different neighborhoods within a single city can vary by a whopping 20 degrees (F), making some people more at risk of experiencing dangerous temperatures.
A new study by researchers at the Science Museum of Virginia and Portland State University, with assistance from a student at Virginia Commonwealth University, is one of the first to link historical housing policies across the United States to inequitable heat exposure."We found that those urban neighborhoods that were denied municipal services and support for home ownership during the mid-20th century now contain the hottest areas in almost every one of the 108 cities we studied," said Vivek Shandas, professor of urban studies and planning at Portland State University. "Our concern is that this systemic pattern suggests a woefully negligent planning system that hyper-privileged richer and whiter communities. As climate change brings hotter, more frequent and longer heat waves, the same historically underserved neighborhoods -- often where lower-income households and communities of color still live -- will, as a result, face the greatest impact."Jeremy Hoffman of the Science Museum of Virginia, and Nicholas Pendleton, a former student at Virginia Commonwealth University, also contributed to the study, which was published in the journal Neighborhoods with less green space and more concrete and pavement are hotter on average, creating 'heat islands.' In an earlier study of Portland, Oregon, Shandas and colleagues found that lower-income households and communities of color tend to live in heat islands. They found similar effects in other cities, and they wanted to know why.To explore this question, they looked at the relationship between 'redlining' and surface heat. Beginning in the 1930s, discriminatory housing policies categorized some neighborhoods -- designated with red lines -- as too hazardous for investment. Thus, residents in 'redlined' neighborhoods were denied home loans and insurance. These areas continue to be predominantly home to lower-income communities and communities of color. While the practice of redlining was banned in 1968, this study aimed to assess the legacy effects of such policies within the context of rising temperatures.The study found formerly redlined neighborhoods are hotter than all other neighborhoods in 94% of the 108 cities studied. In particular, the researchers found that redlined neighborhoods across the country are about 5 degrees Fahrenheit warmer, on average, than non-redlined neighborhoods. However, in some cities the differences are much more stark. For example, the cities of Portland, OR, Denver, CO and Minneapolis, MN showed the largest heat differences between redlined and non-redlined areas -- as much as 12.6 degrees Fahrenheit."The patterns of the lowest temperatures in specific neighborhoods of a city do not occur because of circumstance or coincidence. They are a result of decades of intentional investment in parks, green spaces, trees, transportation and housing policies that provided 'cooling services,' which also coincide with being wealthier and whiter across the country," said Shandas. "We are now seeing how those policies are literally killing those most vulnerable to acute heat.""I think anyone living in these neighborhoods today will tell you that it's hot during a heat wave," said Hoffman. "But that's not really the point. They are not only experiencing hotter heat waves with their associated health risks but also potentially suffering from higher energy bills, limited access to green spaces that alleviate stress and limited economic mobility at the same time. Our study is just the first step in identifying a roadmap toward equitable climate resilience by addressing these systemic patterns in our cities."There are ways to mitigate the effects of extreme heat on potentially vulnerable populations through urban planning, and the researchers want this study to lead to changes in the way we design our cities and neighborhoods."Having worked with dozens of cities to support the creation of heat mitigation plans, we want to recognize that all neighborhoods are not made equal," Shandas said. "Nevertheless, by recognizing and centering the historical blunders of the planning profession over the past century, such as the exclusionary housing policies of 'redlining,' we stand a better chance for reducing the public health and infrastructure impacts from a warming planet."
Weather
2,020
January 14, 2020
https://www.sciencedaily.com/releases/2020/01/200114074046.htm
Climate change increases the risk of wildfires confirms new review
Human-induced climate change promotes the conditions on which wildfires depend, increasing their likelihood -- according to a review of research on global climate change and wildfire risk published today.
In light of the Australian fires, scientists from the University of East Anglia (UEA), Met Office Hadley Centre, University of Exeter and Imperial College London have conducted a Rapid Response Review of 57 peer-reviewed papers published since the IPCC's Fifth Assessment Report in 2013.All the studies show links between climate change and increased frequency or severity of fire weather -- periods with a high fire risk due to a combination of high temperatures, low humidity, low rainfall and often high winds -- though some note anomalies in a few regions.Rising global temperatures, more frequent heatwaves and associated droughts in some regions increase the likelihood of wildfires by stimulating hot and dry conditions, promoting fire weather, which can be used as an overall measure of the impact of climate change on the risk of fires occurring.Observational data shows that fire weather seasons have lengthened across approximately 25 per cent of the Earth's vegetated surface, resulting in about a 20 per cent increase in global mean length of the fire weather season.The literature review was carried out using the new ScienceBrief.org online platform, set up by UEA and the Tyndall Centre for Climate Change Research. ScienceBrief is written by scientists and aims to share scientific insights with the world and keep up with science, by making sense of peer-reviewed publications in a rapid and transparent way.Dr Matthew Jones, Senior Research Associate at UEA's Tyndall Centre and lead author of the review, said: "Overall, the 57 papers reviewed clearly show human-induced warming has already led to a global increase in the frequency and severity of fire weather, increasing the risks of wildfire."This has been seen in many regions, including the western US and Canada, southern Europe, Scandinavia and Amazonia. Human-induced warming is also increasing fire risks in other regions, including Siberia and Australia."However, there is also evidence that humans have significant potential to control how this fire risk translates into fire activity, in particular through land management decisions and ignition sources."At the global scale, burned area has decreased in recent decades, largely due to clearing of savannahs for agriculture and increased fire suppression. In contrast, burned area has increased in closed-canopy forests, likely in response to the dual pressures of climate change and forest degradation.Co-author Professor Richard Betts, Head of Climate Impacts Research at the Met Office Hadley Centre and University of Exeter, said: "Fire weather does occur naturally but is becoming more severe and widespread due to climate change. Limiting global warming to well below 2?C would help avoid further increases in the risk of extreme fire weather."Professor Iain Colin Prentice, Chair of Biosphere and Climate Impacts and Director of the Leverhulme Centre for Wildfires, Environment and Society, Imperial College London, added: "Wildfires can't be prevented, and the risks are increasing because of climate change. This makes it urgent to consider ways of reducing the risks to people. Land planning should take the increasing risk in fire weather into account."Further information:
Weather
2,020
January 8, 2020
https://www.sciencedaily.com/releases/2020/01/200108131731.htm
Evolving landscape added fuel to Gobi Desert's high-speed winds
On February 28, 2007, harsh winds blew 10 train cars off a track running near China's Hami basin, killing three passengers and seriously injuring two others. Hurricane-force gusts of 75 mph or more scour this basin every 15-20 days or so, on average, and can reach maximum speeds of more than 120 mph. A study published last week in
"It's an odd-looking environment because it's covered by these dark-colored gravels," explained lead author Jordan Abell. "It's really hot, and can be extremely windy. Our team wondered if the surface plays any role in these extreme conditions." Abell is a graduate student at Columbia University's Lamont-Doherty Earth Observatory and the Department of Earth and Environmental Sciences. His advisor is Lamont-Doherty geochemist Gisela Winckler, also a co-author on the paper.The Hami basin may once have been covered in a fine, light-colored sediment, similar to California's Death Valley. Within the past 3 million years, however, strong winds carried away those fine sediments, leaving behind a sea of gray and black rocks.Using a weather and forecasting model, Abell and his colleagues studied how this change from light to dark landscape affected wind speeds in the basin. By absorbing more sunlight, the darker stones exposed by wind erosion heated up the air within the depression. The team found that the resulting differences in temperature between the depression and the surrounding mountains increased wind speeds by up to 25 percent. In addition, the amount of time the area experiences high wind speeds increased by 30 to 40 percent.Thus, by changing how much sunlight the ground absorbs, wind erosion appears to have exacerbated wind speeds in this region. It's the first time this positive feedback loop has been described and quantified, said Abell.But it's probably not the only example of its kind. The researchers think this interaction may have helped to shape other stony deserts in Australia, Iran, and perhaps even on Mars.Understanding this relationship between landscape changes, albedo, and wind erosion may help to make climate simulations more accurate for both the past and future.Climate models typically do not account for changes in the reflectance of landscapes other than those caused by ice and vegetation. They also tend to assume arid landscapes remain unchanged over time. That could be problematic in some cases, said Abell."If you wanted to calculate the wind or atmospheric circulation in this area 100,000 years ago, you would need to consider the change in the surface geology, or else you could be incorrect by 20 or 30 percent," he said.He added that the newly discovered relationship could also help to accurately model how other landscape changes, such as urbanization and desertification, influence atmospheric patterns by changing the reflectance of the Earth's surface.Other authors on the paper include: Lucas Gloege from Columbia University's Lamont-Doherty Earth Observatory and the Department of Earth and Environmental Sciences; Alex Pullen and Andrew Metcalf of Clemson University; Zachary Lebo of the University of Wyoming; Paul Kapp of the University of Arizona; and Junsheng Nie of Lanzhou University.
Weather
2,020
January 6, 2020
https://www.sciencedaily.com/releases/2020/01/200106122003.htm
Antarctic waters: Warmer with more acidity and less oxygen
The increased freshwater from melting Antarctic ice sheets plus increased wind has reduced the amount of oxygen in the Southern Ocean and made it more acidic and warmer, according to new research led by University of Arizona geoscientists.
The researchers found Southern Ocean waters had changed by comparing shipboard measurements taken from 1990 to 2004 with measurements taken by a fleet of microsensor-equipped robot floats from 2012 to 2019. The observed oxygen loss and warming around the Antarctic coast is much larger than predicted by a climate model, which could have implications for predictions of ice melt.The discovery drove the research team to improve current climate change computer models to better reflect the environmental changes around Antarctica."It's the first time we've been able to reproduce the new changes in the Southern Ocean with an Earth system model," said co-author Joellen Russell, a professor of geosciences.The research is the first to incorporate the Southern Ocean's increased freshwater plus additional wind into a climate change model, she said. The team used the National Oceanic and Atmospheric Administration's ESM2M model.Previously, global climate change models did not predict the current physical and chemical changes in the Southern Ocean, said Russell, who holds the Thomas R. Brown Distinguished Chair in Integrative Science."We underestimated how much influence that added freshwater and wind would have. When we add these two components to the model, we can directly and beautifully reproduce what has happened over the last 30 years," she said.Now, models will be able to do a better job of predicting future environmental changes in and around Antarctica, she said, adding that the Southern Ocean takes up most of the heat produced by anthropogenic global warming."One out of every eight carbon molecules that comes out of your tailpipe goes into the Southern Ocean," Russell said. "Our model says that in the future, we may not have as big of a carbon sink as we were hoping."First author Ben Bronselaer led the effort to improve the climate models when he was a postdoctoral research associate in Russell's lab. He is now a meteorological and oceanographic engineer at the British multinational oil and gas company BP in London.The team's paper, "Importance of wind and meltwater for observed chemical and physical changes in the Southern Ocean," is scheduled for publication in To develop a better understanding of the Earth's climate system, scientists constantly refine their global climate change models.As part of that effort, the Southern Ocean Carbon and Climate Observations and Modeling Project, or SOCCOM, studies the Southern Ocean and its influence on climate.The National Science Foundation funds SOCCOM, with additional support provided by the National Oceanic and Atmospheric Administration, or NOAA, and NASA.Russell leads the SOCCOM group that improves how the Southern Ocean is represented in computer models of global climate. She's been studying the ocean around Antarctica for 25 years."My first research cruise in the Southern Ocean was in 1994. It was in the winter in the deep South Pacific. I had grown up in Alaska, and I knew what a blizzard felt like -- and I had never felt winds like that before," she said.She's been "obsessed" by the extreme Antarctic winter winds ever since, she said.Russell and other scientists have been taking shipboard measurements in the waters around Antarctica for decades, but winter conditions make that extremely difficult. Moreover, the extent of the winter sea ice makes taking nearshore measurements from ships impossible, she said.The robot floats SOCCOM began deploying in 2014 have solved that problem."The robot floats can go under the winter ice and work all winter long collecting data. The robot floats are the revolution in how we can even imagine looking at the evolution of the ice and the ocean," she said. "We had never seen the winter-time chemistry under the ice."The floats revealed how much Antarctic waters had changed in the last several decades -- a development global climate models had not predicted.Bronselaer, Russell and their colleagues had previously added additional freshwater from melting ice sheets to climate models, but that revision did not reproduce the recent changes in the Southern Ocean's chemistry.Increasing the freshwater and the amount of Antarctic wind in the model solved the problem -- now the model correctly represents the current state of Antarctic waters.The team also used the improved model to forecast conditions in the Southern Ocean. The forecast suggests that in the future, the Southern Ocean may not take up as much carbon dioxide from the atmosphere as previously predicted.Russell plans to continue pursuing the Antarctic's winter winds."We didn't observe it -- but the model says we need it," she said. "I'm proposing to NASA a satellite to go hunt for the missing wind."
Weather
2,020
January 2, 2020
https://www.sciencedaily.com/releases/2020/01/200102143429.htm
Climate signals detected in global weather
In October this year, weather researchers in Utah measured the lowest temperature ever recorded in the month of October in the US (excluding Alaska): -37.1°C. The previous low-temperature record for October was -35°C, and people wondered what had happened to climate change.
Until now, climate researchers have responded that climate is not the same thing as weather. Climate is what we expect in the long term, whereas weather is what we get in the short term -- and since local weather conditions are highly variable, it can be very cold in one location for a short time despite long-term global warming. In short, the variability of local weather masks long-term trends in global climate.Now, however, a group led by ETH professor Reto Knutti has conducted a new analysis of temperature measurements and models. The scientists concluded that the weather-is-not-climate paradigm is no longer applicable in that form. According to the researchers, the climate signal -- that is, the long-term warming trend -- can actually be discerned in daily weather data, such as surface air temperature and humidity, provided that global spatial patterns are taken into account.In plain English, this means that -- despite global warming -- there may well be a record low temperature in October in the US. If it is simultaneously warmer than average in other regions, however, this deviation is almost completely eliminated. "Uncovering the climate change signal in daily weather conditions calls for a global perspective, not a regional one," says Sebastian Sippel, a postdoc working in Knutti's research group and lead author of a study recently published in In order to detect the climate signal in daily weather records, Sippel and his colleagues used statistical learning techniques to combine simulations with climate models and data from measuring stations. Statistical learning techniques can extract a "fingerprint" of climate change from the combination of temperatures of various regions and the ratio of expected warming and variability. By systematically evaluating the model simulations, they can identify the climate fingerprint in the global measurement data on any single day since spring 2012.A comparison of the variability of local and global daily mean temperatures shows why the global perspective is important. Whereas locally measured daily mean temperatures can fluctuate widely (even after the seasonal cycle is removed), global daily mean values show a very narrow range.If the distribution of global daily mean values from 1951 to 1980 are then compared with those from 2009 to 2018, the two distributions (bell curves) barely overlap. The climate signal is thus prominent in the global values but obscured in the local values, since the distribution of daily mean values overlaps quite considerably in the two periods.The findings could have broad implications for climate science. "Weather at the global level carries important information about climate," says Knutti. "This information could, for example, be used for further studies that quantify changes in the probability of extreme weather events, such as regional cold spells. These studies are based on model calculations, and our approach could then provide a global context of the climate change fingerprint in observations made during regional cold spells of this kind. This gives rise to new opportunities for the communication of regional weather events against the backdrop of global warming."The study stems from a collaboration between ETH researchers and the Swiss Data Science Center (SDSC), which ETH Zurich operates jointly with its sister university EPFL. "The current study underlines how useful data science methods are in clarifying environmental questions, and the SDSC is of great use in this," says Knutti.Data science methods not only allow researchers to demonstrate the strength of the human "fingerprint," they also show where in the world climate change is particularly clear and recognisable at an early stage. This is very important in the hydrological cycle, where there are very large natural fluctuations from day to day and year to year. "In future, we should therefore be able to pick out human-induced patterns and trends in other more complex measurement parameters, such as precipitation, that are hard to detect using traditional statistics," says the ETH professor.
Weather
2,020
December 30, 2019
https://www.sciencedaily.com/releases/2019/12/191230104759.htm
Scientists link La Niña climate cycle to increased diarrhea
A study in Botswana by Columbia University Mailman School of Public Health scientists finds that spikes in cases of life-threatening diarrhea in young children are associated with La Niña climate conditions. The findings published in the journal
In low- and middle-income countries, diarrhea is the second leading cause of death in children younger than five years of age, with 72 percent of deaths occurring in the first two years of life. Rates of under-5 diarrhea in Africa are particularly high, with an estimated incidence of 3.3 episodes of diarrhea per child each year and one-quarter of all child deaths caused by diarrhea.The El Niño-Southern Oscillation (ENSO) is a coupled ocean-atmosphere system spanning the equatorial Pacific Ocean that oscillates in a 3-to-7-year cycle between two extremes, El Niño (warmer ocean temperatures) and La Niña (cooler ocean temperatures). The ENSO cycle affects local weather patterns around the world, including temperatures, winds, and precipitation.Researchers analyzed associations between ENSO and climate conditions and cases of under-5 diarrhea in the Chobe region in northeastern Botswana. They found that La Niña is associated with cooler temperatures, increased rainfall, and higher flooding during the rainy season. In turn, La Niña conditions lagged 0-7 months are associated with about a 30-percent increase in incidence of under-5 diarrhea in the early rainy season from December through February"These findings demonstrate the potential use of the El Niño-Southern Oscillation as a long-lead prediction tool for childhood diarrhea in southern Africa," says first author Alexandra K. Heaney, a former doctoral student in environmental health sciences at Columbia Mailman and now a postdoc at University of California, Berkeley. "Advanced stockpiling of medical supplies, preparation of hospital beds, and organization of healthcare workers could dramatically improve the ability of health facilities to manage high diarrheal disease incidence."Previously, El Niño events have been linked to diarrhea outbreaks in Peru, Bangladesh, China, and Japan, but until now studies of the effects of ENSO on diarrheal disease in Africa have been limited to cholera -- a pathogen responsible for only a small fraction of diarrheal cases in Africa.Infectious diarrhea is caused by many different pathogens (viruses, bacteria, and protozoa) and meteorological conditions can have a critical influence on pathogen exposures, in particular, those associated with waterborne transmission. For example, extreme rainfall events may contaminate drinking water by flushing diarrhea-causing pathogens from pastures and dwellings into drinking water supplies, and drought conditions can concentrate animal activity increasing the movement of diarrhea-causing pathogens into surface water resources.The researchers speculate that centralized water disinfection processes currently used in the Chobe region may be insufficient to deal with changes in water quality brought on by extremes of wet and dry weather, although they caution that further confirmatory studies are needed.Earlier research by Columbia Mailman researchers in the Chobe region found that cases of diarrhea in young children spiked during extreme climate conditions, in both the wet and dry seasons. A second study reported on a method to forecast childhood diarrheal disease there. Because climate conditions vary from region to region, forecasts for infectious diseases must be region-specific. In other studies, the scientists have created forecasts for influenza, Ebola, and West Nile Virus. During the influenza season in the United States, they publish weekly regional forecasts with predictions on whether cases are expected to rise or fall and by how much.Research into links between climate systems and infectious disease in Botswana also provides insights into long-term changes in weather patterns coming as a result of climate change."In Southern Africa, precipitation is projected to decrease," says Jeffrey Shaman, PhD, co-author and professor of environmental health sciences at the Columbia Mailman School. "This change, in a hydrologically dynamic region where both wildlife and humans exploit the same surface water resources, may amplify the public health threat of waterborne illness. For this reason, there is an urgent need to develop the water sector in ways that can withstand the extremes of climate change."
Weather
2,019
December 18, 2019
https://www.sciencedaily.com/releases/2019/12/191218090204.htm
Preparing for extreme sea levels depends on location, time
Sometimes to understand the present, it takes looking to the past. That's the approach University of Central Florida coastal researchers are taking to pinpoint the causes of extreme sea level changes.
Using historical data from tide gauges that line U.S. coasts, the researchers created an extreme sea level indicator that identifies how much of a role different major weather and ocean forces have played in affecting extreme sea levels in coastal areas around the country.They published their latest findings today in the journal "What this indicator does, which other indicators do not show, is how weather and climate forces interact with predictable tides to make up high sea levels that can be potentially dangerous," says Thomas Wahl, an assistant professor in UCF's Department of Civil, Environmental and Construction Engineering, a member of UCF's National Center for Integrated Coastal Research and study co-author.To conduct their study, the researchers examined key contributing factors behind extreme sea level change -- mean annual sea level, low frequency tides, and storm surges.Extreme sea level change, in the context of this study, is when the likelihood for water level thresholds to be exceeded is higher or lower than under normal conditions. This can have a devastating effect on coastlines, where 40 percent of the U.S. population lives, or more than 126 million people.Low-frequency tides are higher than average tides that roll in only every so often, such as every 18.6 or every 4.4 years. Average sea level changes each year and can be affected by forces like the El Niño-Southern Oscillation. Storm surges occur from hurricanes and nor'easters.For Florida's Gulf Coast, such as at Cedar Key, mean sea level and storm surges have been the most responsible for creating periods where extreme sea level events tended to be higher or more frequent, the indicator shows. For the South Atlantic coast, including Jacksonville, Florida, mean sea level and low frequency tides have had a greater effect than variations in storm surges.This is in stark contrast to the North Atlantic coast, such as Portland, Maine, where low frequency tides have had a greater influence on extreme sea levels than variations in mean sea level or storm surges, especially in the summer.Knowing these differences can aid policy makers in devising coastal resiliency strategies, says Mamunur Rashid, a postdoctoral research associate in UCF's Department of Civil, Environmental and Construction Engineering, a member of UCF's National Center for Integrated Coastal Research and the study's lead author."Hopefully, at some point such information can help guide the development of sustainable coastal adaptation plans," Rashid says.
Weather
2,019
December 17, 2019
https://www.sciencedaily.com/releases/2019/12/191217141543.htm
In ancient Scottish tree rings, a cautionary tale on climate, politics and survival
Using old tree rings and archival documents, historians and climate scientists have detailed an extreme cold period in Scotland in the 1690s that caused immense suffering. It decimated agriculture, killed as much as 15 percent of the population and sparked a fatal attempt to establish a Scottish colony in southern Panama. The researchers say the episode -- shown in their study to have been during the coldest decade of the past 750 years -- was probably caused by faraway volcanic eruptions. But it was not just bad weather that brought disaster. Among other things, Scotland was politically isolated from England, its bigger, more prosperous neighbor that might have otherwise helped. Propelled in part by the catastrophe, the two nations merged in 1707 to become part of what is now the United Kingdom. Such a famine-related tragedy was never repeated, despite later climate swings.
With Brexit now threatening to isolate the UK from the European Union, the researchers think politicians should take this as a cautionary tale. "By joining England, Scotland became more resilient," said lead author Rosanne D'Arrigo, a tree-ring scientist at Columbia University's Lamont-Doherty Earth Observatory. "The bigger message for today is arguably that as the climate changes, nations will be stronger if they stick together and not try to go it alone." The study appears in the early online edition of the The "Scottish Ills" have long been noted in history books. In some years, snow from the winter persisted on the ground well into summer, and frosts struck every summer night for weeks. The planting season was cut short, and crops were struck down before they could be harvested. Livestock had nothing to eat. The study quotes Mary Caithness, Countess of Breadalbane, describing "cold misty weather such as the oldest person alive hath not seen." Other regions including France, England and the Netherlands also suffered unusually cold weather, but generally with less drastic results. In Scandinavia, however, tens of thousands died. It was "likely the worst era of crop failure, food shortage and mortality ever documented in Scottish history," the researchers write.Based on the width and density of tree rings the researchers collected, they showed that 1695-1704 was Scotland's coldest decade in 750 years. This, on top of the fact that much of the northern hemisphere was already in the grip of the so-called Little Ice Age, when cold temperatures were the norm for centuries, until the 1800s. "Before this, we knew it was cold. Now we have an understanding of exactly how cold," said coauthor Rob Wilson of Scotland's University of St. Andrews, and an adjunct researcher at Lamont-Doherty. "The whole 17th century must have been a horrible time to live in Scotland, but this was the worst part."The researchers say that the Ills coincided closely with multiple large volcanic eruptions. Previous researchers have identified particles in ice cores that traveled long distances from eruptions that probably took place somewhere in the tropics in 1693 and 1695. And Iceland's Mount Hekla darkened the skies for seven months in 1693. Scientists already know that large-scale volcanism throws sulfate particles into the atmosphere; these deflect sunlight and can lower temperatures far from the eruption itself for years. Thus, the researchers believe the eruptions would explain the chilly weather that hit Scotland and other northern hemisphere nations all at the same time. (Unsurprisingly, the tree rings also show that the warmest century of the record was 1911-2010, almost certainly due to human greenhouse-gas emissions.)The findings are an outgrowth of the Scottish Pine Project, in which Wilson and his colleagues have been gathering tree-ring samples for the past 10 years in northern Scotland. In the desolate Highlands region of Cairngorms, they have drilled out cores from living trees going back to the 1400s. To extend the record back further, they have snorkeled along the nearshore bottoms of icy lochs, searching for long-dead trees that have fallen in and been preserved over the centuries in cold, oxygen-poor mud. Once they find specimens, they winch them out by hand and take out cross sections with chain saws. The team has also studied buildings whose timbers went back to the 1100s, though these were not included in the climate reconstruction. Their initial chronology was published in 2017. In the course of their work, the team has found trees in the lochs as old as 8,000 years. They are still collecting samples and working to construct a continuous climate record predating the Middle Ages.In the new study, the researchers say that climate was not the only factor in the Scottish Ills. "The connection seems simple -- volcanic cooling triggered famine -- but the drivers toward famine are far more complex," they write. They cite Scotland's economic circumstances and political isolation from England as major factors. England had more good farmland and, at the time, better agricultural technology and organization for delivering relief to the poor. While also hit with cool weather, England did not suffer a famine, and probably would have come to the aid of Scotland had the nations been united. Scotland also unwisely encouraged the export of crops at a time when they were needed at home.At the height of the Ills, the Scots developed an intricate venture to send colonists to the Darien region of Panama. Driven in part by the desperation of the famine, the idea caught on as a national mania, and people of all social and economic classes invested much of their assets -- in all, as much as half the nation's entire liquid capital. Starting in 1698, a total of 2,500 colonists began sailing to this malarial jungle coast. They were quickly cut down by disease, malnutrition (Scotland could ill afford to resupply the colony) and conflicts with Spanish forces, which already controlled much of South and Central America. The colony was abandoned after just 16 months; only a few hundred colonists survived; and Scotland was financially ruined. The inhospitable Darien region remains barely inhabited even today."At the time, the Scots saw the colony as a kind of Exodus, where they would start over somewhere new," said D'Arrigo. "In the end, they couldn't escape."Repeated proposals to unite England and Scotland had come up during the 1600s, but the Scots had resisted. As the famine came to a close, they finally gave in; apparently, many of the gentry making the decision figured that hitching themselves to a greater power would buffer them from further misfortunes. The Acts of Union, passed by the parliaments of Scotland and England, took effect in 1707. Scotland suffered other climate extremes in succeeding centuries, but never again collapsed in this way.In 2014, more than 300 years after the union, the Scots took a referendum on whether to once again become an independent state; 55 percent voted to stay with the UK. Then came the 2016 UK-wide referendum that set Brexit in motion -- deeply unpopular in Scotland, where 62 percent voted to remain in the EU. In last week's UK parliamentary elections, pro-Brexit forces won overall, but lost resoundingly in Scotland. Many Scots now seem to be reconsidering independence -- not because they want to stand alone again, but because independence might allow them to rejoin the larger community of the EU, and leave the isolationist English to fend for themselves. Calls for another independence referendum are already circulating."Scotland became more resilient when it became part of a union," said Wilson. "It's a cautionary tale from history."The study's other authors are historians Patrick Klinger of the University of Kansas and Timothy Newfield of Georgetown University, and climate scientist Milos Rydval of the Czech University of Life Sciences.
Weather
2,019
December 16, 2019
https://www.sciencedaily.com/releases/2019/12/191216173647.htm
Birds' seasonal migrations shift earlier as climate changes
In what the authors believe is one of the first studies to examine climate change impact on the timing of bird migration on a continental scale, researchers report that spring migrants were likely to pass certain stops earlier now than they would have 20 years ago. Also, temperature and migration timing were closely aligned, with the greatest changes in migration timing occurring in the regions warming most rapidly. Timing shifts were less apparent in fall, they add.
Writing in Horton describes the breadth of the research, which observed nighttime migratory behaviors of hundreds of species representing billions of birds, as "critically important" to understanding and learning more answers about shifting migration patterns. "To see changes in timing at continental scales is truly impressive, especially considering the diversity of behaviors and strategies used by the many species the radars capture," he says, adding that the observed shifts do not necessarily mean that migrants are keeping pace with climate change.Farnsworth says the team's research answered, for the first time, key questions on birds and climate change. "Bird migration evolved largely as a response to changing climate," he points out. "It's a global phenomenon involving billions of birds annually. And it's not a surprise that birds' movements track changing climates. But how assemblages of bird populations respond in an era of such rapid and extreme changes in climate has been a black box. Capturing scales and magnitudes of migration in space and time has been impossible until recently."Horton says that this access to the data and cloud computing greatly enhanced the team's ability to synthesize the findings. "To process all of these data, without cloud computing, it would have taken over a year of continuous computing," he notes. Instead, the team crunched the numbers in about 48 hours.As Sheldon at UMass Amherst points out, these bird flights have been recorded for decades by the National Weather Services' network of constantly scanning weather radars, but until recently these data have been mostly out of reach for bird researchers, partly because the sheer magnitude of information and lack of tools to analyze it made only limited studies possible.For this study, Amazon Web Services provided access to the data. Also, a new tool, "MistNet," developed by Sheldon and colleagues at UMass Amherst with others at the Cornell Lab uses machine learning to extract bird data from the radar record and to take advantage of the decades-long radar data archives. The name refers to the fine, almost invisible, "mist nets" that ornithologists use to capture migratory songbirds.As Sheldon explains, MistNet automates the processing of a massive data set that has measured bird migration over the continental U.S. for over two decades, with excellent results when compared to humans working by hand. It uses computer vision techniques to differentiate birds from rain on the images, a major hurdle that had challenged biologists for decades. "Historically, a person had to look at each radar image to determine whether it contained rain or birds," he notes. "We developed 'MistNet,' an artificial intelligence system to detect patterns in radar images and remove rain automatically."Sheldon's group made earlier maps of where and when migration occurred over the past 24 years and animated these to illustrate, for example, the most intensive migration areas in the continental United States in a corridor roughly along and just west of the Mississippi River. MistNet also allows researchers to estimate flying velocity and traffic rates of migrating birds.Horton at CSU says that the lack of change in fall migration patterns was a little surprising, though migration also tends to be a "little bit messier" during those months. "In the spring, we see bursts of migrants, moving at a fairly rapid pace, ultimately to reach the breeding grounds," he explained. "However, during the fall, there's not as much pressure to reach the wintering grounds, and migration tends to move at a slower, more punctuated pace."A combination of factors makes fall migration more challenging to study, he adds. In the fall, birds are not competing for mates and the pace to reach their destination is more relaxed. There's also a wider age range of birds migrating, as the young eventually realize they need to migrate, too.Horton said the findings have implications for understanding future patterns of bird migration, because the birds rely on food and other resources as they travel. Under climate change, the timing of blooming vegetation or emergence of insects may be out of sync with the passage of migratory birds. They say even subtle shifts could have negative consequences for the health of migratory birds. In the future, the researchers plan to expand their data analysis to include Alaska, where climate change is having more serious impacts than in the lower 48 states in the U.S.
Weather
2,019
December 13, 2019
https://www.sciencedaily.com/releases/2019/12/191213115410.htm
Finding a killer electron hot spot in Earth's Van Allen radiation belts
A collaboration between researchers in Japan, the USA, and Russia has found a hot spot in Earth's radiation belt where killer electrons, which can cause serious anomalies in satellites, form. The finding, published in the journal
Professor Yoshizumi Miyoshi of the Institute for Space-Earth Environmental Research at Nagoya University and colleagues compared data from two satellites situated on opposite sides of the Earth: the Arase satellite, developed by the Japanese Aerospace Exploration Agency (JAXA), and NASA's Van Allen Probes. Both satellites gather data from the Van Allen radiation belts, zones of energetic particles originating largely from solar wind. Energetic particles in the belts are trapped by Earth's magnetic field.Scientists have known that electrons in Van Allen radiation belts that interact with ultralow frequency plasma waves accelerate to reach the speed of light. However, it has not been clear when or where these killer electrons start to accelerate.To gain more insight about the electrons, Professor Miyoshi and his colleagues analyzed data generated on March 30, 2017, by the Arase satellite and Van Allen Probe. On one side of the Earth, the Van Allen Probe identified characteristic signs of an interaction between ultralow frequency waves and energetic electrons. On the opposite side, at the same point in time, the Arase satellite identified high-energy electron signatures, but no ultralow frequency waves.The measurements indicate that the interaction region between electrons and waves is limited, but that the killer electrons then continue to travel on an eastward path around the Earth's magnetosphere."An important topic in space weather science is understanding the dynamics of killer electrons in the Van Allen radiation belt," says Miyoshi. "The results of this study will improve the modelling and lead to more accurate forecasting of killer electrons in Van Allen radiation belts."
Weather
2,019
December 12, 2019
https://www.sciencedaily.com/releases/2019/12/191212153300.htm
Climate cycles and insect pests drive migration timing of reindeer's North American cousin
Caribou, the North American cousin of reindeer, migrate farther than any terrestrial animal. They can cover thousands of miles as they move between winter feeding grounds and summer calving grounds. But many caribou herds are in decline as the warming climate changes much of the landscape they depend on. Inedible shrubs are rapidly encroaching on the tundra, and more frequent forest fires and disease are destroying the trees that provide caribou with lichen for food. The role of climate on their migration patterns has never been well understood, but knowing what drives caribou movements is crucial to predicting the future for the iconic species that plays a key roll the ecological and economic stability of the Arctic region.
A new study led by a University of Maryland biologist discovered two unexpected drivers for migration timing that dispute long-held assumptions and provide insight into potential future effects of climate change on caribou. First, the study found that caribou herds all across North America are triggered to start spring migration at roughly the same time by large-scale, ocean-driven climate cycles. Second, despite a synchronized start, arrival at their respective calving grounds depends on the previous summer's weather conditions. Warm, windless summers that favored insect pests lead to poorer maternal health and delayed arrivals at the calving grounds the following spring.The study, which accounted for approximately 80% of all North American migratory caribou, is the largest caribou migration study to date. It was published in the December 12, 2019 issue of the journal "This was completely unexpected," said Eliezer Gurarie, an associate research scientist in UMD's Department of Biology and lead author of the study. "There was no reason to think that herds that calve near the Hudson Bay in the East would begin migration at the same time as the herds along coastal Western Alaska, or that summer conditions would play an important role in the following spring migration. Prior to this, it had been assumed that migration timing depends on some combination of snowmelt and availability of useful vegetation at the endpoint of the migration. Neither of those held up."The findings could point to difficult times for caribou in the future if Arctic summers continue to warm as predicted."The summers are definitely getting warmer and more insect friendly," Gurarie said. "That is going to have consequences for these iconic animals down the road."To understand what drives caribou migration, Gurarie analyzed tracking data from 1,048 individual caribou from 1995 to 2017. The data represent seven major herds that account for 80% of all North American caribou. The data was collected by the study's coauthors from the National Park Service, Alaska Department of Fish and Game, Yukon Government and Government of Northwest Territories. Gurarie correlated the tracking data to global climate indices, local weather conditions, snowmelt timing and vegetation.The results revealed a clear pattern of synchronized departure. Later migration starts were associated with positive North Atlantic Oscillations in spring and summer, which tend to create cooler, dryer and less windy conditions. Earlier migration starts corresponded with two cycles: a positive Pacific Decadal Oscillation in the spring and previous summer, which tends to bring more snow and wind to the North; and a positive Arctic Oscillation, which strengthens the polar circulation and forces cold air and storms to remain farther north.The relationship between these oscillations and weather on the ground is complex, but the study draws links that explain how regionwide seasonal conditions might influence migration. For example, the scientists found that across the Arctic, warm temperatures after a snowy winter delayed the start of migration, and Gurarie suggested this is likely because snowmelt softened the ground and made travel difficult."The whole departure story is linked to conditions that are most suitable for movement," Gurarie said. "It's less about whether or not there is snow on the ground and more about the conditions of the snow: Is it slushy and soupy and hard to travel through, or is it hardpacked and easier to walk on?"Regardless of travel conditions at the start of migration, the study revealed that caribou arrive at their calving grounds when they are ready to give birth. (If they start migration early in the season, they tend not to rush. If they start late, they move nearly continuously to get to the calving grounds.) But caribou have variable gestation periods, so not all herds were ready to give birth at the same time, which meant some herds arrived at their respective calving grounds later than others. When the researchers compared arrival times with climate and weather data, they consistently found that caribou herds arrived at the calving grounds earlier when they experienced cool, windy conditions the previous summer."This connection to the previous summer's weather, particularly temperature and wind, kept coming up," Gurarie said. "When we looked closer at the biology and ecology of these animals, it became clear that this was tied to insect harassment and the physical condition of the caribou."Insects are relentless pests during summer in the Arctic. Caribou spend a great deal of energy searching for relief from the mosquitos, botflies and warble flies that plague them. To avoid the insects, Caribou move en masse to higher ground or closer to the coast, or they wander in search of snow patches where there are fewer insects. All of that avoidance means much less eating.During cooler, windier summers, caribou remain healthier because they spend less time avoiding insects and more time eating. Healthier caribou cows have more nutrient stores and shorter pregnancies, whereas caribou weakened by a tough summer of insect harassment have longer pregnancies and arrive at the calving grounds later. Unfortunately, herds that arrive late continue to be at a disadvantage, because their calves have less time on the summer feeding range to fatten up before migrating back to the winter range in the fall.Previous studies of gestation and arrival at the calving grounds have focused on the availability of vegetation for forage, but Gurarie and his colleagues found that vegetation at the calving grounds had no impact on migration timing for the herds in their study."Ultimately, we analyzed satellite-tracked movement data and remotely sensed climate and weather data," Gurarie said. "Using these large-scale observational data, we pieced together a story about individual physical condition, gestation times and reproduction calving, which is really a story about behavior and physiology on a very large scale. In making these links, we ended up leaning on decades-old studies of caribou -- papers from the '50s, '60s and '70s. We looked, for example, at relationships between diet and calving timing, at insects and feeding behavior, and at snow quality and movement."Gurarie's excitement over the power of the data to help reveal the story of caribou migration is tempered by a looming question about what could happen to caribou as the climate continues to warm. The team's next steps are to analyze data on caribou birth and death rates to gain a clearer picture of how migration timing and environmental conditions affect caribou populations."Researchers have been studying caribou movement and ecology for decades, but the data are fragmentary," said William Fagan, professor and chair of the UMD Department of Biology and a co-author of the study. "This study shows that by synthesizing these data -- from climate data, and remote tracking data to historical studies and biological data collected in the field by the various agencies that manage caribou across the continent -- there are great opportunities to gain a more integrated portrait of this economically, culturally and ecologically important species."
Weather
2,019
December 11, 2019
https://www.sciencedaily.com/releases/2019/12/191211145647.htm
Mountain goats' air conditioning is failing, study says
A new study in the journal
Researchers from the University of Montana, Glacier National Park, and Wildlife Conservation Society (WCS) found that mountain goats (Oreamnos americanus) in Glacier National Park seek out patches of snow in the summertime to reduce heat stress. When they do, breathing rates went down, a behavioral strategy that results in less energy expended.The trouble is Glacier has already lost some 75% of its glaciers and many snow patches are rapidly dwindling. The park had over 100 glaciers when it was established in 1910. In 2015, only a couple dozen met the size criteria to be considered active glaciersThe study's authors, Wesley Sarmento of the University of Montana, Mark Biel of Glacier National Park, and WCS Senior Scientist, Joel Berger, have studied mountain goats in the field since 2013 to better understand thermal environments and their changes on this cold-adapted species in Glacier.To understand stressors to goats and ways in which they combat heat, the scientists performed observations of animals on and off ice patches on hot summer afternoons, and days with and without wind. To avoid higher temperatures, the goats sought snow patches for resting, and when they found them, breathing rates were reduced by as much as 15 percent.The authors note that while people seek shade or air conditioning to stabilize their metabolic rates and animals like coyotes or marmots seek dens, mountain goats in the shade-less environs above treeline have less opportunity to reduce exposure to rising temperature. Goats that were observed resting in shade did not have significant reductions in respirations."10,000 years ago when the North American climate was cooler there were mountain goats in Grand Canyon, but certainly increasing temperatures and drier weather ultimately contributed to their extinction in that area," says Sarmento.Says Biel: "This work is important to shed light on the impacts of a changing climate on these iconic animals and their habitat. How certain species may adapt as the changes continue is critical in understanding their persistence on the landscape into the future."Like people from Europe to the America's and far beyond, high temperatures cause stress and death. In 2019, more than 1,000 people died from heat exposure in France and Spain.Berger, also the Barbara-Cox Chair of Wildlife Conservation professor at Colorado State University draws analogies beyond alpine animals. "Just as people are feeling the heat of a warming planet with thousands and thousands struggling during summer without natural cooling systems, we're seeing very clearly that what happens to people is also happening to animals -we're all in this together."
Weather
2,019
December 9, 2019
https://www.sciencedaily.com/releases/2019/12/191209132003.htm
Community characteristics shape climate change discussions after extreme weather
Political affiliations, the presence of local environmental organizations and prior local media coverage of climate change play a role in how a community reacts to an extreme weather event, an article published today in
"Extreme weather events such as a catastrophic wildfire, a 500-year flood or a record-breaking heatwave may result in some local discussion and action around climate change, but not as much as might be expected and not in every community," said Hilary Boudet, the paper's lead author and an associate professor of public policy in Oregon State University's School of Public Policy in the College of Liberal Arts."In terms of making links to climate change, local reactions to an extreme weather event depend both on aspects of the event itself, but also on political leanings and resources within the community before the event took place."Boudet's work is part of a growing body of research exploring the links between personal experience with an extreme weather event and social mobilization around climate change. The study was part of a project examining community reactions to extreme weather in the U.S.The researchers sought to better understand how extreme weather events might influence local attitudes and actions related to climate change.Researchers conducted 164 interviews with local residents and community leaders in 15 communities across the United States that had experienced extreme weather events that caused at least four fatalities between 2012 and 2015. They also analyzed media coverage to better understand what kinds of public discussions, actions and policies around climate change occurred in those communities.Of the 15 communities, nine showed evidence of public discussion about the event's connection to climate change in the wake of the disaster."Although many of the extreme events we studied spurred significant emergency response from volunteers and donations for rescue and recovery efforts, we found these events sparked little mobilization around climate change," Boudet said. "Yet there was also a distinct difference between cases where community climate change discussion occurred and where it did not, allowing us to trace pathways to that discussion."When there was some scientific certainty that the weather event was related to climate change, discussion about the connection was more likely to occur, particularly in communities that leaned Democratic or where residents were highly educated, Boudet said.However, even in communities where climate change was discussed in relation to the weather event, it was often a marginal issue. Other more immediate concerns, such as emergency response management and economic recovery, generated far more discussion and subsequent action.Some of those interviewed suggested that broaching the topic of climate change amid disaster recovery efforts could be interpreted as using a tragedy to advance a political agenda, Boudet said."Recent shifts in U.S. opinions on climate change suggest that it may become a more acceptable topic of conversation following an extreme weather event," Boudet said. "Yet our results indicate that it may take time for such discussions to take place, particularly in Republican-leaning communities."While the work challenges the notion that a single extreme weather event will yield rapid local social mobilization around climate change, the researchers found that communities may still make important changes post-event to ensure more effective responses to future events.Boudet and her team are currently examining local policy action post-event to understand how such action relates to local climate change discussion.
Weather
2,019
December 4, 2019
https://www.sciencedaily.com/releases/2019/12/191204152842.htm
Atmospheric river storms create $1 billion-a-year flood damage
Atmospheric rivers pose a $1 billion-a-year flood risk in the West, according to a study released today.
Researchers at Scripps Institution of Oceanography at the University of California San Diego and the U.S. Army Corps of Engineers analyzed the economic impact of the winter storms that deliver an increasingly large share of rain and snow to California and the West. The study appears Dec. 4 in the journal The team led by Scripps postdoctoral researcher Tom Corringham found that flooding has caused nearly $51 billion in damages to western states in the last 40 years. More than 84 percent of these damages were caused by atmospheric rivers (ARs), which are long narrow corridors of water vapor in the atmosphere capable of carrying more than twice the volume of the Amazon river through the sky.In some coastal areas of Oregon and Northern California, ARs were responsible for over 99 percent of all flood damages, the study said. The researchers also noted that much of the economic damage was caused by only a handful of AR storms. An estimated $23 billion in damage, nearly half of all flood damage in the West over 40 years, was caused by just ten ARs. Overall the total flood damage from ARs averaged $1.1 billion annually throughout the West.Last year, Scripps researcher F. Martin Ralph and colleagues created a scale for the intensity and impacts of ARs to help distinguish between weak and strong ARs. The scale is from 1 to 5 and is analogous to those used for rating hurricanes and tornadoes. It includes ARs that are mostly beneficial (AR 1 and AR 2) and those that are mostly hazardous (AR 4 and AR 5). Corringham's analysis confirmed that flood damages from AR 1 and AR 2 events are generally low, and that each category increase in AR intensity corresponds to a tenfold increase in flood damage, with AR 4 and AR 5 events causing damages in the 10s and 100s of millions of dollars, respectively."A small number of extreme ARs cause most of the flood damages in the West," said Corringham, "and even modest increases in intensity could significantly increase their impacts."The study, funded by NOAA through the California Nevada Climate Applications Program, the federal Bureau of Reclamation, and the University of California Office of the President, is the latest of several led by researchers affiliated with Scripps' Center for Western Weather and Water Extremes (CW3E). In recent research, ARs have been shown to be major drivers of weather and hydrology in the western United States, often making the difference between drought and flood years over the span of just a few intense storms. Scientists expect that ARs will become even more significant as global warming trends increase their intensity.Because of this, scientists and emergency officials have called for more research to improve the ability to forecast the paths of ARs as they reach land and their potential to deliver rain and snow. Those improved forecasts could guide management of dams and reservoirs and give emergency officials additional time to prepare for floods. CW3E researchers are currently working on improving reservoir management at two locations through the Forecast Informed Reservoir Operations (FIRO) program and are considering expanding to other locations."Improved AR prediction will provide significant opportunity to improve management of multi-purpose water resource projects without costly structural modifications through Forecast-Informed Reservoir Operations," said study co-author Cary Talbot, chief of the Flood & Storm Protection Division at the Coastal & Hydraulics Laboratory of the US Army Engineer Research & Development Center. "Additionally, improved understanding of the potential economic damage from AR storms will greatly enhance the flood risk management mission of the Corps of Engineers and their state and local partner agencies."Corringham said the study contributes to evidence suggesting that changes in flood mitigation policy and disaster response are necessary. Such changes include discouraging new development in flood-prone areas, restoring natural floodplains, and developing green infrastructure."This is a reminder that weather and climate matter," he said. "Every step we take now to stabilize the global climate system stands to reduce future adverse impacts on our economy."
Weather
2,019
December 4, 2019
https://www.sciencedaily.com/releases/2019/12/191204144914.htm
Outlook for the polar regions in a 2-degrees-warmer world
With 2019 on pace as one of the warmest years on record, a major new study from the University of California, Davis, reveals how rapidly the Arctic is warming and examines global consequences of continued polar warming.
The study, published today in the journal "Many of the changes over the past decade are so dramatic they make you wonder what the next decade of warming will bring," said lead author Eric Post, a UC Davis professor of climate change ecology. ""If we haven't already entered a new Arctic, we are certainly on the threshold."The comprehensive report represents the efforts of an international team of 15 authors specializing in an array of disciplines, including the life, Earth, social, and political sciences. They documented widespread effects of warming in the Arctic and Antarctic on wildlife, traditional human livelihoods, tundra vegetation, methane release, and loss of sea- and land ice. They also examined consequences for the polar regions as the Earth inches toward 2 degrees C warming, a commonly discussed milestone."Under a business-as-usual scenario, the Earth as a whole may reach that milestone in about 40 years," said Post. "But the Arctic is already there during some months of the year, and it could reach 2 degrees C warming on an annual mean basis as soon as 25 years before the rest of the planet."The study illustrates what 2 degrees C of global warming could mean for the high latitudes: up to 7 degrees C warming for the Arctic and 3oC warming for the Antarctic during some months of the year.The authors say that active, near-term measures to reduce carbon emissions are crucial to slowing high latitude warming, especially in the Arctic.Post emphasizes that major consequences of projected warming in the absence of carbon mitigation are expected to reach beyond the polar regions. Among these are sea level rise resulting from rapid melting of land ice in the Arctic and Antarctic, as well as increased risk of extreme weather, deadly heat waves, and wildfire in parts of the Northern Hemisphere."What happens in the Arctic doesn't stay in the Arctic," said co-author Michael Mann, a distinguished professor of atmospheric sciences at Penn State. "The dramatic warming and melting of Arctic ice is impacting the jet stream in a way that gives us more persistent and damaging weather extremes."
Weather
2,019
November 25, 2019
https://www.sciencedaily.com/releases/2019/11/191125120939.htm
Meeting the challenges facing fisheries climate risk insurance
Insurance schemes with the potential to improve the resilience of global fisheries face a host of future challenges, researchers say.
The world's first "Fisheries Index Insurance" scheme, launched by an international consortium in July, is a sovereign-level instrument designed to protect Caribbean fishing communities from extreme weather events which may become more frequent and intense due to climate change.The team of scientists from the University of Exeter and Centre for Fisheries, Environment and Aquaculture Science (Cefas) today publish a letter in The lead author, Nigel Sainsbury, from the University of Exeter, said: "Climate risk insurance can help people and businesses involved in fisheries bounce back faster from extreme weather events, but it is important that this doesn't lead to less sustainable fishing outcomes and that more marginalised groups, particularly unregistered fishers and women involved in fisheries, don't miss out on payments."The new insurance system may enable fishers to make decisions to postpone fishing in extreme weather, rather than risk a dangerous trip.It can also help them to replace and repair fishing boats, gear, tools and infrastructure destroyed or damaged by storms much more rapidly. Sainsbury added: "Extreme weather poses a direct threat to the lives of fishers and daily production, so the design of climate risk insurance needs to reflect this."Policymakers cannot rely solely on climate risk insurance in their climate adaptation plans."It must be complemented by adaptations actions in coastal ecosystems, such as the protection of mangroves, establishing pre-storm preparation plans and investment in less vulnerable fishing boats and gear."The Caribbean Ocean and Aquaculture Sustainability faciliTy (COAST) unlocks insurance pay-outs to a pre-determined list of people and organisations involved in the fishing industry if an extreme weather event occurs and causes a set of environmental indicators -- wave height, rainfall, wind speed and storm surge -- to exceed pre-set thresholds.COAST has been launched in St Lucia and Grenada. It is funded by the US State Department and relies on the specialist capabilities of the Caribbean Catastrophe Risk Insurance Facility (CCRIF SPC) and The World Bank.
Weather
2,019
November 20, 2019
https://www.sciencedaily.com/releases/2019/11/191120175602.htm
Dead-zone report card reflects improving water quality in Chesapeake Bay
An annual model-based report on "dead-zone" conditions in the Chesapeake Bay during 2019 indicates the total volume of low-oxygen, "hypoxic" water was on the high end of the normal range for 1985 to 2018, a finding that scientists consider relatively good news.
Dr. Marjy Friedrichs, a Virginia Institute of Marine Science professor and report card co-author, says "Even with environmental conditions that favor severe hypoxia, including record-high river input and light winds, our analysis shows that the total amount of hypoxia this year was within the normal range seen over the past 35 years."This suggests that nutrient reductions since the 1980s have been successfully improving water quality in the nation's largest estuary. "If the record-breaking river flows we saw last year had occurred back in the 1980s," says Friedrichs, "we would more likely have seen a record-breaking dead zone in 2019. The fact that the 2019 dead zone was within the normal range is a positive sign for Bay restoration."The summer dead zone is one of the major water quality concerns facing the Chesapeake. The dead zone forms when rivers supply excess nitrogen from fertilizers, wastewater, and other sources, fueling short-lived blooms of algae. Bacteria then eat the dead, sinking algae, consuming from bottom waters the dissolved oxygen that fish, shellfish, crabs, and other animals need to survive. Bay dead zones peak during summer, when hot weather encourages algal growth and bacterial decay, and reduces how much oxygen the water can hold. At the same time, calm winds typically preclude the mixing of oxygen-rich surface waters into the depths.Friedrichs developed the Annual Chesapeake Bay Dead Zone Report Card with Dr. Aaron Bever of Anchor QEA, an environmental engineering and consulting firm. Bever earned his Ph.D. from William & Mary's School of Marine Science at VIMS in 2010.The team's report card summarizes oxygen conditions in the Bay each year as estimated by their 3-D, real-time hypoxia forecast model, originally developed with funding from NOAA. The model is based on 35 years of water-quality data collected by the Chesapeake Bay Program, and is forced daily by wind data provided by NOAA and river-input data provided by the U.S. Geological Survey. The modeling team, which includes Research Scientist Dr. Pierre St. Laurent at VIMS, also generates dissolved oxygen statistics for previous years for comparative purposes.Because springtime inflows from the Susquehanna River were high in 2019, scientists from the University of Michigan predicted that summer 2019 would see the 4th largest July hypoxic volume in the last 20 years. That forecast held true through mid-August, with at least a 6-year peak in hypoxic volume occurring on July 25th at 13.1 cubic kilometers -- 17% of the Bay's volume.However, hypoxia decreased as winds picked up in late August and early September, falling almost to zero following the passage of Hurricane Dorian on September 7th. Hypoxia returned with the high temperatures in late September and early October until strong winds mixed Bay waters and ended hypoxia in the mainstem of the Bay for the year on October 10th. September 2019 was the second warmest in the contiguous U.S. for the 125-year period of record.Overall, Friedrichs and Bever estimate the total amount of hypoxia in 2019 was on the high end of the normal range for 1985 to 2018, and higher than it has been in the past five years. As in 2018, hypoxia also lasted longer than in other recent years.The findings of the VIMS hypoxia model and Dead Zone Report generally match the monitoring-based report provided by the Maryland Department of Natural Resources for the Maryland portion of the Bay. Slight variations in results are due to different reporting periods and regions, as the Virginia report includes results for the full Bay from the onset of hypoxia in spring to its cessation in the autumn, while Maryland's DNR focuses on Maryland conditions from June through September.
Weather
2,019
November 13, 2019
https://www.sciencedaily.com/releases/2019/11/191113153112.htm
When reporting climate-driven human migration, place matters
A quick Google search for "What is driving migration from Central America?" reveals that nearly all of the top hits claim climate change as a major catalyst for the mass movement of people out of their home countries. University of Arizona climate researchers, however, have shown that the reality is much more nuanced.
"We were seeing articles in big-name media saying migration from Central America is being driven by climate change and yet, we were looking at these and asking, where is the evidence?" said Kevin Anchukaitis, professor in the School of Geography and Development.To nail down the reality, a team led by Anchukaitis analyzed 40 years of daily weather records from El Salvador, Guatemala, Nicaragua and Honduras -- especially the Central American region known as the "Dry Corridor."They focused on changes in the timing and intensity of the Central American Midsummer Drought, an annual dip in rainfall totals during the summer months. Millions of families in the region plant crops in rhythm with the annual peaks and troughs in rainfall."If rainfall comes at a different time, or if it's less than normal," Anderson said, "it could result in a crop failure and food insecurity," said geography graduate student Talia Anderson, lead author on the paper that was recently published in Anderson analyzed a combination of daily satellite and rain gauge estimates, allowing her to look at weather patterns continuously across space over the past four decades.The findings revealed a complex pattern across the region. In most places, the researchers found insignificant changes in rainfall patterns over the past 40 years. Some local areas, however, changed significantly: Some have gotten drier, while others got wetter. In some places the Midsummer Drought starts earlier or ends later, but elsewhere the researchers found no changes."If you average across the entire region you wouldn't see a trend going either way," Anderson said. "The most important conclusion is that scale matters."The news media don't have this local connection that this study is providing," Anderson added. "We can now say, 'In one part of Guatemala or this part of Nicaragua, you do see a change in these important features of the Midsummer Drought, but in a lot of other regions where we know a lot of migration is originating, we don't see significant rainfall trends.'"According to Anchukaitis' paleoclimatology research, there's simply a lot of natural variability in rainfall in Central America."Even though climate change is very real and is projected to make the region significantly drier later in this century, with this study, we can't yet claim that any trends we see are a result of human-caused climate change," Anderson said.Migration is complicated and there are many reasons that people migrate, Anchukaitis added. That doesn't mean that an individual climate event can't have an impact, but there are other driving forces, he said, such as limited land and resources, violence and corruption.In addition to Anchukaitis, Anderson's co-authors include Diego Pons from Columbia University and Matthew Taylor from the University of Denver. Funding was provided by the National Science Foundation (0852652, 1263609, 1623727 and 1243125).
Weather
2,019
November 13, 2019
https://www.sciencedaily.com/releases/2019/11/191113075107.htm
Stalled weather patterns will get bigger due to climate change
Climate change will increase the size of stalled high-pressure weather systems called "blocking events" that have already produced some of the 21st century's deadliest heat waves, according to a Rice University study.
Atmospheric blocking events are middle-latitude, high-pressure systems that stay in place for days or even weeks. Depending upon when and where they develop, blocking events can cause droughts or downpours and heat waves or cold spells. Blocking events caused deadly heat waves in France in 2003 and in Russia in 2010.Using data from two sets of comprehensive climate model simulations, Rice fluid dynamicists Ebrahim Nabizadeh and Pedram Hassanzadeh, and colleagues found that the area of blocking events in the northern hemisphere will increase by as much as 17% due to anthropogenic climate change. The study, which is available online from Hassanzadeh, an assistant professor of mechanical engineering and of Earth, environmental and planetary sciences, uses computational, mathematical and statistical models to study atmospheric flows related to a broad range of problems from extreme weather events to wind energy. He said researchers have increasingly been interested in learning how climate change might affect blocking events, but most studies have focused on whether blocking events will become more frequent as the atmosphere warms because of greenhouse gas emissions."Studies in the past have looked at whether you get more or less blocking events with climate change," he said. "The question nobody had asked is whether the size of these events will change or not. And the size is very important because the blocking events are more impactful when they are larger. For example, if the high-pressure system becomes bigger, you are going to get bigger heat waves that affect more people, and you are likely going to get stronger heat waves."Nabizadeh, a mechanical engineering graduate student in Rice's Brown School of Engineering, set out to answer the question two years ago. Using a hierarchical modeling approach, he began with experiments on a model of atmospheric turbulence that's far simpler than the real atmosphere.The simple model, which captures the fundamental dynamics of blocking events, allowed Nabizadeh to do a great deal of exploration. Making slight changes in one parameter or another, he ran thousands of simulations. Then the data was analyzed using a powerful dimensional analysis technique called the Buckingham-Pi theorem, which is often used in designing large and complex engineering systems that involve fluid flows.The goal was finding a scaling law, a mathematical formula that described the size of a blocking event using variables that climate scientists already study and understand. Nabizadeh started with scaling laws that have been developed to predict the size of day-to-day weather patterns, but he found that none of the variables were predictive for blocking events.His persistence eventually paid off with a simple formula that relates the area of blocking events to the width, latitude and strength of the jet stream, all of which are well-studied and measured."I gave a talk about this recently, and one of the people came up after and said, 'This is magical, that these powers add up and suddenly you get the right answer.' But it took a lot of work by Ebrahim to get this elegantly simple result," he said.At a one point, Nabizadeh had analyzed the data from many simulations and produced a comparison that included page upon page of figures, and Hassanzadeh said the scaling law discovery was encouraged by an unlikely agency: the Texas Department of Motor Vehicles (DMV)."Ebrahim went to the DMV one weekend, and I went to the DMV the week after, and at the DMV you have to sit and you don't have anything to do," he said. "So after staring at these numbers for hours, we realized this is the right scaling."They also compared the simple-model results with the output of increasingly complex models of the Earth's weather and climate. Nabizadeh said the scaling law predicted changes in the size of future winter blocking events in comprehensive climate model simulations with remarkable accuracy."It performs better for winter events than summer events for reasons we don't yet understand," Nabizadeh said. "Our results suggest future studies should focus on better understanding summer blocks and also how larger blocking events might affect the size, magnitude and persistence of extreme-weather events like heat waves."The research was supported by NASA (80NSSC17K0266), the National Academies' Gulf Research Program, the Department of Energy (DE-AC02-05CH11231) and the National Science Foundation (NSF) (AGS-1545675). Computing resources were provided by the NSF-supported XSEDE project (ATM170020) and Rice's Center for Research Computing in partnership with Rice's Ken Kennedy Institute for Information Technology.
Weather
2,019
November 12, 2019
https://www.sciencedaily.com/releases/2019/11/191112164947.htm
Deep neural networks speed up weather and climate models
When you check the weather forecast in the morning, the results you see are more than likely determined by the Weather Research and Forecasting (WRF) model, a comprehensive model that simulates the evolution of many aspects of the physical world around us.
"It describes everything you see outside of your window," said Jiali Wang, an environmental scientist at the U.S. Department of Energy's (DOE) Argonne National Laboratory, "from the clouds, to the sun's radiation, to snow to vegetation -- even the way skyscrapers disrupt the wind."The myriad characteristics and causes of weather and climate are coupled together, communicating with one another. Scientists have yet to fully describe these complex relationships with simple, unified equations. Instead, they approximate the equations using a method called parameterization in which they model the relationships at a scale greater than that of the actual phenomena.Although parameterizations simplify the physics in a way that allows the models to produce relatively accurate results in a reasonable time, they are still computationally expensive. Environmental scientists and computational scientists from Argonne are collaborating to use deep neural networks, a type of machine learning, to replace the parameterizations of certain physical schemes in the WRF model, significantly reducing simulation time."With less-expensive models, we can achieve higher-resolution simulations to predict how short-term and long-term changes in weather patterns affect the local scale," said Wang, "even down to neighborhoods or specific critical infrastructure."In a recent study, the scientists focused on the planetary boundary layer (PBL), or lowest part of the atmosphere. The PBL is the atmospheric layer that human activity affects the most, and it extends only a few hundred meters above Earth's surface. The dynamics in this layer, such as wind velocity, temperature and humidity profiles, are critical in determining many of the physical processes in the rest of the atmosphere and on Earth.The PBL is a crucial component in the WRF model, but it is also one of the least computationally expensive. This makes it an excellent testbed for studying how more complicated components might be enhanced by deep learning neural networks in the same way."We used 20 years of computer-generated data from the WRF model to train the neural networks and two years of data to evaluate whether they could provide an accurate alternative to the physics-based parameterizations," said Prasanna Balaprakash, a computer scientist and DOE Early Career Award recipient in Argonne's Mathematics and Computer Science division and the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility.Balaprakash developed the neural network and trained it to learn an abstract relationship between the inputs and outputs by feeding it more than 10,000 data points (8 per day) from two locations, one in Kansas and one in Alaska. The result was an algorithm that the scientists are confident could replace the PBL parameterization in the WRF model.The scientists demonstrated that a deep neural network that considers some of the underlying structure of the relationship between the input and output variables can successfully simulate wind velocities, temperature and water vapor over time. The results also show that a trained neural network from one location can predict behavior across nearby locations with correlations higher than 90 percent compared with the test data."Collaboration between the climate scientists and the computer scientists was crucial for the results we achieved," said Rao Kotamarthi, chief scientist and department head of atmospheric science and climate research in Argonne's Environmental Science division. "Incorporating our domain knowledge makes the algorithm much more predictive."The algorithms -- called domain-aware neural networks -- that consider known relationships not only can predict environmental data more accurately, but they also require training of significantly less data than do algorithms that do not consider domain expertise.Any machine learning project requires a large amount of high-quality data, and there was no shortage of data for this study. Supercomputing resources at the ALCF and the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility at Lawrence Berkeley National Laboratory, contributed to the production of more than 300 years (700 terabytes) of data describing past, present and future weather and climate in North America."This database is unique to climate science at Argonne," said Wang, "and we are using it to conduct further studies in deep learning and determine how it can apply to climate models."The scientists' ultimate goal is to replace all of the expensive parameterizations in the WRF model with deep learning neural networks to enable faster and higher-resolution simulation.Currently, the team is working to emulate long-wave and short-wave solar radiation parameterization -- two portions of the WRF model that together take up almost 40% of the calculation time of the physics in the simulations.
Weather
2,019
November 12, 2019
https://www.sciencedaily.com/releases/2019/11/191112110232.htm
This is what the monsoon might look like in a warmer world
In the last interglacial period on Earth about 125,000 years ago, the Indian monsoon was longer, more extreme and less reliable than it is today. This is the conclusion drawn by geoscientists from Ruhr-Universität Bochum (RUB) and the University of Oxford, together with other colleagues from the UK, New Zealand, China and the USA. The team analysed a dripstone from a cave in north-eastern India, combining various methods that provide information about supra-regional and local weather phenomena and the climate dynamics of the past.
The team headed by Matthias Magiera, Dr. Franziska Lechleitner, Professor Ola Kwiecien and Dr. Sebastian Breitenbach describe the results in the journal "The last interglacial period is often considered an analogy to the expected climate changes," says Ola Kwiecien from the RUB Institute of Geology, Mineralogy and Geophysics. "Even though the factors that led to the warming were different then than they are today, of course." Findings about weather and climate phenomena from the last interglacial period provide researchers with clues as to how the climate might change as the earth warms up.The team analysed a dripstone from the Mawmluh Cave in north-eastern India. For one, the researchers determined so-called delta-18-O values, which are a measure of the strength of the Indian monsoon. In the process, they compared the ratio of heavy and light oxygen in the dripstone; this depends on the one hand on the source area of the monsoon, but also on the seasonal distribution of rainfall, temperature and intensity of precipitation. These factors play an important role for the strength of the monsoon weather phenomenon."The delta-18-O value tells us something about the strength of the monsoon, but not how much precipitation falls and how the rain spreads over time," explains Sebastian Breitenbach from the RUB Institute of Geology, Mineralogy and Geophysics. "But that is in fact the crucial information," adds Ola Kwiecien. "For a farmer, it makes a big difference whether precipitation falls constantly and reliably over a certain period of time, or whether surprising and extreme rainfall alternates with longer dry periods."In order to gather clues on the seasonal distribution of rainfall, the researchers defined additional measured values. While the delta-18-O value is a supra-regional parameter that tells them something about the distant sources of monsoon rainfall, other parameters record local phenomena, including the ratio of different elements such as strontium or magnesium to calcium or the ratio of different calcium isotopes in the dripstone. This isotope ratio, known as the delta-44-Ca value, has so far rarely been applied to cave samples.During dry winter and longer dry periods, a phenomenon occurs in the karst rock above the cave that affects the elemental conditions in the dripstone. If rain falls over the Mawmluh Cave, it seeps through the soil, dissolves calcium from the rock and transports it into the cave. The calcium is stored in a dripstone formed by the water; the dripstone, which grows during a moist phase, thus has a high calcium content compared to other elements.However, during the dry period between November and May, some of the calcium can get lost on the way, if there are any air-filled cavities in the rock. These cause calcium to precipitate before it reaches the cave, while elements such as strontium and magnesium remain in the water, are transported to the dripstones and integrated into them. The ratio of magnesium or strontium to calcium in the dripstone thus indicates whether there was much or little rain in the immediate vicinity of the cave. The delta-44-Ca value also provides clues on the precipitation near the cave and, moreover, allows researchers to gain more information on the intensity of the dry phase.The combination of these different parameters enabled the researchers to reconstruct changes in precipitation during the monsoon and non-monsoon periods and, consequently, to gain insight into the distribution of precipitation before, during and after the last interglacial period."On the whole, our data show that the Indian monsoon was less reliable in the last interglacial period than it is today, which suggests that global warming today might be having the same effect," concludes Ola Kwiecien. "This tallies with the tendency for weather extremes to become more frequent." According to the researchers, human impact on the climate in the Indian summer monsoon has not yet fully manifested itself. If the assumptions underlying the current study are correct, however, this could change in the next 20 to 30 years.
Weather
2,019
November 8, 2019
https://www.sciencedaily.com/releases/2019/11/191108074854.htm
Using AI to predict where and when lightning will strike
Lightning is one of the most unpredictable phenomena in nature. It regularly kills people and animals and sets fire to homes and forests. It keeps aircraft grounded and damages power lines, wind turbines and solar-panel installations. However, little is known about what triggers lightning, and there is no simple technology for predicting when and where lightning will strike the ground.
At EPFL's School of Engineering, researchers in the Electromagnetic Compatibility Laboratory, led by Farhad Rachidi, have developed a simple and inexpensive system that can predict when lightning will strike to the nearest 10 to 30 minutes, within a 30-kilometer radius. The system uses a combination of standard meteorological data and artificial intelligence. The research paper has been published in "Current systems are slow and very complex, and they require expensive external data acquired by radar or satellite," explains Amirhossein Mostajabi, the PhD student who came up with the technique. "Our method uses data that can be obtained from any weather station. That means we can cover remote regions that are out of radar and satellite range and where communication networks are unavailable."What's more, because the data can be acquired easily and in real time, predictions can be made very quickly -- and alerts can be issued even before a storm has formed.The EPFL researchers' method uses a machine-learning algorithm that has been trained to recognize conditions that lead to lightning. To carry out the training, the researchers used data collected over a ten-year period from 12 Swiss weather stations, located in both urban and mountainous areas.Four parameters were taken into account: atmospheric pressure, air temperature, relative humidity and wind speed. Those parameters were correlated with recordings from lightning detection and location systems. Using that method, the algorithm was able to learn the conditions under which lightning occurs.Once trained, the system made predictions that proved correct almost 80% of the time.This is the first time that a system based on simple meteorological data has been able to predict lightning strikes through real-time calculations. The method offers a simple way of predicting a complex phenomenon.
Weather
2,019
November 5, 2019
https://www.sciencedaily.com/releases/2019/11/191105104420.htm
Satellite tracking shows how ships affect clouds and climate
By matching the movement of ships to the changes in clouds caused by their emissions, researchers have shown how strongly the two are connected.
When ships burn fossil fuels, they release airborne particles containing various naturally occurring chemicals, including sulphur. These particles are known to modify certain types of clouds, which can affect climate.Better knowledge of how these particles, and particularly the sulphur components, affect clouds could help scientists create more accurate climate models.In the latest study, satellite tracking was also used to show the impact of restrictions on sulphur in fuels, revealing the impact of ships on clouds largely disappears in restricted zones.This information can be used to build a relationship between cloud properties and the sulphur content of shipping fuels. Importantly, this could help shipping companies monitor compliance with sulphur regulations that come into force on 1 January 2020.The study, published today in Emissions from ships contain several chemicals, including sulphate aerosols -- small particles of sulphur and oxygen. The aerosols can act as 'seeds' around which water droplets accumulate, causing changes in cloud properties that are visible to satellites.This means that ships can change clouds, leaving lines -- known as ship tracks -- in the clouds behind them as they sail.However, exactly how these aerosols impact the properties of the clouds is not precisely known. This knowledge is important because the kinds of clouds that the emissions affect can influence climate warming, and is therefore important to capture in climate models.Aerosol are emitted from many sources, such as factories and cars, but it has been difficult to match these outputs with the influence on clouds, as there are many other factors at play.However, with ship tracks, the relationship is more straightforward, enabling researchers to tease out the links between aerosols and clouds more easily.Lead researcher Dr Edward Gryspeerdt, from the Department of Physics at Imperial, said: "Ship tracks act like an experiment that would be impossible for us to do otherwise -- we cannot inject sulphate aerosols into the atmosphere at such scale to see what happens."Instead, restrictions on the amount of ship sulphate emissions can contain provide us with a perfect experiment for determining just how important the aerosols are in cloud formation. By analysing a huge dataset of ship tracks observed from satellites, we can see that they largely disappear when restrictions are introduced, demonstrating the strong impact of aerosols."The team studied more than 17,000 ship tracks from satellite observations and matched them to the movements of individual ships using their onboard GPS.The study period covered the introduction of emission control areas around the coast of North America, the North Sea, the Baltic Sea and the English Channel, which restricted sulphur in ship fuel to 0.5 percent, leading to fewer sulphate aerosol emissions.The researchers found that in these areas, ship tracks nearly completely disappeared compared to before the restrictions, under similar weather conditions.This shows that sulphate aerosols have the most significant impact on cloud formation, as opposed to other components of the ship exhaust, such as black carbon.The result also means that a ship not in compliance with the regulations, by burning the current high-sulphur fuels without exhaust treatment, could be detected because it would create a measurable difference in the satellite-observed cloud properties.Co-author Dr Tristan Smith, from UCL's Energy Institute, said: "Currently, it is hard for regulators to know what ships are doing in the middle of the ocean. The potential for undetected non-compliance with the 2020 sulphur regulations is a real risk for shipping companies because it can create commercial advantage to those companies who do not comply."This study shows that science and technology are producing significant advancements in the transparency of shipping, and helping to reduce risks and unfairness for responsible operators."As well as exploring how the method could be used to identify ships that may not be in compliance with the 0.5 percent limit, the team now want to more precisely relate known ship fuel compositions to ship tracks, allowing them to more accurately predict the influence of sulphur aerosols on cloud formation on a larger scale, ready to feed into climate models.
Weather
2,019
November 4, 2019
https://www.sciencedaily.com/releases/2019/11/191104083320.htm
Satellites are key to monitoring ocean carbon
Satellites now play a key role in monitoring carbon levels in the oceans, but we are only just beginning to understand their full potential.
Our ability to predict future climate relies upon being able to monitor where our carbon emissions go. So we need to know how much stays in the atmosphere, or becomes stored in the oceans or onland. The oceans in particular have helped to slow climate change as they absorb and then store the carbon for thousands of years.The IPCC Special Report on the Oceans and Cryosphere in a Changing Climate, published in September, identified this critical role that the ocean play in regulating our climate along with the need to increase our monitoring and understanding of ocean health.But the vast nature of the oceans, covering over 70% of the Earth's surface, illustrates why satellites are an important component of any monitoring.The new study, led by the University of Exeter, says that increased exploitation of existing satellites will enable us to fill "critical knowledge gaps" for monitoring our climate.The work reports that satellites originally launched to study the wind, also have the capability to observe how rain, wind, waves, foam and temperature all combine to control the movement of heat and carbon dioxide between the ocean and the atmosphere.Additionally, satellites launched to monitor gas emissions over the land are also able to measure carbon dioxide emissions as they disperse over the ocean.Future satellite missions offer even greater potential for new knowledge, including the ability to study the internal circulation of the oceans. New constellations of commercial satellites, designed to monitor the weather and life on land, are also capable of helping to monitor ocean health."Monitoring carbon uptake by the oceans is now critical to understand our climate and for ensuring the future health of the animals that live there," said lead author Dr Jamie Shutler, of the Centre for Geography and Environmental Science on Exeter's Penryn Campus in Cornwall."By monitoring the oceans we can gather the necessary information to help protect ecosystems at risk and motivate societal shifts towards cutting carbon emissions."The research team included multiple European research institutes and universities, the US National Oceanic and Atmospheric Administration, the Japan Aerospace Exploration Agency and the European Space Agency.The researchers call for a "robust network" that can routinely observe the oceans.This network would need to combine data from many different satellites with information from automated instruments on ships, autonomous vehicles and floats that can routinely measure surface water carbon dioxide.And recent computing advancements, such as Google Earth Engine, which provides free access and computing for scientific analysis of satellite datasets, could also be used.The study suggests that an international charter that makes satellite data freely available during major disasters should be expanded to include the "long-term human-made climate disaster," enabling commercial satellite operators to easily contribute.The research was supported by the International Space Science Institute ISSI Bern, Switzerland, and initiated by Dr Shutler at the University of Exeter and Dr Craig Donlon at the European Space Agency.
Weather
2,019
October 22, 2019
https://www.sciencedaily.com/releases/2019/10/191022121128.htm
Satellite data used to calculate snow depth in mountain ranges
Bioscience engineers at KU Leuven (Belgium) have developed a method to measure the snow depth in all mountain ranges in the Northern Hemisphere using satellites. This technique makes it possible to study areas that cannot be accessed for local measurements, such as the Himalayas. The findings were published in
"In Western Europe, we tend to associate snow with ski trips, outdoor fun, or traffic jams, which goes to show that the importance of snow is often underestimated," says postdoctoral researcher Hans Lievens from the Department of Earth and Environmental Sciences at KU Leuven, who is the lead author of this study."Each year, a fifth of the Northern Hemisphere gets covered in snow. More than one billion people rely on this snow for drinking water. Melting water is also very important for agriculture and the production of electricity. "Furthermore, snow has a cooling effect on our climate by reflecting sunlight."As part of an international team, Lievens studied the snow depth in more than 700 mountain ranges in the Northern Hemisphere. The team used radar measurements provided by Sentinel-1, a satellite mission of the European Space Agency (ESA). The researchers analysed the data for the period between the Winter of 2016 up to and including the Summer of 2018."The Sentinel-1 mission specifically aims to observe the surface of the Earth," says Lievens. "The satellite emits radar waves and, based on the reflection of these waves, we can calculate the snow depth. The ice crystals rotate the signal: the more rotated the waves, the more snow there is."Existing calculations of snow depth are often based on local measurements, but in many cases, these offer an inaccurate or incomplete picture. In the Himalayas, for instance, in-situ measurements are almost impossible due to the extreme circumstances. Thanks to the satellite data, it is now possible to observe mountain areas that are difficult or impossible to access.The absolute peak in the measurements pertains to the west of Canada: the Coast Mountains have a snow volume of 380 cubic kilometres. That is over 100 cubic kilometres more than local measurements indicate. Also standing out are the snowy areas in eastern Russia, especially in Siberia and the Kamchatka Peninsula. In Europe, the Scandinavian mountains and the Alps are the areas with the largest volumes of snow."Based on these first measurements, we cannot estimate the impact of climate change yet, but this should become possible in the long run," says Lievens. "We will be able to monitor more accurately how the volume of snow evolves and when the melting season takes place. Our method may also help to improve water distribution management and to assess the flood risk in certain areas."This winter, Hans Lievens and doctoral student Isis Brangers are travelling to the Rocky Mountains in Idaho to further study the technique. "We don't fully understand yet what physically happens when the radar waves reflect in the snowpack. Various elements may influence the signal: the shape and size of the ice crystals, humidity, the different layers of snow, and so on. By continuing to measure and study snow locally, we should be able to refine the method.""In January and February, we'll also take part in the NASA SnowEx campaign. An international team of scientists is examining the snow conditions at Grand Mesa, a large plateau in Colorado with a 3500-metre altitude. We'll be testing various new techniques and sensors there to calculate the snow mass. It's promising to be a very intensive but especially informative time."
Weather
2,019
October 21, 2019
https://www.sciencedaily.com/releases/2019/10/191021153346.htm
Climate warming promises more frequent extreme El Niño events
El Niño events cause serious shifts in weather patterns across the globe, and an important question that scientists have sought to answer is: how will climate change affect the generation of strong El Niño events? A new study, published today in the
The team examined details of 33 El Niño events from 1901 to 2017, evaluating for each event the onset location of the warming, its evolution, and its ultimate strength. By grouping the common developmental features of the events, the team was able to identify four types of El Niño, each with distinct onset and strengthening patterns. Looking across time, they found a decided shift in behavior since the late 1970's: all events beginning in the eastern Pacific occurred prior to that time, while all events originating in the western-central Pacific happened since then. They also found that four of five identified extreme El Niño events formed after 1970.Wang and his co-authors focused on the factors that seemed to be controlling these shifts, including increased sea surface temperatures in the western Pacific warm pool and the easterly winds in the central Pacific, and found that with continued global warming, those factors may lead to a continued increase in frequency in extreme El Niño events."Simulations with global climate models suggest that if the observed background changes continue under future anthropogenic forcing, more frequent extreme El Niño events will induce profound socioeconomic consequences," reports Wang.Past strong El Niño events have caused severe droughts in the western Pacific Islands and Australia, leading to extensive wildfires and famine, while dangerous flooding from excessive rainfall have plagued northern coasts of South America. Warm ocean temperatures associated with the events have also generated strongly negatively effects on fisheries and coral reefs, globally.The classification system derived in this study provides an important tool for improvement in climate modeling of El Niño and La Niña events. Wang's research group plans to explore further how this work may help improve predictions of future El Niño events. A better understanding of how these events may change over time will help adaptation efforts to mitigate their economic, environmental, and societal impacts.
Weather
2,019
October 15, 2019
https://www.sciencedaily.com/releases/2019/10/191015171550.htm
Last year's extreme snowfall wiped out breeding of Arctic animals and plants
In 2018, vast amounts of snow were spread across most of the Arctic region and did not melt fully until late summer, if at all. Publishing on October 15 in the open-access journal
The Arctic is home to a diverse and specialized group of organisms, highly adapted to life under the severe climatic conditions. But now the Arctic is changing, and the region is experiencing both long-term warming and retreating snow-cover. At the same time, climatic variability and the risk of extreme events is increasing. While the consequences of longer-term change are well-documented, we know almost nothing about the impacts of climatic variability and extreme events on the Arctic ecosystems.The 2018 snow conditions resulted in the most complete reproductive failure ever encountered at Zackenberg, and only few plants and animals were able to reproduce due to abundant and late-melting snow. While poor reproduction had been observed in individual species before, such poor reproduction across all levels of the ecosystem had never been seen."One non-breeding year is hardly that bad for high-arctic species," says Niels Martin Schmidt (Aarhus University, Denmark), lead author of the study. "The worrying perspective is that 2018 may offer a peep into the future, where increased climatic variability may push the arctic species to -- and potentially beyond -- their limits. Our study shows that climate change is more than 'just' warming, and that ecosystems may be hard hit by currently still rare but extreme events. What it also brings out is the unparalleled value of long-term observations of the Arctic. Only by keeping an eye on full arctic ecosystems can we understand the havoc brought by the changing climate."
Weather
2,019
October 15, 2019
https://www.sciencedaily.com/releases/2019/10/191015164654.htm
Artificial intelligence and farmer knowledge boost smallholder maize yields
Farmers in Colombia's maize-growing region of Córdoba had seen it all: too much rain one year, a searing drought the next. Yields were down and their livelihoods hung in the balance.
The situation called for a new approach. They needed information services that would help them decide what varieties to plant, when they should sow and how they should manage their crops. A consortium formed with the government, Colombia's National Cereals and Legumes Federation (FENALCE), and big-data scientists at the International Center for Tropical Agriculture (CIAT). The researchers used big-data tools, based on the data farmers helped collect, and yields increased substantially.The study, published in September in "Today we can collect massive amounts of data, but you can't just bulk it, process it in a machine and make a decision," said Daniel Jimenez, a data scientist at CIAT and the study's lead author."With institutions, experts and farmers working together, we overcame difficulties and reached our goals."During the four-year study, Jimenez and colleagues analyzed the data and verified developed guidelines for increased production. Some farmers immediately followed the guidelines, while others waited until they were verified in field trials. Farmers that adopted the full suite of machine-generated guidelines saw their yields increase from an average of 3.5 tons per hectare to more than 6 tons per hectare. This is an excellent yield for rainfed maize in the region.The guidelines also substantially reduced fertilizer costs, and provided advice on how to reduce risks related to variation in the weather patterns, with an emphasis on reducing the negative impacts of heavy rainfall.Researchers from FENALCE co-authored the study, which is part of a Colombian government program aimed at providing farmers with options to manage both weather variability and climate change."If one farmer provides data to a researcher it is almost impossible to gain many insights into how to improve management," said James Cock, a co-author emeritus CIAT scientist. "On the other hand, if many farmers, each with distinct experiences, growing conditions, and management practices provide information, with the help of machine learning it is possible to deduce where and when specific management practices will work."Year-on-year, maize yields in the study region vary by as much as 39 percent due to the weather. Small farmers in the past had to rely on their own knowledge of their crops and accept blanket recommendations often developed by researchers far removed from their own milieu. The study shows that combining farmers' knowledge with data on weather, soils and crop response to variables, farmers can, at least partially, shield their crops against climate variability and stabilize their yields at a higher level.In Córdoba, FENALCE, which compiles information on maize plantations, harvests, yields and costs, set up a web-based platform to collect and maintain data from individual farms. Local experts uploaded information on soils after visiting farms at various stages of the crop development, while IDEAM, Colombia's weather agency, supplied weather information from six stations in the region. This allowed researchers to match daily weather station information with individual fields and the various stages of the growing season.The researchers used machine learning algorithms and expert analysis to measure the impact of different weather, soil conditions and farming practices on yields. For example, they noticed that improving soil drainage to reduce run-off likely reduces yields when rainfall is lower, whereas doing the same in areas with a lot of rain boosts yields. This shows advice on crops needs to be site-specific.The study demonstrated that the amount of phosphorus applied, the seed rate, and field run-off capacity had a major impact on yield levels. Understanding the effects of the inputs on the crops allowed experts to guide small farmers towards the best practices to use in order to produce high, stable yields.The upshot for farmers is that most of the management practices the study recommends do not require major investments, showing that food security and livelihoods can be improved -- at least in this case -- without major expenditures.Initially, CIAT and FENALCE designed a smartphone application for farmers to record soil and other data in the field but corn growers did not adopt the app. Although the web-based platform was used to compile the information, researchers and technical assistants had to visit the farms to help the farmers collect the data. This presents challenges for scaling up this type of exercise.Nevertheless, researchers see opportunities for increased data collection by smallholders, both by directly working with farmers and through technology. Future projects could incorporate apps already developed and used by farmers. Furthermore, data collection by a whole array of technologies ranging from satellites, drones and low-cost sensors, deployed in fields, coupled with combine harvesters that accurately record grain yield at a micro-scale are all becoming realities in the developing world."Much of the hardware and software for the future collection of data may well come when the private sector becomes involved in developing sustainable systems for capturing, analyzing and distributing information," said Jimenez. "In the future we can envisage every field being carefully characterized and monitored, turning the landscape into a whole series of experiments that provide data which machine learning can interpret to help famers manage their crops better."
Weather
2,019
October 7, 2019
https://www.sciencedaily.com/releases/2019/10/191007103611.htm
Another casualty of climate change? Recreational fishing
Another casualty of climate change will likely be shoreline recreational fishing, according to new research from North Carolina State University and Oregon State University. The study finds some regions of the U.S. may benefit from increasing temperatures, but those benefits will be more than offset by declines in fishing elsewhere.
"If there are not significant efforts to curtail climate change, we're looking at declines in recreational fishing participation of around 15% by 2080," says Roger von Haefen, co-author of the study and a professor of agricultural and resource economics at NC State."We also want to stress that this study looks solely at how changes in temperature and precipitation are likely to affect people's willingness to go fishing from the shore," von Haefen says. "This work doesn't get at shifts in fish populations, water quality impacts, or other climate-related changes that could affect recreational fishing demand."To examine this issue, the researchers looked at shoreline recreational fishing data from 2004 through 2009, encompassing all Atlantic coast states, as well as Alabama, Mississippi and Louisiana. Specifically, the researchers examined how different temperature and precipitation conditions impacted decisions to participate in recreational fishing.They found that temperature did affect people's willingness to go fishing, but that the relationship wasn't linear. In other words, temperature extremes (hot or cold) tended to reduce participation relative to an "ideal" 75° F day."Going from chilly to balmy weather can stimulate more recreation, and our data and models bear that out," says Steven Dundas, corresponding author of the study and an assistant professor of applied economics at Oregon State. "But increasing temperature when it's already hot can curtail fishing participation. For example, we estimate participation declines once daily high temperatures reach the mid-90s Fahrenheit."The researchers incorporated this data into a simulation model of recreational behavior. They then coupled their estimates with forecasts from 132 general circulation models, each of which predicts future weather under different greenhouse gas reduction scenarios."If the world adopts stringent climate change mitigation efforts, we predict a 2.6% decline in fishing participation by 2080," Dundas says. "That's the overall best-case scenario.""Worst-case scenario, we see participation drop 15% by 2080. It could drop by 3.4% in the next 30 years, and by 9.9% as early as 2050.""It's important to note that this decline won't be evenly spread across states," von Haefen says. "Cooler areas, such as New England, may see increases in fishing, especially during the 'shoulder' seasons -- early spring and late autumn. But hotter states, like those in the Southeast and Gulf regions, will experience significant summertime declines that will likely offset those gains."In addition, some people who still fish on hot days may shift the time of day when they fish. For example, our results suggest that people fish more in the early mornings and at night to avoid extreme heat."
Weather
2,019
October 7, 2019
https://www.sciencedaily.com/releases/2019/10/191007081750.htm
The last mammoths died on a remote island
The last woolly mammoths lived on Wrangel Island in the Arctic Ocean; they died out 4,000 years ago within a very short time. An international research team from the Universities of Helsinki and Tübingen and the Russian Academy of Sciences has now reconstructed the scenario that could have led to the mammoths' extinction. The researchers believe a combination of isolated habitat and extreme weather events, and even the spread of prehistoric man may have sealed the ancient giants' fate. The study has been published in the latest edition of
During the last ice age -- some 100,000 to 15,000 years ago -- mammoths were widespread in the northern hemisphere from Spain to Alaska. Due to the global warming that began 15,000 years ago, their habitat in Northern Siberia and Alaska shrank. On Wrangel Island, some mammoths were cut off from the mainland by rising sea levels; that population survived another 7000 years.The team of researchers from Finland, Germany and Russia examined the isotope compositions of carbon, nitrogen, sulfur and strontium from a large set of mammoth bones and teeth from Northern Siberia, Alaska, the Yukon, and Wrangel Island, ranging from 40,000 to 4,000 years in age. The aim was to document possible changes in the diet of the mammoths and their habitat and find evidence of a disturbance in their environment. The results showed that Wrangel Island mammoths' collagen carbon and nitrogen isotope compositions did not shift as the climate warmed up some 10,000 years ago. The values remained unchanged until the mammoths disappeared, seemingly from the midst of stable, favorable living conditions.This result contrasts with the findings on woolly mammoths from the Ukrainian-Russian plains, which died out 15,000 years ago, and on the mammoths of St. Paul Island in Alaska, who disappeared 5,600 years ago. In both cases, the last representatives of these populations showed significant changes in their isotopic composition, indicating changes in their environment shortly before they became locally extinct.Earlier aDNA studies indicate that the Wrangel Island mammoths suffered mutations affecting their fat metabolism. In this study, the team found an intriguing difference between the Wrangel Island mammoths and their ice age Siberian predecessors: the carbonate carbon isotope values indicated a difference in the fats and carbohydrates in the populations' diets. "We think this reflects the tendency of Siberian mammoths to rely on their reserves of fat to survive through the extremely harsh ice age winters, while Wrangel mammoths, living in milder conditions, simply didn't need to," says Dr. Laura Arppe from the Finnish Museum of Natural History Luomus, University of Helsinki, who led the team of researchers. The bones also contained levels of sulfur and strontium that suggested the weathering of bedrock intensified toward the end of the mammoth population's existence. This may have affected the quality of the mammoths' drinking water.Why then did the last woolly mammoths disappear so suddenly? The researchers suspect that they died out due to short-term events. Extreme weather such as a rain-on-snow, i.e. an icing event could have covered the ground in a thick layer of ice, preventing the animals from finding enough food. That could have led to a dramatic population decline and eventually to extinction. "It's easy to imagine that the population, perhaps already weakened by genetic deterioration and drinking water quality issues could have succumbed after something like an extreme weather event," says professor Hervé Bocherens from the Senckenberg Center for Human Evolution and Palaeoenvironment at the University of Tübingen, a co-author of the study.Another possible factor could have been the spread of humans. The earliest archaeological evidence of humans on Wrangel Island dates to just a few hundred years after the most recent mammoth bone. The chance of finding evidence that humans hunted Wrangel Island mammoths is very small. Yet a human contribution to the extinction cannot be ruled out.The study shows how isolated small populations of large mammals are particularly at risk of extinction due to extreme environmental influences and human behavior. An important takeaway from this is that we can help preserve species by protecting the populations that are not isolated from one another.
Weather
2,019
October 3, 2019
https://www.sciencedaily.com/releases/2019/10/191003114009.htm
Northern forests have lost crucial cold, snowy conditions
Winter conditions are changing more rapidly than any other season and researchers have found clear signs of a decline in frost days, snow covered days and other indicators of winter that could have lasting impacts on ecosystems, water supplies, the economy, tourism and human health.
As the popular saying goes, "winter is coming," but is it? Researchers at the University of New Hampshire have found clear signs of a decline in frost days, snow covered days and other indicators of winter that could have lasting impacts on ecosystems, water supplies, the economy, tourism and human health."Winter conditions are changing more rapidly than any other season and it could have serious implications," said Alexandra Contosta, research assistant professor at UNH's Earth Systems Research Center. "Whether precipitation falls as snow or rain makes a big difference, whether you're talking about a forest stream, a snowshoe hare or even a skier."In their study, recently published in the journal Researchers say that people tend to view cold and snowy weather as burdensome. Yet winter is important for many ecosystems that influence water, wildlife, forests and people. For instance, cold temperatures help prevent the spread of diseases like Lyme disease and West Nile virus through insects like ticks and mosquitoes, as well as help manage insects that are detrimental to trees, like the hemlock wooly adelgid and eastern pine beetle.A deep and long-lasting snowpack also insulates soils from frigid air temperatures, which prevents roots from freezing, promotes soil nutrient cycling and provides wildlife habitat for burrowing animals. Snow cover is as important to the economy and culture of the northern forest as it is to its ecology, especially for timber harvest, maple sugaring, winter recreation activities like skiing and ice skating, and hunting and fishing essential for indigenous peoples."What makes our work unique is that we considered the human effect of climate as well as the ecological or meteorological aspects," said Contosta. "For example, we looked at "mud days," when temperatures are above freezing and no snow cover is present, which can impact not only forest soil nutrients but also loggers who are not able to reach certain areas that can only be harvested with deep snow."Researchers say much of what is understood about the effects of climate change on ecosystems is based on research conducted during the growing season. Researchers say it's more common to hear about summer climate like drought index or heating degree days. They feel more research needs to be done during the so-called 'dormant' season to fill in the key gaps about how forest ecosystems respond to climate change.This study was funded by the Northeastern States Research Cooperative (NSRC), with additional support to the Hubbard Brook Research Foundation from the Canaday Family Charitable Trust, the Lintilhac Foundation and the Davis Conservation Foundation.
Weather
2,019
September 30, 2019
https://www.sciencedaily.com/releases/2019/09/190930131549.htm
Brave new world: Simple changes in intensity of weather events 'could be lethal'
Hurricane Dorian is the latest example of a frightening trend. Extreme weather events are becoming more frequent, more severe and more widespread as a consequence of climate change. New research from Washington University in St. Louis provides important new insights into how different species may fare under this new normal.
Faced with unprecedented change, animals and plants are scrambling to catch up -- with mixed results. A new model developed by Carlos Botero, assistant professor of biology in Arts & Sciences, and Thomas Haaland, formerly a graduate student at the Norwegian University of Science and Technology, helps to predict the types of changes that could drive a given species to extinction.The study, published Sept. 27 in the journal "It is difficult to predict how organisms will respond to changes in extreme events because these events tend to be, by definition, quite rare," Botero said. "But we can have a pretty good idea of how any given species may respond to current changes in this aspect of climate -- if we pay attention to its natural history, and have some idea of the climatic regime it has experienced in the past."Researchers in the Botero laboratory use a variety of tools from ecology and evolutionary biology to explore how life -- from bacteria to humans -- copes with and adapts to repeated environmental change.For the new study, Botero worked with his former student Haaland, now a postdoctoral fellow at the University of Zurich in Switzerland, to develop an evolutionary model of how populations respond to rare environmental extremes. (Think: 500-year floods.) These rare events can be tricky for evolution because it is difficult to adapt to hazards that are almost never encountered.Through computer simulations, Haaland and Botero found that certain traits and experiences emerged as key indicators of vulnerability.Specifically, they found:The key insight of this new model is that species belonging to the former, "conservative" category can easily adapt to more frequent or widespread extremes but have trouble adjusting when those extremes become more intense. The opposite is true of species in the latter, "care-free" category.Haaland and Botero also found that factors speeding up trait evolution are generally likely to hinder -- rather than favor -- adaptation to rare selection events. Part of the reason: High mutation rates tend to facilitate the process of adaptation to normal conditions during the long intervals in between environmental extremes."Our results challenge the idea that species that have been historically exposed to more variable environments are better suited to cope with climate change," Botero said."We see that simple changes in the pattern and intensity of environmental extremes could be lethal even for populations that have experienced similar events in the past. This model simply helps us better understand when and where we may have a problem."The simple framework that Haaland and Botero describe can be applied to any kind of environmental extreme including flooding, wildfires, heatwaves, droughts, cold spells, tornadoes and hurricanes -- any and all of which might be considered part of the "new normal" under climate change.Take extreme heat as an example. The model can be used to predict what will happen to animal or plant species when there are more heat waves, when heatwaves last longer, or when typical heat waves affect larger areas."Regions in which heat waves used to be rare and patchy are likely to host primarily species that do not exhibit conspicuous adaptations to extreme heat," Botero said. "Our model indicates that the biggest threats of extinction in these particular locations will therefore be more frequent or widespread heat waves, and that the species of highest concern in these places will be endemics and species with small geographic distribution."Conversely, areas in which heat waves were historically common and widespread can be expected to host species that already exhibit adaptations for extreme heat," Botero added. "In this case, our model suggests that the typical inhabitants of these places are likely to be more vulnerable to hotter temperatures than to longer or more widespread heat waves."The new model gives wildlife managers and conservation organizations insight into the potential vulnerabilities of different species based on relatively simple assessments of their natural histories and historical environments.For example, a 2018 study by Colin Donihue, visiting postdoctoral fellow at Washington University, found that Anolis lizards in the Caribbean tend to evolve larger toepads and shorter limb lengths in response to hurricanes because these traits help them cling better to branches during strong winds. The new model suggests that while these lizards are unlikely to be affected by more frequent hurricanes, their populations may nevertheless face a significant threat of extinction if future hurricanes become more intense. A possible solution to this problem might be to provide wind refuges across the island to allow parts of the population escape winds of very high intensity, Botero suggested."While this simple conservation action is unlikely to completely shift the balance from a 'conservative' to a 'care-free' evolutionary response to extreme events, it may nevertheless reduce the strongest vulnerability of these 'conservative' lizard populations," Botero said. "It might just buy them enough time to accumulate sufficient evolutionary changes in their toes and limbs to meet the new demands of their altered habitat."
Weather
2,019
September 25, 2019
https://www.sciencedaily.com/releases/2019/09/190925154033.htm
Climate change could cause drought in wheat-growing areas
In a new study, researchers found that unless steps are taken to mitigate climate change, up to 60 percent of current wheat-growing areas worldwide could see simultaneous, severe and prolonged droughts by the end of the century. Wheat is the world's largest rain-fed crop in terms of harvested area and supplies about 20 percent of all calories consumed by humans.
The risk of widespread drought in wheat production areas is four times the level scientists see today, said Song Feng, associate professor of geosciences and the second author on the study published in the journal Given present-day weather patterns, severe drought could affect up to 15 percent of current wheat-growing areas, the study states. Researchers found that even if global warming is held to 2 degrees Celsius above pre-industrial levels, the target of the Paris Agreement, up to 30 percent of global wheat production areas could see simultaneous drought."This clearly suggests that that global warming will affect food production," said Feng.For the study, Feng and colleagues analyzed 27 climate models, each of which had three different scenarios. "It was terabytes of information, and it took a couple months and multiple computers to run," he said. Feng and Miroslav Trnka, a professor at the Global Change Research Institute in the Czech Republic and first author of the study, came up with the idea for the study over pizza at a conference in Nebraska. They sketched out the initial ideas for the study on the back of a napkin.The study found that historically, the total area affected by severe drought worldwide and food prices are closely related. More widespread drought has meant higher food prices in the past."If only one country or region sees a drought there is less impact," Feng said. "But if multiple regions are affected simultaneously, it can affect global production and food prices, and lead to food insecurity."
Weather
2,019
September 24, 2019
https://www.sciencedaily.com/releases/2019/09/190924112110.htm
New standard of reference for assessing solar forecast proposed
Being able to accurately forecast how much solar energy reaches the surface of the Earth is key to guiding decisions for running solar power plants.
While day-ahead forecasts have become more accurate in recent years, the solar community lacks a unified verification procedure, and assessing how one forecast compares to another is difficult. New work in the Researcher Dazhi Yang proposed an improved way to assess day-ahead solar forecasting. The proposed method combines two popular reference methods for weather forecasting, namely persistence and climatology. Using a weighted linear combination of both methods, his approach provides a new way to gauge the skill of a forecaster."There is a large collection of solar forecasting works in the literature. However, all papers claim superiority, which is clearly not possible," Yang said. "Without a standardized reference method to gauge forecast accuracy, we cannot compare methods reported in different papers, using different data from different locations and timescales."Persistence reference methods assume weather does not change from day to day, and tomorrow's forecast can be drawn from today's observations. Climatology examines the long-term averaged observations over time to generate a forecast."It's generally unclear which type of forecast is more accurate, making it difficult to determine the best forecasters," Yang said. "A standard of reference is much needed in the community, so that forecasters can calculate the skill score and thus perform a direct, 'apples-to-apples' comparison."To demonstrate the universality of the proposed method, Yang applied the framework to a dataset from the Baseline Surface Radiation Network. The network's 66 stations can be found on all seven continents and on an island in every ocean, and has been operating for 27 years.For some applications, Yang's combined approach proved to be more optimal than either climatology or persistence alone.If the goal is to assess the reliability of a forecast, he found his combined reference is optimal. If the goal is to maximize how much a forecast was able to differentiate between different forecasting situations, then forecasters should use a form of the persistence approach that considers the hourly changes of the weather.Such conditional findings might provide an answer for why there isn't a consensus in the first place, he said. Yang hopes that, with the help of journal editors, who might bolster the profile of the new reference method, this framework will become the standard of reference for day-ahead solar forecasting.
Weather
2,019
September 20, 2019
https://www.sciencedaily.com/releases/2019/09/190920183112.htm
Daily rainfall over Sumatra linked to larger atmospheric phenomenon
Around the globe, communities are concerned with rain and storms. An area known as the "Maritime Continent," which includes major islands such as Sumatra, Java, Borneo, Papua New Guinea, along with a galaxy of smaller islands, experiences significant rainfall including periodic monsoonal rain, and flash flooding.
In a new study led by atmospheric scientist Giuseppe Torri at the University of Hawai'i (UH) at M?noa School of Ocean and Earth Science and Technology (SOEST), researchers revealed details of the connection between a larger atmospheric phenomenon, termed the Madden-Julian Oscillation (MJO), and the daily patterns of rainfall in the Maritime Continent.The MJO circles the globe around the tropics and can affect weather on weekly to monthly time scales, alternately bringing cloudy, rain periods and sunny, drier periods.Torri and co-authors found that the impact of the MJO on the daily rainfall patterns of Sumatra was quite significant. When the MJO was active near the Maritime Continent, there was more water vapor -- and therefore greater potential for significant rain events -- and more variations in water vapor throughout the day as compared to the suppressed phase. Also, clouds and rain seemed to move offshore at night faster during the active phase of the MJO.The team relied on data from a network of GPS stations that were installed on Sumatra and on the neighboring islands by a team of scientists interested in monitoring tectonic activity along the western coast of Sumatra. As it turns out, the GPS signal is distorted by the amount of water vapor in the atmosphere. This distortion is bad news for people interested in location information -- which is what the GPS technology was invented for. However, scientists, including UH M?noa atmospheric sciences professor Steven Businger, realized that the distortion can tell us something about the state of the atmosphere and pioneered its use as a source of data.With the extensive coverage of the GPS stations on the island of Sumatra, the team had a dataset that provided a highly detailed picture of the daily atmospheric changes."Given the existing scientific literature, we had a sense that the MJO had an impact on the local convection in the Maritime Continent," said Torri. "One thing that was surprising to me was just how well we could see the convection propagate offshore in the late evening. This is thanks to the density of stations of the GPS network we considered."The MJO is arguably one of the most important phenomena on the planet, and can influence the weather and the climate of regions that are even thousands of miles away from the Maritime Continent. A better understanding of the MJO, and a good way to simulate it are key to better understanding our current and future climate.While the current study furthers understanding of the impacts of the MJO on clouds and rain over Sumatra, Torri will team up with SOEST atmospheric scientist Alison Nugent to investigate the causes of these impacts and the mechanisms that control the offshore propagation of rainfall.
Weather
2,019
September 20, 2019
https://www.sciencedaily.com/releases/2019/09/190920124643.htm
Water may be scarce for new power plants in Asia
Climate change and over-tapped waterways could leave developing parts of Asia without enough water to cool power plants in the near future, new research indicates.
The study found that existing and planned power plants that burn coal for energy could be vulnerable. The work was published today in the journal "One of the impacts of climate change is that the weather is changing, which leads to more extreme events -- more torrential downpours and more droughts," said Jeffrey Bielicki, a co-author of the study and an associate professor with a joint appointment in the Department of Civil, Environmental, and Geodetic Engineering and the John Glenn College of Public Affairs at The Ohio State University."The power plants -- coal, nuclear and natural gas power plants -- require water for cooling, so when you don't have the rain, you don't have the stream flow, you can't cool the power plant."That is already a problem for some power plants in the United States, Bielicki said, where extreme weather patterns, which are increasingly frequent especially in hotter months, have reduced water supplies.But, this study suggests, it is likely to be an even greater problem in developing parts of Asia -- Mongolia, Southeast Asia and parts of India and China -- where more than 400 gigawatts of new coal-fired power plant capacity are planned for operation by 2030. (By comparison: The largest coal-fired power plant in Ohio has the capacity to produce about 2,600 megawatts of electricity; the new plants planned for developing Asia are the equivalent of more than 150 similar facilities.)That increasing power production will itself be part of the problem, the researchers found, creating greater demand for water at the same time that climate change significantly limits the supply."Capacity expansion and climate change combined is going to reduce the water available to cool power plants," said Yaoping Wang, lead author of the study and a former doctoral student at Ohio State. Wang, now a research assistant professor at The University of Tennessee, did some of this research while on a fellowship at the International Institute for Applied Systems Analysis in Austria.Cooling is critical to a plant's ability to operate -- without it, machinery can overheat, causing a shutdown that could disrupt the flow of electricity to homes and businesses, and creating the potential for additional pollution.The researchers analyzed databases of existing and planned coal-fired power plants, and combined that information with high-resolution hydrological maps to evaluate the possible strain on water supplies throughout the region. Then they applied different climate scenarios -- increases in global temperature of 1.5, 2 and 3 degrees Celsius (2.7 -- 4.8 degrees Fahrenheit) above pre-industrial levels, increases set out as milestones in the Paris Agreement, a 2016 international accord to address climate change.The researchers then considered different cooling systems and potential use of post-combustion COThe numbers showed that there simply would not be enough water to cool all the power plants, but there is also a lot of local variability, Wang said.The takeaway for agencies that plan and permit plants across developing Asia, she said, is that they must evaluate the renewable water available near each power plant, taking into account water use by other plants.Bielicki said this may require difficult decisions like reducing the number of planned power plants."There's often a perceived tension between developing your economy and protecting the environment," he said. "Some of the results of this study are saying, 'hey, we expect you're going to run into problems, so you should selectively change your plans, but also thin out your existing power plants, because as you're adding new power plants, you're creating more competition for the water. Your economy needs water, but your ecosystems and people need water, too.'"
Weather
2,019
September 19, 2019
https://www.sciencedaily.com/releases/2019/09/190919142215.htm
Investments to address climate change are good business
An internationally respected group of scientists have urgently called on world leaders to accelerate efforts to tackle climate change. Almost every aspect of the planet's environment and ecology is undergoing changes in response to climate change, some of which will be profound if not catastrophic in the future.
According to their study published in "Acting on climate change" said lead author, Prof Ove Hoegh-Guldberg from the ARC Centre for Excellence in Coral Reef Studies at the University of Queensland in Australia "has a good return on investment when one considers the damages avoided by acting."The investment is even more compelling given the wealth of evidence that the impacts of climate change are happening faster and more extensively than projected, even just a few years ago. This makes the case for rapidly reducing greenhouse gas emissions even more compelling and urgent.Prof Hoegh-Guldberg explained the mismatch. "First, we have underestimated the sensitivity of natural and human systems to climate change, and the speed at which these changes are happening. Second, we have underappreciated the synergistic nature of climate threats -- with the outcomes tending to be worse than the sum of the parts. This is resulting is rapid and comprehensive climate impacts, with growing damage to people, ecosystems, and livelihoods."For example, sea-level rise can lead to higher water levels during storm events. This can create more damage. For deprived areas, this may exacerbate poverty creating further disadvantage. Each risk may be small on its own, but a small change in a number of risks can lead to large impacts.Prof Daniela Jacob, co-author and Director of Climate Services Centre (GERICS) in Germany is concerned about these rapid changes -- especially about unprecedented weather extremes."We are already in new territory" said Prof Jacob, "The 'novelty' of the weather is making our ability to forecast and respond to weather-related phenomena very difficult."These changes are having major consequences. The paper updates a database of climate-related changes and finds that there are significant benefits from avoiding 2oC and aiming to restrict the increase to 1.5oC above pre-industrial global temperatures.Prof Rachel Warren from the Tyndall Centre at the University of East Anglia in the UK assessed projections of risk for forests, biodiversity, food, crops and other critical systems, and found very significant benefits for limiting global warming to 1.5oC rather than 2oC."The scientific community has quantified these risks in order to inform policy makers about the benefits of avoiding them," Prof Warren stated.Since the Paris Agreement came into force, there has been a race to quantify the benefits of limiting warming to 1.5oC so that policy makers have the best possible information for developing the policy required for doing it.Prof Warren continued. "If such policy is not implemented, we will continue on the current upward trajectory of burning fossil fuels and continuing deforestation, which will expand the already large-scale degradation of ecosystems. To be honest, the overall picture is very grim unless we act."A recent report from the United Nations projected that as many as a million species may be at risk of extinction over the coming decades and centuries. Climate change is not the only factor but is one of the most important ones.The urgency of responding to climate change is at front of mind for Prof Michael Taylor, co-author and Dean of Science at the University of the West Indies. "This is not an academic issue, it is a matter of life and death for people everywhere. That said, people from small island States and low-lying countries are in the immediate cross-hairs of climate change.""I am very concerned about the future for these people," said Professor Taylor.This urgency to act is further emphasized by the vulnerability of developing countries to climate change impacts as pointed out by Francois Engelbrecht, co-author and Professor of Climatology at the Global Change Institute of the University of the Witwatersrand in South Africa."The developing African countries are amongst those to be affected most in terms of impacts on economic growth in the absence of strong climate change mitigation," Prof Engelbrecht explains.Prof Hoegh-Guldberg reiterated the importance of the coming year (2020) in terms of climate action and the opportunity to strengthen emission reduction pledges in line with the Paris Agreement of 2015."Current emission reduction commitments are inadequate and risk throwing many nations into chaos and harm, with a particular vulnerability of poor peoples. To avoid this, we must accelerate action and tighten emission reduction targets so that they fall in line with the Paris Agreement. As we show, this is much less costly than suffering the impacts of 2oC or more of climate change.""Tackling climate change is a tall order. However, there is no alternative from the perspective of human well-being -- and too much at stake not to act urgently on this issue."
Weather
2,019
September 18, 2019
https://www.sciencedaily.com/releases/2019/09/190918122506.htm
Scientists forecasted late May tornado outbreak nearly 4 weeks in advance
A team of scientists reports that they accurately predicted the nation's extensive tornado outbreak of late May 2019 nearly four weeks before it began.
The team's study, detailing factors that went into the forecast, was published recently in the journal, "This is the first documented successful long-range forecast for an extended period of tornado activity in the U.S.," said lead author Victor Gensini, a professor of meteorology at Northern Illinois University.Gensini said extended-range predictions are the "new frontier of forecasting.""In our field, there's a big push to accurately predict all kinds of extreme weather events well in advance," Gensini said."If we can better anticipate when and where these extreme events may be occurring, it gives us a better chance to mitigate their impacts. We think any additional lead time could be extremely valuable to emergency response teams, insurance companies, and numerous other sectors of industry."May 17 through May 29 proved to be an unusually active period of severe weather in the United States -- even for a time of the year known to produce violent storms.During the 13-day stretch, 374 tornadoes occurred, more than tripling the 1986-2018 average of 107 for this period. In total, 757 tornado warnings were issued by NOAA's National Weather Service, and seven fatalities were reported. The outbreak contributed significantly to the second highest monthly (E)F1+ tornado count (220) on record for May since reliable tornado counts began in the early 1950s.The central and southern Great Plains, along with the lower Great Lakes region, including Pennsylvania and Ohio, were particularly hard hit by the tornadic storms.Five years ago, Gensini and colleagues formed an Extended Range Tornado Activity Forecast (ERTAF) team to conduct research on sub-seasonal, or extended-range, forecasting. Its current members include Paul Sirvatka of the College of DuPage and current study co-authors David Gold of IBM-Global Business Services, John T. Allen of Central Michigan University and Bradford S. Barrett of the United States Naval Academy.Studies in recent years by the team and other scientists used historical weather-pattern records to develop methodologies for predicting the likelihood of severe weather across the continental United States weeks in advance.From April 28 on, the ERTAF team highlighted the likelihood of an active period of severe weather three to four weeks into the future. The prediction was especially notable given the pre-season expectation of below-average frequencies of U.S. tornadoes due to the presence of weak El Niño conditions in the tropical Pacific Ocean."It's important to note that this was a single successful extended-range forecast -- we're not going to get every one of these correct," Gensini said. "But our work does create a pathway to forecasting severe weather with these extended lead times. These are usually forecasts of opportunity, meaning that they are not always possible."Gensini said the ERTAF team, which posts forecasts on its website every Sunday evening during tornado season, has had many other successful forecasts that were two to three weeks in advance. They chose to publish on this example because of the magnitude of the storms and textbook nature of the chain of events."This is the first extended-range forecast that has been fully scientifically dissected," Gensini said. "We wanted to make sure it's documented."The forecast process is complex. It looks for signals in two atmospheric indices -- the Madden-Julian Oscillation, an eastward moving disturbance of winds, rain and pressure, and the Global Wind Oscillation, a collection of climate and weather information that measures atmospheric angular momentum, or the degree of waviness in the jet stream.Recurring modes within both oscillations occasionally provide enhanced predictability of future potential for severe weather frequency, the researchers said.The conditions that resulted in the tornado outbreak began thousands of miles away as thunderstorms over the Indian Ocean and Maritime Continent. The storms progressed into the equatorial Pacific, leading to an enhancement of the jet stream -- a key signal the scientists were looking for. The jet stream then crashed like a wave, breaking over western North America into a wavy pattern."This process often leads to a thermal trough over the western U.S. that connects downstream to a thermal ridge, creating a rollercoaster-like jet stream pattern," Gensini said. "Those types of weather patterns have long been known to be most favorable for tornado outbreaks."From beginning to end, the pattern progressed as the researchers expected."It doesn't always happen that way, and we have a lot of work to do to make this methodology robust, but every year we learn something new," Gensini said.
Weather
2,019
September 17, 2019
https://www.sciencedaily.com/releases/2019/09/190917133054.htm
Peatlands trap CO2, even during droughts
Although peatlands make up only 3% of the Earth's surface, they store one third of the soil carbon trapped in soils globally. Preserving peatlands is therefore of paramount importance for mitigating climate change, provided that these vulnerable environments are not themselves threatened by global warming.
To better determine this risk, two French scientists, including Vincent Jassey, a CNRS researcher at the Laboratoire Ecologie Fontionnelle et Environnement (CNRS/Université Toulouse III -- Paul Sabatier/INP Toulouse), studied carbon uptake by the two main species of moss that make up the Le Forbonnet peatland in Frasne (Jura). They discovered that when temperatures were high and also during droughts, the two Sphagnum species behaved in opposite ways: Sphagnum medium resists drought, whereas the photosynthesis of Sphagnum fallax is negatively impacted; conversely, in very hot but humid weather, photosynthesis, and thus carbon uptake, in Sphagnum fallax increases, whereas there is a negligible effect on photosynthesis in Sphagnum medium. In both cases, then, the peatland survives.These results show that peatlands can withstand future climate change, provided they are not disturbed. Making peatland conservation a priority would therefore help to limit the impacts of climate change in the future. The study was published on September 9, 2019 in
Weather
2,019
September 11, 2019
https://www.sciencedaily.com/releases/2019/09/190911101606.htm
'Planting water' is possible -- against aridity and droughts
The water regime of a landscape commutes more and more between the extremes drought or flooding. The type of vegetation and land use plays an important role in water retention and runoff. Together with scientists from the UK and the US, researchers from the Leibniz- Institute of Freshwater Ecology and Inland Fisheries (IGB) have developed a mathematical model that can reflect the complex interplays between vegetation, soil and water regimes. They show, for example, that in beech forests water is increasingly cycled between soil and vegetation to increase evaporation to the atmosphere, while grass cover promotes groundwater recharge.
With the developed model EcH2o-iso the researchers can quantify where, how and for how long water is stored and released in the landscape. The model helps to better predict the effects of land-use changes on the water balance under changing climatic conditions. In drought-prone areas in particular, this knowledge can help to develop land use strategies that increase the landscape's resistance to climate change and protect water resources. "So far, the type of vegetation has been considered primarily with a view to preventing soil erosion. In view of more frequent extreme weather events such as droughts and floods, however, it is increasingly a question of which plants can be cultivated to control the retention or loss of water in the landscape," says Prof. Doerthe Tetzlaff, head of the study, leader of the research group "Landscape Ecohydrology" at IGB and Professor in Ecohydrology at the Humboldt Universitaet zu Berlin.Previous forecasting models often capture vegetation as a static element. Thus, the complex interactions between evapotranspiration -- the evaporation of water by plants and of soil and water surfaces -- and the physiological processes of plants could only be insufficiently understood. In this study, however, long-term data of direct vegetation measures were also used (e.g. biomass production and transpiration). This improves the reliability of the models and their transferability. In the field, the models were tested with so-called conservative tracers. These are markers that can be used to determine the age and origin of the water. This is a novel approach to assess the effects of climate change on the water balance.In a region around Lake Stechlin in northern Germany, the researchers validated the model using field studies. They compared land areas with deciduous forest and grass cover. The results of the field study show that grassland use leads to more groundwater recharge and that in beech forests more water is returned to the atmosphere by evapotranspiration. However, the effects are site-specific and depend on the respective hydroclimate, biogeography and landscape ecology. With the help of the EcH2o-iso model, however, these differences can be taken into account in the future and local as well as large scale forecast models can be created.
Weather
2,019
September 11, 2019
https://www.sciencedaily.com/releases/2019/09/190911121953.htm
The danger of heat and cold across Australia
Cold temperatures are not nearly as deadly as heat, with around 2% of all deaths in Australia related to heat, according to new research from the University of Technology Sydney.
The study, published today in the journal Cold weather had a much smaller impact (-0.4% nationwide) except in the coldest climate zone, where 3.6% of deaths could be linked to cold temperatures."Accurately measuring temperature-related mortality is an important step towards understanding the impacts of climate change, particularly across different climate zones," says study author Dr Thomas Longden, from the UTS Centre for Health Economics Research and Evaluation.The study is the first to use a national data set of mortality records to calculate the number of deaths linked to heat and cold in Australia. A key part of the analysis was estimating temperature-related deaths across six climate zones.The climate zones range from areas with hot, humid summers in Northern Australia, to areas with mild summers and cold winters in Tasmania, ACT and parts of NSW and Victoria.Regions with warm, humid summers, including Brisbane, Coffs Harbour and the Gold Coast, had the highest proportion of deaths linked to heat (9.1%).The coldest climate zone, which encompasses Tasmania and the NSW and Victorian alpine regions, saw 3.6% of deaths attributed to cold temperatures and a 3.3% reduction in deaths during warmer months.The study also revealed that in some regions, particularly those with warm, humid summers, colder temperatures actually reduced deaths in comparison to the median temperature."While the cold is more dangerous in the colder climate zones, in four of the six regions, there was a decrease in deaths during colder weather. This is because most of the cold days in warmer climate zones are quite moderate," says Dr Longden.Previous studies that used data for Sydney, Melbourne and Brisbane have suggested that despite increasing temperatures due to climate change, there would be a net reduction in temperature-related deaths due to the reduction in cold-related deaths.However, this study reveals that nationwide there would be a net cost from climate change, as increased heat-related deaths would not be offset by a reduction in cold-related deaths in most climate zones."Whether an increase in heat-related mortality is offset by a reduction in cold-related mortality is crucial to finding a net benefit or cost from climate change when using temperature-mortality relationships," Dr Longden says."The main differences between the earlier studies and this one is the use of a national mortality data set, which allows for the analysis of differences between climate zones, and the reference temperature used to measure the relative risk of mortality," says Dr Longden.
Weather
2,019
September 10, 2019
https://www.sciencedaily.com/releases/2019/09/190910154657.htm
Satellite data record shows climate change's impact on fires
"Hot and dry" are the watchwords for large fires. In just seconds, a spark in hot and dry conditions can set off an inferno consuming thick, dried-out vegetation and almost everything else in its path. While every fire needs a spark to ignite and fuel to burn, hot and dry conditions in the atmosphere play a significant role in determining the likelihood of a fire starting, its intensity and the speed at which it spreads. Over the past several decades, as the world has increasingly warmed, so has its potential to burn.
Since 1880, the world has warmed by 1.9 degrees Fahrenheit, with the five warmest years on record occurring in the last five years. Since the 1980s, the wildfire season has lengthened across a quarter of the world's vegetated surface, and in some places like California, fire has become nearly a year-round risk. 2018 was California's worst wildfire season on record, on the heels of a devasting 2017 fire season. In 2019, wildfires have already burned 2.5 million acres in Alaska in an extreme fire season driven by high temperatures, which have also led to massive fires in Siberia.Whether started naturally or by people, fires worldwide and the resulting smoke emissions and burned areas have been observed by NASA satellites from space for two decades. Combined with data collected and analyzed by scientists and forest managers on the ground, researchers at NASA, other U.S. agencies and universities are beginning to draw into focus the interplay between fires, climate and humans."Our ability to track fires in a concerted way over the last 20 years with satellite data has captured large-scale trends, such as increased fire activity, consistent with a warming climate in places like the western U.S., Canada and other parts of Northern Hemisphere forests where fuels are abundant," said Doug Morton, chief of the Biospheric Sciences Laboratory at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Where warming and drying climate has increased the risk of fires, we've seen an increase in burning."High temperatures and low humidity are two essential factors behind the rise in fire risk and activity, affecting fire behavior from its ignition to its spread. Even before a fire starts they set the stage, said Jim Randerson, an Earth system scientist at the University of California, Irvine who studies fires both in the field and with satellite data.He and his colleagues studied the abundance of lightning strikes in the 2015 Alaskan fire season that burned a record 5.1 million acres. Lightning strikes are the main natural cause of fires. The researchers found an unusually high number of lightning strikes occurred, generated by the warmer temperatures that cause the atmosphere to create more convective systems -- thunderstorms -- which ultimately contributed to more burned area that year.Hotter and drier conditions also set the stage for human-ignited fires. "In the Western U.S., people are accidentally igniting fires all the time," Randerson said. "But when we have a period of extreme weather, high temperatures, low humidity, then it's more likely that typical outdoor activity might lead to an accidental fire that quickly gets out of control and becomes a large wildfire."For example, in 2018 sparks flying from hammering a concrete stake into the ground in 100-degree Fahrenheit heat and sparks from a car's tire rim scraping against the asphalt after a flat tire were the causes of California's devastatingly destructive Ranch and Carr Fires, respectively. These sparks quickly ignited the vegetation that was dried out and made extremely flammable by the same extreme heat and low humidity, which research also shows can contribute to a fire's rapid and uncontrollable spread, said Randerson. The same conditions make it more likely for agricultural fires to get out of control.A warming world also has another consequence that may be contributing to fire conditions persisting over multiple days where they otherwise might not have in the past: higher nighttime temperatures."Warmer nighttime temperature allow fires to burn through the night and burn more intensely, and that allows fires to spread over multiple days where previously, cooler nighttime temperatures might have weakened or extinguished the fire after only one day," Morton said.Hot and dry conditions that precede fires can be tempered by rain and moisture circulating in the atmosphere. On time scales of months to years, broader climate patterns move moisture and heat around the planet. Monitoring these systems with satellite observations allows researchers to be able to begin to develop computer models for predicting whether an upcoming fire season in a given region will be light, average or extreme. The most important of these indicators are sea surface temperatures in the Pacific Ocean that govern the El Niño Southern Oscillation (ENSO)."ENSO is a major driver of fire activity across multiple continents," Randerson said, who along with Morton and other researchers have studied the relationship between El Niño events and fire seasons in South America, Central America, parts of North America, Indonesia, Southeast Asia and equatorial Asia. "The precipitation both before the fire season and during the fire season can be predicted using sea surface temperatures that are measured by NASA and NOAA satellites."An ongoing project, said Randerson, is to now extend that prediction capability globally to regions that are affected by other ocean-climate temperature changes and indicators.In studying the long-term trends of fires, human land management is as important to consider as any other factor. Globally, someplace on Earth is always on fire -- and most of those fires are set by people, either accidentally in wildlands, or on purpose, for example, to clear land or burn agricultural fields after the harvest to remove crop residues.But not all fires behave the same way. Their behavior depends on the fuel type and the how people are changing the landscape. While fire activity has gotten worse in northern latitude forests, research conducted by Randerson and Morton has shown that despite climate conditions that favor fires, the number of fires in grassland and savanna ecosystems worldwide are declining, contributing to an overall decline in global burned area. The decline is due to an increased human presence creating new cropland and roads that serve as fire breaks and motivate the local population to fight these smaller fires, said Morton."Humans and climate together are really the dual factors that are shaping the fires around the world. It's not one or the other," Randerson said.Fires impact humans and climate in return. For people, beyond the immediate loss of life and property, smoke is a serious health hazard when small soot particles enter the lungs, Long-term exposure has been linked to higher rates of respiratory and heart problems. Smoke plumes can travel for thousands of miles affecting air quality for people far downwind of the original fire. Fires also pose a threat to local water quality, and the loss of vegetation can lead to erosion and mudslides afterwards, which have been particularly bad in California, Randerson said.For the climate, fires can directly and indirectly increase carbon emissions to the atmosphere. While they burn, fires release carbon stored in trees or in the soil. In some places like California or Alaska, additional carbon may be released as the dead trees decompose, a process that may take decades because dead trees will stand like ghosts in the forest, decaying slowly, said Morton. In addition to releasing carbon as they decompose, the dead trees no longer act as a carbon sink by pulling carbon dioxide out of the atmosphere. In some areas like Indonesia, Randerson and his colleagues have found that the radiocarbon age of carbon emissions from peat fires is about 800 years, which is then added to the greenhouse gases in that atmosphere that drive global warming. In Arctic and boreal forest ecosystems, fires burn organic carbon stored in the soils and hasten the melting of permafrost, which release methane, another greenhouse gas, when thawed.Another area of active research is the mixed effect of particulates, or aerosols, in the atmosphere in regional climates due to fires, Randerson said. Aerosols can be dark like soot, often called black carbon, absorbing heat from sunlight while in the air, and when landing and darkening snow on the ground, accelerating its melt, which affects both local temperatures -- raising them since snow reflects sunlight away -- and the water cycle. But other aerosol particles can be light colored, reflecting sunlight and potentially having a cooling effect while they remain in the atmosphere. Whether dark or light, according to Randerson, aerosols from fires may also have an effect on clouds that make it harder for water droplets to form in the tropics, and thus reduce rainfall -- and increase drying.Fires of all types reshape the landscape and the atmosphere in ways that can resonate for decades. Understanding both their immediate and long-term effects requires long-term global data sets that follow fires from their detection to mapping the scale of their burned area, to tracing smoke through the atmosphere and monitoring changes to rainfall patterns."As climate warms, we have an increasing frequency of extreme events. It's critical to monitor and understand extreme fires using satellite data so that we have the tools to successfully manage them in a warmer world," Randerson said.
Weather
2,019
September 5, 2019
https://www.sciencedaily.com/releases/2019/09/190905080114.htm
Extreme weather events linked to poor mental health
People whose homes are damaged by storms or flooding are significantly more likely to experience mental health issues such as depression and anxiety, according to new research.
The study, led by the University of York and the National Centre for Social Research, found that the risk to mental health associated with experiencing weather-damage to your home is similar to the risk to mental health associated with living in a disadvantaged area.People with weather-damaged homes are more likely to experience poor mental health even when the damage is relatively minor and does not force them to leave their homes, the study suggests.With scientists saying climate change is likely to increase the frequency and intensity of storms and floods in the UK, emergency planning for extreme weather needs to include mental health support for people affected, the researchers conclude.The researchers analysed data from a large national mental health survey called the Adult Psychiatric Morbidity Survey (APMS). The APMS is the primary source of information on the mental health of people living in England and assesses mental disorders using diagnostic criteria.Survey fieldwork took place throughout 2014 and included a question which asked participants if their home had been damaged by wind, rain, snow or flood in the six months prior to interview -- this period included December 2013 to March 2014, which saw severe winter storms and extensive flooding in the UK.Over 4.2 million flood warnings were issued and over 10,000 residential properties were flooded over these months.Taking other factors known to increase the risk of poor mental health into account -- such as social disadvantage, debt and poor physical health -- the researchers found that people who had experienced storm and flood damage to their homes were about 50% more likely to experience poorer mental health.Lead author of the study, Professor Hilary Graham, from the Department of Health Sciences at the University of York, said: "This study shows that exposure to extreme or even moderate weather events may result in 'psychological casualties' with significant impacts on mental health."This is reflective of the huge impact storms and flooding have on people's lives as alongside the physical damage to homes and businesses, there is the emotional damage to the sense of security that many people derive from their home."The number of properties in the UK exposed to at least a 1 in 75-year flood risk is predicted to increase by 41% under a 2°C temperature rise and by 98% under 4°C temperature rise.Professor Graham added: "With extreme weather events on the rise due to climate change, environmental and health policies need to be brought much more closely together. This means recognising that flood protection policies are also health protection policies and that better protecting communities from floods is also an investment in protecting their mental health."Julie Foley, Director of Flood Risk Strategy & National Adaptation at the Environment Agency, said: "The impact of flooding on people is devastating, and can last long after the flood waters have gone away. People can be out of their homes for months or even years, and the impacts are even wider if businesses, schools and transport routes are affected. This research highlights the how the consequences of flooding can have a significant impact on mental health wellbeing."Our flood defences increase protection to thousands of homes around the country but we can never entirely eliminate the risk of flooding, which is why it's crucial to know how to protect yourself when it hits."
Weather
2,019
September 4, 2019
https://www.sciencedaily.com/releases/2019/09/190904153958.htm
Underwater soundscapes reveal differences in marine environments
Storms, boat traffic, animal noises and more contribute to the underwater sound environment in the ocean, even in areas considered protected, a new study from Oregon State University shows.
Using underwater acoustic monitors, researchers listened in on Stellwagen Bank National Marine Sanctuary off the coast of Boston; Glacier Bay National Park and Preserve in Alaska; National Park of American Samoa; and Buck Island Reef National Monument in the Virgin Islands.They found that the ambient sounds varied widely across the sites and were driven by differences in animal vocalization rates, human activity and weather.The findings demonstrate that sound monitoring is an effective tool for assessing conditions and monitoring changes, said Samara Haver, a doctoral candidate in the College of Agricultural Sciences at OSU and the study's lead author."This is a relatively economical way for us to get a ton of information about the environment," said Haver, who studies marine acoustics and works out of the Cooperative Institute for Marine Resources Studies, a partnership between OSU and the National Oceanic and Atmospheric Administration at the Hatfield Marine Science Center in Newport."Documenting current and potentially changing conditions in the ocean soundscape can provide important information for managing the ocean environment."The findings were published recently in the journal Passive acoustic monitoring is seen as a cost-effective and low-impact method for monitoring the marine environment. The researchers' goal was to test how effective acoustic monitoring would be for long-term assessment of underwater conditions."Ocean noise levels have been identified as a potential measure for effectiveness of conservation efforts, but until now comparing sound across different locations has been challenging," Haver said. "Using equipment that was calibrated across all of the sites, we were able to compare the sound environments of these diverse areas in the ocean."The researchers collected low frequency, passive acoustic recordings from each of the locations between 2014 and 2018. They compared ambient sounds as well as sounds of humpback whales, a species commonly found in all four locations.The inclusion of the humpback whale sounds -- mostly songs associated with mating in the southern waters, and feeding or social calls in the northern waters -- gives researchers a way to compare the sounds of biological resources across all the soundscapes, Haver said.The researchers found that ambient sound levels varied across all four study sites and sound levels were driven by differences in animal vocalization rates, human activity and weather. The highest sound levels were found in Stellwagen Bank during the winter/spring, driven by higher animal sound rates, vessel activity and high wind speeds. The lowest sound levels were found in Glacier Bay in the summer."Generally, the Atlantic areas were louder, especially around Stellwagen, than the Pacific sites," Haver said. "That makes sense, as there is generally more human-made sound activity in the Atlantic. There also was a lot of vessel noise in the Caribbean."The researchers also were able to hear how sound in the ocean changes before, during and after hurricanes and other severe storms; the monitoring equipment captured Hurricanes Maria and Irma in the Virgin Islands and Tropical Cyclone Winston in American Samoa.Ultimately, the study provides a baseline for these four regions and can be used for comparison over time. Documenting current and potentially changing conditions in the ocean soundscape can provide important information for managing the ocean environment, particularly in and around areas that have been designated as protected, Haver said.
Weather
2,019
August 30, 2019
https://www.sciencedaily.com/releases/2019/08/190830112821.htm
Deep snow cover in the Arctic region intensifies heat waves in Eurasia
Persistent abnormally hot weather can cause negative impacts on human health, agriculture, and natural environments. A heat wave -- a spell of hot days with the mercury rising much higher than the average temperature -- has been reported more frequently in Europe and Northeast Asia in recent years.
"Internal atmosphere-land interactions in Eurasia are believed to be an important factor in triggering abnormal summer temperatures. However, the exact reasons for such interactions causing heat waves remain largely unclear," says Associate Professor Tomonori Sato of the research team.In the present study published in The researchers analyzed 6,000 patterns in the spatial distribution of summer temperatures in Eurasia, and succeeded in dividing past summer temperature variations into two groups -- one attributable to global warming and the other attributable to natural changes. The former exhibited the rising temperatures in Eurasia since around 1990, while the latter showed the spatial distribution of low and high temperatures that correspond to the meandering of the westerlies. The distribution shows a wave train-like structure -- which demonstrates that when some regions experienced abnormally high temperatures, the surrounding areas were hit by abnormally low temperatures.The researchers then discovered that when Western Russia had a deeper-than-usual snow cover in late winter and spring, the wave train-like distribution of temperatures appeared. When deeper snow accumulation occurs, more moisture retains in the soil after snowmelt. The soil moisture then prevents the summer temperature from rising, which is a likely cause for making the westerlies meander, thus causing the surrounding regions to experience high temperatures.
Weather
2,019
August 28, 2019
https://www.sciencedaily.com/releases/2019/08/190828100544.htm
Europe warming faster than expected due to climate change
Climate change is increasing the number of days of extreme heat and decreasing the number of days of extreme cold in Europe, posing a risk for residents in the coming decades, according to a new study.
Temperatures in Europe have hit record highs this summer, passing 46.0 degrees Celsius (114.8 degrees Fahrenheit) in southern France. New research in the AGU journal The new study finds parts of Europe are warming faster than climate models project."Even at this regional scale over Europe, we can see that these trends are much larger than what we would expect from natural variability. That's really a signal from climate change," said Ruth Lorenz, a climate scientist at the Swiss Federal Institute of Technology in Zurich, Switzerland, and lead author of the new study.Extreme heat is dangerous because it stresses the human body, potentially leading to heat exhaustion or heat stroke. Scientists knew climate change was warming Europe, but they mostly studied long-term changes in extreme temperatures. The new study looked at observational data to evaluate whether the climate models used for regional projections can reproduce observed trends.In the new study, Lorenz and her colleagues used observational data taken by European weather stations from 1950-2018 and then analyzed the top 1% of the hottest heat extremes and highest humidity extremes, and the top 1% coldest days during that period."We looked further at the hottest day or coldest night per year, so for each year we looked for the maximum/minimum value and how these changed over time," Lorenz said.They found the number of extreme heat days in Europe has tripled since 1950, while the number of extreme cold days decreased by factors of two or three depending on the region. Extremely hot days have become hotter by an average of 2.30 degrees Celsius (4.14 degrees Fahrenheit), while extremely cold days have warmed by 3.0 degrees Celsius (5.4 degrees Fahrenheit) on average. The hottest days and coldest nights warmed significantly more than their corresponding summer and winter mean temperatures.Individual regions throughout Europe experienced drastically different temperature trends, which makes it difficult to compare the average European temperatures to specific stations' extremes, according to the authors. In Central Europe, the extremes warmed by 0.14 degrees Celsius (0.25 degrees Fahrenheit) per decade more than the summer mean, equivalent to an almost 1.0 degree Celsius (1.8 degree Fahrenheit) increase more than the average over the whole study period, according to Lorenz.More than 90% of the weather stations studied showed the climate was warming, a percentage too high to purely be from natural climate variability, according to the researchers.The results also showed that the region was warming faster than climate models projected. Some regions experienced higher extremes than expected and some had lower extremes that expected."In the Netherlands, Belgium, France, the model trends are about two times lower than the observed trends," said Geert Jan van Oldenborgh, a climate analysist at the Royal Netherlands Meteorological Institute in De Bilt, Netherlands, who was not connected to the new study. "We're reaching new records faster than you'd expect."European summers and winters will only grow hotter in the coming years as climate change accelerates, impacting cities and people unprepared for rising temperatures, according to the study authors."Lots of people don't have air conditioning for instance and it makes this really important," Lorenz said. "We expected results based on modeling studies but it's the first time we see it in what we've observed so far."
Weather
2,019
August 28, 2019
https://www.sciencedaily.com/releases/2019/08/190828080540.htm
Scientists call for infiltration to be better incorporated into land surface models
Soil scientists can't possibly be everywhere at once to study every bit of soil across the planet. Plus, soils are constantly changing.
Conditions like weather and land use have a major impact on soil over time. So, to understand everything about soil, we would need to be continuously studying soil around the world. Since this isn't possible, soil scientists are turning to math to predict what happens at the soil's surface.Soil models -- just like economic models -- are helpful to predict trends and make suggestions. An example might be the impact of climate change on water processes in the soil. Models help fill in the gaps of measured data.Since soil is a complex environment, a soil model consists of many pieces that represent different processes. One important aspect of soil models -- how water interacts with soil at the land surface -- was recently discussed by a group of almost 30 scientists. Their work was recently published in Water infiltration at the land surface is a crucial area of study. Infiltration refers to what fraction of the water is getting absorbed by the soil. Being able to predict if precipitation will run over the soil surface or soak in is crucial in land management decisions. If affects aspects of land management like erosion control. It is also important in making sure we have a safe and clean water supply. Land surface models can help scientists predict and simulate the water and energy cycles from the soil's surface into the atmosphere.Each piece of a land surface model is important. The study team found that information about infiltration warrants more attention in land surface models.In order to be truly useful, land surface models need to include loads of information. This includes soil structure, soil moisture and temperature, precipitation, terrain, plants, and more. Scientists use the information to calculate the Earth's climate or see how land use changes may affect it.Harry Vereecken, Forschungszentrum Jülich in Germany, was the lead author of this effort. "The review found important gaps in the current treatment of infiltration processes in land surface models," says Vereecken. "Current models don't account for the effect of structural properties on soil water dynamics. Also, we saw the lack of a consistent framework to upscale infiltration processes from different scales and the large diversity in approaches to describing them."The group is calling on scientists to work together and lend their skills to better include this information in land surface models. This is so the models better reflect the reality of what's happening at the Earth's surface.Their review was a way to compile scientific research from over a long period of time and give suggestions about where soil scientists should focus their efforts next. In looking over lots of research, they found there's no consistent way to predict infiltration. They also found that some aspects of soil that affect infiltration are often ignored."The climate and Earth sciences community typically operate at a larger scale than the soil science community," Vereecken explains. "Soil scientists have mostly worked at smaller scales, such as plot to field scale to study processes and often did not include atmospheric processes in their studies. We wanted to write about the importance of these communities coming together. This is the first review ever that addressed the handling of infiltration processes in these models."He adds that they hope their work provides a common understanding about how infiltration processes are dealt with in land surface models. While it can be difficult to quantify these complex processes and combine them into larger models, it's important in studying the state of the planet. Both groups need each other. Without soil scientists working on a smaller scale, others won't have data for their models."Because soil exerts a key control on climate-related processes, it can add relevance to the research we are doing as soil scientists," Vereecken says. "We hope this can serve as a kind of reference paper for other scientists and connect those that work on different aspects of land surface models."
Weather
2,019
August 28, 2019
https://www.sciencedaily.com/releases/2019/08/190828080536.htm
Using artificial intelligence to track birds' dark-of-night migrations
On many evenings during spring and fall migration, tens of millions of birds take flight at sunset and pass over our heads, unseen in the night sky. Though these flights have been recorded for decades by the National Weather Services' network of constantly scanning weather radars, until recently these data have been mostly out of reach for bird researchers.
That's because the sheer magnitude of information and lack of tools to analyze it made only limited studies possible, says artificial intelligence (AI) researcher Dan Sheldon at the University of Massachusetts Amherst.Ornithologists and ecologists with the time and expertise to analyze individual radar images could clearly see patterns that allowed them to discriminate precipitation from birds and study migration, he adds. But the massive amount of information ¬- over 200 million images and hundreds of terabytes of data -- significantly limited their ability to sample enough nights, over enough years and in enough locations to be useful in characterizing, let alone tracking, seasonal, continent-wide migrations, he explains.Clearly, a machine learning system was needed, Sheldon notes, "to remove the rain and keep the birds."Now, with colleagues from the Cornell Lab of Ornithology and others, senior authors Sheldon and Subhransu Maji and lead author Tsung-Yu Lin at UMass's College of Information and Computer Sciences unveil their new tool "MistNet." In Sheldon's words, it's the "latest and greatest in machine learning" to extract bird data from the radar record and to take advantage of the treasure trove of bird migration information in the decades-long radar data archives. The tool's name refers to the fine, almost invisible, "mist nets" that ornithologists use to capture migratory songbirds.MistNet can "automate the processing of a massive data set that has measured bird migration over the continental U.S. for over two decades," Sheldon says. "This is a really important advance. Our results are excellent compared with humans working by hand. It allows us to go from limited 20th-century insights to 21st-century knowledge and conservation action." He and co-authors point out, "Deep learning has revolutionized the ability of computers to mimic humans in solving similar recognition tasks for images, video and audio."For this work, supported in part by a National Science Foundation grant to Sheldon to design and test new mathematical approaches and algorithms for such applications, the team conducted a large-scale validation of MistNet and competing approaches using two evaluation data sets. Their new paper also presents several case studies to illustrate MistNet's strengths and flexibility. Details appear in the current issue of MistNet is based on neural networks for images and includes several architecture components tailored to the unique characteristics of radar data, the authors point out. Radar ornithology is advancing rapidly and leading to significant discoveries about continent-scale patterns of bird movements, they add.The team made maps of where and when migration occurred over the past 24 years and animated these to illustrate, for example, "the most intensive migration areas in the continental United States," Sheldon explains -- a corridor roughly along and just west of the Mississippi River. MistNet also allows researchers to estimate flying velocity and traffic rates of migrating birds.MistNet, designed to address one of the "long-standing challenges in radar aero-ecology," the authors note, comes just in time to help scientists better use not only existing weather radar data, but the "explosion" of large new data sets generated by citizen science projects such as eBird, animal tracking devices and earth observation instruments, say Sheldon and colleagues."We hope MistNet will enable a range of science and conservation applications. For example, we see in many places that a large amount of migration is concentrated on a few nights of the season," Sheldon says. "Knowing this, maybe we could aid birds by turning off skyscraper lights on those nights." Another question the ornithologists are interested in is the historical timing, or phenology, of bird migration and whether it, and timely access to food, have shifted with climate change.
Weather
2,019
August 27, 2019
https://www.sciencedaily.com/releases/2019/08/190827123525.htm
Positives of climate change? Agricultural, economic possibilities for West Virginia
Depending on your side of the aisle, climate change either elicits doomsday anxiety or unabashed skepticism.
Jason Hubbart, director of Institute of Water Security and Science at West Virginia University, takes a more centered approach.He's studied the undisputable changing patterns in West Virginia's climate. And, believe it or not, there is at least one silver lining stemming from changing climate, he insists: The growing season is getting longer."Our future climates in West Virginia are likely to be more conducive to agricultural production," said Hubbart, a professor of hydrology and water quality in the Davis College of Agriculture, Natural Resources and Design. "We should plan for that now."In research published in In other words, West Virginians are now, on average, seeing cooler summers, warmer winters and wetter weather.Corresponding with those trends, big changes have occurred in agriculture. Yield for hay and corn, which have historically been bread-and-butter resources for the state, have increased, yet 23 percent slower than the national average; however, other crops, including winter wheat and soybeans, have increased yields 15 percent faster than the national average.Based on his findings, "it's time to rethink farming in West Virginia," said Hubbart, who grew up on a 2,000-acre dairy farm near Spokane, Washington.Hubbart breaks down why traditional West Virginia crops are floundering while others, previously not prominent, have gained potential."Some areas of West Virginia are too drenched or flooded all the time," he said. "Because it's wetter, we've seen a decline in crops like hay and corn."An uptick in humidity -- a result of climate change in many regions -- plays a part in the dwindling performance of traditional West Virginia crops. More humidity lowers vapor-pressure deficit, which is the difference between the amount of moisture in the air and how much moisture the air can hold (i.e. saturation).When the air is saturated, water can condense out to form clouds, turn into precipitation and create dew or films of water over a plant's leaves. More importantly, when the air is saturated (approaching 100 percent humidity), plants have a much more difficult time transpiring (moving water from the leaf to atmosphere). Therefore, the plants also have difficulty staying cool, transporting nutrients and photosynthesizing. For many historic agricultural crops, future climates may result in lower productivity.Corn and alfalfa, for instance, need a lot of water. Those plants use energy to create sugars and biomass. If it gets too hot, productivity can slow because they cannot move water up the plant and out the stomates, Hubbart explained. Ultimately, though West Virginia is seeing more precipitation, the increased humidity slows the movement of water from the plant to the atmosphere.Yet that doesn't mean death to West Virginia agriculture. Crops that don't require as much water (through transpiration), or thrive in short winters, long summers or moderate temperatures, could help turn the state around, Hubbart believes.The winter season has shrunk by as much as 20 days, according to Hubbart's research, and the minimum (and winter) temperatures have become warmer. The growing season itself has increased by approximately 13 days."Winter wheat and soy bean crops are just a couple of examples of future agricultural investment," Hubbart said. "Those crops, and many broadleafs do well in short winters. Basil, specialty teas, specialty vegetables, those are plants that have had trouble growing here historically, but now, and in the future, they may fare better."We can diversify our crops more. West Virginia should be thinking strategically about which crops to grow in what locations."Outcomes in his research also suggest the possibility of double-cropping, meaning that the growing seasons are extending long enough to raise one crop and harvest it and then raise another crop and harvest it, too, within the same year."Doing that, obviously, increases economic revenue and provides local food supplies that could greatly improve access to fresh vegetables to our citizens," Hubbart said. "That's more than just a bit of good news."Hubbart's findings come from more than 90 years' worth of observed weather data from climate stations on the ground throughout West Virginia and Appalachia. Whereas some research relies on climate models utilizing information from more distant locations and predictions based on those models that often aren't accurate, these findings are based on actual observed long-term West Virginia data, he said.While other climate research predicts drier climates and the emergence of food deserts, Hubbart's research indicates quite the opposite."West Virginia is a beautiful state with so much to look forward to," he said. "Our great scientists are making incredible progress in agriculture, food deserts, agricultural economics, etc. We need to celebrate our current successes and how we can use those successes in what I view as a very bright agricultural future for our state.""My results indicate that future climates will facilitate higher productivity and new crops, both of which could create an economic boom for West Virginia, reduce food desert issues and broadly improve the human condition in our state."
Weather
2,019
August 27, 2019
https://www.sciencedaily.com/releases/2019/08/190827084721.htm
Chipping away at how ice forms could keep windshields, power lines ice-free
How does ice form? Surprisingly, science hasn't fully answered that question. Differences in ice formation on various surfaces still aren't well understood, but researchers today will explain their finding that the arrangements that surface atoms impose on water molecules are the key. The work has implications for preventing ice formation where it isn't wanted (windshields, power lines) and for promoting ice formation where it is (food or organ preservation). The results could also help improve weather prediction.
The researchers will present their findings today at the American Chemical Society (ACS) Fall 2019 National Meeting & Exposition."We discovered that if we look at the liquid water structure where it contacts the surface, we can start to understand and predict whether a given surface will promote or inhibit ice formation," says Sapna Sarupria, Ph.D., the project's principal investigator. "We're working with collaborators to use this information to better understand the role of ice in weather and to design surfaces that are good or bad for ice formation. Wouldn't it be great to have a windshield that doesn't let ice stick to it in winter?"Sarupria's team uses computers to study molecular simulations of surfaces and ice formation. Unlike the messier real world, this controlled setting gives her the ability to examine the impact of a change in just one surface parameter -- or even just one atom -- at a time. The researchers then correlate the findings with those of experimentalists who work with real-world materials, including silver iodide or minerals such as mica and kaolinite. Silver iodide is so effective at promoting ice formation that it's used for cloud seeding to stimulate rainfall during droughts.Ice formation, or nucleation, occurs when liquid water undergoes a phase transition to solid water. Water can also undergo other phase transitions, such as changing from ice back to a liquid, or to vapor. If these transitions take place in clouds, they can form raindrops and snow. "When you want to predict the weather, you need to know how these phase transitions happen, and that's essentially an open question," says Sarupria, who is at Clemson University. Often these changes occur in the presence of particles such as mineral dust in the atmosphere. The type and amount of dust determine the type of precipitation that occurs. "We're trying to understand how different dust particle surfaces affect the transition of water from the liquid to the solid phase in clouds," she says.Good old HThe researchers are now collaborating with experimentalists who study atmospheric phenomena to help them explain their results. "If we can model these phenomena, we may be able to better understand the role of ice in weather," she explains.Sarupria is also applying her understanding of water structure to design surfaces that can promote or inhibit ice formation. For example, to prevent damage during food storage or cryopreservation of organs, someone in the future could use the new knowledge to form ice at temperatures closer to 32 F, the freezing point of water, rather than at lower temperatures. This could be done by modifying the surface of the packaging or adding molecules to the solution for cryopreservation. "In other cases, such as windshields and power lines, you may not want ice to form," Sarupria says. "So we're trying to figure out how to make coatings or surfaces that won't let ice form, or if it forms, that won't let it stick." Her team is also trying to understand how natural antifreeze proteins help fish and other organisms survive in frigid conditions. "Ultimately, whether it's these proteins or dust particles, it all boils down to how they affect the water structure," she says. "We want to use this information to create a parameter that could help us quickly screen surfaces for their ice nucleation ability."
Weather
2,019
August 22, 2019
https://www.sciencedaily.com/releases/2019/08/190822103834.htm
Rising summer heat could soon endanger travelers on annual Muslim pilgrimage
Over two million Muslim travelers just finished the annual religious pilgrimage to Mecca, Saudi Arabia, traveling during some of the country's hottest weather. New research finds pilgrims in future summers may have to endure heat and humidity extreme enough to endanger their health. The results can help inform policies that would make the trip safer for the several million people who make the pilgrimage each year, according to the study's authors.
Hajj, or Muslim Pilgrimage, is one of the five pillars of the Muslim faith. It is an annual pilgrimage to Mecca, Saudi Arabia, that involves living in the hot weather conditions of Saudi Arabia. Muslims are expected to make the pilgrimage at least once in their lifetimes. Islam follows a lunar calendar, so the dates for Hajj change every year. But for five to seven years at a time, the trip falls over summer.A new study projecting future summer temperatures in the region around Mecca finds that as soon as 2020, summer days in Saudi Arabia could surpass the United States National Weather Service's extreme danger heat-stress threshold, at a wet-bulb temperature of 29.1 degrees C (84.3 degrees Fahrenheit).Wet-bulb temperature is a measurement combining temperature with the amount of moisture in the air. At the extreme danger threshold defined by the National Weather Service, sweat no longer evaporates efficiently, so the human body cannot cool itself and overheats. Exposure to these conditions for long periods of time, such as during Hajj, could cause heat stroke and possibly death."When the Hajj happens in summer, you can imagine with climate change and increasing heat-stress levels conditions could be unfavorable for outdoor activity," said Elfatih Eltahir, a civil and environmental engineer at Massachusetts Institute of Technology and co-author of the new study in the AGU journal "Hajj may be the largest religious tourism event," Eltahir said. "We are trying to bring in the perspective of what climate change could do to such large-scale outdoor activity."Middle Eastern temperatures are rising because of climate change and scientists project them to keep rising in the coming decades. In the new study, Eltahir and his colleagues wanted to know how soon and how frequently temperatures during summer Hajj would pass the extreme danger threshold. The researchers examined historical climate models and used past data to create a projection for the future.In the past 30 years, they found that wet-bulb temperature surpassed the danger threshold 58 percent of the time, but never the extreme danger threshold. At the danger threshold, heat exhaustion is likely and heat stroke is a potential threat from extended exposure. Passing the extreme danger threshold for extended periods of time means heat stroke is highly likely.The researchers then calculated how climate change is likely to impact wet-bulb temperature in Saudi Arabia in the future. They found that in the coming decades, pilgrims will have to endure extremely dangerous heat and humidity levels in years when Hajj falls over summer. Their projections estimate heat and humidity levels during Hajj will exceed the extreme danger threshold six percent of the time by 2020, 20 percent of the time from 2045 and 2053, and 42 percent of the time between 2079 and 2086.Climate change mitigation initiatives make passing the threshold during these years less frequent, projecting one percent by 2020, 15 percent of the time between 2045 and 2053, and 19 percent of the time between 2079 and 2086, according to the study.The study authors stress that their projections are meant not to cause anxiety among pilgrims but instead to help them adapt, and to help authorities plan for safe Hajj."These results are not meant to spread any fears, but they are meant to inform policies about climate change, in relation to both mitigation and adaptation" Eltahir said. "There are ways people could adapt, including structural changes by providing larger facilities to help people perform Hajj as well as nonstructural changes by controlling the number of people who go.""They've provided a very compelling example of an iconic way that 2 to 3 million people per year that can be really vulnerable to what to me is the biggest underrated climate hazard -- this combination of high temp and high humidity," said Radley Horton, a climate scientist at Columbia University Lamont Doherty Earth Observatory who was not involved with the study. "I believe as the century progresses if we don't reduce our greenhouse gases [this] could become every much as an existential threat as sea level rising and coastal flooding."
Weather
2,019
August 21, 2019
https://www.sciencedaily.com/releases/2019/08/190821142729.htm
Ocean temperatures turbocharge April tornadoes over Great Plains region
New research, published in the journal
2019 has seen the second highest number of January to May tornadoes in the United States since 2000, with several deadly outbreaks claiming more than 38 fatalities. Why some years are very active, whereas others are relatively calm, has remained an unresolved mystery for scientists and weather forecasters.Climate researchers from the IBS Center for Climate Physics (ICCP), South Korea have found new evidence implicating a role for ocean temperatures in US tornado activity, particularly in April. Analyzing a large number of atmospheric data and climate computer model experiments, the scientists discovered that a cold tropical Pacific and/or a warm Gulf of Mexico are likely to generate large-scale atmospheric conditions that enhance thunderstorms and a tornado-favorable environment over the Southern Great Plains.This particular atmospheric situation, with alternating high- and low-pressure centers located in the central Pacific, Eastern United States and over the Gulf of Mexico, is known as the negative Pacific North America (PNA) pattern. According to the new research, ocean temperatures can boost this weather pattern in April. The corresponding high pressure over the Gulf of Mexico then funnels quickly-rotating moist air into the Great Plains region, which in turn fuels thunderstorms and tornadoes."Previous studies have overlooked the temporal evolution of ocean-tornado linkages. We found a clear relationship in April, but not in May ," says Dr. Jung-Eun Chu, lead author of the study and research fellow at the ICCP."Extreme tornado occurrences in the past, such as those in April 2011, were consistent with this blueprint. Cooler than normal conditions in the tropical Pacific and a warm Gulf of Mexico intensify the negative PNA, which then turbocharges the atmosphere with humid air and more storm systems," explains Axel Timmermann, Director of the ICCP and Professor at Pusan National University."Seasonal ocean temperature forecasts for April, which many climate modeling centers issue regularly, may further help in predicting the severity of extreme weather conditions over the United States," says June-Yi Lee, Professor at Pusan National University and Coordinating Lead Author of the 6th Assessment report of the Intergovernmental Panel on Climate Change."How Global Warming will influence extreme weather over North America, including tornadoes still remains unknown ," says. Dr. Chu. To address this question, the researchers are currently conducting ultra-high-resolution supercomputer simulations on the institute's new supercomputer Aleph.
Weather
2,019
August 21, 2019
https://www.sciencedaily.com/releases/2019/08/190821115034.htm
Forecasting dusty conditions months in advance
Southwestern Kansas in the 1930s saw some of the worst dust storms ever recorded in the U.S., when apocalyptic clouds of heavy dust terrified and even killed people, livestock and wildlife.
Long ago, farmers phased out the kinds of practices that brought about the Dust Bowl, but dust still can harm health, agriculture and transportation while exacerbating environmental problems. Indeed, dust storms may increase as climate change causes drier conditions. (The National Oceanic and Atmospheric Administration asserts windblown dust storms increased 240% from 1990 to 2011 in the southwestern United States.)Today, a researcher at the University of Kansas has developed an advanced technique for forecasting dusty conditions months before they occur, promising transportation managers, climatologists and people suffering health issues much more time to prepare for dusty conditions. By contrast, common methods of predicting dust in the air only give a few days of advance warning.Bing Pu, assistant professor of geography & atmospheric science at KU, is lead author of a new paper in "We use a statistical model constrained by observational data and the output of a state-of-the-art dynamic seasonal prediction model driven by observational information on Dec. 1," Pu said. "We found using our method, we actually can give a skillful prediction for the dustiness in springtime, one of the dustiest seasons in the U.S., over the Southwestern and Great Plains regions -- two of the dustiest areas in the U.S."Pu and her colleagues, Paul Ginoux and Sarah Kapnick of the NOAA Geophysical Fluid Dynamics Laboratory, and Xiaosong Yang of NOAA and the University Corporation for Atmospheric Research, were able to predict "variance," or days when there was more or less dust in the air than average."Over the southwestern U.S., our model captured the variance of the dustiness over the time period from 2004 to 2016 by about 63%," Pu said. "Over the Great Plans, about 71% of the variance is explained."Pu said factors influencing amounts of dust in the air can include surface winds, precipitation and amount of bareness of the landscape. These kinds of data were incorporated as key variables into the prediction model.According to Pu and her collaborators, high levels of airborne dust can affect individual people, transport systems and agricultural production."Small dust particles are very easily taken into your breathing system and then could cause lung diseases like asthma -- and some studies suggest there might be some connection with lung cancers," Pu said. "There's a study finding dust storms are related to valley fever in Arizona as fungi can attach to dust particles. And when there's a severe dust storm, visibility is reduced so it can increase car accidents on the highways. In 2013, there were severe dust storms in western Kansas that reduced visibility and caused problems for local traffic. In Arizona, when there's a strong dust storm usually called a 'haboob,' the dust wall goes up to a few kilometers high, and this can affect airports -- airports have to close due to the dust storms. Fortunately, these storms are moving quickly and dissipate after a few hours."Beyond safety for people, Pu's team detail in their study how high dust levels can sway the environment as a whole."Dust particles absorb and scatter both solar and terrestrial radiation, thus affecting the local radiative budget and regional hydroclimate," they wrote. "For instance, dust is found to amplify severe droughts in the United States by increasing atmospheric stability, to modulate the North American monsoon by heating the lower troposphere, and to accelerate snow melting and perturb runoff over the Upper Colorado River Basin by its deposition on snow."Pu said she hopes someday an organization or government agency could run the model she's developed and issue seasonal dust predictions months in advance, especially if the potential for high levels of dust cause concern."Traffic systems and human health would benefit most from this long-term prediction ability about dust and air quality," she said. "I think it would be great if an institute would try to give regular predictions of dustiness variations that could be helpful for airports or road traffic or transportation managers. Facilities could plan for times when there could be a lot of dust in the local area. It could even affect the plans of local farmers."For the time being, Pu aims to continue to refine the dust-prediction model to include atypical weather influences and human activity that could contribute to dust patterns."We want to continue to understand what other factors haven't been explored in the seasonal variation of the dust," she said. "For instance, those large-scale factors such as the El Niño-Southern Oscillation, and also anthropogenic factors, how people's influence through agriculture or construction projects, might affect dust emission in the future. Of course, we want to also keep collaborating with people at NOAA GFDL to give dust predictions."
Weather
2,019
August 19, 2019
https://www.sciencedaily.com/releases/2019/08/190819082448.htm
Facial recognition technique could improve hail forecasts
The same artificial intelligence technique typically used in facial recognition systems could help improve prediction of hailstorms and their severity, according to a new study from the National Center for Atmospheric Research (NCAR).
Instead of zeroing in on the features of an individual face, scientists trained a deep learning model called a convolutional neural network to recognize features of individual storms that affect the formation of hail and how large the hailstones will be, both of which are notoriously difficult to predict.The promising results, published in the American Meteorological Society's "We know that the structure of a storm affects whether the storm can produce hail," said NCAR scientist David John Gagne, who led the research team. "A supercell is more likely to produce hail than a squall line, for example. But most hail forecasting methods just look at a small slice of the storm and can't distinguish the broader form and structure."The research was supported by the National Science Foundation, which is NCAR's sponsor."Hail -- particularly large hail -- can have significant economic impacts on agriculture and property," said Nick Anderson, an NSF program officer. "Using these deep learning tools in unique ways will provide additional insight into the conditions that favor large hail, improving model predictions. This is a creative, and very useful, merger of scientific disciplines."Whether or not a storm produces hail hinges on myriad meteorological factors. The air needs to be humid close to the land surface, but dry higher up. The freezing level within the cloud needs to be relatively low to the ground. Strong updrafts that keep the hail aloft long enough to grow larger are essential. Changes in wind direction and speed at different heights within the storm also seem to play a roleBut even when all these criteria are met, the size of the hailstones produced can vary remarkably, depending on the path the hailstones travel through the storm and the conditions along that path. That's where storm structure comes into play."The shape of the storm is really important," Gagne said. "In the past we have tended to focus on single points in a storm or vertical profiles, but the horizontal structure is also really important."Current computer models are limited in what they can look at because of the mathematical complexity it takes to represent the physical properties of an entire storm. Machine learning offers a possible solution because it bypasses the need for a model that actually solves all the complicated storm physics. Instead, the machine learning neural network is able to ingest large amounts of data, search for patterns, and teach itself which storm features are crucial to key off of to accurately predict hail.For the new study, Gagne turned to a type of machine learning model designed to analyze visual images. He trained the model using images of simulated storms, along with information about temperature, pressure, wind speed, and direction as inputs and simulations of hail resulting from those conditions as outputs. The weather simulations were created using the NCAR-based Weather Research and Forecasting model (WRF).The machine learning model then figured out which features of the storm are correlated with whether or not it hails and how big the hailstones are. After the model was trained and then demonstrated that it could make successful predictions, Gagne took a look to see which aspects of the storm the model's neural network thought were the most important. He used a technique that essentially ran the model backwards to pinpoint the combination of storm characteristics that would need to come together to give the highest probability of severe hail.In general, the model confirmed those storm features that have previously been linked to hail, Gagne said. For example, storms that have lower-than-average pressure near the surface and higher-than-average pressure near the storm top (a combination that creates strong updrafts) are more likely to produce severe hail. So too are storms with winds blowing from the southeast near the surface and from the west at the top. Storms with a more circular shape are also most likely to produce hail.This research builds on Gagne's previous work using a different kind of machine learning model -- known as a random forest -- to improve hail prediction. Instead of analyzing images, random forest models ask a series of questions, much like a flowchart, which are designed to determine the probability of hail. These questions might include whether the dew point, temperatures, or winds are above or below a certain threshold. Each "tree" in the model asks slight variants on the questions to come to an independent answer. Those answers are then averaged over the entire "forest," giving a prediction that's more reliable than any individual tree.For that research, published in 2017, Gagne used actual storm observations for the inputs and radar-estimated hail sizes for the outputs to train the model. He found that the model could improve hail prediction by as much as 10%. The machine learning model has now been run operationally during the last several springs to give on-the-ground forecasters access to more information when making hail predictions. Gagne is in the process of verifying how the model did over those few seasons.The next step for the newer machine learning model is to also begin testing it using storm observations and radar-estimated hail, with the goal of transitioning this model into operational use as well. Gagne is collaborating with researchers at the University of Oklahoma on this project."I think this new method has a lot of promise to help forecasters better predict a weather phenomenon capable of causing severe damage," Gagne said. "We are excited to continue testing and refining the model with observations of real storms."
Weather
2,019
August 15, 2019
https://www.sciencedaily.com/releases/2019/08/190815120650.htm
Warmer winters are changing the makeup of water in Black Sea
Warmer winters are starting to alter the structure of the Black Sea, which could foreshadow how ocean compositions might shift from future climate change, according to new research.
A new study published in AGU's This intermediate layer has fluctuated in the past, but in the last 14 years its core temperature has warmed 0.7 degrees Celsius (1.26 degrees Fahrenheit). The blending of the cold intermediate layer with the other layers of water could enable the water masses from the deeper layers of the sea to eventually infiltrate the top layer, which would have unknown impacts on the sea's marine life.The new study suggests climate change is causing the intermediate layer to warm and change, but natural fluctuations could also be playing a role, according to the study's authors.Studying changes in smaller water bodies like the Black Sea shows scientists how larger bodies of water might evolve in the future. The new study suggests what might happen to Earth's oceans as the climate continues to warm, according to the researchers.Water masses, which exist in bodies of water around the globe, influence Earth's climate and move nutrients around the world. Changes in oceanic masses' composition could reshape global currents, affecting the planet's climate and ecosystems.It is difficult to study massive water masses in the oceans so scientists use regional water masses like those in the Black Sea to determine how climate change could be affecting oceanic masses."We want to at least know what could happen under different global climate change scenarios," said Emil Stanev, a physical oceanographer at Helmholtz-Zentrum Geesthacht Center for Materials and Coastal Research in Geesthacht, Germany, and lead author of the new study.The Black Sea lies between the Balkans and Eastern Europe and receives water from many major European rivers. It also receives and loses water through the Bosporus Strait, which links the Black Sea to the Mediterranean. The Black Sea's water stratification comes from the mixing of different water sources, creating water masses within the sea.Water masses have distinct temperatures, salinities, and densities, usually identified by their horizontal and vertical positions in bodies of water. The Black Sea's cold intermediate water mass's depth varies depending on its distance from the shore. The mass separates low-saline surface water from the high-saline bottom water. Each of the sea's water layers hosts specific organisms suited to its oceanographic conditions.Scientists previously studied the Black Sea's cold intermediate layer, but they had not analyzed how it has changed over time."They didn't pay too much attention to the evolution of the water masses," Stanev said.In the new study, Stanev and his colleagues charted the evolution of the Black Sea's cold intermediate water mass for 14 years, comparing its progression with the region's climate trends. They used battery-powered floats to measure the temperature, density and salinity from the sea surface down to 1000 meters (3281 feet) at various points throughout the seasons. They then compared the float data to surface air temperatures to see if there was a correlation between warmer winters and changes in the cold intermediate water mass's temperature and salinity.They found winter weather fluctuations changed the temperature and salinity of the cold intermediate layer, but the density of the water mass remained almost the same. The Black Sea's cold intermediate layer became warmer, allowing its edges to blend with the top and bottom layers of the sea. If this trend continues, it could potentially change the stratification of the sea, according to the study's authors. Restructuring the layers could bring sulfides, corrosive and noxious chemicals at the bottom of the sea, up to the surface, impacting marine wildlife and tourism.Climate change might be warming the sea, but natural variability could also be responsible, according to James Murray, a chemical oceanographer at the University of Washington, who was not connected with the new study. Past research both by Murray and other scientists has shown the Black Sea's water layers have cycled through warm and cool periods since the 1950s.However, the Black Sea's cold intermediate layer has never been this warm, Murray added.Both Stanev and Murray agree that more research on the evolution of the Black Sea's layers is necessary. Continuing to study the Black Sea's cold intermediate layer and its fluctuations will indicate whether climate change is behind the layer's gradual disappearance.
Weather
2,019
August 12, 2019
https://www.sciencedaily.com/releases/2019/08/190812130831.htm
Arctic sea-ice loss has 'minimal influence' on severe cold winter weather
The dramatic loss of Arctic sea ice through climate change has only a "minimal influence" on severe cold winter weather across Asia and North America, new research has shown.
The possible connection between Arctic sea-ice loss and extreme cold weather -- such as the deep freezes that can grip the USA in the winter months -- has long been studied by scientists.Observations show that when the regional sea-ice cover is reduced, swathes of Asia and North America often experience unusually cold and hazardous winter conditions.However, previous climate modelling studies have suggested that reduced sea ice cannot fully explain the cold winters.Now, a new study by experts from the University of Exeter, the Royal Netherlands Meteorological Institute and the Energy and Sustainability Research Institute in Groningen, has shed new light on the link between sea-ice loss and cold winters.For the research, the international team combined observations over the past 40 years with results from sophisticated climate modelling experiments. They found that the observations and models agreed that reduced regional sea ice and cold winters often coincide which each other.They found that the correlation between reduced sea ice and extreme winters across the mid-latitude occurs because both are simultaneously driven by the same, large-scale atmospheric circulation patterns.Crucially, it shows that reduced sea ice only has a minimal influence on whether a harsh and severe winter will occur.The study is published in leading science journal, Dr Russell Blackport, a Mathematics Research Fellow at the University of Exeter and lead author of the paper said: "The correlation between reduced sea ice and cold winters does not mean one is causing the other. We show that the real cause is changes in atmospheric circulation which moves warm air into the Arctic and cold air into the mid-latitudes."Over recent decades, the Arctic region has experienced warming temperatures through climate change, which has led to a large decline in sea-ice cover.This reduction in sea-ice cover means that areas of open water increase, which in turn allows the ocean to lose more heat to the atmosphere in winter -- this can potentially alter the weather and climate, even well outside the Arctic.Recent studies have suggested that the reduced sea ice or Arctic warming has contributed to recent cold winters experienced in the mid-latitude region -- and that as the sea-ice reduces further through climate change, cold winters will become more frequent and severe.Now, this new study suggests that reduced sea ice is not the main cause of the cold winters. Instead, the cold winters are likely caused by random fluctuations in the atmospheric circulation.Professor James Screen, an Associate Professor in Climate Science at the University of Exeter said: "The are many reasons to be concerned about the dramatic loss of Arctic sea ice, but an increased risk of severe winters in North America and Asia is not one of them."Dr John Fyfe, a Research Scientist at the Canadian Centre for Climate Modelling and Analysis, who was not involved in the research, writes in Nature Climate Change: "Blackport and colleagues put to rest the notion that Arctic sea-ice loss caused the cold mid-latitude winters, showing instead that atmospheric circulation changes preceded, and then simultaneously drove sea-ice loss and mid-latitude cooling."
Weather
2,019
August 8, 2019
https://www.sciencedaily.com/releases/2019/08/190808152506.htm
Persistent impacts of smoke plumes aloft
Thunderstorms generated by a group of giant wildfires in 2017 injected a small volcano's worth of aerosol into the stratosphere, creating a smoke plume that lasted for almost nine months. CIRES and NOAA researchers studying the plume found that black carbon or soot in the smoke was key to the plume's rapid rise: the soot absorbed solar radiation, heating the surrounding air and allowing the plume to quickly rise.
The billowing smoke clouds provided researchers with an ideal opportunity to test climate models that estimate how long the particulate cloud would persist -- after achieving a maximum altitude of 23 km, the smoke plume remained in the stratosphere for many months.These models are also important in understanding the climate effects of nuclear war or geoengineering."We compared observations with model calculations of the smoke plume. That helped us understand why the smoke plume rose so high and persisted so long, which can be applied to other stratospheric aerosol injections, such as from volcanoes or nuclear explosions," said NOAA scientist Karen Rosenlof, a member of the author team that also included scientists from CU Boulder, Naval Research, Rutgers and other institutions. The findings were published today in the journal During the summer of 2017, wildfires raged across the Pacific Northwest. On August 12 in British Columbia, a group of fires and ideal weather conditions produced five near-simultaneous towering clouds of smoke or pyrocumulonimbus clouds that lofted smoke high into the stratosphere. Within two months, the plume rose from its initial height of about 12 km up to 23 km and persisted in the atmosphere for much longer -- satellites could spot it even after eight months."The forest fire smoke was an ideal case study for us because it was so well observed by satellites," said lead author Pengfei Yu, a former CIRES scientist at NOAA, now at the Institute for Environment and Climate Research at Jinan University in Guangzhou, China.Instruments on two satellites -- the International Space Station and NASA's CALIPSO -- and on NOAA's balloon-borne Printed Optical Particle Spectrometer, or POPS, provided the aerosol measurements the researchers needed.Yu and his colleagues compared those observations with results from a global climate and chemistry model to get a match for how high up the smoke rose and how long it lasted in the atmosphere. With measurements of the rise rate and evolution of the smoke plume, the researchers could estimate the amount of black carbon in the smoke and how quickly the organic particulate material was destroyed in the stratosphere.They found that the plume's rapid rise could only be explained by the presence of black carbon or soot, which comprised about 2 percent of the total mass of the smoke. The soot absorbed solar radiation, heated the surrounding air and forced the plume high into the atmosphere.Next, the team modeled the degradation of the smoke plume in the atmosphere. They found that to mimic the smoke's observed rate of decay over the multi-month plume, there had to be a relatively slow loss of organic carbon (through photochemical processes) that previous nuclear winter studies had assumed to be very rapid."We have a better understanding of how our models represent smoke. And because we can model this process, we know we can model other aerosol-related processes in the atmosphere," said Ru-Shan Gao, a NOAA scientist and one of the paper's co-authors.CU Boulder's Brian Toon and Rutgers University's Alan Robock, also co-authors of the new paper, are particularly interested in what the findings mean for the climate impacts of nuclear explosions, which include a severe cooling impact dubbed "nuclear winter." In modeling the climate impacts of nuclear war, Toon, Robock and others have long expected that massive fires would create smoke plumes that could also be lofted well up into the stratosphere."While the rise of the smoke was predicted in the 1980s, the 2017 fire in British Columbia is the first time it has been observed," Toon said."It was exciting to get confirmation," Robock added.Moreover, the detailed observations made during the 2017 fire -- such as the somewhat longer-than-expected persistence of organic matter -- are fueling more modeling, the two noted. It's possible that the cooling impacts of a nuclear winter could last somewhat less long than models have predicted to date, Toon said, but work is ongoing.
Weather
2,019
August 8, 2019
https://www.sciencedaily.com/releases/2019/08/190808133524.htm
Over a century of Arctic sea ice volume reconstructed with help from historic ships' logs
Our knowledge of sea ice in the Arctic Ocean comes mostly through satellites, which since 1979 have imaged the dwindling extent of sea ice from above. The University of Washington's Pan-Arctic Ice Ocean and Modeling System, or PIOMAS, is a leading tool for gauging the thickness of that ice. Until now that system has gone back only as far as 1979.
A new paper now extends the estimate of Arctic sea ice volume back more than a century, to 1901. To do so it used both modern-day computer simulations and historic observations, some written by hand in the early 1900s aboard precursors to today's U.S. Coast Guard ships."This extends the record of sea ice thickness variability from 40 years to 110 years, which allows us to put more recent variability and ice loss in perspective," said Axel Schweiger, a sea ice scientist at the UW's Applied Physics Laboratory and first author of the study published in the August issue of the "The volume of sea ice in the Arctic Ocean today and the current rate of loss are unprecedented in the 110-year record," he added.PIOMAS provides a daily reconstruction of what's happening to the total volume of sea ice across the Arctic Ocean. It combines weather records and satellite images of ice coverage to compute ice volume. It then verifies its results against any existing thickness observations. For years after 1950, that might be fixed instruments, direct measurements or submarines that cruise below the ice.During the early 20th century, the rare direct observations of sea ice were done by U.S. Revenue cutters, the precursor to the Coast Guard, and Navy ships that have cruised through the Arctic each year since 1879. In the Old Weather project, the UW, the National Oceanic and Atmospheric Administration and the National Archives have been working with citizen scientists to transcribe the weather entries in digitized historic U.S. ships' logbooks to recover unique climate records for science. The new study is the first to use the logbooks' observations of sea ice."In the logbooks, officers always describe the operating conditions that they were in, providing hourly observations of the sea ice at that time and place," said co-author Kevin Wood, a researcher at the Joint Institute for the Study of the Atmosphere and Ocean. If the ship was in open water, the logbook might read "steaming full ahead" or "underway." When the ship encountered ice, officers might write "steering various courses and speeds" meaning the ship was sailing through a field of ice floes. When they found themselves trapped in the ice pack, the log might read "beset."These logbooks until recently could only be viewed at the National Archives in Washington, D.C., but through digital imaging and transcription by Old Weather citizen-scientists these rare observations of weather and sea ice conditions in the Arctic in the late 1800s and early 1900s have been made available to scientists and the public."These are unique historic observations that can help us to understand the rapid changes that are taking place in the Arctic today," Wood said.Wood leads the U.S. portion of the Old Weather project, which originated in 2010 in the U.K. The weather observations from historic logbooks transcribed by Old Weather citizen scientists have already been added to international databases of climate data and were used in the model of the atmosphere that produced the new results.Officers recorded the ship's position at noon each day using a sextant. They would also note when they passed recognizable features, allowing researchers today to fully reconstruct the ship's route to locate it in space and time.While the historic sea ice observations have not yet been incorporated directly into the ice model, spot checks between the model and the early observations confirm the validity of the tool."This is independent verification that the model is doing the right thing," Schweiger said.The new, longer record provides more context for big storms or other unusual events and a new way to study the Arctic Ocean sea ice system."The observations that we have for sea ice thickness and variability are so limited," Schweiger said. "I think people will start analyzing this record. There's a host of questions that people can ask to help understand Arctic sea ice and predict is future."The PIOMAS tool is widely used by scientists to monitor the current state of Arctic sea ice. The area of Arctic sea ice over the month of June 2019, and the PIOMAS-calculated volume, were the second-lowest for that time of year since the satellite record began.The lowest-ever recorded Arctic sea ice area and volume occurred in September 2012. And while Schweiger believes the long-term trend will be downward, he's not placing bets on this year setting a new record."The state of the sea ice right now is set up for new lows, but whether it will happen or not depends on the weather over the next two months," Schweiger said.The other co-author is Jinlun Zhang at the UW Applied Physics Laboratory. The research was funded by the National Science Foundation, NASA, and the North Pacific Research Board.
Weather
2,019
August 6, 2019
https://www.sciencedaily.com/releases/2019/08/190806142329.htm
How the Pacific Ocean influences long-term drought in the Southwestern US
The Southwest has always faced periods of drought. Most recently, from late 2011 to 2017, California experienced years of lower-than-normal rainfall. El Niño is known to influence rain in the Southwest, but it's not a perfect match. New research from the University of Washington and the Woods Hole Oceanographic Institution explores what conditions in the ocean and in the atmosphere prolong droughts in the Southwestern U.S.
The answer is complex, according to a study published Aug. 6 in the "What causes droughts that last for decades in some parts of the world, and why does that happen? Can we predict it?" said first author Luke Parsons, a UW postdoctoral researcher in atmospheric sciences. "Our study shows that when you have a large El Niño event, and a La Niña event is coming next, that could potentially start a multiyear drought in the Southwestern U.S."The general rule of thumb had been that El Niño years -- when the sea surface in a region off the coast of Peru is at least 1 degree Celsius warmer than average -- tend to have more rainfall, and La Niña years, when that region is 1 degree Celsius cooler than average, tend to have less rain. But that simple rule of thumb doesn't always hold true."People often think that El Niño years are wet in the Southwest, but research over the years shows that's not always the case," Parsons said. "An El Niño sometimes brings rain, or can help cause it, but frequently that's not what makes any given year wet."The recent 2015 winter was a case in point, and Parsons said that event helped inspire the new study. As 2015 shaped up to be an El Niño year, there was hope that it would end California's drought. But the rain didn't start to arrive until the following year.The new study uses climate models to explore the relationship between the world's largest ocean and long-term droughts in the Southwestern U.S., which includes California, Nevada, Utah, Arizona and western Colorado and New Mexico."When it's dry one year after another, that's hard on people, and it can be hard on ecosystems," Parsons said.Weather observations for the Southwest date back only about 150 years, and in that time, only 10 to 15 multiyear droughts have occurred. So the authors used climate models that simulate thousands of years of weather, including over 1,200 long-term droughts in the Southwest. The authors defined a drought as multiple years with lower-than-average rainfall. The drought ended when the region had two consecutive wetter-than-normal years."A lot of people have looked at what's going on over the ocean during a drought, but we're trying to take a step back, and look at the whole life cycle -- what happens before a drought starts, what maintains a drought, and then what ends it," Parsons said.Parsons and co-author Sloan Coats at the Woods Hole Oceanographic Institution separated the system into pre-drought, during-drought and post-drought periods. They found that before a long-term drought starts, there is often an El Niño year. Then the first year of a drought is often colder than normal in that region of the ocean, though it might not be enough to qualify as a La Niña year."Where that warm pool of water sits ends up disturbing, or changing, the jet stream, and that shifts where the winter rains come in off the ocean in the Northern Hemisphere winter," Parsons said. "La Niña can kick off a drought, but you don't have to have multiple La Niña events to continue the drought and keep the Southwest dry."An El Niño that's slightly farther offshore than normal, in the central tropical Pacific, often ends the drought. But the study shows that's not always true: About 1 in 20 drought years could see an El Niño that doesn't deliver rain.Better understanding of long-term droughts could help managers make decisions like whether to release water from the Colorado River, or whether to save some in anticipation of another low year.The study was funded by the Washington Research Foundation. Weather data and climate model results came from the National Science Foundation, the National Oceanic and Atmospheric Administration and the U.S. Department of Energy.
Weather
2,019
August 5, 2019
https://www.sciencedaily.com/releases/2019/08/190805181627.htm
Is it safe to use an electric fan for cooling?
The safety and effectiveness of electric fans in heatwaves depend on the climate and basing public health advice on common weather metrics could be misleading, according to a new study from the University of Sydney.
The research calls into question current guidelines from most public health authorities, including the World Health Organization, that suggest fans may not be beneficial when the temperature rises above 35 degrees Celsius (95°F), as well as recommendations based on heat index caps.Researchers from the University's Thermal Ergonomics Laboratory simulated heatwave conditions to examine the effect of electric fan use on an individual's core temperature, cardiovascular strain, risk of dehydration and comfort levels.The results, published today in However, fans were detrimental for all measures in very hot, dry conditions despite a lower heat index of 46 °C (115°F).Heat index is a commonly used weather metric that expresses both air temperature and relative humidity. It was designed to help convey how hot weather conditions feel to the average person.The United States Environmental Protection Agency (USEPA) states that fan use above a heat index of 37.2°C (99°F) "actually increases the heat stress the body must respond to."Senior author Associate Professor Ollie Jay of the Faculty of Health Sciences and Charles Perkins Centre said recent conditions in Europe and the United States reinforce the urgent need for evidence-based health advice to help protect people against heat-related illness."Our results suggest that under environmental conditions that represent the vast majority of peak heatwaves in the United States and Europe fans should be recommended and the guidelines issued by most public health authorities are unnecessarily conservative," said Associate Professor Jay."It is only when the air temperature is very high and humidity is very low that fans are detrimental, which can be seen in arid conditions such as Phoenix or Las Vegas in the US, or Adelaide in South Australia."Twelve healthy male volunteers were monitored for thermal strain (rectal temperature), cardiovascular strain (heart rate and blood pressure), risk for dehydration (whole body sweat rate), and thermal comfort (assessed using 120-mm visual analogue scale) over a two-hour exposure to simulated peak conditions of two types of heat waves.One was very hot and dry replicating the peak conditions of the California heatwave in July 2018, and the other was cooler but more humid with a higher heat index representing the peak conditions during the Chicago heatwave in July 1995, and Shanghai heatwave in July 2017.Associate Professor Jay said while larger studies are needed, the current research and earlier work published in the Journal of the American Medical Association suggests that neither temperature or heat index caps are the best basis for public health advice on the use of fans.His team is currently examining the effectiveness of a range of different low-resource cooling strategies that can be easily implemented in different heatwave conditions by the elderly and people with medical conditions like coronary artery disease. They are also assessing the impact of different prescription medications on the type of advice that should be issued to the general public in advance of extreme heat events.Declaration: Associate Professor Jay reports grants from the NSW Office of Environment & Heritage Human Health and Social Impacts Node during the conduct of the study, and grants from National Health and Medical Research Council outside the submitted work.
Weather
2,019
August 5, 2019
https://www.sciencedaily.com/releases/2019/08/190805112206.htm
Twelve centuries of European summer droughts
An international team of researchers have published a study exploring the association between summer temperature and drought across Europe placing recent drought in the context of the past 12 centuries. The study reveals that, throughout history, northern Europe has tended to get wetter and southern Europe to get drier during warmer periods. They also observe that recent changes in drought patterns are not unprecedented as yet and emphasising that continuing to improve understanding of the relationship between summer heat and drought is critical to projecting flood and drought risks.
The new study, published in This comparison revealed that the climate model simulations show a too strong relationship between warm and dry summers, and do not capture that a large part of Europe has received more precipitation, not less, when it has been warm in the past 12 centuries.Project leader Dr. Fredrik Charpentier Ljungqvist, Associate Professor at Stockholm University, said these new findings are important as we are able to see for the first time that the relationship between summer temperature and drought in modern weather measurements has persisted for at least 12 centuries. "We can also see that wetting trend in northern Europe, and drying trend in southern Europe, during the 20th century is not unprecedented over this time perspective," he said.Going on to discuss the climate model results, Dr. Ljungqvist said: "Crucially, our study shows that the very strong link between warm and dry periods being simulated in the climate models could be too simple. It's not a picture backed up by the weather records and tree-ring data. The climate model simulations seem to underestimate how large part of Europe actually experiences wetter summers when the climate is warmer.""Our study implies a possible exaggeration in the climate models of temperature-driven drought risk in parts of northern Europe under global warming. But this also means that the models may well underestimate future excessive precipitation, with associated flood risks, in northern Europe," continues Dr. Ljungqvist.
Weather
2,019
August 2, 2019
https://www.sciencedaily.com/releases/2019/08/190802104541.htm
Machine learning helps predict if storms will cause power outages
Thunderstorms are common all over the world in summer. As well as spoiling afternoons in the park, lightning, rain and strong winds can damage power grids and cause electricity blackouts. It's easy to tell when a storm is coming, but electricity companies want to be able to predict which ones have the potential to damage their infrastructure.
Machine learning -- when computers find patterns in existing data which enable them to make predictions for new data -- is ideal for predicting which storms might cause blackouts. Roope Tervo, a software architect at the Finnish Meteorological Institute (FMI) and PhD researcher at Aalto university in Professor Alex Jung's research group has developed a machine learning approach to predict the severity of storms.The first step of teaching the computer how to categorise the storms was by providing them with data from power-outages. Three Finnish energy companies, Järvi-Suomen Energia, Loiste Sähkoverkko, and Imatra Seudun Sähkönsiirto, who have power grids through storm-prone central Finland, provided data about the amount of power disruptions to their network. Storms were sorted into 4 classes. A class 0 storm didn't knock out electricity to any power transformers. A class 1 storm cut-off up to 10% of transformers, a class 2 up to 50%, and a class 3 storm cut power to over 50% of the transformers.The next step was taking the data from the storms that FMI had, and making it easy for the computer to understand. "We used a new object-based approach to preparing the data, which was makes this work exciting" said Roope. "Storms are made up of many elements that can indicate how damaging they can be: surface area, wind speed, temperature and pressure, to name a few. By grouping 16 different features of each storm, we were able to train the computer to recognize when storms will be damaging."The results were promising: the algorithm was very good at predicting which storms would be a class 0 and cause no damage, and which storms would be at least a class 3 and cause lots of damage. The researchers are adding more data for storms into the model to help improve the ability to tell class 1 and 2 storms apart from each other, to make the prediction tools even more useful to the energy companies."Our next step is to try and refine the model so it works for more weather than just summer storms," said Roope, "as we all know, there can be big storms in winter in Finland, but they work differently to summer storms so we need different methods to predict their potential damage"
Weather
2,019
August 1, 2019
https://www.sciencedaily.com/releases/2019/08/190801120209.htm
US infrastructure unprepared for increasing frequency of extreme storms
Current design standards for United States hydrologic infrastructure are unprepared for the increasing frequency and severity of extreme rainstorms, meaning structures like retention ponds and dams will face more frequent and severe flooding, according to a new study.
Extreme weather events are on the rise, but U.S. water management systems use outdated design guidelines. New research, published in the AGU journal The new study is particularly timely in light of recent storms and flash floods along the East Coast."The take-home message is that infrastructure in most parts of the country is no longer performing at the level that it's supposed to, because of the big changes that we've seen in extreme rainfall," said Daniel Wright, a hydrologist at the University of Wisconsin-Madison and lead author of the new study.Engineers often use statistical estimates called IDF curves to describe the intensity, duration, and frequency of rainfall in each area. The curves, published by the National Oceanic and Atmospheric Administration (NOAA), are created using statistical methods that assume weather patterns remain static over time."Design engineers at cities, consulting companies, and counties use this for different purposes, like infrastructure design management, infrastructure risk assessment and so forth. It has a lot of engineering applications," said Amir Aghakouchak, a hydrologist at the University of California, Irvine who was not involved with the new study.But climate change is causing extreme rainfall events to occur more often in many regions of the world, something IDF curves don't take into account. One measure of extreme rainfall is the 100-year storm, a storm that has a one percent chance of happening in a given year, or a statistical likelihood of happening once in 100 years on average.Wright and his colleagues wanted to know how existing IDF curves compare with recent changes in extreme rainfall. They analyzed records from more than 900 weather stations across the U.S. from 1950 to 2017 and recorded the number of times extreme storms, like 100-year storms, exceeded design standards. For example, in the eastern United States, extreme rainstorm events are happening 85 percent more often in 2017, than they did in 1950. In the western U.S., these storms are appearing 51 percent more often now than they once did.The scientists found that in most of the country the growing number of extreme rainstorms can be linked to warming temperatures from climate change, although natural events, such as El Niño, also occasionally affect the Southeast's climate.By comparing the number of storms that actually happened against the number predicted by IDF curves, the researchers also showed the potential consequences for U.S. infrastructure. In some regions, for example, infrastructure designed to withstand extreme rainstorms could face these storms every 40 years instead of every 100 years."Infrastructure that has been designed to these commonly-used standards is likely to be overwhelmed more often than it is supposed to be," Wright said.The researchers hope the findings will encourage climate scientists, hydrologists, and engineers to collaborate and improve U.S. hydrologic infrastructure guidelines."We really need to get the word out about just how far behind our design standards are from there they should be," Wright said.
Weather
2,019
August 1, 2019
https://www.sciencedaily.com/releases/2019/08/190801120156.htm
Ancient plankton help researchers predict near-future climate
The Mauna Loa Observatory in Hawai'i recently recorded the highest concentration of carbon dioxide, or CO
"The Pliocene wasn't a world that humans and our ancestors were a part of," said University of Arizona associate professor of geosciences Jessica Tierney. "We just started to evolve at the end of it."Now that we've reached 415 parts per million COA new study, published today in Before the industrial revolution, COPast proxy measurements of Pliocene sea surface temperatures led scientists to conclude that a warmer Earth caused the tropical Pacific Ocean to be stuck in an equatorial weather pattern called El Niño.Normally, as the trade winds sweep across the warm surface waters of the Pacific Ocean from east to west, warm water piles up in the eastern Pacific, cooling the western side of the ocean by about 7 to 9 degrees Fahrenheit. But during an El Niño, the temperature difference between the east and west drops to just under 2 degrees, influencing weather patterns around the world, including Southern Arizona. El Niños typically occur about every three to seven years, Tierney said.The problem is, climate models of the Pliocene, which included CO"This paper was designed to revisit that concept of the permanent El Niño and see if it really holds up against a reanalysis of the data," she said. "We find it doesn't hold up."About 20 years ago, scientists found they could deduce past temperatures based on chemical analysis of a specific kind of fossilized shell of a type of plankton called foraminifera."We don't have thermometers that can go to the Pliocene, so we have to use proxy data instead," Tierney said.Since then, scientists have learned that foraminifera measurements can be skewed by ocean chemistry, so Tierney and her team instead used a different proxy measurement -- the fat produced by another plankton called coccolithophores. When the environment is warm, coccolithophores produce a slightly different kind of fat than when it's cold, and paleoclimatologists like Tierney can read the changes in the fat, preserved in ocean sediments, to deduce sea-surface temperatures."This is a really commonly used and reliable way to look at past temperatures, so a lot of people have made these measurements in the Pliocene. We have data from all over the world," she said. "Now we use this fat thermometer that we know doesn't have complications, and we're sure we can get a cleaner result."Tierney and her team found that the temperature difference between the eastern and western sides of the Pacific did decrease, but it was not pronounced enough to qualify as a full-fledged permanent El Niño."We didn't have a permanent El Niño, so that was a bit of an extreme interpretation of what happened," she said. "But there is a reduction in the east-west difference -- that's still true."The eastern Pacific got warmer than the western, which caused the trade winds to slacken and changed precipitation patterns. Dry places like Peru and Arizona might have been wetter. These results from the Pliocene agree with what future climate models have predicted, as a result of COThis is promising because now the proxy data matches the Pliocene climate models. "It all checks out," Tierney said.The Pliocene, however, was during a time in Earth's history when the climate was slowly cooling. Today, the climate is getting hotter very quickly. Can we really expect a similar climate?"The reason today sea levels and ice sheets don't quite match the climate of the Pliocene is because it takes time for ice sheets to melt," Tierney said. "However, the changes in the atmosphere that happen in response to CO
Weather
2,019
July 31, 2019
https://www.sciencedaily.com/releases/2019/07/190731125446.htm
Krypton reveals ancient water beneath the Israeli desert
Getting reliable precipitation data from the past has proven difficult, as is predicting regional changes for climate models in the present. A combination of isotope techniques developed by researchers at Argonne and UChicago may help resolve both.
The Negev desert, which covers half of Israel's land mass, is so dry that parts of it get less than three inches of water a year. So dry, its climatological term is "hyperarid." But like most places on Earth, the region evolved to its present condition after eons of changes in climate and geology.Today, despite its parched exterior, there is still water under the Negev. Understanding where it came from, how much is there, and what's happening to it is critical to the security and allocation of that crucial resource.Researchers at Ben-Gurion University of the Negev in Israel are collaborating with colleagues at the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago to better understand the Nubian Sandstone Aquifer system, which lies beneath a large portion of the Negev and other parts of Israel.By combining Argonne's pioneering radiokrypton dating technique with other isotopic fingerprints of the water's composition, the researchers are not only able to tell when that water was deposited, but where it came from and the climate conditions that produced it nearly 400,000 years ago. The result marks the first time that scientists have been able to use groundwater to build a picture of ancient hydro-climates dating back that far.An article describing the research, "Radiokrypton unveils dual moisture sources of a deep desert aquifer," was published July 29, in Proceedings of the National Academy of Sciences (PNAS) online."The aquifers beneath the Negev don't get replenished today, so apparently there were times when there was much more rain in the region that collected underground," says Peter Mueller, principal physicist at Argonne's Trace Radioisotope Analysis Center (TRACER).To determine when and how that might have occurred, the team collected water from more than 20 wells in the area, then separated out the krypton gas and analyzed it using a technology called Atom Trap Trace Analysis (ATTA), a technique first developed at Argonne to support nuclear physics measurements.ATTA measures water for traces of the rare krypton (Kr) isotope The ATTA analysis suggested that the water in the wells accumulated by means of two major "recharging" events that occurred less than 40,000 and near 360,000 years ago. Both periods coincided with generally cooler climates. These "regional humid periods" were ripe for the development of storms that could provide rainfall adequate to replenish the Negev aquifers.While the "We were looking for the delta deuterium, which is a measure of the difference in the ratio of heavy hydrogen to regular hydrogen," says Jake Zappala, postdoctoral appointee at the TRACER Center. "That number is going to vary for different bodies of water depending on where the water came from and what the weather conditions were, which is important."Because deuterium has a heavier mass than hydrogen, it behaves differently, evaporating and condensing at different temperatures. For example, when evaporation happens quickly, as over the Mediterranean Sea, it exhibits a peculiar signature compared to global precipitation trends. Even though it is very rare relative to hydrogen -- only one in ten thousand water molecules contains one deuterium atom instead of hydrogen -- it can be measured very precisely.Thus scientists can "fingerprint" such bodies of water based on the particular signature of its stable isotopes. Every climate pattern places its own imprint in that signature, according to the researchers."This project shows us these tools could be really transformative, tracing water movement much further than we've previously been able to," said Reika Yokochi, research associate professor in the Department of Geophysical Sciences at the University of Chicago, and the first author of the new study.Yokochi, who has been collaborating with the ATTA team since 2012, has been key in developing some of the extraction techniques the team currently uses. She hit upon the idea of combining the two data sets to find a correlation between the From the covariation and the spatial distribution of the data, the team determined that water from the two recharge events came from two distinct sources. About 400,000 years ago, the region was cooler than the present, and moisture is believed to have been delivered from the Atlantic Ocean in the form of tropical plumes. The more recent recharge, less than 40,000 years ago, may have been the result of Mediterranean cyclones during the most recent major glacial event, or Last Glacial Maximum."To our knowledge, this was the first time that groundwater could directly be used as a climate archive on these long timescales," says Zappala. "Using the radiokrypton dating, we are able to say when it rained, and the heavy-to-light water ratio directly tells us something about the weather pattern. So we have a direct correlation between time and regional weather patterns."Another interesting point is that the water came from near an earthquake fault zone, notes Yokochi, suggesting that faults can serve as a "wall" that preserves relatively fresh water over hundreds of thousands of years."It's possible that similar water repositories may exist along other fault zones all over the world," she says.To date, getting reliable precipitation data from the past has proven difficult, as is predicting regional changes for climate models in the present. The combination of isotope tools used by the team may be part of the answer to resolving both.As the tools continue to deliver a more reliable picture of past climate events, like the regional water cycles of the Negev, the researchers believe that this data can serve to calibrate present-day models of similar climate phenomena."Does your climate model predict the right precipitation pattern 400,000 years ago?" asks Mueller. "Using our data, modelers can calculate back in time to see if their model is right. That is one of the key things that we can provide."
Weather
2,019
July 31, 2019
https://www.sciencedaily.com/releases/2019/07/190731125435.htm
Animal friendships 'change with the weather' in the Masai Mara
When it comes to choosing which other species to hang out with, wild animals quite literally change their minds with the weather, a new University of Liverpool study reveals.
The findings, which are published in the journal "In the wild, a species always exists as part of a community of other species, which affect its survival. These interactions are crucial when it comes to predicting extinction risk: if we focus only on single species in isolation we may get it very wrong," explains the leader of the research team, Dr Jakob Bro-Jørgensen.In this study the researchers aimed to uncover if species alter their preference for different social partners when their environment changes -- a central question to forecast how current environmental changes caused by humans are likely to affect animal populations and communities.Over a year, they followed the distribution in space and time of a dozen species inhabiting the Masai Mara plains in East Africa, including buffaloes, giraffes, zebras, antelopes, ostriches and warthogs, to see how the strength of social attraction within individual species pairs changed between the wet and dry season.All of the savannah herbivore species underwent seasonal changes when it came to their social groupings, with rainfall affecting half of all the possible species pairs.The researchers suggest that this could be due to a number of reasons, including species migration, climate adaption and feeding preferences. For example, the presence of migrating wildebeest during the dry season may provide a welcome social partner for some, like zebra, but be avoided by others, like buffalo, while arid-adapted species, such as gazelles, ostrich and warthog, may group together during the dry season but separate during the wet season."Our study shows that the dramatic changes that humans are causing to the environment at present, be it through climate change, overhunting or habitat fragmentation, will likely create indirect consequences by changing the dynamics of ecological communities," says Dr Bro-Jørgensen."This can cause unexpected declines in species if critical bonds with other species are broken. A particular concern is when animals find themselves in novel conditions outside the range which they have been shaped by evolution to cope with," he adds.Following on from this study, the researchers now plan to investigate how predation and feeding strategies interact to drive the formation of mixed-species groups.
Weather
2,019
July 29, 2019
https://www.sciencedaily.com/releases/2019/07/190729111232.htm
Green infrastructure to manage more intense stormwater with climate change
UMD researchers are connecting climate change to urban and suburban stormwater management, with the ultimate goal of increasing resiliency to major storm events. With models not only predicting more rain, but an increased frequency of particularly intense and destructive storms, flooding is a major concern in communities that are becoming more settled with more asphalt. Flooding doesn't just cause property damage, but it impacts the health of the Chesapeake Bay through increased nutrient runoff and pollution. In a new case study published in the
"What we design now is in place for 20 or 30 years, so we should design it with future climate conditions in mind as opposed to what the past rain has looked like," explains Mitchell Pavao-Zuckerman, assistant professor in Environmental Science & Technology. "This work puts emphasis on what's happening in local upland spaces that has immediate implications for the people who are living in these watersheds for future flood mitigation, but connects this to the broader issues of how increased runoff links to the health of the Chesapeake Bay."With this study, Pavao-Zuckerman and graduate student Emma Giese take a practical look at what suburban areas are currently doing to manage their stormwater, and provide some evidence on how and why to implement green infrastructure based on how these systems will hold up in the future. Pavao-Zuckerman and Giese leveraged data available from the United States Geological Survey (USGS) for two watersheds in Clarksburg, Maryland, a suburban town in Montgomery County that is only growing and continuing to develop. These two watersheds each have a distinct development history - one has several larger-scale detention ponds or stormwater basins for a more traditional approach to stormwater management, while the other has a heavy presence of smaller-scale green infrastructure like rain gardens, dry detention ponds, and sand filters. Both watersheds were monitored before and after development to see the impacts of green infrastructure, and both are near a weather monitoring station with climate data that is readily accessible."Green infrastructure consists of things with a much smaller footprint than a stormwater basin, but there are more of them in the watershed, so it comes down to measuring the aggregated effect of a lot of small things in one watershed rather than one or two large things in another watershed," says Pavao-Zuckerman. "Partnering with the USGS to have a good data source at the watershed scale and finding the right model for the question was key."To model future climate change scenarios for these two watersheds, Pavao-Zuckerman and Giese enlisted the help of Adel Shirmohammadi, professor and associate dean in the College of Agriculture & Natural Resources. "Together, we were able to use the USGS data to train the Soil and Water Assessment Tool or SWAT model, taking into account the geography of the watersheds, slope, soil type, impervious surface, built versus open space, and other parameters to determine how much rainfall actually becomes runoff or flooding risk," says Pavao-Zuckerman.Using this model, Pavao-Zuckerman and Giese were then able to take climate change projection data for increased storm frequency and rainfall to run a variety of future scenarios and see how these different watersheds would manage. "We've already seen a significant increase in rainfall in the present day, so we were surprised to see that our baseline present day measure was already seeing the effects of increased rain," says Pavao-Zuckerman.Ultimately, Pavao-Zuckerman and Giese found that the watershed with more green infrastructure was able to buffer and absorb more of the increased rainfall than the more traditionally designed watershed with larger stormwater basins. However, with larger or more intense rain events, both systems failed to handle the amount of rain successfully. "We are seeing more large storm events so either the systems are overwhelmed or are still saturated by the time the next storm event comes," says Pavao-Zuckerman. "So it is really the bigger rain events where we are seeing things not work as well, and that's concerning partly because we know that with climate change these more intense events are going to become more common. This points to the need to plan for these more intense weather events in stormwater management infrastructure."To combat this issue, Pavao-Zuckerman and Giese did find that increasing the capacity for some of the existing systems or increasing the presence of green infrastructure in the watersheds made them more resilient to future extreme rain events. With that in mind, Pavao-Zuckerman and Giese worked with Amanda Rockler, watershed restoration specialist and senior agent with UMD Extension and the Maryland Sea Grant Program, to provide insight into what was feasible to implement. "Our work allows us to see what the added return on investment in these different climate and stormwater management scenarios might be," says Pavao-Zuckerman. "It's more concrete than just saying more green infrastructure is better, which isn't practical and might have a cost-benefit trade off."
Weather
2,019
July 24, 2019
https://www.sciencedaily.com/releases/2019/07/190724144152.htm
Climate change could revive medieval megadroughts in US Southwest
About a dozen megadroughts struck the American Southwest during the 9th through the 15th centuries, but then they mysteriously ceased around the year 1600. What caused this clustering of megadroughts -- that is, severe droughts that last for decades -- and why do they happen at all?
If scientists can understand why megadroughts happened in the past, it can help us better predict whether, how, and where they might happen in the future. A study published today in Previously, scientists have studied the individual factors that contribute to megadroughts. In the new study, a team of scientists at Columbia University's Lamont-Doherty Earth Observatory has looked at how multiple factors from the global climate system work together, and projected that warming climate may bring a new round of megadroughts.By reconstructing aquatic climate data and sea-surface temperatures from the last 2,000 years, the team found three key factors that led to megadroughts in the American Southwest: radiative forcing, severe and frequent La Niña events -- cool tropical Pacific sea surface temperatures that cause changes to global weather events -- and warm conditions in the Atlantic. High radiative forcing appears to have dried out the American Southwest, likely due to an increase in solar activity (which would send more radiation toward us) and a decrease in volcanic activity (which would admit more of it) at the time. The resulting increase in heat would lead to greater evaporation. At the same time, warmer than usual Atlantic sea-surface temperatures combined with very strong and frequent La Niñas decreased precipitation in the already dried-out area. Of these three factors, La Niña conditions were estimated to be more than twice as important in causing the megadroughts.While the Lamont scientists say they were able to pinpoint the causes of megadroughts in a more complete way than has been done before, they say such events will remain difficult for scientists to predict. There are predictions about future trends in temperatures, aridity, and sea surface temperatures, but future El Niño and La Niña activity remains difficult to simulate. Nevertheless, the researchers conclude that human-driven climate change is stacking the deck towards more megadroughts in the future."Because you increase the baseline aridity, in the future when you have a big La Niña, or several of them in a row, it could lead to megadroughts in the American West," explained lead author Nathan Steiger, a Lamont-Doherty Earth Observatory hydroclimatologist.During the time of the medieval megadroughts, increased radiative forcing was caused by natural climate variability. But today we are experiencing increased dryness in many locations around the globe due to human-made forces. Climate change is setting the stage for an increased possibility of megadroughts in the future through greater aridity, say the researchers.
Weather
2,019
July 19, 2019
https://www.sciencedaily.com/releases/2019/07/190719135530.htm
Smart irrigation model predicts rainfall to conserve water
Fresh water isn't unlimited. Rainfall isn't predictable. And plants aren't always thirsty.
Just 3 percent of the world's water is drinkable, and more than 70 percent of that fresh water is used for agriculture. Unnecessary irrigation wastes huge amounts of water -- some crops are watered twice as much as they need -- and contributes to the pollution of aquifers, lakes and oceans.A predictive model combining information about plant physiology, real-time soil conditions and weather forecasts can help make more informed decisions about when and how much to irrigate. This could save 40 percent of the water consumed by more traditional methods, according to new Cornell University research."If you have a framework to connect all these excellent sources of big data and machine learning, we can make agriculture smart," said Fengqi You, energy systems engineering professor.You is the senior author of "Robust Model Predictive Control of Irrigation Systems With Active Uncertainty Learning and Data Analytics," which published in "These crops, when grown in the semiarid, semidesert environment of California's Central Valley, are huge consumers of water -- one gallon of water per almond," Stroock said. "So there's a real opportunity to improve the way we manage water in these contexts."Controlling plant moisture precisely could also improve the quality of sensitive specialty crops such as wine grapes, he said.The researchers' method uses historical weather data and machine learning to assess the uncertainty of the real-time weather forecast, as well as the uncertainty of how much water will be lost to the atmosphere from leaves and soil. This is combined with a physical model describing variations in the soil moisture.Integrating these approaches, they found, makes watering decisions much more precise.Part of the challenge of the research is identifying the best method for each crop, and determining the costs and benefits of switching to an automated system from a human-operated one. Because apple trees are relatively small and respond quickly to changes in precipitation, they may not require weeks or months of weather data. Almond trees, which tend to be larger and slower to adapt, benefit from longer-term predictions."We need to assess the right level of complexity for a control strategy, and the fanciest might not make the most sense," Stroock said. "The experts with their hands on the valves are pretty good. We have to make sure that if we're going to propose that somebody invest in new technology, we've got to be better than those experts."
Weather
2,019
July 17, 2019
https://www.sciencedaily.com/releases/2019/07/190717142639.htm
Correcting historic sea surface temperature measurements
Something odd happened in the oceans in the early 20th century. The North Atlantic and Northeast Pacific appeared to warm twice as much as the global average while the Northwest Pacific cooled over several decades.
Atmospheric and oceanic models have had trouble accounting for these differences in temperature changes, leading to a mystery in climate science: why did the oceans warm and cool at such different rates in the early 20th century?Now, research from Harvard University and the UK's National Oceanography Centre points to an answer both as mundane as a decimal point truncation and as complicated as global politics. Part history, part climate science, this research corrects decades of data and suggests that ocean warming occurred in a much more homogenous way.The research is published in Humans have been measuring and recording the sea surface temperature for centuries. Sea surface temperatures helped sailors verify their course, find their bearings, and predict stormy weather.Until the 1960s, most sea surface temperature measurements were taken by dropping a bucket into the ocean and measuring the temperature of the water inside.The National Oceanic and Atmospheric Administration (NOAA) and the National Science Foundation's National Center for Atmospheric Research (NCAR) maintains a collection of sea surface temperature readings dating back to the early 19th Century. The database contains more than 155 million observations from fishing, merchant, research and navy ships from all over the world. These observations are vital to understanding changes in ocean surface temperature over time, both natural and anthropogenic.They are also a statistical nightmare.How do you compare, for example, the measurements of a British Man-of-War from 1820 to a Japanese fishing vessel from 1920 to a U.S. Navy ship from 1950? How do you know what kind of buckets were used, and how much they were warmed by sunshine or cooled by evaporation while being sampled?For example, a canvas bucket left on a deck for three minutes under typical weather conditions can cool by 0.5 degrees Celsius more than a wooden bucket measured under the same conditions. Given that global warming during the 20th Century was about 1 degree Celsius, the biases associated with different measurement protocols requires careful accounting."There are gigabytes of data in this database and every piece has a quirky story," said Peter Huybers, Professor of Earth and Planetary Sciences and of Environmental Science and Engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and senior author of the paper. "The data is rife with peculiarities."A lot of research has been done to identify and adjust for these peculiarities. In 2008, for example, researchers found that a 0.3-degree Celsius jump in sea surface temperatures in 1945 was the result of measurements taken from engine room intakes. Even with these corrections, however, the data is far from perfect and there are still unexplained changes in sea surface temperature.In this research, Huybers and his colleagues proposed a comprehensive approach to correcting the data, using a new statistical technique that compares measurements taken by nearby ships."Our approach looks at the differences in sea surface temperature measurements from distinct groups of ships when they pass nearby, within 300 kilometers and two days of one another," said Duo Chan, a graduate student in the Harvard Graduate School of Arts and Sciences and first author of the paper. "Using this approach, we found 17.8 million near crossings and identified some big biases in some groups."The researchers focused on data from 1908 to 1941, broken down by the country of origin of the ship and the "decks," a term stemming from the fact that marine observations were stored using decks of punch cards. One deck includes observations from both Robert Falcon Scott's and Ernest Shackleton's voyages to the Antarctic."These data have made a long journey from the original logbooks to the modern archive and difficult choices were made to fit the available information onto punch cards or a manageable number of magnetic tape reels," said Elizabeth Kent, a co-author from the UK National Oceanography Centre. "We now have both the methods and the computer power to reveal how those choices have affected the data, and also pick out biases due to variations in observing practice by different nations, bringing us closer to the real historical temperatures."The researchers found two new key causes of the warming discrepancies in the North Pacific and North Atlantic.The first had to do with changes in Japanese records. Prior to 1932, most records of sea surface temperature from Japanese vessels in the North Pacific came from fishing vessels. This data, spread across several different decks, was originally recorded in whole-degrees Fahrenheit, then converted to Celsius, and finally rounded to tenths-of-a-degree.However, in the lead-up to World War II, more and more Japanese readings came from naval ships. These data were stored in a different deck and when the U.S. Air Force digitized the collection, they truncated the data, chopping off the tenths-of-a-degree digits and recording the information in whole-degree Celsius.Unrecognized effects of truncation largely explain the rapid cooling apparent in foregoing estimate of Pacific sea surface temperatures between 1935 and 1941, said Huybers. After correcting for the bias introduced by truncation, the warming in the Pacific is much more uniform.While Japanese data holds the key to warming in the Pacific in the early 20th century, it's German data that plays the most important role in understanding sea surface temperatures in the North Atlantic during the same time.In the late 1920s, German ships began providing a majority of data in the North Atlantic. Most of these measurements are collected in one deck, which, when compared to nearby measurements, is significantly warmer. When adjusted, the warming in the North Atlantic becomes more gradual.With these adjustments, the researchers found that rates of warming across the North Pacific and North Atlantic become much more similar and have a warming pattern closer to what would be expected from rising greenhouse gas concentrations. However, discrepancies still remain and the overall rate of warming found in the measurements is still faster than predicted by model simulations."Remaining mismatches highlight the importance of continuing to explore how the climate has been radiatively forced, the sensitivity of the climate, and its intrinsic variability. At the same time, we need to continue combing through the data -- through data science, historical sleuthing, and a good physical understanding of the problem, I bet that additional interesting features will be uncovered," said Huybers.This research was co-authored by David I. Berry from the UK National Oceanography Centre. The research was supported by the Harvard Global Institute, the National Science Foundation, and the Natural Environment Research Council.
Weather
2,019
July 16, 2019
https://www.sciencedaily.com/releases/2019/07/190716095504.htm
Tracking down climate change with radar eyes
Over the past 22 years, sea levels in the Arctic have risen an average of 2.2 millimeters per year. This is the conclusion of a Danish-German research team after evaluating 1.5 billion radar measurements of various satellites using specially developed algorithms.
"The Arctic is a hotspot of climate change," explains Prof. Florian Seitz of the German Geodetic Research Institute at the Technical University of Munich (TUM). "Due to rising temperatures, the glaciers of Greenland are receding. At the same time sea ice is melting. Every year, billions of liters of meltwater are released into the ocean." The enormous volumes of fresh water released in the Arctic not only raise the sea level, they also have the potential to change the system of global ocean currents -- and thus, our climate.But how fast do sea levels rise? And precisely what effect does this have? To answer these questions, climatologists and oceanographers require specific measurements over as long a period as possible.In a collaborative effort, researchers from the Technical University of Denmark (DTU) and from the TUM have now documented sea-level changes in the Arctic over more than two decades. "This study is based on radar measurements from space via so-called altimetry satellites and covers the period from 1991 to 2018. Thus, we have obtained the most complete and precise overview of the sea level changes in the Arctic Ocean to date. This information is important in terms of being able to estimate future sea levels associated with climate change," says Stine Kildegaard Rose, Ph.D., researcher at Space DTU."The challenge lies in finding the water signals in the measured data: Radar satellites measure only the distance to the surface: Albeit, vast areas of the Arctic are covered with ice, which obscures the seawater," explains Dr. Marcello Passaro. The TUM researcher has developed algorithms to evaluate radar echoes reflected from the water where it reaches the surface through cracks in the ice.Using these algorithms, Passaro processed and homogenized 1.5 billion radar measurements from the ERS-2 and Envisat satellites. On the basis of the signals tracked at the TUM, the DTU team worked on the post-processing of these data and added the measurements collected by the current CryoSat radar mission.From monthly averages to a climate trendThe researchers created a map with lattice points to represent the monthly sea level elevations for the period between 1996 and 2018. The sum of the monthly maps reveals the long-term trend: The Arctic sea level rose by an average of 2.2 millimeters per year.There are, however, significant regional differences. Within the Beaufort Gyre, north of Greenland, Canada and Alaska, sea levels rose twice as fast as on average -- more than 10 centimeters in 22 years. The reason: The low-salinity meltwater collects here, while a steady east wind produces currents that prevent the meltwater from mixing with other ocean currents. Along the coast of Greenland, on the other hand, the sea level is falling -- on the west coast by more than 5 mm per year, because the melting glaciers weaken the attractive force of gravity there."The homogenized and processed measurements will allow climate researchers and oceanographers to review and improve their models in the future," concludes Passaro.
Weather
2,019
July 16, 2019
https://www.sciencedaily.com/releases/2019/07/190716073719.htm
Joshua trees facing extinction
They outlived mammoths and saber-toothed tigers. But without dramatic action to reduce climate change, new research shows Joshua trees won't survive much past this century.
UC Riverside scientists wanted to verify earlier studies predicting global warming's deadly effect on the namesake trees that millions flock to see every year in Joshua Tree National Park. They also wanted to learn whether the trees are already in trouble.Using multiple methods, the study arrived at several possible outcomes. In the best-case scenario, major efforts to reduce heat-trapping gasses in the atmosphere would save 19 percent of the tree habitat after the year 2070. In the worst case, with no reduction in carbon emissions, the park would retain a mere 0.02 percent of its Joshua tree habitat.The team's findings were published recently in To answer their questions about whether climate change is already having an effect, a large group of volunteers helped the team gather data about more than 4,000 trees.They found that Joshua trees have been migrating to higher elevation parts of the park with cooler weather and more moisture in the ground. In hotter, drier areas, the adult trees aren't producing as many younger plants, and the ones they do produce aren't surviving.Joshua trees as a species have existed since the Pleistocene era, about 2.5 million years ago, and individual trees can live up to 300 years. One of the ways adult trees survive so long is by storing large reserves of water to weather droughts.Younger trees and seedlings aren't capable of holding reserves in this way though, and the most recent, 376-week-long drought in California left the ground in some places without enough water to support new young plants. As the climate changes, long periods of drought are likely to occur with more frequency, leading to issues with the trees like those already observed.An additional finding of this study is that in the cooler, wetter parts of the park the biggest threat other than climate change is fire. Fewer than 10 percent of Joshua trees survive wildfires, which have been exacerbated in recent years by smog from car and industrial exhaust. The smog deposits nitrogen on the ground, which in turn feeds non-native grasses that act as kindling for wildfires.As a partner on this project, the U.S. Park Service is using this information to mitigate fire risk by removing the invasive plants."Fires are just as much a threat to the trees as climate change, and removing grasses is a way park rangers are helping to protect the area today," Sweet said. "By protecting the trees, they're protecting a host of other native insects and animals that depend on them as well."UCR animal ecologist and paper co-author Cameron Barrows conducted a similar research project in 2012, which also found Joshua tree populations would decline, based on models assuming a temperature rise of three degrees. However, this newer study considered a climate change scenario using twice as many variables, including soil-water estimates, rainfall, soil types, and more. In addition, Barrows said on-the-ground observations were essential to verifying the climate models this newer team had constructed.Quoting the statistician George Box, Barrows said, "All models are wrong, but some are useful." Barrows went on to say, "Here, the data we collected outdoors showed us where our models gave us the most informative glimpse into the future of the park."For this study, the UC Riverside Center for Conservation Biology partnered with Earthwatch Institute to recruit the volunteer scientists. Barrows and Sweet both recommend joining such organizations as a way to help find solutions to the park's problems."I hope members of the public read this and think, 'Someone like me could volunteer to help scientists get the kind of data that might lend itself to concrete, protective actions,'" Barrows said.
Weather
2,019
July 15, 2019
https://www.sciencedaily.com/releases/2019/07/190715094606.htm
How much water do snowpacks hold? A better way to answer the question
Oregon State University researchers have developed a new computer model for calculating the water content of snowpacks, providing an important tool for water resource managers and avalanche forecasters as well as scientists.
"In many places around the world, snow is a critical component of the hydrological cycle," said OSU civil engineering professor David Hill. "Directly measuring snow-water equivalent is difficult and expensive and can't be done everywhere. But information about snow depth is much easier to get, so our model, which more accurately estimates snow-water equivalent from snow depth than earlier models, is a big step forward."The findings, published in The project is called Community Snow Observations and is part of NASA's Citizen Science for Earth Systems program. Snowshoers, backcountry skiers and snow-machine users are gathering data to use in computer modeling of snow-water equivalent, or SWE.The Community Snow Observations research team kicked off in February 2017. Led by Hill, Gabe Wolken of the University of Alaska Fairbanks and Anthony Arendt of the University of Washington, the project originally focused on Alaskan snowpacks. Researchers then started recruiting citizen scientists in the Pacific Northwest. Currently, the project has more than 2,000 participants.The University of Alaska Fairbanks has spearheaded the public involvement aspect of the project, while the University of Washington's chief role is managing the data. Hill and Crumley are responsible for the modeling.In addition to snow depth information collected and uploaded by recreationists using avalanche probes, vast amounts of data are also available thanks to LIDAR, a remote sensing method that uses a pulsed laser to map the Earth's topography.The new model developed by the Community Snow Observations team and collaborators at the University of New Hampshire calculates snow-water equivalent by factoring in snow depth, time of year, 30-year averages (normals) of winter precipitation, and seasonal differences between warm and cold temperatures."Using those climate normals rather than daily weather data allows our model to provide SWE estimates for areas far from any weather station," Hill said.Researchers validated the model against a database of snow pillow measurements -- a snow pillow measures snow-water equivalents via the pressure exerted by the snow atop it -- as well as a pair of large independent data sets, one from western North America, the other from the northeastern United States."We also compared the model against three other models of varying degrees of complexity built in a variety of geographic regions," Hill said. "The results show our model performed better than all of them against the validation data sets. It's an effective, easy-to-use means of estimation very useful for vast areas lacking weather instrumentation -- areas for which snow depth data are readily available and daily weather data aren't."
Weather
2,019
July 9, 2019
https://www.sciencedaily.com/releases/2019/07/190709153609.htm
A drier future sets the stage for more wildfires
November 8, 2018 was a dry day in Butte County, California. The state was in its sixth consecutive year of drought, and the county had not had a rainfall event producing more than a half inch of rain for seven months. The dry summer had parched the spring vegetation, and the strong northeasterly winds of autumn were gusting at 35 miles per hour and rising, creating red flag conditions: Any planned or unplanned fires could quickly get out of control.
Sure enough, just before daybreak, strong winds whipped a stray spark from a power line into an inferno. The Camp Fire became the most destructive fire in California's history, scorching approximately 240 square miles, destroying nearly 14,000 buildings, causing billions of dollars in damage and killing 88 people. Later the same day, the Woolsey Fire broke out in Los Angeles County, burning 150 square miles and killing three.Droughts can create ideal conditions for wildfires. Lack of rain and low humidity dry out trees and vegetation, providing fuel. In these conditions, a spark from lightning, electrical failures, human error or planned fires can quickly get out of control.Global climate change is predicted to change precipitation and evaporation patterns around the world, leading to wetter climate in some areas and drier in others. Areas that face increasingly severe droughts will also be at risk for more and larger fires. Several NASA missions collect valuable data to help scientists and emergency responders monitor droughts and fires. Some instruments monitor water in and below the soil, helping to assess whether areas are moving toward dangerous droughts. Others watch for heat and smoke from fires, supporting both research and active disaster recovery.Understanding how fires behave in dry conditions can help firefighters, first responders and others prepare for a hotter, drier future.Earth's warming climate is forecasted to make global precipitation patterns more extreme: Wet areas will become wetter, and dry areas will become drier. Areas such as the American Southwest could see both reduced rainfall and increased soil moisture evaporation due to more intense heat, and in some cases, the resulting droughts could be more intense than any drought of the past millennium.Ben Cook of NASA's Goddard Institute for Space Studies (GISS) in New York City researches "megadroughts" -- droughts lasting more than three decades. Megadroughts have occurred in the past, like the decades-long North American droughts between 1100 and 1300, and the team used tree ring records to compare these droughts with future projections. He and his team examined soil moisture data sets and drought severity indices from 17 different future climate models, and they all predicted that if greenhouse gas emissions continue to increase at their present rate, the risk of a megadrought in the American Southwest could hit 80 percent by the end of the century. Additionally, these droughts will likely be even more severe than those seen in the last millennium.Such severe droughts will affect the amount and dryness of fuel such as trees and grass, Cook said."Fire depends on two things: having enough fuel and drying that fuel out so it can catch fire. So in the short term, more droughts probably mean more fire as the vegetation dries out," said Cook. "If those droughts continue for a long period, like a megadrought, however, it can actually mean less fire, because the vegetation will not grow back as vigorously, and you may run out of fuel to burn. It's definitely complicated."Current and future NASA measurements of soil moisture and precipitation will help to evaluate climate models' predictions, making them even more accurate and useful for understanding Earth's changing climate.Cook and his GISS colleague Kate Marvel were the first to provide evidence that human-generated greenhouse gas emissions were influencing observed drought patterns as long ago as the early 1900's. By showing that human activities have already affected drought in the past, their research provides evidence that climate change from human-generated greenhouse gas emissions will likely influence drought in the future.If the future does hold megadroughts for the southwestern United States, what might this mean for its fire seasons?"Once we change the climatology and get drier and drier fuels, we should expect more intense fires and higher fire severity," said Adam Kochanski, an atmospheric scientist at the University of Utah, referring to the size and impact of the fires. If fuels are moist, the fire is more likely to stay close to the ground and be less destructive, he said. Dry trees and plants make it more likely that flames will reach the forest canopy, making the fire more destructive and harder to control.Kochanski and Jan Mandel of the University of Colorado Denver used data from NASA and other sources to simulate the interactions between wildfires, soil moisture and local weather. They built on previous work by the National Center for Atmospheric Research (NCAR) and others to develop the SFIRE module for the widely used Weather Research and Forecasting model (WRF).This module uses data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) aboard its Aqua and Terra satellites, and the Visible Infrared Imaging Radiometer Suite (VIIRS) aboard the Suomi National Polar-Orbiting Partnership (Suomi NPP) spacecraft.Weather influences fires, but fires also influence local weather by producing heat, water vapor and smoke, Kochanski said. The winds from large fires can alter local weather patterns, and in extreme conditions, generate firestorms and fire tornadoes."It's not uncommon for people involved in wildland fires to report that although the wind is not very strong, the fires propagate very fast," Kochanski said. "If it isn't that windy, but your fire is intense and releases a lot of heat, it has the potential to generate its own winds. Even if the ambient winds are weak, this fire will start moving as if it were really windy."Better modeling of these interactions not only helps firefighters better predict where and how a wildfire might spread, but also helps forest managers know whether a planned burn is safe.Fires' effects persist long after they are extinguished, and the availability or lack of fresh water plays an important role in vegetation regrowth and recovery. Dry conditions may prevent new seeds from germinating in the burned areas. Vegetation loss can lead to erosion and sediment blocking waterways, and firefighting chemicals may contaminate water sources.Forest fires can have impacts on future winter snowpacks as well, said Kelly Gleason, a snow hydrologist and assistant professor at Portland State University. "Snowpack" refers to the snow that accumulates over an entire winter, rather than a single snowfall.Here too, NASA data are key to understanding the processes involved. Gleason and her team used 16 years of data from NASA's MODIS instrument to investigate wildfires' effects on snow melt in forests in the American West. They discovered that soot and debris from fire makes snow darker and less reflective for up to 15 years after a fire."It's like wearing a black T-shirt on a sunny day," Gleason said. "It primes the snowpack to absorb more sunlight energy. And there's more energy anyway, because the forest canopy was burned, so more sun comes through."Their survey of roughly 850 fires between 2000 and 2016 showed that snow in burned forests melted, on average, five days earlier than snow in unburned forests. In some areas the snow melted weeks or months earlier than normal, Gleason said."Every year we experience earlier snow melt, there are strong relationships with big, hot, long-lasting fires the following summer," she said. "It creates this vicious cycle where snow melts earlier due to climate change, which extends the summer drought period where the soil dries out, and when the fuels dry out, you get these big fires. This further accelerates snowmelt, further extending the summer drought period and fire potential."Mandel and Kochanski's fire-atmosphere model is already in operational use in Israel and Greece. While the software requires computing expertise to use, it is available for free, consistent with NASA's mission to freely provide its data and other products to the public.Branko Kosović, program manager for Renewable Energy for the Research Applications Laboratory and director of the Weather Systems and Assessment Program at NCAR, also used WRF to develop the fire prediction system for the state of Colorado's Division of Fire Prevention and Control. This model uses a related module called FIRE and produces a fire, weather and smoke forecast useful for both wildfires and planned fires.Kosović is also using the WRF system for his research, which uses NASA remote sensing data and machine learning to estimate fuel moisture daily over the contiguous Unites States."Measuring live fuel moisture [currently] has to be done manually," Kosović said. "People have to go out, take the live fuel, and essentially cure it in ovens to see how much moisture there is. It's very labor intensive. And you can imagine that, because of that, the data is sparse, both in space and in frequency and time."Kosović, Mandel and Kochanski hope to build systems that will give forest managers better information to plan controlled fires and help improve resource allocation during wildfires, leading to better risk assessment and recovery.NASA scientists monitor both freshwater and fires constantly, from space, the air and the ground, collecting short- and long-term data as Earth's climate continues to change. Programs such as the NASA Earth Science Disasters Program use satellite data to track active fires, monitor their effects on air quality and perform research that helps communities be more prepared before disasters strike. And looking to the future, modeling plays a key role in preparing for changing drought and fire seasons around the world.
Weather
2,019
July 9, 2019
https://www.sciencedaily.com/releases/2019/07/190709091131.htm
A clearer picture of global ice sheet mass
Fluctuations in the masses of the world's largest ice sheets carry important consequences for future sea level rise, but understanding the complicated interplay of atmospheric conditions, snowfall input and melting processes has never been easy to measure due to the sheer size and remoteness inherent to glacial landscapes.
Much has changed for the better in the past decade, according to a new review paper co-authored by researchers at the University of Colorado Boulder, NASA, Utrecht University and Delft University of Technology and recently published in the The study outlines improvements in satellite imaging and remote sensing equipment that have allowed scientists to measure ice mass in greater detail than ever before."We've come a long way in the last 10 years from an observational perspective," said Jan Lenaerts, lead author of the research and an assistant professor in CU Boulder's Department of Atmospheric and Oceanic Sciences (ATOC). "Knowing what happens to ice sheets in terms of mass in, mass out allows us to better connect climate variations to ice mass and how much the mass has changed over time."Ice sheets primarily gain mass from precipitation and lose it due to solid ice discharge and runoff of melt water. Precipitation and runoff, along with other surface processes, collectively determine the surface mass balance. The Antarctic Ice Sheet, the world's largest, is cold year-round with only marginal summer melting. A small increase or decrease in yearly snowfall, then, can make a considerable difference in surface mass because the addition or subtraction is compounded over a massive area."Snowfall is dominant over Antarctica and will stay that way for the next few decades," Lenaerts said. "And we've seen that as the atmosphere warms due to climate change, that leads to more snowfall, which somewhat mitigates the loss of ice sheet mass there. Greenland, by contrast, experiences abundant summer melt, which controls much of its present and future ice loss."In years past, climate models would have been unable to render the subtleties of snowfall in such a remote area. Now, thanks to automated weather stations, airborne sensors and Earth-orbit satellites such as NASA's Gravity Recovery and Climate experiment (GRACE) mission, these models have been improved considerably. They produce realistic ice sheet surface mass balance, allow for greater spatial precision and account for regional variation as well as wind-driven snow redistribution -- a degree of detail that would have been unheard of as recently as the early 2000s."If you don't have the input variable right, you start off on the wrong foot," Lenaerts said. "We've focused on snowfall because it heavily influences the ice sheet's fate. Airborne observations and satellites have been instrumental in giving a better view of all these processes."Ground-based radar systems and ice core samples provide a useful historical archive, allowing scientists to go back in time and observe changes in the ice sheet over long periods of time. But while current technologies allow for greater spatial monitoring, they lack the ability to measure snow density, which is a crucial variable to translate these measurements into mass changes.The biggest opportunity may lie in cosmic ray counters, which measure surface mass balance directly by measuring neutrons produced by cosmic ray collisions in Earth's atmosphere, which linger in water and can be read by a sensor. Over long periods of time, an array of these devices could theoretically provide even greater detail still.Overall, Lenaerts said, the field of ice sheet observation has come of age in recent years, but still stands to benefit from additional resources."The community of researchers studying these issues is still relatively small, but it's already a global community and interest is growing," he said. "We'd like to get to a point where ice sheet mass processes are factored into global climate and Earth system models, to really show that bigger picture."
Weather
2,019
July 9, 2019
https://www.sciencedaily.com/releases/2019/07/190709091125.htm
How much do climate fluctuations matter for global crop yields?
The El Niño-Southern Oscillation has been responsible for widespread, simultaneous crop failures in recent history, according to a new study from researchers at Columbia University's International Research Institute for Climate and Society, the International Food Policy Research Institute (IFPRI) and other partners. This finding runs counter to a central pillar of the global agriculture system, which assumes that crop failures in geographically distant breadbasket regions such as the United States, China and Argentina are unrelated. The results also underscore the potential opportunity to manage such climate risks, which can be predicted using seasonal climate forecasts.
The study, published in "Global agriculture counts on the strong likelihood that poor production in one part of the world will be made up for by good production elsewhere," said Weston Anderson, a postdoctoral research scientist at the International Research Institute for Climate and Society and lead author on the study.Of course, there's always a chance -- however small -- that it won't. The assumption until now has been that widespread crop failures would come from a set of random, adverse weather events, Anderson said.He and his co-authors decided to test this idea by looking at the impact that the El Niño-Southern Oscillation, the Indian Ocean Dipole, and other well-understood climate patterns have had on global production of corn, soybeans and wheat. They analyzed how these modes of climate variability influenced drought and heat in major growing regions."We found that ENSO can, and has, forced multiple breadbasket failures, including a significant one in 1983," said Anderson. "The problem with pooling our risk as a mitigating strategy is that it assumes failures are random. But we know that strong El Niño or La Niña events in effect organize which regions experience drought and extreme temperatures. For some crops, that reorganization forces poor yields in multiple major production regions simultaneously."How important is the influence of climate variability? The authors found that, on a global level, corn is the most susceptible to such crop failures. They found that 18% percent of the year-to-year changes in corn production were the result of climate variability. Soybeans and wheat were found to be less at risk for simultaneous failures, with climate variability accounting for 7% and 6% of the changes in global production, respectively."The bigger the uncertainty around climate drivers, the bigger the risk for those involved in the food systems," said co-author Liangzhi You, a senior research fellow at the International Food Policy Research Institute. "The worst affected are poor farmers in developing countries whose livelihoods depend upon crop yields as they do not have an appetite for risks in absence of formal insurance products or other coping mechanisms." The risk is further exacerbated by challenges posed by lack of infrastructure and resources in developing countries."ENSO may not be important in all years, but it is the only thing we know of that has forced simultaneous global-scale crop failures" said Anderson.Within specific regions, the risk to agriculture by climate variability can be much higher. For example, across much of Africa and in Northeast Brazil, ENSO and other recurring climate phenomena accounted for 40-65% of the ups and downs of food production. In other regions, the number was as low as 10%.While on the surface this may appear to mean that those areas more affected by ENSO and other climate patterns are more at risk to extreme events, the numbers actually reflect a link to climate patterns that can be monitored and predicted."What excites me about this work is that it shows how predictable modes of climate variability impact crop production in multiple regions and can scale up to influence global production, said co-author Richard Seager of Columbia's Lamont Doherty Earth Observatory. "This should allow anticipation of shocks to global food prices and supplies and, hence, improve efforts to avoid food insecurity and provide emergency food assistance when needed."
Weather
2,019
July 8, 2019
https://www.sciencedaily.com/releases/2019/07/190708112439.htm
Indian Ocean causes drought and heatwaves in South America
New research has found the record-breaking South American drought of 2013/14 with its succession of heatwaves and long lasting marine heatwave had its origins in a climate event half a world away -- over the Indian Ocean.
The findings published in It all started with strong atmospheric convection over the Indian Ocean that generated a powerful planetary wave that travelled across the South Pacific to the South Atlantic where it displaced the normal atmospheric circulation over South America.You can think of these atmospheric waves as being similar to an ocean swell generated by strong winds that travel thousands of kilometres from where they were generated. Large-scale atmospheric planetary waves form when the atmosphere is disturbed and this disturbance generates waves that travel around the planet."The atmospheric wave produced a large area of high pressure, known as a blocking high, that stalled off the east coast of Brazil," said lead author Dr Regina Rodrigues."The impacts of the drought that followed were immense and prolonged, leading to a tripling of dengue fever cases, water shortages in São Paulo, and reduced coffee production that led to global shortages and worldwide price increases."That impact wasn't just felt on land as the high-pressure system stalled over the ocean."Highs are associated with good weather. This means clear skies -- so more solar energy going into the ocean -- and low winds -- so less ocean cooling from evaporation.""The result of this blocking high was an unprecedented marine heatwave that amplified the unusual atmospheric conditions and likely had an impact on local fisheries in the region."The researchers found this atmospheric wave was not an isolated event and that strong convection far away in the Indian Ocean had previously led to drought impacts in South America."Using observations from 1982 to 2016, we noticed an increase not only in frequency but also in duration, intensity and area of these marine heatwave events. For instance, on average these events have become 18 days longer, 0.05°C warmer and 7% larger per decade." said CLEX co-author Dr Andrea Taschetto.The 2013/14 South American drought and marine heatwave is the latest climate case study to show how distant events in one region can have major climate impacts on the other side of the world."Researchers found that Australia's 2011 Ningaloo Nino in the Indian Ocean, which completely decimated coastal ecosystems and impacted fisheries, was caused by a La Niña event in the tropical Pacific," said Australian co-author Dr Alex Sen Gupta."Here we have yet another example of how interconnected our world is. Ultimately, our goal is to understand and use these complex remote connections to provide some forewarning of high impact extreme events around the world."
Weather
2,019
July 8, 2019
https://www.sciencedaily.com/releases/2019/07/190708122343.htm
Cave secrets unlocked to show past drought and rainfall patterns
A first-ever global analysis of cave drip waters has shown where stalagmites can provide vital clues towards understanding past rainfall patterns.
In a study published recently in the journal They found that in climates that have a mean average temperature of less than 10oC, isotopes of oxygen in cave drip water were similarly composed as those measured in rainwater. As UNSW's Dr Andy Baker explains, this follows what you would expect in colder climates with less evaporation of rainfall."This oxygen in the water drips from the stalactites and onto the stalagmites," says Dr Baker, from UNSW's School of Biological and Earth and Environmental Sciences."The drip water originally comes from rainfall, providing a direct link to the surface climate. Understanding the extent to which the oxygen isotopic composition of drip water is related to rainfall is a fundamental research question which will unlock the full climate potential of stalagmites and stalactites."But when the researchers examined the oxygen isotopes in drip waters in warmer areas, the oxygen isotopes in the drip waters corresponded to just some of the rain events, as revealed in the stalagmites. Dr Baker says that in such climates, evaporation not only reduces the amount of rainwater that eventually makes its way to the groundwater (a process known as rainfall recharge), but the oxygen isotopes themselves are changed by this process."In hotter climates, recharge to the subsurface doesn't occur from all rain events, rather it likely only occurs after very heavy rain, or seasonally. This study identifies this for the first time and also provides a range of temperatures constraints -- this was never known before," he says.In effect, he says, oxygen isotopes in stalagmites in warmer climates display the balance between wet weather events and prolonged periods of drying."For stalagmites in warm regions it suggests that the oxygen isotope composition will tell us about when recharge occurred -- in other words, when, and how often," Dr Baker says."And that is as valuable as it is unique. In regions like mainland Australia, with extreme weather events like drought and flooding rains, it's a tool to see how often both occurred in the past."Dr Baker says that with this knowledge it will help us understand how important rainfall is in the replenishment of our groundwater resource."This knowledge will improve our understanding of how sustainable our use of groundwater is, especially in regions where groundwater is only recharged by rain," he says.
Weather
2,019
July 2, 2019
https://www.sciencedaily.com/releases/2019/07/190702160115.htm
Using artificial intelligence to better predict severe weather
When forecasting weather, meteorologists use a number of models and data sources to track shapes and movements of clouds that could indicate severe storms. However, with increasingly expanding weather data sets and looming deadlines, it is nearly impossible for them to monitor all storm formations -- especially smaller-scale ones -- in real time.
Now, there is a computer model that can help forecasters recognize potential severe storms more quickly and accurately, thanks to a team of researchers at Penn State, AccuWeather, Inc., and the University of Almería in Spain. They have developed a framework based on machine learning linear classifiers -- a kind of artificial intelligence -- that detects rotational movements in clouds from satellite images that might have otherwise gone unnoticed. This AI solution ran on the Bridges supercomputer at the Pittsburgh Supercomputing Center.Steve Wistar, senior forensic meteorologist at AccuWeather, said that having this tool to point his eye toward potentially threatening formations could help him to make a better forecast."The very best forecasting incorporates as much data as possible," he said. "There's so much to take in, as the atmosphere is infinitely complex. By using the models and the data we have [in front of us], we're taking a snapshot of the most complete look of the atmosphere."In their study, the researchers worked with Wistar and other AccuWeather meteorologists to analyze more than 50,000 historical U.S. weather satellite images. In them, experts identified and labeled the shape and motion of "comma-shaped" clouds. These cloud patterns are strongly associated with cyclone formations, which can lead to severe weather events including hail, thunderstorms, high winds and blizzards.Then, using computer vision and machine learning techniques, the researchers taught computers to automatically recognize and detect comma-shaped clouds in satellite images. The computers can then assist experts by pointing out in real time where, in an ocean of data, could they focus their attention in order to detect the onset of severe weather."Because the comma-shaped cloud is a visual indicator of severe weather events, our scheme can help meteorologists forecast such events," said Rachel Zheng, a doctoral student in the College of Information Sciences and Technology at Penn State and the main researcher on the project.The researchers found that their method can effectively detect comma-shaped clouds with 99 percent accuracy, at an average of 40 seconds per prediction. It was also able to predict 64 percent of severe weather events, outperforming other existing severe-weather detection methods."Our method can capture most human-labeled, comma-shaped clouds," said Zheng. "Moreover, our method can detect some comma-shaped clouds before they are fully formed, and our detections are sometimes earlier than human eye recognition.""The calling of our business is to save lives and protect property," added Wistar. "The more advanced notice to people that would be affected by a storm, the better we're providing that service. We're trying to get the best information out as early as possible."This project enhances earlier work between AccuWeather and a College of IST research group led by professor James Wang, who is the dissertation adviser of Zheng."We recognized when our collaboration began [with AccuWeather in 2010] that a significant challenge facing meteorologists and climatologists was in making sense of the vast and continually increasing amount of data generated by Earth observation satellites, radars and sensor networks," said Wang. "It is essential to have computerized systems analyze and learn from the data so we can provide timely and proper interpretation of the data in time-sensitive applications such as severe-weather forecasting."He added, "This research is an early attempt to show feasibility of artificial intelligence-based interpretation of weather-related visual information to the research community. More research to integrate this approach with existing numerical weather-prediction models and other simulation models will likely make the weather forecast more accurate and useful to people."Concluded Wistar, "The benefit [of this research] is calling the attention of a very busy forecaster to something that may have otherwise been overlooked."
Weather
2,019
July 2, 2019
https://www.sciencedaily.com/releases/2019/07/190702152820.htm
Irrigated farming in Wisconsin's central sands cools the region's climate
New research finds that irrigated farms within Wisconsin's vegetable-growing Central Sands region significantly cool the local climate compared to nearby rain-fed farms or forests.
Irrigation dropped maximum temperatures by one to three degrees Fahrenheit on average while increasing minimum temperatures up to four degrees compared to unirrigated farms or forests. In all, irrigated farms experienced a three- to seven-degree smaller range in daily temperatures compared to other land uses. These effects persisted throughout the year.The results show that the conversion of land to irrigated agriculture can have a significant effect on the regional climate, which in turn can affect plant growth, pest pressure and human health in ways that could be overlooked unless land uses are accounted for in forecasts and planning.Such a cooling effect mitigates -- and obscures -- a global warming trend induced by the accumulation of greenhouses gases in the atmosphere. Irrigated farming, like all agriculture, also generates greenhouses gases.The work was led by Mallika Nocco, who recently completed her doctorate in the Nelson Institute for Environmental Studies at the University of Wisconsin-Madison. Nocco worked with Christopher Kucharik of the Nelson Institute and the UW-Madison agronomy department and Robert Smail from the Wisconsin Department of Natural Resources.The team published their findings July 2 in the journal Irrigation, and agriculture generally, cools the air due to the evaporation of water through crop leaves, much like how evaporating sweat cools people. This evaporation also increases the water content of the air. The scientists wanted to determine if the naturally humid Wisconsin climate would respond as strongly to irrigation as drier regions, such as California, do.To find out, Nocco worked with private landowners to install 28 temperature and humidity sensors in a line that crossed through the Central Sands. The 37-mile transect extended from pine plantations in the west, over irrigated farms toward forests in the east. The researchers collected data across 32 months from the beginning of 2014 through the summer of 2016.Each of the 28 sensors was matched to nearby irrigation levels through a regional well withdrawal database managed by Smail of the Department of Natural Resources.Nocco's team found that irrigation lowered the maximum daily temperature about three and half degrees compared to nearby rainfed farms. Adjacent forests were slightly warmer than either rainfed or irrigated farms.Somewhat surprisingly, the lower maximum temperatures on irrigated farms were accompanied by higher minimum temperatures. Saturated soils can hold more heat than dry soils. When that heat is released at night, it keeps nighttime minimum temperatures somewhat higher. Wet soils may also be darker, helping them absorb more sunlight during the day.The researchers found that if all land in the study area were converted to irrigated agriculture, the daily range in temperatures would shrink nearly five degrees Fahrenheit on average, and up to eight degrees at the high end. This smaller difference between daily maximum and minimum temperatures can significantly affect plant growth or insect pest lifecycles, both of which are sensitive to daily temperatures."If you're adjusting the range of temperatures, you're changing who or what can live in an area," says Nocco.The temperature differences between irrigated fields and rain-fed fields or forests were pronounced during the growing season, when fields were being irrigated, but extended throughout the year. Open fields of snow reflect more winter sunlight than forests do, keeping the air above cooler, but it's not entirely clear what drives winter temperature differences between irrigated and non-irrigated farms.While the cooling effect of irrigation mitigates global climate change on the regional scale, climate models suggest that regional warming attributed to the global trend will eventually overcome the magnitude of mitigation offered by irrigated agriculture. Farmers, who are partially buffered for now from more extreme heat, would quickly face increasing stress in that scenario."Farmers in irrigated regions may experience more abrupt temperature increases that will cause them to have to adapt more quickly than other groups who are already coping with a warming climate," says Kucharik. "It's that timeframe in which people have time to adapt that concerns me."The current study is the first to definitively link irrigation in the Midwest U.S. to an altered regional climate. These results could improve weather and climate forecasts, help farmers plan better, and, the researchers hope, better prepare agricultural areas to deal with a warming climate when the irrigation effect is washed out."Irrigation is a land use with effects on climate in the Midwest, and we need to account for this in our climate models," says Nocco.This work was supported in part by the U.S. Environmental Protection Agency, the U.S. Department of Agriculture Sustainable Agriculture Research and Education program and the Wisconsin Department of Natural Resources.
Weather
2,019
July 1, 2019
https://www.sciencedaily.com/releases/2019/07/190701144304.htm
Researchers create worldwide solar energy model
Solar cells are currently the world's most talked-about renewable energy source, and for any future sustainable energy system, it is crucial to know about the performance of photovoltaic systems at local, regional and global levels. Danish researchers have just set up an historically accurate model, and all the data has been made available for anyone who wants to use it.
Solar energy is advancing in earnest throughout the whole world. Over the past three years, more photovoltaic (PV) installations have been installed globally than any other energy source, and the annual growth rate between 2010 and 2017 was as high as 24%.In global terms, it has been predicted that solar energy will play a similar role to wind energy in the sustainable energy systems of the future, but this requires precise models for how much energy PV systems produce.Danish researchers have now developed these models in a major research project at the Department of Engineering, Aarhus University and the results have been published in the journal "We've collected 38 years of global solar radiation, weather and temperature data with a spatial resolution of 40 km x 40 km for the entire globe, and compared this with historical data for photovoltaic installations in Europe. Based on this, we've made a very accurate model that, at global, regional and local levels, can tell you about the performance of PV installations in a given geography, depending on the type of facility being used. This means we can look at not only a single installation, but energy production in entire countries or continents from PV installations. This is extremely important for the way in which the energy systems of the future can be combined to function optimally," says Assistant Professor Marta Victoria, who has been responsible for the project.She continues: "Generating cheap green energy is no longer a challenge. The price of PV installations has tumbled over the last 10-20 years, so we're now seeing huge investments in this particular energy source. The challenge is to link energy production from myriads of small installations across the landscape with a country's total energy demand and energy production from other sources, some of which is also linked across national borders."The problem is also that the green energy system of the future depends on renewable energy sources, which in turn depend on the weather. This is why, according to Marta Victoria we need very accurate and detailed knowledge about energy production."PV installations will have a huge impact on the energy systems of the future, and planning systems based on models that do not take into account the outages in relation to the norm simply won't work. Therefore, this project has gathered very detailed data over time for the last 38 years for the entire globe, so that the model can be used anywhere," she says.All the data in the model has been made readily available to everyone via Open Licence.The project is part of the RE-Invest project, which is being funded by Innovation Fund Denmark, and which brings together a large number of Danish and international universities and companies to create the energy system of the future.
Weather
2,019
June 27, 2019
https://www.sciencedaily.com/releases/2019/06/190627121246.htm
Natural biodiversity protects rural farmers' incomes from tropical weather shocks
A big data study covering more than 7,500 households across 23 tropical countries shows that natural biodiversity could be effective insurance for rural farmers against drought and other weather-related shocks.
Frederick Noack, assistant professor of food and resource economics in UBC's faculty of land and food systems, worked with colleagues from ETH Zurich and the University of Geneva to study whether natural biodiversity helps buffer farmers' incomes against weather shocks.They found that farmers in areas with greater biodiversity took less of an income hit from droughts than their peers who farmed amid less biodiversity.Their calculations also indicated that a loss of half the species within a region would double the impact of weather extremes on income."We should conserve biodiversity, not just because we like to see tigers and lions, but because it's also an important input for production," said Noack. "It's especially important for people who live in areas where it's hard to get insurance or loans to compensate for environmental shocks. A certain level of biodiversity conservation could be beneficial for people in agriculture, forestry and these sorts of industries."Access to huge datasets allowed the researchers to compare farmers' actual incomes -- using data gathered every three months -- with geocoded data indicating the number of plant species in the local environment. They cross-referenced this with weather data through the growing, planting and harvesting seasons.While the results clearly link natural biodiversity to income stabilization during adverse weather, the ways in which biodiversity accomplishes this were beyond the scope of the study. Noack pointed to a variety of processes occurring naturally within local ecosystems that could contribute. For example, an environment that supports several bee species should allow pollination to happen at a broader range of temperatures. The same environment might also support the natural enemies of pests, which would reduce farmers' dependence on pesticides to stabilize their yield.The research is the first to relate biodiversity directly to incomes at such a scale. Earlier studies have shown that biodiversity can stabilize the production of biomass such as leaves in a field or trees in a forest -- but not how that translates into real income for farmers."The difference between studying biomass and studying income is that income assigns value to different types of biomass," said Noack. "Price signals our value for specific things, so looking at income converts something that happens in the ecosystem to something that we actually value."The data came from tropical countries in Latin America, Asia and Africa, where weather extremes are expected to increase as the earth's atmosphere warms. The analysis shows that conservation of natural biodiversity could play an important role in alleviating poverty for rural households with little access to insurance or loans.The findings also inform the ongoing debate about where conservation efforts should be directed. Conserving large swaths of parkland far from agricultural land may be less effective than conserving smaller pockets in close proximity to farms.
Weather
2,019
June 26, 2019
https://www.sciencedaily.com/releases/2019/06/190626133720.htm
The water future of Earth's 'third pole'
Himalaya. Karakoram. Hindu Kush. The names of Asia's high mountain ranges conjure up adventure to those living far away, but for more than a billion people, these are the names of their most reliable water source.
Snow and glaciers in these mountains contain the largest volume of freshwater outside of Earth's polar ice sheets, leading hydrologists to nickname this region the Third Pole. One-seventh of the world's population depends on rivers flowing from these mountains for water to drink and to irrigate crops.Rapid changes in the region's climate, however, are affecting glacier melt and snowmelt. People in the region are already modifying their land-use practices in response to the changing water supply, and the region's ecology is transforming. Future changes are likely to influence food and water security in India, Pakistan, China and other nations.NASA is keeping a space-based eye on changes like these worldwide to better understand the future of our planet's water cycle. In this region where there are extreme challenges in collecting observations on the ground, NASA's satellite and other resources can produce substantial benefits to climate science and local decision makers tasked with managing an already-scarce resource.The most comprehensive survey ever made of snow, ice and water in these mountains and how they are changing is now underway. NASA's High Mountain Asia Team (HiMAT), led by Anthony Arendt of the University of Washington in Seattle, is in its third year. The project consists of 13 coordinated research groups studying three decades of data on this region in three broad areas: weather and climate; ice and snow; and downstream hazards and impacts.All three of these subject areas are changing, starting with climate. Warming air and alterations in monsoon patterns affect the regional water cycle -- how much snow and rain falls, and how and when the snowpack and glaciers melt. Changes in the water cycle raise or lower the risk of local hazards such as landslides and flooding, and have broad impacts on water allocation and crops that can be grown.For most of human history, a detailed scientific study of these mountains was impossible. The mountains are too high and steep, and the weather too dangerous. The satellite era has given us the first opportunity to observe and measure snow and ice cover safely in places where no human has ever set foot."The explosive growth of satellite technology has been incredible for this region," said Jeffrey Kargel, a senior scientist at the Planetary Science Institute in Tucson, Arizona, and leader of a HiMAT team studying glacial lakes. "We can do things now that we couldn't do ten years ago -- and ten years ago we did things we couldn't do before that." Kargel also credited advances in computer technology that have enabled far more researchers to undertake large data-processing efforts, which are required to improve weather forecasting over such complex topography.Arendt's HiMAT team is charged with integrating the many, varied types of satellite observations and existing numerical models to create an authoritative estimate of the water budget of this region and a set of products local policy makers can use in planning for a changing water supply. A number of data sets by HiMAT teams have already been uploaded to NASA's Distributed Active Archive Center at the National Snow and Ice Data Center. Collectively, the suite of new products is called the Glacier and Snow Melt (GMELT) Toolbox.There's some urgency in completing the toolbox, because changes in melt patterns appear to be increasing the region's hazards -- some of which are found only in this kind of terrain, such as debris dam "failures" on glacial lakes and surging glaciers blocking access to mountain villages and pastures. In the last few decades, towns and infrastructure such as roads and bridges have been wiped out by these events.Kargel's team is studying catastrophic flooding from glacial lakes. These lakes start as melt pools on the surfaces of glaciers, but under the right conditions they may continue to melt all the way to ground level, pooling behind a precarious pile of ice and debris that was originally the front end of the glacier. An earthquake, rockfall or simply the increasing weight of water may breach the debris dam and create a flash flood.Lakes like this were almost unknown 50 or 60 years ago, but as most high mountain Asian glaciers have been shrinking and retreating, glacial lakes have been proliferating and growing. The largest one Kargel has measured, Lower Barun in Nepal, is 673 feet (205 meters) deep with a volume of almost 30 billion gallons (112 million cubic meters), or about 45,000 Olympic-sized swimming pools full. The HiMAT team has mapped every glacial lake larger than about 1,100 feet (330 meters) in diameter for three different time periods -- about 1985, 2001 and 2015 -- to study how the lakes have evolved.As the size and number of glacial lakes increase, so does the threat they pose to the local population and infrastructure. Dalia Kirschbaum of NASA's Goddard Space Flight Center in Greenbelt, Maryland, leads a group that is using satellite data to predict what areas are most susceptible to landslides in high mountain Asia, which can then inform the placement of new infrastructure of the region.One critical factor in future rates of snow and ice melt is the role of dust, soot and pollution that settle on the frozen surfaces. Pristine white snow reflects more than 90% of incoming solar radiation back into the atmosphere. But when snow is blanketed by darker-colored particles of soot or dust, this coating absorbs more heat and the snow melts faster. Research has shown that the reason the Little Ice Age ended in Europe was the coating of soot deposited on the Alps by the Industrial Revolution. In Asia, the last 35 years have seen significant increases in the amount of soot settling on mountain snow. Whether these Asian ranges will react the same way the Alps did centuries ago is an important question.Several HiMAT teams are focused on this issue. Si-Chee Tsay of NASA Goddard is using satellite data to gain a better understanding of the properties of snow, ice, and dust and soot particles in this region. His group is also working in collaboration with regional researchers in Nepal to install sensors at ground level on glaciers located on Mt. Everest, Annapurna and Dhaulagiri, among other sites. These sensors will allow researchers to check the accuracy of satellite readings obtained over the same sites.Tom Painter of the University of California, Los Angeles, is leading a team using satellite data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) and the NOAA/NASA Visible Infrared Imaging Radiometer Suite (VIIRS) in the community Weather Research and Forecasting model to quantify past and possible future variations in snow cover and other factors as soot and dust change. Another team, led by Sarah Kapnick of NOAA, is accounting for dust and soot within global climate models, to improve understanding of both historical and predicted future regional changes.The tallest mountains in the world make for unique challenges in weather forecasting. A team led by Summer Rupper of the University of Utah in Salt Lake City has addressed one of these challenges by developing a model that differentiates between ice and snow that were deposited on the region during the monsoon season and those that came from winter storms, so that scientists can study where and when snow is likely to fall throughout the year.In the HiMAT survey's final year, Arendt said, the research is coming together and the teams' scientific papers are heading for publication. One of the more alarming conclusions is that the glaciers will be 35 to 75% smaller in volume by 2100 due to rapid melting. A paper published on June 19 in Whether rain and snowfall will also change, and whether changes would compound or mitigate the effects of ice loss, are not yet clear. Precipitation already varies considerably from one range to another in this region, depending on the monsoon and the flow of winter storms into the area. For example, precipitation is currently increasing in the Karakoram Range, where glaciers are either stable or advancing, but in every other range in this region, nearly all glaciers are retreating. Whether that anomaly will continue, grow stronger, or reverse as the climate continues to change is not yet clear. "Global climate dynamics will dictate where storms end up and how they intercept the mountains," Arendt said. "Even small changes in the tracking of the storms can create significant variability."Findings like these are why the HiMAT teams are eager to complete their GMELT toolbox, Arendt noted. The new products will offer decision-makers the best compilation of knowledge that can currently be made of how high mountain Asia has been changing in recent decades, along with a new set of resources to help them plan how best to prepare for the future of this hard-to-predict region.
Weather
2,019
June 21, 2019
https://www.sciencedaily.com/releases/2019/06/190621140337.htm
Northern lights' 'social networking' reveals true scale of magnetic storms
Magnetic disturbances caused by phenomena like the northern lights can be tracked by a 'social network' of ground-based instruments, according to a new study from the University of Warwick.
The researchers, led by Professor Sandra Chapman from the University's Department of Physics, have for the first time characterised the observations from over 100 ground based magnetometers in terms of a time-varying directed network of connections. They monitored the development of geomagnetic substorms using the same mathematics used to study social networks. The magnetometers 'befriend' one another when they see the same signal of a propagating disturbance.The research, published in the journal The northern lights, or Aurora Borealis, occur when charged particles from our Sun bombard the Earth's magnetic field. This stores up energy like a battery which it then releases, creating large-scale electrical currents in the ionosphere which generate disturbances of magnetic fields on the ground. Small versions of these substorms are common, but occasionally larger storms will occur that can have a larger impact.Using over 100 magnetometers that form the SuperMAG Initiative led by Dr Jesper Gjerloev, the researchers used the mathematical concepts from network science to monitor the development of substorms in the arctic auroral region. As a substorm develops and the electrical current in the ionosphere grows, individual magnetometers will register a change in the magnetic field. Pairs of magnetometers became linked when their measurements correlated with each other, expanding their network of 'friends' and allowing the researchers to monitor how the auroral disturbance from the substorm forms and propagates, and how quickly.Substorms from the Aurora Borealis create an electrical current in the atmosphere that is echoed at ground level. Localised changes in the Earth's magnetic field can disrupt power lines, electronic and communications systems and technologies such as GPS. They are just one form of space weather that affects our planet on a constant basis.Professor Sandra Chapman from the University of Warwick Department of Physics said: "When talking about space weather, it is useful to provide a single number or rating that indicates how severe it is. To do this, we need to capture the full behaviour of how intense the event is, how widespread spatially, and how rapidly it is changing. Our aim is to use network science to develop useful parameters that do this, encapsulating all the information from 100+ observations."SuperMAG is a great example of how essential international co-operation is to solve problems like space weather that are on a planetary scale, using data from stations located in all the countries that abut the Arctic Circle."
Weather
2,019
June 20, 2019
https://www.sciencedaily.com/releases/2019/06/190620121408.htm
Perovskite solar cells tested for real-world performance -- in the lab
It was only ten years ago that metal-halide perovskites were discovered to be photovoltaic materials. Today, perovskite solar cells made are almost as efficient as the best conventional silicon ones, and there is much hope that they will become a highly efficient and low-cost alternative, as they can be manufactured by rather simple and fast methods like printing.
The major obstacle for commercialization is the stability of perovskite devices. Operational stability is commonly assessed either by continuous illumination in the lab or by outdoor testing. The first approach has the disadvantage of not accounting for real-world operation variations in irradiance and temperature because of day-night and season changes. These are especially important for perovskite solar cells because of their slow response times.On the other hand, outdoor tests require that the devices are encapsulated to protect them against exposure to harsh weather conditions. But encapsulation mainly addresses parasitic failure mechanisms that are not necessarily related to the perovskite material itself.To escape from this dilemma, Wolfgang Tress, a scientist with the lab of Anders Hagfeldt at EPFL, working with colleagues at the lab of Michael Grätzel, brought the real-world conditions into the controlled environment of the lab. Using data from a weather station near Lausanne (Switzerland) they reproduced the real-world temperature and irradiance profiles from specific days during the course of the year. With this approach, the scientists were able to quantify the energy yield of the devices under realistic conditions. "This is what ultimately counts for the real-world application of solar cells," says Tress.The study found that temperature and irradiance variations does not affect the performance of perovskite solar cells in any dramatic way, and although the efficiency of the cells decreases slightly during the course of a day, it recovers during the night."The study provides a further step towards the assessment of the performance and reliability of perovskite solar cells under realistic operation conditions," says Tress.
Weather
2,019
June 19, 2019
https://www.sciencedaily.com/releases/2019/06/190619142532.htm
U.S. beekeepers lost over 40 percent of colonies last year, highest winter losses ever recorded
Beekeepers across the United States lost 40.7% of their honey bee colonies from April 2018 to April 2019, according to preliminary results of the latest annual nationwide survey conducted by the University of Maryland-led nonprofit Bee Informed Partnership. Honey bees pollinate $15 billion worth of food crops in the United States each year.
The survey results show, the annual loss of 40.7% this last year represents a slight increase over the annual average of 38.7%. However winter losses of 37.7%, were the highest winter loss reported since the survey began 13 years ago and 8.9 percentage points higher than the survey average."These results are very concerning, as high winter losses hit an industry already suffering from a decade of high winter losses," said Dennis vanEngelsdorp, associate professor of entomology at the University of Maryland and president for the Bee Informed Partnership.During the 2018 summer season, beekeepers lost 20.5% of their colonies, which is slightly above the previous year's summer loss rate of 17.1%, but about equal to the average loss rate since the summer of 2011. Overall, the annual loss of 40.7% this last year represents a slight increase over the annual average of 38.7%.Just looking at the overall picture and the 10-year trends, it's disconcerting that we're still seeing elevated losses after over a decade of survey and quite intense work to try to understand and reduce colony loss," adds Geoffrey Williams, assistant professor of entomology at Auburn University and co-author of the survey. "We don't seem to be making particularly great progress to reduce overall losses."Since beekeepers began noticing dramatic losses in their colonies, state and federal agricultural agencies, university researchers, and the beekeeping industry have been working together to understand the cause and develop Best Management Practices to reduce losses. The annual colony loss survey, which has been conducted since 2006, has been an integral part of that effort.The survey asks commercial and backyard beekeeping operations to track the survival rates of their honey bee colonies. Nearly 4,700 beekeepers managing 319,787 colonies from all 50 states and the District of Columbia responded to this year's survey, representing about 12% of the nation's estimated 2.69 million managed colonies.The Bee Informed Partnership team said multiple factors are likely responsible for persistently high annual loss rates and this year's jump in winter losses. They say a multi-pronged approach -- research, extension services & education, and best management practices -- is needed to combat the problem.The number one concern among beekeepers and a leading contributor to winter colony losses is varroa mites, lethal parasites that can readily spread from colony to colony. These mites have been decimating colonies for years, with institutions like the University of Maryland actively researching ways to combat them. "We are increasingly concerned about varroa mites and the viruses they spread, said vanEngelsdorp. "Last year, many beekeepers reported poor treatment efficacy, and limited field tests showed that products that once removed 90% of mites or more are now removing far fewer. Since these products are no longer working as well, the mite problem seems to be getting worse.""But mites are not the only problem," continues vanEngelsdorp. "Land use changes have led to a lack of nutrition-rich pollen sources for bees, causing poor nutrition. Pesticide exposures, environmental factors, and beekeeping practices all play some role as well."Karen Rennich, executive director for the Bee Informed Partnership and senior faculty specialist at the University of Maryland, elaborated on land use and environmental factors that may be significant in bee colony loss, including increases in extreme weather."The tools that used to work for beekeepers seem to be failing, and that may be evident in this year's high losses. A persistent worry among beekeepers nationwide is that there are fewer and fewer favorable places for bees to land, and that is putting increased pressure on beekeepers who are already stretched to their limits to keep their bees alive," said Rennich. "We also think that extreme weather conditions we have seen this past year demand investigation, such as wildfires that ravage the landscape and remove already limited forage, and floods that destroy crops causing losses for the farmer, for the beekeeper, and for the public."According to Rennich and Williams, more research is needed to understand what role climate change and variable weather patterns play in honey bee colony losses.
Weather
2,019
June 19, 2019
https://www.sciencedaily.com/releases/2019/06/190619094856.htm
New evidence shows rapid response in the West Greenland landscape to Arctic climate shifts
New evidence shows that Arctic ecosystems undergo rapid, strong and pervasive environmental changes in response to climate shifts, even those of moderate magnitude, according to an international research team led by the University of Maine.
Links between abrupt climate change and environmental response have long been considered delayed or dampened by internal ecosystem dynamics, or only strong in large magnitude climate shifts. The research team, led by Jasmine Saros, associate director of the UMaine Climate Change Institute, found evidence of a "surprisingly tight coupling" of environmental responses in an Arctic ecosystem experiencing rapid climate change.Using more than 40 years of weather data and paleoecological reconstructions, the 20-member team quantified rapid environmental responses to recent abrupt climate change in West Greenland. They found that after 1994, mean June air temperatures were 2.2 degrees C higher and mean winter precipitation doubled to 40 millimeters. Since 2006, mean July air temperatures shifted 1.1 degree C higher.The "nearly synchronous" environmental response to those high-latitude abrupt climate shifts included increased ice sheet discharge and dust, and advanced plant phenology. In lakes, there was earlier ice-out and greater diversity of algal functional traits.The new evidence underscores the highly responsive nature of Arctic ecosystems to abrupt transitions -- and the strength of climate forcing, according to the team, which published its findings in the journal Understanding how ecosystems respond to abrupt climate change is central to predicting and managing potentially disruptive environmental shifts, says Saros, one of seven UMaine professors who have been conducting research in the Arctic in recent years."We present evidence that climate shifts of even moderate magnitude can rapidly force strong, pervasive environmental changes across a high-latitude system," says Saros. "Prior research on ecological response to abrupt climate change suggested delayed or dampened ecosystem responses. In the Arctic, however, we found that nonlinear environmental responses occurred with or shortly after documented climate shifts in 1994 and 2006."
Weather
2,019
June 17, 2019
https://www.sciencedaily.com/releases/2019/06/190617164712.htm
How climate change affects crops in India
Kyle Davis is an environmental data scientist whose research seeks to increase food supplies in developing countries. He combines techniques from environmental science and data science to understand patterns in the global food system and develop strategies that make food-supply chains more nutritious and sustainable.
Since joining the Data Science Institute as a postdoctoral fellow in September 2018, Davis has co-authored four papers, all of which detail how developing countries can sustainably improve their crop production. For his latest study, he focuses on India, home to 1.3 billion people, where he led a team that studied the effects of climate on five major crops: finger millet, maize, pearl millet, sorghum and rice. These crops make up the vast majority of grain production during the June-to-September monsoon season -- India's main growing period -- with rice contributing three-quarters of the grain supply for the season. Taken together, the five grains are essential for meeting India's nutritional needs.And in a paper published in "Expanding the area planted with these four alternative grains can reduce variations in Indian grain production caused by extreme climate, especially in the many places where their yields are comparable to rice," Davis added. "Doing so will mean that the food supply for the country's massive and growing population is less in jeopardy during times of drought or extreme weather."Temperatures and rainfall amounts in India vary from year to year and influence the amount of crops that farmers can produce. And with episodes of extreme climate such as droughts and storms becoming more frequent, it's essential to find ways to protect India's crop production from these shocks, according to Davis.The authors combined historical data on crop yields, temperature, and rainfall. Data on the yields of each crop came from state agricultural ministries across India and covered 46 years (1966-2011) and 593 of India's 707 districts. The authors also used modelled data on temperature (from the University of East Anglia's Climate Research Unit) and precipitation (derived from a network of rain gauges maintained by the Indian Meteorological Department). Using these climate variables as predictors of yield, they then employed a linear mixed effects modelling approach -- similar to a multiple regression ? to estimate whether there was a significant relationship between year-to-year variations in climate and crop yields."This study shows that diversifying the crops that a country grows can be an effective way to adapt its food-production systems to the growing influence of climate change," said Davis. "And it adds to the evidence that increasing the production of alternative grains in India can offer benefits for improving nutrition, for saving water, and for reducing energy demand and greenhouse gas emissions from agriculture."
Weather
2,019
June 17, 2019
https://www.sciencedaily.com/releases/2019/06/190617125138.htm
'Self-healing' polymer brings perovskite solar tech closer to market
A protective layer of epoxy resin helps prevent the leakage of pollutants from perovskite solar cells (PSCs), report scientists from the Okinawa Institute of Science and Technology Graduate University (OIST). Adding a "self-healing" polymer to the top of a PSC can radically reduce how much lead it discharges into the environment. This gives a strong boost to prospects for commercializing the technology.
With atmospheric carbon dioxide levels reaching their highest recorded levels in history, and extreme weather events continuing to rise in number, the world is moving away from legacy energy systems relying on fossil fuels towards renewables such as solar. Perovskite solar technology is promising, but one key challenge to commercialization is that it may release pollutants such as lead into the environment -- especially under extreme weather conditions."Although PSCs are efficient at converting sunlight into electricity at an affordable cost, the fact that they contain lead raises considerable environmental concern," explains Professor Yabing Qi, head of the Energy Materials and Surface Sciences Unit, who led the study, published in "While so-called 'lead-free' technology is worth exploring, it has not yet achieved efficiency and stability comparable to lead-based approaches. Finding ways of using lead in PSCs while keeping it from leaking into the environment, therefore, is a crucial step for commercialization."Qi's team, supported by the OIST Technology Development and Innovation Center's Proof-of-Concept Program, first explored encapsulation methods for adding protective layers to PSCs to understand which materials might best prevent the leakage of lead. They exposed cells encapsulated with different materials to many conditions designed to simulate the sorts of weather to which the cells would be exposed in reality.They wanted to test the solar cells in a worst-case weather scenario, to understand the maximum lead leakage that could occur. First, they smashed the solar cells using a large ball, mimicking extreme hail that could break down their structure and allow lead to be leaked. Next, they doused the cells with acidic water, to simulate the rainwater that would transport leaked lead into the environment.Using mass spectroscopy, the team analyzed the acidic rain to determine how much lead leaked from the cells. They found that an epoxy resin layer provided minimal lead leakage -- orders of magnitude lower than the other materials.Epoxy resin also performed best under a number of weather conditions in which sunlight, rainwater and temperature were altered to simulate the environments in which PSCs must operate. In all scenarios, including extreme rain, epoxy resin outperformed rival encapsulation materials.Epoxy resin worked so well due to its "self-healing" properties. After its structure is damaged by hail, for example, the polymer partially reforms its original shape when heated by sunlight. This limits the amount of lead that leaks from inside the cell. This self-healing property could make epoxy resin the encapsulation layer of choice for future photovoltaic products."Epoxy resin is certainly a strong candidate, yet other self-healing polymers may be even better," explains Qi. "At this stage, we are pleased to be promoting photovoltaic industry standards, and bringing the safety of this technology into the discussion. Next, we can build on these data to confirm which is truly the best polymer."Beyond lead leakage, another challenge will be to scale up perovskite solar cells into perovskite solar panels. While cells are just a few centimeters long, panels can span a few meters, and will be more relevant to potential consumers. The team will also direct their attention to the long-standing challenge of renewable energy storage.
Weather
2,019