Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
June 1, 2017
|
https://www.sciencedaily.com/releases/2017/06/170601124203.htm
|
Volcanoes: Referees for the life on Earth
|
At the Triassic-Jurassic boundary, 200 million years ago, some 60% of species living on Earth disappeared. Scientists suspected that magmatic activity and the release of CO
|
Scientists have often linked the annihilation of life at the Triassic-Jurassic boundary with the emission of gas during the volcanic activity of the Central Atlantic Magmatic Province, a huge volcanic province that erupted around the same time. Geological studies, however, have questioned this hypothesis since the flood basalt eruptions from the igneous province are too young to be responsible for the mass extinction. The scientists, among them a team from UNIGE, therefore went to look for traces of magmatic activity that may be older, proving the role of magmatic activity in mass extinctions that hit the history of Earth during this period of time.The geologists identified large areas covered by flood basalts assigned to the Central Atlantic Magmatic Province (CAMP), which extends over several million km2 from Northern to Southern America, and from Europe to Africa. They also discovered vertical fissures that extend over hundreds of kilometres and large intrusions. "We therefore erected the hypothesis that these fissures and intrusions are older or coeval to the mass extinction at the Triassic-Jurassic boundary, and we have verified this applying our high-precision dating techniques," explains Joshua Davies, research fellow at the Department of Earth Sciences of the Faculty of Science at the University of Geneva (UNIGE).The basalts enclose the mineral zircon in tiny quantities, which itself contains uranium. Uranium has the particularity of disintegrating itself over time into lead at a known rate. "It's because of this, by measuring relative concentrations of uranium and lead, we can determine the age of crystallization of minerals in a rock to about 30'000 years, which is extremely precise for a period of time 200 million years ago," adds Urs Schaltegger, professor at the Department of Earth Sciences of the Faculty of Science at the University of Geneva (UNIGE).To carry out precise age determinations is a complicated exercise, only around four laboratories are capable of at this level of precision, among them the laboratory at UNIGE. The geologists were particularly interested to date basalts that can be found in the Amazonian sedimentary basin, an huge reservoir of coal and oil. And indeed, the results of their age determinations confirm that the age of these basalts correlates with the mass extinction at the Triassic-Jurassic boundary. This result allows the scientists to link this magmatic activity with the thermally induced release of immense volumes of CO
|
Geology
| 2,017 |
June 1, 2017
|
https://www.sciencedaily.com/releases/2017/06/170601082254.htm
|
A new twist on uranium's origin story
|
Uranium, the radioactive element that fuels nuclear power plants and occurs naturally in the Earth's crust, is typically mined from large sandstone deposits deep underground. The uranium in these deposits, which are called roll fronts, has long been thought to form over millions of years via chemical reactions of sulfur and other non-biological compounds.
|
This widely accepted textbook geology is being challenged by Colorado State University biogeochemists in a new study published June 1 in "You know you might have a big story when you discover something that will result in people having to rewrite textbooks," Borch said. "Our results may introduce a paradigm shift in the way we think about ore genesis and mining -- from implications for human health, to restoration practices, to how mining companies calculate how much they can earn from a given site."Conventional wisdom has told us that uranium within ore deposits is mostly found in the form of uraninite, a crystalline mineral. In recent years, scientists had uncovered new evidence that bacteria - living microorganisms -- could generate a different kind of reduced uranium that is non-crystalline and has very different physical and chemical properties. Borch, working on an unrelated experiment studying the composition of uranium at mined and unmined sites in Wyoming, surmised that this biogenic (of biological origin), non-crystalline uranium might occur naturally within ore deposits.To find out, Borch's team analyzed samples from the Wyoming roll front, using new techniques including synchrotron radiation-based spectroscopy and isotope fingerprinting. They found that up to 89 percent of the uranium from their 650-foot-deep samples wasn't crystalline uraninite at all, but rather, a non-crystalline uranium that was bound to organic matter or inorganic carbonate. Most of the uranium they found in that unmined site is estimated to be 3 million years old, and formed via reduction by microorganisms - microbes that respire not on oxygen, but on uranium.To verify their results, the team partnered with experts from the U.S. Geological Survey, Institute for Mineralogy at Leibniz University in Germany, and the Swiss Federal Institute of Technology in Lausanne, all of whom became paper co-authors.Abundance of this biogenic non-crystalline uranium has implications for environmental remediation of mining sites, and for mining practices in general. For instance, biogenic non-crystalline uranium is much more likely to oxidize into a water-soluble form than its crystalline counterparts. This could impact the compound's environmental mobility and its likelihood for contaminating a drinking water aquifer, Borch said.Borch says that most states require spent mines to be restored to pre-mining conditions. "In order to get back to pre-mining conditions, we had better understand those pre-mining conditions," Borch said. "The baseline may not be what we thought it was."Though there is now strong evidence for microbial origins of roll-front uranium, what's less clear is whether the microbes making uranium today are the same as those that formed it in the Earth's crust 3 million years ago. "But we do know through isotopic fingerprinting that the uranium formed via microbial reduction," Borch said.Borch's co-authors include Rizlan Bernier-Latmani, a scientist in Switzerland who developed the isotopic fingerprinting techniques to differentiate between uranium formed via microbial or chemical means.Borch and colleagues hope to explore the origins of roll-front uranium deposits at other sites, in order to evaluate the global significance of their findings.
|
Geology
| 2,017 |
May 30, 2017
|
https://www.sciencedaily.com/releases/2017/05/170530082345.htm
|
Death by volcano?
|
Anyone concerned by the idea that people might try to combat global warming by injecting tons of sulfate aerosols into Earth's atmosphere may want to read an article in the May 1, 2017 issue of the journal
|
In the article, a Washington University scientist and his colleagues describe what happened when pulses of atmospheric carbon dioxide and sulfate aerosols were intermixed at the end of the Ordivician geological period more than 440 million years ago.The counterpart of the tumult in the skies was death in the seas. At a time when most of the planet north of the tropics was covered by an ocean and most complex multicellular organisms lived in the sea, 85 percent of marine animal species disappeared forever. The end Ordivician extinction, as this event was called, was one of the five largest mass extinctions in Earth's history.Although the gases were injected into the atmosphere by massive volcanism rather than prodigious burning of fossil fuels and under circumstances that will never be exactly repeated, they provide a worrying case history that reveals the potential instability of planetary-scale climate dynamics.Figuring out what caused the end Ordivician extinction or any of the other mass extinctions in Earth's history is notoriously difficult, said David Fike, associate professor of earth and planetary sciences in Arts & Sciences and a co-author on the paper.Because the ancient atmospheres and oceans have long since been altered beyond recognition, scientists have to work from proxies, such as variations in oxygen isotopes in ancient rock, to learn about climates long past. The trouble with most proxies, said Fike, who specializes in interpreting the chemical signatures of biological and geological activity in the rock record, is that most elements in rock participate in so many chemical reactions that a signal can often be interpreted in more than one way.But a team led by David Jones, an earth scientist at Amherst College, was able to bypass this problem by measuring the abundance of mercury. Today, the primary sources of mercury are coal-burning power plants and other anthropocentric activities; during the Ordivician, however, the main source was volcanism.Volcanism coincides with mass extinctions with suspicious frequency, Fike said. He is speaking not about an isolated volcano but rather about massive eruptions that covered thousands of square kilometers with thick lava flows, creating large igneous provinces (LIPs). The most famous U.S. example of a LIP is the Columbia River Basalt province, which covers most of the southeastern part of the state of Washington and extends to the Pacific and into Oregon.Volcanoes are plausible climate forcers, or change agents, because they release both carbon dioxide that can produce long-term greenhouse warming and sulfur dioxide that can cause short-term reflective cooling. In addition, the weathering of vast plains of newly exposed rock can draw down atmospheric carbon dioxide and bury it as limestone minerals in the oceans, also causing cooling.When Jones analyzed samples of rock of Ordivician age from south China and the Monitor Range in Nevada, he found anomalously high mercury concentrations. Some samples held 500 times more mercury than the background concentration. The mercury arrived in three pulses, before and during the mass extinction.But what happened? It had to have been an unusual sequence of events because the extinction (atypically) coincided with glaciation and also happened in two pulses.As the scientists began to piece together the story, they began to wonder if the first wave of eruptions didn't push Earth's climate into a particularly vulnerable state, setting it up for a climate catastrophe triggered by later eruptions.The first wave of eruptions laid down a LIP whose weathering then drew down atmospheric carbon dioxide. The climate cooled and glaciers formed on the supercontinent of Gondwana, which was then located in the southern hemisphere.The cooling might have lowered the tropopause, the boundary between two layers of the atmosphere with different temperature gradients. The second wave of volcanic eruptions then injected prodigious amounts of sulfur dioxide above the tropopause, abruptly increasing Earth's albedo, or the amount of sunlight it reflected.This led to the first and largest pulse of extinctions. As ice sheets grew, sea level dropped and the seas became colder, causing many species to perish.During the second wave of volcanism, the greenhouse warming from carbon dioxide overtook the cooling caused by sulfur dioxide and the climate warmed, the ice melted and sea levels rose. Many of the survivors of the first pulse of extinctions died in the ensuing flooding of habitat with warmer, oxygen-poor waters.The take-home, said Fike, is that the different factors that affect Earth's climate can interact in unanticipated ways and it is possible that events that might not seem extreme in themselves can put the climate system into a precarious state where additional perturbations have catastrophic consequences."It's something to keep in mind when we contemplate geoengineering schemes to mitigate global warming," said Fike, who teaches a course where students examine such schemes and then evaluate their willingness to deploy them.
|
Geology
| 2,017 |
May 29, 2017
|
https://www.sciencedaily.com/releases/2017/05/170529133705.htm
|
Hotspots show that vegetation alters climate by up to 30 percent
|
A new Columbia Engineering study, led by Pierre Gentine, associate professor of earth and environmental engineering, analyzes global satellite observations and shows that vegetation alters climate and weather patterns by as much as 30 percent. Using a new approach, the researchers found that feedbacks between the atmosphere and vegetation (terrestrial biosphere) can be quite strong, explaining up to 30 percent of variability in precipitation and surface radiation. The paper (DOI 10.1038/ngeo2957), published May 29 in
|
"While we can currently make fairly reliable weather predictions, as, for example, five-day forecasts, we do not have good predictive power on sub-seasonal to seasonal time scale, which is essential for food security," Gentine says. "By more accurately observing and modeling the feedbacks between photosynthesis and the atmosphere, as we did in our paper, we should be able to improve climate forecasts on longer timescales."Vegetation can affect climate and weather patterns due to the release of water vapor during photosynthesis. The release of vapor into the air alters the surface energy fluxes and leads to potential cloud formation. Clouds alter the amount of sunlight, or radiation, that can reach Earth, affecting Earth's energy balance, and in some areas can lead to precipitation. "But, until our study, researchers have not been able to exactly quantify in observations how much photosynthesis, and the biosphere more generally, can affect weather and climate," says Julia Green, Gentine's PhD student and the paper's lead author.Recent advancements in satellite observations of solar-induced fluorescence, a proxy for photosynthesis, enabled the team to infer vegetation activity. They used remote sensing data for precipitation, radiation, and temperature to represent the atmosphere. They then applied a statistical technique to understand the cause and feedback loop between the biosphere and the atmosphere. Theirs is the first study investigating land-atmosphere interactions to determine both the strength of the predictive mechanism between variables and the time scale over which these links occur.The researchers found that substantial vegetation-precipitation feedback loops often occur in semi-arid or monsoonal regions, in effect hotspots that are transitional between energy and water limitation. In addition, strong biosphere-radiation feedbacks are often present in several moderately wet regions, for instance in the Eastern U.S. and in the Mediterranean, where precipitation and radiation increase vegetation growth. Vegetation growth enhances heat transfer and increases the height of Earth's boundary layer, the lowest part of the atmosphere that is highly responsive to surface radiation. This increase in turn affects cloudiness and surface radiation."Current Earth system models underestimate these precipitation and radiation feedbacks mainly because they underestimate the biosphere response to radiation and water stress response," Green says. "We found that biosphere-atmosphere feedbacks cluster in hotspots, in specific climatic regions that also coincide with areas that are major continental CO2 sources and sinks. Our research demonstrates that those feedbacks are also essential for the global carbon cycle -- they help determine the net CO2 balance of the biosphere and have implications for improving critical management decisions in agriculture, security, climate change, and so much more."Gentine and his team are now exploring ways to model how biosphere-atmosphere interactions may change with a shifting climate, as well as learning more about the drivers of photosynthesis, in order to better understand atmospheric variability.Paul Dirmeyer, a professor in the department of atmospheric, oceanic and earth sciences at George Mason University who was not involved in the study, notes: "Green et al. put forward an intriguing and exciting new idea, expanding our measures of land-atmospheric feedbacks from mainly a phenomenon of the water and energy cycles to include the biosphere, both as a response to climate forcing and a forcing to climate response."
|
Geology
| 2,017 |
May 23, 2017
|
https://www.sciencedaily.com/releases/2017/05/170523144110.htm
|
How X-rays helped to solve mystery of floating rocks in the ocean
|
It's true -- some rocks can float on water for years at a time. And now scientists know how they do it, and what causes them to eventually sink.
|
X-ray studies at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) have helped scientists to solve this mystery by scanning inside samples of lightweight, glassy, and porous volcanic rocks known as pumice stones. The X-ray experiments were performed at Berkeley Lab's Advanced Light Source, an X-ray source known as a synchrotron.The surprisingly long-lived buoyancy of these rocks -- which can form miles-long debris patches on the ocean known as pumice rafts that can travel for thousands of miles -- can help scientists discover underwater volcano eruptions.And, beyond that, learning about its flotation can help us understand how it spreads species around the planet; pumice is nutrient rich and readily serves as a seafaring carrier of plant life and other organisms. Floating pumice can also be a hazard for boats, as the ashy mixture of ground-up pumice can clog engines."The question of floating pumice has been around the literature for a long time, and it hadn't been resolved," said Kristen E. Fauria, a UC Berkeley graduate student who led the study, published in While scientists have known that pumice can float because of pockets of gas in its pores, it was unknown how those gases remain trapped inside the pumice for prolonged periods. If you soak up enough water in a sponge, for example, it will sink."It was originally thought that the pumice's porosity is essentially sealed," Fauria said, like a corked bottle floating in the sea. But pumice's pores are actually largely open and connected -- more like an uncorked bottle. "If you leave the cap off and it still floats ... what's going on?"Some pumice stones have even been observed to "bob" in the laboratory -- sinking during the evening and surfacing during the day.To understand what's at work in these rocks, the team used wax to coat bits of water-exposed pumice sampled from Medicine Lake Volcano near Mount Shasta in Northern California and Santa María Volcano in Guatemala.They then used an X-ray imaging technique at the ALS known as microtomography to study concentrations of water and gas -- in detail measured in microns, or thousandths of a millimeter -- within preheated and room-temperature pumice samples.The detailed 3-D images produced by the technique are very data-intensive, which posed a challenge in quickly identifying the concentrations of gas and water present in the pumice samples' pores.To tackle this problem, Zihan Wei, a visiting undergraduate researcher from Peking University, used a data-analysis software tool that incorporates machine learning to automatically identify the gas and water components in the images.Researchers found that the gas-trapping processes that are in play in the pumice stones relates to "surface tension," a chemical interaction between the water's surface and the air above it that acts like a thin skin -- this allows some creatures, including insects and lizards, to actually walk on water."The process that's controlling this floating happens on the scale of human hair," Fauria said. "Many of the pores are really, really small, like thin straws all wound up together. So surface tension really dominates."The team also found that a mathematical formulation known as percolation theory, which helps to understand how a liquid enters a porous material, provides a good fit for the gas-trapping process in pumice. And gas diffusion -- which describes how gas molecules seek areas of lower concentration -- explains the eventual loss of these gases that causes the stones to sink.Michael Manga, a staff scientist in Berkeley Lab's Energy Geosciences Division and a professor in the Department of Earth and Planetary Science at UC Berkeley who participated in the study, said, "There are two different processes: one that lets pumice float and one that makes it sink," and the X-ray studies helped to quantify these processes for the first time. The study showed that previous estimates for flotation time were in some cases off by several orders of magnitude."Kristen had the idea that in hindsight is obvious," Manga said, "that water is filling up only some of the pore space." The water surrounds and traps gases in the pumice, forming bubbles that make the stones buoyant. Surface tension serves to keep these bubbles locked inside for prolonged periods. The bobbing observed in laboratory experiments of pumice floatation is explained by trapped gas expanding during the heat of day, which causes the stones to temporarily float until the temperature drops.The X-ray work at the ALS, coupled with studies of small pieces of pumice floating in water in Manga's UC Berkeley lab, helped researchers to develop a formula for predicting how long a pumice stone will typically float based on its size. Manga has also used an X-ray technique at the ALS called microdiffraction, which is useful for studying the origins of crystals in volcanic rocks.Dula Parkinson, a research scientist at Berkeley Lab's ALS who assisted with the team's microtomography experiments, said, "I'm always amazed at how much information Michael Manga and his collaborators are able to extract from the images they collect at ALS, and how they're able to join that information with other pieces to solve really complicated puzzles."The recent study triggered more questions about floating pumice, Fauria said, such as how pumice, ejected from deep underwater volcanoes, finds its way to the surface. Her research team has also conducted X-ray experiments at the ALS to study samples from so-called "giant" pumice that measured more than a meter long.That stone was recovered from the sea floor in the area of an active underwater volcano by a 2015 research expedition that Fauria and Manga participated in. The expedition, to a site hundreds of miles north of New Zealand, was co-led by Rebecca Carey, a scientist formerly affiliated with the Lab's ALS.Underwater volcano eruptions are not as easy to track down as eruptions on land, and floating pumice spotted by a passenger on a commercial aircraft actually helped researchers track down the source of a major underwater eruption that occurred in 2012 and motivated the research expedition. Pumice stones spewed from underwater volcano eruptions vary widely in size but can typically be about the size of an apple, while pumice stones from volcanoes on land tend to be smaller than a golf ball."We're trying to understand how this giant pumice rock was made," Manga said. "We don't understand well how submarine eruptions work. This volcano erupted completely different than we hypothesized. Our hope is that we can use this one example to understand the process."Fauria agreed that there is much to learn from underwater volcano studies, and she noted that X-ray studies at the ALS will play an ongoing role in her team's work.
|
Geology
| 2,017 |
May 23, 2017
|
https://www.sciencedaily.com/releases/2017/05/170523083354.htm
|
Smoke from wildfires can have lasting climate impact
|
The wildfire that has raged across more than 150,000 acres of the Okefenokee Swamp in Georgia and Florida has sent smoke billowing into the sky as far as the eye can see. Now, new research published by the Georgia Institute of Technology shows how that smoke could impact the atmosphere and climate much more than previously thought.
|
Researchers have found that carbon particles released into the air from burning trees and other organic matter are much more likely than previously thought to travel to the upper levels of the atmosphere, where they can interfere with rays from the sun -- sometimes cooling the air and at other times warming it."Most of the brown carbon released into the air stays in the lower atmosphere, but a fraction of it does get up into the upper atmosphere, where it has a disproportionately large effect on the planetary radiation balance -- much stronger than if it was all at the surface," said Rodney Weber, a professor in Georgia Tech's School of Earth & Atmospheric Sciences.The study, which was published May 22 in the journal The researchers analyzed air samples collected in 2012 and 2013 by NASA aircraft from the upper troposphere -- about seven miles above the earth's surface -- at locations across the United States. They found surprising levels of brown carbon in the samples but much less black carbon.While black carbon can be seen in the dark smoke plumes rising above burning fossil or biomass fuels at high temperature, brown carbon is produced from the incomplete combustion that occurs when grasses, wood or other biological matter smolders, as is typical for wildfires. As particulate matter in the atmosphere, both can interfere with solar radiation by absorbing and scattering the sun's rays.The climate is more sensitive to those particulates as their altitude increases. The researchers found that brown carbon appears much more likely than black carbon to travel through the air to the higher levels of the atmosphere where it can have a greater impact on climate."People have always assumed that when you emit this brown carbon, over time it goes away," said Athanasios Nenes, a professor and Georgia Power Scholar in the School of Earth & Atmospheric Sciences and the School of Chemical & Biomolecular Engineering.After the brown carbon is carried by smoke plumes into the lower atmosphere, it mixes with clouds. Then it hitches a ride on the deep convection forces that exist in clouds to travel to the upper atmosphere.Although the researchers couldn't explain how, they also found that during the journey through the clouds, the brown carbon became more concentrated relative to black carbon."The surprise here is that the brown carbon gets promoted when you go through the cloud, compared to black carbon," Nenes said. "This suggests that there may be in-cloud production of brown carbon that we were not aware of before."
|
Geology
| 2,017 |
May 23, 2017
|
https://www.sciencedaily.com/releases/2017/05/170523083216.htm
|
Hottest lavas that erupted in past 2.5 billion years revealed
|
An international team of researchers led by geoscientists with the Virginia Tech College of Science recently discovered that deep portions of Earth's mantle might be as hot as it was more than 2.5 billion years ago.
|
The study, led by Esteban Gazel, an assistant professor with Virginia Tech's Department of Geosciences, and his doctoral student Jarek Trela of Deer Park, Illinois, is published in the latest issue of The Archean Eon -- covering from 2.5 to 4 billion years ago -- is one of the most enigmatic times in the evolution of our planet, Gazel said. During this time period, the temperature of Earth's mantle -- the silicate region between the crust and the outer core -- was hotter than it is today, owing to a higher amount of radioactive heat produced from the decay of elements such as potassium, thorium, and uranium. Because Earth was hotter during this period, this interval of geologic time is marked by the widespread of occurrence of a unique rock known as komatiite."Komatiites are basically superhot versions of Hawaiian style lava flows," Gazel said. "You can imagine a Hawaiian lava flow, only komatiites were so hot that they glowed white instead of red, and they flowed on a planetary surface with very different atmospheric conditions, more similar to Venus than the planet we live on today."Earth essentially stopped producing abundant hot komatiites after the Archean era because the mantle has cooled during the past 4.5 billion years due to convective cooling and a decrease in radioactive heat production, Gazel said.However, Gazel and a team made what they call an astonishing discovery while studying the chemistry of ancient Galapagos-related lava flows, preserved today in Central America: a suite of lavas that shows conditions of melting and crystallization similar to the mysterious Archean komatiites.Gazel and collaborators studied a set of rocks from the 90 million-year-old Tortugal Suite in Costa Rica and found that they had magnesium concentrations as high as Archean komatiites, as well as textural evidence for extremely hot lava flow temperatures."Experimental studies tell us that that the magnesium concentration of basalts and komatiites is related to the initial temperature of the melt," Gazel said. "They higher the temperature, the higher the magnesium content of a basalt."The team also studied the composition olivine, the first mineral that crystallized from these lavas. Olivine -- a light green mineral that Gazel has obsessively explored many volcanoes and magmatic regions to search for -- is an extremely useful tool to study a number of conditions related to origin of a lava flow because it is the first mineral phase that crystallizes when a mantle melt cools. Olivines also carry inclusions of glass -- that once was melt -- and other smaller minerals that are helpful to decipher the secrets of the deep Earth."We used the composition of olivine as another thermometer to corroborate how hot these lavas were when they began to cool," Gazel said. "You can determine the temperature that basaltic lava began crystallizing by analyzing the composition of olivine and inclusions of another mineral called spinel. At higher temperatures, olivine will incorporate more aluminum into its structure and spinel will incorporate more chromium. If you know how much of these elements are present in each mineral, then you know the temperature at which they crystallized."The team found that Tortugal olivines crystallized at temperature nearing 2,900 degrees Fahrenheit (1,600 degrees Celsius) -- as high as temperatures recorded by olivines from komatiites -- making this a new record on lava temperatures in the past 2.5 billion years.Gazel and collaborators suggest in their study that Earth may still be capable of producing komatiite-like melts. Their results suggest that Tortugal lavas most likely originated from the hot core of the Galapagos mantle plume that started producing melts nearly 90 million years ago and has remained active ever since.A mantle plume is a deep-earth structure that likely originates at the core-mantle boundary of the planet. When it nears the surface of the planet it begins to melt, forming features known as hotspots such as those found in Hawaii or Galapagos. Geologists can then study these hotspot lava flows and use their geochemical information as a window into the deep Earth."What is really fascinating about this study is that we show that the planet is still capable of producing lavas as hot as during Archean time period," Gazel said. "Based on our results from Tortugal lavas, we think that mantle plumes are 'tapping' a deep, hot region of the mantle that hasn't cooled very much since the Archean. We think that this region is probably being sustained by heat from the crystallizing core of the planet.""This is a really interesting discovery and we are going to keep investigating Tortugal," said Trela, a doctoral student and the first author of the paper. "Although the Tortugal Suite was first discovered and documented more than 20 years ago, it wasn't until now that we have the technology and experimental support to better understand the global implications of this location."Trela added, "Our new data suggest that this suite of rocks offers tremendous opportunity to answer key questions regarding the accretion of Earth, its thermal evolution, and the geochemical messages that mantle plumes bring to the surface of the planet."
|
Geology
| 2,017 |
May 23, 2017
|
https://www.sciencedaily.com/releases/2017/05/170523082009.htm
|
Weathering of rocks a poor regulator of global temperatures
|
A new University of Washington study shows that the textbook understanding of global chemical weathering -- in which rocks are dissolved, washed down rivers and eventually end up on the ocean floor to begin the process again -- does not depend on Earth's temperature in the way that geologists had believed.
|
The study, published May 22 in the open-access journal "Understanding how the Earth transitioned from a hothouse climate in the age of the dinosaurs to today could help us better understand long-term consequences of future climate change," said corresponding author Joshua Krissansen-Totton, a UW doctoral student in Earth and space sciences.The current understanding is that Earth's climate is controlled over periods of millions of years by a natural thermostat related to the weathering of rocks. Carbon dioxide is released into the air by volcanoes, and this gas may then dissolve into rainwater and react with silicon-rich continental rocks, causing chemical weathering of the rocks. This dissolved carbon then flows down rivers into the ocean, where it ultimately gets locked up in carbon-containing limestone on the seafloor.As a potent greenhouse gas, atmospheric carbon dioxide also traps heat from the sun. And a warmer Earth increases the rate of chemical weathering both by causing more rainfall and by speeding up the chemical reactions between rainwater and rock. Over time, reducing the amount of carbon dioxide in the air by this method cools the planet, eventually returning the climate to more moderate temperatures -- or so goes the textbook picture."The general idea has been that if more carbon dioxide is released, the rate of weathering increases, and carbon dioxide levels and temperature are moderated," co-author David Catling, a UW professor of Earth and space sciences. "It's a sort of long-term thermostat that protects the Earth from getting too warm or too cold."The new study began when researchers set out to determine conditions during the earliest life on Earth, some 3.5 billion to 4 billion years ago. They first tested their ideas on what they believed to be a fairly well-understood time period: the past 100 million years, when rock and fossil records of temperatures, carbon dioxide levels and other environmental variables exist.Earth's climate 100 million years ago was very different from today. During the mid-Cretaceous, the poles were 20 to 40 degrees Celsius warmer than the present. Carbon dioxide in the air was more than double today's concentrations. Seas were 100 meters (330 feet) higher, and dinosaurs roamed near the ice-free poles.The researchers created a computer simulation of the flows of carbon required to match all the geologic records, thus reproducing the dramatic transition from the warm mid-Cretaceous times to today."We found that to be able to explain all the data -- temperature, COGeologists had previously estimated that a temperature increase of 7 C would double the rate of chemical weathering. But the new results show that more than three times that temperature jump, or 24 C, is required to double the rate at which rock is washed away."It's just a much less efficient thermostat," Krissansen-Totton said.The authors suggest that another mechanism controlling the rate of weathering may be how much land is exposed above sea level and the steepness of Earth's surface. When the Tibetan Plateau was formed some 50 million year ago, the steeper surfaces may have increased the global rateof chemical weathering, drawing down more CO"In retrospect, our results make a lot of sense," Catling said. "Rocks tell us that Earth has had large swings in temperature over geological history, so Earth's natural thermostat can't be a very tight one."Their calculations also indicate a stronger relationship between atmospheric COThough not the final word, researchers said, these numbers are bad news for today's climate shifts."What all this means is that in the very long term, our distant descendants can expect more warming for far longer if carbon dioxide levels and temperatures continue to rise," Catling said.The researchers will now apply their calculations to other periods of the geologic past."This is going to have implications for the carbon cycles for other times in Earth's history and into its future, and potentially for other rocky planets beyond the solar system," Krissansen-Totton said.
|
Geology
| 2,017 |
May 19, 2017
|
https://www.sciencedaily.com/releases/2017/05/170519083631.htm
|
Sea level as a metronome of Earth's history
|
Sedimentary layers record the history of Earth. They contain stratigraphic cycles and patterns that precisely reveal the succession of climatic and tectonic conditions that have occurred over millennia, thereby enhancing our ability to understand and predict the evolution of our planet. Researchers at the University of Geneva (UNIGE), Switzerland, -- together with colleagues at the University of Lausanne (UNIL) and American and Spanish scientists -- have been working on an analytical method that combines observing deep-water sedimentary strata and measuring in them the isotopic ratio between heavy and light carbon. They have discovered that the cycles that punctuate these sedimentary successions are not, as one might think, due solely to the erosion of mountains that surround the basin, but are more ascribable to sea level changes. This research, which you can read in the journal
|
The area south of the Pyrenees is particularly suitable for studying sedimentary layers. Rocks are exposed over large distances, allowing researchers to undertake direct observation. Turbidites can be seen here: large sediment deposits formed in the past by underwater avalanches consisting of sand and gravel. "We noticed that these turbidites returned periodically, about every million years. We then wondered what the reasons for this cyclicity were," explains Sébastien Castelltort, professor in the department of earth sciences in UNIGE's faculty of sciences.The geologists focused their attention on Eocene sedimentary rocks (about 50 million years ago), which was particularly hot, and undertook the isotopic profiling of the sedimentary layers. "We took a sample every 10 metres," says Louis Honegger, a researcher at UNIGE, "measuring the ratio between 13C (heavy carbon stable isotope) and 12C (light carbon stable isotope). The ratio between the two tells us about the amount of organic matter, the main consumer of 12C, which is greater when the sea level is high. The variations in the ratio helped us explore the possible link with the sea level." The research team found that the turbidite-rich intervals were associated with high 12C levels, and almost always corresponded to periods when the sea level was low. It seems that sedimentary cycles are mainly caused by the rise and fall of the sea level and not by the episodic growth of mountains.When the sea level is high, continental margins are flooded under a layer of shallow water. Since the rivers are no longer able to flow, they begin to deposit the sediments they carry there. This is why so little material reaches the deep basins downstream. When the sea level is low, however, rivers erode their beds to lower the elevation of their mouth; they transfer their sediment directly to the continental slopes of the deep basins, creating an avalanche of sand and gravel. Consequently, if the variations of the sea level are known, it is possible to predict the presence of large sedimentary accumulations created by turbidites, which often contain large volumes of hydrocarbons, one of the holy grails of exploration geology.The research provides a new role for the use of carbon isotopes. "From now on, continues Castelltort, we know that by calculating the ratio between 13C and 12C sampled in similar slope deposits close to continents, we can have an indication of the sea level, which means it's possible to better predict the distribution of sedimentary rocks in our subsurface." In addition, this measurement is relatively simple to perform and it provides accurate data -- a real asset for science and mining companies. The study also highlights the importance of sea levels, which are a real metronome for Earth's sedimentary history. "Of course," concludes Honegger, "tectonic deformation and erosion are important factors in the formation of sedimentary layers; but they play a secondary role in the formation of turbidite accumulations, which are mainly linked to changes in the sea level."
|
Geology
| 2,017 |
May 18, 2017
|
https://www.sciencedaily.com/releases/2017/05/170518083039.htm
|
Land height could help explain why Antarctica is warming slower than the Arctic
|
Temperatures in the Arctic are increasing twice as fast as in the rest of the globe, while the Antarctic is warming at a much slower rate. A new study published in
|
Climate models and past-climate studies show that, as the Earth warms in response to an increase in greenhouse gases in the atmosphere, temperatures rise faster at the poles than in other parts of the planet. This is known as polar amplification. But this amplified warming is not the same at both poles."On average, warming for the entire Antarctic continent has been much slower than Arctic warming so far. Moreover, climate models suggest that, by the end of this century, Antarctica will have warmed less compared to the Arctic," says Marc Salzmann, a researcher at the Institute for Meteorology, University of Leipzig in Germany.A possible cause for the accelerated Arctic warming is the melting of the region's sea ice, which reduces the icy, bright area that can reflect sunlight back out into space, resulting in more solar radiation being absorbed by the dark Arctic waters. Scientists believe this is an important contribution to warming in the region, but it's not the only one.Changes to the transport of heat by the Earth's atmosphere and oceans to the poles have also been suggested as a possible contributor to the steep rise in Arctic temperatures. In addition, the cold temperatures and the way air is mixed close to the surface at the poles mean that the surface has to warm more to radiate additional heat back to space. These effects may not only lead to stronger warming at the north of our planet, but also at the south polar region."I wondered why some of the reasons to explain Arctic warming have not yet caused strongly amplified warming in all of Antarctica as well," says Salzmann, the author of the "I thought that land height could be a game changer that might help explain why the Arctic has thus far warmed faster than Antarctica," he says.With an average elevation of about 2,500 m, Antarctica is the highest continent on Earth, much due to a thick layer of ice covering the bedrock. The continent also has high mountains, such as Mount Vinson, which rises almost 4,900 m above sea level.To test his idea, Salzmann used a computer model of the Earth system to find out how the climate would react to a doubling of the atmospheric carbon-dioxide concentration. He also ran the same experiment in a flat-Antarctica world, where he artificially decreased the land height over the entire southern continent to one metre, a value similar to the surface height in the Arctic. This allowed him to compare how differently the Earth would react to an increase in greenhouse-gas concentrations in the atmosphere if Antarctica was assumed flat.The experiments showed that, if Antarctica's land height is reduced, temperatures in the region respond more strongly to a rise in the concentration of greenhouse gases over the continent. This contributes to an increase in Antarctic warming, which reduces the difference in polar amplification between the Arctic and the Antarctic.The most significant factor, however, was a change in the way heat is transported in the atmosphere from the equator to the poles in the flat Antarctica world compared to the reference model. "Assuming a flat Antarctica allows for more transport of warm air from lower latitudes," Salzmann explains. "This is consistent with the existing view that when the altitude of the ice is lowered, it becomes more prone to melting," Salzmann explains.In the long term, this could contribute to accelerate Antarctic warming in the real world. As the region warms due to increased greenhouse-gas emissions, ice melts, reducing Antarctica's elevation over centuries or thousands of years. This, in turn, would contribute to even more warming.
|
Geology
| 2,017 |
May 15, 2017
|
https://www.sciencedaily.com/releases/2017/05/170515150708.htm
|
Rare Earth element mineral potential in the southeastern US coastal plain
|
Rare earth elements have become increasingly important for advanced technologies, from cell phones to renewable energy to defense systems. Mineral resources hosted in heavy mineral sand deposits are especially attractive because they can be recovered using well-established mechanical methods, making extraction, processing, and remediation relatively simple.
|
In their study just published online in the Geological Society of America Bulletin, A.K. Shah and colleagues examine rare earth mineral resource potential within heavy mineral sands in the southeastern United States.Using geophysical and geochemical data that cover this very wide region, the team mapped the areas most likely to host accumulations of these minerals. Additionally, their analyses of co-minerals provide constraints on broad sedimentary provenance. These constraints suggest that a large percentage of the heavy mineral sands are derived from a relatively small part of the Piedmont province via coastal processes during Atlantic opening, and that a much smaller amount of heavy mineral sands are delivered via rivers and streams.
|
Geology
| 2,017 |
April 9, 2020
|
https://www.sciencedaily.com/releases/2020/04/200409141550.htm
|
Long-living tropical trees play outsized role in carbon storage
|
A group of trees that grow fast, live long lives and reproduce slowly account for the bulk of the biomass -- and carbon storage -- in some tropical rainforests, a team of scientists says in a paper published this week in the journal
|
"People have been arguing about whether these long-lived pioneers contribute much to carbon storage over the long term," said Caroline Farrior, an assistant professor of integrative biology at The University of Texas at Austin and a primary investigator on the study. "We were surprised to find that they do."It is unclear the extent to which tropical rainforests can help soak up excess carbon dioxide in the atmosphere produced by burning fossil fuels. Nonetheless, the new study provides insights about the role of different species of trees in carbon storage.Using more than 30 years' worth of data collected from a tropical rainforest in Panama, the team has uncovered some key traits of trees that, when integrated into computer models related to climate change, will improve the models' accuracy. With the team's improved model, the scientists plan to begin answering questions about what drives forest composition over time and what factors affect carbon storage.Most existing Earth system models used to forecast global climate decades from now, including those used by the Intergovernmental Panel on Climate Change, represent the trees in a forest as all basically the same."This analysis shows that that is not good enough for tropical forests and provides a way forward," Farrior said. "We show that the variation in tropical forest species's growth, survival and reproduction is important for predicting forest carbon storage."The project was led by Nadja Rüger, research fellow at the German Centre for Integrative Biodiversity Research (iDiv), Halle-Jena-Leipzig.In addition to the finding about long-lived pioneers, the team found the composition of a tropical forest over time depends on how each tree species balances two different sets of trade-offs: growth versus survival (for example, one type of tree might grow fast but die young) and stature versus reproduction (another might grow tall but reproduce leisurely). Plotting every species as a point on a graph based on where they fall along these two different axes allowed the scientists to have a more sophisticated and accurate model than prior ones, which usually focused exclusively on the first of these two trade-offs or parametrized the groups by different means."To really appreciate that there is this second trade-off between stature and reproduction, and that it's important in old-growth forests, is a big deal biologically," Farrior said.The team also discovered that the nearly 300 unique tree species that live on Barro Colorado Island, which sits in the middle of the Panama Canal, can be represented in their computer model by just five functional groups and still produce accurate forecasts of tree composition and forest biomass over time.It's not possible to directly verify the forecasts of a forest model in future decades. So the researchers did the next best thing: They seeded their model with forest composition data collected at their site in Panama during the 1980s and then ran the model forward to show that it accurately represents the changes that occurred from then until now. This is called "hindcasting."Next, they plan to explore how a warming world might benefit trees with certain traits over others, shifting forest composition and the potential of forests to store carbon."One of the biggest unknowns in climate forecasting is: What are trees going to do?" said Farrior. "We really need to get a handle on that if we're going to accurately predict how climate will change and manage forests. Right now, they're absorbing some of the excess carbon we're producing and delaying climate change, but will they keep doing it?"The other coauthors on the paper are Richard Condit at the Field Museum of Natural History in Chicago and the Morton Arboretum; Daisy H. Dent at the Smithsonian Tropical Research Institute (STRI) in Panama and the University of Stirling in the U.K.; Saara J. DeWalt at Clemson University; Stephen P. Hubbell at STRI and the University of California, Los Angeles; Jeremy W. Lichstein at the University of Florida, Gainesville; Omar R. Lopez at STRI and Instituto de Investigaciones Científicas y Servicios de Alta Tecnología in Panama; and Christian Wirth at iDiv, University of Leipzig and Max Planck Institute for Biogeochemistry in Germany.Funding was provided by the U.S. National Science Foundation, Deutsche Forschungsgemeinschaft and Secretaría Nacional de Ciencia, Tecnología e Innovación.
|
Global Warming
| 2,020 |
April 9, 2020
|
https://www.sciencedaily.com/releases/2020/04/200409100330.htm
|
Greenland ice sheet meltwater can flow in winter, too
|
Liquid meltwater can sometimes flow deep below the Greenland Ice Sheet in winter, not just in the summer, according to CIRES-led work published in the AGU journal
|
"This observation raises questions for the Greenland research community, and motivates the need for future work on wintertime hydrology in Greenland," said lead author Lincoln Pitcher, a Visiting Fellow at CIRES, part of the University of Colorado Boulder. Pitcher began this work while he was a graduate student at the University of California Los Angeles, and his co-authors are from seven different states and Denmark.When evidence suggested that some of Greenland's glaciers were storing meltwater through the winter, Pitcher set out for southwest Greenland to see if any of this meltwater was also leaving the ice sheet during winter. In February 2015, he and his colleague Colin Gleason of the University of Massachusetts at Amherst dragged a ground-penetrating radar across frozen rivers downstream of the edge of the ice sheet and drilled boreholes to see if any water was leaving the ice sheet and flowing beneath river ice. They surveyed rivers draining five Greenland Ice Sheet outlet glaciers and discovered meltwater flowing at just one site, the Isortoq River. In summertime, the Isotoq drains meltwater from the terminus of the Isunguata Sermia outlet glacier. In winter, the river appears frozen, but Pitcher and Gleason found slowly flowing liquid water there.It was "a trickle, not a torrent," Pitcher said, and the water was flowing below half a meter of ice while temperatures were well below zero. Pitcher and Gleason collected water samples and geochemical analysis indicated that it had come from under the ice sheet itself.The team concluded that it is possible the bed of the Greenland Ice Sheet can stay wet and drain small amounts of water year-round. This finding is important for understanding how meltwater from the ice surface moves through the ice sheet, is retained, refreezes and/or ultimately drains into rivers and/or the global ocean.It is often assumed that Greenland's drainage system lies dormant during winter. Pitcher's team's findings highlight a growing need for year-round Arctic hydrologic investigations, not just in summer.
|
Global Warming
| 2,020 |
April 8, 2020
|
https://www.sciencedaily.com/releases/2020/04/200408113300.htm
|
Don't look to mature forests to soak up carbon dioxide emissions
|
Research published today in
|
Dr. John Drake, assistant professor in ESF's Department of Sustainable Resources Management, is a co-author of the paper in collaboration with researchers at Western Sydney University.The experiment, conducted at Western Sydney University's EucFACE (Eucalyptus Free Air COCarbon dioxide (COHowever, scientists have long wondered whether mature native forests would be able to take advantage of the extra photosynthesis, given that the trees also need nutrients from the soil to grow. Drake joined in the first experiment of its kind applied to a mature native forest to expose a 90-year old eucalypt woodland on Western Sydney's Cumberland Plain to elevated carbon dioxide levels.The researchers combined their measurements into a carbon budget that accounts for all the pathways of carbon into and out of the EucFACE forest ecosystem, through the trees, grasses, insects, soils and leaf litter. This carbon-tracking analysis showed that the extra carbon absorbed by the trees was quickly cycled through the soil and returned to the atmosphere, with around half the carbon being returned by the trees themselves, and half by fungi and bacteria in the soil."The trees convert the absorbed carbon into sugars, but they can't use those sugars to grow more, because they don't have access to additional nutrients from the soil. Instead, they send the sugars below-ground where they 'feed' soil microbes," said Dr. Belinda Medlyn, distinguished professor at the Hawkesbury Institute for the Environment.These findings have global implications: models used to project future climate change, and impacts of climate change on plants and ecosystems, currently assume that mature forests will continue to absorb carbon over and above their current levels, acting as carbon sinks. The findings from EucFACE suggest that those sinks may be weaker or absent for mature forests."While we can't say what we found in this one Australian forest directly translates to northeastern forests in the United States," Drake said this information has implications for forests in New York state."Forests of the northeastern United States for the last 100 years have been regrowing and providing an important carbon sink. As those forests transition to a more mature state, there are some uncertainties whether that will continue," said Drake.The results may also impact New York's first statewide forest carbon assessment led by Dr. Colin Beier of ESF's Climate and Applied Forest Research Institute (CAFRI)."Forests are increasingly seen in policy circles as a critical part of the solution to climate change, and that's certainly the case for New York, where the carbon absorbed by our forests and stored in trees, soils and harvested wood products will be essential for reaching our state's legislated goal of net carbon neutrality by 2050," said Beier, associate professor of ecology and CAFRI director."As we develop forest carbon accounting for New York, one of our biggest questions is how forest ecosystems and their many benefits to society, including reducing climate risk, will respond to a rapidly changing environment," said Beier. "This groundbreaking study fills a major gap and reduces this uncertainty, allowing us to make more reliable predictions and provide better guidance to policymakers, landowners, and forest managers.""Forest carbon storage is vitally important in a climate change context," said Drake, adding "and the recent work in Looking to restoration ecology to encourage forests to grow in some particular areas would be useful, said Drake "There are also possibilities for managing existing forests to increase their carbon storage."Drake is working with colleagues Dr. Julia Burton, Dr. René Germain and Beier to develop and field test alternative forest management strategies that mitigate climate change by increasing the capacity of forests to adapt to changes in climate conditions as well as remove carbon from the atmosphere."We are not looking for a silver bullet," said Burton. "Climate-smart forest management will likely involve a variety of approaches.""The limited capacity of mature trees to respond suggests the need for a diversity of age classes of trees (younger trees sequester, older trees store carbon) and species, including species that may be better adapted to future climate conditions," said Drake.
|
Global Warming
| 2,020 |
April 8, 2020
|
https://www.sciencedaily.com/releases/2020/04/200408110333.htm
|
Climate change could cause sudden biodiversity losses worldwide
|
A warming global climate could cause sudden, potentially catastrophic losses of biodiversity in regions across the globe throughout the 21st century, finds a new UCL-led study.
|
The findings, published today in The study's lead author, Dr Alex Pigot (UCL Centre for Biodiversity & Environment Research): "We found that climate change risks to biodiversity don't increase gradually. Instead, as the climate warms, within a certain area most species will be able to cope for a while, before crossing a temperature threshold, when a large proportion of the species will suddenly face conditions they've never experienced before.""It's not a slippery slope, but a series of cliff edges, hitting different areas at different times."Dr Pigot and colleagues from the USA and South Africa were seeking to predict threats to biodiversity over the course of the 21st century, rather than a single-year snapshot. They used climate model data from 1850 to 2005, and cross-referenced it with the geographic ranges of 30,652 species of birds, mammals, reptiles, amphibians, fish, and other animals and plants. The data was available for areas across the globe, divided up into 100 by 100 km square grid cells.They used climate model projections for each year up to 2100 to predict when species in each grid cell will begin experiencing temperatures that are consistently higher than the organism has previously experienced across its geographic range, for a period of at least five years.The study's first author, Dr Christopher Trisos (African Climate and Development Initiative, University of Cape Town, and National Socio-Environment Synthesis Center -- SESYNC, Maryland, USA), said: "The historic temperature models, combined with species ranges, showed us the range of conditions that each organism can survive under, as far as we know.""Once temperatures in a given area rise to levels that the species have never experienced, we would expect there to be extinctions, but not necessarily -- we simply have no evidence of the ability of these species to persist after this point," he said.The researchers found that in most ecological communities across the globe, a large proportion of the organisms will find themselves outside of their niche (comfort zone) within the same decade. Across all of the communities, on average 73% of the species facing unprecedented temperatures before 2100 will cross that threshold simultaneously.The researchers predict that if global temperatures rise by 4°C by 2100, under a "high emissions" scenario which the researchers say is plausible, at least 15% of communities across the globe, and potentially many more, will undergo an abrupt exposure event where more than one in five of their constituent species crosses the threshold beyond their niche limit within the same decade. Such an event could cause irreversible damage to the functioning of the ecosystem.If warming is kept to 2°C or less, potentially fewer than 2% of communities will face such exposure events, although the researchers caution that within that 2% includes some of the most biodiverse communities on the planet, such as coral reefs.The researchers predict that such unprecedented temperature regimes will begin before 2030 in tropical oceans, and recent events such as mass bleaching of corals on the Great Barrier Reef suggest this is happening already. Higher latitudes and tropical forests are predicted to be at risk by 2050.Dr Pigot said: "Our findings highlight the urgent need for climate change mitigation, by immediately and drastically reducing emissions, which could help save thousands of species from extinction. Keeping global warming below 2°C effectively 'flattens the curve' of how this risk to biodiversity will accumulate over the century, providing more time for species and ecosystems to adapt to the changing climate -- whether that's by finding new habitats, changing their behaviour, or with the help of human-led conservation efforts."Co-author Dr Cory Merow (University of Connecticut) said: "We hope that our findings could serve as an early warning system, predicting which areas will be most at risk and when, that could help target conservation efforts and improve future model projections. It may be valuable to develop a ten-year monitoring programme -- similar to what climate scientists do, but for biodiversity -- which could be updated regularly based on what actually occurs."The study was funded by the Royal Society, the National Science Foundation (USA) and the African Academy of Sciences.
|
Global Warming
| 2,020 |
April 8, 2020
|
https://www.sciencedaily.com/releases/2020/04/200408104954.htm
|
Carbon emission scheme 'succeeding despite low prices'
|
A European Union (EU) programme aimed at reducing carbon dioxide (CO
|
Under the EU's Emissions Trading System (ETS), introduced in 2005 in response to the Kyoto Protocol, governments set a cap on an allowable total amount of emissions over a certain period. They also issue tradable emission permits, which allow for one ton of COIt is widely considered that carbon markets require high prices to reduce emissions but many observers believe they often set prices which are considered too low. However, the study by Strathclyde and Pittsburgh has found that the EU ETS saved around 1.2 billion tons of COThe study has been published in the journal Dr Patrick Bayer, a Chancellor's Fellow in Strathclyde's School of Government & Public Policy and lead author of the study, said: "The ETS was set up to cover some of the most polluting industries"It has focused on very carbon-intensive energy production and manufacturing but there is evidence in other research suggesting that these industries have started to diversify their business models and to look into adopting carbon-neutral technologies or, at least, are interested in thinking about how to change their operations."Firms got an initial endowment of permits free but if they had emissions in excess of what they were allowed, they needed to buy more. If firms are to change their behaviour in the long run, prices of permits should be as high as possible to incentivise them to change away from carbon-intensive production."It turned out prices in carbon markets were fairly low, which then caused major concerns for environmentalists and policy-makers, because they felt they might not provide sufficient incentives."It depends on the sector or size of firm but we argue that, if firms think of carbon regulation as a long-term project, then they do need to start to change their behaviour."The study used emissions in sectors not covered by the EU ETS to estimate what emissions would have been in those sectors the system does cover. It found that emissions in covered sectors decreased by between 8.1% and 11.5%, compared to expected emission levels without the EU ETS. This translates to a decrease of around 3.8%, compared with the EU's total emissions during 2008 to 2016.Dr Bayer said: "In the energy and electricity markets, we have seen even big players thinking about how they can run their operations when becoming less dependent on fossil fuels. But there can be a threat that, whenever prices in those markets go up, an industry or business becomes exposed to high costs."The appeal of carbon markets is that, once they are established with the right rules, you can connect them to other markets. Climate is not concerned about whether emissions are reduced in the UK or Germany or China; so long as they are reduced, that helps to address the problem. If you have carbon markets scattered across the world, you might be able to trade across those markets."The UK's future place in the ETS is still up for discussion but all options are on the table. Whether any UK carbon market would be connected to the European market isn't clear and would probably depend on negotiations with the EU and how trade will be regulated in future relations. Assuming there were agreement on this and some strong economic integration between the two countries, it would probably make a lot of sense to connect those markets. The UK has been successful in decarbonising its economy in the past decade or so and has a strong role to play in continuing to advocate for future decarbonisation""The period our study covered, from 2008 to 2016, included the financial crisis and economic downturn, when demand for the permits reduced. We used a statistical model to account for the effect of the crisis. The emission reductions that we measure are in addition to lower demand for permits due to the economic crisis, energy efficiency targets and climate policies that try to address carbon emissions."
|
Global Warming
| 2,020 |
April 8, 2020
|
https://www.sciencedaily.com/releases/2020/04/200408102135.htm
|
A rapidly changing Arctic
|
A new study by researchers at Woods Hole Oceanographic Institution (WHOI) and their international colleagues found that freshwater runoff from rivers and continental shelf sediments are bringing significant quantities of carbon and trace elements into parts of the Arctic Ocean via the Transpolar Drift -- a major surface current that moves water from Siberia across the North Pole to the North Atlantic Ocean.
|
In 2015, oceanographers conducting research in the Arctic Ocean as part of the International GEOTRACES program found much higher concentrations of trace elements in surface waters near the North Pole than in regions on either side of the current. Their results published this week in the "Many important trace elements that enter the ocean from rivers and shelf sediments are quickly removed from the water column," explains WHOI marine chemist Matthew Charette, lead author of the study. "But in the Arctic they are bound with abundant organic matter from rivers, which allows the mixture to be transported into the central Arctic, over 1,000 kilometers from their source."Trace elements, like iron, form essential building blocks for ocean life. As the Arctic warms and larger swaths of the ocean become ice-free for longer periods of time, marine algae are becoming more productive. A greater abundance of trace elements coming from rivers and shelf sediments can lead to increases in nutrients reaching the central Arctic Ocean, further fueling algal production."It's difficult to say exactly what changes this might bring," says Charette. "but we do know that the structure of marine ecosystems is set by nutrient availability."Nutrients fuel the growth of phytoplankton, a microscopic algae that forms the base of the marine food web. Generally speaking, more phytoplankton brings more zooplankton -- small fish and crustaceans, which can then be eaten by top ocean predators like seals and whales.Higher concentrations of trace elements and nutrients previously locked up in frozen soils (permafrost) are expected to increase as more river runoff reaches the Arctic, which is warming at a much faster rate than most anywhere else on Earth. While an increase in nutrients may boost Arctic marine productivity, Charette cautions that the continued loss of sea ice will further exacerbate climate warming, which will impact ecosystems more broadly."The Arctic plays an important role in regulating Earth's climate, with the ice cover reflecting sunlight back to space, helping to mitigate rising global temperatures due to greenhouse gas emissions," he adds. "Once the ice is gone, the Arctic Ocean will absorb more heat from the atmosphere, which will only make our climate predicament worse."
|
Global Warming
| 2,020 |
April 7, 2020
|
https://www.sciencedaily.com/releases/2020/04/200407131507.htm
|
Protecting the high seas: Identify biodiversity hotspots
|
Often considered desolate, remote, unalterable places, the high seas are, in fact, hotbeds of activity for both people and wildlife. Technology has enabled more human activity in areas once difficult to reach, and that in turn has brought a growing presence of industries such as fishing, mining and transportation in international waters -- the ocean beyond 200 nautical miles from any coast.
|
This increase is cause for concern to people like UC Santa Barbara researchers Douglas McCauley, Morgan Visalli and Benjamin Best, who are interested in the health and biodiversity of the oceans. That no nation has jurisdiction over international waters has, at least historically, made regulation very difficult and puts sensitive and essential ocean habitats and resources at risk."The high seas are the planet's last global commons," said Visalli, a marine scientist at the Benioff Ocean Initiative at UC Santa Barbara. "Yet marine life and resources on the high seas are at risk of being overexploited and degraded under the current fragmented framework of management. The world needs and deserves a comprehensive legal mechanism to protect high seas biodiversity now and into the future."So when the United Nations turned its efforts toward negotiating the first global high seas treaty for "the conservation and sustainable use of marine biological diversity of areas beyond national jurisdiction," the scientists leapt at the chance to put their expertise to work. To kickstart this research, ocean scientists and high seas experts from 13 universities and institutions gathered in a series of workshops held at UC Santa Barbara. Together the team developed a standardized, data-driven strategy to identify hotspots of biodiversity potentially deserving of protection in the high seas."One of the goals of these United Nations negotiations is to develop a pathway for the establishment of marine protected areas in the high seas," said Visalli. "This creates an incredible opportunity to leverage new global data assets and data-driven planning tools to identify areas of the high seas that have outstanding conservation value and could be considered high priority areas for spatial protection."The researchers' results are published in a paper in the journal Marine protected areas -- designated parks in the sea where special measures are taken to protect biodiversity -- are among the most powerful and effective tools marine scientists and managers have at their disposal to look after marine biodiversity, maintain ocean resiliency and enhance the productivity of fishery resources that operate just outside of these parks.But to get the most out of marine protected areas, they need to be put in the right places. Researchers in this collaboration used big data and an optimization algorithm to try to balance the benefits of protecting certain locations with high biodiversity against costs, such as the loss of fishing in that area. Their aim was to find win-win solutions for the possible placement of these high seas protected areas."It is a historic moment for our ocean," said McCauley, a professor of ecology at UC Santa Barbara and director of the Benioff Ocean Initiative. "Places like New York City, that famously included parks for nature and people in their zoning plans before things got busy, have benefited immensely from that foresight. This is our Central Park moment for the high seas."The researchers took more than 22 billion data points organized into 55 layers that included information on conservation-related factors such as species diversity, ocean productivity, threatened species and fishing in locations across the high seas, which cover about two-thirds of the global ocean. They also future-proofed their analysis by including data layers describing the predicted diversity of species in a future ocean altered by climate change."This is important because climate change is rapidly altering our oceans," McCauley said. "Our approach illustrates one way to protect the biodiversity oases of both today and tomorrow."Each hotspot identified in this analysis was special for its own unique reasons. The research highlighted, for example, the Costa Rica Dome, a dynamic nutrient rich region that attracts endangered blue whales and leatherback sea turtles; the Emperor Seamount Chain, a string of extinct underwater volcanoes that are home to some of the oldest living corals; and the Mascarene Plateau, an area in the Indian Ocean that has the largest contiguous seagrass meadow in the world and provides habitat for many globally unique species. These and other notable biodiversity hotspots across the globe could constitute the critical mass needed to achieve long-term marine sustainability goals, according to the study, and are worthy of consideration as the first generation of high seas marine protected areas.Decades in the making and nearly close to completion, the high seas treaty negotiations were set to embark on their fourth round this month, but have been postponed due to the COVID-19 pandemic. Preliminary results from this exercise were presented by UC Santa Barbara scientists at the United Nations during the third negotiation session for the treaty last August.This analysis, the researchers say, disproves the misconception that there is not enough good data about biodiversity in the high seas to strategically plan for high seas protected areas."We have high hopes," McCauley said. "We hope that the United Nations will indeed deliver a strong treaty later this year that includes measures to set up these new international ocean parks. And that science-based analyses, such as these, give them confidence that researchers and experts stand ready to help them strategically put these parks in smart places that will maximize the benefits that these parks will yield for people and nature."
|
Global Warming
| 2,020 |
April 7, 2020
|
https://www.sciencedaily.com/releases/2020/04/200407131442.htm
|
Adding a pinch of salt to El Niño models
|
When modeling the El Niño-Southern Oscillation (ENSO) ocean-climate cycle, adding satellite sea surface salinity -- or saltiness -- data significantly improves model accuracy, according to a new NASA study.
|
ENSO is an irregular cycle of warm and cold climate events called El Niño and La Niña. In normal years, strong easterly trade winds blow from the Americas toward southeast Asia, but in an El Niño year, those winds are reduced and sometimes even reversed. Warm water that was "piled up" in the western Pacific flows back toward the Americas, changing atmospheric pressure and moisture to produce droughts in Asia and more frequent storms and floods in the Americas. The reverse pattern is called a La Niña, in which the ocean in the eastern Pacific is cooler than normal.The team used NASA's Global Modelling and Assimilation Office (GMAO) Sub-seasonal-To-Seasonal (S2S) coupled ocean/atmosphere forecasting system (GEOS-S2S-2) to model three past ENSO events: The strong 2015 El Niño, the 2017 La Niña and the weak 2018 El Niño.Pulling from NASA's Soil Moisture Active Passive (SMAP) mission, the past NASA-CONAE (Argentinian Space Agency) Aquarius mission and the European Space Agency's Soil Moisture Ocean Salinity (SMOS) mission, they compared the forecast model's accuracy for each of the three events with and without assimilating SSS data into the models' initialization. In other words: One model run's initial conditions included SSS data, and the other did not.Adding assimilation of SSS data to the GEOS model helped it to depict the depth and density of the ocean's top layer more accurately, which led to better representations of large-scale circulation in response to ENSO. As a result, the models' predictions for the three case studies more closely reflected actual observations, compared to what forecasting models predicted at the time."In our three case studies, we examined different phases of ENSO," said Eric Hackert, a research scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland and the study's lead author. "For the big El Niño in 2015, assimilating the salinity data damped the signal -- our original model was overestimating the amplitude of the event. For the other two ENSO events, the forecasts originally predicted the wrong sign: For example, in 2017, the model without salinity data forecasted an El Niño, while the real ocean produced a La Niña. However, for each case we examined, adding satellite salinity to the initialization improved the forecasts."The study is one of the first to incorporate SSS data into forecast initialization for a global coupled model of interactions between the ocean, atmosphere, land, aerosols and sea ice. GEOS and other models used to help predict ENSO events do not typically include SSS. However, ocean surface salinity plays an important role in ocean currents, evaporation and interaction with the atmosphere, and heat transfer from the tropics to the poles. Colder, saltier water is denser and heavier than warmer, fresher water, and the large-scale temperature and precipitation shifts of ENSO events change ocean circulation and interactions between the water and atmosphere.Both phases of the ENSO cycle affect ecosystems, economies, human health, and wildfire risk -- making ENSO forecasts vital for many people around the world, Hackert said."For example, forecasts and observations gave a strong indication that there would be a big El Niño in 1997, which would lead to drought in northeast Brazil," he said. "This allowed the government of Brazil to issue a statement to substistence farmers, encouraging them to plant drought-resistant corn instead of high-yield varieties. In this case, good ENSO forecasts along with government action may have saved many lives. This is just one example of many socio-economic benefits for extending useful El Niño predictions."Including satellite SSS data also makes models useful for longer periods -- accurate ENSO forecasts without salinity data only extend out 4 months, while those with SSS data cover 7 months, Hackert said."Rather than having one season of confidence in your forecast, you have two seasons," Hackert said. "If your growing season is six months down the line, a longer quality forecast gives you an improved understanding of whether you need to plant high-yield or drought-resistant varieties. Another example would be that you have plenty of time to fix your roof if you live in Southern California (since El Niño typically brings rainy conditions to the southern US)."Having access to an ongoing record of satellite SSS data is essential for making forecasts accurate and reliable, Hackert said."In current forecast systems, satellite and ocean observations are optimally combined using models and data assimilation techniques to help define the state of the ocean," he said. "This study shows that adding satellite SSS to the suite of current observations helps to characterize the near-surface ocean state, leading to improved seasonal forecasts. We recommend that other forecast model systems around the world adopt SSS into their systems."
|
Global Warming
| 2,020 |
April 7, 2020
|
https://www.sciencedaily.com/releases/2020/04/200407101801.htm
|
Climate change triggers Great Barrier Reef bleaching
|
Australia's iconic Great Barrier Reef is experiencing its third coral bleaching event in just five years. The 2020 bleaching is severe, and more widespread than earlier events.
|
"We surveyed 1,036 reefs from the air during the last two weeks in March, to measure the extent and severity of coral bleaching throughout the Barrier Reef region," said Professor Terry Hughes, Director of the ARC Centre of Excellence for Coral Reef Studies at James Cook University."For the first time, severe bleaching has struck all three regions of the Great Barrier Reef -- the northern, central and now large parts of the southern sectors," Prof Hughes said.Coral bleaching at regional scales is caused by thermal stress due to spikes in sea temperatures during unusually hot summers. The first recorded mass bleaching event along the Great Barrier Reef occurred in 1998, then the hottest year on record. Four more mass bleaching events have occurred since -- as more temperature records were broken -- in 2002, 2016, 2017, and now in 2020.This year, February had the highest monthly temperatures ever recorded on the Great Barrier Reef since the Bureau of Meteorology's sea surface temperature records began in 1900."Bleaching isn't necessarily fatal, and it affects some species more than others," said Professor Morgan Pratchett, also from Coral CoE at JCU, who led underwater surveys to assess the bleaching."A pale or lightly bleached coral typically regains its colour within a few weeks or months and survives," he said.However, many corals die when bleaching is severe. In 2016, more than half of the shallow-water corals died on the northern region of the Great Barrier Reef."We will go back underwater later this year to assess the losses of corals from this most recent event," Prof Pratchett said."The north was the worst affected region in 2016, followed by the central region in 2017. In 2020, the cumulative footprint of bleaching has expanded further to include the south."The distinctive footprint of each bleaching event closely matches the location of hotter and cooler conditions in different years."As summers grow hotter and hotter, we no longer need an El Niño event to trigger mass bleaching at the scale of the Great Barrier Reef," Prof Hughes said."Of the five events we have seen so far, only 1998 and 2016 occurred during El Niño conditions."The gap between recurrent bleaching events is shrinking, hindering a full recovery."We have already seen the first example of back-to-back bleaching -- in the consecutive summers of 2016 and 2017," Prof Hughes said.After five bleaching events, the number of reefs that have so far escaped severe bleaching continues to dwindle. Those reefs are located offshore, in the far north, and in remote parts of the south.
|
Global Warming
| 2,020 |
April 6, 2020
|
https://www.sciencedaily.com/releases/2020/04/200406190412.htm
|
Indigenous knowledge could reveal ways to weather climate change on islands
|
Some islands have such low elevation, that mere inches of sea-level rise will flood them, but higher, larger islands will also be affected by changes in climate and an understanding of ancient practices in times of climate change might help populations survive, according to researchers.
|
"I'm working in a place (Madagascar) where communities around me are sensing, in the span of a few years, that they are seeing climate change," said Kristina Douglass, assistant professor of anthropology, Penn State. "They have seen climate events take out entire reefs."Douglass is interested in how the archaeological record can weigh in on climate change. She wants to understand how communities adapted in the past and how historical events have increased vulnerability. She and Jago Cooper, curator of the Americas, British Museum, investigated the Caribbean Islands and the islands in the Southwestern Indian Ocean off the east coast of Africa from Kenya to Mozambique."If we look back we see that all the communities have been displaced into marginal land," said Douglass, who is also an associate of Penn State Institutes of Energy and the Environment. "If they don't see this, they won't be able to find a solution. They have to consider that around the Caribbean and off of Africa there are historical factors that contribute to the problem."Both sets of islands have different histories. Indigenous Native American groups originally settled the Caribbean islands around 6,000 years ago, while continental Africans settled most of the Southwestern Indian Ocean islands (SWIO) only 2,000 years ago. Both groups of islands became the target of colonization in the last 1,000 years and both originating populations suffered marginalization. In the Caribbean, introduced diseases decimated the native population which was replaced by colonists and African slaves. Slavery played an important role in both locations.One of the many problems of colonization was the push in both locations to move from nomadic to stationary lives. The ideal living situation was considered a permanent location with set fields, pastures or fishing areas. Neither group of islanders were stationary before colonization.According to the researchers, in the Caribbean, in the past, when sea level was rising, the population would notice their coastal sources of fresh water becoming salty and they would then leave coastal areas and move to more inland, higher ground. This prevented storm surge from sweeping away anyone because the people were no longer living in the flood zone."For some islands, archaeological and paleoecological research offer an important record of pre-colonial climate change and its interplay with human lives and landscapes," the researchers report today (Apr. 6) in the The SWIO islands are in the tropics and rainfall varies depending on ocean warming and the El Niño/Southern Oscillation. Coupled with the legacies of colonialism, varying precipitation regimes can bring on food insecurity in southern Madagascar. As recently as 2016, insufficient rainfall caused a catastrophic famine due to crop failures."Being nomadic is a way to deal with highly unreliable climate," said Douglass. "But encouraging sedentary lifestyles made it easier to manage local people."In the past, the prickly pear cactus, introduced from the Americas, served as cattle fodder; a source of water for cattle, people and other plants; and as a defensive barrier for intruders. The Malagasy pastoralists took the non-native plant and adapted it to protect against the vagaries of climate. However, according to the researchers, in the 1930s, French colonists, in an effort to civilize the south, released parasitic cochineal larvae that destroyed the cactus barriers and their water reservoir. This effort to force the people to cultivate cash crops, use irrigation and improve grasslands led to widespread famine during ensuing droughts.Although 1930s farming practices might not be considered modern today, the push for modernization does not always come from outsiders."There is a globalizing influence shaping people to the ideal of what seems to be modern," said Douglass.Pollution, consumption and waste are real problems on all the islands. For example, islanders resistant to "old fashioned" ideas choose disposable diapers rather than cloth ones, even though there is little space for diaper disposal on an island, said Douglass. Tourism, a major source of income on many islands, also brings increased waste disposal pressures and environmental degradation.According to Douglass, while traditional housing was usually quickly and cheaply built and rebuilt after storms, modern housing forms are far more expensive and labor-intensive to replace."The desire to be modern, the elite status connected to things from overseas is real," said Douglass. "We need ways to shape views on what is a good house."Housing, agricultural, grazing and fishing practices that are adaptable to the changing climate can be informed by both the archaeological and historic past, but much of that knowledge disappears when people and languages disappear, she added.
|
Global Warming
| 2,020 |
April 6, 2020
|
https://www.sciencedaily.com/releases/2020/04/200406112530.htm
|
Upper ocean water masses shrinking in changing climate: Less efficient CO2 sink
|
We're familiar with how climate change is impacting the ocean's biology, from bleaching events that cause coral die-offs to algae blooms that choke coastal marine ecosystems, but it's becoming clear that a warming planet is also impacting the physics of ocean circulation.
|
A team of scientists from the University of British Columbia, the Bermuda Institute of Ocean Sciences (BIOS), the French Institute for Ocean Science at the University of Brest, and the University of Southampton recently published the results of an analysis of North Atlantic Ocean water masses in the journal "The oceans play a vital role in buffering the Earth from climate change by absorbing carbon dioxide and heat at the surface and transporting it in the deep ocean, where it is trapped for long periods," said Sam Stevens, doctoral candidate at the University of British Columbia and lead author on the study. "Studying changes in the structure of the world's oceans can provide us with vital insight into this process and how the ocean is responding to climate change."One particular layer in the North Atlantic Ocean, a water mass called the North Atlantic Subtropical Mode Water (or STMW), is very efficient at drawing carbon dioxide out of the atmosphere. It represents around 20% of the entire carbon dioxide uptake in the mid-latitude North Atlantic and is an important reservoir of nutrients for phytoplankton -- the base of the marine food chain -- at the surface of the ocean.Using data from two of the world's longest-running open-ocean research programs -- the Bermuda Atlantic Time-series Study (BATS) Program and Hydrostation 'S' -- the team found that as much as 93% of STMW has been lost in the past decade. This loss is coupled with a significant warming of the STMW (0.5 to 0.71 degrees Celsius or 0.9 to 1.3 degrees Fahrenheit), culminating in the weakest, warmest STMW layer ever recorded."Although some STMW loss is expected due to the prevailing atmospheric conditions of the past decade, these conditions do not explain the magnitude of loss that we have recorded," said Professor Nick Bates, BIOS senior scientist and principal investigator of the BATS Program. "We find that the loss is correlated with different climate change indicators, such as increased surface ocean heat content, suggesting that ocean warming may have played a role in the reduced STMW formation of the past decade."These findings outline a worrying relationship where ocean warming is restricting STMW formation and changing the anatomy of the North Atlantic, making it a less efficient sink for heat and carbon dioxide."This is a good example of how human activities are impacting natural cycles in the ocean," said Stevens, who was previously a BATS research technician from 2014 through 2017 before beginning his doctoral work, which leverages the work he did with BATS/BIOS.
|
Global Warming
| 2,020 |
April 6, 2020
|
https://www.sciencedaily.com/releases/2020/04/200406112528.htm
|
Climate change to affect fish sizes and complex food webs
|
Global climate change will affect fish sizes in unpredictable ways and, consequently, impact complex food webs in our oceans, a new IMAS-led study has shown.
|
Led by IMAS and Centre for Marine Socioecology scientist Dr Asta Audzijonyte and published in the journal Dr Audzijonyte said the study confirmed that changes in water temperature were responsible for driving changes in average sizes of fish species across time and spatial scales."Cold blooded animals, especially fish, have long been noted to grow to a smaller size when raised in warmer temperatures in an aquarium," Dr Audzijonyte said."If fish grow to smaller sizes in warmer aquaria, it is only natural to expect that global warming will also lead to shrinkage of adult fish size."However, average fish body size in wild populations are affected by growth, mortality, recruitment as well as interactions with other organisms and their environment simultaneously and it is unclear how all of these factors are affected by temperature."The researchers were surprised to find that while temperature has a significant impact, it caused different fish species to react differently.In some the average fish body size got smaller as predicted (around 55% of species) but in others it increased (around 45%).In general -- but not universally -- larger species tended to get even bigger in warmer waters, while smaller species tended to get smaller.Tropical species were more likely to be smaller at the warm end of their distribution ranges.Most importantly, the species that were smaller at the warmer edges of their habitat ranges were also more likely to get on average smaller with global warming."At Tasmanian survey locations, where some of the fastest rates of warming were observed, up to 66% of species showed clear changes in body size.""As well as happening quite quickly, some of the size changes can also be surprisingly large."For example, the change in a median-length temperate fish corresponds to around 12% of its body mass for each 1oC of warming."At the current rate of warming, in 40 years this would result in around a 40% change in body length, either increasing or decreasing depending on the species," she said.Dr Audzijonyte said the varying responses of species to warming would have implications for food webs and ecosystems, including their stability and resilience to other external stressors, such as fishing, coastal pollution and a range of different climate change impacts.The study was made possible through collaboration between University of Tasmania scientists and government managers across Australia, and by the efforts of over 100 volunteer Reef Life Survey divers, who have undertaken regular surveys at over 1000 sites around the continent.
|
Global Warming
| 2,020 |
April 3, 2020
|
https://www.sciencedaily.com/releases/2020/04/200403162713.htm
|
Deep-sea worms and bacteria team up to harvest methane
|
Scientists at Caltech and Occidental College have discovered a methane-fueled symbiosis between worms and bacteria at the bottom of the sea, shedding new light on the ecology of deep-sea environments.
|
They found that bacteria belonging to the Methylococcaceae family have been hitching a ride on the feathery plumes that act as the respiratory organs of Laminatubus and Bispira worms. Methylococcaceae are methanotrophs, meaning that they harvest carbon and energy from methane, a molecule composed of carbon and hydrogen.The worms, which are a few inches long, have been found in great numbers near deep-sea methane seeps, vents in the ocean floor where hydrocarbon-rich fluids ooze out into the ocean, although it was unclear why the worms favored the vents. As it turns out, the worms slowly digest the hitchhiking bacteria and thus absorb the carbon and energy that the bacteria harvest from the methane.That is to say, with a little help and some extra steps, the worms have become methanotrophs themselves."These worms have long been associated with seeps, but everyone just assumed they were filter-feeding on bacteria. Instead, we find that they are teaming up with a microbe to use chemical energy to feed in a way we hadn't considered," says Victoria Orphan, James Irvine Professor of Environmental Science and Geobiology and co-corresponding author of a paper on the worms that was published by Orphan and her colleagues made the discovery during research cruises to study methane vents off the coast of Southern California and Costa Rica."We had a colleague on board who was an expert on these worms and noticed that the morphology was unusual. The respiratory plumes were much frillier than anyone had ever seen before, which was the first clue. It was enough to make us say, 'That's interesting. We should investigate,'" says Shana Goffredi, visitor in geobiology at Caltech and lead author of the To probe the nature of the relationship between the worms and the bacteria, the scientists had to first use robotic submarines to take samples from deep-sea methane vents, which, in this case, lie 1,800 meters below the ocean surface.Once the worms were brought topside, the scientists analyzed their tissues, cataloging the carbon isotopes that they had consumed. Carbon exists in two stable isotopic forms -- different "flavors" of carbon, so to speak. Around 99 percent of all carbon is carbon-12, which has six neutrons and six protons in each atomic nucleus, and about 1 percent is carbon-13 (six protons and seven neutrons). Carbon-14, a radioactive isotope, exists in trace amounts.All organisms require carbon -- in some form -- to survive, and they absorb it through metabolic processes. Studying the ratio of carbon-13 to carbon-12 in an organism's tissues can give clues to where that carbon came from and the conditions under which it formed. In the case of the deep-sea worms, their tissues had an unusually low ratio of carbon-13 to carbon-12, meaning that the carbon in the worm's body probably came from methane. Orphan and her collaborators reasoned that because the worms are incapable of processing methane directly, they must be getting their carbon from methanotrophic bacteria."The fact that we found this specific isotope of carbon throughout the worms' bodies and not just in their respiratory plumes indicates that they are consuming methane carbon from these bacteria," Orphan says. The research team followed up on this hypothesis by using molecular techniques and microscopy as well as experiments to test the ability of these worms to incorporate a modified, traceable version of methane.Their research findings change our understanding of seep ecosystems and have implications for deep-sea stewardship, as methane seeps and hydrothermal vents are sure to experience increasing pressure because of human exploitation of energy and minerals.
|
Global Warming
| 2,020 |
April 3, 2020
|
https://www.sciencedaily.com/releases/2020/04/200403115119.htm
|
Changes to drylands with future climate change
|
A research team led by Washington State University has found that while drylands around the world will expand at an accelerated rate because of future climate change, their average productivity will likely be reduced.
|
The study, published in "Our results highlight the vulnerability of drylands to more frequent and severe climate extremes," said Jingyu Yao, a research assistant in WSU's Department of Civil and Environmental Engineering and lead author on the paper.Using satellite data of vegetation productivity, measurements of carbon cycling from 13 sites and datasets from global models of future climate change, the researchers found that productivity of drylands will increase overall by about 12% by 2100 compared to a baseline from about 10 years ago. However, as drylands replace more productive ecosystems, overall global productivity may not increase. Furthermore, due to expected changes in precipitation and temperatures, the amount of productivity in any one dryland area will decrease.In addition, the researchers found that expansion among different types of drylands will lead to large changes in regional and subtype contributions to global dryland productivity.Drylands will experience substantial expansion and degradation in the future due to climate change, wildfire and human activities, including changes to their ecosystem structures as well as to their productivity, said Heping Liu, professor in the Department of Civil and Environmental Engineering and corresponding author on the paper.Because these regions are already water stressed, they are particularly sensitive to temperature or precipitation changes. Warming temperatures from climate change and more frequent and severe droughts threaten their biodiversity as well as their ability to take in and hold carbon.Especially in developing countries, the degradation of dryland ecosystems could have strong societal and economic impacts, said Yao.These changes have already started happening in the last few decades. In the U.S. Southwest, the introduction of invasive species has changed dryland regions from green to brown. Precipitation changes in Australia, which is composed almost entirely of drylands, have meant a dryer continent with dramatic impacts and Mongolia's grasslands have deteriorated because of warmer temperatures, less rainfall and overgrazing.While the drylands' productivity is important for supporting people, these areas also play a critically important role in annual carbon cycling. They help the planet breathe, absorbing carbon dioxide every spring as plants grow and then breathing it out in the fall as they become dormant. Because the growth of dryland ecosystems is very sensitive to changes in rainfall and temperature, drylands show the most impact of any ecosystem in year-to-year changes in the carbon cycle.Understanding their role in future carbon cycling can help researchers determine how to best preserve areas of high carbon uptake."In our society, we are not paying much attention to what's going on with dryland regions," Liu said. "Given their importance in global carbon cycling and ecosystem services, a global action plan involving stringent management and sustainable utilization of drylands is urgently needed to protect the fragile ecosystems and prevent further desertification for climate change mitigation."The work was funded by the U.S. Department of Energy Office of Biological and Environmental Research as well as the National Natural Science Foundation of China.
|
Global Warming
| 2,020 |
April 3, 2020
|
https://www.sciencedaily.com/releases/2020/04/200403115109.htm
|
Northern peatlands will lose some of their CO2 sink capacity under a warmer climate
|
A Nordic study sheds new light on the role of northern peatlands in regulating the regional climate. According to the researchers, peatlands will remain carbon sinks until the end of this century, but their sink capacity will be substantially reduced after 2050, if the climate warms significantly.
|
Peatlands develop in waterlogged conditions which slow down plant decomposition rates, so that layers of dead plant material accumulate over many years as peat. They are a huge storehouse for significant quantities of carbon from the atmosphere. Despite only covering around 3% of the Earth's surface, peatlands contain roughly a fifth of its soil carbon. In Europe, these ecosystems store five times more CO2 than forests.A Nordic team of researchers used novel arctic modelling tools and previously published data on peatland carbon accumulation rates, vegetation and permafrost characteristics to study the role of northern peatlands in regulating the regional climate. A major concern is whether these ecosystems will continue to remain carbon sinks and help in mitigating climate change under changing climatic conditions. The modelling study, published in The model (LPJ-GUESS Peatland) used in this study captured the broad patterns of long-term peatland carbon dynamics at different spatial and temporal scales. The model successfully simulated reasonable vegetation patterns and permafrost extent across the pan-Arctic. Under contrasting warming scenarios (mild and severe), the study showed that peatlands on average continue to remain carbon sinks until the end of this century. However, their sink capacity would be substantially reduced after 2050 under the high-warming scenario due to an increase in soil mineralization rates. This modelling approach contributes to a better understanding of peatland dynamics and its role in the global climate system at different spatiotemporal scales. A major uncertainty of future predictions is the impact of formation of new peatlands with potential change in the peatland sink capacity owing to permafrost thawing and possible landscape changes."With this study our aim is to highlight the importance of peatlands in the global carbon cycle. We adopted an advanced peatland modelling tool to address the issues pertaining to peatland carbon balance in the past and future climate conditions. Now, our plan is to take forward our current research on the role of peatlands in regulating the regional climate by coupling our state-of-the-art peatland model with global and regional climate models in order to quantify the peatland-mediated feedbacks," says Postdoctoral Fellow Nitin Chaudhary, the lead author of the study, from the University of Oslo."Arctic carbon balance modelling studies working with coarse spatial resolution (half-grid scale) have often ignored the role of peatlands. This study emphasises the role of natural peatlands in the Arctic carbon balance and regional climate regulation. Such studies are needed so that their role is well defined in the global carbon models," University Researcher Narasinha Shurpali from the University of Eastern Finland says.
|
Global Warming
| 2,020 |
April 2, 2020
|
https://www.sciencedaily.com/releases/2020/04/200402144517.htm
|
Experiments lead to slip law for better forecasts of glacier speed, sea-level rise
|
Backed by experimental data from a laboratory machine that simulates the huge forces involved in glacier flow, glaciologists have written an equation that accounts for the motion of ice that rests on the soft, deformable ground underneath unusually fast-moving parts of ice sheets.
|
That equation -- or "slip law" -- is a tool that scientists can include in computer models of glacier movement over the deformable beds of mud, sand, pebbles, rocks and boulders under glaciers such as the West Antarctic Ice Sheet, said Neal Iverson, the project leader and a professor of geological and atmospheric sciences at Iowa State University. Models using the new slip law could better predict how quickly glaciers are sliding, how much ice they're sending to oceans and how that would affect sea-level rise.A paper published online today by the journal "The potential collapse of the West Antarctic Ice Sheet is the single largest source of uncertainty in estimations of future sea-level rise, and this uncertainty results, in part, from imperfectly modeled ice-sheet processes," Zoet and Iverson wrote in their paper.Iverson started experiments with the 9-foot-tall ring-shear device inside his laboratory's walk-in freezer in 2009. At the center of the device is a ring of ice about three feet across and eight inches thick. Below the ring is a hydraulic press that can put as much as 100 tons of force on the ice and simulate the weight of a glacier 800 feet thick. Above the ring are motors that can rotate the ice at speeds of 1 to 10,000 feet per year.The ice is surrounded by a tub of temperature-controlled, circulating fluid that keeps the ice ring right at its melting temperature so it slides on a thin film of water -- just like all fast-flowing glaciers.A $530,000 grant from the National Science Foundation supported development of the device. Iverson worked with three engineers from the U.S. Department of Energy's Ames Laboratory -- Terry Herrman, Dan Jones and Jerry Musselman -- to turn his ideas into a working machine.And it has worked for about a decade, providing data on how glaciers move over rigid rock and deformable sediment.For the experiments that led to the new slip law, Zoet drove from Ames to Madison to fill six, 5-gallon buckets with real, glacially deposited sediment called till that had the right mix of mud, sand and larger rock particles.He'd scoop that into the ring-shear device to make the till bed. He'd then construct an ice ring above it by freezing layers of water seeded with ice crystals. He'd apply force on the ice, heat it until it was melting and turn on the machine."We were after the mathematical relationship between the drag holding the ice back at the bottom of the glacier and how fast the glacier would slide," Iverson said. "That included studying the effect of the difference between ice pressure on the bed and water pressure in the pores of the till -- a variable called the effective pressure that controls friction."The data indicated the relationship between "drag, slip velocity and effective pressure that is needed to model glacier flow," Iverson said."Glacier ice is a highly viscous fluid that slips over a substrate -- in this case a deformable till bed -- and friction at the bed provides the drag that holds the ice back," Iverson said. "In the absence of friction, the weight of the ice would cause it to accelerate catastrophically like some landslides."But it's nearly impossible to get drag data in the field. Zoet said the act of drilling through the ice would change the interface between the glacier and bed, making measurements and data less accurate.So Iverson built his laboratory device to collect that data, and Zoet has built a slightly smaller version for his Wisconsin laboratory. Zoet's machine features a transparent sample chamber so researchers can see more of what's happening during an experiment.The resulting experimentally based slip law for glaciers moving over soft beds should make a difference in predictions of glacier movement and sea-level rise:"Ice sheet models using our new slip relationship," Iverson said, "would tend to predict higher ice discharges to the ocean -- and higher rates of sea-level rise -- than slip laws currently being used in most ice sheet models."
|
Global Warming
| 2,020 |
April 2, 2020
|
https://www.sciencedaily.com/releases/2020/04/200402144422.htm
|
Smaller scale solutions needed for rapid progress towards emissions targets
|
Low-carbon technologies that are smaller scale, more affordable, and can be mass deployed are more likely to enable a faster transition to net-zero emissions, according to a new study by an international team of researchers.
|
Innovations ranging from solar panels to electric bikes also have lower investment risks, greater potential for improvement in both cost and performance, and more scope for reducing energy demand -- key attributes that will help accelerate progress on decarbonisation.In order to meet international climate targets, emissions of greenhouse gases need to halve within the next decade and reach net-zero around mid-century. To do this will require an unprecedented and rapid transformation in the way energy is supplied, distributed and used.Researchers from the Tyndall Centre for Climate Change Research at the University of East Anglia (UEA), the International Institute for Applied Systems Analysis (IIASA) in Austria, and the University Institute of Lisbon, collected data on a wide variety of energy technologies at different scales and then tested how well they performed against nine characteristics of accelerated low-carbon transformation, such as cost, innovation and accessibility.They then asked: is it better to prioritise large-scale, costly, non-divisible or "lumpy" technologies, such as nuclear power, carbon capture and storage, high-speed transit systems, and whole-building retrofits?Or is it better to focus on more "granular" options, which are smaller in size, lower in cost, and more modular so they scale not by becoming bigger but by replicating? Examples of these more granular technologies include solar panels, electricity storage batteries, heat pumps, smart thermostats, electric bikes, and shared taxis or 'taxi-buses'.Publishing their findings in the journal Lead researcher Dr Charlie Wilson, at UEA, said: "A rapid proliferation of low-carbon innovations distributed throughout our energy system, cities, and homes can help drive faster and fairer progress towards climate targets."We find that big new infrastructure costing billions is not the best way to accelerate decarbonisation. Governments, firms, investors, and citizens should instead prioritise smaller-scale solutions which deploy faster. This means directing funding, policies, incentives, and opportunities for experimentation away from the few big and towards the many small."As well as being quick to deploy, smaller scale technologies have shorter lifespans and are less complex so innovations and improvements can be brought to market more rapidly. They are also more widely accessible and help create more jobs, giving governments a sound basis for strengthening climate policies.Co-author Prof Arnulf Grubler, at IIASA, said: "Large 'silver bullet' technologies like nuclear power or carbon and capture storage are politically seductive. But larger scale technologies and infrastructures absorb large shares of available public resources without delivering the rapid decarbonisation we need."The researchers emphasise that smaller scale technologies are not a universal solution. In some situations, there are no like-for-like alternatives to large-scale technologies and infrastructure such as aircraft flying long-haul or industrial plants producing iron, steel, and cement.In other situations, large numbers of smaller scale technologies need to integrate within existing infrastructure: widespread deployment of heat pumps and solar panels needs electricity networks, electric vehicles need charging stations, and insulation products need buildings."Smaller scale innovations are not a panacea," added co-author Dr Nuno Bento, of the University Institute of Lisbon, "but in many different contexts they outperform larger-scale alternatives as a means of accelerating low-carbon transformation to meet global climate targets."
|
Global Warming
| 2,020 |
April 2, 2020
|
https://www.sciencedaily.com/releases/2020/04/200402100834.htm
|
Climate disasters increase risks of armed conflicts: New evidence
|
The risk for violent clashes increases after weather extremes such as droughts or floods hit people in vulnerable countries, an international team of scientists finds. Vulnerable countries are characterized by a large population, political exclusion of particular ethnic groups, and low development. The study combines global statistical analysis, observation data and regional case study assessments to yield new evidence for policy-makers.
|
"Climate disasters can fuel some smoldering conflicts -- this is a worrying insight since such disasters are on the rise," says Jonathan Donges at the Potsdam Institute for Climate Impact Research, Germany, co-author of the paper now published in The numbers are quite staggering. "We find that almost one third of all conflict onsets in vulnerable countries over the recent decade have been preceded by a climate-related disaster within 7 days," says co-author Carl-Friedrich Schleussner from Climate Analytics in Berlin, Germany. "This does, however, not mean that disasters cause conflicts, but rather that disaster occurrence increases the risks of a conflict outbreak." After all, conflict is human-made. The analysis of concrete cases of disaster-conflict co-occurences shows that most such cases are not mere coincidences, but likely linked by causal mechanisms -- this is one of the key new findings.In Mali for instance a severe drought occurred in 2009 after which the militant Al-Qaeda in the Islamic Maghreb exploited the resulting state weakness and desperation of local people to recruit fighters and expand its area of operation. Other examples analyzed include China, the Philippines, Nigeria, and Turkey. India turns out to be the country with the by far highest number of disaster-conflict coincidences. The most surprising result of the study, says co-author Michael Brzoska from the University of Hamburg, was the prevalence of opportunities for armed violence over those related to grievances in post-disaster situations."Climate-related disasters may act like a 'threat multiplier' for violent conflicts," explains Tobias Ide from the University of Melbourne. A most important finding of the study is that only countries with large populations, the political exclusion of ethnic groups and relatively low levels of economic development are susceptible to disaster-conflict links. Optimistically, Ide concludes: "Measures to make societies more inclusive and wealthier are, therefore, no-regrets options to increase security in a warming world."
|
Global Warming
| 2,020 |
April 2, 2020
|
https://www.sciencedaily.com/releases/2020/04/200402100837.htm
|
Six decades of change in plankton communities
|
The UK's plankton population -- microscopic algae and animals which support the entire marine food web -- has undergone sweeping changes in the past six decades, according to new research published in
|
Involving leading marine scientists from across the UK, led by the University of Plymouth, the research for the first time combines the findings of UK offshore surveys such as the Continuous Plankton Recorder (CPR) and UK inshore long-term time-series.It then maps those observations against recorded changes in sea surface temperature, to demonstrate the effect of our changing climate on these highly sensitive marine communities.The study's authors say their findings provide further evidence that increasing direct human pressures on the marine environment -- coupled with climate-driven changes -- are perturbing marine ecosystems globally.They also say it is crucial to helping understand broader changes across UK waters, since any shifts in plankton communities have the potential for negative consequences for the marine ecosystem and the services it provides.Since plankton are the very base of the marine food web, changes in the plankton are likely to result in changes to commercial fish stocks, sea birds, and even the ocean's ability to provide the oxygen we breathe.The analyses of plankton functional groups showed profound long-term changes, which were coherent across large geographical areas right around the UK coastline.For example, the 1998-2017 decadal average abundance of meroplankton, a group of animal plankton, which includes lobsters and crabs and which spend their adult lives on the seafloor, was 2.3 times that for 1958-1967 when comparing CPR samples in the North Sea, at a time of increasing sea surface temperatures.This contrasted with a general decrease in plankton which spend their whole lives in the water column, while other offshore species noticed population decreases of around 75%.The study was led by former postdoctoral researcher Dr Jacob Bedford and Dr Abigail McQuatters-Gollop, from the University of Plymouth's Marine Conservation Research Group. It also involved scientists from The Marine Biological Association, Plymouth Marine Laboratory, The Environment Agency, Marine Scotland Science, Centre for Environment Fisheries and Aquaculture Science (Cefas), Agri-Food & Biosciences Institute of Northern Ireland, and the Scottish Association for Marine Science.Dr McQuatters-Gollop, the lead scientist for pelagic habitats policy for the UK, said: "Plankton are the base of the entire marine food web. But our work is showing that climate change has caused plankton around UK waters to experience a significant reorganisation. These changes in the plankton suggest alterations to the entire marine ecosystem and have consequences for marine biodiversity, climate change (carbon cycling) and food webs including commercial fisheries."Dr Clare Ostle, of the Marine Biological Association's Continuous Plankton Recorder (CPR) Survey, said: "Changes in plankton communities not only affect many levels of marine ecosystems but also the people that depend on them, notably through the effects on commercial fish stocks. This research is a great example of how different datasets -- including CPR data -- can be brought together to investigate long-term changes in important plankton groups with increasing temperature. These kind of collaborative studies are important for guiding policy and assessments of our changing environment."Report co-author Professor Paul Tett, from the Scottish Association for Marine Science (SAMS) in Oban, added: "In this paper, we have tried to turn decades of speculation into evidence. It has long been thought that warming seas impact on plankton, the most important organisms in the marine food web. By bringing together such a large, long-term dataset from around the UK for the first time, we have discovered that the picture is a complex one. We therefore need to build on the success of this collaboration by further supporting the Continuous Plankton Recorder and the inshore plankton observatories."
|
Global Warming
| 2,020 |
April 2, 2020
|
https://www.sciencedaily.com/releases/2020/04/200402091451.htm
|
Researchers forecast longer, more extreme wildfire seasons
|
In California, a changing climate has made autumn feel more like summer, with hotter, drier weather that increases the risk of longer, more dangerous wildfire seasons, according to a new Stanford-led study.
|
The paper, published in "Many factors influence wildfire risk, but this study shows that long-term warming, coupled with decreasing autumn precipitation, is already increasing the odds of the kinds of extreme fire weather conditions that have proved so destructive in both northern and southern California in recent years," said study senior author Noah Diffenbaugh, the Kara J Foundation professor at Stanford's School of Earth, Energy & Environmental Sciences.Since the early 1980s, the frequency of autumn days with extreme fire weather conditions has more than doubled in California. Rainfall during the season has dropped off by about 30 percent, while average temperatures have increased by more than 2 degrees Fahrenheit or more than 1 degree Celsius. The most pronounced warming has occurred in the late summer and early autumn, resulting in tinder-dry conditions in forests and grasslands to coincide with the strong, dry "Diablo" and "Santa Ana" winds that typically occur during the autumn in northern and southern California.These conditions have fed large, fast-spreading wildfires across California in recent years. The region's single deadliest wildfire, two largest wildfires, and two most destructive wildfires all occurred during 2017 and 2018, killing more than 150 people and causing more than $50 billion in damage.The paper includes analysis of the conditions surrounding the November 2018 Camp Fire in the Northern Sierra Nevada foothills and the Woolsey Fire around the same time near Los Angeles. In both cases, seasonal strong winds conspired with landscapes dried out following the state's hottest summer on record, stretching limited emergency response resources across the state.Historical weather observations from thermometers and rain gauges showed that the risk of extreme wildfire conditions during autumn has more than doubled across California over the past four decades. Using a large suite of climate model simulations archived by government research centers around the world, the authors revealed evidence that human-caused global warming has made the observed increases in these meteorological preconditions more likely."Autumn is of particular concern since warmer, drier conditions may coincide with the strong offshore wind events which tend to occur in the September to November period," said Michael Goss, the study's lead author and a postdoctoral scholar in Diffenbaugh's Climate and Earth System Dynamics Group.The authors emphasize that there are a number of opportunities for managing the intensifying risk of wildfires in California and other regions. They show that the reduced emissions target identified in the United Nations' Paris agreement would likely slow the increase in wildfire risk. However, even with those reductions, much of California is still likely to experience rising risk of extreme wildfire weather in the future."It's striking just how strong of an influence climate change has already had on extreme fire weather conditions throughout the state," said study coauthor Daniel Swain, a research fellow at UCLA, the National Center for Atmospheric Research and The Nature Conservancy, and a former PhD student with Diffenbaugh at Stanford. "It represents yet another piece of evidence that climate change is already having a discernable influence on day-to-day life in California."The findings come at a time when California's firefighters are facing significant pressures. Because firefighting resources and funding have been traditionally concentrated during the peak summertime fire season, the recent spate of autumn fires burning in both northern and southern California has put particular strain on the response. The ongoing COVID-19 pandemic could further strain emergency resources, including impeding efforts to prepare for the upcoming summer and autumn seasons that are likely to be intensified by low spring snowpack and a dry winter in northern California.The consequences are not restricted to California. In particular, fire-prone regions have historically shared wildfire-fighting resources throughout the year, including movement of people and equipment between the northern and southern hemispheres between the respective summer seasons. Recent autumn wildfires in California have coincided with the onset of wildfires in Australia, creating strain on limited global resources.The authors emphasize that there are many steps California and other regions can take to increase resilience to the rising risks of wildfire. In addition to curbing the trajectory of global warming, risk management options include prescribed burning to reduce fuel loads and improve ecosystem health, upgrades to emergency communications and response systems, community-level development of protective fire breaks and defensible space, and the adoption of new zoning rules and building codes to promote fire resilient construction, according to the researchers.
|
Global Warming
| 2,020 |
April 1, 2020
|
https://www.sciencedaily.com/releases/2020/04/200401150817.htm
|
Almond orchard recycling a climate-smart strategy
|
Recycling trees onsite can sequester carbon, save water and increase crop yields, making it a climate-smart practice for California's irrigated almond orchards, finds a study from the University of California, Davis.
|
Whole orchard recycling is when old orchard trees are ground, chipped and turned back into the soil before new almond trees are planted.The study, published in the journal "To me what was really impressive was the water piece," said corresponding author Amélie Gaudin, an associate professor of agroecology in the UC Davis Department of Plant Sciences. "Water is central to how we think about agriculture in California. This is a clear example of capitalizing on soil health. Here we see some real benefits for water conservation and for growers."Drought and high almond prices have encouraged higher rates of orchard turnover in recent years. The previous practice of burning trees that are no longer productive is now restricted under air quality regulations, so whole orchard recycling presents an alternative. But how sustainable and effective is it for the environment and for farmers?For the study, scientists measured soil health and tree productivity of an almond orchard that turned previous Prunus woody biomass back into the soil through whole orchard recycling and compared it with an orchard that burned its old trees nine years prior.They also experimentally reduced an orchard's irrigation by 20 percent to quantify its water resilience.Their results found that, compared with burn treatments, whole orchard recycling can:"This seems to be a practice that can mitigate climate change by building the soil's potential to be a carbon sink, while also building nutrients and water retention," said Gaudin. "That can be especially important as water becomes more limited."
|
Global Warming
| 2,020 |
March 31, 2021
|
https://www.sciencedaily.com/releases/2021/03/210331085648.htm
|
Floating gardens as a way to keep farming despite climate change
|
Bangladesh's floating gardens, built to grow food during flood seasons, could offer a sustainable solution for parts of the world prone to flooding because of climate change, a new study has found.
|
The study, published recently in the "We are focused here on adaptive change for people who are victims of climate change, but who did not cause climate change," said Craig Jenkins, a co-author of the study and academy professor emeritus of sociology at The Ohio State University. "There's no ambiguity about it: Bangladesh didn't cause the carbon problem, and yet it is already experiencing the effects of climate change."Bangladesh's floating gardens began hundreds of years ago. The gardens are made from native plants that float in the rivers -- traditionally, water hyacinths -- and operate almost like rafts, rising and falling with the waters. Historically, they were used to continue growing food during rainy seasons when rivers filled with water.The farmers -- or their families -- layer the plants about three feet deep, creating a version of raised-bed gardens that float in the water. Then, they plant vegetables inside those rafts. As the raft-plants decompose, they release nutrients, which help feed the vegetable plants. Those vegetable plants typically include okra, some gourds, spinach and eggplant. Sometimes, they also include spices like turmeric and ginger.Floating gardens are also in use in parts of Myanmar, Cambodia and India. The United Nations Food and Agricultural Organization has named Bangladesh's floating gardens a Globally Important Agricultural Heritage System.But as climate change has affected the volume of water in those rivers -- creating extreme highs and floods, along with extreme lows and droughts -- floating gardens have become a way for rural farmers to keep producing food during unpredictable weather. Climate change increases weather extremes and the severity of flooding, and droughts as well.The researchers wanted to understand whether Bangladesh's floating gardens could be a sustainable farming practice as climate change continues to cause floods and droughts, and to see whether the gardens bring better food security to individual households."They've got to be able to grow specific crops that can survive with minimal soil," said Jenkins, who is also a research scientist and former director of the Ohio State Mershon Center for International Security Studies. "And in Bangladesh, a lot of small farmers that had typically relied on rice crops are moving away from those because of the effects of climate change and better returns from alternative crops."For this study, the researchers interviewed farming families who use floating gardens, and found strong evidence that floating gardens provide stability, both in the amount of food available to feed rural populations and in a farming family's income, despite the instability created by a changing climate.They found that farmers typically use hybrid seeds, which must be repurchased each year, to grow a diverse range of vegetables in the floating gardens. The gardens are also susceptible to pests, so farmers end up spending some money on both pesticides and fertilizers. But even with those expenses, they found, benefits outweighed costs.Generally, entire families work on the gardens, the researchers found: Women, children and the elderly prepare seedlings and collect aquatic plants to build gardens. Men cultivate the gardens and protect them from raiders. Some families also farm fish in the waters around their floating gardens.One farmer told the research team that he earns up to four times as much money from the gardens as from traditional rice paddies.Still, the system could use improvements, the researchers found. Farmers often take out high-interest loans to cover the investment costs of building the beds and stocking them with plants. Lower-interest loans from responsible government or non-governmental organizations could alleviate that burden, they found.
|
Weather
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325115302.htm
|
The case of the cloudy filters: Solving the mystery of the degrading sunlight detectors
|
More than 150 years ago, the Sun blasted Earth with a massive cloud of hot charged particles. This plasma blob generated a magnetic storm on Earth that caused sparks to leap out of telegraph equipment and even started a few fires. Now called the Carrington Event, after one of the astronomers who observed it, a magnetic storm like this could happen again anytime, only now it would affect more than telegraphs: It could damage or cause outages in wireless phone networks, GPS systems, electrical grids powering life-saving medical equipment and more.
|
Sun-facing satellites monitor the Sun's ultraviolet (UV) light to give us advance warning of solar storms, both big ones that could cause a Carrington-like event as well as the smaller, more common disturbances that can temporarily disrupt communications. One key piece of equipment used in these detectors is a tiny metal filter that blocks out everything except the UV signal researchers need to see.But for decades, there has been a major problem: Over the course of just a year or two, these filters mysteriously lose their ability to transmit UV light, "clouding up" and forcing astronomers to launch expensive annual recalibration missions. These missions involve sending a freshly calibrated instrument into space to make its own independent observations of the sunlight for comparison.A leading theory has been that the filters were developing a layer of carbon, whose source is contaminants on the spacecraft, that blocked incoming UV light. Now, NIST scientists and collaborators from the Laboratory for Atmospheric and Space Physics (LASP) in Boulder, Colorado, have found the first evidence indicating that carbonization is not the problem, and it must be something else, such as another possible stowaway from Earth. The researchers describe their work in "To my knowledge, it's the first quantitative, really solid argument against carbonization as the cause of the filter degradation," said NIST physicist Charles Tarrio.Most of the light produced by the Sun is visible and includes the rainbow of colors from red (with a wavelength of around 750 nanometers) to violet (with a wavelength of about 400 nm). But the Sun also produces light with wavelengths too long or short for the human eye to see. One of these ranges is extreme ultraviolet (EUV), extending from 100 nm down to just 10 nm.Only about a tenth of a percent of sunlight is in the EUV range. That tiny EUV signal is extremely useful because it spikes in tandem with solar flares. These eruptions on the surface of the Sun can cause changes to Earth's upper atmosphere that disrupt communications or interfere with GPS readings, causing your phone to suddenly think you are 40 feet away from your true location.Satellites that measure EUV signals help scientists monitor these solar flares. But the EUV signals also give scientists a heads-up of hours or even days before more destructive phenomena such coronal mass ejections (CMEs), the phenomenon responsible for the Carrington Event. Future CMEs could potentially overload our power lines or increase radiation exposure for airline crew and passengers traveling in certain locations.And nowadays, the satellites do more than merely give us warnings, said LASP senior research scientist Frank Eparvier, a collaborator on the current work."In the past few decades we've gone from just sending out alerts that flares have happened to being able to correct for solar variability due to flares and CMEs," Eparvier said. "Knowing in real time how much the solar EUV is varying allows for the running of computer models of the atmosphere, which can then produce corrections for the GPS units to minimize the impacts of that variability."Two metals are particularly useful for filtering out the massive amounts of visible light to let through that small but important EUV signal. Aluminum filters transmit EUV light between 17 nm and 80 nm. Zirconium filters transmit EUV light between 6 nm and 20 nm.While these filters begin their lives transmitting a lot of EUV light in their respective ranges, the aluminum filters, in particular, quickly lose their transmission abilities. A filter might start by allowing 50% of 30-nm EUV light through to the detector. But within just a year, it only transmits 25% of this light. Within five years, that number is down to 10%."It's a significant issue," Tarrio said. Less light transmitted means less data available -- a little like trying to read in a dimly lit room with dark sunglasses.Scientists have long known that carbon deposits can build up on instruments when they are subjected to UV light. Sources of carbon on satellites can be everything from fingerprints to the materials used in the construction of the spacecraft itself. In the case of the mysteriously cloudy UV filters, researchers thought carbon might have been deposited on them, absorbing EUV light that would otherwise have passed through.However, since the 1980s, astronomers have been carefully designing spacecraft to be as carbon-free as possible. And that work has helped them with other carbonization problems. But it didn't help with the aluminum EUV filter issue. Nevertheless, the community still suspected carbonization was at least partially responsible for the degradation.To test this in a controlled setting, NIST researchers and collaborators used a machine that effectively lets them create their own space weather.The instrument is NIST's Synchrotron Ultraviolet Radiation Facility (SURF), a room-sized particle accelerator that uses powerful magnets to move electrons in a circle. The motion generates EUV light, which can be diverted via specialized mirrors to impact targets -- in this case, the aluminum and zirconium satellite filters.Each filter was 6 millimeters by 18 mm, smaller than a postage stamp, and only 250 nm thick, about 400 times thinner than a human hair. The sample filters were actually slightly thicker than real satellite filters, with other small changes designed to prevent the SURF beam from literally burning holes into the metals. During a run, the back side of each filter was exposed to a controlled source of carbon.To speed up the testing process, the team blasted the filters with the equivalent of five years' worth of space weather in a mere hour or two. Incidentally, getting that kind of beam power was no sweat for SURF."We turn SURF down to about half a percent of its normal power in order to expose the filters to a reasonable amount of light," Tarrio said. "The satellites are 92 million miles away from the Sun, and the Sun's not putting out an awful lot of EUV to begin with."Finally, after exposure, researchers tested each filter to see how much EUV light in the correct wavelength range was able to pass through.The team found that transmission was not significantly different after exposure versus before exposure, for either the aluminum or the zirconium. In fact, the difference in transmission was just a fraction of a percent, not nearly enough to explain the kind of clouding that happens in real space satellites."We were looking for a 30% decrease in transmission," Tarrio said. "And we just didn't see it."As an extra test, the scientists gave the filters even larger doses of light -- the equivalent of 50 years' worth of ultraviolet radiation. And even that didn't produce much of a light transmission problem, growing just 3 nm of carbon on the filters -- 10 times less than researchers would have expected if carbon was responsible.The real culprit hasn't yet been identified, but researchers already have a different suspect in mind: water.Like most metals, aluminum naturally has a thin layer on its surface of a material called an oxide, which forms when aluminum binds with oxygen. Everything from aluminum foil to soda cans has this oxide layer, which is chemically identical to sapphire.In the proposed mechanism, the EUV light would pull atoms of aluminum out of the filter and deposit them on the filter's exterior, which already has that thin oxide layer. The exposed atoms would then react with the oxygen in water from Earth that has hitched a ride on the spacecraft. Together, the exposed aluminum and water would react to form a much thicker oxide layer, which could theoretically be absorbing the light.Further SURF experiments scheduled for later this year should answer the question of whether the problem really is water, or something else. "This would be the first time that people have looked at the deposition of aluminum oxide in this context," Tarrio said. "We're looking into it as a serious possibility."
|
Weather
| 2,021 |
March 18, 2021
|
https://www.sciencedaily.com/releases/2021/03/210318142513.htm
|
Effective interventions may prevent disease transmission in changing climate
|
<em><em>Aedes aegypti</em></em>
|
Temperature and rainfall have significant impacts on Flood seasons contributed to significantly higher According to the authors, "Dengue is the fastest-growing mosquito-borne disease in the world, and as climate change accelerates, many vulnerable populations will continue to be disproportionately impacted by this virus. Having shown floods to result in significantly increased dengue vector abundance, we hope to encourage actionable interventions to limit infection risk in light of these extreme climate events."
|
Weather
| 2,021 |
March 12, 2021
|
https://www.sciencedaily.com/releases/2021/03/210312155433.htm
|
Farm-level study shows rising temperatures hurt rice yields
|
A study of the relationship between temperature and yields of various rice varieties, based on 50 years of weather and rice-yield data from farms in the Philippines, suggests that warming temperatures negatively affect rice yields.
|
Recent varieties of rice, bred for environmental stresses like heat, showed better yields than both traditional rice varieties and modern varieties of rice that were not specifically bred to withstand warmer temperatures. But the study found that warming adversely affected crop yields even for those varieties best suited to the heat. Overall, the advantage of varieties bred to withstand increased heat was too small to be statistically significant.One of the top 10 countries globally in rice production, the Philippines is also a top-10 rice importer, as domestic supply cannot meet demand.Roderick Rejesus, a professor and extension specialist of agricultural and resource economics at North Carolina State University and the corresponding author of a paper that describes the study, says that teasing out the effects of temperature on rice yields is important to understand whether rice-breeding efforts have helped address the environmental challenges faced by modern society, such as global warming.The study examined rice yields and atmospheric conditions from 1966 to 2016 in Central Luzon, the major rice-growing region of the Philippines. Rejesus and study colleagues were able to utilize farm-level data of rice yields and area weather conditions in four-to-five-year increments over the 50-year period, a rare data trove that allowed the researchers to painstakingly examine the relationship between rice yield and temperature in actual farm environments."This rich data set allowed us to see what was actually happening at the farm level, rather than only observing behavior at higher levels of aggregation like in provinces or districts," Rejesus said.The study examined three general rice varieties planted during those 50 years: traditional rice varieties; "early modern varieties" planted after the onset of the Green Revolution, which were bred for higher yields; and "recent modern varieties" bred for particular characteristics, like heat or pest resistance, for example.Perhaps as expected, the study showed that, in the presence of warming, recent modern varieties had the best yields when compared with the early modern and traditional varieties, and that early modern varieties outperformed traditional varieties. Interestingly, some of the early modern varieties may have also mitigated heat challenges given their smaller "semi-dwarf" plant architecture, even though they were not bred to specifically resist heat."Taken all together, there are two main implications here," Rejesus said. "The first is that, at the farm level, there appears to be a 'yield gap' between how rice performs in breeding trials and on farms, with farm performance of recent varieties bred to be more tolerant to environmental stresses not being statistically different relative to the older varieties."The second is that rice breeding efforts may not have reached their full potential such that it may be possible to produce new varieties that will statistically perform better than older varieties in a farm setting."Rejesus also acknowledged that the study's modest sample size may have contributed to the inability to find statistical significance in the differences in warming impacts between rice varietal yields."This paper has implications for other rice-breeding countries, like Vietnam, because the timing of the release of various rice varieties is somewhat similar to that of the Philippines," Rejesus said. "Plant-breeding institutions can learn from this type of analysis, too. It provides guidance as to where research funding may be allocated by policymakers to further improve the high temperature tolerance of rice varieties available to farmers."Rejesus plans to further study other agricultural practices and innovations that affect crop yields, including an examination of cover crops, or plants grown on cropland in the off season that aim to keep soils healthy, to gauge whether they can mitigate the adverse impacts of a changing climate.
|
Weather
| 2,021 |
March 10, 2021
|
https://www.sciencedaily.com/releases/2021/03/210310204226.htm
|
Fishers at risk in 'perfect storm'
|
Stormier weather will increasingly force fishers to choose between their safety and income, researchers say.
|
Climate change is causing more extreme weather in many locations. Storms will likely increase around the UK in the future, while many fishers in the UK also face economic insecurity.The new study -- led by the University of Exeter -- worked with fishers in Cornwall to understand how they balance the risks and rewards of fishing in varying conditions.Factors that made skippers more likely to risk fishing in high wind or waves included: being the main earner in their household, poor recent fishing success, and having a crew to support."Climate change and economic insecurity create a 'perfect storm', putting ever-increasing pressure on skippers," said lead author Dr Nigel Sainsbury."Fishing is already the most dangerous peacetime profession in the UK, and the combination of more extreme weather and financial challenges will only make this worse."Solving this problem is difficult."Our suggestions include policies that improve the safety of boats and support less vulnerable fishing methods, and the creation of insurance products that pay fishers to stay in port in dangerous conditions."The research team included scientists from the Centre for Environment, Fisheries and Aquaculture Science (Cefas) and the universities of East Anglia, Bristol and North Carolina Wilmington.Researchers presented skippers with various scenarios with differing factors including wave height, wind speed, likely catch and price -- and asked them which trip they would prefer, or whether they would stay ashore."Skippers working with a crew were more likely to 'push the weather'," Dr Sainsbury said."This could partly be explained by the fact it's safer to fish with a crew than alone, but might also be because skippers feel responsible for providing incomes for their crew -- even if the conditions are risky."Fishers were asked to score their fishing success over the previous month on a scale of one to five -- and catch levels were more important to those with low scores, which would lead them to take greater risks if they expected a good catch.Boat size and fishing method also affected decision making. Unsurprisingly, skippers of larger boats and those whose method was less risky in high winds or waves were more willing to go out in such conditions.The study included 80 skippers at seven Cornish ports, and fishing methods included otter board trawl, purse seines, gillnets, tangle nets, trammel nets, hand lines and pots."By taking a human behavioural perspective, this study provides new understanding of how changing storminess can impact fisheries," Dr Sainsbury said."We have shown that fishers' trade-offs of physical risk and fishing rewards are influenced by technical, social and economic factors."This study provides insights that could be very helpful in trying to predict levels of disruption to the fishing industry in the future as a result of changing storminess and climate change."Funding for the study came from the Natural Environment Research Council (NERC), Cefas and Willis Research Network.The paper, published in the journal
|
Weather
| 2,021 |
March 9, 2021
|
https://www.sciencedaily.com/releases/2021/03/210309091251.htm
|
Tropical cyclone exposure linked to rise in hospitalizations from many causes for older adults
|
An increase in overall hospitalizations was reported for older adults in the week following exposure to a tropical cyclone, according to a new study by researchers at Columbia University Mailman School of Public Health, Columbia University's Earth Institute and colleagues at Colorado State University and Harvard T. H. Chan School of Public Health.
|
The researchers used data over 16 years on 70 million Medicare hospitalizations and a comprehensive database of county-level local winds associated with tropical cyclones to examine how tropical cyclone wind exposures affect hospitalizations from 13 mutually exclusive, clinically meaningful causes, along with over 100 sub-causes. This study is the first comprehensive investigation of the impact of hurricanes and other tropical cyclones on all major causes and sub-causes of hospitalizations. The findings are published in Over 16,000 additional hospitalizations were associated with tropical cyclones over a ten-year average exposure. Analyses showed a 14 percent average rise in respiratory diseases in the week after exposure. The day after tropical cyclones with hurricane-force winds respiratory disease hospitalizations doubled. Also reported was an average 4 percent rise in infectious and parasitic diseases and 9 percent uptick in injuries. Hospitalizations from chronic obstructive pulmonary disease (COPD) surged 45 percent the week following tropical cyclone exposure compared to weeks without exposure.This rise in hospitalizations was driven primarily by increases in emergency hospitalizations. The researchers point out that there may have been cases where exposure to the cyclones prevented normal medical care, compelling people to go to the hospital to access services that they might otherwise get outside a hospital setting without the storm. For example, if those with respiratory issues experienced loss of power -- often a result from tropical cyclone winds -- they may have turned to hospitals if they needed power for medical equipment that a hospital could furnish.However, for certain causes, such as certain cancers, the authors also reported decreases in hospitalizations. These decreases were driven by non-emergency hospitalizations, indicating that people possibly cancelled scheduled hospitalizations because of the storm, which may have longer-term impacts on health."We know that hurricanes and other tropical cyclones have devastating effects on society, particularly on the poorest and most vulnerable" said Robbie M. Parks, PhD, Earth Institute post-doctoral fellow at the Columbia University Mailman School of Public Health and first author. "But until now only limited previous studies have calculated their impacts on health outcomes. Current weather trends also indicate that we can expect tropical cyclone exposure to remain a danger to human health and wellbeing, and could cause devastation to many more communities, now and into the future. There is no doubt that extreme weather events, such as tropical cyclones, are a great threat to human health in the U.S. and many other places in the world -- now and with climate change in the future. Our study is a major first step in understanding how tropical cyclone exposure impacts many different adverse health outcomes."The researchers anticipate that adequate forecasting of tropical cyclones might help, for example, in the planning of setting up shelters to provide electricity and common medications and creating easy ways for vulnerable people with certain chronic conditions to find and use those resources outside of the hospital.One of the main impediments for research in this field has been the difficulty in readily accessing data for exposure assessment. This research was greatly facilitated by the work of G. Brooke Anderson, PhD, associate professor at Colorado State University, who curated an open-source dataset to easily assess exposure to tropical cyclones for epidemiologic studies. The authors coupled the exposure data with comprehensive hospitalization data among Medicare enrollees. "The development of environmental health data research platforms that provide a one-point access to data, like the one we used for this study, can be a very powerful tool allowing research in directions that were not possible before," said Francesca Dominici, PhD, professor of biostatistics at the Harvard Chan School and co-author."While serious gaps in knowledge remain, we gained valuable insights into the timing of hospitalizations relative to exposure and how cause-specific hospitalizations can be impacted by tropical cyclones," said Marianthi-Anna Kioumourtzoglou, ScD, assistant professor of environmental health sciences at Columbia Mailman School, and senior author. "These important discoveries will be key for preparedness planning, including hospital and physician preparedness. Our study is just a first step in this process."
|
Weather
| 2,021 |
March 8, 2021
|
https://www.sciencedaily.com/releases/2021/03/210308165237.htm
|
Northern Hemisphere summers may last nearly half the year by 2100
|
Without efforts to mitigate climate change, summers spanning nearly six months may become the new normal by 2100 in the Northern Hemisphere, according to a new study. The change would likely have far-reaching impacts on agriculture, human health and the environment, according to the study authors.
|
In the 1950s in the Northern Hemisphere, the four seasons arrived in a predictable and fairly even pattern. But climate change is now driving dramatic and irregular changes to the length and start dates of the seasons, which may become more extreme in the future under a business-as-usual climate scenario."Summers are getting longer and hotter while winters shorter and warmer due to global warming," said Yuping Guan, a physical oceanographer at the State Key Laboratory of Tropical Oceanography, South China Sea Institute of Oceanology, Chinese Academy of Sciences, and lead author of the new study in Geophysical Research Letters, AGU's journal for high-impact, short-format reports with immediate implications spanning all Earth and space sciences.Guan was inspired to investigate changes to the seasonal cycle while mentoring an undergraduate student, co-author Jiamin Wang. "More often, I read some unseasonable weather reports, for example, false spring, or May snow, and the like," Guan said.The researchers used historical daily climate data from 1952 to 2011 to measure changes in the four seasons' length and onset in the Northern Hemisphere. They defined the start of summer as the onset of temperatures in the hottest 25% during that time period, while winter began with temperatures in the coldest 25%. Next, the team used established climate change models to predict how seasons will shift in the future.The new study found that, on average, summer grew from 78 to 95 days between 1952 to 2011, while winter shrank from 76 to 73 days. Spring and autumn also contracted from 124 to 115 days, and 87 to 82 days, respectively. Accordingly, spring and summer began earlier, while autumn and winter started later. The Mediterranean region and the Tibetan Plateau experienced the greatest changes to their seasonal cycles.If these trends continue without any effort to mitigate climate change, the researchers predict that by 2100, winter will last less than two months, and the transitional spring and autumn seasons will shrink further as well."Numerous studies have already shown that the changing seasons cause significant environmental and health risks," Guan said. For example, birds are shifting their migration patterns and plants are emerging and flowering at different times. These phenological changes can create mismatches between animals and their food sources, disrupting ecological communities.Seasonal changes can also wreak havoc on agriculture, especially when false springs or late snowstorms damage budding plants. And with longer growing seasons, humans will breathe in more allergy-causing pollen, and disease-carrying mosquitoes can expand their range northward.This shift in the seasons may result in more severe weather events, said Congwen Zhu, a monsoon researcher at the State Key Laboratory of Severe Weather and Institute of Climate System, Chinese Academy of Meteorological Sciences, Beijing, who was not involved in the new study."A hotter and longer summer will suffer more frequent and intensified high-temperature events -- heatwaves and wildfires," Zhu said. Additionally, warmer, shorter winters may cause instability that leads to cold surges and winter storms, much like the recent snowstorms in Texas and Israel, he said."This is a good overarching starting point for understanding the implications of seasonal change," said Scott Sheridan, a climate scientist at Kent State University who was not part of the new study.It is difficult to conceptualize a 2- or 5-degree average temperature increase, he said, but "I think realizing that these changes will force potentially dramatic shifts in seasons probably has a much greater impact on how you perceive what climate change is doing."
|
Weather
| 2,021 |
February 2, 2021
|
https://www.sciencedaily.com/releases/2021/02/210202113747.htm
|
How do electrons close to Earth reach almost the speed of light?
|
New study found that electrons can reach ultra-relativistic energies for very special conditions in the magnetosphere when space is devoid of plasma.
|
Recent measurements from NASA's Van Allen Probes spacecraft showed that electrons can reach ultra-relativistic energies flying at almost the speed of light. Hayley Allison, Yuri Shprits and collaborators from the German Research Centre for Geosciences have revealed under which conditions such strong accelerations occur. They had already demonstrated in 2020 that during solar storm plasma waves play a crucial role for that. However, it was previously unclear why such high electron energies are not achieved in all solar storms. In the journal At ultra-relativistic energies, electrons move at almost the speed of light. Then the laws of relativity become most important. The mass of the particles increases by a factor ten, time is slowing down, and distance decreases. With such high energies, charged particles become most dangerous to even the best protected satellites. As almost no shielding can stop them, their charge can destroy sensitive electronics. Predicting their occurrence -- for example, as part of the observations of space weather practised at the GFZ -- is therefore very important for modern infrastructure.To investigate the conditions for the enormous accelerations of the electrons, Allison and Shprits used data from a twin mission, the "Van Allen Probes," which the US space agency NASA had launched in 2012. The aim was to make detailed measurements in the radiation belt, the so-called Van Allen belt, which surrounds the Earth in a donut shape in terrestrial space. Here -- as in the rest of space -- a mixture of positively and negatively charged particles forms a so-called plasma. Plasma waves can be understood as fluctuations of the electric and magnetic field, excited by solar storms. They are an important driving force for the acceleration of electrons.During the mission, both solar storms that produced ultra-relativistic electrons and storms without this effect were observed. The density of the background plasma turned out to be a decisive factor for the strong acceleration: electrons with the ultra-relativistic energies were only observed to increase when the plasma density dropped to very low values of only about ten particles per cubic centimetre, while normally such density is five to ten times higher.Using a numerical model that incorporated such extreme plasma depletion, the authors showed that periods of low density create preferential conditions for the acceleration of electrons -- from an initial few hundred thousand to more than seven million electron volts. To analyse the data from the Van Allen probes, the researchers used machine learning methods, the development of which was funded by the GEO.X network. They enabled the authors to infer the total plasma density from the measured fluctuations of electric and magnetic field."This study shows that electrons in the Earth's radiation belt can be promptly accelerated locally to ultra-relativistic energies, if the conditions of the plasma environment -- plasma waves and temporarily low plasma density -- are right. The particles can be regarded as surfing on plasma waves. In regions of extremely low plasma density they can just take a lot of energy from plasma waves. Similar mechanisms may be at work in the magnetospheres of the outer planets such as Jupiter or Saturn and in other astrophysical objects," says Yuri Shprits, head of the GFZ section Space physics and space weather and Professor at University of Potsdam."Thus, to reach such extreme energies, a two-stage acceleration process is not needed, as long assumed -- first from the outer region of the magnetosphere into the belt and then inside. This also supports our research results from last year," adds Hayley Allison, PostDoc in the Section Space physics and space weather.
|
Weather
| 2,021 |
January 27, 2021
|
https://www.sciencedaily.com/releases/2021/01/210127122418.htm
|
Researchers use car collisions with deer to study mysterious animal-population phenomena
|
For at least a century, ecologists have wondered at the tendency for populations of different species to cycle up and down in steady, rhythmic patterns.
|
"These cycles can be really exaggerated -- really huge booms and huge busts -- and quite regular," said Daniel Reuman, professor of ecology & evolutionary biology at the University of Kansas and senior scientist at the Kansas Biological Survey. "It attracted people's attention because it was kind of mysterious. Why would such a big thing be happening?"A second observation in animal populations might be even harder to fathom: Far-flung communities of species, sometimes separated by hundreds of miles, often fluctuate in synchrony with one another -- an effect known as "spatial synchrony."Now, Reuman and colleagues have written a new study in the peer-reviewed journal Reuman compared the linked population phenomena to a famous physics experiment where two grandfather clocks are placed next to each other against a wall."Over time, the pendulums become synchronized," he said. "The reason is because both produce tiny vibrations in the wall. And the vibrations from one of them in the wall influences the other one just a little bit -- enough to get the pendulums to eventually become synchronous. One reason people think these cycling populations are easy to synchronize is if a few individuals can get from one to the other, like vibrations that go through the wall for the grandfather clocks. It's enough to bring these cycling populations into synchrony. That's how people thought about things before we started our work with this paper."But Reuman and his co-authors describe this process can actually go the other way around. The researchers found weather patterns driven by El Nino influenced predictable fluctuations in deer populations across the state as well as synchrony between different deer populations.Looking at datasets on local temperature and snowfall variations across the state, the team averaged them out, finding "buried underneath all of that randomness a hardly noticeable, but synchronous fluctuation," Reuman said.The three-to-seven-year weather fluctuation directly influenced synchronous population cycles in the state's deer."All that local variation would cancel out because it might be a little bit warmer in one place, a little bit colder and another place -- but that overall synchronous component, which is related to El Nino in this case, reinforces all the local variation," Reuman said. "And it's the same years with deer. So, the reason why the synchrony is causing the cycling is because the synchrony is occurring only on the relevant timescales of the fluctuation. It's only that component of the three-to-seven-year oscillations that synchronize. All the faster and slower oscillations are all local variation that cancels out when you average across the whole state."Moreover, the researchers found these deer population fluctuations predicted the numbers of car collisions with deer statewide more than traffic volume or other factors."It was a surprise to us when we figured out that that's what was going on," Reuman said. "What it amounts to is a new mechanism for these major population cycles and a new way that they can come about. That's fundamentally different from the old way that people were thinking about it."Lead author Tom Anderson, assistant professor at Southern Illinois University Edwardsville, said the work shows it's "still possible to discover new information about well-studied scientific phenomena.""Researchers have been examining population cycles for more than 100 years, yet our study still uncovered new information," Anderson said. "That is partly what makes science, and this project in particular, exciting, to be able to uncover new ways of thinking about something that others have thought about extensively. Our work also has important implications in a variety of other areas, including how fluctuations in populations of plants or animals will respond to climate change and that organisms that are economically and socially important to humans, like white-tailed deer, can undergo periods of high and low abundance due to naturally occurring processes across large spatial scales, which might have implications for their subsequent management."According to co-author Lawrence Sheppard, postdoctoral researcher with the KU Department of Ecology & Evolutionary Biology and the Kansas Biological Survey, the unexpected relationship between spatial synchrony and population cycles was revealed by "new methods to study the different timescales of change in an ecosystem.""We trace how particular timescales of change arise in the data and are communicated from one part of the system to another using 'wavelets,' which I first learned to apply to biomedical data during my Ph.D.," Sheppard said. "In particular, here we find that spatial synchrony on a particular timescale arises from an association with winter climate on that timescale, and the spatial synchrony in the deer population has a substantial statewide impact on human interactions with the deer."Additional authors were Jonathan Walter of KU and the University of Virginia and Robert Rolley of the Wisconsin Department of Natural Resources.Reuman said the findings could transfer to a wide range of other species and ecological systems, with ramifications for agriculture, fisheries, transportation managers and the insurance industry."We started out trying to understand the nature of synchrony in these things and trying to figure out what was causing it, and what its consequences are," Reuman said. "It's turned out to be related to these overall climatic indices. Now for deer, basically it's bad winter weather that we're talking about that synchronizes things. For another particular species, the nature of their relationship with the weather in a location is going to make the difference."
|
Weather
| 2,021 |
December 16, 2020
|
https://www.sciencedaily.com/releases/2020/12/201216155211.htm
|
COVID-19 spread increases when UV levels decrease
|
Natural variations in ultraviolet radiation influence the spread of COVID-19, but the influence is modest compared to preventive measures such as physical distancing, mask wearing, and quarantine, according to new research from Harvard University.
|
"Understanding the potential seasonality of COVID-19 transmission could help inform our response to the pandemic in the coming months," said Jonathan Proctor, a postdoctoral fellow at the Harvard Data Science Initiative and the Harvard Center for the Environment. "These findings suggest that the incidence of COVID-19 may have a seasonal pattern, spreading faster in the winter when it's darker than in the summer."Analyzing daily COVID-19 and weather data from over 3,000 administrative regions in more than 170 countries, Proctor, together with co-authors Peter Huybers, also at Harvard University, Tamma Carleton and Kyle Meng from the University of California Santa Barbara and Jules Cornetet at France's École Normale Supérieure Paris-Saclay, found that the spread of COVID-19 through a population tended to be lower in the weeks following higher UV exposure. Findings were published in the The seasonality of COVID-19 has been a mystery since the disease first emerged one year ago, though there have been some clues that UV could play a role. Related species of coronaviruses such as SARS and MERS were found to be sensitive to UV radiation and recent laboratory studies show that UV inactivates SARS-CoV-2, the virus that causes COVID-19, on surfaces.Attempts to understand the influence of UV in the real world, however, have been limited by scarce data and the difficulty of isolating climate variables from other drivers of transmission. To test for an environmental signal within the noise of the pandemic, the team compiled and cleaned data from statistical agencies around the world. To avoid potentially confounding factors that differ across regions, such as healthcare infrastructure or population density, the team examined how transmission within a particular population changed according to variations in sunlight, temperature, precipitation and humidity experienced by that same population."We basically ask whether daily fluctuations in environmental conditions experienced by a population affect new COVID-19 cases up to two weeks later," Meng explained.The researchers diagnosed the relationship between UV and COVID-19 using data from the beginning of the pandemic and then used that relationship to simulate how seasonal changes might influence the spread of COVID-19. They found that changes in UV between winter and summer led to a 7-percentage point decrease in the COVID-19 growth rate on average across the Northern Hemisphere, which is about half the average daily growth rate at the beginning of the pandemic. While this research shows that COVID-19 exhibits a seasonal pattern due to changes in UV, the full seasonality of COVID-19 remains unclear because of uncertain influences from other environmental factors such as temperature and humidity."We are confident of the UV effect, but this is only one piece of the full seasonality picture," Carleton said.The team noted that environmental influences are just one of many determinants of COVID-19 transmission, and that the estimated effects of UV seasonality in the Northern and Southern Hemispheres are a fraction of the size of previously estimated effects of anti-contagion policies including quarantines and travel bans."As we saw in the U.S. this summer, UV exposure alone is unlikely to stop the spread of the virus without strong social distancing policies," said Proctor. "Regardless of the weather, additional measures appear to be necessary to substantially slow the spread."The team analyzed the data in multiple ways and consistently found that the higher the UV, the lower the spread of COVID-19, but it remains unclear what mechanism is driving that effect. It may be that UV destroys the virus on surfaces or in aerosols, or that on sunny days people go outside more where there is less transmission. It is even possible that UV reduces susceptibility to COVID-19 by stimulating production of vitamin D and boosting the immune system."There's still so much that we don't know about how environmental factors both directly and indirectly, though human behavior, influence the spread of the virus," said Huybers. "But a better understanding of the environmental influences on COVID-19 could allow for seasonal adjustment of containment policies and may help inform vaccination strategies.
|
Weather
| 2,020 |
October 13, 2020
|
https://www.sciencedaily.com/releases/2020/10/201013111322.htm
|
Hurricanes, heavy rains are critical for Hawai'i's groundwater supply
|
Located within the most isolated archipelago in the world, Hawai'i is critically dependent on a clean, ample supply of fresh water. New research led by University of Hawai'i at M?noa scientists indicates that rain brought to the islands by hurricanes and Kona storms can often be the most important precipitation for re-supplying groundwater in many regions of the island of O'ahu.
|
"The majority of Hawai'i's freshwater comes from groundwater," said Daniel Dores, lead author and groundwater and geothermal researcher in the UH M?noa School of Ocean and Earth Science and Technology. "In this study, we investigated the relationship between trade wind showers, major rainfall events like Kona storms, and groundwater."Dores and a team of scientists from SOEST and the Hawai'i Department of Health collected rainfall around the island of Oahu and analyzed the stable isotopes of rainwater, chemical signatures in the water molecules. They compared the chemical signatures in rainwater to those of groundwater to determine the source of water in the aquifers -- event-based rainfall or trade wind-related rain."Because windward and mauka showers are so common, it is easy to assume that is the main source of our drinking water," said Dores. "Also, large rainfall events such as Kona storms result in significant runoff into the oceans. However, our research found that a lot of the rain from Kona storms makes it into our groundwater aquifers and is an important source of our drinking water."Hawai'i is experiencing substantial changes in trade wind weather patterns, and precipitation events could become more extreme. Some of the study co-authors will continue research to understand more about local and regional groundwater recharge and water quality."By better understanding how our groundwater is impacted by these extreme precipitation events, we can better protect the resource itself," said Dores.
|
Weather
| 2,020 |
September 18, 2020
|
https://www.sciencedaily.com/releases/2020/09/200918113350.htm
|
Solar storm forecasts for Earth improved with help from the public
|
Solar storm analysis carried out by an army of citizen scientists has helped researchers devise a new and more accurate way of forecasting when Earth will be hit by harmful space weather. Scientists at the University of Reading added analysis carried out by members of the public to computer models designed to predict when coronal mass ejections (CMEs) -- huge solar eruptions that are harmful to satellites and astronauts -- will arrive at Earth.
|
The team found forecasts were 20% more accurate, and uncertainty was reduced by 15%, when incorporating information about the size and shape of the CMEs in the volunteer analysis. The data was captured by thousands of members of the public during the latest activity in the Solar Stormwatch citizen science project, which was devised by Reading researchers and has been running since 2010.The findings support the inclusion of wide-field CME imaging cameras on board space weather monitoring missions currently being planned by agencies like NASA and ESA.Dr Luke Barnard, space weather researcher at the University of Reading's Department of Meteorology, who led the study, said: "CMEs are sausage-shaped blobs made up of billions of tonnes of magnetised plasma that erupt from the Sun's atmosphere at a million miles an hour. They are capable of damaging satellites, overloading power grids and exposing astronauts to harmful radiation."Predicting when they are on a collision course with Earth is therefore extremely important, but is made difficult by the fact the speed and direction of CMEs vary wildly and are affected by solar wind, and they constantly change shape as they travel through space."Solar storm forecasts are currently based on observations of CMEs as soon as they leave the Sun's surface, meaning they come with a large degree of uncertainty. The volunteer data offered a second stage of observations at a point when the CME was more established, which gave a better idea of its shape and trajectory."The value of additional CME observations demonstrates how useful it would be to include cameras on board spacecraft in future space weather monitoring missions. More accurate predictions could help prevent catastrophic damage to our infrastructure and could even save lives."In the study, published in The simplified model is able to run up to 200 simulations -- compared to around 20 currently used by more complex models -- to provide improved estimates of the solar wind speed and its impact on the movement of CMEs, the most harmful of which can reach Earth in 15-18 hours.Adding the public CME observations to the model's predictions helped provide a clearer picture of the likely path the CME would take through space, reducing the uncertainty in the forecast. The new method could also be applied to other solar wind models.The Solar Stormwatch project was led by Reading co-author Professor Chris Scott. It asked volunteers to trace the outline of thousands of past CMEs captured by Heliospheric Imagers -- specialist, wide-angle cameras -- on board two NASA STEREO spacecraft, which orbit the Sun and monitor the space between it and Earth.The scientists retrospectively applied their new forecasting method to the same CMEs the volunteers had analysed to test how much more accurate their forecasts were with the additional observations.Using the new method for future solar storm forecasts would require swift real-time analysis of the images captured by the spacecraft camera, which would provide warning of a CME being on course for Earth several hours or even days in advance of its arrival.
|
Weather
| 2,020 |
August 21, 2020
|
https://www.sciencedaily.com/releases/2020/08/200821094832.htm
|
First physics-based method for predicting large solar flares
|
Solar flares emit sudden, strong bursts of electromagnetic radiation from the Sun's surface and its atmosphere, and eject plasma and energetic particles into inter-planetary space. Since large solar flares can cause severe space weather disturbances affecting Earth, to mitigate their impact their occurrence needs to be predicted. However, as the onset mechanism of solar flares is unclear, most flare prediction methods so far have relied on empirical methods.
|
The research team led by Professor Kanya Kusano (Director of the Institute for Space-Earth Environmental Research, Nagoya University) recently succeeded in developing the first physics-based model that can accurately predict imminent large solar flares. The work was published in the journal The new method of flare prediction, called the kappa scheme, is based on the theory of "double-arc instability," that is a magnetohydrodynamic (MHD) instability triggered by magnetic reconnection. The researchers assumed that a small-scale reconnection of magnetic field lines can form a double-arc (m-shape) magnetic field and trigger the onset of a solar flare. The kappa -scheme can predict how a small magnetic reconnection triggers a large flare and how a large solar flare can occur.The predictive model was tested on about 200 active regions during solar cycle 24 from 2008 to 2019 using data obtained by NASA's Solar Dynamics Observatory (SDO) satellite. It was demonstrated that with few exceptions, the kappa-scheme predicts most imminent solar flares, as well as the precise location they will emerge from. The researchers also discovered that a new parameter -- the "magnetic twist flux density" close to a magnetic polarity inversion line on the solar surface -- determines when and where solar flares probably occur and how large they are likely to be.Previous flare prediction methods have relied on empirical relations in which the predictions of the previous day tend to continue into the next day even if flare activity changes. In contrast, the kappa-scheme predicts large solar flares through a physics-based approach regardless of previous flare activity. While it takes a lot more work to implement the scheme in real-time operational forecasting, this study shows that the physics-based approach may open a new direction for flare prediction research.
|
Weather
| 2,020 |
August 18, 2020
|
https://www.sciencedaily.com/releases/2020/08/200818142149.htm
|
Cold-weather accounts for almost all temperature-related deaths
|
With the number of extreme weather days rising around the globe in recent years due to global warming, it is no surprise that there has been an upward trend in hospital visits and admissions for injuries caused by high heat over the last several years. But cold temperatures are responsible for almost all temperature-related deaths, according to a new study published in the journal
|
According to the new study by researchers at the University of Illinois Chicago, patients who died because of cold temperatures were responsible for 94% of temperature-related deaths, even though hypothermia was responsible for only 27% of temperature-related hospital visits."With the decrease in the number of cold weather days over the last several decades, we still see more deaths due to cold weather as opposed to hot weather," said Lee Friedman, associate professor of environmental and occupational health sciences in the UIC School of Public Health and corresponding author on the paper. "This is in part due to the body's poorer ability to thermoregulate once hypothermia sets in, as well as since there are fewer cold weather days overall, people don't have time to acclimate to cold when those rarer cold days do occur."Hypothermia, or a drop in the body's core temperature, doesn't require sub-arctic temps. Even mildly cool temperatures can initiate hypothermia, defined as a drop in body temperature from the normal 98.7 degrees to 95 degrees Fahrenheit. When this occurs, organs and systems begin to shut down in an effort to preserve the brain. The process, once started, can be very difficult to get under control; however, people who are more regularly exposed to lower temperatures are better able to resist hypothermia."People who were experiencing homelessness in the records we looked at were less likely to die from temperature-related injury," Friedman said. "Because they have greater outdoor exposure, they acclimate better to both heat and cold."Heat-related issues are more likely to self-resolve by getting to a cooler place or by hydrating, Friedman said.The researchers looked at inpatient and outpatient heat- and cold-related injuries that required a hospital visit in Illinois between 2011 and 2018. They identified 23,834 cold-related cases and 24,233 heat-related cases. Among these patients, there were 1,935 cold-related deaths and 70 heat-related deaths.Friedman said government data systems that track temperature-related deaths significantly undercount these deaths."We found five to 10 times more temperature-related deaths by linking the hospital data to data from the National Weather Service and medical examiner's data," he said. "There are a lot more people dying from temperature-related injuries than is generally reported."Friedman and his colleagues also found that cumulative costs associated with temperature-related hospital visits were approximately $1 billion between 2011 and 2018 in Illinois.Adults older than age 65 and Black people were almost twice as likely to be hospitalized due to temperature-related injuries. Individuals who visited a hospital due to cold temperatures also commonly had multiple health issues, including electrolyte disorders, cardiovascular disease and kidney failure."Currently, the public health community focuses almost exclusively on heat injury. Our data demonstrate that improved awareness and education are needed around the risk for cold injuries, especially since there are fewer but more severe cold weather days -- leaving less chance for acclimation, which can be protective against hypothermia," Friedman said.Chibuzor Abasilim, Rosalinda Fitts and Michelle Wueste from UIC are co-authors on the paper.
|
Weather
| 2,020 |
August 18, 2020
|
https://www.sciencedaily.com/releases/2020/08/200818114936.htm
|
Climate change impact on green energy production
|
As the climate of the planet is changing, as evidenced by record-setting hot summers and extreme weather events, many researchers are looking to more renewable energy sources, such as solar and wind farms.
|
In a paper for the The researchers focused on Australia, since it is an ideal case study with extreme weather events, such as bush fires and windstorms. The sites selected were near Adelaide in South Australia and in southern New South Wales, where variable renewable generators are located or are likely to be located in the future based on the Australian Energy Market Operator's system plan.The researchers analyzed key weather variables, such as temperature, surface solar irradiance and wind speed, in 30-minute intervals for the years 1980 to 2060."We found that the general temporal trends in annual solar and wind power generation due to climate change are small, being at the order of 0.1% of its average production per decade," said Jing Huang, one of the authors.During the five hottest days of every year, however, the effect of climate change on renewable energy production was more severe. During these peak temperature days that coincided with peak energy demand and peak prices, solar power production was down 0.5%-1.1%, and wind farm production decreased between 1.6%-3% per decade.The researchers' findings are geared to inform the electricity sector about the reliability of interconnected power networks in different temperature conditions. Central power operators, for instance, need to plan for all contingencies to avoid power blackouts.While this study looked at two sites in Australia, quantifying temperature impacts on renewable energy generation can be generalized to other areas beyond Australia."For different regions, there are different effects of climate change," said Huang.He said it will be important to conduct more case studies in other areas and climate regions to examine spatial climate variability and couple the findings with other aspects of energy systems, such as load and transmission infrastructure.
|
Weather
| 2,020 |
August 17, 2020
|
https://www.sciencedaily.com/releases/2020/08/200817144119.htm
|
Equatorial winds ripple down to Antarctica
|
A CIRES-led team has uncovered a critical connection between winds at Earth's equator and atmospheric waves 6,000 miles away at the South Pole. The team has found, for the first time, evidence of a Quasi-Biennial Oscillation (QBO) -- an atmospheric circulation pattern that originates at the equator -- at McMurdo, Antarctica.
|
The discovery highlights how winds in the deep tropics affect the remote South Pole, in particular the polar vortex, which can trigger outbreaks of cold weather patterns in mid latitudes. Scientists will be able to use this information to better understand the planet's weather and climate patterns and fuel more accurate atmospheric models, the authors say."We have now seen how this atmospheric pattern propagates from the equator all the way to the high latitudes of Antarctica, showing how these far-away regions can be linked in ways we didn't know about before," said Zimu Li, a former CIRES research assistant who did this work at the University of Colorado Boulder, and lead author of the study out today in the "This can better our understanding of how large-scale atmospheric circulation works, and how patterns in one area of the world can ripple across the entire globe," said Xinzhao Chu, CIRES Fellow, professor in the Ann & H.J. Smead Department of Aerospace Engineering Sciences at the University of Colorado Boulder, and corresponding author on the new work.Every two years or so, the QBO causes the stratospheric winds at Earth's equator to switch direction, alternating between easterly and westerly. Lynn Harvey, a researcher at CU's Laboratory for Atmospheric and Space Physics (LASP) and a coauthor on the study, helped the team study the polar vortices, the massive swirls of cold air that spiral over each of Earth's poles. The study reports that the Antarctic vortex expands during the QBO easterly phase and contracts during the westerly phase. The team suspects that when the QBO changes the polar vortex behavior, that, in turn, affects the behavior of atmospheric waves called gravity waves, which travel across different layers of the atmosphere. They identified specific kinds of changes in those gravity waves: The waves are stronger during the easterly period of the QBO and weaker when the QBO is westerly.For the last nine years, members of Chu's lidar team have spent long seasons at McMurdo Station, Antarctica, braving 24-hour darkness and frigid temperatures to operate custom lasers and measure patterns in Earth's atmosphere. These long-term measurements, along with 21 years of NASA MERRA-2 atmospheric records, were critical to the new findings. Each QBO cycle takes years to complete, so long-term data streams are the only way to identify interannual connections and patterns."Atmospheric scientists can use this information to improve their models -- before this nobody really knew how QBO impacts gravity waves in this polar region," said Xian Lu, researcher at Clemson University and a coauthor on the study. "Researchers can use this information to better model and predict climate, including the variability of atmosphere and space and long-term change."
|
Weather
| 2,020 |
August 10, 2020
|
https://www.sciencedaily.com/releases/2020/08/200810141004.htm
|
Stronger rains in warmer climate could lessen heat damage to crops, says study
|
Intensified rainstorms predicted for many parts of the United States as a result of warming climate may have a modest silver lining: they could more efficiently water some major crops, and this would at least partially offset the far larger projected yield declines caused by the rising heat itself. The conclusion, which goes against some accepted wisdom, is contained in a new study published this week in the journal
|
Numerous studies have projected that rising growing-season temperatures will drastically decrease yields of some major U.S.crops, absent adaptive measures. The damage will come from both steadily heightened evaporation of soil moisture due to higher background temperatures, and sudden desiccation of crops during heat waves. Some studies say that corn, which currently yields about 13 billion bushels a year and plays a major role in the U.S. economy, could nosedive 10 to 30 percent by the mid- to late century. Soy-the United States is the world's leading producer-could decline as much as 15 percent.Since warmer air can hold more moisture, it is also projected that rainfall will in the future come more often in big bursts, rather than gentle downpours-a phenomenon that is already being observed in many areas. Many scientists have assumed that more extreme rains might further batter crops, but the new study found that this will probably not be the case. The reason: most of the projected heavier downpours will fall within a range that benefits crops, rather than passing the threshold at which they hurt them."People have been talking about how more extreme rain will damage crops," said lead author Corey Lesk, a PhD. student at Columbia University's Lamont-Doherty Earth Obsevatory. "The striking thing we found was, the overall effect of heavier rains is not negative. It turns out to be good for crops."That said, the effects will probably be modest, according to the study. It estimates that corn yields could be driven back up 1 or 2 percent by the heavier rains, and soy by 1.3 to 2.5 percent. These increases are dwarfed by the potential losses due to heat, but even a few percent adds up when dealing with such huge quantities of crops. And, the researchers say, "Our findings may help identify new opportunities for climate-adaptive crop management and improved modeling."The team reached their conclusions by studying hour-by-hour rainfall patterns recorded by hundreds of weather stations in the agricultural regions of the U.S. West, South and Northeast each year from 2002 to 2017. They then compared the rainfall patterns to crop yields. They found that years with rains of up to about 20 millimeters an hour-roughly the heaviest downpour of the year on average-resulted in higher yields. It was only when rains reached an extreme 50 millimeters an hour or more that crops suffered damage. (20 millimeters an hour is about three-quarters of an inch; 50 is about 2 inches.) Moreover, years in which rain came mainly as mere drizzle actually damaged yields.The researchers outlined several possible reasons for the differences. For one, drizzle may be too inefficient to do much good. In hot weather, it can mostly evaporate back into the air before reaching subsurface root zones where it is needed; in cooler weather, it might remain on leaves long enough to encourage the growth of damaging fungi. "There are only a fixed number of hours of rain you can get in a season," said Lesk. "If too much of them are taken up by useless drizzle, it's wasted."Heavier storms on the other hand, are better-at least up to a point. These allow water to soak thoroughly into the soil, carrying in both moisture and artificial fertilizer spread on the surface. It is only the most extreme events that hurt crops, say the researchers: these can batter plants directly, wash fertilizer off fields, and saturate soils so thoroughly that roots cannot get enough oxygen.To study the effects of future potential rainfall patterns, the researchers used basic physical models to estimate how much heavier rains might become under different levels of warming. They found that in most cases, more rain would, as expected, come in bigger downpours-but these heavier rains would fall within the fairly wide range where they are beneficial. The most extreme, damaging rains would also increase-but would still be rare enough that the greater number of beneficial rainfalls would outweigh their effects.Because the study averaged out statistics over vast areas, and many other factors can affect crop yields, it would be hard to say exactly what the effects of future rainfall will be in any one area, said Lesk. "No single farmer would use a study like this to make decisions on what to plant or how," he said. But, as the paper concludes, the results "suggest that beyond extreme events, the crop yield response to more common rainfall intensities merits further attention."The study was coauthored by Ethan Coffel of Dartmouth College and Radley Horton of Lamont-Doherty. Funding came from the U.S. National Science Foundation, U.S. Department of Interior, and U.S. Geological Survey.
|
Weather
| 2,020 |
August 6, 2020
|
https://www.sciencedaily.com/releases/2020/08/200806133513.htm
|
Analysis of renewable energy points toward more affordable carbon-free electricity
|
As more states in the U.S. push for increased reliance on variable renewable energy in the form of wind or solar power, long-term energy storage may play an important role in assuring reliability and reducing electricity costs, according to a new paper published by Caltech researchers.
|
Graduate student Jackie Dowling, who works in the lab of Nathan Lewis (BS '77), the George L. Argyros Professor and professor of chemistry, has collaborated with Ken Caldeira at the Carnegie Institution for Science and others to examine energy-storage options and multiple decades of data about wind and solar availability. Dowling and her collaborators determined that currently available battery technology is prohibitively expensive for long-term energy storage services for the power grid and that alternative technologies that can store a few weeks' to a month's worth of energy for entire seasons or even multiple years may be the key to building affordable, reliable renewable electricity systems.Energy storage is needed with renewable energy because wind and solar energy are not as reliably available as fossil fuels. For example, wind power is often at its lowest during the summer in the United States, which is when the electrical grid is strained the most by the demand for air conditioning in homes and businesses."This research is motivated by the fact that laws in several states have mandated 100 percent carbon-free electricity systems by midcentury," says Dowling, lead author of a paper about the work. "Within these mandates, a lot of states include requirements for wind and solar power. Both wind and solar are variable from day to day, or even year to year, yet high reliability is mandatory for a viable electricity system. Energy storage can fill in for the gaps between supply and demand."Dowling looked at short-duration storage systems, such as lithium-ion batteries, and long-duration storage methods, such as hydrogen storage, compressed-air storage, and pumped-storage hydroelectricity.To see how to optimize the use of those storage technologies at the lowest energy cost, Dowling built a mathematical simulation of each and incorporated historical electricity-demand data and four decades of hourly resolved historical weather data across the contiguous U.S. The Macro Energy Model, as she calls it, reveals that adding long-duration storage to a wind-solar-battery system lowers energy costs. In contrast, using batteries alone for storage makes renewable energy more expensive.Dowling says that the extra expense associated with batteries occurs because they cannot cost-effectively store enough energy for an entire season during which electricity is generated in lower amounts. That means an electrical grid would require many costlier solar panels or wind turbines to compensate and would result in wasteful idling of electricity-generation equipment for much of the year.Currently available battery technology is not even close to being cost effective for seasonal storage, Dowling says."The huge dip in wind power in the summer in the U.S. is problematic, and batteries are not suitable for filling that gap. So, if you only have batteries, you have to overbuild wind or solar capacity," she says. "Long-duration storage helps avoid the need to overbuild power generation infrastructure and provides electricity when people need it rather than only when nature provides it. At current technology costs, storage in underground caverns of green hydrogen generated by water electrolysis would provide a cost-effective approach for long-duration grid storage."Other researchers have built renewable energy models, but the team's data-driven approach is the first to incorporate four decades of historical wind and solar variability data, thus factoring in variability from year to year and periodic episodes of rare weather events that affect power generation, such as wind and solar droughts."The more years of data we use in our models, the more we find a compelling need for long-term storage to get the reliability that we expect from an electricity system," she says.Dowling suggests her findings may be helpful to policy makers in states with 100 percent carbon-free electricity laws and high wind/solar mandates and to other U.S. states considering the adoption of similar laws. In the future, she plans to extend her research to take an in-depth look at the roles that specific types of energy storage, such as hydrogen or redox flow batteries, can play in renewable energy systems. For instance, some types of batteries might effectively serve as medium-duration energy storage, she says.
|
Weather
| 2,020 |
August 5, 2020
|
https://www.sciencedaily.com/releases/2020/08/200805110106.htm
|
Climate change may melt the 'freezers' of pygmy owls and reduce their overwinter survival
|
Ecologists at the University of Turku, Finland, have discovered that the food hoards pygmy owls collect in nest-boxes ("freezers") for winter rot due to high precipitation caused by heavy autumn rains and if the hoarding has been initiated early in the autumn. The results of the study show that climate change may impair predators' foraging and thus decrease local overwinter survival. The study has been published in the internationally esteemed
|
Doctoral Candidate Giulia Masoero together with co-authors from the Department of Biology at the University of Turku analysed the unique long-term data set collected by Professor Erkki Korpimäki and his research group in 2003-2018. The aim was to study how the changing weather conditions in late autumn and winter affect the initiation of pygmy owls' food hoarding as well as the accumulation, use and preservability of the hoarded food. The data set was collected from the Kauhava region in South Bothnia with over 500 food hoards and a research area covering 1,000 square kilometres.Pygmy owls are small predators that feed on small mammals, especially voles which are their main prey, and birds. Pygmy owls start hoarding for winter usually in late October when the temperature drops below 0° C. They hoard a large amount of prey in tree cavities or nest boxes.The food stores may be located in multiple nest boxes some kilometres apart. Female owls that are larger than males as well as young owls accumulate larger food stores than males and adult owls.According to Erkki Korpimäki, the hoarded food is important for pygmy owls during winter, when small mammal prey are under snow and birds are scarce."This hoarding behaviour is highly susceptible to global warming because the weather during autumn and winter can affect the condition and usability of the food stores. In several northern areas, autumns have already become warmer and winters milder and rainy. Predictions show that climate change will likely continue along this path and the length of winter will strongly decrease."According to the study, the more rainy days there are between mid-October and mid-December, the more likely the food hoards of pygmy owls are to rotten. The owls use rotten food hoards particularly during poor vole years. However, the study showed that having rotten food hoards reduced the recapture probability of female owls in the study area, meaning female owls either die or are forced to leave the area."This result indicates that either the use of the rotten low-quality food and/or the energy waste linked with collecting a large food store that will not be used can lead to lower survival or dispersal from the study area," notes Doctoral Candidate Giulia Masoero.Pygmy owls might be partly able to adapt to climate change by delaying food hoarding but will more likely suffer due to the changes caused by the warming climate. The results of the study together with global climate predictions thus suggest that climate change has the potential to strongly impair the foraging behaviour and food intake of wintering predators, likely having negative impacts on the boreal forest food web as a whole.
|
Weather
| 2,020 |
August 3, 2020
|
https://www.sciencedaily.com/releases/2020/08/200803120146.htm
|
Germany-wide rainfall measurements by utilizing the mobile network
|
Whether in flood early-warning systems or in agriculture -- rainfall measurements are of great importance. However, there is a lack of accurate data for many regions in the world due to the fact that comprehensive measurements have so far been too expensive. This could change with a new method that has just passed its practical test. Researchers at KIT (Karlsruhe Institute of Technology) and the University of Augsburg have succeeded in utilizing the commercial microwave link network (CML) operated by mobile network providers for Germany-wide rainfall measurements. This new technology is now planned to be used in West Africa. The team published their results in the scientific journals
|
Rain can significantly impair the performance of a mobile network. But a phenomenon that can cause headaches for telecommunications companies is a stroke of luck for meteorological research: "We have developed a completely new method for rain measurement from this interaction between weather events and human technology," says Professor Harald Kunstmann from the Institute of Meteorology and Climate Research -- Atmospheric Environmental Research (IMK-IFU), the so-called Campus Alpin of KIT. "If a commercial microwave link network (CML) is in place, we neither need a new infrastructure nor additional ground staff." Together with scientists from the University of Augsburg, his KIT team now succeeded in performing the first Germany-wide rainfall measurement with the new method: They were able to derive rainfall maps with high temporal resolution based on the attenuation of the CMLs between several thousand of mobile phone masts that is caused by precipitation. "A comparison with the measurements of the German Meteorological Service shows that we have achieved a high degree of correlation," explains Maximilian Graf, member of the research team.Precipitation could be determined thanks to the CML antennas installed in mobile phone masts for signal transmission over long distances. "A frequency of 15 to 40 gigahertz is used here. Its wavelength corresponds to the typical size of raindrops," explains Dr. Christian Chwala who coordinates this research work at the University of Augsburg. "Increasing precipitation weakens the signal that radio masts use to exchange information. Over one year, we measured the current attenuation obtained from 4,000 CMLs with a temporal resolution of 1 minute. The resulting data set is unique in its resolution and enormous size."Besides the classical methods of data analysis, the researchers used artificial intelligence (AI) to filter the rain signal from the noisy measurement results. "Other factors, such as wind or the sun, can also cause a slight attenuation of the signal. With the help of our AI, we were able to identify the signal attenuation that was due to rainfall," says Julius Polz, another scientist from the research group. "We have now trained our AI in such a way that we no longer need to calibrate the system using traditional methods of rain measurement." Thus, it is suitable for application in regions without significant rainfall measurements that could be considered for AI training, such as West Africa.For Germany, however, the method works mainly in spring, summer, and fall. "This is because sleet and freezing rain cause a higher attenuation than liquid precipitation, and snow cannot be measured with the CML network at all," explains Harald Kunstmann. Several projects are currently underway where the researchers will measure rainfall using CMLs, with one particular focus on Germany, in cooperation with the German Meteorological Service and the Office for the Environment of the state of Saxony. In the course of the summer, further projects will start in the Czech Republic and in Burkina Faso, where a nationwide collection of CML data is to be established for the first time in Africa.
|
Weather
| 2,020 |
July 30, 2020
|
https://www.sciencedaily.com/releases/2020/07/200729114858.htm
|
U.S. should consider 'stay-at-home' cooling options during pandemic
|
A new study from Australian scientists at the forefront of climate and health modelling suggests electric fans and water dousing could be a viable stay-at-home cooling strategy as the United States (US) anticipates extreme heat.
|
The risk of two major public health threats converging -- a heatwave and the COVID-19 pandemic -- is quickly becoming a reality as the US approaches its hottest month of the year with COVID-19 cases continuing to rise."Authorities have acknowledged that the usual strategies recommended to protect individuals from heat-related illness such as seeking refuge in air-conditioned places including dedicated cooling centers or shopping malls risks further transmission of the virus," said senior author Associate Professor Ollie Jay from the University of Sydney (Australia)."We also know that many of those who are most at-risk of COVID-19 are those also at-risk of heat-related illness, such as the elderly and those with cardiovascular or respiratory conditions.""There is an urgent need for low-cost, accessible cooling strategies to protect the most vulnerable from heat-related illness and the spread of SARS-COV-2. Our study challenges outdated public health advice suggesting that fans are not beneficial in extreme heat."In the new study, published in With modelling based on clinical trials and historic weather data the researchers found;They conclude that this solution could be potentially recommended by health authorities as a safe and effective stay-at-home cooling strategy across vast swathes of the Northeast, Southeast and Midwest regions of the United States, as well as the West Coast.The researchers acknowledge that while the modelling is conservative (based on the physiology of an older adult) it can only show when fans would or would not be effective based on climate. As with many public health interventions, an individuals' response to the cooling strategy may be influenced by their health status.Based on previous work published in The previous trials conducted on participants in a climate-controlled chamber show that that in hot, humid conditions fans lower core body temperature and cardiovascular strain and improve thermal comfort. However, fans can be detrimental in very hot, dry conditions.These data were analysed alongside historic weather data (temperature and associated relative humidity) recorded at any point over the last 20 years (2000 to 2019 inclusive) for 105 of the most populated metropolis areas in the United States.First author and PhD student Lily Hospers said: "Importantly, this research proposes a potential cooling strategy that can be used at home during the current pandemic, therefore circumventing the need for potentially risky excursions into public spaces and expensive home-based air-conditioning."
|
Weather
| 2,020 |
July 29, 2020
|
https://www.sciencedaily.com/releases/2020/07/200729205009.htm
|
Report provides new framework for understanding climate risks, impacts to US agriculture
|
Agricultural production is highly sensitive to weather and climate, which affect when farmers and land managers plant seeds or harvest crops. These conditions also factor into decision-making, when people decide to make capital investments or plant trees in an agroforestry system.
|
A new report from the U.S. Department of Agriculture focuses on how agricultural systems are impacted by climate change and offers a list of 20 indicators that provide a broad look at what's happening across the country.The report, "Climate Indicators for Agriculture," is co-authored by Colorado State University's Peter Backlund, associate director of the School of Global Environmental Sustainability.Backlund said the research team started with the scientific fact that climate change is underway."We looked at the U.S. agricultural system and examined the climate stresses," he said. "This report outlines data that farmers and land managers can use to understand how climate change is affecting their operations, and, hopefully, guide the development of effective adaptation."In the report, the authors outline how the changes taking place in agriculture affect the system that many people make their livelihoods from."We want to help farmers, ranchers and land managers adapt better under climate change, which requires understanding what is actually happening on the ground. These indicators offer ways to measure the impacts of change," said Backlund.The climate indicators described in the report are arranged in five categories, including physical (extreme precipitation and nighttime air temperature), crop and livestock (animal heat stress and leaf wetness duration), biological (insect infestation in crops, crop pathogens), phenological (timing of budbreak in fruit trees, disease vectors in livestock) and socioeconomic (crop insurance payments, heat-related mortality of agricultural workers).Backlund said the research team chose these indicators based on the strength of their connection to climate change and availability of long-term data, which is needed to identify how impacts are changing over time and whether adaptive actions are having the desired effect."There had to be a measurement of a variable strongly coupled with climate," he said. "As we go forward, we will better understand the impact of climate change by using these indicators."Researchers opted to include nighttime air temperatures as opposed to general temperature because nighttime temperatures have a big effect on the way plants develop.Some of the indicators have national data, while others are more regional. Heat stress on livestock, a huge issue for feedlot operators, will be of interest to farmers and ranchers in states including Colorado."Heat interferes with the rate of reproduction and rate of weight gain," Backlund said. "This presses on the whole operation; it's not just that a few more animals will die from getting too hot."The crop insurance payment indicator offers insight on the repercussions of climate events."You can see if you have a big climate event, like drought, one region will be much more affected than another," he said. "If farmers have good irrigation, they'll be much more capable in dealing with periods of low rainfall."Backlund said the indicator covering weed range and intensity was also notable. As carbon dioxide concentrations increase, researchers are seeing extreme northern migrations and expanded ranges for weeds.
|
Weather
| 2,020 |
July 29, 2020
|
https://www.sciencedaily.com/releases/2020/07/200729124404.htm
|
Breakthrough method for predicting solar storms
|
Extensive power outages and satellite blackouts that affect air travel and the internet are some of the potential consequences of massive solar storms. These storms are believed to be caused by the release of enormous amounts of stored magnetic energy due to changes in the magnetic field of the sun's outer atmosphere -- something that until now has eluded scientists' direct measurement. Researchers believe this recent discovery could lead to better "space weather" forecasts in the future.
|
"We are becoming increasingly dependent on space-based systems that are sensitive to space weather. Earth-based networks and the electrical grid can be severely damaged if there is a large eruption," says Tomas Brage, Professor of Mathematical Physics at Lund University in Sweden.Solar flares are bursts of radiation and charged particles, and can cause geomagnetic storms on Earth if they are large enough. Currently, researchers focus on sunspots on the surface of the sun to predict possible eruptions. Another and more direct indication of increased solar activity would be changes in the much weaker magnetic field of the outer solar atmosphere -- the so-called Corona.However, no direct measurement of the actual magnetic fields of the Corona has been possible so far."If we are able to continuously monitor these fields, we will be able to develop a method that can be likened to meteorology for space weather. This would provide vital information for our society which is so dependent on high-tech systems in our everyday lives," says Dr Ran Si, post-doc in this joint effort by Lund and Fudan Universities.The method involves what could be labelled a quantum-mechanical interference. Since basically all information about the sun reaches us through "light" sent out by ions in its atmosphere, the magnetic fields must be detected by measuring their influence on these ions. But the internal magnetic fields of ions are enormous -- hundreds or thousands of times stronger than the fields humans can generate even in their most advanced labs. Therefore, the weak coronal fields will leave basically no trace, unless we can rely on this very delicate effect -- the interference between two "constellations" of the electrons in the ion that are close -- very close -- in energy.The breakthrough for the research team was to predict and analyze this "needle in the haystack" in an ion (nine times ionized iron) that is very common in the corona.The work is based on state-of-the art calculations performed in the Mathematical Physics division of Lund University and combined with experiments using a device that could be thought of as being able to produce and capture small parts of the solar corona -- the Electron Beam Ion Trap, EBIT, in Professor Roger Hutton's group in Fudan University in Shanghai."That we managed to find a way of measuring the relatively weak magnetic fields found in the outer layer of the sun is a fantastic breakthrough," concludes Tomas Brage.
|
Weather
| 2,020 |
July 24, 2020
|
https://www.sciencedaily.com/releases/2020/07/200724161452.htm
|
Alaska is getting wetter: That's bad news for permafrost and the climate
|
Alaska is getting wetter. A new study spells out what that means for the permafrost that underlies about 85% of the state, and the consequences for Earth's global climate.
|
The study, published today in Nature Publishing Group journal As Siberia remains in the headlines for record-setting heat waves and wildfires, Alaska is experiencing the rainiest five years in its century-long meteorological record. Extreme weather on both ends of the spectrum -- hot and dry versus cool and wet -- are driven by an aspect of climate change called Arctic amplification. As the earth warms, temperatures in the Arctic rise faster than the global average.While the physical basis of Arctic amplification is well understood, it is less known how it will affect the permafrost that underlies about a quarter of the Northern Hemisphere, including most of Alaska. Permafrost locks about twice the carbon that is currently in the atmosphere into long-term storage and supports Northern infrastructure like roads and buildings; so understanding how a changing climate will affect it is crucial for both people living in the Arctic and those in lower latitudes."In our research area the winter has lost almost three weeks to summer," says study lead author and Fairbanks resident Thomas A. Douglas, who is a scientist with the U.S. Army Cold Regions Research and Engineering Laboratory. "This, along with more rainstorms, means far more wet precipitation is falling every summer."Over the course of five years, the research team took 2750 measurements of how far below the land's surface permafrost had thawed by the end of summer across a wide range of environments near Fairbanks, Alaska. The five-year period included two summers with average precipitation, one that was a little drier than usual, and the top and third wettest summers on record. Differences in annual rainfall were clearly imprinted in the amount of permafrost thaw.More rainfall led to deeper thaw across all sites. After the wettest summer in 2014, permafrost didn't freeze back to previous levels even after subsequent summers were drier. Wetlands and disturbed sites, like trail crossings and clearings, showed the most thaw. Tussock tundra, with its deep soils and covering of tufted grasses, has been found to provide the most ecosystem protection of permafrost. While permafrost was frozen closest to the surface in tussock tundra, it experienced the greatest relative increase in the depth of thaw in response to rainfall, possibly because water could pool on the flat surface. Forests, especially spruce forests with thick sphagnum moss layers, were the most resistant to permafrost thaw. Charlie Koven, an Earth system modeler with the Lawrence Berkeley National Laboratory, used the field measurements to build a heat balance model that allowed the team to better understand how rain was driving heat down into the permafrost ground.The study demonstrates how land cover types govern relationships between summer rainfall and permafrost thaw. As Alaska becomes warmer and wetter, vegetation cover is projected to change and wildfires will disturb larger swathes of the landscape. Those conditions may lead to a feedback loop between more permafrost thaw and wetter summers.In the meantime, rainfall -- and the research -- continue. Douglas says, "I was just at one of our field sites and you need hip waders to get to areas that used to be dry or only ankle deep with water. It is extremely wet out there. So far this year we have almost double the precipitation of a typical year.""This study adds to the growing body of knowledge about how extreme weather -- ranging from heat spells to intense summer rains -- can disrupt foundational aspects of Arctic ecosystems," says Merritt Turetsky, Director of the University of Colorado Boulder's Institute of Arctic and Alpine Research (INSTAAR) and a coauthor of the study. "These changes are not occurring gradually over decades or lifetimes; we are watching them occur over mere months to years."
|
Weather
| 2,020 |
July 17, 2020
|
https://www.sciencedaily.com/releases/2020/07/200717101026.htm
|
Reduction in commercial flights due to COVID-19 leading to less accurate weather forecasts
|
Weather forecasts have become less accurate during the COVID-19 pandemic due to the reduction in commercial flights, according to new research.
|
A new study in AGU's journal Aircraft typically inform weather forecasts by recording information about air temperature, relative humidity, air pressure and wind along their flight path. With significantly fewer planes in the sky this spring, forecasts of these meteorological conditions have become less accurate and the impact is more pronounced as forecasts extend further out in time, according to the study, which is part of an ongoing special collection of research in AGU journals related to the current pandemic.Weather forecasts are an essential part of daily life, but inaccurate forecasts can also impact the economy, according to Ying Chen, a senior research associate at the Lancaster Environment Centre in Lancaster, United Kingdom and lead author of the new study. The accuracy of weather forecasts can impact agriculture as well as the energy sector and stability of the electrical grid. Wind turbines rely on accurate forecasts of windspeed and energy companies depend on temperature forecasts to predict what the energy load will be each day as people crank up their air conditioning."If this uncertainty goes over a threshold, it will introduce unstable voltage for the electrical grid," Chen said. "That could lead to a blackout, and I think this is the last thing we want to see in this pandemic."The regions most impacted by the reduction in weather forecasts have been those with normally heavy air traffic, like the United States, southeast China and Australia, as well as isolated regions like the Sahara Desert, Greenland and Antarctica. Western Europe is a notable exception: its weather forecasts have been relatively unaffected despite the number of aircraft over the region dropping by 80-90%.This was surprising, Chen said. Chen suspects the region has been able to avoid inaccuracies because it has a densely-packed network of ground-based weather stations and balloon measurements to compensate for the lack of aircraft."It's a good lesson which tells us we should introduce more observation sites, especially in the regions with sparse data observations," Chen said. "This will help us to buffer the impacts of this kind of global emergency in the future."Chen also found precipitation forecasts around the world have not been significantly affected, because rainfall forecasts have been able to rely on satellite observations. But March, April and May have been relatively dry this year in most of the world, so Chen cautions that precipitation forecasts could potentially suffer as the hurricane and monsoon seasons arrive.Forecast models are more accurate when a greater number of meteorological observations are taken into account, and the number of observations is greatly diminished when fewer planes are in the air, as was the case in March-May of this year. The Aircraft Meteorological Data Relay program includes over 3,500 aircraft and 40 commercial airlines, which typically provide over 700,000 meteorological reports a day.When Chen compared the accuracy of weather forecasts from March-May 2020 to the same periods in 2017, 2018 and 2019, he found the 2020 forecasts were less accurate for temperature, relative humidity, windspeed and air pressure. This is despite the fact that in February, before flights were significantly impacted, weather forecasts were more accurate than in previous years.He found surface pressure and wind speed forecasts were unaffected in the short term (1-3 days) but were less accurate for the longer-term (4-8 days) forecasts included in the study. In February, before the number of flights dropped off, forecast accuracy in several regions that rely on aircraft observations had actually improved by up to 1.5 degrees Celsius (35 degrees Fahrenheit) over previous years. But in March-May 2020, when flights were reduced by 50-75% compared to February, that improvement in accuracy vanished.Chen found western Europe was the only region with normally high flight traffic that did not suffer remarkably reduced accuracy in temperature forecasts. He attributed this to over 1,500 meteorological stations that form a dense data collection network in the area.However, European weather was particularly unvarying over the March-May 2020 time period, making it easier to forecast with less data, according to Jim Haywood, a professor of atmospheric science at the University of Exeter, United Kingdom, who was not involved with the new study. Haywood suspects this played a role in the persisting accuracy of western European forecasts in addition to the network of ground observation points.The longer forecasters lack aircraft data, the more weather forecasts will be impacted, according to the study. While precipitation forecasts have so far been unaffected, scientists' ability to catch early warning signs of extreme weather events this summer could suffer. In the long term, the study results suggest sources of weather data should be diversified, especially in observation-sparse areas and areas that rely heavily on commercial flights, according to Chen.
|
Weather
| 2,020 |
July 16, 2020
|
https://www.sciencedaily.com/releases/2020/07/200716123000.htm
|
Heat stress: The climate is putting European forests under sustained pressure
|
No year since weather records began was as hot and dry as 2018. A first comprehensive analysis of the consequences of this drought and heat event shows that central European forests sustained long-term damage. Even tree species considered drought-resistant, such as beech, pine and silver fir, suffered. The international study was directed by the University of Basel, which is conducting a forest experiment unique in Europe.
|
Until now, 2003 has been the driest and hottest year since regular weather records began. That record has now been broken. A comparison of climate data from Germany, Austria and Switzerland shows that 2018 was significantly warmer. The average temperature during the vegetation period was 1.2°C above the 2003 value and as high as 3.3°C above the average of the years from 1961 to 1990.Part of the analysis, which has now been published, includes measurements taken at the Swiss Canopy Crane II research site in Basel, where extensive physiological investigations were carried out in tree canopies. The goal of these investigations is to better understand how and when trees are affected by a lack of water in order to counter the consequences of climate change through targeted management measures.Trees lose a lot of water through their surfaces. If the soil also dries out, the tree cannot replace this water, which is shown by the negative suction tension in the wood's vascular tissue. It's true that trees can reduce their water consumption, but if the soil water reservoir is used up, it's ultimately only a matter of time until cell dehydration causes the death of a tree.Physiological measurements at the Basel research site have shown the researchers that the negative suction tension and water shortage in trees occurred earlier than usual. In particular, this shortage was more severe throughout all of Germany, Austria and Switzerland than ever measured before. Over the course of the summer, severe drought-related stress symptoms therefore appeared in many tree species important to forestry. Leaves wilted, aged and were shed prematurely.The true extent of the summer heatwave became evident in 2019: many trees no longer formed new shoots -- they were partially or wholly dead. Others had survived the stress of the drought and heat of the previous year, but were increasingly vulnerable to bark beetle infestation or fungus. Trees with partially dead canopies, which reduced the ability to recover from the damage, were particularly affected."Spruce was most heavily affected. But it was a surprise for us that beech, silver fir and pine were also damaged to this extent," says lead researcher Professor Ansgar Kahmen. Beech in particular had until then been classified as the "tree of the future," although its supposed drought resistance has been subject to contentious discussion since the 2003 heatwave.According to the latest projections, precipitation in Europe will decline by up to a fifth by 2085, and drought and heat events will become more frequent. Redesigning forests is therefore essential. "Mixed woodland is often propagated," explains plant ecologist Kahmen, "and it certainly has many ecological and economic advantages. But whether mixed woodland is also more drought-resistant has not yet been clearly proven. We still need to study which tree species are good in which combinations, including from a forestry perspective. That will take a long time."Another finding of the study is that it is only possible to record the impacts of extreme climate events on European forests to a limited extent using conventional methods, and thus new analytical approaches are needed. "The damage is obvious. More difficult is precisely quantifying it and drawing the right conclusions for the future," says Kahmen. Earth observation data from satellites could help track tree mortality on a smaller scale. Spatial patterns that contain important ecological and forestry-related information can be derived from such data: which tree species were heavily impacted, when and at which locations, and which survived without damage? "A system like this already exists in some regions in the US, but central Europe still lacks one."
|
Weather
| 2,020 |
July 16, 2020
|
https://www.sciencedaily.com/releases/2020/07/200716101603.htm
|
Scientists predict dramatic increase in flooding, drought in California
|
California may see a 54 percent increase in rainfall variability by the end of this century, according to new research from the lab of Assistant Professor Da Yang, a 2019 Packard Fellow and atmospheric scientist with the University of California, Davis.
|
Writing in the journal The study explores the Madden-Julian oscillation (MJO), an atmospheric phenomenon that influences rainfall in the tropics and can trigger everything from cyclones over the Indian Ocean to heatwaves, droughts and flooding in the United States. Yang, Zhou and their team show that as the Earth's climate warms, the dynamics controlling MJO will expand eastward and cause a huge uptick in extreme weather in California."I was surprised by the magnitude of the effect," said Yang, an assistant professor with the UC Davis Department of Land, Air and Water Resources. "A 54 percent increase in rainfall variability will have very significant impacts on agriculture, flood control and water management."In 2019, Yang was among 22 early-career scientists and engineers nationwide to receive the Packard Fellowship. He is the first recipient of the award for the UC Davis College of Agricultural and Environmental Sciences.This study was supported by the David and Lucille Packard Foundation and by the project "Toward Accurately Predicting California Hydroclimate by Cracking the Tropical Storm King," which is funded by the U.S. Department of Energy.Yang and his team use satellite observations and computer models to study the physics of rainstorms and atmospheric circulation in a changing climate. They are working to understand what environmental factors control the size and duration of rainstorms and how the collective effects of rainstorms, in turn, shape Earth's climate.
|
Weather
| 2,020 |
July 7, 2020
|
https://www.sciencedaily.com/releases/2020/07/200707134204.htm
|
Breakthrough machine learning approach quickly produces higher-resolution climate data
|
Researchers at the U.S. Department of Energy's (DOE's) National Renewable Energy Laboratory (NREL) have developed a novel machine learning approach to quickly enhance the resolution of wind velocity data by 50 times and solar irradiance data by 25 times -- an enhancement that has never been achieved before with climate data.
|
The researchers took an alternative approach by using adversarial training, in which the model produces physically realistic details by observing entire fields at a time, providing high-resolution climate data at a much faster rate. This approach will enable scientists to complete renewable energy studies in future climate scenarios faster and with more accuracy."To be able to enhance the spatial and temporal resolution of climate forecasts hugely impacts not only energy planning, but agriculture, transportation, and so much more," said Ryan King, a senior computational scientist at NREL who specializes in physics-informed deep learning.King and NREL colleagues Karen Stengel, Andrew Glaws, and Dylan Hettinger authored a new article detailing their approach, titled "Adversarial super-resolution of climatological wind and solar data," which appears in the journal Proceedings of the National Academy of Sciences of the United States of America.Accurate, high-resolution climate forecasts are important for predicting variations in wind, clouds, rain, and sea currents that fuel renewable energies. Short-term forecasts drive operational decision-making; medium-term weather forecasts guide scheduling and resource allocations; and long-term climate forecasts inform infrastructure planning and policymaking.However, it is very difficult to preserve temporal and spatial quality in climate forecasts, according to King. The lack of high-resolution data for different scenarios has been a major challenge in energy resilience planning. Various machine learning techniques have emerged to enhance the coarse data through super resolution -- the classic imaging process of sharpening a fuzzy image by adding pixels. But until now, no one had used adversarial training to super-resolve climate data."Adversarial training is the key to this breakthrough," said Glaws, an NREL postdoc who specializes in machine learning.Adversarial training is a way of improving the performance of neural networks by having them compete with one another to generate new, more realistic data. The NREL researchers trained two types of neural networks in the model -- one to recognize physical characteristics of high-resolution solar irradiance and wind velocity data and another to insert those characteristics into the coarse data. Over time, the networks produce more realistic data and improve at distinguishing between real and fake inputs. The NREL researchers were able to add 2,500 pixels for every original pixel."By using adversarial training -- as opposed to the traditional numerical approach to climate forecasts, which can involve solving many physics equations -- it saves computing time, data storage costs, and makes high-resolution climate data more accessible," said Stengel, an NREL graduate intern who specializes in machine learning.This approach can be applied to a wide range of climate scenarios from regional to global scales, changing the paradigm for climate model forecasting.NREL is the U.S. Department of Energy's primary national laboratory for renewable energy and energy efficiency research and development. NREL is operated for the Energy Department by the Alliance for Sustainable Energy, LLC.
|
Weather
| 2,020 |
July 1, 2020
|
https://www.sciencedaily.com/releases/2020/07/200701125413.htm
|
Knowledge of severe storm patterns may improve tornado warnings
|
A radar signature may help distinguish which severe storms are likely to produce dangerous tornadoes, potentially leading to more accurate warnings, according to scientists.
|
"Identifying which storms are going to produce tornadoes and which are not has been a problem meteorologists have been trying to tackle for decades," said Scott Loeffler, a graduate student in the Department of Meteorology and Atmospheric Science at Penn State. "This new research may give forecasters another tool in their toolbox to do just that."Scientists analyzed radar data from more than a hundred supercell thunderstorms, the most prolific producers of violent tornadoes, and found a statistically significant difference in the structure of storms that produced a tornado and those that did not.Weather radar constantly monitors storms across the country, and data similar to that used in the study are readily available to operational forecasters who issue warnings, the scientists note."These findings have potentially large implications for the accuracy and confidence of tornado warnings and public safety during severe storms," said Matthew Kumjian, associate professor of meteorology at Penn State and Loeffler's adviser. "We look forward to getting this information in the hands of operational meteorologists to assess the impact it has."Tornado warning times have improved over the last several decades, thanks in part to numerical modeling research and intensive field campaigns, but decision-makers often must rely on readily available information like radar data when issuing storm warnings, the scientists said. Previous efforts using conventional radar have struggled to distinguish between tornadic and nontornadic supercells.According to the researchers, in 2013, the U.S. upgraded its radar network to include polarimetric capabilities, which provide additional information about storms, including revealing the shape and size of raindrops.Using this information, the scientists compared areas with large, sparse raindrops and regions dense with smaller drops within supercell storms. The orientation of these two areas was significantly different in tornadic and nontornadic supercells, the researchers reported in the journal "We found for nontornadic supercells, the orientation of the separation between these two areas tended to be more parallel to the direction of the storm's motion," Loeffler said. "And for tornadic supercells, the separation tended to be more perpendicular. So we saw this shift in the angles, and we saw this as a consistent trend."Loeffler said the algorithm from the study can easily be adapted so operational forecasters could use the program in real time with the latest radar data available."Many factors go into issuing a tornado warning, but perhaps knowing the orientation in real time could help them make a decision to pull the trigger or to hold off," he said.The scientists said while the signatures are promising, further numerical modeling studies are needed to understand better the relationship between the orientations and tornado formation.Michael Jurewicz, a meteorologist with the National Weather Service and Michael French, assistant professor at Stony Brook University, contributed to the study.The National Oceanic and Atmospheric Administration and the National Science Foundation supported this research.
|
Weather
| 2,020 |
June 20, 2020
|
https://www.sciencedaily.com/releases/2020/06/200620141955.htm
|
How a historic drought led to higher power costs and emissions
|
Drought can mean restrictions for watering the lawn, crop losses for farmers and an increased risk of wildfires. But it can also hit you and your power company in the wallet.
|
In communities that rely on water for power generation, a drought can mean higher electricity costs and pollution linked to the loss of hydropower supplies.In a recent study, a team led by a researcher from North Carolina State University analyzed the downstream effects of a drought in California that took place in 2012-2016, and was considered one of the worst in the state's history.They found that drought led to significant increases in power costs for three major investor-owned utilities in the state, but other weather-related events were also likely the main culprit behind those increases.They also found that increased harmful emissions of greenhouse gases could be linked to hydropower losses during drought in the future, even as more sources of renewable energy are added to the grid."There is an expectation that droughts like this will happen again in the future, so there's a lot of attention on the way it impacted the state as a whole and their power system," said Jordan Kern, corresponding author of the study and an assistant professor in NC State's Department of Forestry & Environmental Resources. "We felt there was a need to understand what happens to the grid during drought, especially from a financial, economic, and environmental perspective, and we wanted to provide more clarity."In the study, Kern and scientists at the University of North Carolina at Chapel Hill developed a new software tool to model the economic and environmental impact of the drought in California, a state that relies on hydropower to supply a significant portion of its power.On average, the state uses hydropower to supply 13 percent of its energy needs, Kern and his colleagues reported. During the drought, there were lower levels of precipitation, melted snow and stream flow.As a result, hydropower accounted for just 6 percent of the state's electricity needs during the worst year of the drought. At the same time, the researchers reported that increased temperatures led to a greater demand for power for cooling.They found the drought had a "moderate" impact on the market price of electricity. But it was actually another weather event that had a larger impact on costs: a 2014 extreme cold spell known as a "polar vortex" in the eastern United States that led to increased prices for natural gas across the entire country.Researchers estimated that the loss of hydropower generation cost three main investor-owned utilities in California $1.9 billion. However, increased demand for cooling due to higher temperatures in the period probably had a larger economic cost than the lost hydropower, at $3.8 billion. Both are costs that can be passed on to consumers."We tried to figure out exactly how the drought contributed to the increased price for electricity in the market," Kern said. "We found a reasonable impact, but overall, the increase in price during the drought in California was actually due to higher natural gas prices."Researchers also evaluated whether increases in renewable energy resources, like wind and solar energy, could help prevent increases in emissions of carbon dioxide from power generation during drought in the future."Usually what happens in California during a drought is they have to turn on natural gas power plants, and there is a spike in emissions," Kern said. "It was a coincidence that the state was building more renewable energy during the drought. As a result of that, when they lost hydropower, they didn't have to turn on quite as many natural gas plants."While other previous research has suggested an increased dependence on power generated through wind and solar could offset drought-caused increases in carbon dioxide emissions, the researchers said that's not what they saw.They reported that even when renewable energy capacity doubled, their model showed the same increase in fossil fuel generation and carbon dioxide emission during drought years.Kern said that while renewable energy sources result in reduced emissions overall, their analysis points to drought years causing higher emissions, even in systems with more renewable sources."During a drought when you don't have hydropower, the grid needs other types of flexible generation," he said. "They still have to have sources of generation that can turn on when they lose hydropower, and we think it's still going to cause periodic increases in emissions, even with increased renewable energy generation."The study is part of an ongoing effort to understand the impacts of major weather events such as drought, flooding or high winds on electric power systems in order to potentially mitigate power reliability, environmental and financial risks for utilities and their customers."My group spends a lot of time figuring out how extreme weather impacts power systems," Kern said. "What we want to know is about extreme weather that impairs functionality of the grid -- does it cause blackouts or increase costs for consumers."The study was published online in the journal
|
Weather
| 2,020 |
June 10, 2020
|
https://www.sciencedaily.com/releases/2020/06/200610112047.htm
|
New 'sun clock' quantifies extreme space weather switch on-off
|
Extreme space weather events can significantly impact systems such as satellites, communications systems, power distribution and aviation. They are driven by solar activity which is known to have an irregular but roughly 11 year cycle. By devising a new, regular 'sun clock', researchers have found that the switch on and off of periods of high solar activity is quite sharp, and are able to determine the switch on/off times. Their analysis shows that whilst extreme events can happen at any time, they are much less likely to occur in the quiet interval.
|
The clock will help scientists to determine more precisely when the risk for solar storms is highest and help to plan the impacts of space weather on our space infrastructure, important since the next switch on of activity may be imminent as solar activity moves from its current minimum.Published in The clock revealed that the transitions between quiet and active periods in solar activity are sharp. Once the clock is constructed from sunspot observations it can be used to order observations of solar activity and space weather. These include occurrence of solar flares seen in X-ray by the GOES satellites and F10.7 solar radio flux that tracks solar coronal activity. These are all drivers of space weather on the Earth, for which the longest record is the aa index based on magnetic field measurements in the UK and Australia going back over 150 years. All these observations show the same sharp switch on and switch off times of activity.Once past switch on/off times are obtained from the clock, the occurrence rate of extreme events when the sun is active or quiet can be calculated.Lead author Professor Sandra Chapman, of the University of Warwick's Centre for Fusion, Space and Astrophysics, said: "Scientists spend their lives trying to read the book of nature. Sometimes we create a new way to transform the data and what appeared to be messy and complicated is suddenly beautifully simple. In this instance, our sun clock method showed clear 'switch on' and 'switch off' times demarcating quiet and active intervals for space weather for the first time."Large events can happen at any time, but are much more likely around solar maximum. By cleanly ordering the observations we find that in 150 years of geomagnetic activity at earth, only a few percent occur during these quiet conditions."The ability to estimate the risk of a future solar superstorm occurring is vital for space and ground-based technologies that are particularly sensitive to space weather, such as satellites, communications system, power distribution and aviation."If you have a system sensitive to space weather you need to know how likely a big event is, and it is useful to know when we are in a quiet period as it allows maintenance and other activities that make systems temporarily more fragile."The research was co-authored by Scott Mcintosh of the National Center for Atmospheric Research, Robert Leamon of the University of Maryland and NASA's Goddard Space Flight Center and Nick Watkins of the University of Warwick and London School of Economics and Political Science.Robert Leamon said: "The Hilbert transform is a really powerful technique across all of science. Sandra suggested it to me on a completely different project -- this really is a serendipitous chain of events -- because of her work in lab fusion plasmas, and when we applied it to sunspots, we saw it tied to the sharp switch-on of activity that we'd seen elsewhere. Sandra then saw the switch-off looking at the aa index."Scott Mcintosh said: "We foresee that the door that this innovative work opens will lead to development of meaningful climatologies for solar activity and improved predictability that will result from that."
|
Weather
| 2,020 |
June 9, 2020
|
https://www.sciencedaily.com/releases/2020/06/200609122922.htm
|
Heat and humidity battle sunshine for influence over the spread of COVID-19, researchers find
|
An international team of researchers led by McMaster University has found that while higher heat and humidity can slow the spread of COVID-19, longer hours of sunlight are associated with a higher incidence of the disease, in a sign that sunny days can tempt more people out even if this means a higher risk of infection.
|
The findings, published online the journal While research has shown that pathogens such as influenza and SARS thrive in lower temperatures and humidity, little is known about SARS-CoV2, the agent that causes COVID-19."There is a lot of pressure to reopen the economy, and many people want to know if it will be safer to do so in the summer months," says Antonio Páez, a professor and researcher in McMaster's School of Geography & Earth Sciences who is lead author of the study."Restrictions in movement, which have begun to ease around the world, hinge in part on how SARS-CoV2 will be affected by a change in season," he says.Páez and colleagues from Spain's Universidad Politecnica de Cartegena and Brazil's Universidade Federal de Pernambuco investigated climate factors in the spread of COVID-19 in several provinces in Spain, one of the countries hardest hit by the pandemic, with more than 270,000 cases.They combined and analyzed data on reported cases of the disease and meteorological information over a period of 30 days that began immediately before a state-of-emergency was declared.At higher levels of heat and humidity, researchers found that for every percentage increase, there was a 3 per cent decline in the incidence of COVID-19, possibly because warmer temperatures curtail the viability of the virus.The opposite was true for hours of sunshine: more sun meant greater spread. The researchers speculate the increase may be related to human behaviour, since compliance with lockdown measures breaks down in sunnier days.They were also surprised to find rates of transmission dropped among more dense populations and in areas with more older adults, suggesting those populations regard themselves as being at greater risk, and so are more likely to adhere to lockdown guidance.While older adults are more vulnerable to the disease, researchers believe they are less likely overall to contribute to the spread of the disease because they are more apt to be isolated from others because of health or mobility issues.Páez stresses that models such as the one he helped develop show that contagion of COVID-19 declines as a lockdown progresses, possibly to the vanishing point -- an argument for maintaining discipline despite the approach of pleasant weather."We will likely see a decrease in the incidence of COVID-19 as the weather warms up, which is an argument for relaxing social distancing to take advantage of the lower incidence associated with higher temperatures" he says. "But a more conservative approach would be to use the months of summer to continue to follow strict orders to remain in place and to crush this pandemic."
|
Weather
| 2,020 |
June 9, 2020
|
https://www.sciencedaily.com/releases/2020/06/200609122916.htm
|
Study tracks decades of life cycle changes in nonwoody plants
|
For 25 years, Carol Augspurger visited a patch of ancient woods near Urbana, Illinois to look at the same 25 one-square-meter plots of earth she first demarcated for study in 1993. She surveyed the plots once a week in spring and summer, tracking the major life events of each of the herbaceous plants that grew there. In fall, she visited every other week. In winter, once a month.
|
Over the course of her study, Augspurger made nearly 600,000 observations of 43 plant species in Trelease Woods, a 60.5-acre remnant of old-growth forest in central Illinois. She noted 10 distinct developmental stages in the plants' lives, including when they emerged in spring, how long it took them to mature, when the flowers opened and died, when the leaves began to lose their greenness and when the plants went dormant. Augspurger is a professor emerita of plant biology at the University of Illinois at Urbana-Champaign.By tracking these events and their relationship to average daily temperature and precipitation records, Augspurger and her colleague, Illinois Natural History Survey statistician and plant ecologist David Zaya, found that some shifts in the timing of plant seasonal life cycle events correlated with temperature trends.The findings appear in the journal "We marked every major life cycle event in our plants from emergence until they went dormant," Augspurger said. "And we did it for an intact community, a natural forest community."The analysis revealed that by the end of the study in 2017, the first spring plants were emerging almost four days later in March than they had in the early 1990s. But their growing seasons were getting shorter: Dormancy was occurring six days earlier. March average temperatures got slightly cooler over the same time period, but April temperatures were rising.Plants that emerged in late spring -- typically after April 1 -- were undergoing even more dramatic shifts. Their growing seasons were lengthening: The period from emergence to dormancy lasted more than 40 days longer for these plants at the end of the study than at the beginning."For the early species, the growing season was a little bit shorter," Zaya said. "But for the late species, the growing season was 20 days longer per decade."The duration of leaf expansion and flowering was shorter for the late-spring plants, while senescence, their period of aging, got longer. During senescence, plants gradually decline in making sugar and transfer their energy stores underground, but they do not produce new leaves, flowers or fruit. The increasing duration of senescence corresponded with higher average temperatures in the fall."It may be that the late-spring species are benefiting from changing temperature trends by being able to grow and get carbon for a longer period of time in the fall," Zaya said. "This suggests that there may be winners and losers in the plant community as a result of climate change, where some plants can respond more, some can respond less."Many of the changes seen in the plants paralleled the temperature trends, but the researchers caution that the study does not prove that changing temperatures are driving the seasonal life cycle shifts in plants. Tying any specific trend in plants to climate change is tricky, Augspurger said."If you look at these 25 years of Illinois weather, climate change is not happening uniformly every month of the year," she said. "The minimum temperatures are changing more than the maximums, for example, and March is not changing as much as May. It's not only that the different plant species may respond in different ways to the changes, but the weather itself is changing differently."Regardless of the cause, the shifting patterns among plant species likely influence their interactions with herbivores, pollinators and each other, Augspurger said."What they do developmentally ultimately affects their ability to survive, grow and reproduce," she said. "If they're changing their period of activity, they're either going to have a shorter or longer time to gain carbon, to absorb nutrients and to put it together into making flowers, seeds and offspring. This can affect species' abundance and survival, and the health of the overall ecosystem."
|
Weather
| 2,020 |
June 9, 2020
|
https://www.sciencedaily.com/releases/2020/06/200609111058.htm
|
Survival of coronavirus in different cities, on different surfaces
|
One of the many questions researchers have about COVID-19 is how long the coronavirus causing the disease remains alive after someone infected with it coughs or sneezes. Once the droplets carrying the virus evaporate, the residual virus dies quickly, so the survival and transmission of COVID-19 are directly impacted by how long the droplets remain intact.
|
In a paper in Using a mathematical model well established in the field of interface science, the drying time calculations showed ambient temperature, type of surface and relative humidity play critical roles. For example, higher ambient temperature helped to dry out the droplet faster and drastically reduced the chances of virus survival. In places with greater humidity, the droplet stayed on surfaces longer, and the virus survival chances improved.The researchers determined the droplet drying time in different outdoor weather conditions and examined if this data connected to the growth rate of the COVID-19 pandemic. Researchers selected New York, Chicago, Los Angeles, Miami, Sydney and Singapore and plotted the growth rate of COVID-19 patients in these cities with the drying time of a typical droplet. In the cities with a larger growth rate of the pandemic, the drying time was longer."In a way, that could explain a slow or fast growth of the infection in a particular city. This may not be the sole factor, but definitely, the outdoor weather matters in the growth rate of the infection," said Rajneesh Bhardwaj, one of the authors."Understanding virus survival in a drying droplet could be helpful for other transmissible diseases that spread through respiratory droplets, such as influenza A," said Amit Agrawal, another author.The study suggests that surfaces, such as smartphone screens, cotton and wood, should be cleaned more often than glass and steel surfaces, because the latter surfaces are relatively hydrophilic, and the droplets evaporate faster on these surfaces.
|
Weather
| 2,020 |
June 4, 2020
|
https://www.sciencedaily.com/releases/2020/06/200604152034.htm
|
Environmental damage from fog reduction is observable from outer space, find hydrologists
|
A new study led by ecohydrologists at IUPUI has shown for the first time that it's possible to use satellite data to measure the threat of climate change to ecological systems that depend on water from fog.
|
The paper, published in the journal "It's never been shown before that you can observe the effect of fog on vegetation from outer space," said Lixin Wang, an associate professor in the School of Science at IUPUI, who is the senior author on the study. "The ability to use the satellite data for this purpose is a major technological advance."The need to understand the relationship between fog and vegetation is urgent since environmental change is reducing fog levels across the globe. The shift most strongly affects regions that depend upon fog as a major source of water, including the redwood forests in California; the Atacama desert in Chile; and the Namib desert in Namibia, with the latter two currently recognized as World Heritage sites under the United Nations due to their ecological rarity."The loss of fog endangers plant and insect species in these regions, many of which don't exist elsewhere in the world," said Na Qiao, a visiting student at IUPUI, who is the study's first author. "The impact of fog loss on vegetation is already very clear. If we can couple this data with large-scale impact assessments based on satellite data, it could potentially influence environmental protection policies related to these regions."The IUPUI-led study is based upon optical and microwave satellite data, along with information on fog levels from weather stations at two locations operated by the Gobabeb Namib Research Institute in the Namib desert. The satellite data was obtained from NASA and the U.S. Geological Survey. The fog readings were taken between 2015 and 2017.Wang's work with the Gobabeb facility is supported under a National Science Foundation CAREER grant. At least once a year, he and student researchers, including both graduate and undergraduate students from IUPUI, travel to the remote facility -- a two-hour drive on a dirt road from the nearest city -- to conduct field research.The study found a significant correlation between fog levels and vegetation status near both weather stations during the entire time of the study. Among other findings, the optical data from the site near the research facility revealed obvious signs of plant greening following fog, and up to 15 percent higher measures during periods of fog versus periods without fog.Similar patterns were seen at the second site, located near a local rock formation. The microwave data also found significant correlation between fog and plant growth near the research facility, and up to 60 percent higher measures during periods of fog versus periods without fog.The study's conclusions are based upon three methods of remotely measuring vegetation: two based upon optical data, which is sensitive to the vibrance of greens in plants, and a third based upon microwave data, which is sensitive to overall plant mass, including the amount of water in stems and leaves. Although observable by machines, the changes in vegetation color are faint enough to go undetected by the human eye.Next, the team will build upon their current work to measure the effect of fog on vegetation over longer periods of time, which will assist with future predictions. Wang also aims to study the relationship in other regions, including the redwood forests in California."We didn't even know you could use satellite data to measure the impact of fog on vegetation until this study," he said. "If we can extend the period under investigation, that will show an even more robust relationship. If we have 10 years of data, for example, we can make future predictions about the strength of this relationship, and how this relationship has been changing over time due to climate change."
|
Weather
| 2,020 |
June 3, 2020
|
https://www.sciencedaily.com/releases/2020/06/200603132543.htm
|
Double-sided solar panels that follow the sun prove most cost effective
|
Solar power systems with double-sided (bifacial) solar panels -- which collect sunlight from two sides instead of one -- and single-axis tracking technology that tilts the panels so they can follow the sun are the most cost effective to date, researchers report June 3rd in the journal
|
"The results are stable, even when accounting for changes in the weather conditions and in the costs from the solar panels and the other components of the photovoltaic system, over a fairly wide range," says first author Carlos Rodríguez-Gallegos, a research fellow at the Solar Energy Research Institute of Singapore, sponsored by the National University of Singapore. "This means that investing in bifacial and tracking systems should be a safe bet for the foreseeable future."Research efforts tend to focus on further boosting energy output from solar power systems by improving solar cell efficiency, but the energy yield per panel can also be increased in other ways. Double-sided solar panels, for example, produce more energy per unit area than their standard counterparts and can function in similar locations, including rooftops. This style of solar panel, as well as tracking technology that allows each panel to capture more light by tilting in line with the sun throughout the day, could significantly improve the energy yield of solar cells even without further advancements in the capabilities of the cells themselves. However, the combined contributions of these recent technologies have not been fully explored.To identify the global economic advantages associated with the use of a variety of paired photovoltaic technologies, Rodríguez-Gallegos and colleagues first used data from NASA's Clouds and the Earth's Radiant Energy System (CERES) to measure the total radiation that reaches the ground each day. The researchers further tailored this data to account for the influence of the sun's position on the amount of radiation a solar panel can receive based on its orientation, and then calculated the average net cost of generating electricity through a photovoltaic system throughout its lifetime. They focused on large photovoltaic farms composed of thousands of modules rather than smaller photovoltaic systems, which generally include higher associated costs per module. The team validated their model using measured values from experimental setups provided by three institutes and incorporated additional weather parameters to perform a worldwide analysis.The model suggests that double-sided solar panels combined with single-axis tracking technology is most cost effective almost anywhere on the planet, although dual-axis trackers -- which follow the sun's path even more accurately but are more expensive than single-axis trackers -- are a more favorable substitute in latitudes near the poles. But despite this technology's clear benefits, Rodríguez-Gallegos does not expect this style of photovoltaic system to become the new standard overnight."The photovoltaics market is traditionally conservative," he says. "More and more evidence points toward bifacial and tracking technology to be reliable, and we see more and more of it adopted in the field. Still, transitions take time, and time will have to show whether the advantages we see are attractive enough for installers to make the switch."While this work considers standard silicon-based solar cells, Rodríguez-Gallegos and colleagues next plan to analyze the potential of tracking systems combined with pricey, top-of-the-line solar materials with higher efficiencies (called tandem technologies), which are currently limited to heavy-duty concentrator photovoltaics and space applications."As long as research continues to take place, the manufacturing costs of these materials are expected to keep on decreasing, and a point in time might be reached when they become economically competitive and you might see them on your roof," says Rodríguez-Gallegos. "We then aim to be a step ahead of this potential future so that our research can be used as a guide for scientists, manufacturers, installers, and investors."
|
Weather
| 2,020 |
June 2, 2020
|
https://www.sciencedaily.com/releases/2020/06/200602183417.htm
|
New research deepens understanding of Earth's interaction with the solar wind
|
As the Earth orbits the sun, it plows through a stream of fast-moving particles that can interfere with satellites and global positioning systems. Now, a team of scientists at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) and Princeton University has reproduced a process that occurs in space to deepen understanding of what happens when the Earth encounters this solar wind.
|
The team used computer simulations to model the movement of a jet of plasma, the charged state of matter composed of electrons and atomic nuclei that makes up all the stars in the sky, including our sun. Many cosmic events can produce plasma jets, from relatively small star burps to gigantic stellar explosions known as supernovae. When fast-moving plasma jets pass through the slower plasma that exists in the void of space, it creates what is known as a collision-less shock wave.These shocks also occur as Earth moves through the solar wind and can influence how the wind swirls into and around Earth's magnetosphere, the protective magnetic shield that extends into space. Understanding plasma shock waves could help scientists to forecast the space weather that develops when the solar wind swirls into the magnetosphere and enable the researchers to protect satellites that allow people to communicate across the globe.The simulations revealed several telltale signs indicating when a shock is forming, including the shock's features, the three stages of the shock's formation, and phenomena that could be mistaken for a shock. "By being able to distinguish a shock from other phenomena, scientists can feel confident that what they are seeing in an experiment is what they want to study in space," said Derek Schaeffer, an associate research scholar in the Princeton University Department of Astrophysics who led the PPPL research team. The findings were reported in a paper published in The plasma shocks that occur in space, like those created by Earth traveling against the solar wind, resemble the shock waves created in Earth's atmosphere by supersonic jet aircraft. In both occurrences, fast-moving material encounters slow or stationary material and must swiftly change its speed, creating an area of swirls and eddies and turbulence.But in space, the interactions between fast and slow plasma particles occur without the particles touching one another. "Something else must be driving this shock formation, like the plasma particles electrically attracting or repelling each other," Schaeffer said. "In any case, the mechanism is not fully understood."To increase their understanding, physicists conduct plasma experiments in laboratories to monitor conditions closely and measure them precisely. In contrast, measurements taken by spacecraft cannot be easily repeated and sample only a small region of plasma. Computer simulations then help the physicists interpret their laboratory data.Today, most laboratory plasma shocks are formed using a mechanism known as a plasma piston. To create the piston, scientists shine a laser on a small target. The laser causes small amounts of the target's surface to heat up, become a plasma, and move outward through a surrounding, slower-moving plasma.Schaeffer and colleagues produced their simulation by modeling this process. "Think of a boulder in the middle of fast-moving stream," Schaeffer said. "The water will come right up to the front of the boulder, but not quite reach it. The transition area between quick motion and zero [standing] motion is the shock."The simulated results will help physicists distinguish an astrophysical plasma shock wave from other conditions that arise in laboratory experiments. "During laser plasma experiments, you might observe lots of heating and compression and think they are signs of a shock," Schaeffer said. "But we don't know enough about the beginning stages of a shock to know from theory alone. For these kinds of laser experiments, we have to figure out how to tell the difference between a shock and just the expansion of the laser-driven plasma."In the future, the researchers aim to make the simulations more realistic by adding more detail and making the plasma density and temperature less uniform. They would also like to run experiments to determine whether the phenomena predicted by the simulations can in fact occur in a physical apparatus. "We'd like to put the ideas we talk about in the paper to the test," says Schaeffer.
|
Weather
| 2,020 |
May 30, 2020
|
https://www.sciencedaily.com/releases/2020/05/200530163107.htm
|
New sunspots potentially herald increased solar activity
|
On May 29, 2020, a family of sunspots -- dark spots that freckle the face of the Sun, representing areas of complex magnetic fields -- sported the biggest solar flare since October 2017. Although the sunspots are not yet visible (they will soon rotate into view over the left limb of the Sun), NASA spacecraft spotted the flares high above them.
|
The flares were too weak to pass the threshold at which NOAA's Space Weather Prediction Center (which is the U.S. government's official source for space weather forecasts, watches, warnings and alerts) provides alerts. But after several months of very few sunspots and little solar activity, scientists and space weather forecasters are keeping their eye on this new cluster to see whether they grow or quickly disappear. The sunspots may well be harbingers of the Sun's solar cycle ramping up and becoming more active.Or, they may not. It will be a few more months before we know for sure.As the Sun moves through its natural 11-year cycle, in which its activity rises and falls, sunspots rise and fall in number, too. NASA and NOAA track sunspots in order to determine, and predict, the progress of the solar cycle -- and ultimately, solar activity. Currently, scientists are paying close attention to the sunspot number as it's key to determining the dates of solar minimum, which is the official start of Solar Cycle 25. This new sunspot activity could be a sign that the Sun is possibly revving up to the new cycle and has passed through minimum.However, it takes at least six months of solar observations and sunspot-counting after a minimum to know when it's occurred. Because that minimum is defined by the lowest number of sunspots in a cycle, scientists need to see the numbers consistently rising before they can determine when exactly they were at the bottom. That means solar minimum is an instance only recognizable in hindsight: It could take six to 12 months after the fact to confirm when minimum has actually passed.This is partly because our star is extremely variable. Just because the sunspot numbers go up or down in a given month doesn't mean it won't reverse course the next month, only to go back again the month after that. So, scientists need long-term data to build a picture of the Sun's overall trends through the solar cycle. Commonly, that means the number we use to compare any given month is the average sunspot number from six months both backward and forward in time -- meaning that right now, we can confidently characterize what October 2019 looks like compared to the months before it (there were definitely fewer sunspots!), but not yet what November looks like compared to that.On May 29, at 3:24 a.m. EST, a relatively small M-class solar flare blazed from these sunspots. Solar flares are powerful bursts of radiation. Harmful radiation from a flare cannot pass through Earth's atmosphere to physically affect humans on the ground, however -- when intense enough -- they can disturb the atmosphere in the layer where GPS and communications signals travel. The intensity of this flare was below the threshold that could affect geomagnetic space and below the threshold for NOAA to create an alert.Nonetheless, it was the first M-class flare since October 2017 -- and scientists will be watching to see if the Sun is indeed beginning to wake up.
|
Weather
| 2,020 |
May 29, 2020
|
https://www.sciencedaily.com/releases/2020/05/200529093123.htm
|
Climate could cause abrupt British vegetation changes
|
Climate change could cause abrupt shifts in the amount of vegetation growing in parts of Great Britain, new research shows.
|
The University of Exeter studied the country in high resolution, using models to examine the local impacts of two climate change scenarios at 1.5x1.5 km scale.It found that even "smooth" climate change could lead to sudden changes in the amount of vegetation in some places.Most such changes were increases, caused by factors such as warmer, wetter conditions and more COElsewhere, warmer conditions could cause soil to dry out, reducing plant productivity and decreasing vegetation rapidly."The general expected trend towards warmer, wetter weather is likely to cause an overall increase in vegetation in temperate places like Great Britain," said Dr Chris Boulton, of the Global Systems Institute at the University of Exeter."However, we wanted to find out whether even 'smooth' climate change might lead to abrupt shifts in vegetation."A lot of research has focussed on 'tipping points' in large systems like rainforests and oceans."Our study doesn't predict abrupt shifts across the whole of Great Britain -- 0.5-1.5% of the land area depending on the climate scenario -- but it shows numerous shifts can happen on a localised level."The researchers used a new method for identifying "abrupt" shifts to look for sudden changes in the total amount of carbon stored in vegetation over a short period of time."We also find early warning signals before some of the abrupt shifts. This is good news as it shows the potential for being able to predict them in the real world," said Dr Boulton.GSI Director Professor Tim Lenton said: "We didn't expect to see hundreds of localised abrupt shifts in the projections."Up to now, climate-driven abrupt shifts in vegetation have been rare in Great Britain."Our results should not be taken as specific predictions of where abrupt ecosystem change will happen."But they serve to illustrate that it could happen across Great Britain in a changing climate."The study, funded by the Valuing Nature Programme, used two climate scenarios developed by the Met Office.
|
Weather
| 2,020 |
May 13, 2020
|
https://www.sciencedaily.com/releases/2020/05/200513111417.htm
|
New study could help better predict rainfall during El Niño
|
Researchers at the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science have uncovered a new connection between tropical weather events and U.S. rainfall during El Niño years. The results can help explain why California received significantly less rainfall than predicted during the 2015 El Niño event while massive flooding occurred in the Mississippi River basin.
|
UM Rosenstiel School graduate student Marybeth Arcodia analyzed 39-years of weather data from the National Centers for Environmental Prediction-National Center for Atmospheric Research Reanalysis Project to understand how the Madden-Julian Oscillation (MJO), a phenomenon of sub-seasonal atmospheric variability in the tropical Indo-Pacific Oceans, leads to pressure and rainfall anomalies over the North Pacific and North America.Results from the study show that when both an El Niño Southern Oscillation (ENSO) and MJO event are occurring simultaneously, the rainfall pattern typically seen from ENSO can be considerably altered for a few days to weeks due to interference from the MJO.The researchers found that ENSO modifies the teleconnection signals associated with the Madden-Julian Oscillation in the United States. While at the same time, the El Niño Southern Oscillation acts to interfere with the MJO signals, resulting in significantly enhanced or masked rainfall anomalies in the U.S."Although the source of changes in rainfall patterns is coming from thousands of miles away in the tropical Indian and Pacific oceans, our study shows just how connected the tropics and the United States can be," said Arcodia, the study's lead author and UM Rosenstiel School Ph.D student. "If we have a better grasp on how the climate system works and how different aspects of our atmosphere work together, we can have more confidence in our weather and climate forecasts."The results from this study offer potential to help better understand Earth's weather and climate system.
|
Weather
| 2,020 |
May 11, 2020
|
https://www.sciencedaily.com/releases/2020/05/200511142120.htm
|
Street smarts required in heat mitigation
|
One day last July, Ariane Middel and two other Arizona State University researchers headed west on Interstate 10. Squeezed inside their van were MaRTy 1 and MaRTy 2, mobile biometeorological instrument platforms that can tell you exactly what you feel in the summer heat. All five were destined for Los Angeles.
|
The researchers and their colleagues were headed to L.A. to start investigating how solar reflective coatings on select city streets affected radiant heat and, in turn, pedestrians' comfort on a typical summer day.The Los Angeles Bureau of Street Surfaces has pioneered the use of solar reflective coatings in a quest to cool city streets.The idea is, if you coat a street with a lighter color than traditional pavement black, it will actually lower the surrounding temperatures.But Middel and her collaborators now wanted to see what effect reflective coating had on pedestrians."If you're in a hot, dry and sunny climate like Phoenix or L.A., the mean radiant temperature has the biggest impact on how a person experiences the heat," explains Middel, assistant professor in the ASU School of Arts, Media and Engineering and a senior sustainability scientist in the Julie Ann Wrigley Global Institute of Sustainability. "The mean radiant temperature is essentially the heat that hits the human body. It includes the radiation from the sun, so if you are standing in direct sunlight you will feel much hotter than in the shade."Thanks to remote-sensing satellites, decades of data exist on the Earth's land surface temperature; that is, how hot a single point on the Earth's surface would feel to the touch. But that data should not be confused with near-surface ambient and radiant temperature, the heat that humans and animals "experience," said Middel, lead author of the study and director of ASU's SHaDE Lab, which stands for Sensable Heatscapes and Digital Environments.The researchers' study is the first to measure the thermal performance of solar reflective coatings using instruments that sense meteorological variables relevant to a pedestrian's experience: radiant heat, ambient temperature, wind and humidity.The researchers focused on two variables, surface temperature and radiant temperature over highly reflective surfaces. They took MaRTy 1 and 2 on hourly strolls through a Los Angeles neighborhood to measure a pedestrian's heat exposure over regular asphalt roads, reflective coated roads and sidewalks next to the roads.MaRTy, which stands for mean radiant temperature, looks like a weather station in a wagon. The station measures the total radiation that hits the body, including sunlight and the heat emitted from surfaces like asphalt.The study showed that the surface temperature of the coated asphalt road was up to 6 degrees Celsius cooler than the regular road in the afternoon. However, the radiant heat over coated asphalt was 4 degrees Celsius higher than non-coated areas, basically negating any heat-limiting factor."So, if you're a pedestrian walking over the surface, you get hit by the shortwave radiation reflected back at you," Middel said.The study also found that the coating didn't have a big impact on air temperature, only half a degree in the afternoon and 0.1 degrees Celsius at night.The upshot, said V. Kelly Turner, assistant professor of urban planning at UCLA and the study's co-author, is that to cool off cities, urban climatologists and city planners need to focus on different solutions or combinations of solutions depending on a desired goal."The solutions are context dependent and depend on what you want to achieve," Turner explained.A solution that addresses surface temperature is not necessarily suited to the reduction of building energy use. For example, if you want cooler surface temperatures on a playground because children are running across its surface, a reflective coating would be best. But if you want to reduce the thermal load on people, planting trees or providing shade would be more effective.But what happens if you combine trees with cool pavement? Does the cool pavement lose its ability to reduce surface temperature? Or perhaps the cool pavement is costly to maintain when the trees drop their leaves?"So, reflective coating is not a panacea," Turner said. "It's one tool."It should also be noted that temperature is a multifaceted measurement of heat. Surface temperature, ambient temperature and mean radiant temperature are distinct from one another and require distinct solutions when it comes to mitigating heat."We need more of these experiments," Middel said. "There have been a lot of large-scale modeling studies on this. So, we don't know in real life if we get the same effects. The urban environment is so complex, and models have to always simplify. So, we don't know what really happens on the ground unless we measure, and there haven't been these types of measurements in the past."The researchers report their findings of the Los Angeles study in, "Solar reflective pavements -- A policy panacea to heat mitigation?" which was published on April 8, 2020 in the journal
|
Weather
| 2,020 |
May 11, 2020
|
https://www.sciencedaily.com/releases/2020/05/200511142116.htm
|
El Niño-linked decreases in soil moisture could trigger massive tropical-plant die offs
|
New research has found that El Niño events are often associated with droughts in some of the world's more vulnerable tropical regions. Associated with warmer than average ocean temperatures in the eastern Pacific, El Niños can in turn influence global weather patterns and tropical precipitation, and these changes can lead to massive plant die-offs if other extreme factors are also at play.
|
"We know a lot about El Niño in terms of its impact on weather and surface water resources," said Kurt Solander, a research hydrologist in the Computational Earth Science group at Los Alamos National Laboratory and lead author of the paper. "This new study drills down to reveal how El Niño can affect the moisture content of soil, which controls the growth of plants, the food we eat, and how much water from land gets fed back into the atmosphere through evaporation."In the paper, Solander and Los Alamos colleagues Brent Newman and Chonggang Xu analyzed changes in soil moisture content in the humid tropics after three "Super El Niño" events from the past -- 1982-83, 1997-98, and 2015-16. They found that during these years the most severe and consistent decreases in soil moisture occurred in regions like the Amazon basin and maritime Southeast Asia, with some of the changes potentially being significant enough to become a factor responsible for large-scale plant die off. In contrast, some other tropical areas, such as tropical East Africa, will likely see an increase in soil moisture during major El Niño events.The team used a global dataset based on computer models and historic satellite observations of near-surface soil moisture. By extracting data from the rooting zone from the humid tropics, predicted soil moisture changes during the super El Niños could be examined at local scales. The team combined these data with on-site measurements, collected across the tropics, to verify the accuracy of the satellite and computer models. They were then able to identify ways to improve the estimates of soil moisture changes during El Niño events, and showed that El Niño induced responses varied from significant increases or decreases to minimal change relative to what occurs during non-El Niño years and spatial location.Super El Niño events typically happen every 15 to 20 years, with mild to moderate events coming every three to five years. The most immediate impact of this new information is that it can help governments or farmers in these areas prepare for the consequences of decreased soil moisture, or understand that crops will need more water during these events."Scientists can predict these events with a moderate degree of confidence three to six months in advance," Solander said. "With this new information, water managers in these areas can, for example, regulate how much water they retain in a reservoir to compensate for the expected decreases in available moisture for local agricultural crops."The work is part of ongoing research at Los Alamos studying spatial patterns of precipitation recycling, which effectively determines how much moisture plants return to the atmosphere. In plant-dense regions like the Amazon basin, researchers at Los Alamos hope to provide insight on atmospheric moisture feedbacks from vegetation as plants adjust to climatic warming, which in turn helps researchers understand how precipitation will change on a global scale.
|
Weather
| 2,020 |
May 8, 2020
|
https://www.sciencedaily.com/releases/2020/05/200508145333.htm
|
Potentially fatal combinations of humidity and heat are emerging across the globe
|
Most everyone knows that humid heat is harder to handle than the "dry" kind. And recently, some scientists have projected that later in the century, in parts of the tropics and subtropics, warming climate could cause combined heat and humidity to reach levels rarely if ever experienced before by humans. Such conditions would ravage economies, and possibly even surpass the physiological limits of human survival.
|
According to a new study, the projections are wrong: such conditions are already appearing. The study identifies thousands of previously rare or unprecedented bouts of extreme heat and humidity in Asia, Africa, Australia, South America and North America, including in the U.S. Gulf Coast region. Along the Persian Gulf, researchers spotted more than a dozen recent brief outbreaks surpassing the theoretical human survivability limit. The outbreaks have so far been confined to localized areas and lasted just hours, but they are increasing in frequency and intensity, say the authors. The study appears this week in the journal "Previous studies projected that this would happen several decades from now, but this shows it's happening right now," said lead author Colin Raymond, who did the research as a PhD. student at Columbia University's Lamont-Doherty Earth Observatory. "The times these events last will increase, and the areas they affect will grow in direct correlation with global warming."Analyzing data from weather stations from 1979 to 2017, the authors found that extreme heat/humidity combinations doubled over the study period. Repeated incidents appeared in much of India, Bangladesh and Pakistan; northwestern Australia; and along the coasts of the Red Sea and Mexico's Gulf of California. The highest, potentially fatal, readings, were spotted 14 times in the cities of Dhahran/Damman, Saudi Arabia; Doha, Qatar; and Ras Al Khaimah, United Arab Emirates, which have combined populations of over 3 million. Parts of southeast Asia, southern China, subtropical Africa and the Caribbean were also hit.The southeastern United States saw extreme conditions dozens of times, mainly near the Gulf Coast in east Texas, Louisiana, Mississippi, Alabama and the Florida Panhandle. The worst spots: New Orleans and Biloxi, Miss. Such conditions also reached inland into Arkansas and along the southeastern coastal plain.Not surprisingly, incidents tended to cluster on coastlines along confined seas, gulfs and straits, where evaporating seawater provides abundant moisture to be sucked up by hot air. In some areas further inland, moisture-laden monsoon winds or wide areas of crop irrigation appear to play the same role.Prior climate studies failed to recognize most past incidents because climate researchers usually look at averages of heat and humidity measured over large areas and over several hours at a time. Raymond and his colleagues instead drilled directly into hourly data from 7,877 individual weather stations, allowing them to pinpoint shorter-lived bouts affecting smaller areas.Humidity worsens the effects of heat because humans cool their bodies by sweating; water expelled through the skin removes excess body heat, and when it evaporates, it carries that heat away. The process works nicely in deserts, but less well in humid regions, where the air is already too laden with moisture to take on much more. Evaporation of sweat slows. In the most extreme instances, it could stop. In that case, unless one can retreat to an air-conditioned room, the body's core heats beyond its narrow survivable range, and organs begin to fail. Even a strong, physically fit person resting in the shade with no clothes and unlimited access to drinking water would die within hours.Meteorologists measure the heat/humidity effect on the so-called "wet bulb" Centigrade scale; in the United States, these readings are often translated into "heat index" or "real-feel" Fahrenheit readings. Prior studies suggest that even the strongest, best-adapted people cannot carry out normal outdoor activities when the wet bulb hits 32 C, equivalent to a heat index of 132 F. Most others would crumble well before that. A reading of 35 -- the peak briefly reached in the Persian Gulf cities -- is considered the theoretical survivability limit. That translates roughly to a heat index of 160 F. (The heat index actually ends at 127 F, so these readings are literally off the charts.) "It's hard to exaggerate the effects of anything that gets into the 30s," said Raymond.The study found that worldwide, wet-bulb readings approaching or exceeding 30C on the wet bulb have doubled since 1979. The number of readings of 31 -- previously believed to occur only rarely -- totaled around 1,000. Readings of 33 -- previously thought to be almost nonexistent -- totaled around 80.A heat wave that struck much of the United States last July maxed out at about 30C on the wet bulb, translating into heat indexes approaching 115 F in places; the highest was 122 F, in Baltimore, Md., and a similar wave hit in August. The waves paralyzed communities and led to at least a half-dozen deaths, including those of an air-conditioning technician in Phoenix, Az., and former National Football League lineman Mitch Petrus, who died in Arkansas while working outside.It was a modest toll; heat-related illnesses already kill more U.S. residents than any other weather-related hazard including cold, hurricanes or floods. An investigation last year by the website InsideClimate News revealed that cases of heat stroke or heat exhaustion among U.S. troops on domestic bases grew 60 percent from 2008 to 2018. Seventeen soldiers died, almost all in the muggy U.S. Southeast. High-humidity heat waves in Russia and Europe, where far fewer people have air conditioning, have killed tens of thousands."We may be closer to a real tipping point on this than we think," said Radley Horton, a Lamont-Doherty research scientist and coauthor of the paper. Horton coauthored a 2017 paper projecting that such conditions would not take hold until later in the century.While air conditioning may blunt the effects in the United States and some other wealthy countries, there are limits. Before the new study, one of the previously highest heat/humidity events ever reported was in the Iranian city of Bandar Mahshahr, which almost reached a 35C wet-bulb reading on July 31, 2015. There were no known deaths; residents reported staying inside air-conditioned vehicles and buildings, and showering after brief sojourns outside. But Horton points out that if people are increasingly forced indoors for longer periods, farming, commerce and other activities could potentially grind to a halt, even in rich nations-a lesson already brought home by the collapse of economies in the face of the novel coronavirus.In any case, many people in the poorer countries most at risk do not have electricity, never mind air conditioning. There, many rely on subsistence farming requiring daily outdoor heavy labor. These facts could make some of the most affected areas basically uninhabitable, says Horton.Kristina Dahl, a climatologist at the Union of Concerned Scientists who led a study last year warning of increasing future heat and humidity in the United States, said the new paper shows "how close communities around the world are to the limits." She added that some localities may already be seeing conditions worse than the study suggests, because weather stations do not necessarily pick up hot spots in dense city neighborhoods built with heat-trapping concrete and pavement.Steven Sherwood, a climatologist at the Australia's University of New South Wales, said, "These measurements imply that some areas of Earth are much closer than expected to attaining sustained intolerable heat. It was previously believed we had a much larger margin of safety."The study was coauthored by Tom Matthews, a lecturer in climate science at Loughborough University in the United Kingdom. Colin Raymond is now a postdoctoral researcher at NASA's Jet Propulsion Laboratory.
|
Weather
| 2,020 |
May 6, 2020
|
https://www.sciencedaily.com/releases/2020/05/200506162213.htm
|
Climate change could reawaken Indian Ocean El Niño
|
Global warming is approaching a tipping point that during this century could reawaken an ancient climate pattern similar to El Niño in the Indian Ocean, new research led by scientists from The University of Texas at Austin has found.
|
If it comes to pass, floods, storms and drought are likely to worsen and become more regular, disproportionately affecting populations most vulnerable to climate change.Computer simulations of climate change during the second half of the century show that global warming could disturb the Indian Ocean's surface temperatures, causing them to rise and fall year to year much more steeply than they do today. The seesaw pattern is strikingly similar to El Niño, a climate phenomenon that occurs in the Pacific Ocean and affects weather globally."Our research shows that raising or lowering the average global temperature just a few degrees triggers the Indian Ocean to operate exactly the same as the other tropical oceans, with less uniform surface temperatures across the equator, more variable climate, and with its own El Niño," said lead author Pedro DiNezio, a climate scientist at the University of Texas Institute for Geophysics, a research unit of the UT Jackson School of Geosciences.According to the research, if current warming trends continue, an Indian Ocean El Niño could emerge as early as 2050.The results, which were published May 6 in the journal To show whether an Indian Ocean El Niño can occur in a warming world, the scientists analyzed climate simulations, grouping them according to how well they matched present-day observations. When global warming trends were included, the most accurate simulations were those showing an Indian Ocean El Niño emerging by 2100."Greenhouse warming is creating a planet that will be completely different from what we know today, or what we have known in the 20th century," DiNezio said.The latest findings add to a growing body of evidence that the Indian Ocean has potential to drive much stronger climate swings than it does today.Co-author Kaustubh Thirumalai, who led the study that discovered evidence of the ice age Indian Ocean El Niño, said that the way glacial conditions affected wind and ocean currents in the Indian Ocean in the past is similar to the way global warming affects them in the simulations."This means the present-day Indian Ocean might in fact be unusual," said Thirumalai, who is an assistant professor at the University of Arizona.The Indian Ocean today experiences very slight year-to-year climate swings because the prevailing winds blow gently from west to east, keeping ocean conditions stable. According to the simulations, global warming could reverse the direction of these winds, destabilizing the ocean and tipping the climate into swings of warming and cooling akin to the El Niño and La Niña climate phenomena in the Pacific Ocean. The result is new climate extremes across the region, including disruption of the monsoons over East Africa and Asia.Thirumalai said that a break in the monsoons would be a significant concern for populations dependent on the regular annual rains to grow their food.For Michael McPhaden, a physical oceanographer at the National Oceanic and Atmospheric Administration who pioneered research into tropical climate variability, the paper highlights the potential for how human-driven climate change can unevenly affect vulnerable populations."If greenhouse gas emissions continue on their current trends, by the end of the century, extreme climate events will hit countries surrounding the Indian Ocean, such as Indonesia, Australia and East Africa with increasing intensity," said McPhaden, who was not involved in the study. "Many developing countries in this region are at heightened risk to these kinds of extreme events even in the modern climate."The research was funded by a National Science Foundation (NSF) grant under the NSF Paleoclimate Program.
|
Weather
| 2,020 |
May 6, 2020
|
https://www.sciencedaily.com/releases/2020/05/200506123732.htm
|
Winter warm spells see an increase in duration and frequency in UK temperature records
|
Warm winter spells have increased in frequency and duration two- to three times over since 1878, according to scientists led by the University of Warwick.
|
In a new analysis of historical daily temperature data published in the The conclusions do not rely on identifying and counting winter warm spells directly but instead use observations of daily temperatures to show how the likelihood of different temperatures has changed. By applying a method called crossing theory to these probabilities, the scientists have provided information on the changing relationship between frequency, duration and intensity of these warm spells.The researchers focused on the maximum daily temperatures in December, January and February in observations from 1878. Week-long warm intervals that return on average every 5 years now consistently exceed 13 degrees C. In the 1850s, a winter warm spell lasting more than 5 days with a daily maximum temperature above 12-13°C would typically take at least 5 years to reoccur. Nowadays they occur more often, typically every 4 years or less.Climate variability is expected to increase as the global climate warms, and the increase of extended warm spells during winter can have an important impact on agriculture and the sustainability of ecosystems. However, ecosystems are not uniformly sensitive to changes at different temperatures. They are instead vulnerable to changes around critical temperature thresholds and these thresholds may be far from the distribution mean.Lead author Professor Sandra Chapman of the University of Warwick Department of Physics said: "Our results show that it is possible to focus on warm spells above specific temperature thresholds that are critical for individual species and ecosystem functioning. It thus can be of direct value in supporting our understanding and assessment of climate change impacts.Professor Stainforth from the Grantham Research Institute at the London School of Economics and Political Science said: "Sustained periods of warm weather can have a significant impact on agriculture and ecosystems even when they don't involve record-breaking extremes. The changing frequency and characteristics of such events may have substantial impacts and this new work demonstrates a novel and flexible method for deducing how they are changing. It provides a valuable new approach for studying the less obvious consequences of climate change."Professor Eugene Murphy, Science Leader of the Ecosystems Team at British Antarctic Survey said: "Unusually extended periods of warm weather in winter can disrupt biological processes causing changes in the development of populations of plants and animals during the following spring. These changes can affect the biological balance that sustains ecosystems and the diverse biological communities they support, potentially reducing their resilience and capacity to cope with future change."
|
Weather
| 2,020 |
May 5, 2020
|
https://www.sciencedaily.com/releases/2020/05/200505190556.htm
|
Decoding the skies: The impact of water vapor on afternoon rainfall
|
The role that atmospheric water vapor plays in weather is complex and not clearly understood. However, University of Arizona researchers have started to tease out the relationship between morning soil moisture and afternoon rainfall under different atmospheric conditions in a new study in the journal
|
"The prevailing wisdom on the relationship between soil moisture and rainfall is that if you have wetter soil in the morning, you'll have a greater occurrence of rainfall in afternoon, but it's more complicated than that," said lead author Josh Welty, a UArizona doctoral student in the Department of Hydrology and Atmospheric Sciences. "On a global scale, we see evidence that you can have greater chances of afternoon rainfall over both wet and dry soil conditions, depending on atmospheric moisture."The team, which also included researchers from the Desert Research Institute in Nevada and NASA's Goddard Space Flight Center, used satellite-based observations of soil moisture and afternoon rainfall in the northern hemisphere from the last five years. The work was supported by NASA and is based on NASA satellite data from the Global Precipitation Measurement mission and the Soil Moisture Active Passive satellite, as well as atmospheric moisture and movement data from the Modern-Era Retrospective analysis for Research and Applications Version 2, or MERRA-2, model, which incorporates satellite observations.The researchers found that on days when wind blows in little atmospheric moisture, afternoon rainfall is more likely to occur over wetter soils or higher relative humidity. On days when wind introduces lots of atmospheric moisture, afternoon rainfall is more likely over drier soils or lower relative humidity. The team also found that for both conditions, afternoon rainfall occurrence is more likely with warmer morning soil or air temperature.The researchers focused on days in which afternoon rainfall occurred and noted the difference between the number of rainfall days that occurred over wetter-than-average soil and the number of rainfall days that occurred over drier-than-average soil. They then grouped their results into three categories: high, mid and low atmospheric moisture transport by wind.This research builds on a 2018 study that identified soil moisture's role in afternoon rainfall amount in the Southern Great Plains of Oklahoma. The new findings show that the relationship between soil moisture, afternoon rainfall and atmospheric moisture in Oklahoma doesn't apply across the entire northern hemisphere."Over the Southern Great Plains, we found that when the wind brings less moisture, dry soils are associated with increases in rainfall amount; and when the wind brings greater moisture, wet soils are associated with increases in rainfall amount. In the current study, we find that, actually, in many regions, the opposite is true for the likelihood of afternoon rainfall," Welty said.Understanding the role of water vapor in weather is important because its effects are felt everywhere, according to Welty's thesis adviser and paper co-author Xubin Zeng, Agnese N. Haury Endowed Chair in Environment and director of the Climate Dynamics and Hydrometeorology Center and Land-Atmosphere-Ocean Interaction Group."For example, for the Southern Great Plains there are many tornado activities because there is water vapor moving in from the Gulf of Mexico. Also, on the California coast you talk about severe flooding from atmospheric rivers," which are corridors of concentrated water vapor that can quickly precipitate once hitting a mountain range, causing mass flooding," Zeng said."Water vapor brought in by the winds is an important source to understand. In the past, people didn't pay enough attention to it in studying how land conditions affect rainfall, potentially making their results misleading. Once we consider the wind's movement of water vapor, the results become more robust," Zeng said.Understanding this relationship is even more important as global warming changes patterns of atmospheric moisture, soil moisture and more. Such changes will not only have effects on weather and natural disasters, but also on agriculture, Zeng said."The results really show the complexity of the land's influence on weather and climate," said physical scientist and paper co-author Joe Santanello from NASA's Goddard Space Flight Center, who chairs the NASA-supported Local Land-Atmosphere Coupling working group to improve weather and climate models. "When you add in the human factor of irrigation or land use that changes the dryness or wetness of the soils, which we currently don't represent well in the models, we potentially have additional downstream effects on weather and climate that we haven't foreseen."The next step is to assess how these relationships play out in global climate and weather forecasting models."Our findings are observational, but now, we want to use computer modeling to help us understand why drier or wetter soil could enhance rainfall likelihood," Zeng said. "We know it's true, but we don't quantitatively know why."
|
Weather
| 2,020 |
May 4, 2020
|
https://www.sciencedaily.com/releases/2020/05/200504114054.htm
|
Arctic 'shorefast' sea ice threatened by climate change
|
For people who live in the Arctic, sea ice that forms along shorelines is a vital resource that connects isolated communities and provides access to hunting and fishing grounds. A new study by Brown University researchers found that climate change could significantly reduce this "shorefast ice" in communities across Northern Canada and Western Greenland.
|
The study, published in The analysis found that by 2100, communities could see shorefast ice seasons reduced by anywhere from five to 44 days, with the coldest communities in the study seeing the largest reductions. The wide range of potential outcomes was a surprise, the researchers say, and underscores the need to take local factors into account when making policy to prepare for future climate change."One of the key takeaways for me is that even though the whole Arctic is going to warm and lose ice, we see very different outcomes from one community to another," said Sarah Cooley, lead author of the study and a Ph.D. student in the Institute at Brown for Environment and Society (IBES). "When you combine that wide range of outcomes with the fact that different communities have lots of social, cultural and economic differences, it means that some communities may experience much larger impacts than others."For example, the northern Canadian communities of Clyde River and Taloyoak, which are particularly dependent upon shorefast ice for subsistence hunting and fishing, will see some of the most substantial declines in sea ice. On average, these two communities can expect ice to break up 23 to 44 days earlier, respectively by 2100. That could mean "economically and culturally significant activities on the ice will be harder to maintain in the future," the researchers write.That the coldest regions in the study could see the largest reductions in ice is cause for concern, says study co-author Johnny Ryan, a postdoctoral researcher at IBES."Some of these places are considered to be the last remnants of truly polar ecosystems and people talk a lot about preserving these areas in particular," Ryan said. "Yet these are the areas that we find will lose the most ice."The research is part of a larger research effort aimed at better understanding how climate change in the Arctic will impact the people who live there. In addition to gathering satellite and scientific data, the research team conducted fieldwork in the community of Uummannaq in western Greenland to learn more about how the local population utilizes the ice."Shorefast ice is something that's most important from the standpoint of the people who use it," Cooley said. "It has some implications in terms of global climate, but those are fairly small. This is really all about how it affects the people who actually live in the Arctic, and that's why we're studying it."The fieldwork also provided a first-hand perspective of how things have been changing over the years."One of the most powerful things that came out of the field study for me was listening to a hunter talk about how the ice is breaking up earlier than it ever has in his lifetime," Ryan said. "We're only observing this 20-year satellite record. But to be able to learn from locals about what things were like 50 or 60 years ago, it really emphasized how climate change has already impacted the community."Moving forward, the research team is hopeful that mapping the local effects of regional and global climate patterns will be useful for policy-makers."Because shorefast ice is one of many environmental assets important to Arctic communities," the researchers write, "future research combining broad-scale analysis tools with community-level characteristics may help provide more actionable information for Arctic populations facing substantial climatic and social change."
|
Weather
| 2,020 |
May 1, 2020
|
https://www.sciencedaily.com/releases/2020/05/200501092916.htm
|
Ocean acidification prediction now possible years in advance
|
CU Boulder researchers have developed a method that could enable scientists to accurately forecast ocean acidity up to five years in advance. This would enable fisheries and communities that depend on seafood negatively affected by ocean acidification to adapt to changing conditions in real time, improving economic and food security in the next few decades.
|
Previous studies have shown the ability to predict ocean acidity a few months out, but this is the first study to prove it is possible to predict variability in ocean acidity multiple years in advance. The new method, described in "We've taken a climate model and run it like you would have a weather forecast, essentially -- and the model included ocean chemistry, which is extremely novel," said Riley Brady, lead author of the study, and a doctoral candidate in the Department of Atmospheric and Oceanic Sciences.For this study the researchers focused on the California Current System, one of four major coastal upwelling systems in the world, which runs from the tip of Baja California in Mexico all the way up into parts of Canada. The system supports a billion-dollar fisheries industry crucial to the U.S. economy."Here, you've got physics, chemistry, and biology all connecting to create extremely profitable fisheries, from crabs all the way up to big fish," said Brady, who is also a graduate student at the Institute of Arctic and Alpine Research (INSTAAR). "Making predictions of future environmental conditions one, two, or even three years out is remarkable, because this is the kind of information that fisheries managers could utilize."The California Current System is particularly vulnerable to ocean acidification due to the upwelling of naturally acidic waters to the surface."The ocean has been doing us a huge favor," said study co-author Nicole Lovenduski, associate professor in atmospheric and oceanic sciences and head of the Ocean Biogeochemistry Research Group at INSTAAR.The ocean absorbs a large fraction of the excess carbon dioxide in the Earth's atmosphere derived from human activity. Unfortunately, as a result of absorbing this extra human-made carbon dioxide -- 24 million tons every single day -- the oceans have become more acidic."Ocean acidification is proceeding at a rate 10 times faster today than any time in the last 55 million years," said Lovenduski.Within decades, scientists are expecting parts of the ocean to become completely corrosive for certain organisms, which means they cannot form or maintain their shells."We expect people in communities who rely on the ocean ecosystem for fisheries, for tourism and for food security to be affected by ocean acidification," said Lovenduski.This spells trouble for the California Current System, with its naturally corrosive waters. This extra acidification could push its fragile ecosystems over the edge.People can easily confirm the accuracy of a weather forecast within a few days. The forecast says rain in your city? You can look out the window.But it's a lot more difficult to get real-time measurements of ocean acidity and figure out if your predictions were correct.But this time, CU Boulder researchers were able to capitalize on historical forecasts from a climate model developed at the National Center for Atmospheric Research. Instead of looking to the future, they generated forecasts of the past using the climate model to see how well their forecast system performed. They found that the climate model forecasts did an excellent job at making predictions of ocean acidity in the real world.However, these types of climate model forecasts require an enormous amount of computational power, manpower, and time. The potential is there, but the forecasts are not yet ready to be fully operational like weather forecasts.And while the study focuses on acidification in one region of the global ocean, it has much larger implications.States and smaller regions often do their own forecasts of ocean chemistry on a finer scale, with higher resolution, focused on the coastline where fisheries operate. But while these more local forecasts cannot factor in global climate variables like El Niño, this new global prediction model can.This means that this larger model can help inform the boundaries of the smaller models, which will significantly improve their accuracy and extend their forecasts. This would allow fisheries and communities to better plan for where and when to harvest seafood, and to predict potential losses in advance."In the last decade, people have already found evidence of ocean acidification in the California current," said Brady. "It's here right now, and it's going to be here and ever present in the next couple of decades."
|
Weather
| 2,020 |
April 21, 2020
|
https://www.sciencedaily.com/releases/2020/04/200421150232.htm
|
Ocean: How the blob came back
|
Weakened wind patterns likely spurred the wave of extreme ocean heat that swept the North Pacific last summer, according to new research led by the University of Colorado Boulder and the Scripps Institution of Oceanography at the University of California San Diego. The marine heat wave, named the "Blob 2.0" after 2013's "Blob," likely damaged marine ecosystems and hurt coastal fisheries. Waters off the U.S. West Coast were a record-breaking 4.5 degrees F (2.5 degrees C) above normal, the authors found.
|
"Most large marine heat waves have historically occurred in the winter," said Dillon Amaya, a postdoctoral Visiting Fellow at CIRES and lead author on the new study out this week in And that wasn't the only record: 2019 also saw the weakest North Pacific atmospheric circulation patterns in at least the last 40 years. "This was truly a 99th-percentile type of event, with impacts like slow winds felt around the North Pacific," Amaya said.To search for physical processes that might have influenced the formation of Blob 2.0 in summertime, the team paired real world sea surface temperature data with an atmospheric model, and tested the impacts of various possible drivers.The most likely culprit: weaker winds. In short, when circulation patterns weaken, so does the wind. With less wind blowing over the ocean's surface, there's less evaporation and less cooling: The process is similar to wind cooling off human skin by evaporating sweat. In 2019, it was as if the ocean was stuck outside on a hot summer day with no wind to cool it down.A thinning of the ocean's mixed layer, the depth where surface ocean properties are evenly distributed, also fueled the Blob 2.0, the researchers found. The thinner the mixed layer, the faster it warms from incoming sunlight and weakened winds. And the impacts can keep accumulating in a vicious cycle: the lower atmosphere above the ocean responds to warmer water by burning off low clouds, which leaves the ocean more exposed to sunlight, which warms the ocean more, and burns off even more clouds.Warm ocean temperatures have the potential to devastate marine ecosystems along the U.S. West Coast During warmer months, marine plants and animals with a low heat tolerance are at higher risk than during the winter -- warm on top of warm can be more harmful than warm on top of cold.And with our changing climate, we may see more harmful impacts like this in years to come, the authors reported. As global warming continues, heat extremes like last summer's Blob 2.0 are becoming increasingly likely."It's the same argument that can be made for heat waves on land," said Amaya. "Global warming shifts the entire range of possibilities towards warmer events. The Blob 2.0 is just the beginning. In fact, events like this may not even be considered 'extreme' in the future."The researchers hope these results will help scientists and decision makers better predict and prepare for future marine heat waves. "If we understand the mechanisms that drove this summertime event and how it influenced marine systems," Amaya said, "we can better recognize the early warning signs in the future and better predict how heat waves interact with the coast, how long they last and how destructive they might be."
|
Weather
| 2,020 |
April 20, 2020
|
https://www.sciencedaily.com/releases/2020/04/200420165724.htm
|
Arctic research expedition likely faces extreme conditions in fast-changing Arctic
|
In October 2019, scientists trapped a ship filled with equipment in Arctic sea ice with the intention of drifting around the Arctic Ocean for a full year, gathering data on the polar regions and sea ice floes. However, a new study indicates there is a chance the expedition may melt out months before the year-end goal.
|
The MOSAiC (Multidisciplinary drifting Observatory for the Study of Arctic Climate) research team went through extensive preparation and training for the expedition, including analyzing historic conditions. The new research shows, however, that Arctic conditions have been changing so rapidly that the past may no longer be a guide to today.Scientists at the National Center for Atmospheric Research (NCAR) have used an ensemble of multiple climate model runs to simulate conditions along potential routes for the polar expedition, using today's conditions in the "new Arctic." The results suggest that thinner sea ice may carry the ship farther than would be expected compared to historical conditions and the sea ice around the ship may melt earlier than the 12-month goal. Of the 30 model runs analyzed in the new study, five (17%) showed melt-out in less than a year.The research, published in the journal The ensemble of 30 model runs used current climate conditions and reflected the breadth of ways sea ice could form, drift, and melt in a 2020 climate. The study did not incorporate 2019 ice conditions and is not a forecast of the track the ship will take over its year-long expedition."The whole point of MOSAiC is to understand the new Arctic and how things have changed over the last 10 years," said Alice DuVivier, an NCAR climate scientist and lead author of the new study. "This model gives us an understanding of the range of drifting possibilities the expedition could face in the new ice regime."Scientists have been gathering data on Arctic sea ice extent, which can cover millions of square miles in winter, since 1979 when satellites began capturing annual changes in ice cover. "The changes in the Arctic system are so incredibly rapid that even our satellite observations from 15 years ago are unlike the Arctic today," said Marika Holland, NCAR scientist and co-author of the study. "Now there is thinner ice, which moves more quickly, and there is less snow cover. It is a totally different ice regime."To compare the differences between the "old Arctic" and "new Arctic," the scientists created ice floe tracks for the expedition's ship, a German research icebreaker called the Polarstern, using the sea ice model in the NCAR-based Community Earth System Model (CESM). First, the scientists ran 28 model tracks based on historic satellite data of sea ice conditions. Then they compared the results to 30 model tracks under the young, thin, and more seasonal Arctic ice conditions that are more reflective of recent conditions.In these "new Arctic" conditions, the ice floes moved more quickly so their paths extend farther and varied more from each other compared to "old Arctic" paths, which include thicker sea ice and shorter, slower tracks.Most notably, the study finds that in the seasonal Arctic scenario, 17% of the simulated tracks show the Polarstern melting out of the ice altogether, months before the October 2020 finish line. The model runs estimate July 29, 2020, as the earliest potential melt date, highlighting that the present-day Arctic has an increased risk of extreme events, such as melt-out. In contrast, none of the tracks run under the historic satellite data showed a melt-out scenario.While the results provide additional insight into the potential outcome of MOSAiC's path, the model runs are not a forecast of the expedition's track, said DuVivier, who presented the study's results to the MOSAiC science team as they prepared for the campaign. Rather, the results are a way to explore the many scenarios a ship could potentially face over the course of the journey in the current climate. "Modeling is a way to explore many worlds," said DuVivier. "Previous experience isn't always indicative of what is going to happen."By the first phase of the expedition, in fall 2019, the researchers had already encountered thin ice conditions, having initially struggled to find a thick enough ice floe to anchor the Polarstern, and experiencing storms that have challenged the expedition. According to the National Snow & Ice Data Center, the Arctic sea ice in 2019 reached the second-lowest minimum in the satellite record, meaning the expedition began under extremely low ice conditions.Since then, the ship's current track has been drifting farther than expected. "The model experiments from this study have comparable tracks to what the Polarstern has been experiencing in the last few months," said DuVivier. "We are not making a prediction for the expedition, but those types of tracks melt out early in our climate model."Time will tell what the Polarstern's ultimate track and destination will be, but the expedition will still provide scientists with a wealth of data. The information will ultimately be hugely beneficial for improving climate models like CESM and helping scientists understand the changes in the Arctic that future polar expeditions may go on to observe."This is why we need MOSAiC. Models can inform these kinds of campaigns and these campaigns are going to inform our models," said Holland, who has her own project on the MOSAiC expedition, collecting data about snow on sea ice and upper-ocean heating that affects sea ice thickness."We don't have a lot of new observations taken in this new regime, and this will be fundamental to our future understanding of the Arctic," said Holland.
|
Weather
| 2,020 |
April 16, 2020
|
https://www.sciencedaily.com/releases/2020/04/200416151750.htm
|
Climate-driven megadrought is emerging in western US, says study
|
With the western United States and northern Mexico suffering an ever-lengthening string of dry years starting in 2000, scientists have been warning for some time that climate change may be pushing the region toward an extreme long-term drought worse than any in recorded history. A new study says the time has arrived: a megadrought as bad or worse than anything even from known prehistory is very likely in progress, and warming climate is playing a key role. The study, based on modern weather observations, 1,200 years of tree-ring data and dozens of climate models, appears this week in the leading journal
|
"Earlier studies were largely model projections of the future," said lead author Park Williams, a bioclimatologist at Columbia University's Lamont-Doherty Earth Observatory. "We're no longer looking at projections, but at where we are now. We now have enough observations of current drought and tree-ring records of past drought to say that we're on the same trajectory as the worst prehistoric droughts."Reliable modern observations date only to about 1900, but tree rings have allowed scientists to infer yearly soil moisture for centuries before humans began influencing climate. Among other things, previous research has tied catastrophic naturally driven droughts recorded in tree rings to upheavals among indigenous Medieval-era civilizations in the Southwest. The new study is the most up-to-date and comprehensive long-term analysis. It covers an area stretching across nine U.S. states from Oregon and Montana down through California and New Mexico, and part of northern Mexico.Using rings from many thousands of trees, the researchers charted dozens of droughts across the region, starting in 800 AD. Four stand out as so-called megadroughts, with extreme aridity lasting decades: the late 800s, mid-1100s, the 1200s, and the late 1500s. After 1600, there were other droughts, but none on this scale.The team then compared the ancient megadroughts to soil moisture records calculated from observed weather in the 19 years from 2000 to 2018. Their conclusion: as measured against the worst 19-year increments within the previous episodes, the current drought is already outdoing the three earliest ones. The fourth, which spanned 1575 to 1603, may have been the worst of all -- but the difference is slight enough to be within the range of uncertainty. Furthermore, the current drought is affecting wider areas more consistently than any of the earlier ones -- a fingerprint of global warming, say the researchers. All of the ancient droughts lasted longer than 19 years -- the one that started in the 1200s ran nearly a century -- but all began on a similar path to to what is showing up now, they say.Nature drove the ancient droughts, and still plays a strong role today. A study last year led by Lamont's Nathan Steiger showed that among other things, unusually cool periodic conditions over the tropical Pacific Ocean (commonly called La Niña) during the previous megadroughts pushed storm tracks further north, and starved the region of precipitation. Such conditions, and possibly other natural factors, appear to have also cut precipitation in recent years. However, with global warming proceeding, the authors say that average temperatures since 2000 have been pushed 1.2 degrees C (2.2 F) above what they would have been otherwise. Because hotter air tends to hold more moisture, that moisture is being pulled from the ground. This has intensified drying of soils already starved of precipitation.All told, the researchers say that rising temperatures are responsible for about half the pace and severity of the current drought. If this overall warming were subtracted from the equation, the current drought would rank as the 11th worst detected -- bad, but nowhere near what it has developed into."It doesn't matter if this is exactly the worst drought ever," said coauthor Benjamin Cook, who is affiliated with Lamont and the Goddard Institute for Space Studies. "What matters is that it has been made much worse than it would have been because of climate change." Since temperatures are projected to keep rising, it is likely the drought will continue for the foreseeable future; or fade briefly only to return, say the researchers."Because the background is getting warmer, the dice are increasingly loaded toward longer and more severe droughts," said Williams. "We may get lucky, and natural variability will bring more precipitation for a while. But going forward, we'll need more and more good luck to break out of drought, and less and less bad luck to go back into drought." Williams said it is conceivable the region could stay arid for centuries. "That's not my prediction right now, but it's possible," he said.Lamont climatologist Richard Seager was one of the first to predict, in a 2007 paper, that climate change might eventually push the region into a more arid climate during the 21st century; he speculated at the time that the process might already be underway. By 2015, when 11 of the past 14 years had seen drought, Benjamin Cook led a followup study projecting that warming climate would cause the catastrophic natural droughts of prehistory to be repeated by the latter 21st century. A 2016 study coauthored by several Lamont scientist reinforced those findings. Now, says Cook, it looks like they may have underestimated. "It's already happening," he said.The effects are palpable. The mighty reservoirs of Lake Mead and Lake Powell along the Colorado River, which supply agriculture around the region, have shrunk dramatically. Insect outbreaks are ravaging dried-out forests. Wildfires in California and across wider areas of the U.S. West are growing in area. While 2019 was a relatively wet year, leading to hope that things might be easing up, early indications show that 2020 is already on a track for resumed aridity."There is no reason to believe that the sort of natural variability documented in the paleoclimatic record will not continue into the future, but the difference is that droughts will occur under warmer temperatures," said Connie Woodhouse, a climate scientist at the University of Arizona who was not involved in the study. "These warmer conditions will exacerbate droughts, making them more severe, longer, and more widespread than they would have been otherwise."Angeline Pendergrass, a staff scientist at the U.S. National Center for Atmospheric Research, said that she thinks it is too early to say whether the region is at the cusp of a true megadrought, because the study confirms that natural weather swings are still playing a strong role. That said, "even though natural variability will always play a large role in drought, climate change makes it worse," she said.Tucked into the researchers' data: the 20th century was the wettest century in the entire 1200-year record. It was during that time that population boomed, and that has continued. "The 20th century gave us an overly optimistic view of how much water is potentially available," said Cook. "It goes to show that studies like this are not just about ancient history. They're about problems that are already here."The study was also coauthored by Edward Cook, Jason Smerdon, Kasey Bolles and Seung Baek, all of Lamont-Doherty Earth Observatory; John Abatzaglou of the University of Idaho; and Andrew Badger and Ben Livneh of the University of Colorado, Boulder.
|
Weather
| 2,020 |
April 15, 2020
|
https://www.sciencedaily.com/releases/2020/04/200415133440.htm
|
New textile could keep you cool in the heat, warm in the cold
|
Imagine a single garment that could adapt to changing weather conditions, keeping its wearer cool in the heat of midday but warm when an evening storm blows in. In addition to wearing it outdoors, such clothing could also be worn indoors, drastically reducing the need for air conditioning or heat. Now, researchers reporting in
|
"Smart textiles" that can warm or cool the wearer are nothing new, but typically, the same fabric cannot perform both functions. These textiles have other drawbacks, as well -- they can be bulky, heavy, fragile and expensive. Many need an external power source. Guangming Tao and colleagues wanted to develop a more practical textile for personal thermal management that could overcome all of these limitations.The researchers freeze-spun silk and chitosan, a material from the hard outer skeleton of shellfish, into colored fibers with porous microstructures. They filled the pores with polyethylene glycol (PEG), a phase-changing polymer that absorbs and releases thermal energy. Then, they coated the threads with polydimethylsiloxane to keep the liquid PEG from leaking out. The resulting fibers were strong, flexible and water-repellent. To test the fibers, the researchers wove them into a patch of fabric that they put into a polyester glove. When a person wearing the glove placed their hand in a hot chamber (122 F), the solid PEG absorbed heat from the environment, melting into a liquid and cooling the skin under the patch. Then, when the gloved hand moved to a cold (50 F) chamber, the PEG solidified, releasing heat and warming the skin. The process for making the fabric is compatible with the existing textile industry and could be scaled up for mass production, the researchers say.
|
Weather
| 2,020 |
April 14, 2020
|
https://www.sciencedaily.com/releases/2020/04/200414122754.htm
|
Solar power plants get help from satellites to predict cloud cover
|
The output of solar energy systems is highly dependent on cloud cover. While weather forecasting can be used to predict the amount of sunlight reaching ground-based solar collectors, cloud cover is often characterized in simple terms, such as cloudy, partly cloudy or clear. This does not provide accurate information for estimating the amount of sunlight available for solar power plants.
|
In this week's In 2016, NASA began launching a new generation of Geostationary Operational Environmental Satellites, the GOES-R series. These satellites occupy fixed positions above the Earth's surface. Each is equipped with several sophisticated instruments, including the Advanced Baseline Imager, or ABI, which can detect radiation upwelling from the Earth at specific wavelengths.The SCOPE method estimates three properties of clouds that determine the amount of sunlight reaching the Earth's surface. The first, cloud top height, is the altitude corresponding to the top of each cloud. The second, cloud thickness, is simply the difference in altitude between a cloud's top and bottom. The third property is the cloud optical depth, a measure of how a cloud modifies light passing through it.Clouds are, essentially, floating masses of condensed water. The water takes multiple forms as liquid droplets or ice crystals of varying sizes. These different forms of water absorb light in different amounts, affecting a cloud's optical depth.The amount of light absorbed also depends on the light's wavelength. Absorption is especially variable for light in the wider infrared range of the spectrum but not so much for light in the narrower visible range.The SCOPE method simultaneously estimates cloud thickness, top height and optical depth by coupling ABI sensor data from GOES-R satellites to an atmospheric model. Two other inputs to the model come from ground-based weather stations: ambient temperature and relative humidity at the ground. These are used to adjust temperature and gas concentration vertical profiles in the model.The accuracy of the estimated cloud optical properties was evaluated using one year of data from 2018 for measurements taken at seven ground-based locations in the U.S. during both night and day, in all sorts of weather, and for a wide spatial coverage at 5-minute intervals."SCOPE can be used during both day and night with reliable accuracy," said co-author Carlos F.M. Coimbra. "Due to its high-frequency output during daytime, SCOPE is especially suitable for providing accurate real-time estimates of cloud optical properties for solar forecasting applications."
|
Weather
| 2,020 |
April 6, 2020
|
https://www.sciencedaily.com/releases/2020/04/200406100828.htm
|
Scientists discover legacy of past weather in stories of prairie plant restoration
|
Before there were farms in southwest Michigan, there were prairies. For thousands of years, tall grass prairies stood undisturbed until European settlers turned the rich, highly productive soils to agriculture.
|
Today, tall grass prairies East of the Mississippi are virtually extinct.But some landowners want to return land throughout the Midwest to its incredibly deep roots, converting abandoned, depleted and fallow agricultural fields to native prairie -- with varying degrees of success.Michigan State University's Lars Brudvig, associate professor in the Department of Plant Biology, and former MSU graduate student Anna Funk investigated fields of data going back 20 years to find out why some replanted prairies are healthier than others. Their research is published in "Native prairie plants are rare on the landscape, so land managers need to intentionally spread seeds on the ground for them to come back," said Brudvig, whose lab partnered closely with farmers, land managers and various nature conservancies across Illinois, Indiana and Michigan for the study.Each of the 83 sites Brudvig and Funk studied started from roughly the same point and had similar processes of ecological management. Controlled burning, targeted herbicides and regular mowing were common strategies, but every site had widely different outcomes.The big question was, why?"There was a bit of tantalizing evidence from just a couple of studies that suggested the weather you get during a prairie restoration can actually have a long-lasting effect on the success of the project," said Funk, who revisited northern Illinois prairies where she got her start as a plant ecologist with her mentor, Tom Simpson, at the McHenry County Conservation District. "It was an idea that land managers were familiar with anecdotally, but it hadn't been carefully studied."The researchers looked at a wide swath of restored prairies -- the best and the worst -- to see if they could identify any patterns in fields planted in a particularly rainy or hot year. They interviewed the people who had planted each of the prairies in order to pinpoint planting dates, the number of prairie species planted and other details about ongoing management at each of the sites.Then, they went out to each prairie, surveying the abundance and diversity of both weeds and native prairie plants and taking samples of the soil to test how productive each site was."When I did a big analysis of all of our data, I hoped to find some effect of the planting year, but I still assumed other factors would be most important, like how often the site had been managed with prescribed fire," Funk said. "I was very surprised to find not only an effect of weather, but that sometimes planting-year weather conditions had the biggest effect of all.""We expected that in years where it rained more when the prairie was initially planted, it should turn out better because plants need water," Brudvig added. "But it's exactly the opposite because there are not only prairie plants at these sites, but weedy plant species that really respond to precipitation."Even more surprising was how long the first year's weather left its mark on restored prairie systems."We thought rainfall would matter at first, but that we should see that signature in the data become less and less important as the sites got older and older," Brudvig said. "Instead, we saw the first-year weather conditions had a signature that persisted for decades."The findings may seem disheartening as climate change brings wetter springs more often to the Midwest but recognizing weather signatures may be an important tool for restoration practitioners who are fighting to re-establish these extremely imperiled ecosystems."We suggest they use long-term weather forecasts to help predict if it will be a rainy or dry year so that it might be possible for land managers to focus more effort on starting prairies in what is expected to be a drier year," Brudvig explained. "They may also be able to invest more in weed control during rainy planting years by mowing the weeds above the prairie plants.""I really hope this knowledge will be helpful to anyone planting prairies, as well as to ecologists more broadly who are doing experiments that could be affected by the weather," Funk said. "The next step will be to figure out how to best mitigate the weed-bomb that extra rain causes. I'm sure prairie managers already have lots of ideas on what to explore next."
|
Weather
| 2,020 |
April 1, 2020
|
https://www.sciencedaily.com/releases/2020/04/200401111653.htm
|
Uncertain climate future could disrupt energy systems
|
Extreme weather events -- such as severe drought, storms, and heat waves -- have been forecast to become more commonplace and are already starting to occur. What has been less studied is the impact on energy systems and how communities can avoid costly disruptions, such as partial or total blackouts.
|
Now an international team of scientists has published a new study proposing an optimization methodology for designing climate-resilient energy systems and to help ensure that communities will be able to meet future energy needs given weather and climate variability. Their findings were recently published in "On one side is energy demand -- there are different types of building needs, such as heating, cooling, and lighting. Because of long-term climate change and short-term extreme weather events, the outdoor environment changes, which leads to changes in building energy demand," said Tianzhen Hong, a Berkeley Lab scientist who helped design the study. "On the other side, climate can also influence energy supply, such as power generation from hydro, solar and wind turbines. Those could also change because of weather conditions."Working with collaborators from Switzerland, Sweden, and Australia, and led by a scientist at the Ecole Polytechnique Fédérale de Lausanne (EPFL), the team developed a stochastic-robust optimization method to quantify impacts and then use the data to design climate-resilient energy systems. Stochastic optimization methods are often used when variables are random or uncertain."Energy systems are built to operate for 30 or more years. Current practice is just to assume typical weather conditions today; urban planners and designers don't commonly factor in future uncertainties," said Hong, a computational scientist leading multi-scale energy modeling and simulation at Berkeley Lab. "There is a lot of uncertainty around future climate and weather.""Energy systems," as defined in the study, provide energy needs, and sometimes energy storage, to a group of buildings. The energy supplied could include gas or electricity from conventional or renewable sources. Such community energy systems are not as common in the U.S. but may be found on some university campuses or in business parks.The researchers investigated a wide range of scenarios for 30 Swedish cities. They found that under some scenarios the energy systems in some cities would not be able to generate enough energy. Notably, climate variability could create a 34% gap between total energy generation and demand and a 16% drop in power supply reliability -- a situation that could lead to blackouts."We observed that current energy systems are designed in a way that makes them highly susceptible to extreme weather events such as storms and heat waves," said Dasun Perera, a scientist at EPFL's Solar Energy and Building Physics Laboratory and lead author of the study. "We also found that climate and weather variability will result in significant fluctuations in renewable power being fed into electric grids as well as energy demand. This will make it difficult to match the energy demand and power generation. Dealing with the effects of climate change is going to prove harder than we previously thought."The authors note that 3.5 billion people live in urban areas, consuming two-thirds of global energy, and by 2050 urban areas are expected to hold more than two-thirds of the world's population. "Distributed energy systems that support the integration of renewable energy technologies will support the energy transition in the urban context and play a vital role in climate change adaptation and mitigation," they wrote.Hong leads an urban science research group at Berkeley Lab that studies energy and environmental issues at the city scale. The group is part of Berkeley Lab's Building Technology and Urban Systems Division, which for decades has been at the forefront of research into advancing energy efficiency in the built environment.
|
Weather
| 2,020 |
March 31, 2020
|
https://www.sciencedaily.com/releases/2020/03/200331130110.htm
|
New global groundwater maps
|
NASA researchers have developed new satellite-based, weekly global maps of soil moisture and groundwater wetness conditions and one to three-month U.S. forecasts of each product. While maps of current dry/wet conditions for the United States have been available since 2012, this is the first time they have been available globally.
|
"The global products are important because there are so few worldwide drought maps out there," said hydrologist and project lead Matt Rodell of NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Droughts are usually well known when they happen in developed nations. But when there's a drought in central Africa, for example, it may not be noticed until it causes a humanitarian crisis. So it's valuable to have a product like this where people can say, wow, it's really dry there and no one's reporting it."These maps are distributed online by the National Drought Mitigation Center at the University of Nebraska-Lincoln (UNL) to support U.S. and global drought monitoring."Being able to see a weekly snapshot of both soil moisture and groundwater is important to get a complete picture of drought," said professor Brian Wardlow, director for the Center for Advanced Land Management Information Technologies at UNL, who works closely with Rodell on developing remote sensing tools for operational drought monitoring.Monitoring the wetness of the soil is essential for managing agricultural crops and predicting their yields, because soil moisture is the water available to plant roots. Groundwater is often the source of water for crop irrigation. It also sustains streams during dry periods and is a useful indicator of extended drought. But ground-based observations are too sparse to capture the full picture of wetness and dryness across the landscape like the combination of satellites and models can.Both the global maps and the U.S. forecasts use data from NASA and German Research Center for Geosciences's Gravity Recovery and Climate Experiment Follow On (GRACE-FO) satellites, a pair of spacecraft that detect the movement of water on Earth based on variations of Earth's gravity field. GRACE-FO succeeds the highly successful GRACE satellites, which ended their mission in 2017 after 15 years of operation. With the global expansion of the product, and the addition of U.S. forecasts, the GRACE-FO data are filling in key gaps for understanding the full picture of wet and dry conditions that can lead to drought.The satellite-based observations of changes in water distribution are integrated with other data within a computer model that simulates the water and energy cycles. The model then produces, among other outputs, time-varying maps of the distribution of water at three depths: surface soil moisture, root zone soil moisture (roughly the top three feet of soil), and shallow groundwater. The maps have a resolution of 1/8th degree of latitude, or about 8.5 miles, providing continuous data on moisture and groundwater conditions across the landscape.The GRACE and GRACE-FO satellite-based maps are among the essential data sets used by the authors of the U.S. Drought Monitor, the premier weekly map of drought conditions for the United States that is used by the U.S. Department of Agriculture and the Federal Emergency Management Agency, among others, to evaluate which areas may need financial assistance due to losses from drought."GRACE [provided and GRACE-FO now provides] a national scope of groundwater," said climatologist and Drought Monitor author Brian Fuchs, at the drought center. He and the other authors use multiple data sets to see where the evidence shows conditions have gotten drier or wetter. For groundwater, that used to mean going to individual states' groundwater well data to update the weekly map. "It's saved a lot of time having that groundwater layer along with the soil moisture layers, all in one spot," Fuchs said. "The high-resolution data that we're able to bring in allows us to draw those contours of dryness or wetness right to the data itself."One of the goals of the new global maps is to make the same consistent product available in all parts of the world -- especially in countries that do not have any groundwater-monitoring infrastructure."Drought is really a key [topic]... with a lot of the projections of climate and climate change," Wardlow said. "The emphasis is on getting more relevant, more accurate and more timely drought information, whether it be soil moisture, crop health, groundwater, streamflow -- [the GRACE missions are] central to this," he said. "These types of tools are absolutely critical to helping us address and offset some of the impacts anticipated, whether it be from population growth, climate change or just increased water consumption in general."Both the Center for Advanced Land Management and the National Drought Mitigation Center are based in UNL's School of Natural Resources, and they are working with international partners, including the U.S. Agency for International Development and the World Bank, to develop and support drought monitoring using the GRACE-FO global maps and other tools in the Middle East, North Africa, South Africa, South East Asia, and India.Droughts can be complex, both in timing and extent. At the surface, soil moisture changes rapidly with weather conditions. The moisture in the root zone changes a little slower but is still very responsive to weather. Lagging behind both is groundwater, since it is insulated from changes in the weather. But for longer-term outlooks on drought severity -- or, conversely, flood risk in low-lying areas -- groundwater is the metric to watch, said Rodell."The groundwater maps are like a slowed down, smoothed version of what you see at the surface," Rodell said. "They represent the accumulation of months or years of weather events." That smoothing provides a more complete picture of the overall drying or wetting trend going on in an area. Having an accurate accounting of groundwater levels is essential for accurately forecasting near-future conditions.The new forecast product that projects dry and wet conditions 30, 60, and 90 days out for the lower 48 United States uses GRACE-FO data to help set the current conditions. Then the model runs forward in time using the Goddard Earth Observing System, Version 5 seasonal weather forecast model as input. The researchers found that including the GRACE-FO data made the resulting soil moisture and groundwater forecasts more accurate.Since the product has just been rolled out, the user community is only just beginning to work with the forecasts, but Wardlow sees a huge potential."I think you'll see the GRACE-FO monitoring products used in combination with the forecasts," Wardlow said. "For example, the current U.S. product may show moderate drought conditions, and if you look at the forecast and the forecast shows next month that there's a continued drying trend, then that may change the decision versus if it was a wet trend."The U.S. forecast and global maps are freely available to users through the drought center's data portal.To download the maps, visit:
|
Weather
| 2,020 |
March 25, 2020
|
https://www.sciencedaily.com/releases/2020/03/200325143802.htm
|
Heat takes its toll on mental health
|
Hot days increase the probability that an average adult in the U.S. will report bad mental health, according to a study published March 25, 2020 in the open-access journal
|
As one of the most important factors affected by climate change, human health is now recognized as a global research priority. In particular, mental health has been gaining attention among world leaders in recent years. The promotion of mental health has, for the first time, been included in the United Nations Sustainable Development Agenda under goal number three ("Good Health and Well-being") to be reached by 2030. In a rapidly warming world, temperature increases pose a challenge to achieving that goal. In the new study, Li and colleagues set out to gauge the magnitude of that challenge by quantifying the effect of temperature on self-reported mental health.The researchers examined the relationship between mental health data and historical daily weather information for more than three million Americans between 1993 and 2010. Compared to the temperature range of 60-70°F, cooler days in the past month reduce the probability of reporting days of bad mental health, whereas hotter days increase this probability. In addition, cooler days have an immediate beneficial effect, whereas hotter days tend to matter most after about 10 consecutive days. The willingness to pay to avoid an additional hot day in the past month ranges from $2.6 to $4.6 per day. According to the authors, future studies should examine how community-level factors mediate the effects of climate change on individual mental health to guide the design of appropriate policies.The authors add: "We found a positive relationship between hotter temperatures and self-reported mental distress in the United States."
|
Weather
| 2,020 |
March 18, 2020
|
https://www.sciencedaily.com/releases/2020/03/200318143722.htm
|
Global warming influence on extreme weather events has been frequently underestimated
|
A new Stanford study reveals that a common scientific approach of predicting the likelihood of future extreme weather events by analyzing how frequently they occurred in the past can lead to significant underestimates -- with potentially significant consequences for people's lives.
|
Stanford climate scientist Noah Diffenbaugh found that predictions that relied only on historical observations underestimated by about half the actual number of extremely hot days in Europe and East Asia, and the number of extremely wet days in the U.S., Europe and East Asia.The paper, published March 18 in "We are seeing year after year how the rising incidence of extreme events is causing significant impacts on people and ecosystems," Diffenbaugh said. "One of the main challenges in becoming more resilient to these extremes is accurately predicting how the global warming that's already happened has changed the odds of events that fall outside of our historical experience."For decades, engineers, land-use planners and risk managers have used historical weather observations from thermometers, rain gauges and satellites to calculate the probability of extreme events. Those calculations -- meant to inform projects ranging from housing developments to highways -- have traditionally relied on the assumption that the risk of extremes could be assessed using only historical observations. However, a warming world has made many extreme weather events more frequent, intense and widespread, a trend that is likely to intensify, according to the U.S. government.Scientists trying to isolate the influence of human-caused climate change on the probability and/or severity of individual weather events have faced two major obstacles. There are relatively few such events in the historical record, making verification difficult, and global warming is changing the atmosphere and ocean in ways that may have already affected the odds of extreme weather conditions.In the new study, Diffenbaugh, the Kara J. Foundation professor at Stanford's School of Earth, Energy & Environmental Sciences, revisited previous extreme event papers he and his colleagues had published in recent years. Diffenbaugh wondered if he could use the frequency of record-setting weather events from 2006 to 2017 to evaluate the predictions his group had made using data from 1961 to 2005. He found in some cases the actual increase in extreme events was much larger than what had been predicted."When I first looked at the results, I had this sinking feeling that our method for analyzing these extreme events could be all wrong," said Diffenbaugh, who is also the Kimmelman Family senior fellow in the Stanford Woods Institute for the Environment. "As it turned out, the method actually worked very well for the period that we had originally analyzed -- it's just that global warming has had a really strong effect over the last decade."Interestingly, Diffenbaugh also found that climate models were able to more accurately predict the future occurrence of record-setting events. While acknowledging that climate models still contain important uncertainties, Diffenbaugh says the study identifies the potential for new techniques that incorporate both historical observations and climate models to create more accurate, robust risk management tools."The good news," Diffenbaugh said, "is that these new results identify some real potential to help policymakers, engineers and others who manage risk to integrate the effects of global warming into their decisions."
|
Weather
| 2,020 |
March 11, 2020
|
https://www.sciencedaily.com/releases/2020/03/200311100855.htm
|
Coral reefs 'weathering' the pressure of globalization
|
More information about the effects human activities have on Southeast Asian coral reefs has been revealed, with researchers looking at how large-scale global pressures, combined with the El Niño Southern Oscillation (ENSO) climate pattern, can detrimentally impact these delicate marine ecosystems.
|
The research, published in the Lead Australian researcher Dr Nicola Browne, a coral ecologist from the School of Molecular and Life Sciences at Curtin University, said that over the past 40 years, nearly all of Southeast Asia's marine coastal ecosystems have experienced intense pressures, due to large-scale economic development, urbanisation and deforestation."The ways that humans use the land can severely impact soil contents and soil erosion patterns, which then discharge sediments into freshwater systems and nearby marine environments, ultimately altering the water quality on coral reefs," Dr Browne said."Weather patterns that bring heavy rainfalls, such as those seen with the ENSO climate pattern, can exacerbate these erosion patterns even more, bringing more sediments into the local marine environments, which then ultimately end up affecting the coral reef ecosystems."Through analysing coral core samples from the Miri-Sibuti Coral Reefs National Park in Sarawak, Malaysia, the research team revealed decade-long, synchronous climatic impacts on the reef systems in northern Borneo, linking the El Niño Southern Oscillation climate pattern to effects shown on the area's coral reefs.Lead researcher PhD candidate Ms Hedwig Krawczyk, from the University of Leicester in the United Kingdom, explained coral cores act as a type of 'record keeper' of the local marine environments, creating fossils which researchers are able to read, interpret and then use to predict future ecosystem impacts."Corals incorporate geochemical tracers from the surrounding water into their skeleton, leaving behind a type of record of the marine environment at specific instances in time," Ms Krawczyk said."In coastal regions where there is limited, long-term environmental data, such as in Borneo, coral cores provide a critical record of local changes in river runoff and rainfall. These records help us to understand the types of pressures these reefs have been exposed to over the last 30 years, and their level of resilience to future environmental changes."Our study testified that both marine and terrestrial environments in Borneo are massively affected by changes in the hydroclimate associated with ENSO, and longer term cycles in regional rainfall and temperature."
|
Weather
| 2,020 |
March 10, 2020
|
https://www.sciencedaily.com/releases/2020/03/200310101550.htm
|
Climate shifts prompt shrubs and trees to take root in open areas
|
Wild, treeless landscapes are becoming more wooded as climate change leads to warming temperatures and wetter weather, research suggests.
|
Trees and shrubs are spreading across the tundra and the savanna, transforming these vast, open areas that contain unique biodiversity, researchers say.The dramatic changes to these regions -- which account for some 40 per cent of the world's land -- could alter the global carbon balance and climate system, scientists say. This is because woody plants store carbon, provide fuel for fires and influence how much of the sun's heat is reflected back into space.As well as affecting the climate, increasing woody plant cover could alter the unique biodiversity of areas home to diverse species including caribou in the tundra and elephants in the savanna, researchers say.Rapid warming in the Arctic tundra -- spanning northern parts of Canada, the US, Greenland, northern Europe and Russia -- has increased shrub plant cover there by 20 per cent over the past 50 years, the study found.Expanding shrub cover could raise soil temperatures in the tundra, leading to thawing of the permafrost -- frozen ground that contains nearly half of the world's soil carbon.Scientists found that shrub and tree cover in savannas -- which include Africa's plains, Australia's outback and drylands of South America -- rose by 30 per cent during the same period, as rainfall increased.A team led by University of Edinburgh researchers carried out the largest global woody cover change study of its kind to date. They compared temperature and rainfall data with more than 1000 records of plant cover change from almost 900 sites across six continents.They also found that other factors -- including wild fires and animal grazing patterns -- affect shrub and tree cover, revealing that variables shaping the future of the tundra and the savanna are more complex than previously thought.The study, published in the journal Mariana García Criado, of the University of Edinburgh's School of GeoSciences, who led the study, said: "This research indicates the far-reaching effects of climate change across the planet. Uncovering the ways in which different landscapes are responding requires collaboration among scientists, and cooperation with local peoples to better understand the changes we're seeing and their impacts from different perspectives."
|
Weather
| 2,020 |
March 9, 2020
|
https://www.sciencedaily.com/releases/2020/03/200309130018.htm
|
How new data can make ecological forecasts as good as weather forecasts
|
When El Nino approaches, driven by warm Pacific Ocean waters, we've come to expect both drenching seasonal rains in the southern U.S. and drought in the Amazon. Those opposite extremes have huge effects on society and are increasingly predictable thanks to decades of weather data.
|
Soon, University of Wisconsin-Madison ecologist Ben Zuckerberg thinks we'll be able to pull off the same forecasting feat for bird migrations and wildlife populations. That's because just as those recurring changes in climate have predictable consequences for humans, they also have predictable effects on plants and animals.For instance, ecological predictions could help us prepare for diseases in crops or population crashes in endangered species. Good forecasting could tell us where conservation measures are needed most in the coming year or decade.With a team of scientists, Zuckerberg published a paper March 5 in the journal "Plant and animal populations respond to climate at continental scales," says Zuckerberg, who is leading an interdisciplinary team looking to unearth evidence of this global climate-ecology link. "Going forward, we want to know how do we observe this connection? How do we measure it? How do we track how these dynamics are changing?"He and his team believe that a recent revolution in ecological data makes this possible. With the rise of citizen science, hundreds of thousands of global volunteers have been collecting quality data about the world around them. And the National Science Foundation has begun setting up ecological stations nationwide that mirror the ubiquitous weather stations we rely on for constant data collection."We are beginning that revolution right now in ecology where we are able to collect data at a scale that matches what climatologists have been able to use," says Zuckerberg. "Having data that's been collected over continental scales, in real time, and that spans decades is really what you need to analyze the regularity and changes in both climate and ecological dipoles."The idea that climate affects ecosystems across big expanses is not entirely new. It's been clear for decades that plant and animal behavior can be synchronized across a region. One classic example is acorn production. In certain years, all the oak trees in an area will produce huge amounts of acorns, which in turn leads to population booms in squirrels and other animals. Most likely, climate helps organize this collective response. Better data will make it easier to spot these kinds of patterns across the globe.Understanding this climate-ecology connection is more urgent than ever as Earth rapidly warms and its climate changes, says Zuckerberg. It's not clear how climate change will affect patterns like El Nino or the plants and animals that respond to those patterns. Getting a handle on how predictable climate extremes affect ecosystems will help researchers respond to changes as they arise.Now with their theory laid out, Zuckerberg's team is beginning the first project to formally test the ecological dipole idea. They will use citizen science data to track deviations in normal bird migrations and the boom-and-bust cycle of seed production to try to identify a link back to climate across the entire continent.For Zuckerberg, the fun comes from wrapping his head around this modern-day butterfly effect."Shifts in the climate system that can influence these ecological processes originate halfway across the world," he says. "And I love thinking about how these connections are going to change over time. It's really fascinating."
|
Weather
| 2,020 |
March 5, 2020
|
https://www.sciencedaily.com/releases/2020/03/200305132223.htm
|
What we don't know (about lakes) could hurt us
|
As the power of extreme weather events increase with climate change, a team of scientists warn that lakes around the world may dramatically change, threatening ecosystem health and water quality.
|
And the international team reports that our limited understanding of how lakes -- especially algae at the base of food webs -- may respond to more-extreme storms represents a knowledge gap that increases the risk.The team of 39 scientists from 20 countries on four continents investigated what is currently known about how lake ecosystems respond to extreme storm events. The scientists found they cannot confidently predict how lakes will respond to the more frequent and intense storms that are expected in a warming world."If extreme weather events significantly change carbon, nutrient, or energy cycling in lakes, we better figure it out quickly," said Jason Stockwell, an aquatic ecologist at the University of Vermont who led the new research, "because lakes can flip, like a lightbulb, from one healthy state to an unhealthy one -- and it can be hard or impossible to flip them back again."The new study focused on phytoplankton -- microscopic plants commonly known as algae. "Phytoplankton are of particular concern because they are the base of the food web," said Stockwell, "and a critical driver of water quality."The new study, "Storm Impacts on Phytoplankton Community Dynamics in Lakes," was published in the journal It is well known that extreme weather events damage property, infrastructure, and the environment, including freshwater resources that are critical to human health. However, lakes are especially sensitive to storm events because they experience storms directly and receive storm runoff from throughout their watersheds. Runoff includes sediments, nutrients, microplastics, and much more."We have a good idea of how lakes physically respond to storms: the water column mixes, water temperature changes, and sediments can be churned up from the bottom or delivered by rivers and streams to make the lake more turbid," Stockwell said. "But the physical response of the lake is just a part of the story. The biological impact of storms on phytoplankton and other plants and animals is fundamental to how lakes behave -- and, as our study reveals, poorly understood."In a search of thousands of scientific articles from around the world, the scientists found just 31 studies on 18 lakes that connected storms to freshwater lake conditions, and then to phytoplankton. Not only was the information sparse, but the few available findings were inconsistent. It became clear that the scientific community has a poor understanding of how phytoplankton respond to storms, or how their responses may differ by storm types, across different lakes, or even at different times of year.The scientists call for a collaborative, multi-disciplinary effort by modelers, limnologists, watershed experts and other scientists, through research coordination networks -- such as the Global Lake Ecological Observatory Network (GLEON) -- to develop and advance a research framework of storm impacts on phytoplankton.The team of scientists suggest several research directions including integrating watershed and lake physical models with biological models to better predict phytoplankton responses to storm-induced changes to lake conditions. The scientists also recommend continued and expanded long-term lake monitoring programs, coupled with networks of electronic high-frequency sensors, to evaluate short-term changes, emergent patterns, and long-term responses of lakes and water quality to storm events.Similar research is also required for zooplankton, tiny grazers a little smaller than a rice grain that are essential food for fish. The goal is to better understand the pathways by which storms impact watershed-scale processes and plants and animals in lakes."We must quickly learn more -- so we can better respond to the very real and pressing threat of climate change on lakes around the world," said Stockwell, director of UVM's Rubenstein Ecosystem Science Laboratory. "Without healthy lakes, we are sunk," he said.
|
Weather
| 2,020 |
March 3, 2020
|
https://www.sciencedaily.com/releases/2020/03/200303113321.htm
|
New version of Earth model captures climate dynamics
|
Earth supports a breathtaking range of geographies, ecosystems and environments, each of which harbors an equally impressive array of weather patterns and events. Climate is an aggregate of all these events averaged over a specific span of time for a particular region. Looking at the big picture, Earth's climate just ended the decade on a high note -- although not the type one might celebrate.
|
In January, several leading U.S. and European science agencies reported 2019 as the second-hottest year on record, closing out the hottest decade. July went down as the hottest month ever recorded.Using new high-resolution models developed through the U.S. Department of Energy's (DOE) Office of Science, researchers are trying to predict these kinds of trends for the near future and into the next century; hoping to provide the scientific basis to help mitigate the effects of extreme climate on energy, infrastructure and agriculture, among other essential services required to keep civilization moving forward.Seven DOE national laboratories, including Argonne National Laboratory, are among a larger collaboration working to advance a high-resolution version of the Energy Exascale Earth System Model (E3SM). The simulations they developed can capture the most detailed dynamics of climate-generating behavior, from the transport of heat through ocean eddies -- advection -- to the formation of storms in the atmosphere."E3SM is an Earth system model designed to simulate how the combinations of temperature, winds, precipitation patterns, ocean currents and land surface type can influence regional climate and built infrastructure on local, regional and global scales," explains Robert Jacob, Argonne's E3SM lead and climate scientist in its Environmental Science division. "More importantly, being able to predict how changes in climate and water cycling respond to increasing carbon dioxide (CO"Climate change can also have big impacts on our need and ability to produce energy, manage water supplies and anticipate impacts on agriculture" he adds, "so DOE wants a prediction model that can describe climate changes with enough detail to help decision-makers."Facilities along our coasts are vulnerable to sea level rise caused, in part, by rapid glacier melts, and many energy outages are the result of extreme weather and the precarious conditions it can create. For example, 2019's historically heavy rainfalls caused damaging floods in the central and southern states, and hot, dry conditions in Alaska and California resulted in massive wild fires.And then there is Australia.To understand how all of Earth's components work in tandem to create these wild and varied conditions, E3SM divides the world into thousands of interdependent grid cells -- 86,400 for the atmosphere to be exact. These account for most major terrestrial features from "the bottom of the ocean to nearly the top of the atmosphere," collaboration members wrote in a recent article published in the Journal of Advances in Modeling Earth Systems."The globe is modeled as a group of cells with 25 kilometers between grid centers horizontally or a quarter of a degree of latitude resolution," says Azamat Mametjanov, an application performance engineer in Argonne's Mathematics and Computer Science division. "Historically, spatial resolution has been much coarser, at one degree or about 100 kilometers. So we've increased the resolution by a factor of four in each direction. We are starting to better resolve the phenomena that energy industries worry about most -- extreme weather."Researchers believe that E3SM's higher-resolution capabilities will allow researchers to resolve geophysical features like hurricanes and mountain snowpack that prove less clear in other models. One of the biggest improvements to the E3SM model was sea surface temperature and sea ice in the North Atlantic Ocean, specifically, the Labrador Sea, which required an accurate accounting of air and water flow."This is an important oceanic region in which lower-resolution models tend to represent too much sea ice coverage," Jacob explains. "This additional sea ice cools the atmosphere above it and degrades our predictions in that area and also downstream."Increasing the resolution also helped resolve the ocean currents more accurately, which helped make the Labrador Sea conditions correspond with observations from satellites and ships, as well as making better predictions of the Gulf Stream.Another distinguishing characteristic of the model, says Mametjanov, is its ability to run over multiple decades. While many models can run at even higher resolution, they can run only from five to 10 years at most. Because it uses the ultra-fast DOE supercomputers, the 25-km E3SM model ran a course of 50 years.Eventually, the team wants to run 100 years at a time, interested mainly in the climate around 2100, which is a standard end date used for simulations of future climate.Higher resolution and longer time sequences aside, running such a model is not without its difficulties. It is a highly complex process.For each of the 86,400 cells related to the atmosphere, researchers run dozens of algebraic operations that correspond to some meteorological processes, such as calculating wind speed, atmospheric pressure, temperature, moisture or the amount of localized heating contributed by sunlight and condensation, to name just a few."And then we have to do it thousands of times a day," says Jacob. "Adding more resolution makes the computation slower; it makes it harder to find the computer time to run it and check the results. The 50-year simulation that we looked at in this paper took about a year in real time to run."Another dynamic for which researchers must adjust their model is called forcing, which refers mainly to the natural and anthropogenic drivers that can either stabilize or push the climate into different directions. The main forcing on the climate system is the sun, which stays relatively constant, notes Jacob. But throughout the 20th century, there have been increases in other external factors, such as COFor this first simulation, the team was not so much probing a specific stretch of time as working on the model's stability, so they chose a forcing that represents conditions during the 1950s. The date was a compromise between preindustrial conditions used in low-resolution simulations and the onset of the more dramatic anthropogenic greenhouse gas emissions and warming that would come to a head in this century.Eventually, the model will integrate current forcing values to help scientists further understand how the global climate system will change as those values increase, says Jacob."While we have some understanding, we really need more information -- as do the public and energy producers -- so we can see what's going to happen at regional scales," he adds. "And to answer that, you need models that have more resolution."One of the overall goals of the project has been to improve performance of the E3SM on DOE supercomputers like the Argonne Leadership Computing Facility's Theta, which proved the primary workhorse for the project. But as computer architectures change with an eye toward exascale computing, next steps for the project include porting the models to GPUs."As the resolution increases using exascale machines, it will become possible to use E3SM to resolve droughts and hurricane trends, which develop over multiple years," says Mametjanov."Weather models can resolve some of these, but at most for about 10 days. So there is still a gap between weather models and climate models and, using E3SM, we are trying to close that gap."
|
Weather
| 2,020 |
February 27, 2020
|
https://www.sciencedaily.com/releases/2020/02/200227144235.htm
|
Big data helps farmers adapt to climate variability
|
A new Michigan State University study shines a light on how big data and digital technologies can help farmers better adapt to threats -- both present and future -- from a changing climate.
|
The study, published in Between 2007 and 2016, the U.S. economy took an estimated $536 million economic hit because of yield variation in unstable farmland caused by climate variability across the Midwest. More than one-quarter of corn and soybean cropland in the region is unstable. Yields fluctuate between over-performing and underperforming on an annual basis.Bruno Basso, MSU Foundation professor of earth and environmental sciences, and his postdoctoral research fellow, Rafael Martinez-Feria, set out to address the key pillars of the National Institute for Food and Agriculture's Coordinated Agricultural Project that Basso has led since 2015."First, we wanted to know why -- and where -- crop yields varied from year to year in the corn and soybean belt of the U.S.," Basso said. "Next, we wanted to find out if it was possible to use big data to develop and deploy climate-smart agriculture solutions to help farmers reduce cost, increase yields and limit environmental impact."Basso and Martinez-Feria first examined soil and discovered that alone, it could not sufficiently explain such drastic yield variations."The same soil would have low yield one year and high yield the next," Basso said. "So, what is causing this temporal instability?"Using an enormous amount of data obtained from satellites, research aircraft, drones and remote sensors, and from farmers via advanced geospatial sensor suites present in many modern combine harvesters, Basso and Martinez-Feria wove big data and digital expertise together.What they found is that the interaction between topography, weather and soil has an immense impact on how crop fields respond to extreme weather in unstable areas. Terrain variations, such as depressions, summits and slopes, create localized areas where water stands or runs off. Roughly two-thirds of unstable zones occur in these summits and depressions and the terrain controls water stress experienced by crops.With comprehensive data and the technology, the team quantified the percentage of every single corn or soybean field in the Midwest that is prone to water excess or water deficit. Yields in water-deficient areas can be 23 to 33% below the field average for seasons with low rainfall but are comparable to the average in very wet years. Areas prone to water excess experienced yields 26 to 33% below field average during wet years.Basso believes their work will help determine the future of climate-smart agriculture technologies."We are primarily concerned with helping farmers see their fields in a new manner, helping them make better decisions to improve yield, reduce cost and improve environmental impact," Basso said. "Knowing that you have an area shown to be water deficient, you will plan your fertilizer applications differently. The amount of fertilizer for this area should be significantly lower than what you would apply in areas of the same field with more water available to the plants."The study was funded by USDA NIFA (awards 2015-68007-23133, 2018- 67003-27406) and AgBioResearch.
|
Weather
| 2,020 |
February 20, 2020
|
https://www.sciencedaily.com/releases/2020/02/200220130507.htm
|
The climate and increased extreme weather affect our energy systems
|
Climate change, with more and more storms and heat waves, also has consequences for our energy supply. An international research team has now developed a new method for calculating how extreme weather affects energy systems.
|
Climate change is often described in terms of average temperature changes. But it is mainly extreme weather events, like cold snaps, autumn storms and summer heat waves, that have the greatest impact on the economy and society.And our energy systems -- especially systems that include renewable energy sources -- are highly dependent on the weather and climate. But to date, there have been no suitable methods for calculating how future extreme weather events will affect these energy systems.Renewable energy, such as solar and wind power, play a crucial role for reducing climate change by partially replacing fossil-fuel-based energy sources."But their capacity is highly dependent on weather conditions, which makes their share in the existing energy system something of a challenge when it comes to reliability and stability," says Deliang Chen, professor of physical meteorology at the University of Gothenburg and one of the five researchers on the international research team.The researchers have now developed a new method for predicting how energy systems that contain renewable energy technology may be affected in the future by a changing climate with a focus on extreme weather. Researchers have used the method to analyse 30 Swedish cities, including Gothenburg, Stockholm and Malmö.The results show that future extreme climate events can have a considerable impact on energy systems."When we used the method in 30 Swedish cities and considered 13 scenarios for climate change, we saw uncertainty in the potential for renewable energy and energy demand."The gap between access to and energy demand may, with certain future climate variations, be as much as 34 per cent."That means a reduction in power supply reliability of up to 16 per cent due to extreme weather events."Faced with upcoming climate changes, shaping and optimising an energy system that can coordinate renewable energy sources in the energy supply requires close collaboration between energy system experts and climate researchers, according to Deliang Chen."We need this kind of collaboration to handle the complexity of climate and energy systems and to be able to predict the multidimensional effects that await."
|
Weather
| 2,020 |
February 19, 2020
|
https://www.sciencedaily.com/releases/2020/02/200219152855.htm
|
Jet stream not getting 'wavier' despite Arctic warming
|
Rapid Arctic warming has not led to a "wavier" jet stream around the mid-latitudes in recent decades, pioneering new research has shown.
|
Scientists from the University of Exeter have studied the extent to which Arctic amplification -- the faster rate of warming in the Arctic compared to places farther south -- has affected the fluctuation of the jet stream's winding course over the North Hemisphere.Recent studies have suggested the warming Arctic region has led to a "wavier" jet stream -- which can lead to extreme weather conditions striking the US and Europe.However, the new study by Dr Russell Blackport and Professor James Screen, shows that Arctic warming does not drive a more meandering jet stream.Instead, they believe any link is more likely to be a result of random fluctuations in the jet stream influencing Arctic temperatures, rather than the other way around.The study is published in leading journal Dr Blackport, a Research Fellow in Mathematics and lead author of the study, said: "While there does appear to be a link between a wavier jet stream and Arctic warming in year-to-year and decade-to-decade variability, there has not been a long-term increase in waviness in response to the rapidly warming Arctic."Scientists have studied whether the jet stream's meandering course across the Northern Hemisphere is amplified by climate change in recent years.For about two decades, the jet stream -- a powerful band of westerly winds across the mid-latitudes -- was observed to have a "wavier" flow, which coincided with greater Arctic warming through climate change.These waves have caused extreme weather conditions to strike mainland Europe and the US, bringing intense cold air that leads to extreme cold weather.In this new study, Dr Blackport and Professor Screen studied not only climate model simulations but also the observed conditions going back 40 years.They found that the previously reported trend toward a wavier circulation during autumn and winter has reversed in recent years, despite continued Arctic amplification.This reversal has resulted in no long-term trends in waviness, in agreement with climate model simulations, which also suggest little change in "waviness" in response to strong Arctic warming.The results, the scientists say, strongly suggest that the observed and simulated link between jet stream "waviness" and Arctic temperatures do not represent a causal effect of Arctic amplification on the jet stream.Professor Screen, an Associate Professor in Climate Science at Exeter added: "The well-publicised idea that Arctic warming is leading to a wavier jet stream just does not hold up to scrutiny."With the benefit of ten more years of data and model experiments, we find no evidence of long-term changes in waviness despite on-going Arctic warming."Insignificant effect of Arctic amplification on the amplitude of mid-latitude atmospheric waves is published in
|
Weather
| 2,020 |
February 18, 2020
|
https://www.sciencedaily.com/releases/2020/02/200218124411.htm
|
Warming oceans are getting louder
|
One of the ocean's loudest creatures is smaller than you'd expect -- and will get even louder and more troublesome to humans and sea life as the ocean warms, according to new research presented here at the Ocean Sciences Meeting 2020.
|
Snapping shrimp create a pervasive background crackling noise in the marine environment. Scientists suspect the sound helps the shrimp communicate, defend territories and hunt for food. When enough shrimp snap at once, the noise can dominate the soundscape of coastal oceans, sometimes confusing sonar instruments.Researchers will present new results Friday at the Ocean Sciences Meeting 2020 suggesting that with increased ocean temperatures, snapping shrimp will snap more often and louder than before. This could amplify the background noise, or soundscape, of the global ocean, with implications for marine life and humans."It's a really cool little animal," said Aran Mooney, a marine biologist at Woods Hole Oceanographic Institution who will present the work. "They're a crustacean, kind of like a little shrimp or lobster. They make a sound by like closing a claw so fast it makes this bubble and when that bubble implodes, it makes that snapping sound."Mooney and his colleague Ashlee Lillis detected a strong relationship between warmer waters and louder, more frequent snapping shrimp sounds by experimenting with shrimp in tanks in their lab and by listening to shrimp in the ocean at different water temperatures."As you increase that temperature, snap rates increase," Mooney said.This makes sense because shrimp are essentially cold-blooded animals, meaning their body temperature and activity levels are largely controlled by their environment, in the same way ants can move faster in warmer weather than in cool weather."We can actually show in the field that not only does snap rate increase, but the sound levels increase as well," Mooney said. "So the seas are actually getting louder as water, warmer temperatures."Louder snapping shrimp could potentially have harmful effects on fish and even sonar used by submarines and ships."We know that fish use sound to communicate," Mooney said. "Fish call each other, and they make sounds to attract mates and for territorial defense. If the seas get louder, it has the potential to influence those communications. We don't really know that yet. That's something we have to follow up on."Human use of sound in the oceans might also be impaired by very loud snapping shrimp. Common instruments like sonar fish finders might be affected, Mooney said. There is also the possibility louder seas could affect instruments the Navy uses to detect mines, which could have implications for national defense, he said.Listen to snapping shrimp sounds here:
|
Weather
| 2,020 |
February 13, 2020
|
https://www.sciencedaily.com/releases/2020/02/200213124214.htm
|
Computer-based weather forecast: New algorithm outperforms mainframe computer systems
|
The exponential growth in computer processing power seen over the past 60 years may soon come to a halt. Complex systems such as those used in weather forecast, for example, require high computing capacities, but the costs for running supercomputers to process large quantities of data can become a limiting factor. Researchers at Johannes Gutenberg University Mainz (JGU) in Germany and Università della Svizzera italiana (USI) in Lugano in Switzerland have recently unveiled an algorithm that can solve complex problems with remarkable facility -- even on a personal computer.
|
In the past, we have seen a constant rate of acceleration in information processing power as predicted by Moore's Law, but it now looks as if this exponential rate of growth is limited. New developments rely on artificial intelligence and machine learning, but the related processes are largely not well-known and understood. "Many machine learning methods, such as the very popular deep learning, are very successful, but work like a black box, which means that we don't know exactly what is going on. We wanted to understand how artificial intelligence works and gain a better understanding of the connections involved," said Professor Susanne Gerber, a specialist in bioinformatics at Mainz University. Together with Professor Illia Horenko, a computer expert at Università della Svizzera italiana and a Mercator Fellow of Freie Universität Berlin, she has developed a technique for carrying out incredibly complex calculations at low cost and with high reliability. Gerber and Horenko, along with their co-authors, have summarized their concept in an article entitled "Low-cost scalable discretization, prediction, and feature selection for complex systems" recently published in The paper presented is the result of many years of work on the development of this new approach. According to Gerber and Horenko, the process is based on the Lego principle, according to which complex systems are broken down into discrete states or patterns. With only a few patterns or components, i.e., three or four dozen, large volumes of data can be analyzed and their future behavior can be predicted. "For example, using the SPA algorithm we could make a data-based forecast of surface temperatures in Europe for the day ahead and have a prediction error of only 0.75 degrees Celsius," said Gerber. It all works on an ordinary PC and has an error rate that is 40 percent better than the computer systems usually used by weather services, whilst also being much cheaper.SPA or Scalable Probabilistic Approximation is a mathematically-based concept. The method could be useful in various situations that require large volumes of data to be processed automatically, such as in biology, for example, when a large number of cells need to be classified and grouped. "What is particularly useful about the result is that we can then get an understanding of what characteristics were used to sort the cells," added Gerber. Another potential area of application is neuroscience. Automated analysis of EEG signals could form the basis for assessments of cerebral status. It could even be used in breast cancer diagnosis, as mammography images could be analyzed to predict the results of a possible biopsy."The SPA algorithm can be applied in a number of fields, from the Lorenz model to the molecular dynamics of amino acids in water," concluded Horenko. "The process is easier and cheaper and the results are also better compared to those produced by the current state-of-the-art supercomputers."
|
Weather
| 2,020 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.