Date
stringlengths
11
18
Link
stringlengths
62
62
Title
stringlengths
16
148
Summary
stringlengths
1
2.68k
Body
stringlengths
22
13k
Category
stringclasses
20 values
Year
int64
2k
2.02k
March 20, 2018
https://www.sciencedaily.com/releases/2018/03/180320123450.htm
Wind, sea ice patterns point to climate change in western Arctic
A major shift in western Arctic wind patterns occurred throughout the winter of 2017 and the resulting changes in sea ice movement are possible indicators of a changing climate, says Kent Moore, a professor of physics at the University of Toronto Mississauga.
Thanks to data collected by buoys dropped from aircraft onto the Arctic Ocean's sea ice, Moore and colleagues at the University of Washington, where he spent the year as the Fulbright Visiting Chair in Arctic Studies, were able to observe this marked, anomalous shift in Arctic wind patterns and sea ice movement during the winter of 2017.Their study is published in Usually, the western Arctic has relatively stable weather during the winter; it is home to a quasi-stationary region of high pressure known as the Beaufort High, which promotes "anti-cyclonic" winds that travel in a clockwise direction and move sea ice along with it. By contrast, the eastern Arctic has a more dynamic climate where cyclones are a common winter phenomenon with storms moving from Greenland towards Norway and the Barents Sea."Last year, we looked at the buoy tracks in the western Arctic and saw that the sea ice was moving in a counter-clockwise pattern instead and wondered why," Moore says. "We discovered that storms were moving in an unexpected direction from the Barents Sea along the Siberian coast and into the western Arctic, bringing with them low-pressures that caused the collapse of the Beaufort High."Moore and colleagues believe that the low-pressure systems were able to make inroads into the western Arctic because of an unusually warm fall in 2016 resulting in thinner and less extensive sea ice. During the winter, this allowed for more oceanic heat to be transferred to the atmosphere and provided an additional energy source for these storms."As a result of this additional energy source, the storms did not dissipate over the Barents Sea, as is usual, and were able to reach into the western Arctic," Moore says. "We reviewed more than 60 years of weather data from the Arctic and it appears that this collapse has never happened before."Generally, the Beaufort High drives sea ice motion throughout the Arctic as well as impacting ocean circulation over the North Atlantic Ocean. Any shift in movement patterns has the potential to affect the climate in these regions, as well as the Arctic ecosystem that depends on predictable areas of open water and ice.For example, as a result of this collapse, sea ice was thinner along the coast of the Canadian Arctic Archipelago, as well as in the southern Beaufort Sea last winter. Such changes can disturb Arctic food webs, stressing marine mammals and polar bears, especially if they are ongoing."If this becomes part of the normal pattern -- even if it happens every few years -- it will mean that the climate is changing," Moore says. "We are still exploring all of the specific impacts."
Weather
2,018
March 19, 2018
https://www.sciencedaily.com/releases/2018/03/180319124302.htm
Human influence on climate change will fuel more extreme heat waves in US
Human-caused climate change will drive more extreme summer heat waves in the western U.S., including in California and the Southwest as early as 2020, new research shows.
The new analysis of heat wave patterns across the U.S., led by scientists at the University of Miami Rosenstiel School of Marine and Atmospheric Science (UM) based Cooperative Institute for Marine and Atmospheric Studies (CIMAS) and colleagues, also found that human-made climate change will be a dominant driver for heat wave occurrences in the Great Lakes region by 2030, and in the Northern and Southern Plains by 2050 and 2070, respectively.Human-made climate change is the result of increased carbon dioxide and other human-made emissions into the atmosphere."These are the years that the human contributions to climate change will become as important as natural variability in causing heat waves," said lead author Hosmay Lopez, a CIMAS meteorologist based at NOAA's Atlantic Oceanographic Meteorological Laboratory. "Without human influence, half of the extreme heat waves projected to occur during this century wouldn't happen."The study, published in the March 19, 2018 online issue of the journal Lopez and colleagues used climate models along with historical climate data from 1900-2010 to project future heat wave patterns across the U.S. during the summer months of June through August. The climate change scenarios identified four regions where human-made climate change would be the dominate cause of heat extremes, surpassing natural climate variability. The researchers defined extreme heat wave events as three or more days of record high temperatures."Population growth coupled with the fact that extreme heat is the leading weather-related cause of death in the United States call for the need to identify the relative roles of internal variability and human-caused climate change on these extremes," said Lopez. "This work provides a significant advancement in the scientific understanding of future projections of heat waves."The researchers say that regional climate variability, such as differences in atmospheric circulation, precipitation, and existence of green spaces affect when human-caused climate change will become the primary driver of extreme heat events. For example, researchers found that a pattern of low-lying and fast-moving circulation of air over the Great Plains, a type of natural variability, will delay the onset of when human-caused climate change would be the main cause of heat waves in this region.Understanding the driving forces behind the projected increase in occurrence and severity of heat waves is crucial for public health security and necessary for communities to develop extreme heat mitigation strategies, said the authors.
Weather
2,018
March 14, 2018
https://www.sciencedaily.com/releases/2018/03/180314092708.htm
Exceptionally large amount of winter snow in Northern Hemisphere this year
The new Arctic Now product developed by the Finnish Meteorological Institute shows with one picture the extent of the area in the Northern Hemisphere currently covered by ice and snow. This kind of information, which shows the accurate state of the Arctic, becomes increasingly important due to climate change. The Arctic region will be discussed at the Arctic Meteorological Week which begins in Levi next week.
In the Northern Hemisphere the maximum seasonal snow cover occurs in March. "This year has been a year with an exceptionally large amount of snow, when examining the entire Northern Hemisphere. The variation from one year to another has been somewhat great, and especially in the most recent years the differences between winters have been very great," says Kari Luojus, Senior Research Scientist at the Finnish Meteorological Institute.The information has been gleaned from the Arctic Now service of the Finnish Meteorological Institute, which is unique even on a global scale. The greatest difference compared with other comparable services is that traditionally they only tell about the extent of the ice or snow situation."Here at the Finnish Meteorological Institute we have managed to combine data to form a single image. In this way we can get a better situational picture of the cryosphere -- that is, the cold areas of the Northern Hemisphere," Research Professor Jouni Pulliainen observes. In addition to the coverage, the picture includes the water value of the snow, which determines the water contained in the snow. This is important information for drafting hydrological forecasts on the flood situation and in monitoring the state of climate and environment in general.Information on the amount of snow is also sent to the Global Cryosphere Watch service of the World Meteorological Organisation (WMP) where the information is combined with trends and statistics of past years. Lengthy series of observation times show that the total amount of snow in the Northern Hemisphere has declined in the spring period and that the melting of the snow has started earlier in the same period. Examination over a longer period (1980-2017) shows that the total amount of snow in all winter periods has decreased on average.Also, the ice cover on the Arctic Ocean has grown thinner and the amount and expanse of perennial ice has decreased. Before 2000 the smallest expanse of sea ice varied between 6.2 and 7.9 million square kilometres. In the past ten years the expanse of ice has varied from 5.4 to 3.6 million square kilometres. Extreme weather phenomena -- winters in which snowfall is sometimes quite heavy, and others with little snow, will increase in the future.The Arctic area is warming at twice the speed as the rest of the world, and the impact of climate change can already be seen at the moment in the Arctic regions. On the other hand, the changes are affecting the rest of the earth."What happens in the Arctic regions does not stay in the Arctic regions. It also affects a wider area. The exceptional strengthening of a high-pressure area in Siberia, which brought freezing temperatures to Finland in late February and early March, may be partly the result of atmospheric warming over the Arctic Ocean. When it is exceptionally cold somewhere in the world, it is often exceptionally warm somewhere else. This is what happened in the end of February-early March when temperatures in the North Pole were around zero degrees Celsius and it was exceptionally cold in Europe," explains Ari Laaksonen, Scientific Director at the Finnish Meteorological Institute.The weather fluctuates from one year to another and individual cold snaps in the Arctic area are not, as such, proof of the progression of climate change. "However, they are a reminder of how climate uncertainty has increased and that we'll have to get use to variations in the weather as the climate change proceeds," Laaksonen observes.Next week an Arctic Meteorology Week will begin at Levi. Meteorological themes of the Arctic region will be a focus at the Arctic Meteorology Summit on 20 March. The entire week will bring together a wide spectrum of influential figures in the field of meteorology.Cooperation is important because research and security in the Arctic region require comprehensive and long-term weather, ice, sea, and atmospheric observations and modelling. "Meteorology is a new theme in the Arctic Council, even though the Arctic countries are already engaging in plenty of cooperation on the subject. The Levi meeting will include discussions on how meteorological know-how will be seen in the future work of the Arctic Council," says Arctic Ambassador Aleksi Härkönen.
Weather
2,018
March 14, 2018
https://www.sciencedaily.com/releases/2018/03/180314092305.htm
Chain reaction of fast-draining lakes poses new risk for Greenland ice sheet
A growing network of lakes on the Greenland ice sheet has been found to drain in a chain reaction that speeds up the flow of the ice sheet, threatening its stability.
Researchers from the UK, Norway, US and Sweden have used a combination of 3D computer modelling and real-world observations to show the previously unknown, yet profound dynamic consequences tied to a growing number of lakes forming on the Greenland ice sheet.Lakes form on the surface of the Greenland ice sheet each summer as the weather warms. Many exist for weeks or months, but drain in just a few hours through more than a kilometre of ice, transferring huge quantities of water and heat to the base of the ice sheet. The affected areas include sensitive regions of the ice sheet interior where the impact on ice flow is potentially large.Previously, it had been thought that these 'drainage events' were isolated incidents, but the new research, led by the University of Cambridge, shows that the lakes form a massive network and become increasingly interconnected as the weather warms. When one lake drains, the water quickly spreads under the ice sheet, which responds by flowing faster. The faster flow opens new fractures on the surface and these fractures act as conduits for the drainage of other lakes. This starts a chain reaction that can drain many other lakes, some as far as 80 kilometres away.These cascading events -- including one case where 124 lakes drained in just five days -- can temporarily accelerate ice flow by as much as 400%, which makes the ice sheet less stable, and increases the rate of associated sea level rise. The results are reported in the journal The study demonstrates how forces within the ice sheet can change abruptly from one day to the next, causing solid ice to fracture suddenly. The model developed by the international team shows that lakes forming in stable areas of the ice sheet drain when fractures open in response to a high tensile shock force acting along drainage paths of water flowing beneath the ice sheet when other lakes drain far away."This growing network of melt lakes, which currently extends more than 100 kilometres inland and reaches elevations as high a 2,000 metres above sea level, poses a threat for the long-term stability of the Greenland ice sheet," said lead author Dr Poul Christoffersen, from Cambridge's Scott Polar Research Institute. "This ice sheet, which covers 1.7 million square kilometres, was relatively stable 25 years ago, but now loses one billion tonnes of ice every day. This causes one millimetre of global sea level rise per year, a rate which is much faster than what was predicted only a few years ago."The study departs from the current consensus that lakes forming at high elevations on the Greenland ice sheet have only a limited potential to influence the flow of ice sheet as climate warms. Whereas the latest report by Intergovernmental Panel on Climate Change concluded that surface meltwater, although abundant, does not impact the flow of the ice sheet, the study suggests that meltwater delivered to the base of the ice sheet through draining lakes in fact drives episodes of sustained acceleration extending much farther onto the interior of the ice sheet than previously thought."Transfer of water and heat from surface to the bed can escalate extremely rapidly due to a chain reaction," said Christoffersen. "In one case we found all but one of 59 observed lakes drained in a single cascading event. Most of the melt lakes drain in this dynamic way."Although the delivery of small amounts of meltwater to the base of the ice sheet only increases the ice sheet's flow locally, the study shows that the response of the ice sheet can intensify through knock-on effects.When a single lake drains, the ice flow temporarily accelerates along the path taken by water flowing along the bottom of the ice sheet. Lakes situated in stable basins along this path drain when the loss of friction along the bed temporarily transfers forces to the surface of the ice sheet, causing fractures to open up beneath other lakes, which then also drain."The transformation of forces within the ice sheet when lakes drain is sudden and dramatic," said co-author Dr Marion Bougamont, also from the Scott Polar Research Institute. "Lakes that drain in one area produce fractures that cause more lakes to drain somewhere elsewhere. It all adds up when you look at the pathways of water underneath the ice."The study used high-resolution satellite images to confirm that fractures on the surface of the ice sheet open up when cascading lake drainage occurs. "This aspect of our work is quite worrying," said Christoffersen. "We found clear evidence of these crevasses at 1,800 metres above sea level and as far 135 kilometres inland from the ice margin. This is much farther inland than previously considered possible."While complete loss of all ice in Greenland remains extremely unlikely this century, the highly dynamic manner in which the ice sheet responds to Earth's changing climate clearly underscores the urgent need for a global agreement that will reduce the emission of greenhouse gases.The work was funded by the Natural Environment Research Council (NERC) and the European Research Council (ERC).
Weather
2,018
March 13, 2018
https://www.sciencedaily.com/releases/2018/03/180313130629.htm
Warm Arctic means colder, snowier winters in northeastern US, study says
Scientists from Rutgers University-New Brunswick and Atmospheric and Environmental Research (AER) have linked the frequency of extreme winter weather in the United States to Arctic temperatures.
Their research was published today in "Basically, this confirms the story I've been telling for a couple of years now," said study co-author Jennifer Francis, research professor of marine and coastal sciences in Rutgers' School of Environmental and Biological Sciences. "Warm temperatures in the Arctic cause the jet stream to take these wild swings, and when it swings farther south, that causes cold air to reach farther south. These swings tend to hang around for awhile, so the weather we have in the eastern United States, whether it's cold or warm, tends to stay with us longer."The research is timely given the extreme winter of 2017-2018, including record warm Arctic and low sea ice, record-breaking polar vortex disruption, record-breaking cold and disruptive snowfalls in the United States and Europe, severe "bomb cyclones" and costly nor'easters, said Judah Cohen, director of seasonal forecasting at AER and lead author of the study.In their study, Cohen, Francis and AER's Karl Pfeiffer found that severe winter weather is two to four times more likely in the eastern United States when the Arctic is abnormally warm than when the Arctic is abnormally cold. Their findings also show that winters are colder in the northern latitudes of Europe and Asia when the Arctic is warm.Paradoxically, the study shows that severe winter weather in the western United States is more likely when the Arctic is colder than normal.The researchers found that when Arctic warming occurred near the surface, the connection to severe winter weather was weak. When the warming extended into the stratosphere, however, disruptions of the stratospheric polar vortex were likely. These disruptions usually cause severe winter weather in mid- to late winter and affect large metropolitan centers of the northeastern United States."Five of the past six winters have brought persistent cold to the eastern U.S. and warm, dry conditions to the West, while the Arctic has been off-the-charts warm," Francis said. "Our study suggests that this is no coincidence. Exactly how much the Arctic contributed to the severity or persistence of the pattern is still hard to pin down, but it's becoming very difficult to believe they are unrelated."
Weather
2,018
March 8, 2018
https://www.sciencedaily.com/releases/2018/03/180308133316.htm
New 3-D measurements improve understanding of geomagnetic storm hazards
Measurements of the three-dimensional structure of Earth, as opposed to the one-dimensional models typically used, can help scientists more accurately determine which areas of the United States are most vulnerable to blackouts during hazardous geomagnetic storms.
Space weather events such as geomagnetic storms can disturb Earth's magnetic field, interfering with electric power grids, radio communication, GPS systems, satellite operations, oil and gas drilling and air travel. Scientists use models of Earth's structure and measurements of Earth's magnetic field taken at USGS observatories (In a new U.S. Geological Survey study/a>, scientists calculated voltages along power lines in the mid-Atlantic region of the U.S. using 3D data of Earth. These data, taken at Earth's surface, reflect the complex structure of Earth below the measurement sites and were collected during the National Science Foundation EarthScope USArray project. The scientists found that for many locations, the voltages they calculated were significantly different from those based on previous 1D calculations, with the 3D data producing the most precise results."Using the most accurate data available to determine vulnerable areas of the power grid can help maintain life-saving communications and protect national security during severe geomagnetic storms," said Greg Lucas, a USGS scientist and the lead author of the study. "Our study suggests that 3D data of the earth should be used whenever they are available."Electric currents from a March 1989 geomagnetic storm caused a blackout in Quebec and numerous glitches in the U.S. power grid. In past studies, scientists using simple 1D models of Earth would have found that 16 high-voltage electrical transmission lines were disturbed in the mid-Atlantic region during the storm, resulting in the blackout. However, by using realistic 3D data to calculate the 1989 scenario, the new study found that there might have actually been 62 vulnerable lines."This discrepancy between 1D- and 3D-based calculations of the 1989 storm demonstrates the importance of realistic data, rather than relying on previous 1D models, to determine the impact that a geomagnetic storm has on power grids," Lucas said.The new study is published in the journal For more information about the effects of geomagnetic storms, please visit the USGS Geomagnetism Program website (
Weather
2,018
March 8, 2018
https://www.sciencedaily.com/releases/2018/03/180308105139.htm
Scientists accurately model the action of aerosols on clouds
Global climate is a tremendously complex phenomenon, and researchers are making painstaking progress, year by year, to try to develop ever more accurate models. Now, an international group including researchers from the Advanced Institute for Computational Science (AICS) in Japan, using the powerful K computer, have for the first time accurately calculated the effects of aerosols on clouds in a climate model.
Aerosols play a key role in cloud formation, as they provide the "seeds" -- called cloud condensation nuclei -- that allow clouds to form and affect their life cycle. The water in the air condenses onto the tiny particles, and gradually grow into droplets and finally into raindrops that precipitate. The action of aerosols is an important element of research on climate change, as they partially counteract the heating action of greenhouse gases.It was previously believed that increasing aerosol density would always lead to more clouds, but recent satellite observations showed that this is not necessarily true. It is now understood that, due to temperature differences between the top and bottom layers of clouds, there is a delicate balance of evaporation and condensation, with aerosols in the lower parts of the clouds promoting cloud formation, but those in the upper parts allowing the water to evaporate.Previously, climate models were unable to model the response of these micro-processes within the clouds to aerosol variation, but using the K computer, the RIKEN-led group combined a model that simulates the entire global weather over a year, at a horizontal resolution of just 14 kilometers, with a simulation of how the aerosols behave within clouds. Unlike conventional models, which show a uniform increase in clouds over the earth when there is an increase in aerosols, the high-resolution model, which takes into account the vertical processes inside clouds, accurately depicted how large areas experience a drop in cloud cover.According to Yosuke Sato from the Computational Climate Science Research Team at RIKEN AICS and Nagoya University, "It was very gratifying to see that we could use a powerful supercomputer to accurately model the microphysics of clouds, giving a more accurate picture of how clouds and aerosol behave in the real world. In the future, we hope to use even more powerful computers to allow climate models to have more certainty in climate prediction."
Weather
2,018
March 7, 2018
https://www.sciencedaily.com/releases/2018/03/180307115511.htm
Weather satellites aid search and rescue capabilities
The same satellites that identify severe weather can help save you from it.
The National Oceanic and Atmospheric Administration's (NOAA's) Geostationary Operational Environmental Satellite (GOES) constellation monitors Earth's environment, helping meteorologists observe and predict the weather. GOES observations have tracked thunderstorms, tornadoes, hurricanes and flash floods. They've even proven useful in monitoring dust storms, forest fires and volcanic activity.The recently launched GOES-S (planned to replace the current GOES-West later this year) and other GOES series satellites carry a payload supported by NASA's Search and Rescue (SAR) office, which researches and develops technologies to help first responders locate people in distress worldwide, whether from a plane crash, a boating accident or other emergencies.Over its history, the SAR office at NASA's Goddard Space Flight Center in Greenbelt, Maryland, has developed emergency beacons for personal, nautical and aeronautical use, along with ground station receivers that detect beacon activation. Space segment SAR instruments fly on many spacecraft in various orbits around the Earth. The GOES SAR transponders are geostationary, meaning that they appear "fixed" relative to a user on the surface due to their location over the equator and orbital period of 24 hours."The SAR space segment isn't just one instrument in one orbit," said Tony Foster, SAR's deputy mission manager. "Rather it's a series of instruments aboard diverse satellites in various orbits, each working together to provide first responders with highly accurate locations."The GOES search and rescue transponders, unlike SAR instruments in other orbits, are only able to detect the beacon signals, not help to determine location. This detection rapidly alerts the global SAR network, Cospas-Sarsat, of a distress beacon's activation. This gives the system valuable time to prepare before the signal's origin can be determined by SAR instruments on low-Earth-orbiting satellites.Additionally, beacons with integrated GPS technology can send their location data through GOES to the SAR network. The network can then alert local first responders to the location of the emergency without the aid of the low-Earth-orbiting constellation of search and rescue instruments.NASA's SAR team provides on-orbit testing, support and maintenance of the search and rescue instrument on GOES. The GOES satellites and SAR instruments are funded by NOAA."We are proud to support the Cospas-Sarsat program by hosting a search and rescue transponder aboard our satellites," said Tim Walsh, GOES-R series program acting system program director. "SAR is one of the many NOAA-NASA collaborations that translate into life-saving technology."In the future, first responders will rely on a new constellation of instruments on GPS and other Global Navigation Satellite Systems currently in medium-Earth orbit, an orbit that views larger swathes of the Earth than low-Earth orbit due to higher altitudes. These new instruments will enable the SAR network to locate a distress signal more quickly than the current system and calculate their position with accuracy an order of magnitude better, from one kilometer (0.6 miles) to approximately 100 meters (328 feet).In the meantime, the SAR transponders aboard GOES cover the time between the activation of a distress signal and detection by SAR instruments in low-Earth orbit."NASA's SAR office dedicates itself to speed and accuracy," said Lisa Mazzuca, SAR mission manager. "The instruments and technologies we develop endeavor to alert first responders to a beacon's activation as soon as possible. The GOES search and rescue transponders are crucial to this goal, providing near-instantaneous detection in the fields of view of the Earth."
Weather
2,018
March 7, 2018
https://www.sciencedaily.com/releases/2018/03/180307100722.htm
Wildfires set to increase: could we be sitting on a tinderbox in Europe?
2017 was one of the worst years on record for fires in Europe, with over 800,000 hectares of land burnt in Portugal, Italy and Spain alone.
As the world gets warmer and Europe's land gets drier, fires are set to get even worse -- and not just for the hottest countries around the Mediterranean. Even relatively safe Alpine mountain regions will see a rapid increase in fire danger unless action is taken to limit climate change and reduce the main causes for wildfires starting and spreading.These warnings come from a JRC study of factors affecting fire danger across the continent. Scientists modelled fire danger for several weather and climate scenarios. The study compares a high greenhouse gas emissions scenario to one in which global warming is limited to 2°C above pre-industrial levels, run across several models to assess patterns of danger up to the end of the century.The effect of climate change on rainfall, temperature, wind and humidity will cause countries around the Mediterranean to become drier. By drawing moisture from deep layers of wood, leaves, soil and other organic matter, extreme weather events such as droughts could leave kindling waiting to ignite. The dryness is set to push northwards to central Europe, putting a larger area at risk. Land surrounding the Alps and other European mountain systems, in which wildfires are currently much less common, show a fast pace of danger increase.Successful action to limit the level of global warming can prevent the situation from spiralling out of control, although the danger is set to increase even in the most optimistic climate scenario. This means that effective adaptation strategies will be crucial to minimising the severity and occurrence of forest fires -- and the havoc that they can wreak on European communities.The study identifies three key factors that could help reduce the risk:· Human activity: data from the European Forest Fire Information System (EFFIS) confirms that nearly all forest fires in Europe start due to human activity. Minimising this factor involves exploring the causes that lead people to start fires, as well as increasing awareness of fire danger, encouraging good behaviour and sanctioning offenders.· Vegetation management: reducing the fire risk in dry forests and minimising the likelihood of severe fires. These measures need to be carefully adapted to the different forest ecosystems.· Sustainable forest composition and structure: managed forests often have lower tree density, which makes them less prone to fire than unmanaged forests. Some old-age forests are associated with less severe fires than densely-packed, young trees. Particular tree species and ecosystems are more resistant to drought and better adapted to post-fire recovery. However, climate change will have differing effects on the habitat suitability for these trees. Therefore, careful landscape design is key to a well climate-adapted forest composition that ensures sustainable, less fire-prone forests.Europe's increasingly urbanised population further highlights the importance of minimising fire risk. As the world becomes more urbanised, abandoned agricultural land could leave behind more fuel to feed wildfires. This exposes new settlements built in transitional areas between cities and wildland to increased fire risk. Hence, careful landscape design must also frame strategic agricultural land use planning, with the aim to avoid the abandonment of agricultural land.The report argues that, as well as taking account of these factors, policymakers should look at wildfires as a natural hazard and consider designing defensible space from a social and policy perspective.In response to the 2017 forest fires, the Commission has taken a number of initiatives to further strengthen EU civil protection capacity and to enhance disaster prevention, preparedness and response across Europe. For instance, the Commission proposal to amend the Union Civil Protection Mechanism, published last November, aims to reinforce overall collective capacities to respond to disasters, and to strengthen the focus on prevention.Further information:
Weather
2,018
March 2, 2018
https://www.sciencedaily.com/releases/2018/03/180302124830.htm
Snowpack levels show dramatic decline in western states, U.S.
A new study of long-term snow monitoring sites in the western United States found declines in snowpack at more than 90 percent of those sites -- and one-third of the declines were deemed significant.
Since 1915, the average snowpack in western states has declined by between 15 and 30 percent, the researchers say, and the amount of water lost from that snowpack reduction is comparable in volume to Lake Mead, the West's largest manmade reservoir. The loss of water storage can have an impact on municipal, industrial and agricultural usage, as well as fish and other animals.Results of the study are being published this week in "It is a bigger decline than we had expected," said Philip Mote, director of the Oregon Climate Change Research Institute at Oregon State University and lead author on the study. "In many lower-elevation sites, what used to fall as snow is now rain. Upper elevations have not been affected nearly as much, but most states don't have that much area at 7,000-plus feet."The solution isn't in infrastructure. New reservoirs could not be built fast enough to offset the loss of snow storage -- and we don't have a lot of capacity left for that kind of storage. It comes down to managing what we have in the best possible ways."The researchers attribute the snowpack decline to warmer temperatures, not a lack of precipitation. But the consequences are still significant, they point out. Earlier spring-like weather means more of the precipitation will not be stored as long in the mountains, which can result in lower river and reservoir levels during late summer and early fall.The study considered data from 1,766 sites in the western U.S., mostly from the U.S. Department of Agriculture's Natural Resources Conservation Service and the California Department of Water Resources. The researchers focused on measurements taken on April 1, which historically has been the high point for snowpack in most areas, though they also looked at measurements for Jan. 1, Feb. 1, March 1, and May 1 -- which led to the range of decline of 15 to 30 percent.They also used a physically based computer model of the hydrologic cycle, which takes daily weather observations and computes the snow accumulation, melting, and runoff to estimate the total snowpack in the western U.S."We found declining trends in all months, states and climates," Mote said, "but the impacts are the largest in the spring, in Pacific states, and in locations with mild winter climates."The Pacific states -- California, Oregon and Washington -- receive more precipitation because of the Pacific Ocean influence, and more of the snow falls at temperatures near freezing. Because the Cascade Mountains, which transect the region, are not as steep as the Rocky Mountains, they have more area that is affected by changes in temperature."When you raise the snow zone level 300 feet, it covers a much broader swath than it would in the inland states," Mote said.Mote was one of 12 lead authors on a chapter of the fifth Intergovernmental Panel on Climate Change report looking at the cryosphere, which is comprised of snow, river and lake ice, sea ice, glaciers, ice sheets and frozen ground. Also an author on the fourth IPCC report, he had led a 2005 study on western snowpack levels that had also documented declines that were less dramatic than those in this new study."The amount of water in the snowpack of the western United States is roughly equivalent to all of the stored water in the largest reservoirs of those states," Mote said. "We've pretty much spent a century building up those water supplies at the same time the natural supply of snowpack is dwindling."On smaller reservoirs, the water supply can be replenished after one bad year. But a reservoir like Lake Mead takes four years of normal flows to fill; it still hasn't recovered from the drought of the early 2000s."Mote said snowpack levels in most of the western U.S. for 2017-18 thus far are lower than average -- a function of continued warming temperatures and the presence of a La Niña event, which typically results in warmer and drier conditions in most southwestern states.
Weather
2,018
March 1, 2018
https://www.sciencedaily.com/releases/2018/03/180301151514.htm
No laughing matter, yet humor inspires climate change activism
Melting icecaps, mass flooding, megadroughts and erratic weather are no laughing matter. However, a new study shows that humor can be an effective means to inspire young people to pursue climate change activism. At the same time, fear proves to be an equally effective motivator and has the added advantage of increasing people's awareness of climate change's risks.
"Young people have a huge stake in global climate change. They are going to bear the brunt of it, more so than old guys like me," said Jeff Niederdeppe, associate professor of communication at Cornell University, who oversaw the study. "Young people buy green products, they believe in climate change, they're worried about it, but they're not as politically active on the issue as older generations are. And if you look at where millennials get news information, it's from John Oliver and Trevor Noah, these satirical news programs. We wanted to test if this humorous approach could be used to engage young people in climate change activism."The study, "Pathways of Influence in Emotional Appeals: Benefits and Tradeoffs of Using Fear or Humor to Promote Climate Change-Related Intentions and Risk Perceptions," published in the Niederdeppe readily admits that academics don't make the best comedians. So, the researchers partnered with Second City Works, a marketing offshoot of the legendary improvisational theater troupe in Chicago that launched the careers of Bill Murray, Tina Fey, Amy Poehler and other Saturday Night Live alums.Second City Works created a series of online videos that feature a weatherman providing forecasts about extreme weather patterns caused by climate change in the United States, each with a drastically different tone. A humorous video emphasized the weatherman's cluelessness as he struggled to understand the signs of climate change. A more ominous version highlighted the severity of climate change and its devastating impacts. A third video used a neutral tone and language to present an informational view of climate change. Each video concluded with a recommendation to "Find out what your local officials and the presidential candidates think about climate change. Have your voice heard on Nov. 8." A fourth video about income inequality was used as a control."The humor video made people laugh more, and people who found it funny were more likely to want to plan to partake in activism, recycle more and believe climate change is risky," said Christofer Skurka, a third-year doctoral student in communication, who is the paper's lead author.While the study focused on adults between the ages of 18 and 30, the researchers found that college-aged adults between 18 and 24 were most inspired to activism by the humorous video. Fear, meanwhile, proved to be equally effective across the entire age range, both in raising awareness about climate change's risks and motivating viewers to intend to engage in direct action, although the ominous video was not perceived by respondents to be as informative as the neutral, informational video."I don't think this study, in and of itself, says we should use fear over humor," Niederdeppe said. "This was a particular type of humor. It was very silly. The clueless weatherman was the butt of the jokes. But if you look at the kind of satirical commentary like John Oliver does, there is a bite and a target: industry or the hypocrisy of politicians, for instance. Our next project is looking at whether we can combine humor with this biting, anger-inducing satire, and if that can promote even greater motivation to take action."
Weather
2,018
February 27, 2018
https://www.sciencedaily.com/releases/2018/02/180227111639.htm
Wind and solar power could meet four-fifths of US electricity demand, study finds
The United States could reliably meet about 80 percent of its electricity demand with solar and wind power generation, according to scientists at the University of California, Irvine; the California Institute of Technology; and the Carnegie Institution for Science.
However, meeting 100 percent of electricity demand with only solar and wind energy would require storing several weeks' worth of electricity to compensate for the natural variability of these two resources, the researchers said."The sun sets, and the wind doesn't always blow," noted Steven Davis, UCI associate professor of Earth system science and co-author of a renewable energy study published today in the journal The team analyzed 36 years of hourly U.S. weather data (1980 to 2015) to understand the fundamental geophysical barriers to supplying electricity with only solar and wind energy."We looked at the variability of solar and wind energy over both time and space and compared that to U.S. electricity demand," Davis said. "What we found is that we could reliably get around 80 percent of our electricity from these sources by building either a continental-scale transmission network or facilities that could store 12 hours' worth of the nation's electricity demand."The researchers said that such expansion of transmission or storage capabilities would mean very substantial -- but not inconceivable -- investments. They estimated that the cost of the new transmission lines required, for example, could be hundreds of billions of dollars. In comparison, storing that much electricity with today's cheapest batteries would likely cost more than a trillion dollars, although prices are falling.Other forms of energy stockpiling, such as pumping water uphill to later flow back down through hydropower generators, are attractive but limited in scope. The U.S. has a lot of water in the East but not much elevation, with the opposite arrangement in the West.Fossil fuel-based electricity production is responsible for about 38 percent of U.S. carbon dioxide emissions -- CO"The fact that we could get 80 percent of our power from wind and solar alone is really encouraging," he said. "Five years ago, many people doubted that these resources could account for more than 20 or 30 percent."But beyond the 80 percent mark, the amount of energy storage required to overcome seasonal and weather variabilities increases rapidly. "Our work indicates that low-carbon-emission power sources will be needed to complement what we can harvest from the wind and sun until storage and transmission capabilities are up to the job," said co-author Ken Caldeira of the Carnegie Institution for Science. "Options could include nuclear and hydroelectric power generation, as well as managing demand."
Weather
2,018
February 26, 2018
https://www.sciencedaily.com/releases/2018/02/180226152657.htm
New understanding of ocean turbulence could improve climate models
Brown University researchers have made a key insight into how high-resolution ocean models simulate the dissipation of turbulence in the global ocean. Their research, published in
The study was focused on a form of turbulence known as mesoscale eddies, ocean swirls on the scale of tens to hundreds of kilometers across that last anywhere from a month to a year. These kinds of eddies can pinch off from strong boundary currents like the Gulf Stream, or form where water flows of different temperatures and densities come into contact."You can think of these as the weather of the ocean," said Baylor Fox-Kemper, co-author of the study and an associate professor in Brown's Department of Earth, Environmental and Planetary Sciences. "Like storms in the atmosphere, these eddies help to distribute energy, warmth, salinity and other things around the ocean. So understanding how they dissipate their energy gives us a more accurate picture of ocean circulation."The traditional theory for how small-scale turbulence dissipates energy states that as an eddy dies out, it transmits its energy to smaller and smaller scales. In other words, large eddies decay into smaller and smaller eddies until all the energy is dissipated. It's a well-established theory that makes useful predictions that are widely used in fluid dynamics. The problem is that it doesn't apply to mesoscale eddies."That theory only applies to eddies in three-dimensional systems," Fox-Kemper said. "Mesoscale eddies are on the scale of hundreds of kilometers across, yet the ocean is only four kilometers deep, which makes them essentially two-dimensional. And we know that dissipation works differently in two dimensions than it does in three."Rather than breaking up into smaller and smaller eddies, Fox-Kemper says, two-dimensional eddies tend to merge into larger and larger ones."You can see it if you drag your finger very gently across a soap bubble," he said. "You leave behind this swirly streak that gets bigger and bigger over time. Mesoscale eddies in the global ocean work the same way."This upscale energy transfer is not as well understood mathematically as the downscale dissipation. That's what Fox-Kemper and Brodie Pearson, a research scientist at Brown, wanted to do with this study.They used a high-resolution ocean model that has been shown to do a good job of matching direct satellite observations of the global ocean system. The model's high resolution means it's able to simulate eddies on the order of 100 kilometers across. Pearson and Fox-Kemper wanted to look in detail at how the model dealt with eddy dissipation in statistical terms."We ran five years of ocean circulation in the model, and we measured the damping of energy at every grid point to see what the statistics are," Fox-Kemper said. They found that dissipation followed what's known as a lognormal distribution -- one in which one tail of the distribution dominates the average."There's the old joke that if you have 10 regular people in a room and Bill Gates walks in, everybody gets a billion dollars richer on average -- that's a lognormal distribution," Fox-Kemper said. "What it tells us in terms of turbulence is that 90 percent of the dissipation takes place in 10 percent of the ocean."Fox-Kemper noted that the downscale dissipation of 3-D eddies follows a lognormal distribution as well. So despite the inverse dynamics, "there's an equivalent transformation that lets you predict lognormality in both 2-D and 3-D systems."The researchers say this new statistical insight will be helpful in developing coarser-grained ocean simulations that aren't as computationally expensive as the one used in this study. Using this model, it took the researchers two months using 1,000 processors to simulate just five years of ocean circulation."If you want to simulate hundreds or thousands or years, or if you want something you can incorporate within a climate model that combines ocean and atmospheric dynamics, you need a coarser-grained model or it's just computationally intractable," Fox-Kemper said. "If we understand the statistics of how mesoscale eddies dissipate, we might be able to bake those into our coarser-grained models. In other words, we can capture the effects of mesoscale eddies without actually simulating them directly."The results could also provide a check on future high-resolution models."Knowing this makes us much more capable of figuring out if our models are doing the right thing and how to make them better," Fox-Kemper said. "If a model isn't producing this lognormality, then it's probably doing something wrong."The research was supported by the National Science Foundation (OCE-1350795), Office of Naval Research (N00014-17-1-2963) and the National Key Research Program of China (2017YFA0604100.
Weather
2,018
February 22, 2018
https://www.sciencedaily.com/releases/2018/02/180222133400.htm
Weather should remain predictable despite climate change
According to the Intergovernmental Panel on Climate Change, temperatures are expected to rise between 2.5 and 10 degrees Fahrenheit over the next century. This warming is expected to contribute to rising sea levels and the melting of glaciers and permafrost, as well as other climate-related effects. Now, research from the University of Missouri suggests that even as rising carbon dioxide levels in the atmosphere drive the climate toward warmer temperatures, the weather will remain predictable.
"The jet stream changes character every 10 to 12 days, and we use this pattern to predict the weather," said Anthony Lupo, professor of atmospheric science in MU's School of Natural Resources, which is located in the College of Agriculture, Food and Natural Resources. "We were curious about how this would change in a world with higher carbon dioxide levels. We found that in that warmer world, the variability of the jet stream remained the same."Lupo and Andrew Jensen, who earned his doctorate at MU, used an existing climate model to simulate jet stream flow in the Northern Hemisphere. The simulation monitored a variable that responds to jet stream flow changes and can indicate global-scale weather instability. Researchers used this variable to determine when the jet stream altered its flow. Since meteorologists can only accurately predict weather within the 10 to 12 days between jet stream flow changes, a shift in this time frame would directly impact weather predictability.Over the course of a simulated 31 years, their observations indicated the jet stream would change its character about 30 to 35 times per year, a number that is consistent with current jet stream patterns. As the time frame used to predict weather did not change, the researchers concluded that weather would likely remain as predictable in a warmer world as it is today. The results do not address the effects of climate change on the nature or frequency of weather events but instead focus on the range of predictability afforded by the jet stream. In addition, the researchers did not extend the simulation past the mid-century to ensure their data was as accurate as possible."Climate change will continue to create a lot of ripple effects, but this experiment provides evidence that the range of forecasting will remain the same," Lupo said.
Weather
2,018
February 20, 2018
https://www.sciencedaily.com/releases/2018/02/180220170354.htm
Distant tropical storms have ripple effects on weather close to home
The famously intense tropical rainstorms along Earth's equator occur thousands of miles from the United States. But atmospheric scientists know that, like ripples in a pond, tropical weather creates powerful waves in the atmosphere that travel all the way to North America and have major impacts on weather in the U.S.
These far-flung, interconnected weather processes are crucial to making better, longer-term weather predictions than are currently possible. Colorado State University atmospheric scientists, led by professors Libby Barnes and Eric Maloney, are hard at work to address these longer-term forecasting challenges.In a new paper in According to the study, led by former graduate researcher Bryan Mundhenk, the model, using both these phenomena, allows skillful prediction of the behavior of major rain storms, called atmospheric rivers, three and up to five weeks in advance."It's impressive, considering that current state-of-the-art numerical weather models, such as NOA's Global Forecast System, or the European Centre for Medium-Range Weather Forecasts' operational model, are only skillful up to one to two weeks in advance," says paper co-author Cory Baggett, a postdoctoral researcher in the Barnes and Maloney labs.The researchers' chief aim is improving forecast capabilities within the tricky no-man's land of "subseasonal to seasonal" timescales: roughly three weeks to three months out. Predictive capabilities that far in advance could save lives and livelihoods, from sounding alarms for floods and mudslides to preparing farmers for long dry seasons. Barnes also leads a federal NOAA task force for improving subseasonal to seasonal forecasting, with the goal of sharpening predictions for hurricanes, heat waves, the polar vortex and more.Atmospheric rivers aren't actual waterways, but"rivers in the sky," according to researchers. They're intense plumes of water vapor that cause extreme precipitation, plumes so large they resemble rivers in satellite pictures. These "rivers" are responsible for more than half the rainfall in the western U.S.The Madden-Julian Oscillation is a cluster of rainstorms that moves east along the Equator over 30 to 60 days. The location of the oscillation determines where atmospheric waves will form, and their eventual impact on say, California. In previous work, the researchers have uncovered key stages of the Madden-Julian Oscillation that affect far-off weather, including atmospheric rivers.Sitting above the Madden-Julian Oscillation is a very predictable wind pattern called the quasi-biennial oscillation. Over two- to three-year periods, the winds shift east, west and back east again, and almost never deviate. This pattern directly affects the Madden-Julian Oscillation, and thus indirectly affects weather all the way to California and beyond.The CSU researchers created a model that can accurately predict atmospheric river activity in the western U.S. three weeks from now. Its inputs include the current state of the Madden-Julian Oscillation and the quasi-biennial oscillation. Using information on how atmospheric rivers have previously behaved in response to these oscillations, they found that the quasi-biennial oscillation matters -- a lot.Armed with their model, the researchers want to identify and understand deficiencies in state-of-the-art numerical weather models that prevent them from predicting weather on these subseasonal time scales."It would be worthwhile to develop a good understanding of the physical relationship between the Madden-Julian Oscillation and the quasi-biennial oscillation, and see what can be done to improve models' simulation of this relationship," Mundhenk said.Another logical extension of their work would be to test how well their model can forecast actual rainfall and wind or other severe weather, such as tornadoes and hail.
Weather
2,018
February 20, 2018
https://www.sciencedaily.com/releases/2018/02/180220095003.htm
Warmer future for the Pacific Northwest if carbon dioxide levels rise, climate projections show
In the midst of an unseasonably warm winter in the Pacific Northwest, a comparison of four publicly available climate projections has shown broad agreement that the region will become considerably warmer in the next century if greenhouse gas concentrations in the atmosphere rise to the highest levels projected in the the Intergovernmental Panel on Climate Change (IPCC) "business-as-usual" scenario.
In this scenario, carbon dioxide concentrations are projected to continue to rise and to exceed 900 parts per million, more than double today's level of just over 400 parts per million. Annual average global temperatures are projected to rise between 1.5 and 7 degrees Celsius (2.7 to 12.6 degrees Fahrenheit), and precipitation is expected to increase during the winter and decrease in the summer.To examine projections of future climates in the Northwest, researchers in the College of Forestry at Oregon State University and the U.S. Forest Service obtained outputs from more than 30 climate models, known as general circulation models. These models simulate the Earth's climate at scales that are generally too large to be applied with confidence to local areas, such as the watersheds of small rivers and streams.The scientists examined four different versions of the model outputs, each one translated for the region with data from weather stations in the Northwest through a process called "downscaling." While the resulting fine-resolution climate projections vary for parts of the Northwest, such as coastal watersheds and mountain crests, the general agreement among them gives scientists increasing confidence in using fine-resolution climate projections for exploring future climate change impacts. The differences among them were no more than 0.3 degrees Celsius (about 0.5 degrees Fahrenheit) for the region.The results were published this week in the journal "From a regional perspective, the differences in projected future changes are minor when you look at how much each projection says climate will change for the business-as-usual scenario," said Yueyang Jiang, lead author and a postdoctoral scientist at OSU. "The climate projections were created using different downscaling methods, but the projected changes in climate among them are similar at the regional scale."The researchers chose to analyze projections for the recent past as well as for three 29-year periods from 2011 to 2100. Their goal was to characterize the differences to inform and guide scientists and land managers who are evaluating the projected impacts of climate change on local resources.The fine-resolution climate projections vary in downscaling techniques and in the choice of historically observed weather datasets used to calibrate their calculations. Jiang and his team confirmed that the methods used to downscale each of the models had little to no effect on the data. They showed instead that differences arose from the choice of historical observation datasets used, which vary due to highly variable weather patterns or due to a lack of data in areas where weather stations are far apart."These differences become enhanced in areas with strong geographic features, such as the coastline and at the crest of mountain ranges," said John Kim, co-author on the paper and a scientist with the Pacific Northwest Research Station of the U.S. Forest Service.Nevertheless, Kim added, the analysis reveals "a fairly consistent high-resolution picture of climate change" under the highest greenhouse gas concentration scenario projected by the IPCC. "So, individuals and organizations that are interested in how much climate may change for most parts of the region can use any of the datasets we examined."However, the researchers also caution against using only one projection to explore the effects of climate change at specific threshholds, such as how plants and animals might respond to a decrease in days with temperatures below freezing. Scientists interested in such climate effects should use several models, they added.The project was supported by Oregon State University and the Pacific Northwest Research Station.
Weather
2,018
February 14, 2018
https://www.sciencedaily.com/releases/2018/02/180214150205.htm
Risk of extreme weather events higher if Paris Agreement goals aren't met
The individual commitments made by parties of the United Nations Paris Agreement are not enough to fulfill the agreement's overall goal of limiting global temperature rise to less than 2 degrees Celsius above pre-industrial levels. The difference between the U.N. goal and the actual country commitments is a mere 1 C, which may seem negligible. But a study from Stanford University, published Feb. 14 in
In this study, Noah Diffenbaugh, the Kara J Foundation Professor of Earth System Science at Stanford's School of Earth, Energy & Environmental Sciences, and fellow researchers from Columbia University and Dartmouth College expanded on previous work analyzing historical climate data, which demonstrated how greenhouse gas emissions have increased the probability of recording-breaking hot, wet and dry events in the present climate. Now, the group analyzed similar models to estimate the probability of extreme weather events in the future under two scenarios of the Paris Agreement: increases of 1.5 to 2 degrees if countries live up to their aspirations, or 2 to 3 degrees if they meet the commitments that they have made."The really big increases in record-setting event probability are reduced if the world achieves the aspirational targets rather than the actual commitments," said Diffenbaugh, who is also the Kimmelman Family Senior Fellow in the Stanford Woods Institute for the Environment. "At the same time, even if those aspirational targets are reached, we still will be living in a climate that has substantially greater probability of unprecedented events than the one we're in now."The new study is the latest application of an extreme event framework that Diffenbaugh and other researchers at Stanford have been developing for years. They have applied this framework to individual events, such as the 2012-2017 California drought and the catastrophic flooding in northern India in June 2013. In their 2017 paper on severe events, they found that global warming from human emissions of greenhouse gases has increased the odds of the hottest events across more than 80 percent of the globe for which reliable observations were available, while also increasing the likelihood of both wet and dry extremes.The framework relies on a combination of historical climate observations and climate models that are able to simulate the global circulation of the atmosphere and ocean. The group uses output from these models run under two conditions: one that includes only natural climate influences, like sunspot or volcano activity, and another that also includes human influences like rising carbon dioxide concentrations. The researchers compare the simulations to historical extreme event data to test whether the condition with natural or human influences best represents reality.For the new study, the researchers expanded the number of climate models from their previous paper that had investigated the 1 degree of global warming that has already occurred, strengthening their earlier conclusions. Then, they used their findings to predict the probabilities of severe events in the two Paris Agreement scenarios.Although the researchers knew that increases in temperature would very likely lead to increases in severe events, the stark difference in the outcomes of the two scenarios surprised them.The researchers found that emissions consistent with the commitments countries have made are likely to result in a more than fivefold increase in probability of record-breaking warm nights over approximately 50 percent of Europe, and more than 25 percent of East Asia. This 2 to 3 degrees of global warming would also likely result in a greater than threefold increase in record-breaking wet days over more than 35 percent of North America, Europe and East Asia. The authors found that this level of warming is also likely to lead to increases in hot days, along with milder cold nights and shorter freezes.Meeting the Paris Agreement's goal of keeping the global-scale warming to less than 2 degrees is likely to reduce the area of the globe that experiences greater than threefold increases in the probability of record-setting events. However, even at this reduced level of global warming, the world is still likely to see increases in record-setting events compared to the present.When people build a dam, plan the management of a river or build on a floodplain, it is common practice to base decisions on past historical data. This study provides more evidence that these historical probabilities no longer apply in many parts of the world. The new analysis helps clarify what the climate is likely to look like in the future and could help decision makers plan accordingly."Damages from extreme weather and climate events have been increasing, and 2017 was the costliest year on record," Diffenbaugh said. "These rising costs are one of many signs that we are not prepared for today's climate, let alone for another degree of global warming.""But the good news is that we don't have to wait and play catch-up," Diffenbaugh added. "Instead, we can use this kind of research to make decisions that both build resilience now and help us be prepared for the climate that we will face in the future."
Weather
2,018
February 13, 2018
https://www.sciencedaily.com/releases/2018/02/180213183555.htm
Polar vortex defies climate change in the Southeast U.S.
Overwhelming scientific evidence has demonstrated that our planet is getting warmer due to climate change, yet parts of the eastern U.S. are actually getting cooler. According to a Dartmouth-led study in
During the winter and spring, the U.S. warming hole sits over the Southeast, as the polar vortex allows arctic air to plunge into the region. This has resulted in persistently cooler temperatures throughout the Southeast. After spring, the U.S. warming hole moves north and is located in the Midwest.The study found that winter temperatures in the U.S. warming hole are associated with a wavier jet stream, which is linked to natural climate cycles over the Atlantic and Pacific Oceans, and potentially to climate change. Previous research has illustrated that warming temperatures and melting Arctic sea ice set up conditions for a wavier jet stream. The study revealed that the jet stream over the U.S. became wavier in the late 1950's, coincident with the start of the warming hole. As such, since the late 1950's, the polar vortex has been cooling the southeastern U.S. during the winter."By discovering that the U.S. warming hole's location depends on the season, we've found a new way to help understand this phenomenon," says Jonathan M. Winter, an assistant professor of geography at Dartmouth and principle investigator for the research. "For example, the recent extreme cold snaps in the Southeast, which seem counterintuitive to global warming, may be related to the U.S. warming hole," added Trevor F. Partridge, a graduate student in earth sciences at Dartmouth and the study's lead author.While the wintertime U.S. warming hole was found to be associated with the wavier jet stream, this was not the case for summertime temperatures. This conclusion supports previous studies that find connections between the summer warming hole in the Midwest and intensified farming, increased irrigation and air pollution, which primarily impact climate in summer and autumn.The study provides new insight on when the U.S. warming hole occurred and where it is located spatially. Using National Oceanic and Atmospheric Administration data from 1,407 temperature stations and 1,722 precipitation stations from throughout the contiguous U.S. from 1901 to 2015, the researchers examined temperature and precipitation data over time for all stations, and identified stations that were persistently cooler than average from 1960 to 2015. Daily temperatures in the warming hole cooled by an average of 1.2 degrees Fahrenheit since 1958, compared to a global average warming of about 1 degree Fahrenheit over the same period. The findings provide greater context on the cause of the U.S. warming hole, a phenomenon that has large implications for both the U.S. agricultural sector, and Midwest and Southeast weather now and potentially into the future.
Weather
2,018
February 13, 2018
https://www.sciencedaily.com/releases/2018/02/180213123251.htm
Intensive agriculture influences US regional summer climate, study finds
Scientists agree that changes in land use such as deforestation, and not just greenhouse gas emissions, can play a significant role altering the world's climate systems. Now, a new study by researchers at MIT and Dartmouth College reveals how another type of land use, intensive agriculture, can impact regional climate.
The researchers show that in the last half of the 20th century, the midwestern U.S. went through an intensification of agricultural practices that led to dramatic increases in production of corn and soybeans. And, over the same period in that region, summers were significantly cooler and had greater rainfall than during the previous half-century. This effect, with regional cooling in a time of overall global warming, may have masked part of the warming effect that would have occurred over that period, and the new finding could help to refine global climate models by incorporating such regional effects.The findings are being published this week in The team showed that there was a strong correlation, in both space and time, between the intensification of agriculture in the Midwest, the decrease in observed average daytime temperatures in the summer, and an increase in the observed local rainfall. In addition to this circumstantial evidence, they identified a mechanism that explains the association, suggesting that there was indeed a cause-and-effect link between the changes in vegetation and the climatic effects.Eltahir explains that plants "breathe" in the carbon dioxide they require for photosynthesis by opening tiny pores, called stoma, but each time they do this they also lose moisture to the atmosphere. With the combination of improved seeds, fertilizers, and other practices, between 1950 and 2009 the annual yield of corn in the Midwest increased about fourfold and that of soybeans doubled. These changes were associated with denser plants with more leaf mass, which thus increased the amount of moisture released into the atmosphere. That extra moisture served to both cool the air and increase the amount of rainfall, the researchers suggest."For some time, we've been interested in how changes in land use can influence climate," Eltahir says. "It's an independent problem from carbon dioxide emissions," which have been more intensively studied.Eltahir, Alter, and their co-authors noticed that records showed that over the course of the 20th century, "there were substantial changes in regional patterns of temperature and rainfall. A region in the Midwest got colder, which was a surprise," Eltahir says. Because weather records in the U.S are quite extensive, there is "a robust dataset that shows significant changes in temperature and precipitation" in the region.Over the last half of the century, average summertime rainfall increased by about 15 percent compared to the previous half-century, and average summer temperatures decreased by about half a degree Celsius. The effects are "significant, but small," Eltahir says.By introducing into a regional U.S. climate model a factor to account for the more intensive agriculture that has made the Midwest one of the world's most productive agricultural areas, the researchers found, "the models show a small increase in precipitation, a drop in temperature, and an increase in atmospheric humidity," Eltahir says -- exactly what the climate records actually show.That distinctive "fingerprint," he says, strongly suggests a causative association. "During the 20th century, the midwestern U.S. experienced regional climate change that's more consistent with what we'd expect from land-use changes as opposed to other forcings," he says.This finding in no way contradicts the overall pattern of global warming, Eltahir stresses. But in order to refine the models and improve the accuracy of climate predictions, "we need to understand some of these regional and local processes taking place in the background."Unlike land-use changes such as deforestation, which can reduce the absorption of carbon dioxide by trees that can help to ameliorate emissions of the gas, the changes in this case did not reflect any significant increase in the area under cultivation, but rather a dramatic increase in yields from existing farmland. "The area of crops did not expand by a whole lot over that time, but crop production increased substantially, leading to large increases in crop yield," Alter explains.The findings suggest the possibility that at least on a small-scale regional or local level, intensification of agriculture on existing farmland could be a way of doing some local geoengineering to at least slightly lessen the impacts of global warming, Eltahir says. A recent paper from another group in Switzerland suggests just that.But the findings could also portend some negative impacts because the kind of intensification of agricultural yields achieved in the Midwest are unlikely to be repeated, and some of global warming's effects may "have been masked by these regional or local effects. But this was a 20th-century phenomenon, and we don't expect anything similar in the 21st century," Eltahir says. So warming in that region in the future "will not have the benefit of these regional moderators."
Weather
2,018
February 12, 2018
https://www.sciencedaily.com/releases/2018/02/180212190935.htm
Bats as barometer of climate change
Historical radar data from weather monitoring archives have provided unprecedented access to the behaviours of the world's largest colony of migratory bats and revealed changes in the animals' seasonal habits with implications for pest management and agricultural production.
The work, which focuses on the Bracken Cave colony in southern Texas, is the first long-term study of animal migration using radar, say Phillip Stepanian and Charlotte Wainwright, meteorologists from Rothamsted Research. The pair's findings are published today in "These bats spend every night hard at work for local farmers, consuming over half of their own weight in insects, many of which are harmful agricultural pests, such as the noctuid moths, corn earworm and fall armyworm," says Wainwright."Our initial goal was just to show that the populations could be monitored remotely without disturbing the colony. We weren't expecting to see anything particularly noteworthy. The results were surprising," says Stepanian.Millions of bats regularly migrate north from Mexico to Bracken Cave, which is managed by Bat Conservation International in the suburbs of San Antonio. Using the radar data, the pair measured the population exiting the cave every night for 22 years, from 1995 to 2017, enabling them to record seasonal and longer-term changes."We found that the bats are migrating to Texas roughly two weeks earlier than they were 22 years ago. They now arrive, on average, in mid March rather than late March," says Wainwright.While most bats tend to have left by the end of November, the pair discovered that about 3.5% of the summer population are now staying for the winter, compared with less than 1% 22 years ago and, from written cave surveys, no overwintering bats at all in the mid 1950s."We can't tell if the overwintering bats are bats that arrived in March and have not returned south, or if they migrated to Bracken Cave from farther north," says Stepanian. "However, the behavioural patterns indicate a response to some environmental change, and to the presence of insect prey earlier in the year."This bat study "presents a new perspective on adaptation to global change, answering some longstanding questions while raising many more," conclude the pair. They also note that "weather radar networks are key infrastructure around much of the world...and hold the promise of providing continental surveillance of bat populations, as well as their ongoing responses to global change."
Weather
2,018
February 8, 2018
https://www.sciencedaily.com/releases/2018/02/180208120839.htm
Micro to macro mapping -- Observing past landscapes via remote-sensing
Remotely detecting changes in landforms has long relied upon the interpretation of aerial and satellite images. Effective interpretation of these images, however, can be hindered by the environmental conditions at the time the photo was taken, the quality of the image and the lack of topographical information.
More recently, data produced by photogrammetry and Light Detection and Ranging (LiDAR) models have become commonplace for those involved in geographical analysis -- engineers, hydrologists, landscape architects and archaeologists.In general, these techniques were designed to highlight small-scale 'micro-topographies' such as the expansive Mayan settlement network recently revealed in the dense jungles of Guatemala. But, how to connect the dots on a larger scale?In new research published this week in the journal Dr Hector Orengo, researcher at the McDonald Institute for Archaeological Research and lead author of the study, said, "We originally developed this algorithm to complement multitemporal remote-sensing using multispectral satellite images that are currently being applied to the reconstruction of the prehistoric river network in northwest India as part of the TwoRains project."The TwoRains multitemporal remote sensing approach has had an important impact as it was able to find and accurately trace more than 8,000km of relict water courses; the image of which has been selected as the cover image of this year's Cambridge Science Festival (see image above).However, the authors were conscious that many ancient rivers were not being found. Dr Orengo said, "It soon became clear that detecting and mapping topographic features such as levees, riverbeds, bluff lines and dune fields could help provide insight into how palaeorivers behaved and eventually disappeared.""The new MSRM algorithm has addressed this need and its application has significantly extended our knowledge of the palaeoriver network of north-western India with more than 10,000 new rivers detected."Understanding how the Indus civilisation accessed and managed their water resources is at the heart of the TwoRains Project.Dr Cameron Petrie, director of the ERC-funded project and co-author on the study, commented, "We are investigating the nature of human adaptation to the ecological conditions created by the winter and summer rainfall systems of India. These systems are important for understanding the past and planning for the future due to their potential for direct impact on very current issues such as food security and the sustainability of human settlement in particular areas.""Humans can adapt their behaviour to a wide range of climatic and environmental conditions, so it is essential that we understand the degree to which human choices in the past, present and future are resilient and sustainable in the face of variable weather conditions, and when confronted with abrupt events of climate change. Reconstructing the prehistoric hydrographical network of the Sutlej-Yamuna interfluve in northwest India helps us to understand these adaptations more fully."Dr Orengo believes that the new method has many uses outside the realm of archaeology. He commented, "The application of MSRM can also be beneficial to all other research fields aiming to interpret small terrain differences. We have made the code open access in the paper with the hope that others will be able to use it for their own interests, and also evaluate and improve it."
Weather
2,018
February 7, 2018
https://www.sciencedaily.com/releases/2018/02/180207140354.htm
Towards a better prediction of solar eruptions
Just one phenomenon may underlie all solar eruptions, according to researchers from the CNRS, École Polytechnique, CEA and INRIA in an article featured on the cover of the February 8 issue of
Just as on Earth, storms and hurricanes sweep through the atmosphere of the Sun. These phenomena are caused by a sudden, violent reconfiguration of the solar magnetic field, and are characterized by an intense release of energy in the form of light and particle emissions and, sometimes, by the ejection of a bubble of plasma. Studying these phenomena, which take place in the corona (the outermost region of the Sun), will enable scientists to develop forecasting models, just as they do for the Earth's weather. This should limit our technological vulnerability to solar eruptions, which can impact a number of sectors such as electricity distribution, GPS and communications systems.In 2014, researchers showed that a characteristic structure, an entanglement of magnetic force lines twisted together like a hemp rope, gradually appears in the days preceding a solar flare. However, until recently then they had only observed this rope in eruptions that ejected bubbles of plasma. In this new study, the researchers studied other types of flare, the models of which are still being debated, by undertaking a more thorough analysis of the solar corona, a region where the Sun's atmosphere is so thin and hot that it is difficult to measure the solar magnetic field there. They did this by measuring stronger magnetic field at the surface of the Sun, and then using these data to reconstruct what was happening in the solar corona.They applied this method to a major flare that developed over a few hours on October 24, 2014. They showed that, in the hours before the eruption, the evolving rope was confined within a multilayer magnetic 'cage'. Using evolutionary models running on supercomputer, they showed that the rope had insufficient energy to break through all the layers of the cage, making the ejection of a magnetic bubble impossible. Despite this, the high twist of the rope triggered an instability and the partial destruction of the cage, causing a powerful emission of radiation that led to disruptions on Earth.Thanks to their method, which makes it possible to monitor the processes taking place in the last few hours leading up to a flare, the researchers have developed a model able to predict the maximum energy that can be released from the region of the Sun concerned. The model showed that for the 2014 eruption, a huge ejection of plasma would have occurred if the cage had been less resistant. This work demonstrates the crucial role played by the magnetic 'cage-rope' duo in controlling solar eruptions, as well as being a new step towards early prediction of such eruptions, which will have potentially significant societal impacts.
Weather
2,018
January 30, 2018
https://www.sciencedaily.com/releases/2018/01/180130123725.htm
Pathway to give advanced notice for hailstorms
A new study led by
Hail is easily the most economically destructive hazard posed by severe thunderstorms, producing on average billions of dollars in U.S. losses each year, including damage to roofs, homes and especially crops."We found a really strong relationship between jet stream patterns over the Pacific Ocean and U.S. hail frequency," Gensini said. "In simple terms, when the jet stream is really wavy, the likelihood of experiencing hail greatly increases."The study by Gensini and co-author John Allen of Central Michigan University was accepted for publication in the journal, Two years ago, Gensini led research on a method to predict the likelihood of U.S. tornado activity weeks in advance. Last year, of 26 long-range (two to three weeks) forecasts for increased, average or below average U.S. tornado activity, more than half were "spot on," Gensini said. Most of the other predictions were only slightly off.The new study is an extension of the tornado research, suggesting a similar method can be used in sub-seasonal forecasts of hailstorms. "There's a high degree of correlation between environments that produce hail and tornadoes, but not all storms produce both hazards," said Gensini, a professor in the NIU Department of Geographic and Atmospheric Sciences.While the method would be used to forecast hail activity for the country in general, portions of Texas, Oklahoma, Arkansas, Kentucky, Missouri, Mississippi, Tennessee, Illinois and Indiana are most vulnerable to the phenomenon.Gensini and Allen examined hail observations from national storm data for the period of 1979 to 2016. They compared those events with changes in the Global Wind Oscillation (GWO) index, a collection of climate and weather information that measures atmospheric angular momentum, or the degree of waviness in the jet stream.The GWO index has eight distinct phases. Four of those phases were reliable predictors of increased inland hail activity during peak storm seasons, according to the study."There is a strong connection between the GWO and U.S. hail frequency," Allen said. "This relationship helps to understand what is driving hail variability, and explains to a large degree when we are likely to experience active and inactive periods during the spring and fall."During the summer, however, the GWO is unreliable as other smaller-scale meteorological processes tend to dominate local weather conditions."We will be testing the relationships this spring when severe weather season ramps up," Gensini said."We're starting to demonstrate more clearly a pathway to increase the lead time for severe weather forecasts, now with both hailstorms and tornadoes," he added. "We keep adding cinderblocks to the methodology, and it's slowly becoming robust."
Weather
2,018
January 30, 2018
https://www.sciencedaily.com/releases/2018/01/180130094720.htm
UK regional weather forecasts could be improved using jet stream data
Weather forecasters could be able to better predict regional rainfall and temperatures across the UK by using North Atlantic jet stream data, according to new research.
Climate scientists examined the relationship between changes in North Atlantic atmospheric circulation -- or jet stream -- and UK regional weather variations during summer and winter months over the past 65 years, and found that the jet stream changes were significantly associated with variations in regional rainfall and temperatures in the UK.The jet stream is a giant current of air that flows eastwards over mid-latitude regions around the globe, and influences weather in the UK. It is partly controlled by changes in the atmosphere, ocean and sea-ice conditions in various key regions.These findings show for the first time that meteorologists could use predictions of jet stream changes on a seasonal basis -- or potentially even years ahead -- to produce more accurate seasonal weather forecasts on a smaller, regional scale than is currently possible.The research team from the University of Lincoln, UK, say that a reliable seasonal forecast of the jet stream conditions will indicate the weather patterns expected in the UK during upcoming months, and crucially provide regional information on likely general weather conditions. The information would be invaluable to planners and decision makers in agriculture, energy supply, transport and insurance industries, as well as members of the public.Dr Richard Hall and Edward Hanna, Professor of Climate Science and Meteorology, from the University of Lincoln's School of Geography carried out the study.Professor Hanna said: "Understanding how jet stream changes affect the nature and severity of seasonal weather on a UK regional basis will make a huge difference to the general public and planners in the transport, energy, construction and leisure sectors -- particularly over the winter months."If we can predict jet stream changes on a seasonal basis, or even potentially a number of years ahead, we should then be able to better predict seasonal weather changes on a smaller, regional scale than we currently can."Using a statistical technique called empirical orthogonal function analysis, the scientists explored where the jet stream was located and how variable this was throughout the year, to identify the dominant North Atlantic atmospheric circulation patterns that influence UK weather on a seasonal scale.They then used a method called correlation analysis, where the association between two different factors such as changes recorded in the jet stream and subsequent temperatures in the UK, or the volume of rainfall, was used to determine their results.Finally, the team also examined temperature, rainfall or wind data from years which had been noted to have unusual jet stream conditions over 65 years to establish if there were related patterns.Dr Hall added: "Seasonal forecasts often focus on the North Atlantic Oscillation (NAO) index which indicates the position of the jet stream, but they ignore the contribution of the other factors which influence temperature and rainfall anomalies, and more importantly, do not distinguish between UK regional variations, limiting the usefulness of the forecast."The findings have been published in the
Weather
2,018
January 29, 2018
https://www.sciencedaily.com/releases/2018/01/180129145831.htm
Mammals and birds could have best shot at surviving climate change 
New research that analyzed more than 270 million years of data on animals shows that mammals and birds -- both warm-blooded animals -- may have a better chance of evolving and adapting to the Earth's rapidly changing climate than their cold-blooded peers, reptiles and amphibians.
"We see that mammals and birds are better able to stretch out and extend their habitats, meaning they adapt and shift much easier," said Jonathan Rolland, a Banting postdoctoral fellow at the biodiversity research centre at UBC and lead author of the study. "This could have a deep impact on extinction rates and what our world looks like in the future."By combining data from the current distribution of animals, fossil records and phylogenetic information for 11,465 species, the researchers were able to reconstruct where animals have lived over the past 270 million years and what temperatures they needed to survive in these regions.The planet's climate has changed significantly throughout history and the researchers found that these changes have shaped where animals live. For example, the planet was fairly warm and tropical until 40 million years ago, making it an ideal place for many species to live. As the planet cooled, birds and mammals were able to adapt to the colder temperatures so they were able to move into habitats in more northern and southern regions."It might explain why we see so few reptiles and amphibians in the Antarctic or even temperate habitats," said Rolland. "It's possible that they will eventually adapt and could move into these regions but it takes longer for them to change."Rolland explained that animals that can regulate their body temperatures, known as endotherms, might be better able to survive in these places because they can keep their embryos warm, take care of their offspring and they can migrate or hibernate."These strategies help them adapt to cold weather but we rarely see them in the ectotherms or cold-blooded animals," he said.Rolland and colleagues argue that studying the past evolution and adaptations of species might provide important clues to understand how current, rapid changes in temperature impact biodiversity on the planet.
Weather
2,018
January 25, 2018
https://www.sciencedaily.com/releases/2018/01/180125140920.htm
Tiny particles have outsize impact on storm clouds, precipitation
Tiny particles fuel powerful storms and influence weather much more than has been appreciated, according to a study in the Jan. 26 issue of the journal
The research focuses on the power of minute airborne particles known as aerosols, which can come from urban and industrial air pollution, wildfires and other sources. While scientists have known that aerosols may play an important role in shaping weather and climate, the new study shows that the smallest of particles have an outsize effect: Particles smaller than one-thousandth the width of a human hair can cause storms to intensify, clouds to grow and more rain to fall.The tiny pollutants -- long considered too small to have much impact on droplet formation -- are, in effect, diminutive downpour-makers."We showed that the presence of these particles is one reason why some storms become so strong and produce so much rain. In a warm and humid area where atmospheric conditions are otherwise very clean, the intrusion of very small particles can make quite an impact," said Jiwen Fan of the Department of Energy's Pacific Northwest National Laboratory, who is lead author of the paper in The findings are based largely on unique data made possible by the GoAmazon research campaign, where scientists made ground-based and airborne measurements related to climate during 2014-2015. The campaign was run by the Atmospheric Radiation Measurement (ARM) Climate Research Facility, a U.S. Department of Energy Office of Science user facility.The study capitalized on data from an area of the Amazon that is pristine except for the region around Manaus, the largest city in the Amazon, with a population of more than 2 million people. The setting gave scientists the rare opportunity to look at the impact of pollution on atmospheric processes in a largely pre-industrial environment and pinpoint the effects of the particles apart from other factors such as temperature and humidity.In this study, scientists studied the role of ultrafine particles less than 50 nanometers wide in the development of thunderstorms. Similar but larger particles are known to play a role in feeding powerful, fast-moving updrafts of air from the land surface to the atmosphere, creating the clouds that play a central role in the formation of water droplets that fall as rain.But scientists had not observed -- until now -- that smaller particles below 50 nanometers, such as particles produced by vehicles and industrial processes, could do the same. Not only that. The new study revealed that these particles, whose effects on clouds have been mostly neglected until now, can invigorate clouds in a much more powerful way than their larger counterparts.Through detailed computer simulations, the scientists showed how the smaller particles have a powerful impact on storm clouds.It turns out that when larger particles aren't present high in a warm and humid environment, it spells opportunity for the smaller particles to act and form cloud droplets. The low concentration of large particles contributes to high levels of excessive water vapor, with relative humidity that can go well beyond 100 percent. That's a key condition spurring ultrafine particles to transform into cloud droplets.While the particles are small in size, they are large in number, and they can form many small droplets on which the excess water vapor condenses. That enhanced condensation releases more heat, and that heat makes the updrafts much more powerful: More warm air is pulled into the clouds, pulling more droplets aloft and producing more ice and snow pellets, lightning, and rain.The result: "Invigorated convection," as Fan says -- and stronger storms."We've shown that under clean and humid conditions, like those that exist over the ocean and some land in the tropics, tiny aerosols have a big impact on weather and climate and can intensify storms a great deal," said Fan, an expert on the effects of pollution on storms and weather. "More broadly, the results suggest that from pre-industrial to the present day, human activity possibly may have changed storms in these regions in powerful ways."
Weather
2,018
January 24, 2018
https://www.sciencedaily.com/releases/2018/01/180124172428.htm
Century of data shows sea-level rise shifting tides in Delaware, Chesapeake bays
The warming climate is expected to affect coastal regions worldwide as glaciers and ice sheets melt, raising sea level globally. For the first time, an international team has found evidence of how sea-level rise already is affecting high and low tides in both the Chesapeake and Delaware bays, two large estuaries of the eastern United States.
The team combined a computer model with 100 years of observations to tease out the fact that global sea-level rise is increasing the tidal range, or the distance between the high and low tides, in many areas throughout each bay.Tides, or the rising and falling of the ocean's surface, occur on regular intervals and result from numerous factors, the biggest of which is the gravitational pull from the sun and moon. For centuries, people have documented and predicted the daily high and low tides because they can impact ocean navigation. Tides can also affect ocean life, flooding risk, fishing, the weather, and some energy sources such as hydroelectric power.In 2015, Andrew Ross, meteorology doctoral student, Penn State, noticed an odd pattern emerging while testing a numerical computer model for tidal research. Adding one meter of sea-level rise to the model resulted in a distinct pattern of changes to the high and low tides throughout the Chesapeake Bay."We weren't sure why it was there, but it was unique enough that we thought it should show up in observations, too, if it was actually real," said Ross, now a postdoctoral research associate, Princeton University. "So we started looking at the observations, doing more comparisons."Ross began working with a team that included his adviser, Raymond Najjar, professor of oceanography, Penn State, to pinpoint the precise effects of sea-level rise by subtracting other forces that affect changes to the tidal range. Some of these forces are predictable, including the 18.61 years it takes for the moon's orbit around the Earth to oscillate. Other forces are less predictable, like the effect from dredging a bay to create a wider, deeper space for container ships. The team analyzed tidal gauge records from 15 locations in the Delaware and Chesapeake bays, the oldest of which dates back to 1901. They also studied nearby cities outside of each bay to control for larger changes affecting the ocean more broadly.Once they isolated sea-level rise's influence on the tides using the tide gauge records, they compared this information with a computer model in which they could adjust the overall sea level, simulating the effects of the past or future. They reported, in the "In the Delaware Bay, as you go upstream toward Philadelphia, the shore lines are converging in a sort of funnel shape, and so we see that amplifies sea-level rise's effects on the tides," said Ross. "That amplification gets magnified the farther you go upstream."By contrast, the shape of the Chesapeake Bay resulted in a different pattern of effects, in some cases slightly decreasing the tidal range. Compared to the Delaware Bay, the Chesapeake Bay is longer and has less of a funnel shape. After a wave enters the Chesapeake Bay from the ocean, it eventually hits the end of the bay and reflects back towards the ocean, either adding to or subtracting from the heights of waves it contacts. Because waves travel faster in deeper water, a higher sea level results in a faster wave speed. This changes where incoming and outgoing waves interact with each other, which has a cascading effect on tide patterns throughout the bay."When people think about flooding associated with sea-level rise, they often think everything will just go up, including the tides, but the reality is that in some cases, the tidal range may stay the same or decrease," said Najjar. "It all depends on the geometry of the bay and the speed of waves, which we know are affected by rising sea levels."
Weather
2,018
January 24, 2018
https://www.sciencedaily.com/releases/2018/01/180124123225.htm
Record jump in 2014-2016 global temperatures largest since 1900
Global surface temperatures surged by a record amount from 2014 to 2016, boosting the total amount of warming since the start of the last century by more than 25 percent in just three years, according to new University of Arizona-led research.
"Our paper is the first one to quantify this jump and identify the fundamental reason for this jump," said lead author Jianjun Yin, a UA associate professor of geosciences.The Earth's average surface temperature climbed about 1.6 degrees F (0.9 C) from 1900 to 2013.By analyzing global temperature records, Yin and his colleagues found that by the end of 2016, the global surface temperature had climbed an additional 0.43 degrees F (0.24 C).Co-author Jonathan Overpeck said, "As a climate scientist, it was just remarkable to think that the atmosphere of the planet could warm that much that fast."The spike in warming from 2014 to 2016 coincided with extreme weather events worldwide, including heat waves, droughts, floods, extensive melting of polar ice and global coral bleaching.The new research shows that natural variability in the climate system is not sufficient to explain the 2014-2016 temperature increase, said co-author Cheryl Peyser, a UA doctoral candidate in geosciences.In the current paper, the researchers also projected how frequent such big temperature spikes would be under four different greenhouse emission scenarios. Record-breaking temperature jumps and the accompanying extreme weather events will become more frequent unless greenhouse gas emissions decline, the team found.Figuring out the mechanism for the temperature spike built on previous work by Peyser, Yin and others.The earlier work showed that although the Earth's surface warming had slowed from 1998 to 2013, heat from additional atmospheric greenhouse gases was being sequestered in the Pacific Ocean. The strong 2014-2015 El Niño roiled the ocean and released all the stored heat, causing a big jump in the Earth's surface temperatures."Our research shows global warming is accelerating," Yin said.The research paper, "Big Jump of Record Warm Global Mean Surface Temperature in 2014-2016 Related to Unusually Large Oceanic Heat Releases," by Yin, Overpeck, Peyser and Ronald Stouffer, a UA adjunct instructor in geosciences, is online in the journal The Visiting Scientist Program of Princeton University, the National Oceanic and Atmospheric Administration and the National Science Foundation funded the research.In early 2017, Yin and Overpeck were having lunch at Wilko, a restaurant near the University of Arizona campus, and Yin mentioned how fast the globe was warming.Overpeck said, "I knew it was warming a lot, but I was surprised at how much it warmed and surprised at his insight into the probable mechanism."The two scientists began brainstorming about expanding on Peyser and Yin's previous work.The researchers analyzed observations of global mean surface temperatures from 1850 to 2016, ocean heat content from 1955 to 2016, sea level records from 1948 to 2016 and records of the El Niño climate cycle and a longer climate cycle called the Pacific Decadal Oscillation -- 15 different datasets in all.The analysis showed the 0.43 F (0.24 C) global temperature increase from 2014 to 2016 was unprecedented in the 20th and 21st centuries.Although some release of heat from the Pacific Ocean is normal during an El Niño, the researchers found much of the heat released in 2014-2015 was due to additional warming from increases in the amount of greenhouse gases in the atmosphere.Yin said, "The result indicates the fundamental cause of the large record-breaking events of global temperature was greenhouse-gas forcing rather than internal climate variability alone."The researchers also projected how often a 0.43 F (0.24 C) global temperature increase might occur in the 21st century depending on the amount of greenhouse gases emitted worldwide between now and 2100. The team used four representative concentration pathway, or RCP, models that project future climate change between 2006 and 2100.For the low-emission RCP scenario in which greenhouse gas emissions peak by 2020 and decline thereafter, temperature jumps of at least 0.43 F (0.24 C) might occur from zero to one time in the 21st century, the team found.For the highest-emission RCP scenario in which greenhouse gas emissions rise unabated throughout the 21st century, spikes of record warm temperatures would occur three to nine times by 2100. Under this scenario, such events would likely be warmer and longer than the 2014-2016 spike and have more severe impacts. The world is on track for one of the higher emission scenarios, Peyser said. Adapting to the increases in the frequency, magnitude and duration of rapid warming events projected by the higher emission scenario will be difficult, the scientists write.Yin said, "If we can reduce greenhouse gas emissions, we can reduce the number of large record-breaking events in the 21st century -- and also we can reduce the risk."
Weather
2,018
January 24, 2018
https://www.sciencedaily.com/releases/2018/01/180124123113.htm
Weather patterns, farm income, other factors, may be influencing opioid crisis
The overprescribing of opioid-based painkillers may be the main driver of the increased abuse of opioids in rural America, but economists say that other factors, including declining farm income, extreme weather and other natural disasters, may affect a crisis that is killing thousands of citizens and costing the country billions of dollars.
In a study of relationships between socioeconomic variables and opioid-related drug overdoses, researchers found several correlations that are often not discussed in the current conversation about the nation's deaths of despair, which includes opioid overdoses, said Stephan Goetz, professor of agricultural and regional economics, Penn State and director of the Northeast Regional Center for Rural Development.For example, a higher number of natural disasters experienced historically in a county is correlated with an increase in opioid overdoses, according to the researchers. They used presidentially declared disasters by county from FEMA -- the Federal Emergency Management Agency -- to determine the effect of natural disasters on opioid deaths. These disasters primarily include weather-related events, such as hurricanes, droughts and floods.If climatologists' warnings are correct, a changing climate could produce more extreme weather patterns, which could then have an effect on opioid overdoses and deaths, said Goetz, who worked with Meri Davlasheridze, assistant professor in marine sciences, Texas A&M at Galveston.Income also matters, according to the researchers. For each $10,000 reduction in net income per farm, opioid overdoses rose by 10 percent from a national average of 10.2 deaths per 100,000 people to 11.2 deaths per 100,000 people. Opioid-related deaths are also increasing in rural counties, they added."Our results confirm that economic factors, including income especially and unemployment, as well as population density -- or rurality -- are important," said Goetz. "As we are controlling for economic factors, population density appears to play an independent role in accounting for the disparate death rates."Goetz added that it is important to use caution when interpreting this data."We are giving each county the same weight in our statistical analysis and the farming population is not that big -- it's about one or two percent of the U.S. population," said Goetz. "But, there could be a spillover effect -- if the farm income declines, the rest of the rural economy suffers."Estimates indicate that opioid-related drug overdoses cost the country $432 billion in 2015, according to the researchers, who presented their findings at a recent meeting of the Allied Social Sciences Association (ASSA) annual meeting in Philadelphia."To give some sense of this, the opioid crisis is a problem that is magnitudes of order larger than the costs associated with weather-related disasters in 2017," said Goetz. "This is a far-reaching problem -- and it cuts across social, economic and political lines."There are some glimmers of hope in the research, Goetz said. For example, overdoses among younger people seem to be declining, Goetz said. The highest rates of overdose are among people in the 45- to 64-year-old range.Because the self-employed have lower rates of overdose, the researchers suggest that self-employment also seems to be a deterrent against the opioid crisis."Sometimes we think of the self-employed, or entrepreneurs as more stressed and as people who might be looking for an escape from those pressures, but that doesn't appear to be the case in opioid use," said Goetz.The researchers theorize that one reason this wave of opioid deaths may be higher in rural counties is because of the low number of mental-health treatment facilities in those areas and, perhaps, the stigma attached to seeking help in those facilities."There are far fewer mental-health treatment facilities, so if you have a problem, you might not know where to go for help," Goetz said. "We're thinking that one of the things we need to investigate in the future is whether awareness is the problem or is there a stigma? These are all important issues to consider and they could be addressed through educational or other programs."The United States Department of Agriculture and the National Institute of Food and Agriculture supported this work.
Weather
2,018
January 23, 2018
https://www.sciencedaily.com/releases/2018/01/180123102012.htm
Better predicting mountains' flora and fauna in a changing world
Climbing a mountain is challenging. So, too, is providing the best possible information to plan for climate change's impact on mountain vegetation and wildlife. Understanding how plant and animal species in mountainous areas will be affected by climate change is complicated and difficult.
In Mountain ranges take up about a quarter of the world's land area, are rich in biodiversity, and are home to many endangered or threatened wildlife, including the iconic giant panda. Mountains also have notoriously complex climates. Their landscapes harbor microclimates, varied air circulation patterns and elevations and usually are too remote to have many weather and climate observing stations.Understanding how climate change may affect wildlife habitats is important to conservation managers. Climate change could render today's wildlife refuges less hospitable and unable to support wildlife populations. The study "Uncertainty of future projections of species distributions in mountainous regions" notes that the majority of researchers working to create models predicting changes in species distributions over time have used climate datasets derived from conventional observing stations.The problem, notes Ying Tang, a research associate in MSU's geography department and the Center for Systems Integration and Sustainability, is that the resolution of the station network in remote mountain areas may not capture the complex climates of mountain ranges, leading to uncertainty in future projections of species distributions.To get a better read on the climate patterns of mountain regions, Tang and her colleagues did a deep dive into also examining a newly compiled dataset of remotely-sensed measurements of temperature and precipitation gathered from satellite sensors. These measurements have a finer resolution and more continuous spatial coverage than conventional climate observing networks. They modeled the future distributions of bamboo species in the mountains of southwestern China that are essential for giant panda conservation efforts.The combination of the two types of datasets, Tang said, allows a better understanding of habitat suitability in mountainous areas.It's also much more difficult to process. Tang and the group, under the direction of geography professor Julie Winkler, spent some two years running several million simulations to re-examine earlier projections based on conventional climate datasets only, burning through 20 terabytes of data.The use of the two very different climate datasets allows for more confidence in the future projections for those bamboo species for which the projected changes were similar for the two climate datasets and provides an estimate of the level of uncertainty for those species for which the projections differed."This information is invaluable for conservation planning, allowing for nuanced and flexible decision making," Tang said. "The use of multiple climate datasets in species distribution modeling helps to ensure that conservation planners in mountainous regions have the best possible information available to them."
Weather
2,018
January 22, 2018
https://www.sciencedaily.com/releases/2018/01/180122164718.htm
Predicting snowpack even before the snow falls
As farmers in the American West decide what, when and where to plant, and urban water managers plan for water needs in the next year, they want to know how much water their community will get from melting snow in the mountains.
This melting snow comes from snowpack, the high elevation reservoir of snow which melts in the spring and summer. Agriculture depends on snowpack for a majority of its water. Meltwater also contributes to municipal water supply; feeds rivers and streams, boosting fisheries and tourism; and conditions the landscape, helping lessen the effects of drought and wildfires.Now, new NOAA research is showing we can predict snow levels in the mountains of the West in March some eight months in advance. This prediction can be down to the scale of a mountain range, which will improve regional water forecasts."In summer when people are thinking about 4th of July fireworks and barbeques, long before the first snow has fallen, our experimental prediction system tells us what the following March will be like," said Sarah Kapnick, a physical scientist at NOAA's Geophysical Fluid Dynamics Laboratory who led the research that appears online today in While we have long range climate predictions that show a decline of snowpack by the end of the century and short-range rain and snow forecasts, until now there has been little information on what to expect in the next two months to two years.In its early stages and not yet ready to deliver operational forecasts, the research has the potential to improve a range of water-related decisions affecting warm weather water supply, wildfire risk, ecology and industries like agriculture that depend on water from melting snowpack.NOAA's experimental predictions were accurate for much of the West except in the mountains of the southern Sierra Nevada. The infrequent and chaotic nature of precipitation-producing storms in the mountains stretching from California to Washington have long been a challenge."Having seasonal snow forecasts would be a tremendous boon to water managers," said Frank Gehrke, chief of the California Cooperative Snow Survey Program in the state's Department of Water Resources. "I'm not surprised prediction is running into difficulty in the Sierra Nevada but I'm hopeful the work we're doing now to improve data from this terrain will help improve prediction here."While better prediction of water resources has always been a priority in the West, the recent prolonged drought from 2012 to 2015 and the devastating 2017 wildfires have raised the stakes. The Weather Research and Forecasting Innovation Act of 2017 passed by Congress and signed by the President also identifies improved snowpack prediction as a national priority.
Weather
2,018
January 18, 2018
https://www.sciencedaily.com/releases/2018/01/180118173711.htm
Long-term warming trend continued in 2017: NASA, NOAA
Earth's global surface temperatures in 2017 ranked as the second warmest since 1880, according to an analysis by NASA.
Continuing the planet's long-term warming trend, globally averaged temperatures in 2017 were 1.62 degrees Fahrenheit (0.90 degrees Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASA's Goddard Institute for Space Studies (GISS) in New York. That is second only to global temperatures in 2016.In a separate, independent analysis, scientists at the National Oceanic and Atmospheric Administration (NOAA) concluded that 2017 was the third-warmest year in their record. The minor difference in rankings is due to the different methods used by the two agencies to analyze global temperatures, although over the long-term the agencies' records remain in strong agreement. Both analyses show that the five warmest years on record all have taken place since 2010.Because weather station locations and measurement practices change over time, there are uncertainties in the interpretation of specific year-to-year global mean temperature differences. Taking this into account, NASA estimates that 2017's global mean change is accurate to within 0.1 degree Fahrenheit, with a 95 percent certainty level."Despite colder than average temperatures in any one part of the world, temperatures over the planet as a whole continue the rapid warming trend we've seen over the last 40 years," said GISS Director Gavin Schmidt.The planet's average surface temperature has risen about 2 degrees Fahrenheit (a little more than 1 degree Celsius) during the last century or so, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere. Last year was the third consecutive year in which global temperatures were more than 1.8 degrees Fahrenheit (1 degree Celsius) above late nineteenth-century levels.Phenomena such as El Niño or La Niña, which warm or cool the upper tropical Pacific Ocean and cause corresponding variations in global wind and weather patterns, contribute to short-term variations in global average temperature. A warming El Niño event was in effect for most of 2015 and the first third of 2016. Even without an El Niño event -- and with a La Niña starting in the later months of 2017 -- last year's temperatures ranked between 2015 and 2016 in NASA's records.In an analysis where the effects of the recent El Niño and La Niña patterns were statistically removed from the record, 2017 would have been the warmest year on record.Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2017 annual mean temperature for the contiguous 48 United States was the third warmest on record.Warming trends are strongest in the Arctic regions, where 2017 saw the continued loss of sea ice.NASA's temperature analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations.These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980.NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth's polar regions and global temperatures.The full 2017 surface temperature data set and the complete methodology used to make the temperature calculation are available at: GISS is a laboratory within the Earth Sciences Division of NASA's Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.
Weather
2,018
January 18, 2018
https://www.sciencedaily.com/releases/2018/01/180118100806.htm
Warming Arctic climate constrains life in cold-adapted mammals
Despite the growth in knowledge about the effects of a warming Arctic on its cold-adapted species, how these changes affect animal populations is poorly understood. Research efforts have been hindered by the area's remoteness and complex logistics required to gain access.
A new study led by Joel Berger, professor in the Department of Fish, Wildlife and Conservation Biology at Colorado State University, has uncovered previously unknown effects of rain-on-snow events, winter precipitation and ice tidal surges on the Arctic's largest land mammal, the muskoxen.The warmer climate is stressing mothers and young muskoxen, said Berger, also a senior scientist with the Wildlife Conservation Society. Rain-on-snow events occurring in the winter -- when muskoxen gestate -- and unusually dry winter conditions have also led to underdeveloped skeletal growth in juvenile muskoxen. This effect can be traced back to their pregnant mothers."When rain-on-snow events occur in the Arctic, due to warming temperatures, and the snow freezes again, this leads to mothers not being able to access food for adequate nutrition," said Berger. "The babies then, unfortunately, pay the price."The smaller size observed in juvenile and young adult muskoxen is associated with poorer health and fitness, due to delayed puberty, and increased mortality, according to the research team.In addition, scientists documented a mass mortality event due to a one-time extreme ice event caused by a tidal surge. In February 2011, an historically high tidal surge resulted in at least 52 muskoxen being submerged at the northern coast of Bering Land Bridge peninsula.Researchers also found historical records documenting deaths due to rapid freezing and thawing during winter at a range of single sites of whales -- 170 Beluga and 150 narwhals -- and sea otters along the Aleutian Islands."Unlike polar bears, which are on the world's stage, no one really knows about muskoxen or cares," said Berger. "They roamed with wooly mammoths but still survive. Muskoxen are feeling the heat, just as we humans are feeling the extremes of climate. These wild weather swings have massive impacts on us. Solutions are clear, but we fail to respond by changing our consumptive ways."The research team analyzed head size of juvenile muskoxen using digital photo data over the span of seven years and at three sites in Alaska and Russia. They also compiled winter weather data for the Alaskan sites from the closest weather stations maintained by the National Oceanic and Atmospheric Administration. Data used to calculate rain-on-snow events on Wrangel Island, located in the Arctic Ocean, between the Chukchi Sea and East Siberian Sea, were from the Federal Hydro-meteorological Service of Russia, whose records date back to 1926.Berger acknowledged the role his research plays "in a challenging political era." His collaborators include Russian scientists in Asia and the Alaskan Arctic. Berger is not only a researcher, but he also serves as a diplomat of sorts, working closely with Russian research counterparts and the government. He praised them for their active engagement and willingness to share data."To better understand how species like the muskoxen are responding to a changing Arctic, we must collaborate, so that we can document rare and often tragic events, and gain additional insight from local people," said Berger.
Weather
2,018
January 17, 2018
https://www.sciencedaily.com/releases/2018/01/180117125525.htm
Himawari-8 data assimilated simulation enables 10-minute updates of rain and flood predictions
Using the power of Japan's K computer, scientists from the RIKEN Advanced Institute for Computational Science and collaborators have shown that incorporating satellite data at frequent intervals -- ten minutes in the case of this study -- into weather prediction models can significantly improve the rainfall predictions of the models and allow more precise predictions of the rapid development of a typhoon.
Weather prediction models attempt to predict future weather by running simulations based on current conditions taken from various sources of data. However, the inherently complex nature of the systems, coupled with the lack of precision and timeliness of the data, makes it difficult to conduct accurate predictions, especially with weather systems such as sudden precipitation.As a means to improve models, scientists are using powerful supercomputers to run simulations based on more frequently updated and accurate data. The team led by Takemasa Miyoshi of AICS decided to work with data from Himawari-8, a geostationary satellite that began operating in 2015. Its instruments can scan the entire area it covers every ten minutes in both visible and infrared light, at a resolution of up to 500 meters, and the data is provided to meteorological agencies. Infrared measurements are useful for indirectly gauging rainfall, as they make it possible to see where clouds are located and at what altitude.For one study, they looked at the behavior of Typhoon Soudelor (known in the Philippines as Hanna), a category 5 storm that wreaked damage in the Pacific region in late July and early August 2015. In a second study, they investigated the use of the improved data on predictions of heavy rainfall that occurred in the Kanto region of Japan in September 2015. These articles were published in Monthly Weather Review and For the study on Typhoon Soudelor, the researchers adopted a recently developed weather model called SCALE-LETKF -- running an ensemble of 50 simulations -- and incorporated infrared measurements from the satellite every ten minutes, comparing the performance of the model against the actual data from the 2015 tropical storm. They found that compared to models not using the assimilated data, the new simulation more accurately forecast the rapid development of the storm. They tried assimilating data at a slower speed, updating the model every 30 minutes rather than ten minutes, and the model did not perform as well, indicating that the frequency of the assimilation is an important element of the improvement.To perform the research on disastrous precipitation, the group examined data from heavy rainfall that occurred in the Kanto region in 2015. Compared to models without data assimilation from the Himawari-8 satellite, the simulations more accurately predicted the heavy, concentrated rain that took place, and came closer to predicting the situation where an overflowing river led to severe flooding.According to Miyoshi, "It is gratifying to see that supercomputers along with new satellite data, will allow us to create simulations that will be better at predicting sudden precipitation and other dangerous weather phenomena, which cause enormous damage and may become more frequent due to climate change. We plan to apply this new method to other weather events to make sure that the results are truly robust."
Weather
2,018
January 16, 2018
https://www.sciencedaily.com/releases/2018/01/180116222526.htm
Weather anomalies accelerate the melting of sea ice
In the winter of 2015/16, something happened that had never before been seen on this scale: at the end of December, temperatures rose above zero degrees Celsius for several days in parts of the Arctic. Temperatures of up to eight degrees were registered north of Svalbard. Temperatures this high have not been recorded in the winter half of the year since the beginning of systematic measurements at the end of the 1970s. As a result of this unusual warmth, the sea ice began to melt.
"We heard about this from the media," says Heini Wernli, Professor of Atmospheric Dynamics at ETH Zurich. The news aroused his scientific curiosity, and a team led by his then doctoral student Hanin Binder investigated the issue. In November 2017, they published their analysis of this exceptional event in the journal In it, the researchers show how these unusual temperatures arose: three different air currents met over the North Sea between Scotland and southern Norway, carrying warm air northwards at high speed as though on a "highway."One air current originated in the Sahara and brought near-surface warm air with it. To begin with, temperature of this air was about 20 degrees Celsius. While it cooled off on its way to the Arctic, it was still above zero when it arrived. "It's extremely rare for warm, near-surface subtropical air to be transported as far as the Arctic," says Binder.The second air current originated in the Arctic itself, a fact that astonished the scientists. To begin with, this air was very cold. However, the air mass -- which also lay close to the ground -- moved towards the south along a curved path and, while above the Atlantic, was warmed significantly by the heatflux from the ocean before joining the subtropical air current.The third warm air current started as a cold air mass in the upper troposphere, from an altitude above 5 kilometres. These air masses were carried from west to east and descended in a stationary high-pressure area over Scandinavia. Compression thereby warmed the originally cold air, before it entered the "highway to the Arctic."This highway of air currents was made possible by a particular constellation of pressure systems over northern Europe. During the period in question, intense low-pressure systems developed over Iceland while an extremely stable high-pressure area formed over Scandinavia. This created a kind of funnel above the North Sea, between Scotland and southern Norway, which channelled the various air currents and steered them northwards to the Arctic.This highway lasted approximately a week. The pressure systems then decayed and the Arctic returned to its typical frozen winter state. However, the warm period sufficed to reduce the thickness of the sea ice in parts of the Arctic by 30 centimetres -- during a period in which ice usually becomes thicker and more widespread."These weather conditions and their effect on the sea ice were really exceptional," says Binder. The researchers were not able to identify a direct link to global warming. "We only carried out an analysis of a single event; we didn't research the long-term climate aspects" emphasises Binder.However, the melting of Arctic sea ice during summer is a different story. The long-term trend is clear: the minimum extent and thickness of the sea ice in late summer has been shrinking continually since the end of the 1970s. Sea ice melted particularly severely in 2007 and 2012 -- a fact which climate researchers have thus far been unable to fully explain. Along with Lukas Papritz from the University of Bergen, Wernli investigated the causes of these outliers. Their study has just been published in the journal According to their research, the severe melting in the aforementioned years was caused by stable high-pressure systems that formed repeatedly throughout the summer months. Under these cloud-free weather conditions, the high level of direct sunlight -- the sun shines 24 hours a day at this time of year -- particularly intensified the melting of the sea ice.These high-pressure systems developed through an influx of air from temperate latitudes. Low-pressure systems in the North Atlantic and North Pacific areas, for example, "inject" air masses into the Arctic at a height of about eight kilometres. This raised the height of the tropopause, the boundary between the troposphere and the stratosphere, in the region of the "injections." As a result, surface air pressure below rose and a high-pressure system was established. While it dissipated again around ten days later, an unusually high amount of sea ice melted in the interim, and the remaining ice thinned.The climate scientists' investigation demonstrated that in the summers of 2007 and 2012, during which these high-pressure situations occurred particularly frequently, they led to cloud-free conditions every third day. The high level of solar radiation intensified and accelerated the melting of the sea ice. "The level of solar radiation is the main factor in the melting of the ice in summer. Unlike with the winter anomaly, the "injected" air at about 8 kilometre altitude from the south is not warm -- with minus 60 degrees it's ice-cold," says Wernli."The air temperature therefore has very little effect on the ice." Furthermore, the northward transport of warm, humid air masses at the edge of the high-pressure systems reduces (heat) emission, which further intensifies melting.Their analysis has allowed the researchers to understand the meteorological processes leading to significant variations in summertime ice melt for the first time. "Our results underline the fundamental role that weather systems in temperate latitudes play in episodes of particularly intense ice melt in the Arctic," says the ETH professor.
Weather
2,018
January 16, 2018
https://www.sciencedaily.com/releases/2018/01/180116144209.htm
In sweet corn, workhorses win
When deciding which sweet corn hybrids to plant, vegetable processors need to consider whether they want their contract growers using a workhorse or a racehorse. Is it better to choose a hybrid with exceptional yields under ideal growing conditions (i.e., the racehorse) or one that performs consistently well across ideal and less-than-ideal conditions (i.e., the workhorse)? New research from the University of Illinois suggests the workhorse is the winner in processing sweet corn.
"Experts say the ideal cultivar would have exceptional yield regardless of the weather, and across a large area, but it's unknown if such cultivars are commercially available," says Marty Williams, an ecologist with the Department of Crop Sciences at U of I and USDA-ARS.Williams says a number of crops have been studied for yield stability, a cultivar's ability to produce consistent yields across inconsistent environments. The work has resulted in several recommendations about where to grow specific cultivars for the best results."Stability analysis is valuable, particularly given the increased weather variability we're facing. However, previous studies always stopped with recommendations. No one appears to have quantified if such recommendations are followed. Our work is about how yield stability of individual hybrids actually relates to hybrid adoption in sweet corn," he says. Although the focus is on sweet corn, the study is the first to link a cultivar's yield stability with adoption in any crop.Williams obtained data from an anonymous vegetable processing company, representing more than a decade of sweet corn hybrid assessment trials across the upper Midwest and the Pacific Northwest. He pulled the number of cases produced per acre -- a yield metric important to processors that he calls "case production" -- from each trial, and then incorporated environmental data to calculate yield stability for 12 of the most commonly planted hybrids grown for processing.Performance of each hybrid was related to all other hybrids across a wide range of growing conditions. This enabled Williams to assign each hybrid to categories of high, average, and low stability and high, average, and low yield. He found 10 hybrids were average for both stability and yield. A few hybrids had above-average yield or above-average stability, but none had both, suggesting the "ideal" sweet corn hybrid does not yet exist.Williams then analyzed another dataset representing nearly 15,000 processing sweet corn fields over a period of 20 years. He was able to calculate the acreage planted in each of the 12 hybrids from the hybrid assessment trial. Those 12 hybrids accounted for most of the acreage planted to sweet corn over the 20-year period for the processor.Most hybrids accounted for 1 to 4 percent of the planted acreage. However, he found a single hybrid was planted on disproportionately more acres: 31.2 percent, to be exact. That hybrid was the only one exhibiting above-average stability across variable growing conditions.In processing sweet corn, vegetable processors -- not growers -- choose the hybrid for each field. Processors need hybrids that lend themselves to machine harvest, ears that hold up to processing, and kernels that maintain quality as a finished product. Williams says vegetable processors also consider the capacity of their processing facilities."When sweet corn is ripe, it must be harvested. Moreover, unlike grain corn which can be stored prior to use, sweet corn must be processed and preserved immediately after harvest," Williams explains. "Midwest processors want to have their plants running at capacity throughout the approximately three-month harvest window. A plant running significantly above or below that capacity is costly. I suspect a racehorse hybrid is problematic because it's difficult to predict its performance when the weather deviates from ideal growing conditions, which is common in the Midwest."Evidence that vegetable processors prioritize stability could inform future sweet corn breeding programs, and, according to Williams, it could provide a sense of security for growers. "Growers are more likely tasked with growing a workhorse over a racehorse. That decision buffers them, as well as the processor, from less-than-ideal growing conditions," he says.
Weather
2,018
January 13, 2018
https://www.sciencedaily.com/releases/2018/01/180113093733.htm
Sanchi oil spill contamination could take three months to reach mainland
Water contaminated by the oil currently leaking into the ocean from the Sanchi tanker collision is likely to take at least three months to reach land, and if it does the Korean coast is the most likely location. However, the oil's fate is highly uncertain, as it may burn, evaporate, or mix into the surface ocean and contaminate the environment for an extended duration.
This is according to emergency ocean model simulations run by scientists at the National Oceanography Centre (NOC) and The University of Southampton to assess the potential impact of local ocean circulation on the spread of pollutants. These simulations were run using the leading-edge, high-resolution global ocean circulation model, NEMO.The Sanchi tanker collision occurred on the border between the Yellow and East China seas, an area with complex, strong and highly variable surface currents.Leading this research, Dr Katya Popova, from the National Oceanography Centre, said "Oil spills can have a devastating effect on the marine environment and on coastal communities. Strong ocean currents mean that, once released into the ocean, an oil spill can relatively rapidly spread over large distances. So understanding ocean currents and the timescale on which they transport ocean pollutants is critical during any maritime accidents, especially ones involving oil leaks."The team of scientists involved in this study 'dropped' virtual oil particles into the NEMO ocean model and tracked where they ended up over a three month period. Simulations were run for a series of scenarios of ocean circulation typical for the area the oil spill is thought to have occurred in, and for this time of year. This allowed the scientists to produce a map of the potential extent of the oil spill, showing the risk of oil pollutants reaching a particular part of the ocean.However, Stephen Kelly, the University of Southampton PhD student who ran the model simulations, said "There was a high level of variation between different scenarios, depending on a number of factors. Primarily the location of the original oil spill and the way in which atmospheric conditions were affecting ocean circulation at that time."NOC scientist, Dr Andrew Yool, who collaborated in this study, discussed how the approach used during these model simulations could help optimise future search and recovery operations at sea by rapidly modelling oil spills in real-time. "By using pre-existing ocean model output we can estimate which areas could potentially be affected over weekly to monthly timescales, and quickly at low computing cost. This approach complements traditional forecast simulations, which are very accurate for a short period of time but lose their reliability on timescales that are required to understand the fate of the spill on the scale from days to weeks."The NEMO ocean model is supported by UK National Capability funding from the Natural Environment Research Council (NERC). This model is widely used by both UK and international groups for research into ocean circulation, climate and marine ecosystems, and operationally as part of the UK Met Office's weather forecasting.
Weather
2,018
January 12, 2018
https://www.sciencedaily.com/releases/2018/01/180112091209.htm
Jet stream changes since 1960s linked to more extreme weather
Increased fluctuations in the path of the North Atlantic jet stream since the 1960s coincide with more extreme weather events in Europe such as heat waves, droughts, wildfires and flooding, reports a University of Arizona-led team.
The research is the first reconstruction of historical changes in the North Atlantic jet stream prior to the 20th century. By studying tree rings from trees in the British Isles and the northeastern Mediterranean, the team teased out those regions' late summer weather going back almost 300 years -- to 1725."We find that the position of the North Atlantic Jet in summer has been a strong driver of climate extremes in Europe for the last 300 years," Trouet said.Having a 290-year record of the position of the jet stream let Trouet and her colleagues determine that swings between northern and southern positions of the jet became more frequent in the second half of the 20th century, she said."Since 1960 we get more years when the jet is in an extreme position." Trouet said, adding that the increase is unprecedented.When the North Atlantic Jet is in the extreme northern position, the British Isles and western Europe have a summer heat wave while southeastern Europe has heavy rains and flooding, she said.When the jet is in the extreme southern position, the situation flips: Western Europe has heavy rains and flooding while southeastern Europe has extreme high temperatures, drought and wildfires."Heat waves, droughts and floods affect people," Trouet said. "The heat waves and drought that are related to such jet stream extremes happen on top of already increasing temperatures and global warming -- it's a double whammy."Extreme summer weather events in the American Midwest are also associated with extreme northward or southward movements of the jet stream, the authors write."We studied the summer position of the North Atlantic jet. What we're experiencing now in North America is part of the same jet stream system," Trouet said.This winter's extreme cold and snow in the North American Northeast and extreme warmth and dryness in California and the American Southwest are related to the winter position of the North Pacific Jet, she said.The paper, "Recent enhanced high-summer North Atlantic Jet variability emerges from three-century context," by Trouet and her co-authors Flurin Babst of the Swiss Federal Research Institute WSL in Birmensdorf and Matthew Meko of the UA is scheduled for publication in "I remember quite vividly when I got the idea," Trouet said. "I was sitting in my mom's house in Belgium."While visiting her family in Belgium during the very rainy summer of 2012, Trouet looked at the newspaper weather map that showed heavy rain in northwestern Europe and extreme heat and drought in the northeastern Mediterranean."I had seen the exact same map in my tree-ring data," she said. The tree rings showed that hot temperatures in the Mediterranean occurred the same years that it was cool in the British Isles -- and vice versa.The part of an annual tree ring that forms in the latter part of the growing season is called latewood. The density of the latewood in a particular tree ring reflects the August temperature that year.Other investigators had measured the annual latewood density for trees from the British Isles and the northeastern Mediterranean for rings formed from 1978 back to 1725.Because August temperatures in those two regions reflect the summer position of the North Atlantic jet stream, Trouet and her colleagues used those tree-ring readings to determine the historical position of the jet stream from 1725 to 1978. For the position of the jet stream from 1979 to 2015, the researchers relied on data from meteorological observations."There's a debate about whether the increased variability of the jet stream is linked to man-made global warming and the faster warming of the Arctic compared to the tropics," Trouet said."Part of the reason for the debate is that the data sets used to study this are quite short -- 1979 to present. If you want to see if this variability is unprecedented, you need to go farther back in time -- and that's where our study comes in."With the discovery of much older trees in the Balkans and in the British Isles, Trouet hopes to reconstruct the path of the North Atlantic jet stream as much as 1,000 years into the past. She is also interested in reconstructing the path of the North Pacific jet stream, which influences the climate and weather over North America.
Weather
2,018
January 11, 2018
https://www.sciencedaily.com/releases/2018/01/180111101403.htm
Why the Republican Party may have an advantage when it rains: Voters change their minds
Bad weather affects U.S. voter turnout and election outcomes with past research demonstrating that the Republican Party has the advantage. A new study by researchers at Dartmouth College and The Australian National University finds that the Republican Party's advantage when it rains may be due in part to voters changing their partisan preference that day.
The study published in "Our study suggests that weather conditions may affect people's decisions on not only whether to vote but also who they vote for," says co-author Yusaku Horiuchi, professor of government at Dartmouth.The findings revealed that at least 1 percent of voting age adults in the U.S. who would have voted for a Democrat had the weather been good, voted instead for a Republican on rainy election days.The change in party preference may be attributed to a psychological behavior, where voters may be more averse to risk during poor weather conditions. Earlier studies have identified a correlation between ideological and political orientations in which conservatives or Republicans tend to be more averse to risk than liberals or Democrats.The study was based on a statistical analysis that drew on compositional electoral data: the voter share for the Democratic candidate, the voter share for the Republican candidate and the abstention rate, the sum of which should be 100 percent. When this compositional nature of election outcomes was taken into account, the research team discovered a more nuanced effect of rainfall -- how voters' preferences may change with bad weather.The Dartmouth-ANU research team points out that past studies looking at how rain affects people's decisions to go to the polls or abstain from voting have focused on how voter turnout tends to be higher among Republicans than among Democrats; however, the team argues that this only partially explains the alleged Republican advantage.
Weather
2,018
January 10, 2018
https://www.sciencedaily.com/releases/2018/01/180110101013.htm
Human-perceived temperature is rising faster than actual air temperature
Each of the three years from 2014 to 2016 broke the global air temperature record, and 2017 will also turn out to be one of the hottest years ever. To predict how humans will be affected by climate change, geographers and climatologists led by Professor David Chen Yongqin from the Department of Geography and Resource Management at The Chinese University of Hong Kong (CUHK) and Dr Li Jianfeng from the Department of Geography at Hong Kong Baptist University (HKBU) studied the apparent temperature (AP), the temperature equivalent perceived by humans. They found that AP increased faster than air temperature (AT) over land in the past few decades, especially in the low latitude areas, and the rise is expected to continue in the future. This finding was recently published in
Scientists have developed and used Global Climate Models (GCMs) to simulate the global climate and make projections of future AT and other climatic variables under different carbon emission scenarios in the 21st century. However, GCMs do not directly project how the changes of other climatic factors, such as humidity and wind, affect human perception.Professor Chen remarked, "Among the extensive and far-reaching impacts of global warming, human health and labour productivity are most directly affected by thermal discomfort and heat-related morbidity and mortality. Our study of the faster increases in apparent temperature has produced important findings for this kind of climate change impact assessment, providing a strong scientific support for more stringent and effective climate change mitigation efforts to combat global warming."Dr Li said the latest research findings give a better understanding of changes in human-perceived equivalent temperature, and indicate global warming has stronger long-term impacts on human beings under both extreme and non-extreme weather conditions, suggesting that climate change adaptation cannot just focus on heat wave events, but should be extended to the whole range of effects of temperature increases. The team will continue to explore the related issues to enhance the scientific knowledge.The research team used four reanalysis datasets of the past climate and outputs from seven GCMs to estimate the human-perceived equivalent temperature AP, from AT, humidity and wind. Findings showed that the global land average AP increased 0.04oC per decade faster than AT before 2005, because of the concurrent increases in AT and humidity. This trend was projected to increase to 0.06 oC per decade and 0.17 oC per decade under Representative Concentration Pathway 4.5 scenario (RCP4.5) and RCP8.5, respectively, and reduce to 0.02oC per decade under RCP2.6. The faster increases in AP are more significant in low latitude areas (tropical and sub-tropical regions) than the middle and high latitude areas. Study also indicated that the number of days with extremely apparent temperature will substantially increase in 2081 to 2100 compared to the period between 1981 and 2000, mainly due to the remarkable increase in the frequency of extremely hot days in summer.Taken together, a key conclusion is that the world, as perceived by human beings, will become hotter than that just indicated by air temperature under global warming. This conclusion clearly implies that cities and communities, especially those located at tropical and sub-tropical regions like Hong Kong, will face bigger threats from hot weather and therefore greater efforts for climate change mitigation and adaptation are vital and urgent.
Weather
2,018
January 8, 2018
https://www.sciencedaily.com/releases/2018/01/180108133239.htm
Water-based, eco-friendly and energy-saving air-conditioner
A team of researchers from the National University of Singapore (NUS) has pioneered a new water-based air-conditioning system that cools air to as low as 18 degrees Celsius without the use of energy-intensive compressors and environmentally harmful chemical refrigerants. This game-changing technology could potentially replace the century-old air-cooling principle that is still being used in our modern-day air-conditioners. Suitable for both indoor and outdoor use, the novel system is portable and it can also be customised for all types of weather conditions.
Led by Associate Professor Ernest Chua from the Department of Mechanical Engineering at NUS Faculty of Engineering, the team's novel air-conditioning system is cost-effective to produce, and it is also more eco-friendly and sustainable. The system consumes about 40 per cent less electricity than current compressor-based air-conditioners used in homes and commercial buildings. This translates into more than 40 per cent reduction in carbon emissions. In addition, it adopts a water-based cooling technology instead of using chemical refrigerants such as chlorofluorocarbon and hydrochlorofluorocarbon for cooling, thus making it safer and more environmentally-friendly.To add another feather to its eco-friendliness cap, the novel system generates potable drinking water while it cools ambient air.Assoc Prof Chua said, "For buildings located in the tropics, more than 40 per cent of the building's energy consumption is attributed to air-conditioning. We expect this rate to increase dramatically, adding an extra punch to global warming. First invented by Willis Carrier in 1902, vapour compression air-conditioning is the most widely used air-conditioning technology today. This approach is very energy-intensive and environmentally harmful. In contrast, our novel membrane and water-based cooling technology is very eco-friendly -- it can provide cool and dry air without using a compressor and chemical refrigerants. This is a new starting point for the next generation of air-conditioners, and our technology has immense potential to disrupt how air-conditioning has traditionally been provided."Current air-conditioning systems require a large amount of energy to remove moisture and to cool the dehumidified air. By developing two systems to perform these two processes separately, the NUS Engineering team can better control each process and hence achieve greater energy efficiency.The novel air-conditioning system first uses an innovative membrane technology -- a paper-like material -- to remove moisture from humid outdoor air. The dehumidified air is then cooled via a dew-point cooling system that uses water as the cooling medium instead of harmful chemical refrigerants. Unlike vapour compression air-conditioners, the novel system does not release hot air to the environment. Instead, a cool air stream that is comparatively less humid than environmental humidity is discharged -- negating the effect of micro-climate. About 12 to 15 litres of potable drinking water can also be harvested after operating the air-conditioning system for a day."Our cooling technology can be easily tailored for all types of weather conditions, from humid climate in the tropics to arid climate in the deserts. While it can be used for indoor living and commercial spaces, it can also be easily scaled up to provide air-conditioning for clusters of buildings in an energy-efficient manner. This novel technology is also highly suitable for confined spaces such as bomb shelters or bunkers, where removing moisture from the air is critical for human comfort, as well as for sustainable operation of delicate equipment in areas such as field hospitals, armoured personnel carriers, and operation decks of navy ships as well as aircrafts," explained Assoc Prof Chua.The research team is currently refining the design of the air-conditioning system to further improve its user-friendliness. The NUS researchers are also working to incorporate smart features such as pre-programmed thermal settings based on human occupancy and real-time tracking of its energy efficiency. The team hopes to work with industry partners to commercialise the technology.This project is supported by the Building and Construction Authority and National Research Foundation Singapore.
Weather
2,018
January 6, 2018
https://www.sciencedaily.com/releases/2018/01/180106103436.htm
In Antarctic dry valleys, early signs of climate change-induced shifts in soil
In a study spanning two decades, a team of researchers led by Colorado State University found declining numbers of soil fauna, nematodes and other animal species in the McMurdo Dry Valleys, one of the world's driest and coldest deserts. This discovery is attributed to climate change, which has triggered melting and thawing of ice in this desert since an uncharacteristically warm weather event in 2001.
There are no plants, birds or mammals in the McMurdo Dry Valleys, located in the largest region of the Antarctic continent. But microbes and microscopic soil invertebrates live in the harsh ecosystem, where the mean average temperature is below -15 degrees Celsius, or 5 degrees Fahrenheit.The findings offer insight and an alarm bell on how ecosystems respond to climate change and to unusual climate events, scientists said."Until 2001, the region was not experiencing a warming trend," said Walter Andriuzzi, lead author of the study and a postdoctoral researcher in the Department of Biology and School of Global Environmental Sustainability."On the contrary, it was getting colder," he continued. "But in 2001, the cooling trend stopped abruptly with an extremely warm weather event. Since then, the average temperatures are either stable or are increasing slightly. But most importantly, there have been more frequent intense weather events."The research team sampled soil invertebrates and measured soil properties, including water content, in three hydrological basins and at three different elevations in the region. In Taylor Valley, the field study was launched in 1993; in Miers and Garwood valleys, scientists started their work in 2011.Andriuzzi said what the team found in this long-term study cannot be observed by looking at average or monthly temperatures."It's a few hours, or days of unusually warm weather," he said. "There are even peaks of high solar radiation that trigger ice thawing without high temperatures. That's how climate change is happening there, and it's already starting to impact the biological community there."Higher temperatures mean more melting and thawing of ice from glaciers and permafrost, which has led to the decline of the most common species, the nematode Scottnema lindsayae. Other species are becoming more abundant and are spreading uphill. As a result, at higher elevations, the microbes and animals in the soil are becoming more diverse, with unknown consequences for the ecosystem."This is happening worldwide, and not just in Antarctica," said Andriuzzi, who is a researcher in the lab of University Distinguished Professor Diana Wall. In the Rocky Mountains, for instance, scientists have observed insects moving uphill on a year to year basis, due to warming temperatures.Andriuzzi, who led field work in the McMurdo Dry Valleys, called Antarctic nematodes "remarkable creatures.""It's amazing that they survive in these conditions," he said. The growing season only lasts a few weeks, but in the field, this microscopic animal may live 10 years.Given what the team found, Andriuzzi said it will take time for the nematode community to recover from these disturbances."With climate change, some species are winners, some are losers," he said. "In the Dry Valleys it's all about how they respond to warming and, most importantly, water."Andriuzzi said shifts in communities are often very hard to predict because there are so many species."It's easier in places like the dry valleys to isolate the effects of climate change, or to isolate how one species responds to climate change in one way," he said. "It's a natural laboratory, where some of the mechanisms that operate elsewhere can be unveiled."This research is part of the McMurdo Dry Valleys Long-Term Ecological Research program, which was established in Antarctica by the National Science Foundation in 1992. CSU's Diana Wall is one of the founders of this field study.Learn more about this research and an extreme melt season which occurred in 2002.Co-authors of the new research include Byron Adams from Brigham Young University, Ross Virginia from Dartmouth College, and Jeb Barrett from Virginia Tech.
Weather
2,018
January 4, 2018
https://www.sciencedaily.com/releases/2018/01/180104120311.htm
Is Arctic warming influencing the UK's extreme weather?
Severe snowy weather in winter or extreme rains in summer in the UK might be influenced by warming trends in the Arctic, according to new findings.
Climate scientists from the UK and the US examined historic data of extreme weather events in the UK over the past decade and compared them with the position of the North Atlantic polar atmospheric jet steam using a measure called the North Atlantic Oscillation (NAO) index.The NAO indicates the position of the jet stream -- which is a giant current of air that broadly flows eastwards over mid-latitude regions around the globe -- through a diagram which shows 'negative' and 'positive' spikes, similar to how a heart monitor looks.The researchers highlight that the exceptionally wet UK summers of 2007 and 2012 had notably negative readings of the NAO, as did the cold, snowy winters of 2009/2010 and 2010/2011, while the exceptionally mild, wet, stormy winters experienced in 2013/2014 and 2015/2016 showed pronounced positive spikes.The scientists also highlighted a correlation between the jet stream's altered path over the past decade -- so-called jet stream 'waviness' -- and an increase during summer months in a phenomenon called Greenland high-pressure blocking, which represents areas of high pressure that remain nearly stationary over the Greenland region and distort the usual progression of storms across the North Atlantic.Increased jet waviness is associated with a weakening of the jet stream, and the accompanying 'blocking' is linked to some of the most extreme UK seasonal weather events experienced over the past decade. The strength and path of the North Atlantic jet stream and the Greenland blocking phenomena appear to be influenced by increasing temperatures in the Arctic which have averaged at least twice the global warming rate over the past two decades, suggesting that those marked changes may be a key factor affecting extreme weather conditions over the UK, although an Arctic connection may not occur each year.Edward Hanna, Professor of Climate Science and Meteorology at the University of Lincoln's School of Geography, carried out the study with Dr Richard Hall, also from the University of Lincoln, and Professor James E Overland from the US National Oceanographic & Atmospheric Administration Pacific Marine Environmental Laboratory.Professor Hanna said: "Arctic warming may be driving recent North Atlantic atmospheric circulation changes that are linked to some of the most extreme weather events in the UK over the last decade."In winter, a positive North Atlantic Oscillation (NAO) is linked with a more northward, vigorous jet and mild, wet, stormy weather over the UK, while a negative NAO tends to be associated with a more southerly-positioned jet and relatively cold and dry but sometimes snowy conditions. In summer the jet stream is displaced further north, so a positive NAO is typically associated with warm dry weather, while a negative NAO often corresponds to wetter, cooler UK weather conditions."While part of the uneven seasonal North Atlantic Oscillation changes might be due to natural random fluctuations in atmospheric circulation, the statistically highly unusual clustering of extreme NAO values in early winter, as well as extreme high summer Greenland Blocking Index values since 2000, suggest a more sustained, systematic change in the North Atlantic atmospheric circulation that may be influenced by longer-term external factors. This includes possible influences from the tropical oceans and solar energy changes as well as the extreme warming that has recently occurred in the Arctic."Of course, weather is naturally chaotic, and extremes are a normal part of our highly variable UK climate, but globally there has recently been an increase in the incidence of high temperature and heavy precipitation extremes. The cold UK winter episodes we noted are not so intuitively linked to global climate change but reflect part of a long-term trend towards more variable North Atlantic atmospheric circulation from year to year during winter months, especially early winter."This trend has culminated in the last decade having several record negative and positive December values of the North Atlantic Oscillation, with lots of resulting disruption from extreme weather over the UK. On the other hand there has been no really notably dry, hot, sunny summer in the UK since 2006; summers overall have either been around average or exceptionally wet, and this appears to be linked with strong warming and more frequent high pressure over Greenland in the last decade."
Weather
2,018
January 5, 2018
https://www.sciencedaily.com/releases/2018/01/180105082330.htm
Making solar energy more efficient
With global warming an ever-present worry, renewable energy -- particularly solar power -- is a burgeoning field. Now, two doctoral students in the School of Architecture & Design (Arc/D) have demonstrated methods of optimizing the capture of sunlight in experiments at the Center for Design Research.
Mohammed Alshayeb started by asking himself what might be done to boost the performance of solar panels. "The efficiency of a photovoltaic panel is measured under standard testing conditions -- at 77 degrees Fahrenheit," he said. "Every degree that the temperature increases decreases performance."Alshayeb wondered if there was a way to "extract the heat out of the panels" when the temperature rises above 77. Because most solar panels are installed on building roofs, Alshayeb decided to compare the effects of three different types of roof materials -- highly reflective (i.e., white), conventional (black) and vegetated (green) -- on the panels' performance.The CDR roof is mostly covered with sedum, planted in trays. So Alshayeb established his test bed there, installing a solar panel monitoring system over the green roof, as well as nearby white and black portions. He also installed temperature, humidity and light sensors and a weather station to record conditions like wind speed. The sensors made recordings every five minutes for a year, and Alshayeb then analyzed the data.What he found was that, contrary to industry practice, which favors white roofs over black, white roofs actually slightly decreased the efficiency of the solar panels due to the heat they reflected up toward the panels. However, compared to the vegetated roof, the high-reflective and conventional roof materials were not significantly different from one another. Panels installed over the green roof performed best, generating an average of 1.4 percent more energy as compared to those over the white and black roofs."There is a lot of research in this area, but nothing as comprehensive as he has done," said Alshayeb's faculty adviser, Associate Professor of Architecture Jae D. Chang. "The next step is to see the effect of increasing the height of the panel over the roof."Another of Chang's students, Afnan Barri, wanted to see whether she could improve the performance of light shelves. A traditional light shelf is a fixed, horizontally mounted plane that can be placed either outside, inside or on both sides of a window in order to reflect and redirect sunlight inside a building. Light shelves can thus reduce the use of artificial lighting and electricity.Traditional, fixed light-shelf systems have limited effectiveness, as they are only capable of functioning while the angle of the sun to the earth is just right. Previous experiments have shown that movable light shelves and ones with curved surfaces can diffuse sunlight with greater efficiency than traditional fixed, flat systems. This is where Barri's idea of a Dynamic Thermal-Adaptive Curved Lightshelf (DTACL) came about. She thought: "What if there were a system that could combine all these methods to enhance the delivery of natural light into buildings throughout the day without the use of mechanical and electrical controls, and unlike existing movable systems?"Her project includes computer simulations and a field experiment to collect a year's worth of data on the performance of the DTACL system through different weather conditions on the KU campus. She created and placed on the lawn of the CDR four experimental rooms the size of refrigerators fitted with sensors and light shelves. Three of the rooms have fixed light shelves in various configurations, while one, the DTACL, uses an adaptive, composite material called Thermadapt, invented by Ronald P. Barrett and commercialized by a company he runs with his son, KU Professor of Engineering Ron Barrett-Gonzalez. Thermadapt changes shape in response to heat and sunlight, curving upward. When it cools, it flattens back out.Barri theorized that the DTACL system would transfer light inside a building more efficiently than the fixed systems, and her initial results have proven that to be the case."I am still in the process on collecting, comparing and analyzing these data," she said. "However, based on a two-month pilot study and computer simulations, the indoor light intensity of the DTCAL system is twice as great as the intensity of a fixed, traditional light shelf."I'd like to take it overseas and perform an experiment like this in more extreme temperatures," said the native of Saudi Arabia.
Weather
2,018
January 3, 2018
https://www.sciencedaily.com/releases/2018/01/180103123052.htm
Researchers use 'global thermometer' to track temperature extremes, droughts
Large areas of the Earth's surface are experiencing rising maximum temperatures, which affect virtually every ecosystem on the planet, including ice sheets and tropical forests that play major roles in regulating the biosphere, scientists have reported.
An analysis of records from NASA's Aqua satellite between 2003 and 2014 shows that spikes in maximum surface temperatures occurred in the tropical forests of Africa and South America and across much of Europe and Asia in 2010 and in Greenland in 2012. The higher temperature extremes coincided with disruptions that affected millions of people: severe droughts in the tropics and heat waves across much of the northern hemisphere. Maximum temperature extremes were also associated with widespread melting of the Greenland ice sheet.The satellite-based record of land surface maximum temperatures, scientists have found, provides a sensitive global thermometer that links bulk shifts in maximum temperatures with ecosystem change and human well-being.Those are among the conclusions reported in the Land surface temperature measures the heat radiated by land and vegetation. While weather stations typically measure air temperatures just above the surface, satellites record the thermal energy emitted by soil, rock, pavement, grass, trees and other features of the landscape. Over forests, for example, the satellite measures the temperature of the leaves and branches of the tree canopy."Imagine the difference between the temperature of the sand and the air at the beach on a hot, summer day," said David Mildrexler, the lead author who received his Ph.D. from the College of Forestry at Oregon State last June. "The air might be warm, but if you walk barefoot across the sand, it's the searing hot surface temperature that's burning your feet. That's what the satellites are measuring."The researchers looked at annual maximum land surface temperatures averaged across 8-day periods throughout the year for every 1-square kilometer (247 acres) pixel on Earth. NASA collects surface temperature measurements with an instrument known as MODIS (Moderate Resolution Imaging Spectroradiometer) on two satellites (Aqua and Terra), which orbit the Earth from north to south every day. Mildrexler and his team focused on the annual maximum for each year as recorded by the Aqua satellite, which crosses the equator in the early afternoon as temperatures approach their daily peak. Aqua began recording temperature data in the summer of 2002."As anyone who pays attention to the weather knows, the Earth's temperature has incredible variability," said Mildrexler. But across the globe and over time, the planet's profile of high temperatures tends to be fairly stable from year to year. In fact, he said, the Earth has a maximum temperature profile that is unique, since it is strongly influenced by the presence of life and the overall frequency and distribution of the world's biomes. It was the discovery of a consistent year-to-year profile that allowed the researchers to move beyond a previous analysis, in which they identified the hottest spots on Earth, to the development of a new global-change indicator that uses the entire planet's maximum land surface temperatures.In their analysis, the scientists mapped major changes in 8-day maximum land surface temperatures over the course of the year and examined the ability of such changes to detect heat waves and droughts, melting ice sheets and tropical forest disturbance. In each case, they found significant temperature deviations during years in which disturbances occurred. For example, heat waves were particularly severe, droughts were extensive in tropical forests, and melting of the Greenland ice sheet accelerated in association with shifts in the 8-day maximum temperature.In 2010, for example, one-fifth of the global land area experienced extreme maximum temperature anomalies that coincided with heat waves and droughts in Canada, the United States, Northern Europe, Russia, Kazakhstan, Mongolia and China and unprecedented droughts in tropical rainforests. These events were accompanied by reductions in ecosystem productivity, the researchers wrote, in addition to wildfires, air pollution and agricultural losses."The maximum surface temperature profile is a fundamental characteristic of the Earth system, and these temperatures can tell us a lot about changes to the globe," said Mildrexler. "It's clear that the bulk shifts we're seeing in these maximum temperatures are correlated with major changes to the biosphere. With global temperatures projected to continue rising, tracking shifts in maximum temperature patterns and the consequences to Earth's ecosystems every year globally is potentially an important new means of monitoring biospheric change."The researchers focused on satellite records for land surfaces in daylight. NASA also produces satellite-based temperature records for the oceans and for nighttime portions of the globe.The research was supported by the Pacific Northwest Research Station of the U.S. Forest Service and by Oregon State University.
Weather
2,018
January 3, 2018
https://www.sciencedaily.com/releases/2018/01/180103111410.htm
New report calls for research to better understand, predict Gulf of Mexico's loop current system
A new report from the National Academies of Sciences, Engineering, and Medicine calls for an international, multi-institutional comprehensive campaign of research, observation, and analysis activities that would help improve understanding and prediction of the Gulf of Mexico's Loop Current System (LCS). The position, strength, and structure of the LCS -- the dominant ocean circulation feature in the Gulf -- has major implications for oil and gas operations, hurricane intensity, coastal ecosystems, oil spill response, the fishing industry, tourism, and the region's economy.
The report identifies a suite of complementary research efforts that would provide critical information about the LCS to help promote safer offshore operations, better understand the Gulf's complex oceanographic systems, facilitate disaster response, help protect coastal communities, protect and manage ecological resources, and predict and forecast weather and climate impacts. Estimated to take about 10 years and cost between $100 million and $125 million, the recommended research campaign is critical for more accurate predictions of the Loop Current's path, the report says."Improving our predictive skills and understanding of the Loop Current System is critical to operational safety and a variety of human activities in the Gulf," said Paul G. Gaffney II, chair of the committee that wrote the report, a retired Navy vice admiral, and president emeritus of Monmouth University. "Moreover, improving ocean modeling in the Gulf will also inform prediction efforts in other ocean basins. Our report identifies gaps in knowledge and recommends comprehensive measurements and research efforts that could be undertaken to fill these gaps."The LCS flows northward through the Yucatán Channel up into the Gulf of Mexico where it eventually turns eastward and then southward before exiting out through the Florida Straits and feeding into the Gulf Stream. The position of the current varies greatly depending on whether it is in a retracted state or a more northerly, extended state. In addition, circular currents known as eddies -- which can be 100-200 miles in diameter -- occasionally separate from the main flow of the LCS and slowly migrate into the western Gulf.Advancing understanding of the LCS could provide many benefits, the report says. For example, the lack of real-time, in situ observations in the deep ocean after the Deepwater Horizon oil spill made it difficult for first responders to track oil under the ocean's surface. Better information could have improved spill response and recovery operations. In addition, when the LCS is in its extended state, its strong currents pose significant operational safety concerns for oil and gas operations, causing costly slowdowns or shutdowns. Knowledge of the factors and dynamics that cause the LCS extended state could help the industry be better prepared.However, despite decades of research, important questions about LCS dynamics remain unanswered, such as factors that influence the Loop Current extension into the Gulf and eddy shedding from the Loop Current. Most scientific observations of the LCS have been limited to ocean surface features and satellite data, and although there have been a number of field studies of the full water-column from ocean surface to seafloor, they were of limited geographic scope and over short time periods. While this research has advanced knowledge of the LCS, significant gaps remain in understanding the formation, variability, and structure of the LCS and its interaction with other dynamic processes in the Gulf.
Weather
2,018
January 1, 2018
https://www.sciencedaily.com/releases/2018/01/180101144749.htm
Curbing climate change
Humans may be the dominant cause of global temperature rise, but they may also be a crucial factor in helping to reduce it, according to a new study that for the first time builds a novel model to measure the effects of behavior on climate.
Drawing from both social psychology and climate science, the new model investigates how human behavioral changes evolve in response to extreme climate events and affect global temperature change.The model accounts for the dynamic feedbacks that occur naturally in the Earth's climate system -- temperature projections determine the likelihood of extreme weather events, which in turn influence human behavior. Human behavioral changes, such as installing solar panels or investing in public transportation, alter greenhouse gas emissions, which change the global temperature and thus the frequency of extreme events, leading to new behaviors, and the cycle continues.Combining climate projections and social processes, the model predicts global temperature change ranging from 3.4 to 6.2°C by 2100, compared to 4.9°C from the climate model alone.Due to the complexity of physical processes, climate models have uncertainties in global temperature prediction. The new model found that temperature uncertainty associated with the social component was of a similar magnitude to that of the physical processes, which implies that a better understanding of the human social component is important but often overlooked.The model found that long-term, less easily reversed behavioral changes, such as insulating homes or purchasing hybrid cars, had by far the most impact in mitigating greenhouse gas emissions and thus reducing climate change, versus more short-term adjustments, such as adjusting thermostats or driving fewer miles.The results, published today in the journal "A better understanding of the human perception of risk from climate change and the behavioral responses are key to curbing future climate change," said lead author Brian Beckage, a professor of plant biology and computer science at the University of Vermont.The paper was a result of combined efforts of the joint Working Group on Human Risk Perception and Climate Change at the National Institute for Mathematical and Biological Synthesis (NIMBioS) at the University of Tennessee, Knoxville, and the National Socio-Environmental Synthesis Center (SESYNC) at the University of Maryland. Both institutes are supported by the National Science Foundation. The Working Group of about a dozen scientists from a variety of disciplines, including biology, psychology, geography, and mathematics, has been researching the questions surrounding human risk perception and climate change since 2013."It is easy to lose confidence in the capacity for societies to make sufficient changes to reduce future temperatures. When we started this project, we simply wanted to address the question as to whether there was any rational basis for 'hope' -- that is a rational basis to expect that human behavioral changes can sufficiently impact climate to significantly reduce future global temperatures," said NIMBioS Director Louis J. Gross, who co-authored the paper and co-organized the Working Group."Climate models can easily make assumptions about reductions in future greenhouse gas emissions and project the implications, but they do this with no rational basis for human responses," Gross said. "The key result from this paper is that there is indeed some rational basis for hope."That basis for hope can be the foundation which communities can build on in adopting policies to reduce emissions, said co-author Katherine Lacasse, an assistant professor of psychology at Rhode Island College."We may notice more hurricanes and heat waves than usual and become concerned about climate change, but we don't always know the best ways to reduce our emissions," Lacasse said. "Programs or policies that help reduce the cost and difficulty of making long-term changes or that bring in whole communities to make long-term changes together can help support people to take big steps that have a meaningful impact on the climate."
Weather
2,018
December 22, 2017
https://www.sciencedaily.com/releases/2017/12/171222090302.htm
Humidity may prove breaking point for some areas as temperatures rise, says study
Climate scientists say that killer heat waves will become increasingly prevalent in many regions as climate warms. However, most projections leave out a major factor that could worsen things: humidity, which can greatly magnify the effects of heat alone. Now, a new global study projects that in coming decades the effects of high humidity in many areas will dramatically increase. At times, they may surpass humans' ability to work or, in some cases, even survive. Health and economies would suffer, especially in regions where people work outside and have little access to air conditioning. Potentially affected regions include large swaths of the already muggy southeastern United States, the Amazon, western and central Africa, southern areas of the Mideast and Arabian peninsula, northern India and eastern China.
"The conditions we're talking about basically never occur now -- people in most places have never experienced them," said lead author Ethan Coffel, a graduate student at Columbia University's Lamont-Doherty Earth Observatory. "But they're projected to occur close to the end of the century." The study will appears this week in the journal Warming climate is projected to make many now-dry areas dryer, in part by changing precipitation patterns. But by the same token, as global temperatures rise, the atmosphere can hold more water vapor. That means chronically humid areas located along coasts or otherwise hooked into humid-weather patterns may only get more so. And, as many people know, muggy heat is more oppressive than the "dry" kind. That is because humans and other mammals cool their bodies by sweating; sweat evaporates off the skin into the air, taking the excess heat with it. It works nicely in the desert. But when the air is already crowded with moisture -- think muggiest days of summer in the city -- evaporation off the skin slows down, and eventually becomes impossible. When this cooling process halts, one's core body temperature rises beyond the narrow tolerable range. Absent air conditioning, organs strain and then start to fail. The results are lethargy, sickness and, in the worst conditions, death.Using global climate models, the researchers in the new study mapped current and projected future "wet bulb" temperatures, which reflect the combined effects of heat and humidity. (The measurement is made by draping a water-saturated cloth over the bulb of a conventional thermometer; it does not correspond directly to air temperature alone.) The study found that by the 2070s, high wet-bulb readings that now occur maybe only once a year could prevail 100 to 250 days of the year in some parts of the tropics. In the southeast United States, wet-bulb temperatures now sometimes reach an already oppressive 29 or 30 degrees Celsius; by the 2070s or 2080s, such weather could occur 25 to 40 days each year, say the researchers.Lab experiments have shown wet-bulb readings of 32 degrees Celsius are the threshold beyond which many people would have trouble carrying out normal activities outside. This level is rarely reached anywhere today. But the study projects that by the 2070s or 2080s the mark could be reached one or two days a year in the U.S. southeast, and three to five days in parts of South America, Africa, India and China. Worldwide, hundreds of millions of people would suffer. The hardest-hit area in terms of human impact, the researchers say, will probably be densely populated northeastern India."Lots of people would crumble well before you reach wet-bulb temperatures of 32 C, or anything close," said coauthor Radley Horton, a climate scientist at Lamont-Doherty. "They'd run into terrible problems." Horton said the results could be "transformative" for all areas of human endeavor -- "economy, agriculture, military, recreation."The study projects that some parts of the southern Mideast and northern India may even sometimes hit 35 wet-bulb degrees Celsius by late century -- equal to the human skin temperature, and the theoretical limit at which people will die within hours without artificial cooling. Using a related combined heat/humidity measure, the so-called heat index, this would be the equivalent of nearly 170 degrees Fahrenheit of "dry" heat. But the heat index, invented in the 1970s to measure the "real feel" of moist summer weather, actually ends at 136; anything above that is literally off the chart. On the bright side, the paper says that if nations can substantially cut greenhouse-gas emissions in the next few decades, the worst effects could be avoided.Only a few weather events like those projected have ever been recorded. Most recent was in Iran's Bandar Mahshahr, on July 31, 2015. The city of more than 100,000 sits along the Persian Gulf, where seawater can warm into the 90s Fahrenheit, and offshore winds blow moisture onto land. On that day, the "dry" air temperature alone was 115 degrees Fahrenheit; saturated with moisture, the air's wet bulb reading neared the 35 C fatal limit, translating to a heat index of 165 Fahrenheit.Bandar Mahshahr's infrastructure is good and electricity cheap, so residents reported adapting by staying in air-conditioned buildings and vehicles, and showering after brief ventures outside. But this may not be an option in other vulnerable places, where many people don't have middle-class luxuries."It's not just about the heat, or the number of people. It's about how many people are poor, how many are old, who has to go outside to work, who has air conditioning," said study coauthor Alex deSherbinin of Columbia's Center for International Earth Science Information Network. De Sherbinin said that even if the weather does not kill people outright or stop all activity, the necessity of working on farms or in other outdoor pursuits in such conditions can bring chronic kidney problems and other damaging health effects. "Obviously, the tropics will suffer the greatest," he said. Questions of how human infrastructure or natural ecosystems might be affected are almost completely unexplored, he said.Only a handful of previous studies have looked at the humidity issue in relation to climate change. It was in 2010 that a paper in the Proceedings of the National Academy of Sciences proposed the 35-degree survivability limit. In 2015, researchers published a paper in the journal Nature Climate Change that mapped areas in the southern Mideast and Persian Gulf regions as vulnerable to extreme conditions. There was another this year in the journal Science Advances, zeroing in on the densely populated, low-lying Ganges and Indus river basins. The new study builds on this earlier research, extending the projections globally using a variety of climate models and taking into account future population growth.Elfatih Eltahir, a professor of hydrology and climate at the Massachusetts Institute of Technology who has studied the issue in the Mideast and Asia, said the new study "is an important paper which emphasizes the need to consider both temperature and humidity in defining heat stress."Climate scientist Steven Sherwood of the University of New South Wales, who proposed the 35-degree survivability limit, said he was skeptical that this threshold could be reached as soon as the researchers say. Regardless, he said, "the basic point stands." Unless greenhouse emissions are cut, "we move toward a world where heat stress is a vastly greater problem than it has been in the rest of human history. The effects will fall hardest on hot and humid regions."
Weather
2,017
December 21, 2017
https://www.sciencedaily.com/releases/2017/12/171221101332.htm
Climate change: Soil animals cannot explain self-reinforcing effect
When the soil warms up, it releases more carbon dioxide (CO
The fact that the world's climate is changing is mainly due to the burning of fossil fuel. As a consequence, large quantities of carbon dioxide (COScientists had previously assumed that this effect was mainly due to the presence of small animals and microorganisms in the soil, which feed on dead organic matter (for example, fallen leaves). Because when they 'burn' their food, CODr Madhav P. Thakur, first author of the study, explains why these results are of great relevance: "The feedback effect of climate warming via the greater release of COThe study was conducted as part of a long-term climate change experiment in Minnesota, USA. In the 'B4WarmED' (Boreal Forest Warming at an Ecotone in Danger) experiment, scientists are heating various plots of boreal forest land artificially by 3.4°C. In addition, they also reduce rainfall by 40% in some places by setting up tents in rainy weather. The scientists measured how much the soil animals ate using 'bait lamina strips': small sticks with holes in which the researchers filled substrate that resembled the organic matter in the soil. These sticks were stuck deep into the ground. Every two weeks the scientists checked how much of the substrate was eaten. The researchers carried out more than 40 such measurements over a period of four years. It is the first study of this scale to investigate the effects of global warming and drought on decomposer soil animals. In addition, the researchers checked the respiration of soil microorganisms by excluding plant roots with a metal ring in small soil areas and then measuring how much CO
Weather
2,017
December 20, 2017
https://www.sciencedaily.com/releases/2017/12/171220122040.htm
California cliffs at risk of collapse identified
Danger -- Unstable Cliffs -- Stay Back: The yellow warning signs that pepper coastal cliffs from northern California to the US-Mexico border may seem overly dramatic to the casual observer. But actively eroding cliffs make up the majority of the California coastline, and sudden landslides and collapses have caused injuries and several fatalities in recent years. In addition, eroding cliffs currently threaten highways, houses, businesses, military bases, parks, power plants, and other critical facilities -- all in all billions of dollars of development.
Research suggests that erosion rates will increase as sea level rises, further exacerbating these problems. "It is critical we study current and historical cliff retreat so we can better plan for the future," says Adam Young, a researcher at Scripps Institution of Oceanography at the University of California San Diego who recently published a unique large-scale analysis of coastal cliff erosion in California.The study, published in the journal Existing cliff erosion studies are often small scale, use a variety of techniques, and often rely on lower quality data sources, providing a patchwork across the state. "What's unique about this study is that it applies a consistent methodology across a very large area using accurate high-resolution laser data," says Young.While some of the basic causes of coastal cliff erosion -- such as rainfall and waves -- are clear, this has not translated into a simple way to predict future erosion rates or identify areas at risk. Variation in cliff geology, beach protection, exposure to weather, and other factors also complicate predicting erosion rates.Previous research has identified clear correlations between rainfall and coastal erosion in southern California, but the impact of storms waves has been more elusive. "It's difficult to measure," explains Young, "We lack field observations because with powerful waves crashing against the cliff, it is not an easy place to make measurements." Variation in cliff geology, beach protection, exposure to weather, and other factors also complicate predicting erosion rates.To create a consistent analysis of recent cliff changes, Young compared two massive LiDAR data sets, three-dimensional maps of the California coastline, recorded eleven years apart. The highest cliff erosion rates were found in San Onofre, Portuguese Bend, Palos Verdes, Big Sur, Martins Beach, Daly City, Double Point, and Point Reyes (see map). Young then compared the recent cliff erosion maps to historical records from 1932 and 1934. By comparing the different maps, he built an analysis for the majority of the state's coastline, showing both recent and historical erosion rates.The study shows that the historical cliff erosion rate does not always provide a good prediction of future rates. "The results show that if a cliff experienced a large of amount of erosion during one time period, it was followed by a time period with very little erosion, and the cliff could be relatively stabilized for a time," explains Young. "It will mobilize again, but we don't know when and more research is needed to better understand the time cycles involved."Young also found that cliffs with high erosion rates in recent times were often preceded by time periods with very little erosion. These are key findings, because models predicting future cliff retreat are often based on projecting the historical rates.Young also introduced a new experimental measure to identify the riskiest precipices. Previous research had suggested that the difference between erosion rates of the cliff face compared to the cliff top could indicate instability -- in short, the cliff steepness. When he applied this hazard index, Young identified worrisome spots along the California coast, including San Onofre State Beach, Big Sur, Martin's Beach, and Daly City.Young is currently working on a set of maps to be made available to the public, and he has presented the work at scientific conferences. He says, "I hope that this study will help improve models that predict erosion, help identify hazardous areas, and assist policymakers who are working to protect our coast."The research has already caught the attention of planners at the California Coastal Commission, a state agency charged with preserving and protecting the coastline for current and future generations."The study could be particularly useful for local governments looking to update their local coastal programs in light of climate change and sea level rise," says Lesley Ewing, a senior coastal engineer for the commission. While the study does not provide projections for future erosion rates, researchers expect that sea-level rise will contribute to faster erosion rates and greater risk to public and private coastal property, and governments are working to plan for the impacts."The coast of California is stacked with very expensive real estate -- not to mention power plants, wastewater treatment facilities, and highways," she says. Some of this is already at risk -- over 100 miles of shoreline armory has been built to protect it, and more will be at risk in the future."There's so much opportunity to use this research -- this could serve as a reality check for planners who often focus on specific regions and smaller scales," adds Ewing.
Weather
2,017
December 18, 2017
https://www.sciencedaily.com/releases/2017/12/171218131225.htm
Warmer, wetter climate could mean stronger, more intense storms
How would today's weather patterns look in a warmer, wetter atmosphere -- an expected shift portended by climate change?
Colorado State University researcher Kristen Rasmussen offers new insight into this question -- specifically, how thunderstorms would be different in a warmer world.The assistant professor of atmospheric science works at the interface of weather and climate. She is lead author on a new paper in For the study, Rasmussen employed a powerful new dataset developed by the National Center for Atmospheric Research (NCAR) in Boulder, Colorado, where Rasmussen completed postdoctoral work before joining the CSU faculty in 2016.The scientists generated the enormous dataset by running NCAR's Weather Research and Forecasting model at an extremely high resolution of about 4 kilometers (about 2.5 miles), across the entire contiguous U.S. Typical climate models only resolve to about 100 kilometers (about 62 miles) -- not nearly the detail available in the new dataset. Included in the new data are finer-scale cloud processes than have been available in previous climate models.Using the dataset and collaborating with NCAR researchers, Rasmussen led analysis of detailed climate simulations. The first control simulation included weather patterns from 2000-2013. The second simulation overlaid that same weather data with a "pseudo global warming" technique using an accepted scenario that assumes a 2- to 3-degree increase in average temperature, and a doubling of atmospheric carbon dioxide."When we compared the current convective population to the future, we found that weak to moderate storms decrease in frequency, whereas the most intense storms increase in frequency," Rasmussen said. "This is an indication of a shift in the convective population, and it gives us a picture of how changes in climate may affect the occurrence of thunderstorms."To explain this finding, the study also showed that while the amount of energy available for convection increases in a warmer and moister climate, the energy inhibiting convection also increases. The relationships of these shifts provide a thermodynamic explanation for increasing or decreasing numbers of storms.Current climate models do not properly account for cloud processes and have made assumptions about their behavior. In fact, cloud and mesoscale, or medium-scale, processes in the atmosphere are among the biggest uncertainties in today's climate models, Rasmussen said."Now that global climate models are being run at higher resolution, they need more information about the physical processes of clouds, in order to better understand all the ramifications of climate change," she said. "This was one of the motivations behind the study."In Rasmussen's study, cloud behavior was more realistically defined using data resolved in 4-kilometer blocks. That meant she could resolve topographical features like the Rocky Mountains and allow the thunderstorms to develop naturally in their environment. Her study accounted for propagation of organized storms, and also included correct daily precipitation cycles across the U.S., neither of which are accurately represented in current climate models.NCAR plans more climate simulations that include even finer-scale detail of weather processes. Rasmussen hopes to conduct follow-up studies that account for shifts in the storm track, which was not reflected in her most recent study.
Weather
2,017
December 14, 2017
https://www.sciencedaily.com/releases/2017/12/171214144522.htm
Effects of climate change could accelerate by mid-century
Nature lovers beware, environmental models used by researchers at the University of New Hampshire are showing that the effects of climate change could be much stronger by the middle of the 21st century, and a number of ecosystem and weather conditions could consistently decline even more in the future. If carbon dioxide emissions continue at the current rate, they report that scenarios of future conditions could not only lead to a significant decrease in snow days, but also an increase in the number of summer days over 90 degrees and a drastic decline in stream habitat with 40 percent not suitable for cold water fish.
"While this research was applied to New Hampshire, the approach can be generally applied, and a number of things that people care about will worsen due to climate change," said Wilfred Wollheim, associate professor in the department of natural resources and the environment and one of the study's authors. "For example, right now the average number of snow days is 60 per year, but in 20 to 30 years the models show that the number of snow days could be as low as 18 days per year."The research, published recently in the journal "Land use and population growth interacting with climate change are also important drivers," said Wollheim. "These models can help guide efforts to make plans to adapt to the changing climate. Alterations in land use policy could reduce these impacts. In particular, prevention of sprawl and investment in storm and waste water infrastructure would further maintain more ecosystem services. Implementing policies that reduce greenhouse gas emissions are essential to limit even further changes."The researchers say this study is the first time a model like this has been applied to New England watersheds that consistently account for climate change, land use change, forest ecosystem processes and aquatic ecosystem processes, including variability in weather that occurs within years (seasonal and storm) and across years, to assess a whole suite of changes at the same time.
Weather
2,017
December 14, 2017
https://www.sciencedaily.com/releases/2017/12/171214101654.htm
That Feeling in Your Bones
Rainy weather has long been blamed for achy joints. Unjustly so, according to new research from Harvard Medical School. The analysis, published Dec. 13 in BMJ, found no relationship between rainfall and joint or back pain.
The notion that certain symptoms and weather go hand in hand has persisted since antiquity. Hippocrates, writing in On Airs, Waters, and Places, exhorted those who wish to understand medicine to look at the changing seasons of the year and study the prevailing winds to see how the weather they bring affects health. The belief has endured over the centuries and well into the present, likely fueled by a combination of folklore and small studies that have repeatedly yielded mixed results.The newly published analysis led by Anupam Jena of Harvard Medical School's Department of Health Care Policy, used a "big data" approach, linking insurance claims from millions of doctor's visits with daily rainfall totals from thousands of National Oceanic and Atmospheric Administration weather stations."No matter how we looked at the data, we didn't see any correlation between rainfall and physician visits for joint pain or back pain," said Jena, who is the Ruth L. Newhouse Associate Professor of Health Care Policy at Harvard Medical School and an internist at Massachusetts General Hospital. "The bottom line is: Painful joints and sore backs may very well be unreliable forecasters."The study examined Medicare records of more than 11 million primary care office visits by older Americans between 2008 and 2012. The research team asked a variety of questions: Did more patients seek care for back pain or joint pain when it rained or following periods of rainy weather? Were patients who went to the doctor for other reasons more likely to also report aching knees or backs around rainy days? What if there were several rainy days in a row? Even in the absence of a "rain effect" in the overall group, did patients with a prior diagnosis of rheumatoid arthritis report more pain? The answers to all of these questions showed no meaningful link between joint pain and rainy weather. Overall, 6.35 percent of the office visits included reports of pain on rainy days, compared with 6.39 percent on dry days.So, are patients who believe there's a connection all wet?"It's hard to prove a negative," Jena said, "but in this flood of data, if there was a clinically significant increase in pain, we would have expected to find at least some small, but significant, sign of the effect. We didn't."The human brain is good at finding patterns, Jena noted, and these beliefs are often self-fulfilling. If you expect your knee to hurt when it rains and it doesn't, you forget about it, he said, but if it hurts and you blame it on the rain, it tends to stick in your mind."As physicians, we should be sensitive to the things our patients are telling us. Pain is pain, with or without rain," Jena said. "But it's important to know that, at the clinical level, joint pain does not appear to ebb and flow with the weather."Andrew Olenski, graduate student in the Department of Economics at Columbia University, David Molitor, assistant professor of finance at the University of Illinois at Urbana-Champaign and Nolan Miller, professor of finance and Julian Simon Faculty Fellow at the University of Illinois at Urbana-Champaign, were co-authors of the study. This study was supported by grants from the Office of the Director, National Institutes of Health (Jena, NIH Early Independence Award, Grant 1DP5OD017897) and the National Institute on Aging (Miller and Molitor, Grant R01AG053350).
Weather
2,017
December 13, 2017
https://www.sciencedaily.com/releases/2017/12/171213143608.htm
High-resolution climate models present alarming new projections for US
Approaching the second half of the century, the United States is likely to experience increases in the number of days with extreme heat, the frequency and duration of heat waves, and the length of the growing season. In response, it is anticipated that societal, agricultural and ecological needs will increase the demand on already-strained natural resources like water and energy. University of Illinois researchers have developed new, high-resolution climate models that may help policymakers mitigate these effects at a local level.
In a paper published in the journal Many climate models use a spatial resolution of hundreds of kilometers. This approach is suitable for global-scale models that run for centuries into the future, but they fail to capture small-scale land and weather features that influence local atmospheric events, the researchers said."Our new models work at a spatial resolution of 12 km, allowing us to examine localized changes in the climate system across the continental U.S.," Wuebbles said. "It is the difference between being able to resolve something as small as Champaign County versus the entire state of Illinois -- it's a big improvement."The study looked at two different future greenhouse gas output projections -- one "business as usual" scenario where fossil fuel consumption remains on its current trajectory and one that implies a significant reduction in consumption by the end of the century. The group generated data for two decade-long projections (2045-54 and 2085-94) and compared them with historical data (1995-2004) for context."One of the most alarming findings in our business-as-usual projection shows that by late-century the southeastern U.S. will experience maximum summer temperatures every other day that used to occur only once every 20 days," Zobel said.Although not as severe, other regions of the country are also expected to experience significant changes in temperature."The Midwest could see large unusual heat events, like the 1995 Chicago heat wave, which killed more than 800 people, become more common and perhaps even occur as many as five times per year by the end of the century," Wuebbles said. "Heat waves increase the mortality rate within the Midwest and the Northeast because people in these densely populated regions are not accustomed to coping with that kind of heat that frequently."The extreme temperatures and extended duration of the warmer season will likely take a significant toll on crops and the ecosystem, the researchers said. Areas like the American West, which is already grappling for limited water resources, could witness much shorter frost seasons at high elevations, leading to a smaller surge in spring meltwater than what is needed for the early growing season."The high resolution of our models can capture regional climate variables caused by local landforms like mountains, valleys and bodies of water," Zobel said. "That will allow policymakers to tailor response actions in a very localized way."The new models concentrate on temperature and do not factor in the effect that regional precipitation patterns will have on the impact of the anticipated climate changes. The researchers plan to extend their study to account for these additional variables."The concept of global climate change can be somewhat abstract, and people want to know how these projected changes are going to affect them, in their community," Wuebbles said. "Our models are helping answer those questions, and that is what separates our work from the larger, global-scale studies."
Weather
2,017
December 11, 2017
https://www.sciencedaily.com/releases/2017/12/171211120436.htm
A global north-to-south shift in wind power by end of century
CIRES and RASEI researchers suggest that wind resources in the next century may decrease in many regions in the Northern Hemisphere -- and could sharply increase in several hotspot regions down south. The first-of-its-kind study predicting how global wind power may shift with climate change appears today in
"There's been a lot of research looking at the potential climate impact of energy production transformations -- like shifting away from fossil fuels toward renewables," said lead author Kris Karnauskas, CIRES Fellow and Assistant Professor in Atmospheric and Oceanic Sciences (ATOC) at CU Boulder. "But not as much focuses on the impact of climate change on energy production by weather-dependent renewables, like wind energy."Wind powers only about 3.7 percent of worldwide energy consumption today, but global wind power capacity is increasing rapidly -- about 20 percent a year. Karnauskas and colleagues Julie Lundquist and Lei Zhang, also in ATOC, wanted to better understand likely shifts in production, so they turned to an international set of climate model outputs to assess changes in wind energy resources across the globe. The team then used a "power curve" from the wind energy industry to convert predictions of global winds, density and temperature into an estimate of wind energy production potential.While not all of the climate models agreed on what the future will bring, substantial changes may be in store, especially a prominent asymmetry in wind power potential across the globe. If carbon dioxide emissions continue at high levels, wind power resources may decrease in the Northern Hemisphere's mid-latitudes, and increase in the Southern Hemisphere and tropics by 2100.Strangely, the team also found that if emission levels are mitigated, dropping lower in coming decades, they see only a reduction of wind power in the north -- it may not be countered with an increase of power in the south.Renewable energy decision makers typically plan and install wind farms in areas with consistently strong winds today. For example, the prairies of the American Midwest -- persistently windy today and in recent decades -- are dotted with tens of thousands of turbines. While the new assessment finds wind power production in these regions over the next twenty years will be similar to that of today, it could drop significantly by the end of the century.By contrast, potential wind energy production in northeastern Australia could see dramatic increases.There were different reasons for the Northern decline and the Southern increase in wind power potential in the high-emissions scenario, Karnauskas and his co-authors found in their modeling results. In the Northern Hemisphere, warmer temperatures at the North Pole weaken the temperature difference between this cold region and the warm equator. A smaller temperature gradient means slower winds in the northern mid-latitudes."These decreases in North America occur primarily during the winter season, when those temperature gradients should be strong and drive strong winds," said Associate professor Lundquist, who is also a RASEI Fellow. In addition to North America, the team identified possible wind power reductions in Japan, Mongolia and the Mediterranean.This may be bad news for the Japanese, who are rapidly accelerating their wind power development.In the Southern Hemisphere, where there is more ocean than land, a different kind of gradient increases: land warms faster than the surrounding, much-larger oceans. That intensified gradient increases the winds. Hotspots for likely wind power increases include: Brazil, West Africa, South Africa and Australia."Europe is a big question mark," added Karnauskas. "We have no idea what we'll see there. That's almost scary, given that Europe is producing a lot of wind energy already." The trend in this region (and in others, like the southeastern United States) is just too uncertain: some models forecast wind power increase, and others, a decrease.In a warming world, harnessing more wind power in coming decades could be critical for countries trying to meet emission reduction standards set by the Paris Climate Agreement. The team's results may help inform decision-makers across the globe determining where to invest this technology."The climate models are too uncertain about what will happen in highly productive wind energy regions, like Europe, the Central United States, and Inner Mongolia," said Lundquist. "We need to use different tools to try to forecast the future -- this global study gives us a roadmap for where we should focus next with higher-resolution tools."
Weather
2,017
December 8, 2017
https://www.sciencedaily.com/releases/2017/12/171208113545.htm
Marine mammal beachings not likely due to space weather
The age-old mystery of why otherwise healthy dolphins, whales and porpoises get stranded along coasts worldwide deepens: After a collaboration between NASA scientists and marine biologists, new research suggests space weather is not the primary cause of animal beachings -- but the research continues. The collaboration is now seeking others to join their search for the factors that send ocean mammals off course, in the hopes of perhaps one day predicting strandings before they happen.
Scientists have long sought the answer to why these animals beach, and one recent collaboration hoped to find a clear-cut solution: Researchers from a cross-section of fields pooled massive data sets to see if disturbances to the magnetic field around Earth could be what confuses these sea creatures, known as cetaceans. Cetaceans are thought to use Earth's magnetic field to navigate. Since intense solar storms can disturb the magnetic field, the scientists wanted to determine whether they could, by extension, actually interfere with animals' internal compasses and lead them astray.During their first investigation, the scientists -- from NASA's Goddard Space Flight Center in Greenbelt, Maryland; the International Fund for Animal Welfare, or IFAW; and the Bureau of Ocean Energy Management, or BOEM -- were not able to hammer down a causal connection."We've learned so far there is no smoking gun indicating space weather is the primary driver," said Goddard space weather scientist Antti Pulkkinen. "But there is a sense that geomagnetic conditions may be part of a cocktail of contributing factors."Now, the team is opening their study up much wider: They're asking other scientists to participate in their work and contribute data to the search for the complex set of causes for such strandings.Mass strandings occur around the world and can affect anywhere from three to several hundred animals during any given event. Although they are a global phenomenon, scientists have identified certain hot spots: New Zealand, Australia, and Cape Cod, Massachusetts, all of which share key geographic characteristics like sloping beaches and fine-grained sediment -- factors thought to play a role in strandings.In strandings involving multiple deaths, autopsies reveal that the vast majority of the deceased animals were healthy before they beached. Some researchers hypothesize groups strand when their strong social bonds compel them to follow a distressed individual into shallow waters."Whales and dolphins have always been mythical emblems for us," said Desray Reeb, a marine biologist at BOEM's headquarters in Sterling, Virginia. "They're intelligent, social and mystical, and present an intriguing challenge for us to understand because they're so like us, and yet so different."This particular investigation was Reeb's brainchild; she approached Pulkkinen about launching the research effort after hearing his presentation about space weather in June 2015. The team initially focused on Cape Cod -- the biggest hot spot in the United States -- and sifted through nearly two decades of IFAW stranding observations alongside both ground- and space-based NASA space weather data.Just as weather varies on Earth, occasionally bringing thunderstorms and gusty winds, the ever-changing Sun sometimes hurls massive clouds of solar material and magnetic fields into space, called coronal mass ejections, or CMEs. The effects of these eruptions on near-Earth space are collectively known as space weather. CMEs can spark powerful geomagnetic storms if they slam into Earth's magnetic field. If solar storms and strandings were indeed connected, the scientists thought they might detect patterns in Earth's geomagnetic activity in the time surrounding a stranding event."If we can determine what conditions promote strandings and develop an alert system that recognizes when those factors are coming together, then stranding networks in different areas can prepare for the event and get rescue efforts on the ground sooner," said project collaborator Katie Moore, the IFAW Deputy Vice President of Conservation and Animal Welfare.Headquartered in Yarmouth Port, Massachusetts, IFAW operates in 40 countries, rescuing animals and promoting conservation to secure a safe habitat for wildlife. In Cape Cod, IFAW has developed a robust emergency response program that has increased the stranding survival rate from 14 to 75 percent in almost 30 years. Shifting from reactive to predictive capabilities, however, would represent an entirely new approach to animal rescue. With funding from BOEM's Environmental Studies Program and NASA's Science Innovation Fund, the team undertook a major data-mining effort to take the initial steps toward developing predictions.First, they looked for correlations between each stranding event and the space weather outlook the day of that event. Then, they shifted the space weather data by different time periods -- one day, two days, 10 days, and so on -- to explore whether there is a delay in the effects of solar activity on strandings.After analyzing all the data, the scientists found that no matter the shift in time, space weather had the same statistical relationship with each stranding -- indicating no clear causal connection between geomagnetic activity and the Cape Cod strandings.While the scientists had been hoping for a eureka moment, the results of their analysis still led them to consider that while space weather isn't a primary driver of strandings, it could be one factor among several. Unraveling interactions and events in biological scenarios typically requires ecological perspectives; perhaps space weather, they thought, was one necessary component of the grander ecological conditions that lead to mass stranding events."Although our analyses indicated that geomagnetic storms are likely not a major cause, it is very difficult, if not impossible, to completely exclude any possible factor from the mix," Pulkkinen said. "Our view is that strandings are likely caused by a complex combination of multiple environmental factors, so we want to include the widest possible range of possible parameters in the follow-up study."Diving deeper into the complex puzzle of mass strandings, the team decided to expand their analysis and include additional oceanographic and atmospheric data sets from NASA's Earth science missions, including Terra, the Sea-viewing Wide Field-of-view Sensor -- or SeaWIFS, for short -- and Global Precipitation Measurement, as well as the National Oceanic and Atmospheric Administration's Geostationary Operational Environmental Satellite, or GOES, mission. In turn, the team itself also expanded to include more collaborators with expertise in the increasingly complex statistical analysis the project demanded.The additional data may shed light on the interacting conditions that affect cetaceans' behavior. For example, tides, winds and sea surface temperature could disrupt their migration habits, and ocean color -- referring to the water's chemical and particle content -- could reflect changes in the food chain."NASA has access to large-scale oceanographic data sets ranging from primary productivity to ocean temperature, currents and wind," Moore said. "For the first time, we're layering huge data sets to study this problem. Maybe we'll find there's a 'perfect storm' of conditions that lead to a stranding."To determine whether they've found a plausible explanation for a stranding, the team statisticians build models that attempt to make predictions within their data sets. They remove a small subset of the data, and if their model can accurately replicate the missing pieces, the scientists may be on the right track."These environmental and animal observations are noisy data, so whatever we find, we have to take with a grain of salt," said Erdem Karaköylü, a Goddard Earth science data analyst and oceanographer who joined the team during its expansion. "But it's also a rich data set. When you have a lot of data, it's easier to discard what's not useful."While the team's initial attention is turned to Cape Cod, their research has implications for preventing strandings across the globe. According to Reeb, each stranding hot spot requires individualized study, but the factors affecting strandings may be the same globally -- albeit to varying degrees of importance. Additionally, the team's current priority is laying the groundwork for future studies by developing methods for storing and analyzing multiple data sets. They envision building an open-source tool that would enable scientists across the world to collaborate and study strandings in their area in a similar fashion.Moore is still hopeful that her team will one day have a predictive model to support their rescues, ultimately enabling them to save more animals. In the meantime, the team will continue to inspect the layers of data for interactions and patterns, deepening their understanding of mass strandings and setting a precedent for future interdisciplinary studies."In past decades, we scientists often have worked in isolation, everyone sticking to their own specialty and answering questions from their perspective," Reeb said. "This exciting study brings amazing people with diverse expertise together to answer a question that has ramifications across the board."
Weather
2,017
December 8, 2017
https://www.sciencedaily.com/releases/2017/12/171208095519.htm
Controlled burns limited severity of Rim Fire
Controlled burning of forestland helped limit the severity of one of California's largest wildfires, according to Penn State geographers.
The researchers studying the Rim Fire, which in 2013 burned nearly 400 square miles of forest in the Sierra Nevadas, found the blaze was less severe in areas recently treated with controlled burns.Forest managers use controlled or prescribed fires under favorable weather conditions in an effort to reduce underbrush and fuel in forests, which can build up over time and cause more intense wildfires."We found prescribed burns really reduced the severity of the Rim Fire," said Alan Taylor, professor of geography and associate in the Earth and Environmental Systems Institute at Penn State."It points to the potential use of prescribed fires to reduce severe fire effects across landscapes," he said. "You can fight fire with fire. You can fight severe fires using these more controlled fires under conditions that are suitable."Scientists examined 21 previous fires within the Rim Fire's perimeter, which burned in and around Yosemite National Park. They found areas that had burned within the preceding 15 years fared better in the 2013 blaze.The best predictor of fire severity was how severe the area last burned, according to the findings published in the journal "Low severity burning seems to be very effective at limiting the severity of subsequent fires," said Lucas Harris, a graduate student in geography and lead author on the paper.The researchers examined factors like topography, weather conditions and fire history and used statistical models to determine what influenced fire severity. They found topography and weather conditions were the most important factors in the initial fire. However, the severity of that initial fire was the best predictor of how severe the next fire would be."The best predictor of what's going to happen in a reburn is what happened in the initial burn," Taylor said. "The long-term implication of that is that anything you can do to reduce the severity of an initial burn is going to play out into the future."The U.S. Forest Service has for decades practiced a policy of fire suppression, or fire exclusion, in areas it manages. This prevents smaller fires from burning underbrush and forest litter.When these fuels accumulate over a number of years, they can lead to unusually intense fires when the area does burn. This can destroy even the tallest trees, which a less intense fire might have spared."Fire severity has been increasing for about the past three decades," Taylor said. "There are real questions about whether we are beginning to see a shift in vegetation types driven by fire activity fueled by fire suppression and climate change."The researchers said severe fires leave behind a new legacy on the landscape. Less frequent, more severe fires caused by human intervention can change the composition of the forest and make future severe fires more likely to occur. For example, shrubs, which grow quickly after a fire, can take over forestland and then burn again before trees are able to re-establish."If you have a high severity initial fire, that's a real lost opportunity," Harris said. "You are probably getting a vegetation change due to that first fire that's going to cause more high-severity fires in the future and potentially the emergence of non-forest that could last for a long time."
Weather
2,017
December 7, 2017
https://www.sciencedaily.com/releases/2017/12/171207182531.htm
New mapping technique can help fight extreme poverty
For years, policymakers have relied upon surveys and census data to track and respond to extreme poverty.
While effective, assembling this information is costly and time-consuming, and it often lacks detail that aid organizations and governments need in order to best deploy their resources.That could soon change.A new mapping technique, described in the Nov. 14 issue of the Proceedings of the National Academies of Sciences, shows how researchers are developing computational tools that combine cellphone records with data from satellites and geographic information systems to create timely and incredibly detailed poverty maps."Despite much progress in recent decades, there are still more than 1 billion people worldwide lacking food, shelter and other basic human necessities," says Neeti Pokhriyal, one of the study's co-lead authors, and a PhD candidate in the Department of Computer Science and Engineering at the University at Buffalo.The study is titled "Combining Disparate Data Sources for Improved Poverty Prediction and Mapping."Some organizations define extreme poverty as a severe lack of food, health care, education and other basic needs. Others relate it to income; for example, the World Bank says people living on less than $1.25 per day (2005 prices) are extremely impoverished.While declining in most areas of the world, roughly 1.2 billion people still live in extreme poverty. Most are in Asia, sub-Saharan Africa and the Caribbean. Aid organizations and governmental agencies say that timely and accurate data are vital to ending extreme poverty.The study focuses on Senegal, a sub-Saharan country with a high poverty rate.The first data set are 11 billion calls and texts from more than 9 million Senegalese mobile phone users. All information is anonymous and it captures how, when, where and with whom people communicate with.The second data set comes from satellite imagery, geographic information systems and weather stations. It offers insight into food security, economic activity and accessibility to services and other indicators of poverty. This can be gleaned from the presence of electricity, paved roads, agriculture and other signs of development.The two datasets are combined using a machine learning-based framework.Using the framework, the researchers created maps detailing the poverty levels of 552 communities in Senegal. Current poverty maps divide the nation in four regions. The framework also can help predict certain dimensions of poverty such as deprivations in education, standard of living and health.Unlike surveys or censuses, which can take years and cost millions of dollars, these maps can be generated quickly and cost-efficiently. And they can be updated as often as the data sources are updated. Plus, their diagnostic nature can help assist policymakers in designing better interventions to fight poverty.Pokhriyal, who began work on the project in 2015 and has travelled to Senegal, says the goal is not to replace census and surveys but to supplement these sources of information in between cycles. The approach could also prove useful in areas of war and conflict, as well as remote regions.
Weather
2,017
December 7, 2017
https://www.sciencedaily.com/releases/2017/12/171207125950.htm
One wet winter can shake up San Francisco Bay's invasive species
For many Californians, last year's wet winter triggered a case of whiplash. After five years of drought, rain from October 2016 to February 2017 broke more than a century of records. In San Francisco Bay, Smithsonian Environmental Research Center biologists discovered a hidden side effect: All that freshwater rain can turn the tables on some of the bay's invasive species.
"As you get wetter and wetter, there are fewer and fewer [marine] species that can tolerate those conditions," said Andrew Chang, lead author of the new study published Dec. 7 in Chang, a marine biologist at the Smithsonian Environmental Research Center's branch in Tiburon, Calif., has been watching San Francisco's invaders since 2000. He is especially interested in the fouling community, underwater creatures like tunicates and bryozoans that grow on boats, docks and fishing and aquaculture equipment. Though some of them can look stunningly beautiful underwater, they are less attractive when they clog fishing gear or ruin nets.Some species, Chang has noticed, need a salty bay to survive. When a wet winter sends massive surges of freshwater into the bay, those organisms start to suffer. And last winter may not have been a fluke: Those kinds of extremes -- years of deluge and years of drought -- are already becoming more common as climate change accelerates.In the past, scientists have generally tracked how shifting weather patterns impact a single species. In the new study, Chang and the center's Marine Invasions Lab examined how San Francisco's fouling community as a whole changed over 13 years of wet, dry and moderate weather. Starting in 2001, the team tracked the growth of these species in Richmond Marina, a mostly saltwater marina in northeastern San Francisco Bay. They suspended square PVC panels from docks, where they remained underwater for one month, three months or, starting in 2004, three to five years, collecting all kinds of colorful marine life.During dry years, when bay waters remained salty, one invader dominated above all others: the invasive tunicate But when the wetter winters of 2006 and 2011 hit, To confirm that wet weather was indeed behind the species switch-ups, Chang's team ran an experiment in addition to the 13-year field surveys, taking panels from the marina and exposing them in lab to near-freshwater (wet year), near-oceanic saltwater (dry year) or medium-saltwater (moderate year). Marine life on panels exposed to the fresher "wet year" water suffered near total mortality. When they put the panels back in the marina and pulled them up eight weeks later, an entirely new suite of species had colonized. By contrast, species on the medium- and high-saltwater panels barely suffered at all.As Chang sees it, freshwater years reset the system -- a situation that could work to the advantage of some of the invaders. "If you're a new invader arriving to San Francisco Bay, for example, what better time to come in than right after a wet winter has killed off most of your potential competitors?" he said.Many of the new species, like colonial tunicates and encrusting bryozoans, are non-native. However, Chang's team noticed a couple native species did better in wet years too. This suggests with the right strategy, managers could use the situation to help native species instead."When you have a wet winter and it kills off a huge number of species...we're really knocking back the non-native population," Chang said. "Perhaps that would be an opportune time to take some aggressive management action."According to Chang, that action could involve addressing boat traffic -- one of the key ways invasive species arrive in San Francisco Bay -- or other tactics to pump up native species or ensure invasive ones stay low.San Francisco almost certainly has not seen its last rainy winter or its last drought. What extreme wet years offer, biologists suggest, is a window of opportunity. When a dominant invader like Ciona gets cut down, it could be a godsend for another invader, or a new shot at life for some of the bay's struggling natives.
Weather
2,017
December 6, 2017
https://www.sciencedaily.com/releases/2017/12/171206121952.htm
'Stressed out' cocoa trees could produce more flavorful chocolate
Most people agree that chocolate tastes great, but is there a way to make it taste even better? Perhaps, according to scientists who looked at different conditions that can put a strain on cocoa trees. Reporting in ACS'
Cocoa trees grow in hot and humid climates near the equator. Traditionally, these trees are raised together in mixed groves with other types of trees and plants that can cool the air and provide vital shade. The system, called agroforestry, provides a low-stress environment, increases nutrients in the soil and helps maintain ground water levels. But to gain higher yields, growers sometimes plant cocoa trees in solitary, "monocultural," groves, in which the trees are exposed to stressful conditions. In response to the stress, tress produce antioxidants that can potentially counteract the damage, but these compounds also could change the quality characteristics of the beans. Wiebke Niether, Gerhard Gerold and colleagues from FiBL (Switzerland) wanted to find out whether differing growing methods can influence the chemical composition, and potentially the flavor, of cocoa beans.The researchers harvested beans from five cocoa tree farms in Bolivia at the beginning and end of the dry season, which runs from April to September. The trees were raised in full-sun monocultural groves or in agroforest settings. The beans were fermented and dried, then analyzed. The research team detected only minor differences in the chemical composition among the beans harvested from the farms during the same weather conditions. Slightly more phenols and other antioxidant compounds were detected in beans taken from monoculturally grown trees than those that came from trees grown with agroforest methods, but the differences were not significant, according to the researchers. The larger contribution to chemical composition was the weather. Overall, the antioxidant content increased and fat content of the beans decreased during the dry season as temperatures rose and soil moisture dropped. The researchers say these differences could contribute to variability in cocoa bean flavor.
Weather
2,017
November 30, 2017
https://www.sciencedaily.com/releases/2017/11/171130093954.htm
New UK map of air pollution provides insights into nitrogen dioxide levels across the country and within towns and cities
EarthSense Systems -- a joint venture between the University of Leicester and aerial mapping company BlueSky -- has published MappAir® -- the first ever high resolution nationwide map of air pollution.
Combining data from satellites and its own air quality monitoring sensors together with open source data, EarthSense has used complex modelling techniques to create the highly accurate map. Initially available for the whole of the UK at 100 metre resolution, MappAir® shows how air pollution, specifically nitrogen dioxide, changes across the country and within towns and cities, highlighting likely sources and potential clean-air refuge areas."Air pollution is making headlines across the world for all the wrong reasons," commented James Eddy, Managing Director of EarthSense Systems. "However, there simply isn't enough data available for those charged with tackling the issue to make informed decisions. MappAir® can provide a street-view to city-wide visualisation of air pollution, and is the first in a series of nationwide products that are coming to market in the next year."Using the British National Grid, EarthSense has divided the UK into 100 metre squares -- about twice the size of an average football pitch. Air pollution readings from satellites and its own Zephyr air quality monitoring sensors were combined with open data, including traffic emissions and weather conditions, to produce an annual average for each cell.As additional sensors come online and more historical data is made available, EarthSense plans to produce a range of MappAir® products, including an ultra-high resolution 1m dataset for detailed study areas, a 10m map for urban areas, an historic time series of maps showing how air pollution changes over the course of a day and on different days, and forecast maps giving an indication of fluctuations up to three days ahead. EarthSense will also be releasing a map of PM2.5 (ultrafine pollution particles smaller than 2.5 micrometres) later in 2017."Air pollution is not a constant threat," continued Eddy. "Not only does it differ from location to location, as MappAir® clearly shows, but it also changes from morning rush hour to afternoon school run, and from week day commutes to weekend leisure pursuits. This is why we are already working on the next products in the MappAir® range, including near real time altering maps and forecast maps."With the Government recently outlining its plans to tackle climate change while driving economic growth, accurate map based data, such as the MappAir® products, are designed for a wide range of applications. These include local planning, enforcement and mitigation strategies, as well as commercial applications such as conveyancing and health diagnostics. It is hoped the MappAir® products will also help with public engagement and behavioural change initiatives.EarthSense Systems is a joint venture between aerial mapping company Bluesky and the University of Leicester. The MappAir® data is available to view and purchase online at
Weather
2,017
November 29, 2017
https://www.sciencedaily.com/releases/2017/11/171129143405.htm
Sea-level rise predicted to threaten more than 13,000 archaeological sites in southeastern US
Sea-level rise may impact vast numbers of archaeological and historic sites, cemeteries, and landscapes on the Atlantic and Gulf coasts of the southeastern United States, according to a study published November 29, 2017 in the open-access journal
To estimate the impact of sea-level rise on archaeological sites, the authors of the present study analyzed data from the Digital Index of North American Archaeology (DINAA). DINAA aggregates archaeological and historical data sets developed over the past century from numerous sources, providing the public and research communities with a uniquely comprehensive window into human settlement.Just in the remainder of this century, if projected trends in sea-level rise continue, the researchers predict that over 13,000 recorded archaeological sites in the southeast alone may be submerged with a 1 m rise in sea-level, including over 1,000 listed on the National Register of Historic Places as important cultural properties. Many more sites and structures that have not yet been recorded will also be lost.Large linked data sets, such as DINAA, that show what may be impacted and what could be lost across entire regions, are essential to developing procedures for sampling, triage, and mitigation efforts. Such research is essential to making accurate forecasts and public policy decisions about the consequences of rapid climate change, extreme weather events, and displaced populations. These are factors that could shape our civilization profoundly in the years to come.Anderson notes: "Sea-level rise in the coming years will destroy vast numbers of archaeological sites, buildings, cemeteries, and cultural landscapes. Developing informatics capabilities at regional and continental scales like DINAA (Digital Index of North American Archaeology) is essential if we are to effectively plan for, and help mitigate, this loss of human history."
Weather
2,017
November 28, 2017
https://www.sciencedaily.com/releases/2017/11/171128230031.htm
Higher plant species richness may not be enough to protect ecosystems from the worst impacts of climate extremes
Studies on mild fluctuations in weather have provided support for the idea that higher biodiversity results in more stable functioning of ecosystems, but critical appraisal of the evidence from extreme event studies is lacking.
Higher plant species richness is not always sufficient to reduce ecosystem vulnerability to climate extremes, as shown in a comprehensive literature analysis published in the While biodiversity is under threat around the globe, the number of extreme weather events is on the rise as a direct consequence of climate change. Researchers from several institutes around Europe have now looked into the scientific literature that addresses these global changes to examine the interactions between biodiversity and extreme weather events.They sought to find out whether and how increased biodiversity may help to uphold the functioning of ecosystems in the face of climate extremes. In other words, can biodiversity help to avert the worst effects of droughts, heat waves and extremely wet weather? The answer, it turns out, is not cut-and-dried. Available evidence from herbaceous systems indicates mixed effects of species richness on biomass stability to extremely wet and dry events.Why doesn't plant species richness play a consistently beneficial role in ensuring that the functioning of ecosystems is better maintained under climate extremes? The authors provide several explanations for this unexpected finding. First of all, it seems that biodiversity may not offer as much protection if the event in question is very extreme: buffering mechanisms which drive ecosystem resistance, such as compensation by better-adapted species or species taking over the functional role of others (functional redundancy), may simply be overwhelmed in such cases.However, as lead author Hans De Boeck from the University of Antwerp points out: "Biodiversity may still be important, as it has been shown to speed up recovery of plant productivity after an extreme event."Secondly, the cause of biodiversity decline may confound biodiversity-stability effects. Unlike in artificially-assembled, experimental systems, widely-observed eutrophication (nutrient enrichment) caused by intensive agriculture, traffic and industry often leads to impoverished ecosystems with few but fast-growing species that are less able to cope with adverse climatic conditions such as drought. Reducing eutrophication and/or maintaining a greater diversity of species with different growth rates within ecosystems could lead to more stable systems that are better able to face extremes.Finally, species richness may not be the most relevant indicator of 'biodiversity' when studying biodiversity-stability relationships. General patterns of biodiversity effects may be more apparent if scientists consider the diversity in plant traits rather than simply species numbers. "Diversity metrics can include a variety of properties of trait distributions, but studies have only just scratched the surface on the value of these different metrics for extreme event science," says De Boeck.In order to better harness the benefits of biodiversity for sustained ecosystem function, the authors suggest that future research should focus on understanding the underlying mechanisms of diversity-stability relationships in the face of extreme events. The study presented here highlights current knowledge gaps and provides research recommendations so that ecologists can gain a deeper understanding of the linkages between biodiversity and ecosystem stability in a changing world.
Weather
2,017
November 27, 2017
https://www.sciencedaily.com/releases/2017/11/171127135821.htm
Fighting plant disease at warm temperatures keeps food on the table
An issue of global concern is the anticipated shortage of agricultural output to meet the steady rise in human population. Michigan State University scientists understand that overcoming crop loss due to disease and adverse weather will be key in achieving this goal.
One of the best historical examples of this is the Irish Potato Famine. Beginning in 1845, Ireland experienced the "perfect storm" of unusually cool, damp weather that provided prime growing conditions for an exotic pathogen that destroyed the potato crop. With their primary food source ravaged by disease, a million Irish people died from the ensuing famine.On the other end of the thermometer, warmer temperatures also can cause extensive crop loss. This critical correlation between changing weather and plants' ability to fend off diseases is featured in the current issue of In this scenario, Bethany Huot, MSU cell and molecular biology graduate program alumna and the study's lead author, wanted to find out if plants' defense system was compromised or was pathogens' virulence enhanced?The answer: It's both."Just like people, plants are more likely to get sick when they are growing in stressful environments," said Huot, who published the paper with Sheng Yang He, University Distinguished Professor of plant biology and a Howard Hughes Medical Institute Investigator, and Beronda Montgomery, MSU Foundation Professor. "While individual stresses are damaging to plants, they can have catastrophic effects when combined."The researchers showed on the genetic level how high temperature weakens plant defenses while, separately, strengthening bacterial attacks.When people get a fever, they take a form of salicylic acid, or SA, commonly known as aspirin. Plants don't have to go to a medicine cabinet because they're able to make their own SA. At 73 degrees Fahrenheit, plants can produce plenty of SA to fight off a pathogenic infection. However, when the heat rose above 86 degrees, no SA was produced, leaving plants vulnerable.The authors also found that the pathogen became stronger at the elevated temperature. However, the increased vulnerability of the plants occurred regardless of whether the pathogen was present."Since the plants could no longer make SA at elevated temperature, we sprayed them with a chemical that acts like SA," Huot said. "This treatment effectively protected the plants from infection; even though the bacteria are more virulent at high temperatures, plants can fight them off if we give them the SA they can no longer make."Even if global climate issues are resolved, local fluctuations in environment will always occur and greatly impact crop growth and yield, Huot added."Increasing our understanding of how specific environmental factors affect the host and the pathogen as well as their interactions can inform strategies for developing robust crop resistance," she said. "This is important for keeping food on the table."
Weather
2,017
November 27, 2017
https://www.sciencedaily.com/releases/2017/11/171127124747.htm
Breakthrough in tornado short-term forecasting could mean earlier, more accurate warnings
When mere seconds of storm warning could mean the difference between harm or safety, two researchers with ties to Western University in London, Canada, have developed a tornado-prediction method they say could buy as much as 20 minutes additional warning time.
These radar-based calculations can forecast a tornado, with 90 per cent accuracy within a 100-kilometre radius, say Anna Hocking, PhD and a Western alumna, and Prof. Wayne Hocking, who leads the Atmospheric Dynamics Group based at Western's Department of Physics and Astronomy in the Faculty of Science.They have authored a paper entitled "Tornado Identification and Forewarning with VHF Windprofiler Radars," published today in "Typically, meteorologists look for specific signatures that include wind speeds plus an overshoot, a dome-like knob that forms atop a thundercloud," said Anna Hocking. "What we've been able to do, for the first time, is add in and quantify a third factor: turbulence."The pair use a unique Ontario-Quebec network of purpose-built radars that measure wind and turbulence through the upper atmosphere, troposphere and lower stratosphere. The O-QNet of 10 radar arrays (including one just north of London) was built in part with funding from the Canadian Foundation for Innovation, as well as NSERC and Environment and Climate Change Canada. The network also involves collaboration with researchers at York University and McGill University."Because this network is so large and is designed to measure turbulence as well as winds, we've been able to see patterns, predictors, that haven't been evident before," said Wayne Hocking, who is also a Fellow of the Royal Society of Canada.The Hockings collected and analyzed 16 years of tornado data, including reports to Environment Canada from citizen observers, and correlated that with real-time and archived data from these radar arrays.Of the 31 documented tornadoes, specific profiles of cloud overshoot into the stratosphere, wind velocity and turbulence were uniquely present 90 per cent of the time -- with all three signature features evident 10 to 20 minutes before the tornado formed. There was less than a 15% likelihood of false detection, Wayne Hocking said.That's a significant improvement over existing predictions, with warnings that often cover huge regions and can produce false alarms.Accurate, timely and geographically precise tornado warnings have long been a holy grail among meteorologists. Sometimes, proprietary technologies and differing methodologies among forecasters have proven a barrier to deciphering weather patterns."There's still a lot to be done and we're not going to say this is going to solve the whole forecasting problem -- but this is a large step in the right direction," Wayne Hocking said. "The data suggests we now have a more reliable tool for forecasting than has been possible before this."
Weather
2,017
November 22, 2017
https://www.sciencedaily.com/releases/2017/11/171122093117.htm
Rainfall can indicate that mosquito-borne epidemics will occur weeks later
A new study demonstrates that outbreaks of mosquito-borne viruses Zika and Chikungunya generally occur about three weeks after heavy rainfall. Researchers also found that Chikungunya will predominate over Zika when both circulate at the same time, because Chikungunya has a shorter incubation period -- just two days, versus 10 days for Zika. The latter finding explains why a late-2015 Zika epidemic in Rio de Janeiro ended while the number of Chikungunya cases increased in February 2016.
Viruses transmitted by insects can lead to serious health repercussions. Zika is linked to birth defects, and up to 1 percent of Zika infections result in Guillain-Barre syndrome, a form of paralysis. Chikungunya can cause arthritis.The researchers aimed to identify the environmental drivers of these epidemics to create a framework for predicting where and when future outbreaks could occur.The researchers screened 10,459 blood and urine samples for Chikungunya, dengue and Zika from residents of 48 municipalities in the state of Rio de Janeiro. They tracked dates of major rainfalls, assessed the geographic distribution of mosquito-borne virus incidence in cities and neighborhoods and the timing of epidemics. They confirmed 1,717 cases of Zika infection, 2,170 cases of Chikungunya and 29 cases of dengue. Zika occurred more commonly in neighborhoods with little access to municipal water infrastructure; the incidence of Chikungunya was weakly correlated with urbanization, such as the density of buildings.Rains began in October 2015 and were followed one month later by the largest wave of Zika. Zika cases markedly declined in February 2016, which coincided with the start of a Chikungunya outbreak.The findings could enable municipalities to enact measures to limit the spread of the diseases and prepare vaccinations.Health officials can use the occurrence of heavy rainfall to prepare for epidemics roughly one month in advance. A weather-based early warning system could give public health officials sufficient lead time to obtain supplies of intravenous immunoglobin, the standard treatment for Guillain-Barre. The analysis also pinpointed potential risk factors for clustering of Zika cases in Rio de Janeiro's neighborhoods.The study was published in the peer-reviewed journal
Weather
2,017
November 22, 2017
https://www.sciencedaily.com/releases/2017/11/171122093109.htm
Climate change models of bird impacts pass the test
A major study looking at changes in where UK birds have been found over the past 40 years has validated the latest climate change models being used to forecast impacts on birds and other animals.
Led by the University of Adelaide, in collaboration with an international team of researchers, the scientists compared forecasts from ecological models with observed changes to the bird populations -- and found the latest models were working well."Models have been developed in recent years to predict how the area where a bird species lives -- known as its range -- will change as the climate does," says lead author Dr Damien Fordham from the University of Adelaide's Environment Institute."The results show that the enormous effort being invested into improving tools for forecasting the effect of climate change on species range movement and extinctions is working."We are now a lot more confident in what models should be used, and when, to provide a more accurate picture of biodiversity loss from climate change. While this study was on UK birds, we expect these results will also hold for many other birds and animals."Dr Fordham, who heads Global Ecology research at the University of Adelaide, directed a team of scientists who tested how accurately different types of ecological models predicted the contraction and expansion of the ranges of 20 UK bird species over the last 40 years.They found that the latest generation of models, which directly account for important ecological responses to climate change, do much better at forecasting recent range shifts.For example, the Sparrowhawk has colonised the eastern UK since 1970, and this was captured by sophisticated models that included population growth rates and how far birds travel from where they are born."Our findings are a real win for bird conservation in the UK and beyond," said Dr Regan Early, of the Centre for Ecology and Conservation at the University of Exeter."This is because we now have tools that not only better forecast climate-driven range movements, but can be used to target conservation management resources more effectively."The results also have direct consequences for efforts to protect biodiversity.The research team will now use the models to rank the cost effectiveness of different regional conservation alternatives for birds in the UK this century.The study is published in the journal
Weather
2,017
November 21, 2017
https://www.sciencedaily.com/releases/2017/11/171121095201.htm
Climate changes triggered immigration to America in the 19th century, study finds
In the 19th century, over 5 million Germans moved to North America. It was not only a century of poverty, war and revolutions in what is now Germany, but also of variable climate. Starting at the tail end of the cold period known as the Little Ice Age, the century saw glacier advances in the Alps, and a number of chilly winters and cool summers, as well as other extreme weather events such as droughts and floods.
"Overall, we found that climate indirectly explains up to 20-30% of migration from Southwest Germany to North America in the 19th century," says Rüdiger Glaser, a professor at the University of Freiburg, Germany, and lead-author of the The researchers could see a climate signature in most major migration waves from Southwest Germany during the 19th century. "The chain of effects is clearly visible: poor climate conditions lead to low crop yields, rising cereal prices and finally emigration," says Glaser. "But it is only one piece of the puzzle.""Our results show that the influence of climate was marked differently during the different migration waves," adds Iso Himmelsbach, another of the researchers at the University of Freiburg who took part in the study.The team studied official migration statistics and population data from the 19th century, as well as weather data, harvest figures and cereal-price records. They focused on the region that is now the Baden-Württemberg state, where many of the migrants -- such as Charles Pfizer of pharmaceutical fame -- originated from. They started by identifying the major migration waves and then investigated to what extent climate played a role in driving people to North America during each of them.The first wave followed the eruption of the Tambora volcano in Indonesia in 1815. The volcanic ash and gases spewed into the atmosphere caused temperatures to drop around the world for a few years after the eruption. The 'year without summer', 1816, was wet and cold causing widespread crop failures, famine and emigration."Another peak-migration year, 1846, had an extremely hot and dry summer leading to bad harvests and high food prices," says Annette Bösmeier, a researcher at the University of Freiburg who also involved in the study. "These two years of high migration numbers appear to be quite strongly influenced by climate changes, while for other migration waves other circumstances appeared to be more important," she adds.Climate was a less significant factor in driving the largest emigration wave, from 1850 to 1855, the researchers found. While unfavourable weather affected crops resulting in low harvests during this time, other factors also drove up food prices. During the Crimean War (1853-1856), for example, France banned food exports, putting pressure on the German grain markets. At the time, the authorities of Baden also paid the poorest people to leave the country in an attempt to prevent uprisings and save on welfare. This, too, drove up emigration numbers."Migration in the 19th century was a complex process influenced by multiple factors. Lack of economic perspectives, social pressure, population development, religious and political disputes, warfare, family ties and the promotion of emigration from different sides influenced people's decision to leave their home country," concludes Glaser. "Nevertheless, we see clearly that climate was a major factor."In the past few years, climate has taken a central stage in migration discussions since future climate change is expected to lead to mass migration ('climate refugees'), as sea levels rise and extreme weather events, such as floods, droughts and hurricanes, become more frequent. The team hope their study can shed some light on the various factors influencing migration and how important climate can be in triggering mass movements of people.
Weather
2,017
November 17, 2017
https://www.sciencedaily.com/releases/2017/11/171117103802.htm
Warmer water signals change for Scotland's shags
An increasingly diverse diet among European Shags at one of Scotland's best-studied breeding colonies has been linked to long-term climate change and may have important implications for Scotland's seabirds, according to research led by the Centre for Ecology & Hydrology on November 16 2017.
Three decades of data from the Isle of May, off Scotland's east coast, found that the proportion of sandeels -- the bird's usual fare -- declined by 48% between 1985 and 2014. Over the same period, the number of other fish prey in the diet increased, from an average of just 1 species per year in 1985 to 11 in 2014. Crucially, the results, presented in the The North Sea is one of the most rapidly warming marine ecosystems on the planet, and warmed by 0.037 degrees Celsius per year between 1982 and 2012. Lead author, Richard Howells explains that the study, "ties in with many observations of changes in the abundance, distribution and phenology of many species in the North Sea, and a decline in the availability and size of sandeels."Climate models predict further increases in sea surface temperature and weather variability in the region, with generalisation in shag diet, one way in which this species appears to be responding to this change."Short-term weather conditions also impacted on the bird's ability to feed, with, "windier conditions on a daily basis linked to fewer sandeel in the diet. This may affect the ability of parents to successfully feed their chicks.""Changes in the prey types consumed by this population suggest that adults may now be hunting across a broader range of habitats than they did in the past, such as rocky habitats where they can find the Rock Butterfish. Such changes may alter interactions with potential threats, such as small-scale offshore renewable developments."Finally, Howells adds, "By identifying effects of both long-term temperature trends and short term weather variability, this study helps us gain a better understanding of how our seabird species are affected by ongoing climate change, and may help us direct conservation efforts to preserve these populations now and in the future."
Weather
2,017
November 17, 2017
https://www.sciencedaily.com/releases/2017/11/171117085130.htm
New theory rewrites opening moments of Chernobyl disaster
A brand-new theory of the opening moments during the Chernobyl disaster, the most severe nuclear accident in history, based on additional analysis is presented for the first time in the journal
The new theory suggests the first of the two explosions reported by eyewitnesses was a nuclear and not a steam explosion, as is currently widely thought and is presented by researchers from the Swedish Defence Research Agency, Swedish Meteorological and Hydrological Institute, and Stockholm University.They hypothesize that the first explosive event was a jet of debris ejected to very high altitudes by a series of nuclear explosions within the reactor. This was followed, within three seconds, by a steam explosion which ruptured the reactor and sent further debris into the atmosphere at lower altitudes.The theory is based on new analysis of xenon isotopes detected by scientists from the V.G. Khlopin Radium Institute in the Leningrad, four days after the accident, at Cherepovets, a city north of Moscow far from the major track of Chernobyl debris. These isotopes were the product of recent nuclear fission, suggesting they could be the result of a recent nuclear explosion. In contrast, the main Chernobyl debris which tracked northwest to Scandinavia contained equilibrium xenon isotopes from the reactor's core.By assessing the weather conditions across the region at the time, the authors also established that the fresh xenon isotopes at Cherepovets were the result of debris injected into far higher altitudes than the debris from the reactor rupture which drifted towards Scandinavia.Observations of the destroyed reactor tank indicated that the first explosion caused temperatures high enough to melt a two-meter thick bottom plate in part of the core. Such damage is consistent with a nuclear explosion. In the rest of the core, the bottom plate was relatively intact, though it had dropped by nearly four meters. This suggests a steam explosion which did not create temperatures high enough to melt the plate but generated sufficient pressure to push it down.Lead author and retired nuclear physicist from the Swedish Defence Research Agency, Lars-Erik De Geer commented, "We believe that thermal neutron mediated nuclear explosions at the bottom of a number of fuel channels in the reactor caused a jet of debris to shoot upwards through the refuelling tubes. This jet then rammed the tubes' 350kg plugs, continued through the roof and travelled into the atmosphere to altitudes of 2.5-3km where the weather conditions provided a route to Cherepovets. The steam explosion which ruptured the reactor vessel occurred some 2.7 seconds later."Seismic measurements and an eye-witness report of a blue flash above the reactor a few seconds after the first explosion also support the new hypothesis of a nuclear explosion followed by a steam explosion. This new analysis brings insight into the disaster, and may potentially prove useful in preventing future similar incidents from occurring.
Weather
2,017
November 15, 2017
https://www.sciencedaily.com/releases/2017/11/171115175313.htm
Shape of Lake Ontario generates white-out blizzards, study shows
A 6-foot-wide snow blower mounted on a tractor makes a lot of sense when you live on the Tug Hill Plateau. Tug Hill, in upstate New York, is one of the snowiest places in the Eastern U.S. and experiences some of the most intense snowstorms in the world. This largely rural region, just east of Lake Ontario, gets an average of 20 feet of snow a year.
Hence the tractor-mounted snow blower.The region's massive snow totals are due to lake-effect snowstorms and, it turns out, to the shape of Lake Ontario.Lake-effect storms begin when a cold mass of air moves over relatively warm water. The heat and moisture from the water destabilize the air mass and cause intense, long-lasting storms. Lake-effect snow is common in the Great Lakes region and in areas downwind of large bodies of water, including the Great Salt Lake.Researchers, including the University of Utah's Jim Steenburgh and University of Wyoming's Bart Geerts, now report that these intense snowstorms are fueled by air circulation driven by the heat released by the lake, and that the shoreline geography of Lake Ontario affects the formation and location of this circulation. The result? Very heavy snowfall.The findings, published in three papers, show how the shorelines of lakes may help forecasters determine the impacts of lake-effect storms."Lake Ontario's east-west orientation allows intense bands of snow to form," said Ed Bensman, a program director in the National Science Foundation's (NSF) Division of Atmospheric and Geospace Sciences, which funded the research. "This study found that the shape of the lake's shoreline can have an important influence on the low-level winds that lead to bands of snow for long periods of time -- and to heavy snow totals. The research team analyzed the strength of these snow bands, and their formation and persistence. Snow bands were often active for several days."When land breezes move offshore from places where the coastline bulges out into a lake, unstable air masses form and drive a narrow band of moisture that dumps its moisture as snow on a strip of land downwind of the lake.Steenburgh said it's long been known that breezes coming from the shore onto a lake help initiate and direct the formation of snow bands. Steenburgh and Geerts, and colleagues from universities in Illinois, Pennsylvania and upstate New York, traveled to Lake Ontario as part of an NSF-funded project called Ontario Winter Lake-effect Systems (OWLeS). The scientists investigated several questions about lake-effect systems:To find out, Geerts' team flew a Wyoming King Air research plane through winter storms, and Steenburgh's group set up weather monitoring equipment, including profiling radars and snow-measurement stations, to monitor the arrival of lake-effect storms near Tug Hill.The researchers witnessed the region's intense snowfall, including one storm that dropped 40 inches in 24 hours. Snowfall rates often exceeded 4 inches per hour. "That's an amazing rate," Steenburgh said. "It's just an explosion of snow."Wyoming Cloud Radar aboard the King Air plane detected an intense secondary air circulation across the main snow band. "This circulation had a narrow updraft, creating and lifting snow like a fountain in a narrow strip that dumped heavy snow where it made landfall," Geerts said. Using a weather model, Steenburgh's team found that this circulation's origin was a land breeze generated by the lake's uneven shoreline geography.In some cases, another land breeze generated a second snow band that merged with the first. "The intense secondary circulation, with updrafts up to 20 miles per hour, had never been observed before," Geerts said.One particular shoreline feature played a large role: a gentle, broad bulge along Lake Ontario's southern shore that extends from about Niagara Falls in the west to Rochester, New York, in the east."This bulge was important in determining where the lake-effect snow bands developed," Steenburgh said. "A bulge near Oswego, New York, on the southeast shore, also contributed to an increase in the precipitation downstream of Lake Ontario over Tug Hill."Steenburgh says the residents of the region take the heavy snowfall in stride. Roads are kept plowed, and the team found that on many days, the biggest challenge was just getting out of the driveway of the house they stayed in. Once the tractor-snow blower was fired up, however, the researchers had a clear shot."We're a bunch of snow geeks," Steenburgh said. "We love to see it snowing like that. It's really pretty incredible. And our friends on Tug Hill made sure we could do our research."Incorporating considerations of shoreline geography into weather forecast models can help predict which communities might be most affected by snowstorms, Steenburgh said. Understanding the effect of breezes that arise from the shore's shape is the key."If we want to pinpoint where the lake-effect is going to be, we're going to have to do a very good job of simulating what's happening along these coastal areas," he said.
Weather
2,017
November 15, 2017
https://www.sciencedaily.com/releases/2017/11/171115124926.htm
Off track: How storms will veer in a warmer world
Under global climate change, Earth's climatic zones will shift toward the poles. This is not just a future prediction; it is a trend that has already been observed in the past decades. The dry, semi-arid regions are expanding into higher latitudes, and temperate, rainy regions are migrating poleward. In a paper that that was recently published in
Prof. Yohai Kaspi of the Institute's Earth and Planetary Sciences Department explains that Earth's climatic zones roughly follow latitudinal bands. Storms mostly move around the globe in preferred regions called "storm tracks," forming over the ocean and generally traveling eastward and somewhat poleward along these paths. Thus, a storm that forms in the Atlantic off the East Coast of the US at a latitude of around 40N will reach Europe in the region of latitude 50N. Until recently, however, this inclination to move in the direction of the nearest pole was not really understood. Dr. Talia Tamarin in Kaspi's group solved this fundamental question in her doctoral research.Kaspi: "From the existing climate models, one can observe the average storm tracks, but it is hard to prove cause and effect from these. They only show us where there are relatively more or fewer storms. Another approach is following individual storms; however, we must deal with chaotic, noisy systems that are heavily dependent on the initial conditions, meaning no storm is exactly like another. Talia developed a method that combines these two approaches. She applied a storm-tracking algorithm to simplified atmospheric circulation models in which thousands of storms are generated, thus eliminating the dependence on initial conditions. This allowed her to understand how such storms develop over time and space, and what controls their movement." Even such simplified models involve calculations that require several days of computation in one of the Weizmann Institute's powerful computer clusters.In the present study, to understand how the movement of storms may change in a warmer world, Tamarin and Kaspi applied the same method to full-complexity simulations of climate change predictions. Their analysis showed that the tendency of storm tracks to veer in the direction of the poles intensifies in warmer conditions. They discovered that two processes are responsible for this phenomenon. One is connected to the vertical structure and circulation near the tops of these weather systems. A certain type of flow that is necessary for them to grow also steers the storms toward the pole, and these flows are expected to become stronger when average temperatures rise.The second process is connected to the energy tied up in the water vapor in such storms. In global warming, the hotter air will contain more water vapor, and thus more energy will be released when the vapor condenses to drops. "The hottest, wettest air is circulating up the eastern flank of the storm -- to the northern side -- and releasing energy there," says Tamarin. "This process pushes the storm northward (or southward in the southern hemisphere), and this effect will also be stronger in a warmer climate."The models of climate change predict that if average global temperatures rise by four degrees over the next 100 years, storms will deviate poleward from their present tracks by two degrees of latitude. The research performed at the Weizmann Institute of Science shows that part of this will be due to the mechanism they demonstrated, and the other part is tied to the fact that storms are born at a higher latitude in a warmer world. "The model Talia developed gives us both qualitative information on the mechanisms that steer storms toward the poles and quantitative means to predict how these will change in the future," says Kaspi. "Although two degrees may not sound like a lot, the resulting deviation in temperature and rain patterns will have a significant effect on climate zones," he adds.
Weather
2,017
November 13, 2017
https://www.sciencedaily.com/releases/2017/11/171113195019.htm
Study of impact of climate change on temperatures suggests more deaths unless action taken
The largest study to date of the potential temperature-related health impacts of climate change has shown that as global temperatures rise, the surge in death rates during hot weather outweighs any decrease in deaths in cold weather, with many regions facing sharp net increases in mortality rates.
Published in Encouragingly, the research, led by the London School of Hygiene & Tropical Medicine, also showed these deaths could largely be avoided under scenarios that include mitigation strategies to reduce greenhouse gas emissions and further warming of the planet.Antonio Gasparrini, Associate Professor of Biostatistics and Epidemiology at the London School of Hygiene & Tropical Medicine and lead author of the paper, said: "Climate change is now widely recognised as the biggest global threat of the 21st century. Although previous studies have shown a potential rise in heat-related mortality, little was known about the extent to which this increase would be balanced by a reduction in cold-related deaths. In addition, effects tend to vary across regions, depending on local climate and other characteristics, making global comparisons very difficult."This study demonstrates the negative impact of climate change, which may be more dramatic among the warmer and more populated areas of the planet, and in some cases disproportionately affect poorer regions of the world. The good news is that if we take action to reduce global warming, for instance by complying with the thresholds set by the Paris Agreement, this impact will be much lower."The research, funded by the Medical Research Council, involved creating the first global model of how mortality rates change with hot or cold weather. It used real data from 85 million deaths between 1984 and 2015, specific to a wide-range of locations that took into account different climates, socioeconomics and demographics.This enabled the team to estimate how temperature-related mortality rates will change under alternative scenarios of climate change, defined by the four Representative Concentration Pathways (RCPs) established by the Intergovernmental Panel on Climate Change for climate modelling and research in 2014.Under the worst-case scenario (RCP 8.5), which assumes that greenhouse gas emissions continue to rise throughout the 21st century, the authors show the potential for extremely large net increases in temperature-related mortality in the warmer regions of the world. In cooler areas, the less intense warming and large decrease in cold-related deaths may mean no net change or a marginal reduction in temperature-related deaths.Under the strictest pathway (RCP 2.6), which assumes an early peak of greenhouse gas emissions which then decline substantially, the potential net increases in mortality rates at the end of the century be minimal (between -0.4% and +0.6%) in all the regions included in this study, highlighting the benefits of the implementation of mitigation policies.Sir Andy Haines, Professor of Public Health & Primary Care at the London School of Hygiene & Tropical Medicine, and study co-author, said: "This paper shows how heat related deaths will escalate in the absence of decisive action to reduce the emissions of carbon dioxide and short-lived climate pollutants such as methane and black carbon. Such action could also result in major health benefits in the near term by reducing deaths from air pollution."It is imperative that the actions are taken to build on the achievements of the Paris Treaty as the commitments made there are insufficient to prevent warming above 2 degrees C compared with pre-industrial temperatures."Antonio Gasparrini said: "The findings of this study will be crucial for the development of coordinated and evidence-based climate and public health policies, and for informing the ongoing international discussion on the health impacts of climate change that is vital for the future health of humanity."The authors acknowledge limitations in the study, including the lack of data for some regions of the world, and the fact that adaptation mechanisms and potential changes to demographics have not been accounted for.
Weather
2,017
November 8, 2017
https://www.sciencedaily.com/releases/2017/11/171108092411.htm
Improving climate observations offers major return on investment
A well-designed climate observing system could help scientists answer knotty questions about climate while delivering trillions of dollars in benefits by providing decision makers information they need to protect public health and the economy in the coming decades, according to a new paper published today.
The flip side is also true, said lead author Elizabeth Weatherhead, a scientist with CIRES at the University of Colorado Boulder. The cost of failing to invest in improving our ability to predict and plan for droughts, floods, extreme heat events, famine, sea level rise and changes in freshwater availability could reach hundreds of billions of dollars each year, she and her colleagues wrote. Their paper is published in the current edition of "Improving our understanding of climate not only offers large societal benefits but also significant economic returns," Weatherhead said. "We're not specifying which measurement (or observing) systems to target, we're simply saying it's a smart investment to address the most pressing societal needs."Data generated by the current assemblage of observing systems, including NOAA's satellite and ground-based observing systems, have yielded significant insights into important climate questions. However, coordinated development and expansion of climate observing systems are required to advance weather and climate prediction to address the scale of risks likely in the future.For instance, the current observing system cannot monitor precipitation extremes throughout much of the world, and cannot forecast the likelihood of extreme flooding well enough to sufficiently guide rebuilding efforts. "The current decline of our Earth observing systems is likely to continue into the foreseeable future," said Liz Moyer, a climate researcher at the University of Chicago who was not involved in the new assessment. "Unless action is taken -- such as suggested in this paper -- our ability to plan for and respond to some of the most important aspects of climate, including extreme events and water availability, will be significantly limited."Weatherhead and a team that included four NOAA laboratory directors and many other prominent climate scientists urge that investments focus on tackling seven "grand challenges," such as predicting extreme weather and climate shifts, the role of clouds and circulation in regulating climate, the regional sea level change and coastal impacts, understanding the consequences melting ice, and feedback loops involving carbon cycling. In each category, observations are needed to inform process studies, to build long-term datasets against which to evaluate changing conditions,and ultimately to improve modeling and forecasting capabilities."We are on the threshold of a new era in prediction, drawing on our knowledge of the entire Earth system to strengthen societal resilience to potential climate and weather disasters," said co-author Antonio Busalacchi, president of the University Corporation for Atmospheric Research. "Strategic investments in observing technologies will pay for themselves many times over by protecting life and property, promoting economic growth, and providing needed intelligence to decision makers.""Well planned observations are important to more than just understanding climate," agreed Deon Terblanche, director of research at the World Meteorological Organization. "Predicting the weather and extreme events, and managing water availability and energy demand will all benefit,""Developing observation systems focused on the major scientific questions with a rigorous evaluation process to ensure the measurement quality is fit-for-purpose -- as the authors propose -- will more than pay off in the long run," said Tom Gardiner, a principal research scientist at the UK's National Physical Laboratory.Objective evaluations of proposed observing systems, including satellites, ground-based or in-situ observations as well as new, currently unidentified observational approaches, will be needed to prioritize investments and maximize societal benefits, the authors propose."We need to take a critical look at what's needed to address the most important climate questions," said NASA scientist and co-author Bruce Wielicki.Not all new observing strategies would necessarily require expensive new systems like satellites, the authors pointed out. For example, after a devastating flood hit Fort Collins, Colo. in 1998, the state climatologist developed a network of trained volunteers to supplement official National Weather Service precipitation measurements using low-cost measuring tools and a dedicated web portal. The Community Collaborative Rain, Hail and Snow now counts thousands of volunteers nationwide who provide the data directly to the National Weather Service.Using a rigorous evaluation process to develop a robust network of observation systems focused on the major scientific questions will more than pay off in the long run, the authors concluded."The economic risks from climate change are measured in trillions of dollars," agreed Rich Sorkin, CEO of Jupiter, a Silicon Valley-based company that provides intelligence on weather and climate risks around the globe. "So an improved, properly designed observing system, with commensurate investments in science and understanding, has the potential to be of tremendous value to society."
Weather
2,017
November 6, 2017
https://www.sciencedaily.com/releases/2017/11/171106091055.htm
Animation meets biology: Shedding new light on animal behavior
Many animals rely on movement to find prey and avoid predators. Movement is also an essential component of the territorial displays of lizards, comprising tail, limb, head and whole-body movements.
For the first time, digital animation has been used as a research tool to examine how the effectiveness of a lizard's territorial display varies across ecological environments and conditions. The new research is published in the journal A team from La Trobe University's School of Life Sciences, led by Dr Richard Peters, worked with academics from Monash University's Faculty of IT to create, using 3D animation, a series of varied environmental settings and weather conditions, comprising different plant environments and wind conditions, to quantify how lizard displays are affected by this variation.The use of movement to communicate is common among lizards, but it has been impossible to observe lizard signalling behaviour in every type of ecological setting using traditional methods such as using multiple cameras in the wild, Dr Peters said.Our research team therefore devised an innovative way of combining evolutionary biology with digital arts to create a 3D animation tool that simulates three spatial dimensions plus movement through time.La Trobe University PhD student and lead author, Xue ("Snow") Bian, explained how a real Jacky dragon (The lizard's signalling was reconstructed by digitising the position of multiple body parts through the sequence, subsequently combining the data from the two camera views to reconstruct the signalling motion in 3D, Ms Bian said.Then four scenarios were created using the same lizard signal in different plant environments and weather conditions to explore how these different ecological contexts affected signal effectiveness, explained Dr Tom Chandler from Monash University.Dr Peters said that using animation as a research tool will allow scientists to measure much more accurately the behavioural signals of lizards.This exciting development in evolutionary biology opens up all sorts of other possibilities for studying animal behaviour in a range of settings, including in environments affected by climate change and habitat modification, Dr Peters said.Under such circumstances, lizard signals might be more noticeable, therefore making the lizard more vulnerable to predators.The research -- Integrating evolutionary biology with digital arts to quantify ecological constraints on vision-based behaviour -- is published on 6 November in the journal
Weather
2,017
November 1, 2017
https://www.sciencedaily.com/releases/2017/11/171101130322.htm
Improving public safety in face of extreme weather and disasters
Our ability to observe and predict severe weather events and other disasters has improved markedly over recent decades, yet this progress does not always translate into similar advances in the systems used in such circumstances to protect lives. A more cohesive alert and warning system that integrates public and private communications mechanisms and adopts new technologies quickly is needed to deliver critical information during emergency situations. At the same time, better understanding of social and behavioral factors would improve the ways we communicate about hazards, inform response decisions such as evacuations, develop more resilient urban infrastructure, and take other steps to improve weather readiness.
Two reports by the National Academies of Sciences, Engineering, and Medicine propose steps to improve public safety and resilience in the face of extreme weather and other disasters.Emergency Alert and Warning Systems: Current Knowledge and Future Research Directions examines how government systems such as Wireless Emergency Alerts (WEA) and Integrated Public Alert and Warning System (IPAWS) will need to evolve as technology advances. The transformation of these alert systems should be informed by both technological and social and behavioral sciences research, the report says.Integrating Social and Behavioral Sciences Within the Weather Enterprise emphasizes the need for government agencies, industry, and academic institutions involved in the weather enterprise to work together to more actively engage social and behavioral scientists, in order to make greater progress in protecting life and enhancing prosperity. While efforts to improve physical weather prediction should continue, the report says, realizing the greatest return on investment from such efforts requires understanding how people's contexts, experiences, knowledge, perceptions, and attitudes shape their responses to weather risks.As technology advances, government systems such as Wireless Emergency Alerts (WEA) and Integrated Public Alert and Warning System (IPAWS) will need to evolve, and their transformation should be informed by both technological and social and behavioral sciences research, says Emergency Alert and Warning Systems: Current Knowledge and Future Research Directions, one of the reports released today.Emergency alerts and warnings are sent out by government agencies through broadcast media and WEA. But the report notes that the information ecosystem has broadened to also include a wider variety of delivery mechanisms including first-person reports on social media platforms. Private companies like Google and Facebook are also collecting information from emergency management agencies to issue notifications. The committee that conducted the study and wrote the report said government-designed systems need to fit into this larger structure of communication.The committee envisioned an integrated alerts system that continually takes advantage of new technologies and knowledge emerging from events and research. Emergency managers should increase the use of WEA and incorporate current knowledge of how the public responds to emergency notifications to craft more effective alert messages in the near future. Those agencies and private companies responsible for evolving and implementing IPAWS and WEA should adopt newer delivery and geotargeting technologies in the next several years.The report outlines key research questions and other areas of study. One example is to improve geotargeting by performing more research to determine the best ways to graphically display the location of an individual in a risky situation and how visualizations can be used to best illustrate the location of the message receiver relative to the area of impact. The committee also recommended exploring message characteristics like length and content when communicating about an emergency situation, how best to transmit information in multiple languages, and how to make public education campaigns regarding disaster alerts more effective.There are also several challenges in building an effective alerts system, the report notes, such as slow adoption of new systems because of gaps in funding or expertise, the challenge of adapting to ever-changing technology, and limited opportunities for engineers, social science researchers, and emergency managers to frequently interact to apply current knowledge or fill gaps in understanding.Improving the Weather Enterprise with Social and Behavioral SciencesWeather forecasts and warnings are being made with greater accuracy, geographic specificity, and lead time, which allow people and communities to take appropriate protective measures. Yet, as recent hazardous weather events have illustrated, social and behavioral factors -- including people's contexts, experiences, knowledge, perceptions, and attitudes -- shape responses to weather risks, says Integrating Social and Behavioral Sciences Within the Weather Enterprise, the second report released today.The committee that conducted the study and wrote this report noted that as efforts to advance meteorological research continue, it is essential for government agencies, industry, and academic institutions, all part of the weather enterprise, to integrate social and behavioral sciences into their work. This report suggests strategies to better engage researchers and practitioners from multiple social science fields to advance those fields, to more effectively apply relevant research findings, and to foster more cooperation on this endeavor among public, private, and academic sectors.A better understanding of social and behavioral aspects of weather readiness will help us not only to design more effective forecasts and warnings but also to reduce vulnerability and mitigate risks of hazardous weather well before an event strikes and to better support emergency management and response efforts.The report includes a special focus on social science research related to road safety, given that road weather hazards are by far the largest cause of weather-related deaths and injuries in the United States, An estimated 445,000 people are injured and 6,000 killed annually due to weather-related vehicle accidents. Understanding why people choose to drive during hazardous weather can help in developing better strategies to discourage risky behavior. Better understanding how drivers get weather-related information can help better inform people who encounter dangerous conditions such as icy roads or low visibility while already in transit.Many innovative social science research activities to date have made demonstrable contributions to the weather enterprise. But new insights are often not routinely applied in practice, and building a solid base of knowledge has been hampered by small-scale and inconsistent investments in these efforts. The report finds that limited support for research in this area has made it difficult to sustain a critical mass of robust studies, let alone expand research capacity. Making greater progress in advancing interdisciplinary work among physical and social science researchers also requires that meteorologists and other weather professional have a more realistic understanding of the many disciplines and research methodologies within social and behavioral sciences; of the time and resources needed for robust research; and of the inherent limitations in providing simple, universally applicable answers to complex social questions.NOAA will need to play a central role in driving this research forward in order to achieve the agency's goals of improving the nation's weather readiness, the report says. The committee detailed several possible mechanisms for the agency to advance its capacity to support social and behavioral science research, including innovative public-private partnerships for interdisciplinary weather research and creating social science-focused research programs within NOAA's Cooperative Institutes. Other federal agencies that are needed as key partners in this work are the National Science Foundation, the Federal Highway Administration, and the Department of Homeland Security/FEMA.Some examples of critical research needs highlighted in this report include: understanding how forecasters, broadcast media, emergency and transportation managers, and private weather companies interact and create and disseminate information; understanding how to better reach and inform populations that are particularly vulnerable to hazardous weather; and understanding how new communication technologies affect message design and are changing people's weather information access, interpretations, preparedness, and response.
Weather
2,017
October 31, 2017
https://www.sciencedaily.com/releases/2017/10/171031111446.htm
Dinosaur-killing asteroid impact cooled Earth's climate more than previously thought
The Chicxulub asteroid impact that wiped out the dinosaurs likely released far more climate-altering sulfur gas into the atmosphere than originally thought, according to new research.
A new study makes a more refined estimate of how much sulfur and carbon dioxide gas were ejected into Earth's atmosphere from vaporized rocks immediately after the Chicxulub event. The study's authors estimate more than three times as much sulfur may have entered the air compared to what previous models assumed, implying the ensuing period of cool weather may have been colder than previously thought.The new study lends support to the hypothesis that the impact played a significant role in the Cretaceous-Paleogene extinction event that eradicated nearly three-quarters of Earth's plant and animal species, according to Joanna Morgan, a geophysicist at Imperial College London in the United Kingdom and co-author of the new study published in "Many climate models can't currently capture all of the consequences of the Chicxulub impact due to uncertainty in how much gas was initially released," Morgan said. "We wanted to revisit this significant event and refine our collision model to better capture its immediate effects on the atmosphere."The new findings could ultimately help scientists better understand how Earth's climate radically changed in the aftermath of the asteroid collision, according to Georg Feulner, a climate scientist at the Potsdam Institute for Climate Impact Research in Potsdam, Germany who was not involved with the new research. The research could help give new insights into how Earth's climate and ecosystem can significantly change due to impact events, he said."The key finding of the study is that they get a larger amount of sulfur and a smaller amount of carbon dioxide ejected than in other studies," he said. "These improved estimates have big implications for the climactic consequences of the impact, which could have been even more dramatic than what previous studies have found."The Chicxulub impact occurred 66 million years ago when an asteroid approximately 12 kilometers (7 miles) wide slammed into Earth. The collision took place near what is now the Yucatán peninsula in the Gulf of Mexico. The asteroid is often cited as a potential cause of the Cretaceous-Paleogene extinction event, a mass extinction that erased up to 75 percent of all plant and animal species, including the dinosaurs.The asteroid collision had global consequences because it threw massive amounts of dust, sulfur and carbon dioxide into the atmosphere. The dust and sulfur formed a cloud that reflected sunlight and dramatically reduced Earth's temperature. Based on earlier estimates of the amount of sulfur and carbon dioxide released by the impact, a recent study published in Geophysical Research Letters showed Earth's average surface air temperature may have dropped by as much as 26 degrees Celsius (47 degrees Fahrenheit) and that sub-freezing temperatures persisted for at least three years after the impact.In the new research, the authors used a computer code that simulates the pressure of the shock waves created by the impact to estimate the amounts of gases released in different impact scenarios. They changed variables such as the angle of the impact and the composition of the vaporized rocks to reduce the uncertainty of their calculations.The new results show the impact likely released approximately 325 gigatons of sulfur and 425 gigatons of carbon dioxide into the atmosphere, more than 10 times global human emissions of carbon dioxide in 2014. In contrast, the previous study in Geophysical Research Letters that modeled Earth's climate after the collision had assumed 100 gigatons of sulfur and 1,400 gigatons of carbon dioxide were ejected as a result of the impact.The new study's methods stand out because they ensured only gases that were ejected upwards with a minimum velocity of 1 kilometer per second (2,200 miles per hour) were included in the calculations. Gases ejected at slower speeds didn't reach a high enough altitude to stay in the atmosphere and influence the climate, according to Natalia Artemieva, a senior scientist at the Planetary Science Institute in Tucson, Arizona and co-author of the new study.Older models of the impact didn't have as much computing power and were forced to assume all the ejected gas entered the atmosphere, limiting their accuracy, Artemieva said.The study authors also based their model on updated estimates of the impact's angle. An older study assumed the asteroid hit the surface at an angle of 90 degrees, but newer research shows the asteroid hit at an angle of approximately 60 degrees. Using this revised angle of impact led to a larger amount of sulfur being ejected into the atmosphere, Morgan said.The study's authors did not model how much cooler Earth would have been as a result of their revised estimates of how much gas was ejected. Judging from the cooling seen in the previous study, which assumed a smaller amount of sulfur was released by the impact, the release of so much sulfur gas likely played a key role in the extinction event. The sulfur gas would have blocked out a significant amount of sunlight, likely leading to years of extremely cold weather potentially colder than the previous study found. The lack of sunlight and changes in ocean circulation would have devastated Earth's plant life and marine biosphere, according to Feulner.The release of carbon dioxide likely led to some long-term climate warming, but its influence was minor compared to the cooling effect of the sulfur cloud, Feulner said.Along with gaining a better understand of the Chicxulub impact, researchers can also use the new study's methods to estimate the amount of gas released during other large impacts in Earth's history. For example, the authors calculated the Ries crater located in Bavaria, Germany was formed by an impact that ejected 1.3 gigatons of carbon dioxide into the atmosphere. This amount of gas likely had little effect on Earth's climate, but the idea could be applied to help understand the climactic effects of larger impacts.
Weather
2,017
October 30, 2017
https://www.sciencedaily.com/releases/2017/10/171030141900.htm
Greenhouse gas concentrations surge to new record
Concentrations of carbon dioxide in the atmosphere surged at a record-breaking speed in 2016 to the highest level in 800,000 years, according to the World Meteorological Organization's Greenhouse Gas Bulletin. The abrupt changes in the atmosphere witnessed in the past 70 years are without precedent.
Globally averaged concentrations of CORapidly increasing atmospheric levels of COThe annual bulletin is based on observations from the WMO Global Atmosphere Watch Programme. These observations help to track the changing levels of greenhouse gases and serve as an early warning system for changes in these key atmospheric drivers of climate change.Population growth, intensified agricultural practices, increases in land use and deforestation, industrialization and associated energy use from fossil fuel sources have all contributed to increases in concentrations of greenhouse gases in the atmosphere since the industrial era, beginning in 1750.Since 1990, there has been a 40% increase in total radiative forcing -- the warming effect on our climate -- by all long-lived greenhouse gases, and a 2.5% increase from 2015 to 2016 alone, according to figures from the US National Oceanic and Atmospheric Administration quoted in the bulletin."Without rapid cuts in CO"COThe last time the Earth experienced a comparable concentration of COThe WMO Greenhouse Gas Bulletin reports on atmospheric concentrations of greenhouse gases. Emissions represent what goes into the atmosphere. Concentrations represent what remains in the atmosphere after the complex system of interactions between the atmosphere, biosphere, cryosphere and the oceans. About a quarter of the total emissions is taken up by the oceans and another quarter by the biosphere, reducing in this way the amount of CO2 in the atmosphere.A separate Emissions Gap Report by UN Environment, to be released on 31 October, tracks the policy commitments made by countries to reduce greenhouse gas emissions and analyses how these policies will translate into emissions reductions through 2030, clearly outlining the emissions gap and what it would take to bridge it."The numbers don't lie. We are still emitting far too much and this needs to be reversed. The last few years have seen enormous uptake of renewable energy, but we must now redouble our efforts to ensure these new low-carbon technologies are able to thrive. We have many of the solutions already to address this challenge. What we need now is global political will and a new sense of urgency," said Erik Solheim, head of UN Environment.Together, the Greenhouse Gas Bulletin and Emissions Gap Report provide a scientific base for decision-making at the UN climate change negotiations, which will be held from 7-17 November in Bonn, Germany.WMO, UN Environment and other partners are working towards an Integrated Global Greenhouse Gas Information System to provide information that can help nations to track the progress toward implementation of their national emission pledges, improve national emission reporting and inform additional mitigation actions. This system builds on the long-term experience of WMO in greenhouse gas instrumental measurements and atmospheric modelling.WMO is also striving to improve weather and climate services for the renewable energy sector and to support the Green Economy and sustainable development. To optimize the use of solar, wind and hydropower production, new types of weather, climate and hydrological services are needed.COThe rate of increase of atmospheric COOver the last 800,000 years, pre-industrial atmospheric COFrom the most-recent high-resolution reconstructions from ice cores, it is possible to observe that changes in COMethane (CHAtmospheric methane reached a new high of about 1 853 parts per billion (ppb) in 2016 and is now 257% of the pre-industrial level.Nitrous oxide (NIts atmospheric concentration in 2016 was 328.9 parts per billion. This is 122% of pre-industrial levels. It also plays an important role in the destruction of the stratospheric ozone layer which protects us from the harmful ultraviolet rays of the sun. It accounts for about 6% of radiative forcing by long-lived greenhouse gases.
Weather
2,017
October 24, 2017
https://www.sciencedaily.com/releases/2017/10/171024144013.htm
New method for monitoring Indian Summer Monsoon
Researchers from Florida State University have created a tool for objectively defining the onset and demise of the Indian Summer Monsoon -- a colossal weather system that affects billions of people annually.
In a study published in the journal For generations, scientists have struggled to produce a model for reliably defining the duration of the monsoon. No existing system has allowed researchers to reliably define the parameters of the season at this fine a scale."Current weather forecasting and monitoring protocols focus attention on monsoon onset at one location -- specifically the state of Kerala in the southwest corner of the country -- and extrapolate for the rest of the region," said FSU Associate Professor of Earth, Ocean and Atmospheric Science Vasu Misra, the study's lead investigator. "We have gone down to specific locations, we've covered the whole country, and we've objectively defined the onset and demise dates for any given year."The lack of a clear, granular and objective benchmark for ISM onset and demise for all areas of the country has been a longtime source of consternation for the Indian people.In some parts of the country, the torrents of rain that characterize monsoon season account for more than 90 percent of the total annual precipitation. Consequently, many rhythms of Indian political and agricultural life can be destabilized by dubious or false claims of monsoon onset."That leads to tremendous amounts of frustration and confusion for the general public and for the people who are trying to monitor the monsoon because nobody has really gotten down to do it at a granular scale," Misra said.This new system, which ties the onset of the monsoon to location-specific rainfall thresholds, can work to allay that frustration.Up until now, regional meteorological departments have relied on their own ad hoc criteria for determining ISM onset, which can often lead to contradicting claims. A more inclusive method will allow officials and researchers throughout the country to define the monsoon season using a standardized system that, through rigorous testing, has been shown to capture ISM evolution comprehensively.Anchoring the definition of onset and demise solely in local rain rates eliminates the need to rely on less accessible atmospheric variables. This streamlined approach makes it considerably easier to monitor monsoon evolution."Our research enables quite easy real-time monitoring of the onset and demise of the Indian monsoon," Misra said. "We've tested this for 105 years of available data, and this criterion hasn't failed once for any location over India."By orienting this novel framework around rates of rainfall in a given area, Misra and his team have effectively removed the necessity for broad extrapolation.With this methodology, a question that has baffled meteorologists for decades finally has a simple, actionable answer."You don't need complicated definitions," Misra said. "Now we completely base the definition on rainfall, and it hasn't failed."
Weather
2,017
October 20, 2017
https://www.sciencedaily.com/releases/2017/10/171020125758.htm
US ocean observation critical to understanding climate change, but lacks long-term national planning
The ocean plays a critical role in climate and weather, serving as a massive reservoir of heat and water that influences tropical storms, El Nin?o, and climate change. In addition, the ocean has absorbed 30 percent of the carbon dioxide associated with human activities, lessening the climate effects of fossil fuel combustion.
Ocean observing systems are important as they provide information essential for monitoring and forecasting changes in Earth's climate on timescales ranging from days to centuries. A new report by the National Academies of Sciences, Engineering, and Medicine finds that continuity of ocean observations is vital to gain an accurate understanding of the climate, and calls for a decadal, national plan that is adequately resourced and implemented to ensure critical ocean information is available to understand and predict future changes. The report notes that federal activities provide an opportunity for sustained and coordinated ocean-observing in the U.S., but require coordinated and high-level leadership to be effective. Additional benefits of this observational system include improvements in weather forecasting, marine resource management, and maritime navigation.The United States' contributions to the international network of ocean-observing activities are substantial today, and have advanced our understanding of global climate. Particularly, the U.S. is a leader in the efforts of the Global Ocean Observing System, an international organization that identifies priority ocean variables for understanding climate and technical requirements for their measurements. But issues related to flat or declining funding are jeopardizing the country's leadership and creating challenges in maintaining long-term ocean-related climate observations, the report says. Funding mechanisms that rely on annual budget approval or short-term grants may result in discontinuity of ocean-climate measurements, reducing the value of the observations made to date and in the future.The reports also identifies other challenges that impact sustained observations, such as the declining investment in new technological development, increasing difficulty in retaining and replenishing the human resources associated with sustained ocean observing, and a decreasing number of global and ocean-class research vessels.The vast ocean area and harsh environment presents a challenge for observing systems, but new sensors, materials, battery technology, and more efficient electronics could increase the effectiveness, efficiency, and longevity of ocean-observing instruments. Ships will continue to be required to deploy and maintain ocean-observing platforms. The report says maintenance of a capable fleet of global and ocean-class research vessels are an essential component of the U.S. effort to sustain ocean observing. At the same time researchers and technicians in key government and academic laboratories are integral to success in the U.S. at sustained ocean observing and are a resource that requires support.Given that ocean observations for climate provide a wide range of benefits to the agricultural, shipping, fishing, insurance, and energy-supply industries, the committee that wrote the report suggested that efforts could be made to draw support for ocean observing from the commercial sector. In addition, philanthropic organizations have provided support for technology and capacity building initiatives that benefit ocean observing. The committee concluded that establishing an organization to enhance partnerships across sectors with an interest in ocean-observing, particularly nonprofits, philanthropic organizations, academia, U.S. federal agencies, and the commercial sector, would be an effective mechanism to increase engagement and coordination.
Weather
2,017
October 19, 2017
https://www.sciencedaily.com/releases/2017/10/171019121842.htm
Three-quarters of the total insect population lost in protected nature reserves
Since 1989, in 63 nature reserves in Germany the total biomass of flying insects has decreased by more than 75 percent. This decrease has long been suspected but has turned out to be more severe than previously thought. Ecologists from Radboud University together with German and English colleagues published these findings in the scientific journal
In recent years, it became clear that the numbers of many types of insects such as butterflies and bees were declining in Western Europe and North America. 'However, the fact that flying insects are decreasing at such a high rate in such a large area is an even more alarming discovery,' states project leader at the Radboud University Hans de Kroon.Entomologists (insect researchers) in Krefeld, Germany, led by Martin Sorg and Heinz Schwan, collected data over the past 27 years in 63 different places within nature reserves across Germany. Flying insects were trapped in malaise traps and the total biomass was then weighed and compared. The researchers from Nijmegen, Germany and England have now been able to analyse this treasure trove of data for the first time.The researchers discovered an average decline of 76 percent in the total insect mass. In the middle of summer, when insect numbers peak, the decline was even more severe at 82 percent. According to Caspar Hallmann from Radboud University who performed the statistical analyses, 'All these areas are protected and most of them are managed nature reserves. Yet, this dramatic decline has occurred.'The exact causes of the decline are still unclear. Changes in the weather, landscape and plant variety in these areas are unable to explain this. The weather might explain many of the fluctuations within the season and between the years, but it doesn't explain the rapid downward trend.Researchers can only speculate about the possible causes. 'The research areas are mostly small and enclosed by agricultural areas. These surrounding areas inflict flying insects and they cannot survive there. It is possible that these areas act as an 'ecological trap' and jeopardize the populations in the nature reserves,' explains Hallmann. It is likely that the results are representative for large parts of Europe and other parts of the world where nature reserves are enclosed by a mostly intensively used agricultural landscape.'As entire ecosystems are dependent on insects for food and as pollinators, it places the decline of insect eating birds and mammals in a new context,' states Hans de Kroon. 'We can barely imagine what would happen if this downward trend continues unabated.'Because the causes of the decline are not yet known, it is difficult to take any concrete measures. The researchers hope that these findings will be seen as a wake-up call and prompt more research into the causes and support for long-term monitoring.De Kroon: 'The only thing we can do right now is to maintain the utmost caution. We need to do less of the things that we know have a negative impact, such as the use of pesticides and prevent the disappearance of farmland borders full of flowers. But we also have to work hard at extending our nature reserves and decreasing the ratio of reserves that border agricultural areas.'
Weather
2,017
October 12, 2017
https://www.sciencedaily.com/releases/2017/10/171012123049.htm
Rainfall trends in arid regions buck commonly held climate change theories
The recent intense hurricanes in the Atlantic have sharply focused attention on how climate change can exacerbate extreme weather events.
Scientific research suggests that global warming causes heavier rainfall because a hotter atmosphere can hold more moisture and warmer oceans evaporate faster feeding the atmosphere with more moisture.However, this link between climate warming and heavy rainfall has only been examined in particular regions where moisture availability is relatively high.Until now, no research has been undertaken that examines this relationship in dryland regions where short, sharp rainstorms are the dominant source of precipitation and where moisture availability on land is extremely limited.To explore the links between climatic warming and rainfall in drylands, scientists from the Universities of Cardiff and Bristol analysed more than 50 years of detailed rainfall data (measured every minute) from a semi-arid drainage basin in south east Arizona exhibiting an upward trend in temperatures during that period.The analysis demonstrated a decline in rainfall intensity, despite an increase in total rainfall over the years. Interestingly, the study shows that there is a long-term decline in heavy rainfall events (greater than 25 mm/h) and an associated increase in the number of smaller storms each delivering less rainfall.This result is contrary to commonly held assumptions about rainfall trends under climate change.Lead author, Dr Michael Singer from School of Earth and Ocean Sciences at Cardiff University, said: "In drylands, convective (or short, intense) rainfall controls water supply, flood risk and soil moisture but we have had little information on how atmospheric warming will affect the characteristics of such rainstorms, given the limited moisture in these areas."Co-author, Dr Katerina Michaelides, from the School of Geographical Sciences and Cabot Institute at the University of Bristol, said: "Our findings are consistent with previous research in the Colorado Basin which has revealed a decline in runoff in the upper part of the Basin."Our work demonstrates that there is a more regional decline in water resources in this dryland region, which may be found in other dryland regions of the world."Since trends in convective rainfall are not easily detected in daily rainfall records, or well-simulated by global or regional climate models, the researchers created a new tool to assess the effects of climate change on rainfall patterns and trends in dryland areas.Their new model, STORM, simulates individual rainstorms and their expression over a river basin, and it can represent different classes of climate change over many decades.Drs Singer and Michaelides employ STORM to show that the historical rainfall trends likely resulted in less runoff from this dryland basin, an effect they expect to have occurred at many similar basins in the region.Dr Singer added: "We see this model as a useful tool to simulate climate change in regions and cases where traditional models and methods don't capture the trends."
Weather
2,017
October 12, 2017
https://www.sciencedaily.com/releases/2017/10/171012114839.htm
Geologic evidence is the forerunner of ominous prospects for a warming Earth
While strong seasonal hurricanes have devastated many of the Caribbean and Bahamian islands this year, geologic studies on several of these islands illustrate that more extreme conditions existed in the past. A new analysis published in
In Bermuda and the Bahamas, the geology of the last interglacial (LIG; approximately 120,000 years ago) is exquisitely preserved in nearly pure carbonate sedimentary rocks. A record of superstorms and changing sea levels is exposed in subtidal, beach, storm, and dune deposits on multiple islands. Extensive studies by the authors over the past decades on these islands have documented stratigraphic, sedimentologic, and geomorphic evidence of major oceanic and climatic disruptions at the close of the last interglacial.Dr. Paul J. Hearty, a retired Associate Professor at the University of North Carolina at Wilmington, and Dr. Blair. R. Tormey, a Coastal Research Scientist at Western Carolina University conducted an invited review of published findings. It demonstrates that during a global climate transition in the late last interglacial, also known as marine isotope substage 5e (MIS 5e), abrupt multi-meter sea-level changes occurred. Concurrently, coastlines of the Bahamas and Bermuda were impacted by massive storms generated in the North Atlantic Ocean, resulting in a unique trilogy of wave-transported deposits: megaboulders, chevron-shaped, storm-beach ridges, and runup deposits on high dune ridges.While perhaps more mundane than the megaboulders (found only locally on Eleuthera), the sedimentological structures found within chevron ridge and runup deposits across islands throughout the Bahamas and Bermuda point to frequent and repeated inundation by powerful storm waves, in some locations leaving storm deposits tens of meters above sea level.During the last interglacial, sea levels were about 3-9 meters higher than they are now. The geologic evidence indicates that the higher sea-levels were accompanied by intense "superstorms," which deposited giant wave-transported boulders at the top of cliffed coastlines, formed chevron-shaped, storm beach ridges in lowland areas, and left wave runup deposits on older dunes more than 30 meters above sea level. These events occurred at a time of only slightly warmer global climate and CO2 (about 275 ppm) was much lower than today.The authors emphasize "the LIG record reveals that strong climate forcing is not required to yield major impacts on the ocean and ice caps." In our industrial world, rapidly increasing atmospheric CO2 has surpassed 400 ppm, levels not achieved since the Pliocene era about 3 million years ago, while global temperature has increased nearly 1 °C since the 1870s. Today, ice sheets are melting, sea level is rising, oceans are warming, and weather events are becoming more extreme.Drs. Hearty and Tormey conclude that with the greatly increased anthropogenic CO2 forcing at rates unmatched in nature, except perhaps during global extinction events, dramatic change is certain. They caution that, "Our global society is producing a climate system that is racing forward out of humanity's control into an uncertain future. If we seek to understand the non-anthropogenic events of the last interglaciation, some of the consequences of our unchecked forward speed may come more clearly into focus...a message from the past; a glimpse into the future."
Weather
2,017
October 12, 2017
https://www.sciencedaily.com/releases/2017/10/171012091550.htm
Study reveals need for better modeling of weather systems for climate prediction
Computer-generated models are essential for or scientists to predict the nature and magnitude of weather systems, including their changes and patterns. Using 19 climate models, a team of researchers led by Professor Minghua Zhang of the School of Marine and Atmospheric Sciences at Stony Brook University, discovered persistent dry and warm biases of simulated climate over the region of the Southern Great Plain in the central U.S. that was caused by poor modeling of atmospheric convective systems -- the vertical transport of heat and moisture in the atmosphere. Their findings, to be published in
The climate models analyzed in the paper "Causes of model dry and warm bias over central U.S. and impact on climate projections," included a precipitation deficit that is associated with widespread failure of the models in capturing actual strong rainfall events in summer over the region. By correcting for the biases, the authors found that future changes of precipitation over the US Southern Great Plain by the end of the 21"Current climate models are limited by available computing powers even when cutting-edge supercomputers are used," said Professor Zhang. "As a result, some atmospheric circulations systems cannot be resolved by these models, and this clearly impacts the accuracy of climate change predictions as shown in our study."Professor Zhang and colleagues believe climate models will become more accurate in the coming years with the use of exsascale supercomputing, now in development worldwide.
Weather
2,017
October 11, 2017
https://www.sciencedaily.com/releases/2017/10/171011180239.htm
Herbivores help protect ecosystems from climate change
Plant-eating critters are the key ingredient to helping ecosystems survive global warming, finds new UBC research that offers some hope for a defence strategy against climate change.
"The herbivores created space for other plants and animals to move in and we saw much more diversity and variety in these ecosystems," said Rebecca Kordas, the lead author of the study who completed this research as a PhD student in zoology at UBC. "We want variety because we found it helps protect the ecosystem when you add a stressor like heat."For this study, Kordas, who is now a research fellow at Imperial College London, and her colleagues created mini-marine ecosystems on the shore of Ruckle Park on British Columbia's Salt Spring Island. The mini ecosystems were built on hard plastic plates that allowed researchers to control the temperatures. Some of the plates allowed voracious herbivores called limpets in, and some kept them out. Limpets are like snails, but with a cone-shaped shell.The researchers were studying life in the intertidal zone, the area of the shore between the low tide and high tide. This area is home to a community of starfish, anemones, mussels, barnacles and seaweed. As the tide moves in and out, the plants and animals must cope with huge variation in temperature every day, sometimes as much as 20 to 25 degrees Celsius."These creatures are already living at their physiological limits, so a two-degree change -- a conservative prediction of the warming expected over the next 80 years or so -- can make a big difference," said Kordas. "When heat waves come through B.C. and the Pacific Northwest, we see mass mortality of numerous intertidal species."The researchers found that in the summer, when temperatures were at their warmest, communities could fare well even if they were heated, but only if limpets were present."When limpets were part of the community, the effects of warming were less harsh," she said.Christopher Harley, a professor of zoology at UBC and senior author on the study, says consumers like limpets, sea otters or starfish are very important to maintaining biodiversity, especially in aquatic ecosystems. Losing these species can destabilize ecosystems, but by the same token, protecting these species can make ecosystems more resilient."We should be thinking of ways to reduce our negative effects on the natural environment and these results show that if we do basic conservation and management, it can make a big difference in terms of how ecosystems will weather climate change," Harley said.
Weather
2,017
October 11, 2017
https://www.sciencedaily.com/releases/2017/10/171011091718.htm
Scientists develop tool which can predict coastal erosion and recovery in extreme storms
The damage caused to beaches by extreme storms on exposed energetic coastlines and the rate at which they recover can now be accurately predicted thanks to new research led by the University of Plymouth.
Working with the University of New South Wales, scientists have developed a computer model which uses past wave observations and beach assessments to forecast the erosion and/or accretion of beach sediments over the coming year.They believe it could be a sea change for coastal managers, giving them the opportunity to make decisions that could protect communities from severe wave damage.In a study, published in In seeking to address that, they have developed a traffic light system based on the severity of approaching storms, which will highlight the level of action required to protect particular beaches.Dr Mark Davidson, Reader in Coastal Processes at the University of Plymouth, led the research. He said: "In the past, coastal managers have always tended to be responsive. They have been unable to fully predict how their areas might respond over periods of up to a year, and to assess any pre-emptive measures they could take. This research goes some way to changing that, enabling us to warn people in advance about how beaches will respond and helping officials take the steps they need to protect themselves and their communities."The new tool was tested on two beaches -- Perranporth in North Cornwall and Narrabeen, just north of Sydney -- which experience very differing wave and climatic conditions.Measured and/or modelled wave data are used to generate around a thousand potential shoreline predictions and based on a statistical analysis of these, potential shoreline positions are displayed in traffic like system, whereby green signifies normal displacement ranges, amber would be considered high and red are extreme.The period tested included the Pasha Bulker storm sequence recorded at Narrabeen in 2007, and the extreme storms of 2013/14, known to be the most energetic storms to hit Europe's Atlantic coastline in more than six decades.In both cases, the methodology was able to predict both storm erosion and subsequent recovery, giving a clear indication of the intensity of storms in terms of their impact on the coast.Dr Davidson added: "Beaches play a crucial role in the lives of coastal communities, acting as a defence but also in creating leisure opportunities. Gaining a greater knowledge of how they might be affected by weather is therefore essential, both in the short and long term. We have never been able to forecast over a longer period of time before, and are now looking at ways to expand this tool so that its accuracy and benefits can be increased."
Weather
2,017
October 9, 2017
https://www.sciencedaily.com/releases/2017/10/171009123151.htm
Droughts and wildfires: How global warming is drying up the North American monsoon
Researchers have struggled to accurately model the changes to the abundant summer rains that sweep across the southwestern United States and northwestern Mexico, known to scientists as the "North American monsoon."
In a report published Oct. 9 in the journal The report's authors include Salvatore Pascale, an associate research scholar in atmospheric and oceanic sciences (AOS); Tom Delworth, a lecturer in geosciences and AOS and research scientist at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL); Sarah Kapnick, a 2004 Princeton alumna and former AOS postdoc who is currently a research physical scientist at GFDL; AOS associate research scholar Hiroyuki Murakami; and Gabriel Vecchi, a professor of geosciences and the Princeton Environmental Institute.When they corrected for persistent sea surface temperature (SST) biases and used higher-resolution data for the regional geography, the researchers created a model that accurately reflects current rainfall conditions and suggests that future changes could have significant consequences for regional water resources and hazards."This study represents fundamental science relating to the physics of the North American monsoon, but feeds back onto weather to climate predictions and building resiliency for our water supply and responses to hazards," said Kapnick. "I am excited about this leap forward to improve our models and for the potential applications that they will provide in the future to society."Their results highlight the possibility of a strong precipitation reduction in the northern edge of the monsoon in response to warming, with consequences for regional water resources, agriculture and ecosystems."Monsoon rains are critical for the southwest U.S. and northwest Mexico, yet the fate of the North American monsoon is quite uncertain," said Pascale, the lead author on the paper. "The future of the monsoon will have direct impacts on agriculture, on livelihoods."Previous general circulation models have suggested that the monsoons were simply shifting later, with decreased rains through July but increased precipitation in September and October."The consensus had been that global warming was delaying the monsoon ... which is also what we found with the simulation if you didn't correct the SST biases," Pascale said. "Uncontrolled, the SST biases can considerably change the response. They can trick us, introducing artefacts that are not real."Once those biases were corrected for, the researchers discovered that the monsoon is not simply delayed, but that the total precipitation is facing a dramatic reduction.That has significant implications for regional policymakers, explained Kapnick. "Water infrastructure projects take years to a decade to plan and build and can last decades. They require knowledge of future climate ... to ensure water supply in dry years. We had known previously that other broadly used global models didn't have a proper North American monsoon. This study addresses this need and highlights what we need to do to improve models for the North American monsoon and understanding water in the southwest."The new model also suggests that the region's famous thunderstorms may become less common, as the decreased rain is associated with increased stability in the lower-to-middle troposphere and weakened atmospheric convection."The North American monsoon is also related to extreme precipitation events that can cause flash floods and loss of life," Kapnick said. "Knowing when the monsoon will start and predicting when major events will happen can be used for early warnings and planning to avoid loss of life and property damage. This paper represents the first major step towards building better systems for predicting the monsoon rains."The researchers chose to tackle the region in part because previous, coarser-resolution models had shown that this area would be drying out, a prediction that has been borne out in the droughts and wildfires of recent years. But most of those droughts are attributed to the change in winter storms, said Pascale."The storm track is projected to shift northward, so these regions might get less rain in winter, but it was very uncertain what happens to the monsoon, which is the other contributor to the rains of the region. We didn't know, and it's crucial to know," he said.In their model, the researchers were able to tease out the impacts of one factor at a time, which allowed them to investigate and quantify the monsoon response to the doubling of atmospheric carbon dioxide, increased temperatures and other individual changes.Pascale stressed the limits of this or any other climate model. "They need to be used with an understanding of their shortcomings and utilized to their expected potential but no further. They can give us quite reliable information about the large scale atmospheric circulation, but if you want to look at the regional, small-scale effects, you have to be very careful," he said. "Models are critical but they are not perfect, and small imperfections can lead to big misunderstandings."He continued: "We are not saying, 'We are sure that this is what will be,' but we wanted to point out some mechanisms which are key, and have to be taken into account in future research on the North American monsoon. This is a difficult region, so future research will point out if we were right, and to what extent."
Weather
2,017
October 5, 2017
https://www.sciencedaily.com/releases/2017/10/171005103803.htm
Tracking debris in the Earth‘s orbit with centimeter precision using efficient laser technology
Uncontrollable flying objects in orbit are a massive risk for modern space travel, and, due to our dependence on satellites today, it is also a risk to global economy. A research team at the Fraunhofer Institute for Applied Optics and Precision Engineering IOF in Jena, Germany, has now especially developed a fiber laser that reliably determines the position and direction of the space debris' movement to mitigate these risks.
Space debris is a massive problem in low Earth orbit space flight. Decommissioned or damaged satellites, fragments of space stations and other remnants of space missions pose a potential threat of collisions with active satellites and spacecraft every day. In addition to their destructive force, collisions also create additional risk creating thousands of new pieces of debris, which in turn could collide with other objects -- a dangerous snowball effect.Today, the global economy depends to a substantial degree on satellites and their functions -- these applications are, for example, used in telecommunications, the transmission of TV signals, navigation, weather forecasting and climate research. The damage or destruction of such satellites through a collision with orbiting satellites or remains of rockets can cause immense and lasting damage. Therefore, the hazardous space debris needs to be reliably tracked and recorded before any salvaging or other counter-measures can be considered. Experts from Fraunhofer IOF in Jena have developed a laser system that is perfectly suited for this task."With our robust and efficient system we can reliably and accurately determine the objects' exact position and direction of movement in orbit," explains Dr. Thomas Schreiber from the fiber lasers group at Fraunhofer IOF. "Laser systems like ours must be exceptionally powerful in order to withstand the extreme conditions in space. In particular, the high physical strain on the carrier rocket during the launch, where the technology is subjected to very strong vibrations. "In the low earth orbit, the high level of exposure to radiation, the extreme temperature fluctuations and the low energy supply are just as great obstacles to overcome. This necessitated the new development by the Jena research team since common laser technologies are not able to cope with these challenges.Moreover, it is also necessary to analyze space debris over comparatively long distances. For this purpose, the laser pulse is propagating through a glass fiber-based amplifier and sent on its kilometers long journey."Very short laser pulses, which last only a few billionths of a second, are shot at different positions in space to determine the speed, direction of motion and the rotational motion of the objects," explains Dr. Dr. Oliver de Vries. "With our laser system it is possible to shoot up thousands of pulses per second. If an object is actually at one of the positions examined, part of the radiation is reflected back to a special scanner, which is directly integrated into the system. Even though the laser beam is very fast, it takes some time for the emitted light to get to the object and back again. This so-called 'time of flight' can then be converted into a distance and a real 3D coordinate accordingly." The system's sophisticated sensors, which collect the reflected light reflexes, can detect even billionths of the reflected light.The principle -- originally developed by the two researchers of Fraunhofer IOF for Jena-Optronik and the German Aerospace Centre (Deutsches Zentrum für Luft- und Raumfahrt, DLR) -- has already been successfully tested during a space transporter's docking maneuver at the International Space Station ISS. Previously, the laser system had been installed in a sensor of the Thuringian aerospace company Jena-Optronik GmbH and was launched in 2016 with the autonomous supply transporter ATV-5. Jena Optronik's system also excels in energy efficiency: the fiber laser operates at a total power of less than 10 watts -- that is significantly less than a commercial laptop, for instance.
Weather
2,017
September 26, 2017
https://www.sciencedaily.com/releases/2017/09/170926125154.htm
Energy harvested from evaporation could power much of US
In the first evaluation of evaporation as a renewable energy source, researchers at Columbia University find that U.S. lakes and reservoirs could generate 325 gigawatts of power, nearly 70 percent of what the United States currently produces.
Though still limited to experiments in the lab, evaporation-harvested power could in principle be made on demand, day or night, overcoming the intermittency problems plaguing solar and wind energy. The researchers' calculations are outlined in the Sept. issue of "We have the technology to harness energy from wind, water and the sun, but evaporation is just as powerful," says the study's senior author Ozgur Sahin, a biophysicist at Columbia. "We can now put a number on its potential."Evaporation is nature's way of cycling water between land and air. Sahin has previously shown how this basic process can be exploited to do work. One machine developed in his lab, the so-called Evaporation Engine, controls humidity with a shutter that opens and closes, prompting bacterial spores to expand and contract. The spores' contractions are transferred to a generator that makes electricity. The current study was designed to test how much power this process could theoretically produce.One benefit of evaporation is that it can be generated only when needed. Solar and wind power, by contrast, require batteries to supply power when the sun isn't shining and wind isn't blowing. Batteries are also expensive and require toxic materials to manufacture."Evaporation comes with a natural battery," said study lead author, Ahmet-Hamdi Cavusoglu, a graduate student at Columbia. "You can make it your main source of power and draw on solar and wind when they're available." Evaporation technology can also save water. In the study, researchers estimate that half of the water that evaporates naturally from lakes and reservoirs into the atmosphere could be saved during the energy-harvesting process. In their model, that came to 25 trillion gallons a year, or about a fifth of the water Americans consume.States with growing populations and sunnier weather can best capitalize on evaporation's capacity to generate power and reduce water waste, in part because evaporation packs more energy in warm and dry conditions, the researchers say. Drought-prone California, Nevada and Arizona could benefit most.The researchers simplified their model in several ways to test evaporation's potential. They limited their calculations to the United States, where weather station data are readily accessible, and excluded prime locations such as farmland, rivers, the Great Lakes, and coastlines, to limit errors associated with modeling more complex interactions. They also made the assumption that technology to harvest energy from evaporation efficiently is fully developed.Klaus Lackner, a physicist at Arizona State University who was not involved in the study, expressed support for the team's findings. Lackner is developing artificial trees that draw carbon dioxide from the air, in part, by harnessing the power of evaporation."Evaporation has the potential to do a lot of work," he said. "It's nice to see that drying and wetting cycles can also be used to collect mechanical energy."The researchers are working to improve the energy efficiency of their spore-studded materials and hope to eventually test their concept on a lake, reservoir, or even a greenhouse, where the technology could be used to simultaneously make power and limit water loss.
Weather
2,017
September 26, 2017
https://www.sciencedaily.com/releases/2017/09/170926091423.htm
Higher risk of heart failure in cold weather
Could decreases in temperature cause heart failure and death?
An increase in hospitalization and death in elderly patients with heart failure could be associated with changes in temperature and atmospheric pressure, according to a new study in Environment International. The authors of the study say elderly with heart failure should avoid fog and low cloud in the winter as a preventive measure.Previous research has shown that changes in the weather can affect the health of vulnerable people -- for example, heat waves and cold spells have been shown to increase disease and even lead to death in people from low-income neighborhoods. The new study, led by researchers at Université Laval and Université de Sherbrooke in Quebec, Canada, reveals the impact of changes in temperature and air pressure on heart failure patients."We know that doctors rarely take the weather forecast into account when treating or making recommendations to heart failure patients," said Prof. Pierre Gosselin, lead author of the study from Universitié Laval in Canada. "So with the extreme differences in temperature due to climate change, we wanted to show how the weather is becoming a more relevant factor. Our study shows that exposure to cold or high-pressure weather could trigger events leading to hospitalization or death in heart failure patients."Treating heart failure patients is expensive: according to the Institut Canadien d'Information sur la Santé, people over 65 accounted for 78 percent of patients with the most expensive hospitalization costs per diagnosis between 2011 and 2012 in Canada. Of these, the cost of heart failure ranked third and was estimated at CAN$276 million.In the new study, the team assessed 112,793 people aged 65 years and older that had been diagnosed with heart failure in Quebec between 2001 and 2011. Patients with heart failure were identified in the Quebec Integrated Chronic Disease Surveillance System (QICDSS) database using the International Classification of Diseases (ICD).The participants were followed for an average of 635 days. During this time, the researchers measured the mean temperature, relative humidity, atmospheric pressure and air pollutants in the surrounding environment and studied the data to see if there was an association.The results showed a higher risk of hospitalization or death in the winter period of the year (October to April) compared to the summer period (May to September).The researchers noticed that the risk to experience hospitalization or death of heart failure cause was increased of 0.7 percent for every 1°C decrease in the mean temperature of the previous seven days. They also found that the risk of heart failure incident increased by 4.5 percent for each increase of 1 kPa in atmospheric pressure.In other words, a drop of 10°C in the average temperature over seven days, which is common in several countries because of seasonal variations, is associated with an increased risk in being hospitalized or dying of heart failure of about 7 percent in people aged over 65 diagnosed with the disease..During the follow-up period, 21,157 heart failure events occurred, representing 18.7 percent of the people studied. In total, 18,309 people were hospitalized and 4,297 died. In some cases, hospitalization and death occurred the same day. The researchers calculated this to 0.03 percent of patients experiencing an incident per day, which extends to about 1500 hospitalizations or deaths over a 10-year period, or 150 events per year.Prof. Gosselin and the team suggest that elderly with heart failure should be given support and access to preventive measures, especially since managing heart failure is expensive for society. He commented: "Our study suggests that exposure to cold or high-pressure weather could trigger events leading to hospitalization or death in heart failure patients. This means that they should avoid exposure to fog and low cloud weather in winter as they often accompany high pressure systems."
Weather
2,017
September 25, 2017
https://www.sciencedaily.com/releases/2017/09/170925151434.htm
Antarctica: The wind sublimates snowflakes
Researchers have observed and characterized a weather process that was not previously known to occur in Antarctica's coastal regions. It turns out that the katabatic winds that blow from the interior to the margins of the continent reduce the amount of precipitation (mainly snowfall) -- which is a key factor in the formation of the ice cap. By forming a very dry layer of air in the first kilometer or so of atmosphere, the winds turn the falling snowflakes during their fall directly from their solid state into water vapor in a process known as sublimation. The authors of this study used new data collected at the coast of Adélie Land over a yearlong period, together with simulations carried out using atmospheric models. They estimated that, across the continent, cumulative precipitation near the ground was 17% lower than its maximum level higher in altitude. Their measurements indicate that precipitation may be as much as 35% lower in the region around East Antarctica. The researchers believe that this phenomenon could be further aggravated by climate change. Their study has been published in
"Until now, the extent of this important process, which is largely undetectable by satellite, was not fully appreciated," explains Alexis Berne, corresponding author of the study and head of EPFL's Environmental Remote Sensing Laboratory (LTE). Berne worked with a team of Swiss, French and British researchers in 2015 and 2016, using a new combination of instruments to take measurements at the Dumont d'Urville French research station on the coast of East Antarctica. The team used three instruments: a Doppler dual-polarization weather radar, a weighing precipitation gauge and a radar profiler. The polarization radar collected information on the type and intensity of precipitation, while the precipitation gauge weighed the accumulated snowfall every minute and helped calibrate the two radars' estimates. These two instruments were used to collect data from November 2015 to January 2016. The third instrument -- the radar profiler -- has been continuously collecting vertical profiles of the intensity of precipitation up to three kilometers in altitude since November 2015 at Dumont d'Urville.At first, the researchers were surprised by the results they obtained. The sharp decline in precipitation recorded near the ground was not consistent with their usual observations. "We therefore worked off the hypothesis that the reduction in precipitation in the lower atmospheric levels was caused by snow crystals sublimating as a result of the katabatic winds," explains Christophe Genthon, CNRS Senior Scientist at the Institute of Geosciences of the Environment, based in Grenoble. These frequent, strong winds come from the continent's high plateaus. The Antarctic ice sheet is quite flat, so the winds can gain in strength and reach as far as the coast. This creates a thin bottom layer of air (up to 300 m) that is saturated with uplifted snow crystals. Above this, there is a second layer of air that is much dryer. Snowflakes formed in the cloud layer aloft, sublimate when they pass through this second layer, turning straight into water vapor. Over time, this reduces the contribution from precipitation to the ice sheet's mass balance. "This layer is in a blind zone for satellites because of the echoes from the surface, which explains why this phenomenon had not been detected by satellites," says Berne.The researchers then found evidence of katabatic winds capable of triggering sublimation in most of the data collected by radiosonde at permanent research stations across East Antarctica. Using a series of numerical atmospheric models and comparing the results with the measurements taken in Adélie Land, they were able to quantify the impact over the entire continent. And they discovered that the process of sublimation has a huge influence on the accumulation of precipitation.Data on the ice sheet's mass balance is essential for predicting how sea levels will rise or fall. Researchers generally expect global warming to result in higher levels of precipitation in Antarctica. But the impact of the katabatic winds on precipitation could challenge these forecasts and make them far more complicated. The team therefore plans to continue analyzing the continent. "We'd like to keep collecting data on coastal areas and look more closely at areas where the terrain is more complex. We also plan to use different types of atmospheric models for comparison purposes. Broadly speaking, we hope our work will help increase our understanding of how climate change will affect precipitation in Antarctica," explains Alexis Berne.
Weather
2,017
September 22, 2017
https://www.sciencedaily.com/releases/2017/09/170922094027.htm
Winter cold extremes linked to high-altitude polar vortex weakening
When the strong winds that circle the Arctic slacken, cold polar air can escape and cause extreme winter chills in parts of the Northern hemisphere. A new study finds that these weak states have become more persistent over the past four decades and can be linked to cold winters in Russia and Europe. It is the first to show that changes in winds high up in the stratosphere substantially contributed to the observed winter cooling trend in northern Eurasia. While it is still a subject of research how the Arctic under climate change impacts the rest of the world, this study lends further support that a changing Arctic impacts the weather across large swaths of the Northern Hemisphere population centers.
"In winter, the freezing Arctic air is normally 'locked' by strong circumpolar winds several tens of kilometers high in the atmosphere, known as the stratospheric polar vortex, so that the cold air is confined near the pole," says Marlene Kretschmer from PIK, lead-author of the study to be published in the Despite global warming, recent winters in the Northeastern US, Europe and especially Asia were anomalously cold -- some regions like Western Siberia even show a downward temperature trend in winter. In stark contrast, the Arctic has been warming rapidly. Paradoxically, both phenomena are likely linked: When sea-ice North of Scandinavia and Russia melts, the uncovered ocean releases more warmth into the atmosphere and this can impact the atmosphere up to about 30 kilometers height in the stratosphere disturbing the polar vortex. Weak states of the high-altitude wind circling the Arctic then favors the occurrence of cold spells in the mid-latitudes. Previous work by Kretschmer and colleagues identified this causal pathway in observational data and it is further supported by several climate computer simulation studies."Our latest findings not only confirm the link between a weak polar vortex and severe winter weather, but also calculated how much of the observed cooling in regions like Russia and Scandinavia is linked to the weakening vortex. It turns out to be most," says co-author Judah Cohen from Atmospheric and Environmental Research/Massachusetts Institute of Technology (US). "Several types of weather extremes are on the rise with climate change, and our study adds evidence that this can also include cold spells, which is an unpleasant surprise for these regions." The effect is stronger over Asia and Europe than over the US."It is very important to understand how global warming affects circulation patterns in the atmosphere," says co-author Dim Coumou from Vrije Universiteit Amsterdam, Netherlands. "Jet Stream changes can lead to more abrupt and surprising disturbances to which society has to adapt. The uncertainties are quite large, but global warming provides a clear risk given its potential to disturb circulation patterns driving our weather -- including potentially disastrous extremes."
Weather
2,017
September 20, 2017
https://www.sciencedaily.com/releases/2017/09/170920144658.htm
Wave Glider surfs across stormy Drake Passage in Antarctica
The Southern Ocean is key to Earth's climate, but the same gusting winds, big waves and strong currents that are important to ocean physics make it perilous for oceanographers.
Instead their job is increasingly being given to ocean drones, the autonomous floating vehicles that collect data from the world's oceans. With an urgent need to better understand climate to predict how it will shift with more heat-trapping gases, scientists are developing new tools to measure waters below where satellites can penetrate, and in places that are too dangerous or expensive to reach regularly by research ship. They are also sending those instruments on increasingly ambitious missions.Many of these new tools look like robotic fish, but the University of Washington sent a robotic surf board to ride the waves collecting data from Antarctica to South America. The Wave Glider, a long-duration ocean robot designed to operate in stormy conditions and high latitudes, can stay at sea for months patrolling for illegal fishing, listening for seismic events, collecting weather or ocean data and monitoring the environment. Last December, UW researchers sent it out on a first-ever attempt to cross the terrifically turbulent waters of Drake Passage.The currents circling Antarctica that pose a challenge to mariners also mix significant heat energy from all the world's oceans. Most of that mixing happens in the top few hundred feet, where winds and waves basically put the surface layer on a spin cycle."The Southern Ocean, and the Drake Passage in particular, are key locations that are historically under-sampled," said first author Jim Thomson, an oceanographer at the UW's Applied Physics Laboratory. "Using an autonomous platform allowed us to have persistence in the region, as well as track or target the fronts and gradients that make the place so interesting."The recent paper in The UW oceanographers used a commercial Wave Glider made by Liquid Robotics, a California-based subsidiary of the Boeing Co., to surf along the water's surface gathering observations. The researchers added extra sensors for temperature, salinity, air pressure, humidity and wind to the commercial model.After a test run in summer 2016 off Washington's coast, the instrument was deployed off the Antarctic Peninsula in December. It spent about three months zigzagging its way across the fabled Drake Passage, while the researchers occasionally piloted the instrument remotely from shore.As the study's authors wrote, this is where the strong Antarctic current becomes "a mess of swirling eddies" and meanders around its central path. "The zig-zag pattern in the middle of Drake Passage was designed to survey the strong fronts and meanders of the Antarctic Circumpolar Current common to that region," wrote Thomson and co-author James Girton, also with the UW's Applied Physics Laboratory.A Wave Glider harnesses energy from the waves, using the shape of the water motion below the surface to drive the vehicle forward with minimal power. With wave energy for motion and solar panels charging batteries to power its sensors, the board can operate for months without maintenance. Even so, the late-summer sun so far south did not provide enough energy to recharge batteries late into the expedition, and a research ship retrieved the instrument and its data near Argentina in late March. Though the board didn't reach South America, the real goal was the data it collected."The mission just completed would have cost many millions of dollars to complete with a ship," Thomson said. "An autonomous approach allowed us to collect data that has never -- and would never have -- been collected in this remote region."The authors are still processing the observations collected during the voyage, which was funded by the National Science Foundation, to understand mixing on different spatial scales. They hope that future funding will allow another chance to collect more data and transition this program into regular annual monitoring of the Drake Passage."It's not just about having done this successfully once, it's about learning how to make this routine. We do that, and we change the game on data collection in this important region." Thomson said.
Weather
2,017
September 20, 2017
https://www.sciencedaily.com/releases/2017/09/170920100043.htm
Gravity waves influence weather and climate
Gravity waves form in the atmosphere as a result of destabilizing processes, for example at weather fronts, during storms or when air masses stroke over mountain ranges. They can occasionally be seen in the sky as bands of cloud. For weather forecast and climate models, however, they are mostly "invisible" due to their short wavelength. The effects of gravity waves can only be taken into consideration by including additional special components in the models. The "MS-GWaves" research unit funded by the German Research Foundation and led by Goethe University Frankfurt has meanwhile further developed such parameterizations and will test them in the second funding period.
Although gravity waves have comparatively short wavelengths of between just a few hundred metres and several hundred kilometres, at times they influence the transport of water vapour as well as large-scale winds and temperature distributions to a considerable degree. This effect is strongest in the upper layers of the atmosphere. These, in turn, have such a strong effect on the lower layers too that a realistic modelling of weather and climate in the atmosphere is impossible without giving due consideration to gravity waves. Gravity waves also play a significant role for air traffic in predicting turbulence and are an important factor in weather extremes, such as heavy rain or storms.In the first funding period, the ten research institutes participating in the project documented in detail the formation of gravity waves in one of the largest measuring campaigns ever undertaken, using radar, high-performance lasers, rockets and research planes as well as through laboratory tests. They also refined the hypothesis on the formation and dispersion of gravity waves to such an extent that their development can now be reproduced much more reliably in high-resolution numerical models too.In a further step, the research unit led by Professor Ulrich Achatz of the Department of Atmospheric and Environmental Sciences at Goethe University Frankfurt has used these findings to improve parameterizations, which serve to describe the influence of gravity waves, in weather and climate models with typically coarser resolution. They have refined the weather and climate model ICON used by Germany's National Meteorological Service (DWD) and the Max Planck Institute for Meteorology. The new model, UA-ICON, allows more precise predictions for the upper atmosphere and can be operated with different resolutions, so that gravity waves can either be simulated in it for test purposes or must be parameterized in the operational mode. The advanced parameterizations are now being integrated in this model and tested in the second funding period.The project will also focus on impacts on weather prediction and climate modelling. An important aspect in this context is a better description of the interaction between gravity waves and ice clouds (cirrus), undertaken in cooperation with the University of Mainz. It could well be that this plays an important role for the climate.
Weather
2,017
September 19, 2017
https://www.sciencedaily.com/releases/2017/09/170919160152.htm
End-of-summer Arctic sea ice extent is eighth lowest on record
Arctic sea ice appeared to have reached its yearly lowest extent on Sept. 13, NASA and the NASA-supported National Snow and Ice Data Center (NSIDC) at the University of Colorado Boulder have reported. Analysis of satellite data by NSIDC and NASA showed that at 1.79 million square miles (4.64 million square kilometers), this year's Arctic sea ice minimum extent is the eighth lowest in the consistent long-term satellite record, which began in 1978.
Arctic sea ice, the layer of frozen seawater covering much of the Arctic Ocean and neighboring seas, is often referred to as the planet's air conditioner: its white surface bounces solar energy back to space, cooling the globe. The sea ice cap changes with the season, growing in the autumn and winter and shrinking in the spring and summer. Its minimum summertime extent, which typically occurs in September, has been decreasing, overall, at a rapid pace since the late 1970s due to warming temperatures.This year, temperatures in the Arctic have been relatively mild for such high latitudes, even cooler than average in some regions. Still, the 2017 minimum sea ice extent is 610,000 square miles (1.58 million square kilometers) below the 1981-2010 average minimum extent."How much ice is left at the end of summer in any given year depends on both the state of the ice cover earlier in the year and the weather conditions affecting the ice," said Claire Parkinson, senior climate scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "The weather conditions have not been particularly noteworthy this summer. The fact that we still ended up with low sea ice extents is because the baseline ice conditions today are worse than the baseline 38 years ago."The three years with the lowest Arctic ice extents on record -- 2012, 2016 and 2007 -- experienced strong summer storms that hammered the ice cover and sped up its melt. "In all of those cases, the weather conditions contributed to the reduced ice coverage. But if the exact same weather system had occurred three decades ago, it is very unlikely that it would have caused as much damage to the sea ice cover, because back then the ice was thicker and it more completely covered the region, hence making it more able to withstand storms," Parkinson said.On the other side of the planet, Antarctica is heading to its maximum yearly sea ice extent, which typically occurs in September or early October. This year's maximum extent is likely to be among the eight lowest in the satellite record -- a dramatic turn of events considering that 2012, 2013 and 2014 all saw consecutive record high maximum extents, followed by a sudden large drop in 2015 and a further although smaller decrease in 2016. So far, the September Antarctic ice extents this year are comparable to those of a year ago."What had been most surprising about the changing sea ice coverage in the past three decades was the fact that the Antarctic sea ice was increasing instead of decreasing," Parkinson said. "The fact of Arctic sea ice decreases was not as shocking because this was expected with a warming climate, although the overall rate of the decreases was greater than most models had forecast."Parkinson said that although it is still too early to talk about a long-term reversal in the behavior of Antarctic sea ice, the decreases witnessed in the past two years provide important data to test the various hypotheses that scientists have put forward to explain why Antarctic sea ice coverage had been increasing, overall, between 1979 and 2015.Adding the Antarctic and Arctic sea ice extents month by month through the satellite record shows that globally the Earth has been losing sea ice since the late 1970s in each portion of the annual cycle of ice growth and decay. "In fact, this year, every single month from January through August experienced a new monthly record low in global sea ice extents," Parkinson said.
Weather
2,017
September 19, 2017
https://www.sciencedaily.com/releases/2017/09/170919123123.htm
Changes in non-extreme precipitation may have not-so-subtle consequences
Extreme floods and droughts receive a lot of attention. But what happens when precipitation -- or lack thereof -- occurs in a more measured way?
Researchers have analyzed more than five decades of data from across North America to find that changes in non-extreme precipitation are more significant than previously realized. And the changes are greater than those that have occurred with extreme precipitation.Non-extreme precipitation can have a strong effect on ecosystems, agriculture, infrastructure design and resource management, the scientists say, pointing to a need to examine precipitation in a more nuanced, multifaceted way."This study shows that everyday precipitation events -- not just the extremes that have been the focus of most studies -- are changing," said University of Illinois scientist Praveen Kumar, principal investigator of the National Science Foundation's (NSF) Intensively Managed Landscapes Critical Zone Observatory (CZO), one of nine such NSF CZOs."It's not just the amount of rainfall that's important," said Kumar, "it's the duration of that rainfall and the amount of time between rainfall and dry periods."The study, published in "We used data from more than 3,000 weather stations," said Roque-Malo. "There are a few other studies that use a similar methodology, but they have focused on smaller sections of the continent or parts of Europe."The researchers identified several regions where the microclimate -- local climate determined by elevation and ecosystem -- appears to have a significant effect on precipitation trends."This study confirms that there is more to climate than the number and size of extreme events," said Richard Yuretich, CZO program director at NSF, which funded the research through its Division of Earth Sciences. "Shifts in the daily patterns of rainfall, sometimes subtle, also occur. These can be very hard to document, but the existence of long-term monitoring sites provides the information needed to recognize trends and plan for the future."In areas such as Oregon's Willamette Valley, the researchers observed decreases in the total annual precipitation, the number of days per year with precipitation, and the number of consecutive days with precipitation. The areas immediately surrounding the valley, however, had increases in those measures."Examples like this indicate that it may not be the best practice to make broad assumptions like 'all wet areas are becoming wetter and all dry areas are becoming drier,'" said Roque-Malo.The observations have important implications for the resilience of ecosystems and for agriculture and water resource planning, the researchers say."Successive generations of ecosystems evolve through adaptation to these kinds of changes," said Kumar. "If the rate of change, however small, exceeds the adaptive capacity, these environments will be susceptible to collapse."Added Roque-Malo, "Hydroelectric plants, storm water drainage systems -- any structure that relies on an assumption of expected precipitation -- could be vulnerable as we look toward becoming more climate-resilient."Although current models may not be able to resolve the small but steady changes observed in this study, the researchers hope their work will inform and provide validation criteria for future models and assessments.
Weather
2,017
September 18, 2017
https://www.sciencedaily.com/releases/2017/09/170918092852.htm
Devilish source of dust in atmosphere of Earth and Mars
Swirling columns of sand and dust, known as dust devils, are a feature of desert areas on Mars and on Earth. Now, a study of terrestrial dust devils has shown that around two thirds of the fine particles lifted by these vortices can remain suspended in the atmosphere and be transported around the globe. The findings have implications for the climate and weather of both planets and, potentially, human health here on Earth. Results will be presented by Dr Jan Raack of the Open University at the European Planetary Science Congress (EPSC) 2017 in Riga, Latvia on Monday, 18th Septebmer 2017.
The study by Raack and an international team of collaborators gives important insights into the contribution of dust devils to mineral aerosols in planetary atmospheres. About half of the dust lifted into the martian atmosphere each year is thought to come from dust devils. However, to date, the structure of these vortices has not been well understood. As terrestrial dust devils act very similarly to those on Mars, Raack and colleagues have carried out multiple field campaigns over the past five years to study dust devils in three different deserts on Earth, in China, Morocco and the USA. The researchers took samples of grains lifted by dust devils at different heights, studied tracks left by dust devils on the surface and measured physical and meteorological properties of dust devils.Raack explains: “The method for sampling is simple – although not actually that pleasant to carry out as it involves getting sandblasted. Essentially, we cover a 5-metre aluminium pipe with double sided sticky tape and run into an active dust devil. We hold the boom upright in the path of a dust devil and wait until the dust devil passes over the boom. Numerous grains are collected on the sticky tape, which are preserved on-site by pressing sections of the tape from different heights onto glass slides.”Back in the lab, the glass slides are analysed under an optical microscope and all grains measured and counted to gain detailed relative grain size distributions of the sampled dust devils. The results presented at EPSC 2017 focus on samples taken during field campaigns in the south and southwest of Morocco, funded by Europlanet and supported by the Ibn Battuta Centre in Marrakesh.“We found that the dust devils we measured have a very similar structure, despite different strengths and dimensions. The size distribution of particles within the dust devils seems to correspond to the distribution of grain sizes in the surface they passed over. We have been able to confirm the presence of a sand-skirt – the bottom part of the dust devil with high concentration of larger sand grains – and most particles were only lifted within the first metre. However, the decrease in grain diameter with height is nearly exponential,” says Raack.In the terrestrial dust devils, the team found that around 60-70% of all the fine dust particles (with diameters up to three hundredths of a millimetre) appear to stay in suspension. These small mineral aerosols can be transported over long distances on Earth and have an influence on the climate and weather. They can also reach populated areas, affecting air quality and human health. On arid Mars, where most of the surface is desert-like and the dust content is much higher, the impact is even larger. Further analysis of the datasets will include meteorological measurements of the dust devils that will be used to interpret data obtained by landers and rovers on Mars, including the Curiosity rover and the upcoming ExoMars and InSight lander missions.
Weather
2,017
September 15, 2017
https://www.sciencedaily.com/releases/2017/09/170915162954.htm
Black Sea water temperatures may buck global trend
Using a model developed at the JRC, scientists have successfully simulated the Black Sea's long term currents, salt water content and temperature for the first time.
Average surface temperatures of the Black Sea may not have risen, according to the surprising results of a new study from the JRC.The study used a model to simulate possible temperature changes and predict long term trends in the Black Sea's hydrodynamics.While the surface showed no long term warming trend, the same simulations also indicated that average temperatures at 50 metres below the surface may be rising.The Black Sea has unique natural conditions like a positive net freshwater balance and very specific local currents. Observational data on temperature change is varied and scarce. As such it is not clear what the impacts of climate change have been on Black Sea water temperatures.The Sea has undergone significant ecological degradation since the 1970s, due largely to pollution, overfishing and natural climatic variations. Mapping trends in its ecosystem and simulating future scenarios is vital to understand how the Sea's properties may develop in the future as a result of climate change and policy decisions.The simulations in this study, covering five decades, show no significant long-term trend in the Black Sea's average surface water temperature. This lack of a trend is an entirely new result based on a long term simulation that had not previously been successfully conducted.The simulation was run for the full period from 1960 -- 2015 and the results were checked against known data, both from satellite information available over the last twenty years and less complete data from earlier decades.Prior to the completion of this study, scientists had relied on sparse surface temperature data from ship cruises to understand the Sea's properties in the earlier decades.However, the few data points that do exist for this period have not been enough to prove a decisive trend. In fact, in the decade between 1966 and 1975 there is practically no observational data available at all.The results of the simulation, while filling in the gaps, also came as a surprise to scientists who were expecting to see at least some warming trend between 1960 and 2015. The results also come in direct contrast to previous simulations of the nearby Mediterranean Sea, which is getting warmer.Scientists were also surprised to find a significant decreasing trend in surface salt content of 0.02 % per year, again in direct contrast to the increasing surface salinity found in the Mediterranean. The simulations found no individual correlation between salinity and wind speed/direction, or indeed with an increase in fresh water input from the many rivers running into the Black Sea.This suggests that combinations of weather conditions are responsible for the trend.Furthermore, the study identifies three distinct periods in which there was a significant shift in the salt water and temperature properties of the Black Sea -- 1960-1970, 1970-1995 and 1995-2015. This may be related to changes in the Sea's currents, as the periods were also characterised by significant changes from a weak and disintegrated current circulation in the first period, to a strong main 'Rim Current' circulation in the second and third periods.Over the full simulation period, the strengthening of this circulation can be seen, accompanied by intensified formation of small, localised eddies running against the current.The Earth is getting warmer, but this isn't happening uniformly across the planet and some regions are heating faster than others. So whilst the Black Sea might not be strongly affected, this is likely compensated by other regions which are warming at a faster rate than the globe as a whole.For example, the sea-surface waters near Texas when Hurricane Harvey roared toward Houston were among the Earth's warmest.While it might sound like good news that there has been no long-term increase in the surface water temperatures of the Black Sea, this does not mean that it is unaffected by global warming. These effects may be hidden or mitigated by the fact that air temperature in the region is warming.Indeed, the study also looked at average temperature trends at specific depths and found a positive trend at 50 metres below the surface, which suggests a warming of the deeper waters prior to the surface layer.Several unique natural conditions of the Black Sea are well known: the so-called 'Rim Current' of strong water circulation around the perimeter of the Sea; the Cold Intermediate Layer (CIL) of waters below surface level; and the high level of anoxic water, which makes up over 90% of the basin's deep water volume.However, current knowledge about the spatial and temporal dynamics of the Black Sea's salt content and temperature characteristics is far more limited, due to a lack of data and sparse distribution of existing measurements.With this knowledge gap in mind, researchers at the JRC sought to develop a model able to reproduce the Black Sea's temperature and salt-water content over the long term.During the study, carried out as part of the EU's Scenario simulations of the changing Black Sea ecosystem (SIMSEA) project, continuous simulations covering the period 1960-2015 were performed.The researchers used a high-resolution General Estuarine Transport Model (GETM) with specific equations designed to accurately reproduce the main physical features of the Black Sea without relying on scarce observational and climatological data.The study is the first successful Black Sea model based on GETM and has been published in the Journal of Geophysical Research. The successful validation and long-term application of the model used will be incorporated into future forecasts and simulations.
Weather
2,017
September 11, 2017
https://www.sciencedaily.com/releases/2017/09/170911150943.htm
Clouds like honeycomb
Polygons are widespread in nature: Drying mud may crack into many-sided blocks, and bees shape honeycomb into regular, six-sided cells. Hexagons also appear in broad sheets of clouds across parts of Earth's oceans, and now a team of researchers has used a network approach to analyze why. Their work promises to help scientists to find more accurate descriptions of clouds in computer models of weather and climate change.
Large decks of stratocumulus clouds self-organize into honeycomb-like patterns. "These types of clouds cool the planet by reflecting solar radiation but their description in climate models is still rather crude," said lead author Franziska Glassmeier. She found that she could use a relatively simple mathematical model to re-create the cloud patterns, which are shaped in nature by a complex interplay of physical processes.The new paper, co-authored by NOAA scientist and CIRES Fellow Graham Feingold, is published in the journal Since the first satellite images in the early 60s, scientists have recognized that stratocumulus clouds -- carpet-like, low clouds often draped across large sections of subtropical oceans -- look like imperfect honeycombs. Sometimes the cells are "closed," with cloudy areas in the cells surrounded by cloud-free rings; and sometimes they are "open," with cloud-free cells surrounded by cloudy rings. The pattern constantly changes as cells emerge, disappear, and re-arrange.The researchers ran highly detailed simulations of clouds to capture the precise air movements that create these patterns: in general, where air moves up it creates cloudy regions, and where it descends, cloud-free regions form. They then applied a mathematical technique called Voronoi tessellation to translate air movements into a network of polygonal tiles. The simple mathematical model developed by Glassmeier and Feingold re-creates this pattern. "It is like creating a dynamic mosaic with specific rules that allow one to replace different patches with a new set of tiles over and over again," Glassmeier explains. Their model offers a fundamental explanation for the structure and dynamics of stunning stratocumulus cloud decks.And perhaps more importantly, the network analysis technique can help to produce more accurate clouds in computer models. "Clouds still represent a significant uncertainty in our climate projections," said Feingold. "Our hope is that this novel cellular network approach will lead to new ways of looking at the cloud parameterization problem."
Weather
2,017
September 11, 2017
https://www.sciencedaily.com/releases/2017/09/170911150930.htm
Urban climate change
Southern cities such as Houston and Tampa -- which faced the wrath of hurricanes Harvey and Irma, respectively -- may not be the only urban environments vulnerable to extreme weather. Northern cities also face the potential for flooding as global temperatures continue to warm.
In fact, higher temperatures have been found to disproportionately affect northern land areas, particularly the Arctic, which has already experienced fallout from climate change.A new study by a group of international researchers, including UC Santa Barbara's Joe McFadden, combines observations and modeling to assess the impact of climate and urbanization on the hydrological cycle across the distinct seasons in four cold climate cities in Europe and North America. Their findings appear in the journal "In general, the amount of precipitation is increasing but also the kind of precipitation is changing," said McFadden, an associate professor in UCSB's Department of Geography. "While more precipitation may fall in a year, it arrives as rain rather than snow because temperatures are rising. A shorter period covered by snow, more spring rain and faster snow melt can combine to release large amounts of runoff that have the potential to stress urban hydrologic systems and cause flooding in urban areas."The scientists used measurements taken in Minneapolis-St.Paul, Minnesota; Montreal, Canada; Basel, Switzerland; and Helsinki, Finland. Lead author Leena Järvi of the University of Helsinki coupled those with an urban hydrological model -- the Surface Urban Energy and Water balance Scheme (SUEWS) -- to perform a multiyear analysis. The investigators found that after snow melt, urban runoff returns to being strongly controlled by the proportion of built-up versus vegetated surfaces, which can absorb water. However in winter, the presence of snow masks this influence.Basel had more than 80 percent impermeable surfaces, whereas the American site -- a first ring suburb in Minneapolis-St.Paul -- had the lowest amount of impermeable surface, about 10 percent."Combining measurements and modeling in this way is very valuable because it gives us a starting point to compare different cities, gradations between the city and the suburbs or changes in the city as it grows over time," McFadden said. "Once we understand how that works, that knowledge is portable and can be used to understand other problems."According to McFadden, not only does this analysis demonstrate that wintertime climate can be important for northern cities, it also shows the effects in terms of flood risks. However, he noted, how this plays out within each city is a complex interaction."We showed that the model accurately represents what we measured in cities, so now we can use it to conduct sensitivity studies, where only a single variable -- the percentage of the city covered by impervious versus pervious materials -- changes," he said. "Then we can examine how the melt of the snow and the runoff changes in light of the percentage of each city's impervious surface. This is really important because it helps us understand how the built environment of the city modifies the effects of global climate factors."
Weather
2,017