Date
stringlengths
11
18
Link
stringlengths
62
62
Title
stringlengths
16
148
Summary
stringlengths
1
2.68k
Body
stringlengths
22
13k
Category
stringclasses
20 values
Year
int64
2k
2.02k
March 13, 2017
https://www.sciencedaily.com/releases/2017/03/170313160839.htm
Looking for 'fingerprints' at the intersection of weather and climate
Scientists have found the seasonal "fingerprints" of Arctic sea ice, El Nino, and other climate phenomena in a new study that probes the global interactions between weather and climate.
Although the terms are often used interchangeably, weather and climate are different. Weather can be reasonably well predicted up to a week in advance, and is characterized by its daily dynamics. Climate changes more slowly, which is why you wouldn't plan a ski trip to Wyoming for August, or a beach vacation in Cape Cod in February. Yet there is variability within climate. A particularly snowy winter in New England one year might be followed by almost no snow the next year.Certain climate processes, such as the El Nino Southern Oscillation or the annual cycle of Arctic sea ice cover, are intertwined with both weather and climate. They operate on seasonal time scales, but their variability and strength reside in the cumulative effect of fast-moving daily weather. They also have the capacity to create big problems for public safety and regional economies.Using concepts from the statistical mechanics of the microscopic world, researchers from Yale University and the Centre for Mathematical Sciences at the University of Cambridge have created a mathematical framework to better explain such phenomena. It is a departure from previous theories in that it uses formulas that explicitly embrace a wide range of time variation."Treating the fastest and slowest time scales in a system that is itself temporal by its very nature allowed us to see how daily weather fluctuations accumulate to impact the predictability on seasonal time scales. In other words, the seasons 'remember' the weather," said John Wettlaufer, the A.M. Bateman Professor of Geophysics, Mathematics, and Physics at Yale, and co-author of the study, published March 13 in the journal
Weather
2,017
March 13, 2017
https://www.sciencedaily.com/releases/2017/03/170313160827.htm
Rapid decline of Arctic sea ice a combination of climate change and natural variability
Arctic sea ice in recent decades has declined even faster than predicted by most models of climate change. Many scientists have suspected that the trend now underway is a combination of global warming and natural climate variability.
A new study finds that a substantial chunk of summer sea ice loss in recent decades was due to natural variability in the atmosphere over the Arctic Ocean. The study, from the University of Washington, the University of California Santa Barbara and federal scientists, is published March 13 in "Anthropogenic forcing is still dominant -- it's still the key player," said first author Qinghua Ding, a climate scientist at the University of California Santa Barbara who holds an affiliate position at the UW, where he began the work as a research scientist in the UW's Applied Physics Laboratory. "But we found that natural variability has helped to accelerate this melting, especially over the past 20 years."The paper builds on previous work by Ding and other UW scientists that found changes in the tropical Pacific Ocean have in recent decades created a "hot spot" over Greenland and the Canadian Arctic that has boosted warming in that region.The hot spot is a large region of higher pressure where air is squeezed together so it becomes warmer and can hold more moisture, both of which bring more heat to the sea ice below. The new paper focuses specifically on what this atmospheric circulation means for Arctic sea ice in September, when the ocean reaches its maximum area of open water."The idea that natural or internal variability has contributed substantially to the Arctic sea ice loss is not entirely new," said second author Axel Schweiger, a University of Washington polar scientist who tracks Arctic sea ice. "This study provides the mechanism and uses a new approach to illuminate the processes that are responsible for these changes."Ding designed a new sea ice model experiment that combines forcing due to climate change with observed weather in recent decades. The model shows that a shift in wind patterns is responsible for about 60 percent of sea ice loss in the Arctic Ocean since 1979. Some of this shift is related to climate change, but the study finds that 30-50 percent of the observed sea ice loss since 1979 is due to natural variations in this large-scale atmospheric pattern."What we've found is that a good fraction of the decrease in September sea ice melt in the past several decades is most likely natural variability. That's not really a surprise," said co-author David Battisti, a UW professor of atmospheric sciences."The method is really innovative, and it nails down how much of the observed sea ice trend we've seen in recent decades in the Arctic is due to natural variability and how much is due to greenhouse gases."The long-term natural variability is ultimately thought to be driven by the tropical Pacific Ocean. Conditions in the tropical Pacific set off ripple effects, and atmospheric waves snake around the globe to create areas of higher and lower air pressure.Teasing apart the natural and human-caused parts of sea ice decline will help to predict future sea ice conditions in Arctic summer. Forecasting sea ice conditions is relevant for shipping, climate science, Arctic biology and even tourism. It also helps to understand why sea ice declines may be faster in some decades than others."In the long term, say 50 to 100 years, the natural internal variability will be overwhelmed by increasing greenhouse gases," Ding said. "But to predict what will happen in the next few decades, we need to understand both parts."What will happen next is unknown. The tropical Pacific Ocean could stay in its current phase or it could enter an opposite phase, causing a low-pressure center to develop over Arctic seas that would temporarily slow the long-term loss of sea ice due to increased greenhouse gases."We are a long way from having skill in predicting natural variability on decadal time scales," Ding said.
Weather
2,017
March 13, 2017
https://www.sciencedaily.com/releases/2017/03/170313135011.htm
Democrats and Republicans draw different conclusions when seasons are too hot or too cold
When the weather is unseasonably hot or cold, Americans across the political spectrum have even stronger views about whether climate change caused by human activity is a reality or not. Republicans are then less likely to conform to the scientific consensus on global warming, while Democrats are much more likely to do so. This is according to the findings of Jeremiah Bohr of the University of Wisconsin Oshkosh in the US, published in Springer's journal
It is well known that people with conservative leanings are more prone to deny the existence or severity of human-induced global warming than others. This divide exists among political elites as well as the American public. In the current analysis, Bohr wanted to find out if people's particular political orientations and beliefs about global warming changed at all during periods of so-called temperature anomalies, when temperatures above or beyond the normal are experienced.Bohr used data from two sources in his study. Data concerning people's beliefs about global warming and the social setting they find themselves in came from four nationally representative CBS/New York Times surveys of American adults, collected in February 2013, March 2013, February 2014 and May 2014. These months represent moments when different regions in the US experienced temperatures both five degrees Fahrenheit above or below the average temperature for the previous three decades. Bohr then merged the survey data with state-specific monthly temperature averages collected by the National Oceanic and Atmospheric Administration's National Center for Environmental Information.His models indicate that temperature anomalies exacerbate existing political polarization and thoughts over what causes global warming. This is especially so when unseasonal temperatures at least five degrees Fahrenheit above or below the established five-year baseline are experienced.Democrats are more likely in such cases to attribute global warming to human activity. Republicans, on the other hand, are less likely to conform to the scientific consensus on global warming during very cold or very warm periods. When breaking down Republican identity between those who do and do not support the Tea Party movement, Bohr further found that both kinds of Republicans converge in their global warming beliefs during extreme temperature anomalies, but diverge during more seasonable temperature conditions."This would be consistent with the elite cues hypothesis, in that we would expect political leaders who deny anthropogenic global warming to claim victory during unseasonably cold periods or amplify their denial during unseasonably warm periods that invite challenge to their worldview," says Bohr.He is not surprised that political polarization over global warming beliefs increases during unseasonable temperature anomalies. "These are precisely the locations and moments when partisan differences over the role of human activity in global warming may resonate most," adds Bohr.
Weather
2,017
March 10, 2017
https://www.sciencedaily.com/releases/2017/03/170310091950.htm
Floods and hurricanes predicted with social media
Social media can warn us about extreme weather events before they happen -- such as hurricanes, storms and floods -- according to new research by the University of Warwick.
Nataliya Tkachenko, with her supervisors in the Department of Computer Science, has found that photographs and key words posted online can signal weather risks developing in specific locations and times -- for example, posts about water levels rising can alert the authorities to a potential flood.Tracking certain words used in social media posts around the time of an extreme weather event -- such as water and river when there is a flood risk -- allows information to be collated to accurately predict which areas will be affected, and how big the impact will be to infrastructure and human life.The researchers tracked photos and videos with tags such as river, water and landscape on the social media platform Flickr between 2004 and 2014.Whilst these words can be used to generally describe natural scenery, researchers found that in certain time periods before the peak of extreme weather events -- and in the locations where they occurred -- these words took on a distinct meaning of forecast and warning, showing the weather worsening.These risk-signalling words can act as 'social sensors', which when used alongside physical meteorological sensors can help to improve the prediction and monitoring of the behaviour and severity of an evolving weather event in multiple areas.Physical sensors -- such as flood monitors -- have been used traditionally to detect extreme weather events, but their scope is limited, and they cannot accurately cover each specific area which may be affected in the same way that social media can.Social media is currently used as an effective tool for 'now-casting' -- providing eye-witness accounts of ongoing events -- but has not yet been harnessed for predicting large-scale events which are still developing.Using social media and physical meteorological sensors together would create an early warning system for extreme weather events of unprecedented accuracy and efficacy.Nataliya Tkachenko, from the Warwick Institute for the Science of Cities, comments:"Our analysis demonstrates that metadata in social media image postings enables them to be used as 'social sensors', which can serve as a valuable supplement to instrument-based systems for predicting and monitoring floods, and other kinds of natural hazards."The opportunities represented by these new data sources are truly exciting as they can help to protect homes, save lives and design more resilient cities!"The research, 'Predicting floods with Flickr tags', is published in Nataliya is a PhD student in the EPSRC-funded Urban Science Centre for Doctoral Training.
Weather
2,017
March 9, 2017
https://www.sciencedaily.com/releases/2017/03/170309084827.htm
Additional Arctic weather data raises forecast accuracy of cold snaps in Japan
A research team consisting of members from Japan's National Institute of Polar Research, the Japan Agency for Marine-Earth Science and Technology (JAMSTEC), and other organizations conducted forecasting simulations of the cold waves that hit Japan and the North American East Coast in February 2015. Results showed that additional data collected that year through more frequent observation of meteorological conditions in the Arctic's upper atmosphere from both land-based research stations and the research vessel
In recent years, extreme winter weather events such as heavy snowfalls and severe winters have been occurring frequently in regions such as East Asia, North America and Europe. For example, Japan was experiencing a mild 2014/2015 winter when in February the winter pressure pattern strengthened. With it came record level snowfalls, to the Hokuriku region in particular. Memories are still fresh of the extreme record-setting -15°C cold wave that hit the North American East Coast a week later, which brought with it significant impacts on the region's people, transportation systems and economy.To minimize the destructive effects of these extreme winter weather events, accurate forecasting of cold waves flowing in from the Arctic as early as possible is imperative. To that end, it is vastly preferable to have as much meteorological observation data as possible. However, as acquiring such data involves significant personnel and economic cost, its effectiveness needs to be ascertained.An international research group led by Dr. Kazutoshi Sato and Dr. Jun Inoue of NIPR, and Dr. Akira Yamazaki of JAMSTEC, conducted experimental simulated forecasts of the 2015 cold waves that hit Japan on February 9 and the North American East Coast on February 16.The collection of larger than usual amounts of Arctic winter weather data in 2015 was due to two reasons: the Norwegian research vessel Results of the simulations clearly showed that the additional data collected by the The additional data collected in 2015 allowed for a significantly more accurate understanding of the dynamics occurring in the center of the cold winter air masses that develop in the upper atmosphere above the Arctic Ocean (the polar vortex), and the initial conditions in the atmosphere from which extreme winter weather events arise. More precise assessment of the initial conditions preceding extreme weather events is indispensable in making accurate forecast calculations.Associate Professor Inoue of the research team states "This indicates that meteorological observation in the Arctic can help reduce the impact of extreme winter weather events in mid-latitude areas with concentrated populations. It is anticipated that Japan will continue to contribute actively to meteorological observation in the Arctic."
Weather
2,017
March 7, 2017
https://www.sciencedaily.com/releases/2017/03/170307100337.htm
Climate study: More intense and frequent severe rainstorms likely
A University of Connecticut climate scientist confirms that more intense and more frequent severe rainstorms will likely continue as temperatures rise due to global warming, despite some observations that seem to suggest otherwise.
In a research paper appearing this week in "We hope this information puts things in better perspective and clarifies the confusion around this issue," says Wang, who led an international team of climate experts in conducting the study. "We also hope this will lead to a more accurate way of analyzing and describing climate change."Climate scientists and policymakers closely monitor severe and prolonged rainstorms as they can have a devastating impact on local environments and economies. These damaging storms can cause catastrophic flooding; overwhelm sewage treatment plants; increase the risk of waterborne disease; and wipe out valuable crops.Current climate models show most of the world will experience more intense and more frequent severe rainstorms for the remainder of the 21st century, due to hotter temperatures caused by global warming.But whether this increase in extreme precipitation will continue beyond the end of the century, and how it will be sustained, is less clear.Meteorological observations from weather stations around the globe show the intensity of severe rainstorms relative to temperature is like a curve -- steadily going up as low to medium surface temperatures increase, peaking when temperatures hit a certain high point, then dropping off as temperatures continue rising.Those observations raise the prospect that damaging rainstorms could eventually ease once surface temperatures reach a certain threshold.However, Wang says the peaks seen in the observational data and climate models simply reflect the natural variability of the climate. As Earth warms, her team found, the entire curve representing the relationship between extreme precipitation and rising temperatures is moving to the right. This is because the threshold temperature at which rain intensity peaks also goes up as temperature rises. Therefore, extreme rainfall will continue to increase, she says.The relationship between precipitation and temperature is founded in science. Simply put, warmer air holds more moisture. Scientists can even tell you how much. A widely used theorem in climate science called the Clausius-Clapeyron equation dictates that for every degree the temperature goes up, there is an approximately 7 percent increase in the amount of moisture the atmosphere can hold. The intensity of extreme precipitation, which is proportional to atmospheric moisture, also increases at a scaling rate of approximately 7 percent, in the absence of moisture limitations.The problem is that when scientists ran computer models predicting the likelihood of extreme precipitation in the future, and compared those results with both present day observations and the temperature scaling dictated by the so-called "C-C equation," the numbers were off. In many cases, the increase in extreme precipitation relative to surface temperature over land was closer to 2 to 5 percent, rather than 7 percent. In their analysis, Wang's team discovered that average local surface temperatures increase much faster than the threshold temperatures for extreme precipitation, and attributed the lower scaling rate to the fact that earlier studies compared extreme precipitation with average local temperatures rather than the temperature at the time the rainstorms occurred."There are a lot of studies where people are trying to determine why the scaling rate is lower than 7 percent," says Wang. "Our study suggests that this is a wrong question to ask. If you want to relate rain intensity to temperature using the C-C relationship as a reference, you have to relate to the temperature at which the rain event occurs, not the mean temperature, which is the long term average."Kevin Trenberth, an expert on global warming and the lead author of several reports prepared by the Intergovernmental Panel on Climate Change, joined Wang in the current study. Trenberth is currently a Distinguished Senior Scientist in the Climate Analysis Section at the National Center for Atmospheric Research. He shared the 2007 Nobel Peace Prize with former Vice President Al Gore as a member of the IPCC. Trenberth explains the findings this way:"In general, extreme precipitation increases with higher temperatures because the air can hold more moisture -- although that depends on moisture availability. But beyond a certain point, it is the other way round: the temperature responds to the precipitation, or more strictly speaking, the conditions leading to the precipitation, [such as extensive cloud cover or surface moisture]. The most obvious example of this is in a drought where there is no precipitation. Another example is in cloudy, stormy conditions, when it is wet and cool. By relating the changes in precipitation to the temperature where the relationship reverses -- instead of the mean temperature as in previous studies -- we can make sense of the differences and the changes. Moreover, it means there is no limit to the changes that can occur, as otherwise might be suspected if there were a fixed relationship."
Weather
2,017
March 6, 2017
https://www.sciencedaily.com/releases/2017/03/170306122124.htm
Flashy first images arrive from NOAA's GOES-16 lightning mapper
Detecting and predicting lightning just got a lot easier. The first images from a new instrument onboard NOAA's GOES-16 satellite are giving NOAA National Weather Service forecasters richer information about lightning that will help them alert the public to dangerous weather.
The first lightning detector in a geostationary orbit, the Geostationary Lightning Mapper (GLM), is transmitting data never before available to forecasters. The mapper continually looks for lightning flashes in the Western Hemisphere, so forecasters know when a storm is forming, intensifying and becoming more dangerous. Rapid increases of lightning are a signal that a storm is strengthening quickly and could produce severe weather.During heavy rain, GLM data will show when thunderstorms are stalled or if they are gathering strength. When combined with radar and other satellite data, GLM data may help forecasters anticipate severe weather and issue flood and flash flood warnings sooner. In dry areas, especially in the western United States, information from the instrument will help forecasters, and ultimately firefighters, identify areas prone to wildfires sparked by lightning.Accurate tracking of lightning and thunderstorms over the oceans, too distant for land-based radar and sometimes difficult to see with satellites, will support safe navigation for aviators and mariners.The new mapper also detects in-cloud lightning, which often occurs five to 10 minutes or more before potentially deadly cloud-to-ground strikes. This means more precious time for forecasters to alert those involved in outdoor activities of the developing threat.NASA successfully launched GOES-R at 6:42 p.m. EST on November 19, 2016 from Cape Canaveral Air Force Station in Florida and it was renamed GOES-16 when it achieved orbit. GOES-16 is now observing the planet from an equatorial view approximately 22,300 miles above the surface of the Earth.NOAA's satellites are the backbone of its life-saving weather forecasts. GOES-16 will build upon and extend the more than 40-year legacy of satellite observations from NOAA that the American public has come to rely upon.
Weather
2,017
March 3, 2017
https://www.sciencedaily.com/releases/2017/03/170303091341.htm
NASA study improves forecasts of summer Arctic sea ice
The Arctic has been losing sea ice over the past several decades as Earth warms. However, each year, as the sea ice starts to melt in the spring following its maximum wintertime extent, scientists still struggle to estimate exactly how much ice they expect will disappear through the melt season. Now, a new NASA forecasting model based on satellite measurements is allowing researchers to make better estimates.
Forecasts of how much Arctic sea ice will shrink from spring into fall is valuable information for such communities as shipping companies and native people that depend on sea ice for hunting. Many animal and plant species are impacted directly by changes in the coverage of sea ice across the Arctic. Uncertain weather conditions through spring and summer make the forecasting of Arctic sea ice for a given year extremely challenging.With data from satellites, which have been measuring sea ice in the Arctic since 1979, scientists can easily calculate the downward trend in Arctic sea ice. To make forecasts of how the Arctic sea ice cover might behave in the upcoming year, researchers have several options. The simplest approach is to assume a continuation of the long-term trend into the current year. The problem with this approach is that it will miss outliers -- years when the sea ice cover will be a lot higher or lower than expected. Another option is to analyze the physical characteristics of the sea ice cover as the melt season develops, to try to more precisely estimate if the amount of sea ice come September will be more or less than expected from the long-term trend."What we have shown is that we can use information collected in the spring and onwards to determine if we should see more or less ice come the end of summer than expected from the long-term decline," said Alek Petty, lead author of the new paper, which was published on February 27 in the journal The study used satellite measurements of sea ice coverage and melt onset. Petty's team found that the forecasts based on melt onset -- the time at which sea ice starts to melt and open water appears in the Arctic Ocean -- were most reliable in early spring, while sea ice coverage-based predictions were more reliable from June onwards. The forecasts focus specifically on regions that historically corresponded with how much sea ice remains come the September minimum extent. The predictions become more accurate with each passing month, as the model integrates more near-real-time information about sea ice melt and the distribution of open water areas across the Arctic Ocean and surrounding seasTo test whether their model produced reliable forecasts, Petty's team went back in time and made predictions for each year of the satellite record, using historical data of the Arctic sea ice conditions. They then evaluated the results against both the actual minimum extent for that year and what the long-term trend would have predicted."We found that our forecast model does much better than the linear trend at capturing what actually happened to the sea ice in any specific year," Petty said. "Our model is very good at catching the highs and the lows. The absolute values? Not exactly, but it tends to do very well at seeing when the sea ice extent is going to go up and when it's going to go down compared to what we might be expecting for that year."Petty's research also showed that models can produce reliable forecasts of sea ice not only for the whole Arctic, but for concrete regions; specifically, the Beaufort and Chukchi seas north of Alaska."The state of sea ice has a large impact on the Alaskan hunting communities," Petty said. "If they know ahead of time what the sea ice cover is going to be like that year, they might be able to infer the availability of the species they hunt."Future research will explore synthesizing different sea ice measurements into the same model to improve the reliability of the forecasts, Petty said.
Weather
2,017
March 2, 2017
https://www.sciencedaily.com/releases/2017/03/170302090827.htm
A new way of assessing winter driving conditions and associated risks
A new study, published today in the
In countries like Canada that have severe winter seasons, transportation agencies often face challenges in meeting the safety and mobility needs of people on the road. To address these challenges, most agencies have a comprehensive winter maintenance program in place that includes policies, best practices, and guidelines for monitoring and reporting of road surface conditions. Typically, road surface condition information is broadcast through a traveler information portal known as 511 system or the website of the road agency. However, there is a lack of consistency in defining and determining the winter driving conditions of a highway across different transportation agencies and jurisdictions. Additionally, different terms may represent different levels of travel risk depending on the agency and location. "The main goal of our study is to develop and propose a new approach to road surface condition classification that provides consistency in the communication of the driving risk that a motorist may experience," says Dr. Lalita Thakali, Research Associate at the University of Waterloo.In this study, researchers from the Department of Civil & Environmental Engineering at the University of Waterloo, propose a risk-based approach for classifying road surface conditions that could be used for monitoring winter driving conditions and directing winter road maintenance operations. The researchers propose a relative risk index on the basis of the risk estimated using a collision model calibrated using detailed hourly data of weather, road surface conditions, traffic and accidents on a large number of highway sections in Ontario over six winter seasons.The study proposed two alternative approaches to address the challenge of determining the overall condition of a highway section or route with non-uniform driving conditions. The first approach applies a risk model to estimate the relative increase in risk under a specific winter weather and road surface conditions as compared to normal conditions. The second approach involves converting different classes of road conditions observed on any given route into a single dominant class based on the relative risk between individual classes of road conditions. This could help drivers assess the road conditions of their entire trip or route."An ideal classification system for the public should be one that is simple, intuitive, and consistent" continues Dr. Thakali. The risk-based approach for road condition classification introduced in this research represents one step closer towards such an ideal classification system. Further research could look into the feasibility of developing a universal risk index that is applicable across different regions in Canada.The paper, "A risk-based approach to winter road surface condition classification" by Liping Fu, Lalita Thakali, Tae J. Kwon and Taimur Usman was published today in the
Weather
2,017
March 1, 2017
https://www.sciencedaily.com/releases/2017/03/170301084933.htm
Highest temperatures recorded for Antarctic region
The World Meteorological Organization has announced new verified record high- temperatures in Antarctica, an area once described as "the last place on Earth." The temperatures range from the high 60s (in Fahrenheit) to the high teens, depending on the location they were recorded in Antarctica.
Knowledge and verification of such extremes are important in the study of weather patterns, naturally occurring climate variability and human induced change at global and regional scales, said Randy Cerveny, an Arizona State University professor of geographical science and urban planning and the Rapporteur of Climate and Weather Extremes for the WMO."The temperatures we announced today are the absolute limit to what we have measured in Antarctica," Cerveny said. "Comparing them to other places around the world and seeing how other places have changed in relation to Antarctica gives us a much better understanding of how climate interacts, and how changes in one part of the world can impact other places."Because Antarctica is so vast (it is roughly the size of the United States) and varied the WMO committee of experts, convened by Cerveny, provided three temperature measurements for the Antarctic.The highest temperature for the "Antarctic region" (defined by the WMO and the United Nations as all land and ice south of 60-deg S) of 19.8 C (67.6 F), which was observed on Jan. 30, 1982 at Signy Research Station, Borge Bay on Signy Island.The highest temperature for the Antarctic Continent, defined as the main continental landmass and adjoining islands, is the temperature extreme of 17.5 C (63.5 F) recorded on Mar. 24, 2015 at the Argentine Research Base Esperanza located near the northern tip of the Antarctic Peninsula.The highest temperature for the Antarctic Plateau (at or above 2,500 meters, or 8,200 feet) was -7 C (19.4 F) made on Dec. 28, 1989 at an automatic weather station site D-80 located inland of the Adelie Coast.The Antarctic is cold, windy and dry. The average annual temperature ranges from -10 C on its coasts to -60 C (14 F to -76 F) at the highest points in the interior. Its immense ice sheet is about 4.8 km (3 miles) thick and contains 90 percent of the world's fresh water, enough to raise sea levels by around 60 meters (200 feet) if it were all to melt.Cerveny said that observing the extremes of what the Polar Regions are experiencing can provide a better picture of the planet's interlinked weather system."The polar regions of our planet have been termed the 'canary' in our global environment," Cerveny said. "Because of their sensitivity to climate changes, sometimes the first influences of changes in our global environment can be seen in the north and south polar regions. Knowledge of the weather extremes in these locations therefore becomes particularly important to the entire world. The more we know of this critically important area to our environment, the more we can understand how all of our global environments are interlinked."Cerveny said an additional benefit is understanding how those extremes were achieved."In the case of the Antarctic extremes, two of them were the result of what are called 'foehn' winds -- what we call Chinook winds -- very warm downslope winds that can very rapidly heat up a place. These winds are found even here in the United States, particularly along the front range of the Rockies. The more we learn about how they vary around the world, the better we can understand them even here in the United States.Full details of the Antarctic high temperatures and their assessment are given in the on-line issue of
Weather
2,017
February 28, 2017
https://www.sciencedaily.com/releases/2017/02/170228131007.htm
Declining Arctic sea ice influences European weather, but isn't a cause of colder winters
The dramatic loss of Arctic sea ice through climate change is unlikely to lead to more severe winter weather across Northern Europe, new research has shown.
A pioneering new study has explored how Arctic sea-ice loss influences the North Atlantic Oscillation (NAO) weather phenomenon, which affects winter weather conditions in Northern Europe, in places such as the UK, Scandinavia and the Baltic states.Previous studies have suggested that Arctic sea-ice loss causes the NAO to spend longer in its 'negative phase' - generating more easterly winds that bring colder air from Scandinavia and Siberia to the UK. This might be expected to cause more frequent cold winters, such as the deep freeze experienced in the UK in the winter of 2009/2010.However the new study, carried out by Dr James Screen from the University of Exeter, crucially suggests that Arctic sea-ice loss does not cause colder European winters.Dr Screen suggests this surprising result is due to a 'missing' cooling response - meaning that the expected cooling brought about by more easterly winds is offset by the widespread warming effects of Arctic sea-ice loss.The study is published in leading science journal, Dr Screen, a Senior Lecturer in Mathematics at the University of Exeter said: "We know that the NAO is an important factor in controlling winter weather over Northern Europe"."The negative phase of the NAO is typically associated with colder winters. Because of this it has been reasonable to think that we would experience more severe winter weather if Arctic sea-ice loss intensifies the negative phase of the NAO"."This research indicates that although sea-ice loss does intensify the negative NAO, bringing more days of cold easterly winds, it also causes those same winds to be warmer than they used to be. These two competing effects cancel each other out, meaning little change in the average temperature of European winters as a consequence of sea-ice loss".The NAO phenomenon describes large-scale changes in atmospheric wind patterns over the North Atlantic. Importantly, the NAO relates to changes in the strength and position of the North Atlantic jet stream - a band of very fast winds high in the atmosphere. The position of the jet stream has a substantial impact on weather in Northern Europe.Using the sophisticated UK Met Office climate model, Dr Screen conducted computer experiments to study the effects of Arctic sea-ice loss on the NAO and on Northern European winter temperatures.Dr Screen added: "Scientists are eager to understand the far-flung effects of Arctic sea-ice loss. On the one hand this study shows that sea-ice loss does influence European wind patterns. But on the other hand, Arctic sea-ice loss does not appear to be a cause of European temperature change, as some scientists have argued." It was funded through a grant by the Natural Environment Research Council (NERC).
Weather
2,017
February 22, 2017
https://www.sciencedaily.com/releases/2017/02/170222150256.htm
'Atmospheric rivers' associated with California flooding also common in the southeast
Much of the flood-inducing rainfall that has pummeled California over the last month flowed into the region via a river in the sky. But these so-called atmospheric rivers, which transport large quantities of water vapor poleward from the tropics, can wreak havoc in the Southeast as well.
University of Georgia geography and atmospheric sciences researchers provide the first detailed climatological analysis of Southeastern atmospheric rivers in a new study published in the International Journal of Climatology."Drought-busting rainfall this month, which also compromised the U.S.'s tallest dam, has been associated with these atmospheric rivers or what some call the Pineapple Express," said J. Marshall Shepherd, Georgia Athletic Association Distinguished Professor and director of the atmospheric sciences program at UGA. "Our study shows that they are more common than we thought in the Southeast, and it is important to properly understand their contributions to rainfall given our dependence on agriculture and the hazards excessive rainfall can pose."While atmospheric rivers vary in size and shape, those containing large amounts of water vapor, strong winds, and that stall over watersheds vulnerable to flooding, can create extreme rainfall and floods. Such events can disrupt travel, lead to mudslides and cause catastrophic damage to life and property.The study was constructed from climatological data of atmospheric river events from 1979 to 2014. The research emerged from a graduate seminar on atmospheric rivers developed by UGA Distinguished Research Professor Tom Mote, one of the authors of the study."A large portion of atmospheric river research has focused on the western coast of the United States, so the goal of this study was to improve our understanding of atmospheric river events throughout the Southeast," said Neil Debbage, doctoral candidate in the UGA department of geography and first author on the study."Overall, we found that atmospheric rivers in the Southeast were fairly common and most prevalent during the winter," Debbage said. "However, the specific number and nature of atmospheric rivers differed between various sub-regions of the Southeast."Major flooding in Nashville in 2010 was associated with atmospheric river activity, but not all atmospheric rivers cause damage, as many are weak and can provide much-needed precipitation."A better understanding of when atmospheric river events occur in the Southeast, their basic characteristics, and the weather patterns conducive for their development will lead to improved forecasting and community awareness of atmospheric rivers in the region," Debbage said.
Weather
2,017
February 22, 2017
https://www.sciencedaily.com/releases/2017/02/170222102524.htm
How migratory birds respond to balmier autumns?
Around the world, no matter where we are, we can usually expect the weather to change from one season to the next. In North America, the warm days of summer eventually turn into the cooler days of autumn, and these changes are vital to a lot of the animals that inhabit the region as they trigger the urge of animals to prepare for winter. Migratory animals, like songbirds, use these predictable weather changes as environmental cues to tell them when it's time to migrate south. But with the earth now getting hotter and hotter each year, birds can no longer rely on the once predictable climate. As autumns are becoming milder, ornithologists keep pondering on how it could be affecting birds' migratory decisions. Now, a new paper published this week in an online journal
The study led by Adrienne Berchtold from the Advanced Facility for Avian Research at the University of Western Ontario, focused on one songbird species that is known to rely on weather for its migratory journey: the white-throated sparrow. The bird migrates from Canada to the southern United States each autumn, and it tends to migrate later than other migrants, basing its journeys on when the weather provides opportunities for flight.To figure out the underlying pressures that drive the birds to migrate, the researchers captured white-throated sparrows during one autumn migration and placed them in specially-designed bird cages equipped with high-tech monitoring gear that kept track of how active the birds were by day and night. The scientists then changed the room temperatures throughout the experiment to see how the birds would react. When the temperature dropped to chilly 4ºC, in an attempt to mimic the typical fall conditions in the northern part of the flyway, the birds all became restless at night, signifying they were in a migratory state. When, in turn, the temperature was raised to a warm 24ºC, none of the birds showed signs of migratory restlessness, indicating they were under no pressure to depart in these balmy conditions.These results will have considerable implications for the future of the migration as this and other bird species rely on predictable weather changes to leave home for the season. In North America, the continuous trend in soaring autumn temperatures could delay the birds migration. Yet another more drastic possibility is that the birds would decide, perhaps unsurprisingly, to stay put and not to migrate at all. In fact, a recent paper in this same journal found this very pattern is happening in the population of American Robins of North America, who are increasingly deciding not to migrate.According to Andrew Farnsworth, a Research Associate at the Cornell Lab of Ornithology who studies bird migration, "This type of research gives us more of the clues that scientists need to understand how birds respond, and might respond in the future, to changes in environmental conditions they experience. Considering these findings in light of previous research on nocturnal migratory restlessness from the mid to late 20th century, and more importantly, recent research on fuel accumulation and photoperiodicity, these results add to our growing understanding of how birds migrate and even how their migration evolved. Furthermore, given the predicted changes in global temperatures from human activities, these findings highlight the potential for dramatic changes to movements for many migratory species."
Weather
2,017
February 17, 2017
https://www.sciencedaily.com/releases/2017/02/170217100056.htm
Local weather impacts melting of one of Antarctica's fastest-retreating glaciers
Local weather plays an important part in the retreat of the ice shelves in West Antarctica, according to new research published in the journal
The study led by scientists at the University of East Anglia (UEA) of the Pine Island Glacier (PIG) used a unique five-year record to study how the interactions between the ocean and the atmosphere, as well as changing currents, control how heat is transported to, and beneath, the Pine Island Ice Shelf.Pine Island Glacier is one of the fastest melting glaciers in Antarctica with some studies suggesting that its eventual collapse is almost inevitable.Previous research suggested more warm water was circulating under the ice shelf and melting it more rapidly, leading to an increasing contribution to sea level rise. However relatively little was known about what drives changes in ocean conditions in this remote part of Antarctica due to its inaccessibility. Some studies suggested that the ocean conditions close to Pine Island Glacier are influenced most strongly by winds at the edge of the continental shelf, some 400 km to the north, which in turn respond to changes in tropical ocean temperatures.The study looked at the impact of shelf-edge winds and found this to be less direct than previously thought, and that local atmospheric conditions and ocean circulation are the main drivers of ocean temperature changes in the critical 350-700m depth range, over the period of observation.Dr Ben Webber, oceanographer at UEA's School of Environmental Sciences said: "The ice shelves of the Amundsen Sea -- an area of the Southern Ocean -- protect much of the West Antarctic Ice Sheet from collapse. These ice shelves are rapidly losing mass and understanding the mechanisms which control ocean conditions and drive melting of these glaciers is hugely important."We found a strong annual cycle in the exchange of heat between the ocean and the atmosphere, which drives changes in ocean temperature. While these changes are less evident in deeper waters, through convection and mixing the heat can penetrate deeply enough to have a major impact on melting and influence the temperature of the water entering the cavity under the glacier."There was a colder weather period from 2012-13, however, a separate study has shown that this only led to a partial slowdown of the glacier's retreat, and many glaciers in the region have been retreating for decades and aren't slowing down."Changes in the direction of the ocean currents also cause changes in temperature close to Pine Island Glacier. The colder period was associated with a reversal in the currents that transport heat into and around the bay.Co-author Dr Povl Abrahamsen, oceanographer at British Antarctic Survey, said: "Most of the ocean data around Antarctica are snapshots of conditions -- and many areas are only visited once every one or two years, if that. A continuous five-year time series near Pine Island Glacier, one of the fastest-melting glaciers in Antarctica, lets us see what is happening between these snapshots, giving us insights into the processes driving the melting of Pine Island Glacier."Dr Webber continued: "It is likely that other ice shelves around Antarctica that are melting due to warm ocean conditions will also be strongly influenced by local atmospheric conditions. This would underline the importance of atmospheric and ocean monitoring close to the Antarctic coasts to give early warning of future changes in ice shelf melting and glacial retreat."The research was carried as part of the Natural Environment Research Council (NERC)-funded iSTAR Programme and was in collaboration with US and Korean collaborators using data from ship-based and atmospheric observations including ship-deployed oceanographic moorings.
Weather
2,017
February 15, 2017
https://www.sciencedaily.com/releases/2017/02/170215121115.htm
Scientists report ocean data from under Greenland's Petermann Glacier
In August 2015, University of Delaware oceanographer Andreas Muenchow and colleagues deployed the first UD ocean sensors underneath Petermann Glacier in North Greenland, which connects the great Greenland ice sheet directly with the ocean.
Petermann Glacier is the second largest floating ice shelf in the northern hemisphere.Located approximately 16 to 2,300 feet below the glacier, the five ocean sensors are connected to a weather station at the surface, creating the first cabled observatory on a floating, moving, and rapidly melting Greenland glacier.The researchers recently reported in the journal Specifically, the paper found that the same water that has been measured in the fjord is under the glacier, lending credence to the idea that the continuity of the glacier depends on the conditions outside the glacier in the fjord.This water is warming an average of 0.03 degrees Celsius per year, with temperatures at the deepest ocean sensors sometimes exceeding 0.3 degrees Celsius or 33 degrees Fahrenheit, Muenchow said. These temperature values are consistent at various water depths, and match data from a 2003-09 study in adjacent Nares Strait, which connects to both the Arctic and Atlantic Oceans."This correlation tells us this is the same water and that this is what's causing the melting of the glacier, which could influence sea level rise," said Muenchow, an associate professor of oceanography in UD's School of Marine Science and Policy, which is housed in the College of Earth, Ocean, and Environment (CEOE).The scientists theorize that warmer Atlantic water will continue to arrive inside Petermann Fjord and below the ice shelf from Nares Strait in the next one-to-two years.
Weather
2,017
February 13, 2017
https://www.sciencedaily.com/releases/2017/02/170213151250.htm
Using high-resolution satellites to measure African farm yields
Stanford researchers have developed a new way to estimate crop yields from space, using high-res photos snapped by a new wave of compact satellites.
The approach, detailed in the February 13 issue of the journal of the "Improving agricultural productivity is going to be one of the main ways to reduce hunger and improve livelihoods in poor parts of the world," said study-coauthor Marshall Burke, an assistant professor in the department of Earth System Science at Stanford's School of Earth, Energy & Environmental Sciences. "But to improve agricultural productivity, we first have to measure it, and unfortunately this isn't done on most farms around the world."Earth-observing satellites have been around for over three decades, but most of the imagery they capture has not been high-enough resolution to visualize the very small agricultural fields typical in developing countries. Recently, however, satellites have shrunk in both size and cost while simultaneously improving in resolution, and today there are several companies competing to launch refrigerator- and shoebox-sized satellites into space that take high resolution images of Earth."You can get lots of them up there, all capturing very small parts of the land surface at very high resolution," said study-coauthor David Lobell, an associate professor in the Department of Earth System Science. "Any one satellite doesn't give you very much information, but the constellation of them actually means that you're covering most of the world at very high resolution and at very low cost. That's something we never really had even a few years ago."In the new study, Burke and Lobell set out to test whether the images from this new wave of satellites are good enough reliably estimate crop yields. The pair focused on an area in Western Kenya where there are a lot of smallholder farmers that grow maize, or corn, on small, half-acre or one-acre lots. "This was an area where there was already a lot of existing field work," Lobell said. "It was an ideal site to test our approach."The scientists compared two different methods for estimating agricultural productivity yields using satellite imagery. The first approach involved "ground truthing," or conducting ground surveys to check the accuracy of yield estimates calculated using the satellite data, which was donated by the company Terra Bella. For this part of the study, Burke and his field team spent weeks conducting house-to-house surveys with his staff, talking to farmers and gathering information about individual farms."We get a lot of great data, but it's incredibly time consuming and fairly expensive, meaning we can only survey at most a thousand or so farmers during one campaign," Burke said. "If you want to scale up our operation, you don't want to have to recollect ground survey data everywhere in the world."For this reason, the team also tested an alternative "uncalibrated" approach that did not depend on ground survey data to make predictions. Instead, it uses a computer model of how crops grow, along with information on local weather conditions, to help interpret the satellite imagery and predict yields."Just combining the imagery with computer-based crop models allows us to make surprisingly accurate predictions, just based on the imagery alone, of actual productivity on the field," Burke said.The researchers have plans to scale up their project and test their approach across more of Africa. "Our aspiration is to make accurate seasonal predictions of agricultural productivity for every corner of Sub-Saharan Africa," Burke said. "Our hope is that this approach we've developed using satellites could allow a huge leap in in our ability to understand and improve agricultural productivity in poor parts of the world."Lobell is also the deputy director of Stanford's the Center on Food Security and the Environment and a Senior Fellow at the Stanford Woods Institute for the Environment.
Weather
2,017
February 13, 2017
https://www.sciencedaily.com/releases/2017/02/170213131422.htm
Impact of climate change on mammals and birds 'greatly underestimated'
An international study published today involving University of Queensland research has found large numbers of threatened species have already been impacted by climate change.
Associate Professor James Watson of UQ's School of Earth and Environmental Sciences and the Wildlife Conservation Society said alarmingly, the team of international researchers found evidence of observed responses to recent climate changes in almost 700 birds and mammal species."There has been a massive under-reporting of these impacts," he said."Only seven per cent of mammals and four per cent of birds that showed a negative response to climate change are currently considered 'threatened by climate change and severe weather' by the International Union for the Conservation of Nature Red List of Threatened Species."Associate Professor Watson said the study reviewed the observed impacts of climate change on birds and mammals using a total of 130 studies, making it the most comprehensive assessment to date on how climate change has affected our most well studied species."The results suggested it is likely that around half the threatened mammals (out of 873 species) and 23 per cent of threatened birds (out of 1272 species) have already responded negatively to climate change," he said.Lead author Michela Pacifici of the Global Mammal Assessment Program at Sapienza University of Rome said this implied that, in the presence of adverse environmental conditions, populations of these species had a high probability of also being negatively impacted by future climatic changes.Associate Professor Watson said the study clearly showed that the impact of climate change on mammals and birds to date has been greatly under estimated and reported on."This under-reporting is also very likely in less studied species groups. We need to greatly improve assessments of the impacts of climate change on all species right now," he said."We need to communicate the impacts of climate change to the wider public and we need to ensure key decision makers know significant change needs to happen now to stop species going extinct."Climate change is not a future threat anymore."The paper was published in the journal
Weather
2,017
February 13, 2017
https://www.sciencedaily.com/releases/2017/02/170213083755.htm
New data from NOAA GOES-16's Space Environment In-Situ Suite (SEISS) instrument
The new Space Environment In-Situ Suite (SEISS) instrument onboard NOAA's GOES-16 is working and successfully sending data back to Earth.
A plot from SEISS data showed how fluxes of charged particles increased over a few minutes around the satellite on January 19, 2017. These particles are often associated with brilliant displays of aurora borealis at northern latitudes and australis at southern latitudes; however, they can pose a radiation hazard to astronauts and other satellites, and threaten radio communications.Information from SEISS will help NOAA's Space Weather Prediction Center provide early warning of these high flux events, so astronauts, satellite operators and others can take action to protect lives and equipment.SEISS is composed of five energetic particle sensor units. The SEISS sensors have been collecting data continuously since January 8, 2017, with an amplitude, energy and time resolution that is greater than earlier generations of NOAA's geostationary satellites.SEISS was built by Assurance Technology Corporation and its subcontractor, the University of New Hampshire.NASA successfully launched GOES-R at 6:42 p.m. EST on November 19, 2016 from Cape Canaveral Air Force Station in Florida and it was renamed GOES-16 when it achieved orbit. GOES-16 is now observing the planet from an equatorial view approximately 22,300 miles above the surface of the Earth.NOAA's satellites are the backbone of its life-saving weather forecasts. GOES-16 will build upon and extend the more than 40-year legacy of satellite observations from NOAA that the American public has come to rely upon.
Weather
2,017
February 9, 2017
https://www.sciencedaily.com/releases/2017/02/170209090950.htm
Increase in the number of extremely strong fronts over Europe?
A new research finds an increase of strong and extremely strong fronts in summertime and autumn over Europe. If this is a trend or caused by climate change remains to be seen, according to lead author Sebastian Schemm.
Weather fronts are for good reason key elements on our daily weather chart. When they traverse from west to east across Europe they can bring vigorous weather changes, often in connection with high wind speed, gusts, heavy precipitation or often hail. The latter affects mostly continental Europe during the summer season and much less the Nordic countries.In a recent study in Geophysical Research Letters Sebastian Schemm from the University of Bergen and the Bjerknes Centre for Climate Research, and co-authors from the University of Bern and ETH Zürich revealed an increase in the number of strong and extremely strong fronts over Europe mainly during summer and autumn. The study was also selected as as research highlight in Nature Climate Change.In their study, the authors investigated gridded data based on observations and satellite retrievals for the period between 1979-2014.No comparable trend is identified over continental North America. Because frontal precipitations increases with the strength of a front, this findings may help to better understand the high spatio-temporal variability of precipitation trends across Europe."Surprisingly meteorologist have yet to settle on a single front definition," argues Sebastian Schemm, but "in our study we relied on a very common method that helps meteorologist to draw surface fronts."In this definition information about temperature and moisture contrasts are combined into one single variable. Accordingly the authors are able to pinpoint increasing atmospheric humidity as the underlying cause of the observed trend in extreme fronts.According to the IPCC AR5 report, humidity trends are significant over Europe but mixed or close to zero over parts of North America. This is in agreement with the trend in weather fronts. However, according to the lead author, it remains to be seen if the increase in the number of extremely strong fronts is tight to anthropogenic climate change or simply natural variability and part of a multi-decadal climate fluctuation.
Weather
2,017
February 7, 2017
https://www.sciencedaily.com/releases/2017/02/170207191842.htm
Drought identified as key to severity of West Nile virus epidemics
A study led by UC Santa Cruz researchers has found that drought dramatically increases the severity of West Nile virus epidemics in the United States, although populations affected by large outbreaks acquire immunity that limits the size of subsequent epidemics.
The study, published February 8 in "We found that drought was the dominant weather variable correlated with the size of West Nile virus epidemics," said first author Sara Paull, who led the study as a post-doctoral researcher at UC Santa Cruz and is now at the National Center for Atmospheric Research.West Nile virus was introduced into North America in 1999 and has caused yearly epidemics each summer since. The intensity of these epidemics, however, has varied enormously. In some years, there were only a few hundred severe human cases nationally, whereas in each of three years (2002, 2003, and 2012), approximately 3,000 people suffered brain-damaging meningitis or encephalitis, and almost 300 died. The variation at the state level has been even higher, with yearly case numbers varying 50-fold from year to year, on average. The causes of this enormous variation were unknown and had led scientists at the Centers for Disease Control and Prevention to suggest that predicting the size of future epidemics was difficult or impossible.In the new study, Paull and Marm Kilpatrick, an associate professor of ecology and evolutionary biology at UC Santa Cruz, analyzed patterns in the number of severe West Nile virus infections each year in each state and nationally. They examined a number of weather variables, including summer temperature, precipitation, winter severity, and drought. They also tested a long-standing hypothesis that the disease shows a wave-like pattern in causing large outbreaks in the first year and few cases subsequently due to a build-up of immunity in bird populations, which are the main hosts for the virus."We found strong evidence that in some regions the spread of West Nile virus was indeed wave-like, with large outbreaks followed by fewer cases," Paull said. "However, our analyses indicated that human immunity -- not just bird immunity -- played a large part in the decrease in human cases by reducing the number of people susceptible to the disease."Kilpatrick said the links with drought were unexpected. In collaboration with Dr. Laura Kramer from the New York State Department of Health, his lab had developed a very careful method of mapping the influence of temperature on the biology of both the virus and the three different mosquitoes that are most important in transmitting the virus."We thought epidemics would coincide with the most ideal temperatures for transmission," Kilpatrick said. "Instead, we found that the severity of drought was far more important nationally, and drought appeared to be a key driver in the majority of individual states as well."It's not yet clear how drought increases transmission of the virus, he said. Data from Colorado indicate that drought increases the fraction mosquitoes infected with West Nile virus, but not the abundance of mosquitoes. Drought might affect transmission between mosquitoes and birds by stressing birds or changing where they congregate.With the help of climatologists Dan Horton and Noah Diffenbaugh at Stanford University, Paull used the links between drought, immunity, and West Nile virus to project the impacts of climate change on future epidemics. Over the next three decades, drought is projected to increase in many regions across the United States due to increased temperatures, despite increases in precipitation in some of the same areas.Model projections indicated that increased drought could double the size of future West Nile virus epidemics, but that outbreaks would be limited to regions that have yet to sustain large numbers of cases. These findings provide a tool to help guide public health efforts to regions most likely to experience future epidemics.
Weather
2,017
February 7, 2017
https://www.sciencedaily.com/releases/2017/02/170207104238.htm
Malaria control efforts can benefit from forecasting using satellites
Umeå University researcher Maquins Sewe has established links between patterns of malaria in Kenya and environmental factors (temperature, rainfall and land cover) measurable by satellite imagery. In his doctoral dissertation, the researcher shows that conducive environmental conditions occur before increases in hospital admissions and mortality due to malaria, indicating that the satellite information is useful for the development of disease forecasting models and early warning systems.
"When integrated with 'on-the-ground' malaria surveillance and control strategies, forecasting models that use satellite images can help policy makers to choose the most cost-effective responses to reduce malaria burden," says Maquins Sewe, researcher at the Department of Public Health and Clinical Medicine, Epidemiology and Global Health Unit. "Since prevailing weather conditions regulate the abundance of malaria-transmitting mosquitos, assessing environmental risks can help predict a rise in malaria infections geographically and allow proactive control efforts to focus on hot areas."According to Maquins Sewe, satellite-based malaria forecasting might be especially useful in low resource settings where data on weather conditions are limited or nonexistent. The idea is that an early warning system, based on the use of weather monitoring and malaria surveillance, can provide sufficient lead time to launch geographically focused cost-effective proactive interventions.The control and prevention of malaria is very important in countries in Sub-Saharan Africa, where malaria contributes to high levels of morbidity and mortality. Children under the age of five are the most vulnerable group and have the highest mortality from malaria. The elimination of malaria by 2030 was one of the Sustainable Development Goals set forth by the World Health Assembly in 2015. Improved malaria surveillance technologies is recommended as one of the key strategies to achieve the SDG's elimination goal.In his research, Maquins Sewe used data from the Health and Demographic Surveillance System (HDSS) run by Kenya Medical Research Institute and United States Center for Disease Control (KEMRI/CDC). The data covered the Asembo, Gem and Karemo regions of Western Kenya, which have a combined population of over 240,000. In this region, malaria accounts for 28 percent of all deaths in children under 5 years.The study provides evidence that a progression of changes in environmental conditions and the subsequent occurrence of malaria mortality follow the expected biologic mechanism. Temperature determines both the development of the malaria parasite in the mosquito and the rate at which the mosquito develops from larva to adult, while precipitation provides the necessary breeding places. The study identified a risk pattern that together with a delay form the basis for the forecast model. However, the model developed in the study showed that longer-term forecasts decreased in accuracy.In order to optimize response strategies, Maquins Sewe developed an economic assessment framework to quantify the tradeoffs between the cost-effectiveness of proactive interventions and the fact that forecast accuracy diminishes with increasing lead-time."This study contributes to malaria early warning systems with several important components, including risk assessment, a model for integrating surveillance and satellite data for prediction, and a method for identifying the most cost-effective response strategies given the uncertainty with predictive data in the long-term," concludes Maquins Sewe.Find the thesis online at:
Weather
2,017
February 6, 2017
https://www.sciencedaily.com/releases/2017/02/170206111908.htm
Extreme fires will increasingly be part of our global landscape, researchers predict
Increasingly dangerous fire weather is forecast for Australia and the Mediterranean as the global footprint of extreme fires expands, according to the latest research.
University of Tasmania Professor of Environmental Change Biology David Bowman led an international collaboration -- including researchers from the University of Idaho and South Dakota State University -- to compile a global satellite database of the intensity of 23 million landscape fires used to identify 478 of the most extreme wildfire events."Extreme fire events are a global and natural phenomenon, particularly in forested areas that have pronounced dry seasons," Professor Bowman said."With the exception of land clearance, the research found that extremely intense fires are associated with anomalous weather -- such as droughts, winds, or in desert regions, following particularly wet seasons."Of the top 478 events, we identified 144 economically and socially disastrous extreme fire events that were concentrated in regions where humans have built into flammable forested landscapes, such as areas surrounding cities in southern Australia and western North America."Using climate change model projections to investigate the likely consequences of climate change, the research found more extreme fires are predicted in the future for Australia's east coast, including Brisbane, and the whole of the Mediterranean region -- Portugal, Spain, France, Greece and Turkey."The projections suggest an increase in the days conducive to extreme wildfire events by 20 to 50 per cent in these disaster-prone landscapes, with sharper increases in the subtropical Southern Hemisphere, and the European Mediterranean Basin," Professor Bowman said.The research has been published in the scientific journal The research is released on the day the State remembers the impact of the 1967 bushfires in the city of Hobart and across the South, which claimed the lives of 62 people, left 900 injured and more than 7,000 homeless.
Weather
2,017
February 6, 2017
https://www.sciencedaily.com/releases/2017/02/170206083827.htm
Scientist studies whether solar storms cause animal beachings
A long-standing mystery among marine biologists is why otherwise healthy whales, dolphins, and porpoises -- collectively known as cetaceans -- end up getting stranded along coastal areas worldwide. Could severe solar storms, which affect Earth's magnetic fields, be confusing their internal compasses and causing them to lose their way?
Although some have postulated this and other theories, no one has ever initiated a thorough study to determine whether a relationship exists -- until now. NASA heliophysicist Antti Pulkkinen, who works at the agency's Goddard Space Flight Center in Greenbelt, Maryland, has teamed with the federal Bureau of Ocean Energy Management, or BOEM, and the International Fund for Animal Welfare, or IFAW, to determine whether a link exists.Strandings occur around the world, involving as few as three to as many as several hundred animals per event. Although a global phenomenon, such strandings tend to happen more often in New Zealand, Australia, and Cape Cod, Massachusetts, said project collaborator Katie Moore, the director of IFAW's global Animal Rescue Program. Headquartered in Yarmouth Port, Massachusetts, IFAW operates in 40 countries, rescuing animals and promoting conservation to secure a safe habitat for wildlife."These locations share some key characteristics, such as the geography, gently sloping beaches, and fine-grained sediment, which we think all play some role in these events," she said.Another possibility is that these animals' internal compasses are somehow skewed by humans' use of multi-beam echo sounders and other sonar-type equipment used to map the seafloor or locate potential fishing sites, to name just a few applications."However, these human-made influences do not explain most of the strandings," said Pulkkinen, an expert in space weather and its effect on Earth. "Theories as to the cause include magnetic anomalies and meteorological events, such as extreme tides during a new moon and coastal storms, which are thought to disorient the animals. It has been speculated that due to the possible magnetic-field sensing used by these animals to navigate, magnetic anomalies could be at least partially responsible."Indeed, magnetic anomalies caused when the sun's corona ejects gigantic bubbles of charged particles out into the solar system can cause problems for Earth-orbiting satellites and power grids when they slam into Earth's protective magnetosphere. It's possible they could affect animals, as well, Pulkkinen said."The type of data that Antti has accumulated, together with the extensive stranding data at our disposal, will allow us to undertake the first rigorous analysis to test possible links between cetacean mass strandings and space-weather phenomena," said Desray Reeb, a marine biologist at BOEM's headquarters in Sterling, Virginia. Reeb approached Pulkkinen about launching a research effort after hearing his presentation about space weather in June 2015.With funding from BOEM and NASA's Science Innovation Fund, Pulkkinen and his collaborators are carrying out a massive data-mining operation. The team will analyze NASA's large space-weather databases, including field recordings and space observations, and stranding data gathered by BOEM and IFAW."We estimate that records on the order of hundreds of cetacean mass strandings will be available for study, thus making our analyses statistically significant," Pulkkinen said. "We therefore expect that we will be able to reliably test the hypothesis. So far, there has been very little quantitative research, just a lot of speculation," Pulkkinen continued. "What we're going to do is throw cold, hard data at this. It's a long-standing mystery and it's important that we figure out what's going on."The team expects to complete the study by the end of September and publish its findings in a scientific, peer-reviewed journal. Should the study reveal a statistical correlation, team members said the results won't necessarily imply a causal link. However, it would provide the first thorough research into this hypothesis and offer the first step toward determining if it's correct."The results of this study will be informative for researchers, stranding network organizers, resource agencies and regulatory agencies," Reeb said. "If we understand the relationship between the two, we may be able to use observations of solar storms as an early warning for potential strandings to occur," added Moore, who said she "was immediately keen" to get involved in the study. "This would allow stranding responders in global hotspots, and really around the world, to be better prepared to respond, thus having the opportunity to save more animals."
Weather
2,017
February 1, 2017
https://www.sciencedaily.com/releases/2017/02/170201093251.htm
A future for skiing in a warmer world
As the world struggles to make progress to limit climate change, researchers are finding ways to adapt to warmer winter temperatures -- by developing environmentally friendly ways of producing artificial snow.
Chances are if you know anything about Norway, you know it's a place where skiing was born.Norse mythology describes gods and goddesses hunting on skis, and 4000-year-old petroglyphs from northern Norway include some of the earliest known drawings of people on skis. One of the most recognizable Norwegian paintings worldwide depicts two skiers in 1206 fleeing to safety with the country's two-year-old prince, Håkon Håkonsson.Over the centuries, skiing in Norway has evolved from a practical mode of winter transport to a sport that is deeply ingrained in Norwegian culture. Norwegians themselves like to say they enter the world uniquely prepared for their northern home -- because they are "born with skis on their feet."But warmer weather due to climate change has made for less-than-stellar ski conditions in Norway and across Europe. Advances in snowmaking, where water is "seeded" with a protein from a bacterium that allows snow to be made at temperatures right around freezing, simply aren't enough to keep up with the changing climate.In response, a team of Norwegian researchers has been awarded a NOK 2.3 million grant from the Norwegian Ministry of Culture to develop a new approach to snowmaking -- one that would allow snow to be made in an energy-efficient way, even at warmer temperatures. The project has been named, appropriately enough, "Snow for the Future."Traditional snowmaking makes up for a lack of snow by spraying water into cold air, and letting physics do the rest. But if temperatures are above freezing, this simply won't work, for obvious reasons.Researchers at SINTEF, Scandinavia's largest independent research institute, and the Norwegian University of Science and Technology (NTNU) have worked extensively with a type of technology called a heat pump. They think that heat pumps could be key to producing snow in an environmentally friendly way, even at higher temperatures. Your refrigerator and freezer are examples of appliances that use heat pumps to regulate temperatures."One of the main aims of the project will be to find out how we can produce snow regardless of the outdoor temperature, and to develop energy-efficient ways of doing it," says Petter Nekså, an energy research scientist at SINTEF.Nekså thinks that one feasible approach is to develop heat pumps where the cold side can be used to produce snow, while the warm side is used for heating."If the air outside is cold, traditional snow cannons work very well. But these are temperature dependent," says Nekså. "At higher temperatures, you need a refrigeration plant to make snow. The advantage is that the process is independent of air temperatures."What can make the process energy efficient is heating a building with the heat generated by the heat pump as it cools water to be made into snow, Nekså says."In this way, we can heat indoor facilities while also making artificial snow for ski slopes outside -- virtually cost free," he says.The approach involves adapting current heat pump technology, says Jacob Stang, one of Nekså's colleagues at SINTEF."A traditional snow production facility that makes snow at zero degrees outdoors has no 'hot side'," Stang says. "That means we need a heat pump that has the properties of a refrigeration plant. We have to adapt components, such as an evaporator and condenser, to get them to work together."The project will be conducted in collaboration with the city of Trondheim, where SINTEF and NTNU are based, and the Norwegian Ski Federation (NSF).The researchers are also hoping to develop better ways of storing snow, which is an approach many ski areas use as a hedge against warmer temperatures. Currently, many ski area use sawdust to store artificial snow that can be spread on slopes and trails when the weather doesn't deliver the white stuff on its own. While this is a proven approach, over time the sawdust loses its insulating properties and has to be replaced.The project will also identify new ways of making sure that ski areas get as much benefit as they can out of manufactured snow. The researchers will look at everything from the design and drainage of ski runs, to protection from sun and rain, salting and snow preparation.Researchers will conduct lab experiments, use computer models and simulations, create prototypes and undertake field tests."Norway has a long tradition and expertise in this field," says Trygve M. Eikevik, a professor in NTNU's Department of Energy and Process Engineering. "The fishery sector produces around 300 thousand tonnes of ice each year for fish export. This is enough to cover an 8-metre-wide, 150-kilometre-long ski trail with a layer of ice that is 0.5 metres thick. It is more than possible to manufacture snow for skiing."The NSF hopes the project will increase the chances that Norway will be able to host World Championships in skiing in the future, but officials are most concerned about maintaining skiing as a pastime in Norway. Communities across the country promote skiing by maintaining easily accessible, lighted and groomed ski trails and encouraging ski clubs. This strong system recruits young people to skiing, which has led to Norway's prominence in both alpine and cross-country ski competitions. It also helps keep people healthy, by encouraging them to get outside to exercise in the winter."The challenges posed by climate change represent perhaps the greatest threat to ski sports. This is why we're very pleased that this project is taking off," says Marit Gjerland, who is a ski run consultant for the NSF. "Good results from the project will mean a lot for the future of ski sports."She says the technology could also expand the popularity of skiing, by making snow available in places where it previously wasn't."Just like we have artificial football pitches, we could also create future snow parks," she says.One of the aims of the project is to establish a snow technology research centre based in Trondheim, where both Norwegian and international projects could be carried out."We envisage the development of more efficient refrigeration plants and snow production concepts, facilities designed for combined snow and heat production, and a total concept that integrates data models with meteorological data," says Eikevik."We hope this will help promote innovation and business development related to future snow production facilities," he says.
Weather
2,017
January 30, 2017
https://www.sciencedaily.com/releases/2017/01/170130224728.htm
New ocean observations improve understanding of motion
Oceanographers commonly calculate large scale surface ocean circulation from satellite sea level information using a concept called "geostrophy," which describes the relationship between oceanic surface flows and sea level gradient. Conversely, researchers rely on data from in-water current meters to measure smaller scale motion. New research led by University of Hawai'i at Mānoa (UHM) oceanographer Bo Qiu has determined from observational data the length scale at which using sea level height no longer offers a reliable calculation of circulation.
Upper-ocean processes dissipate heat, transport nutrients and impact the uptake of carbon dioxide -- making circulation a critical driver of biological activity in the ocean. The movement of water in the ocean is determined by many factors including tides; winds; surface waves; internal waves, those that propagate within the layers of the ocean; and differences in temperature, salinity or sea level height. Additionally, like high and low pressure systems seen on TV weather maps, the ocean is full of eddies, slowly swirling masses of water."As length scales become smaller from several hundred miles to a few tens of miles, we discovered the point at which geostrophic balance becomes no longer valid -- meaning that sea level is no longer useful for calculating ocean circulation," said Qiu, professor at the UHM School of Ocean and Earth Science and Technology (SOEST). "That is due to the presence of oceanic internal wave motions which essentially disrupts the motion that would be caused by geostrophy."Scientists use sea level as a means to calculate ocean circulation because satellites circle Earth daily, acquiring sea level data frequently and accurately. Prior to this study, published in Further, in areas of the ocean with persistent or frequent eddies, Qiu and co-authors from the Japan Meteorological Agency, Caltech and NASA Jet Propulsion Laboratory determined that sea level can reliably be used to calculate circulation at a fairly high resolution, that is, at fairly small length scales (resolution of 10 miles). However, in areas where motion is dominated by internal waves, satellite sea level can only be used to infer motion on a very large scale (resolution of 125 miles)."This aspect of the study was a bit of a surprise," said Qiu. "I didn't anticipate that the transition point would vary by an order of magnitude within the western North Pacific."In the future, Qiu and colleagues hope to develop a mathematical approach to creating more detailed pictures of circulation based on sea level in more locations throughout the Pacific.
Weather
2,017
January 30, 2017
https://www.sciencedaily.com/releases/2017/01/170130111058.htm
First-ever GPS data release to boost space-weather science
Today, more than 16 years of space-weather data is publicly available for the first time in history. The data comes from space-weather sensors developed by Los Alamos National Laboratory on board the nation's Global Positioning System (GPS) satellites. The newly available data gives researchers a treasure trove of measurements they can use to better understand how space weather works and how best to protect critical infrastructure, such as the nation's satellites, aircraft, communications networks, navigation systems, and electric power grid.
"Space-weather monitoring instruments developed at Los Alamos have been fielded on GPS satellites for decades," said Marc Kippen, the Los Alamos program manager. "Today, 23 of the nation's more than 30 on-orbit GPS satellites carry these instruments. When you multiply the number of satellites collecting data with the number of years they've been doing it, it totals more than 167 satellite years. It's really an unprecedented amount of information."Extreme space-weather events have the potential to significantly threaten safety and property on Earth, in the air, and in space. For example, the hazard of increased radiation exposure from charged particles released during a large solar flare could require that flights be diverted away from a polar route. Similarly, sudden bursts of plasma and magnetic field structures (coronal mass ejections, or CMEs) from the sun's atmosphere and high-speed solar wind could significantly disable large portions of the electric power grid. The resulting cascading failures could disturb air traffic control, disrupt the water supply, and interfere with life-saving medical devices.In space, the charged particles measured by the Los Alamos-GPS sensors are the primary limit on how long a satellite can operate in space before succumbing to the damaging effects of radiation. In extreme events those particles can cause malfunction of satellites or even catastrophic failure of entire satellite systems. For example, in April 2010, a large magnetic disturbance resulted in a communications failure, causing a satellite to uncontrollably drift in space and presenting a hazard to nearby satellites. Currently, scientists are unable to predict when these extreme events will occur, how strong they will be, or how severe the effects will be. The release of Los Alamos-GPS data enables new studies that will help answer these questions.The Los Alamos-GPS sensors continuously measure the energy and intensity of charged particles, mainly electrons and protons, energized and trapped in Earth's magnetic field. These trapped particles form the Van Allen radiation belts, which are highly dynamic -- varying on time scales from minutes to decades. From GPS orbit (roughly 12,600 miles above Earth), satellite-borne sensors probe the largest radiation belt -- consisting mainly of energetic electrons. Each of the 23 sensors in the current GPS constellation makes detailed measurements of the belts every six hours. Together the sensors provide 92 complete measurements of the belts every day. The newly released measurements constitute a nearly continuous global record of the variability in this radiation belt for the past 16 years, including how it responds to solar storms. The data provides an invaluable record for understanding radiation-belt variability that is key to developing effective space-weather forecasting models.Los Alamos has been anticipating greater awareness of the nation's vulnerability to space weather since the 1990s, when it began aligning its space-weather research activities with its critical-infrastructure program. "This led to an awareness that we could expand the utility of our space-weather data to programs beyond the specific requirements they were designed for," said Kippen.The public release of GPS energetic-particle data was conducted under the terms of an October 2016 White House Executive Order. It culminates years of work between the Office of Science and Technology Policy and the National Security Council to coordinate interagency efforts aimed at improved understanding, prediction and preparedness for potentially devastating space-weather events. The specific goal of releasing space-weather data from national-security assets such as GPS satellites is to enable broad scientific community engagement in enhancing space-weather model validation and improvements in space-weather forecasting and situational awareness.
Weather
2,017
January 25, 2017
https://www.sciencedaily.com/releases/2017/01/170125145750.htm
A crab's eye view of rising tides in a changing world
Coastal ecosystems and aquifers will be greatly affected by climate change, not only from rising temperatures and more volatile weather, including changes in precipitation patterns, but also from sea level rise.
In the search for methods to analyze these effects, researchers at NJIT have identified powerful statistical tools that should help coastal scientists both measure and anticipate changes in conditions such as subsurface water temperature and salinity. Results from the study, funded by the National Oceanic and Atmospheric Administration, have been published in The tools, known as spectral and co-spectral techniques, required a large data set for validation, which the authors obtained from two beaches in Prince William Sound in Alaska. In particular, they looked at the impact of sea level variability on these beaches, collecting a year's worth of data on water pressure, water temperature and salinity at thirty-minute intervals from sensors embedded in the sand.Using these techniques to identify trends within the data, they found that at most locations, the effects of sea level rise were first seen in the water pressure in aquifers, followed by changes in salinity and then temperature. Following in the wake of the other two leading indicators, or warning signals, it appears that changes in temperature would take longer to be felt inland than rising salinity."Changes in the temperature were sometimes seen a week later than both rising water pressure and salinity. We attribute this to soil grains acting as a thermal buffer, absorbing a non-negligible fraction of the heat carried by seawater and fresh groundwater. Thus, even if the sea temperature should change suddenly, it would take a week or more for its effects on coastal aquifers to subside," said Michel Boufadel, director of NJIT's Center for Natural Resources Development and Protection (NRDP), a professor of environmental engineering and an author of the study.Water pressure was the first to change as water is incompressible and thus transmits pressure rapidly. Rising salinity, by comparison, is mitigated by the spreading and dilution of seawater when it interacts with fresh groundwater coming from the upland behind the beach."All of these changes, including the off-kilter sequence of salinity and temperature fronts at a particular locale, could have ramifications for animal and plant life in the aquifer ecosystem," said Xiaolong Geng, a postdoctoral fellow at NJIT and an author of the study. "We believe that regular data-keeping at short intervals will allow us to also monitor the pace and dynamics of change in these vulnerable regions."In particular, the researchers say, in conditions of low soil permeability where changes in water pressure and temperature are typically distantly connected, current predictive models that extend out beyond a week would likely decline in accuracy. Such approaches would need to be "anchored" to regular collection of data on water pressure, salinity, and temperature in the way meteorological models are corrected by measurements at given times and locations, such as at rain stations at airports. The researchers conclude that measurements of coastal ecosystem features should take place at once per week or more.
Weather
2,017
January 25, 2017
https://www.sciencedaily.com/releases/2017/01/170125120742.htm
100% renewable energy sources require overcapacity
Germany decided to go nuclear-free by 2022. A CO
Intermittent sources are, by definition, unsteady. Therefore, a back-up system capable of providing power at a level of 89% of peak load would be needed. This requires creating an oversised power system to produce large amounts of surplus energy. A day storage to handle surplus is ineffective because of the day-night correlation of surplus power in the winter. A seasonal storage system loses its character when transformation losses are considered; indeed, it only contributes to the power supply after periods with excessive surplus production.The option of an oversized, intermittent renewable-energy-sources system to feed the storage is also ineffective. This is because, in this case, energy can be taken directly from the large intermittent supply, making storage superfluous. In addition, the impact on land use and the transformation of landscape by an unprecedented density of wind convertors and transmission lines needs to be taken into consideration. He also warns of the risk that it will intensify social resistance.
Weather
2,017
January 25, 2017
https://www.sciencedaily.com/releases/2017/01/170125093737.htm
Early onset of winter triggers evolution towards smaller snow voles in Graubünden
Researchers from the University of Zurich have succeeded in documenting an extremely rare case of evolutionary adaptation "in action" among wild snow voles near Chur. The selective pressure triggered by several consecutive winters with early snowfall resulted in a genetic decrease in body weight. The reason: Smaller voles are fully grown by the time the weather conditions deteriorate.
Adaptive evolution, i.e. genetic change via natural selection, plays a central role in how plant and animal populations guarantee their long-term survival. Although this process is well understood in breeding conditions and in the lab, it is still largely unclear how often and how rapidly it takes place under natural conditions. Examples of contemporary adaptive evolution remain extremely rare.This is precisely what the team headed by Erik Postma, a research group leader at the Department of Evolutionary Biology and Environmental Studies at UZH, has managed to do. The scientists have been studying a population of snow voles (Chionomys nivalis) in their alpine habitat above Churwalden (Graubünden, Switzerland) at an altitude of around 2,000 meters since 2006. They were able to demonstrate that the voles changed genetically over a few generations only. "But contrary to our expectations, they didn't get bigger," says Postma. "Instead, adaptive evolution pushed the voles to become smaller and lighter."In principle, larger snow voles are fitter: They have better capabilities to survive and reproduce. Despite this positive correlation at the phenotypic level, however, a converse causal relationship was evident on the genotypic level. "The voles whose genetic make-up led to a lower body weight were the fittest, especially in years when the first winter snow fell earlier than usual," explains the biologist. This may be because lighter young are more likely to reach their final size before the weather deteriorates and winter comes.If the scientists had restricted their observations solely to phenotypic traits, such as body size and weight, this rare example of "evolution in action" in the wild would have remained hidden. After all, over the same decade the number of snow voles in the population shrank simultaneously, so more food was available to each animal. In turn, this ecological change compensated for the genetic change and caused the average weight to remain constant. Only by separating the role of genes and the environment were the biologists able to see through this phenotypic "masking."Based on DNA samples, they began by reconstructing the genealogy of the vole population and thus how the animals were related. They then used statistical models and quantitative genetics to determine how the genes responsible for body weight changed over time. This approach unveiled an evolutionary pressure in favor of young animals, which reach their maximum body size earlier. "We assume," adds Postma, "that a climate fluctuation -- several consecutive years with early snowfall -- is the selection pressure behind this evolutionary adaptation."Neither the selective pressure nor the evolutionary response could have been identified with the methods used by the majority of studies that examine how wild populations respond to environmental changes, which predominantly concentrate on changes in the phenotype. In wild populations, the lack of a genetic perspective may provide a flawed understanding of how the causes and consequences of natural selection are connected. As a consequence, the importance of adaptive evolution in plant and animal populations may have been underestimated thus far, especially when it comes to adaptations in response to rapid, anthropogenic, environmental change.
Weather
2,017
January 25, 2017
https://www.sciencedaily.com/releases/2017/01/170125091717.htm
Florida corals tell of cold spells and dust bowls past, foretell weather to come
Scientists seeking an oceanic counterpart to the tree rings that document past weather patterns on land have found one in the subtropical waters of Dry Tortugas National Park near the Florida Keys, where long-lived boulder corals contain the chemical signals of past water temperatures. By analyzing coral samples, USGS researchers and their colleagues have found evidence that an important 60- to 85-year-long cycle of ocean warming and cooling has been taking place in the region as far back as the 1730s.
The cycle called the Atlantic Multidecadal Oscillation, or AMO, is linked to rainfall over most of the US, Midwestern droughts, hurricane intensification and landfalls, and the transfer of ocean heat from the tropical Caribbean Sea to the North Atlantic Ocean by way of the Gulf Stream. It interacts with ongoing climate change in poorly understood ways, and it is very hard to spot in pre-20th century records."The AMO has a huge impact on human populations and the economy, mainly through its influence on rainfall patterns," said geochemist Jennifer Flannery of the USGS Coastal and Marine Science Center in St. Petersburg, Florida, who led the study. "Climatologists suspect the AMO is a natural climate cycle that has existed for more than 1,000 years. But until recently most of the evidence came from ships at sea, and only went back 150 years or so."The record we obtained from the Dry Tortugas coral cores captures several complete AMO cycles stretching back 278 years. That gives climate modelers a lot of new evidence to work with as they try to understand past AMOs and predict future ones."The Dry Tortugas samples precisely track major climate phenomena like the Little Ice Age that ended in the early 1800s, and the lethal Dust Bowl drought of the 1930s. A research paper about the study appeared January 15 in the journal Dry Tortugas National Park is a cluster of small, isolated islands at an important marine crossroads: the Florida Straits, where the Gulf of Mexico and the Caribbean Sea flow into the Atlantic Ocean. The islands are within a large zone of seawater called the Atlantic Warm Pool, which typically heats up in spring to 83 degrees Fahrenheit (28.5 degrees Celsius) or more. The heat stored in the Atlantic Warm Pool appears to influence rainfall in the Caribbean and parts of North America, and the formation and intensity of hurricanes.The Dry Tortugas also lie near the origin of the Gulf Stream, the current that carries warm seawater north to Greenland, where it chills, plunges deeper into the sea, and heads back towards the equator. Together, the northbound warm flow at the surface and the deep, cold southbound flow are known as the Atlantic Meridional Overturning Circulation or AMOC, which affects weather in the entire North Atlantic, including the US Atlantic seaboard and much of Europe.Some parts of this circulation system have been known for centuries, but others, like the AMO, are relatively recent discoveries. Climatologists are eager to learn more about the AMO from a longer record of sea surface temperatures in this region where ocean-wide patterns take shape.That's where the Dry Tortugas coral cores come in. Coral skeletons, like tree rings, have growth rings that preserve evidence of past weather conditions. While they are alive, corals take up strontium and calcium from seawater, depositing the two minerals in their skeletons in a ratio that varies with water temperature.By measuring the strontium-to-calcium ratio in corals, scientists can reconstruct past sea surface temperatures. Working with two boulder corals cored by divers in 2008 and 2012, Flannery's team used a dentist's drill to collect and analyze samples at intervals as short as one month, going back as far as 1837. Combining these two corals' records with three other Dry Tortugas coral cores that stretch back to 1733, the team was able to track 278 years' worth of sea surface temperatures.The Dry Tortugas corals show that after a cold spell during the 1960s, sea surface temperatures in the region have warmed by about 1 ½ degrees Fahrenheit (0.8 degrees Celsius) between 1970 and 2012. They also show two sets of oscillations in sea surface temperatures: a shorter cycle lasting 28 to 30 years, and a longer cycle of 80 to 90 years, consistent with the Atlantic Multidecadal Oscillation.The coral cores reliably track these longer cycles of warming and cooling, providing confirmation that the Atlantic Multidecadal Oscillation has existed for the past three centuries, Flannery said. This suggests that there is a close connection between sea temperatures in the area around the Dry Tortugas and the larger AMO."By looking at sea surface temperatures in the Dry Tortugas, climatologists may be able to predict imminent changes that will affect the entire North Atlantic basin," Flannery said.
Weather
2,017
January 24, 2017
https://www.sciencedaily.com/releases/2017/01/170124111330.htm
Why storms are becoming more dangerous as the climate warms
Researchers know that more, and more dangerous, storms have begun to occur as the climate warms. A team of scientists has reported an underlying explanation, using meteorological satellite data gathered over a 35-year period.
The examination of the movement and interaction of mechanical energies across the atmosphere, published Jan. 24 in the journal "It is a new way to look at and explain what people have observed," said Liming Li, assistant professor of physics at the University of Houston and corresponding author of the paper. "We found that the efficiency of Earth's global atmosphere as a heat engine is increasing during the past four decades in response to climate change."In this case, increased efficiency isn't a good thing. It suggests more potential energy is being converted to kinetic energy -- energy that is driving atmospheric movement -- resulting in a greater potential for destructive storms in regions where the conversion takes place."Our analyses suggest that most energy components in the Lorenz energy cycle have positive trends," the researchers wrote. "As a result, the efficiency of Earth's global atmosphere as a heat engine increased during the past 35 years."In addition to Li, researchers involved in the work include Yefeng Pan, first author and a former doctoral student at UH; Xun Jiang, associate professor of earth and atmospheric sciences at UH; Gan Li, Wentao Zhang and Xinyue Wang, all of Guilin University of Electronic Technology; and Andrew P. Ingersoll of the California Institute of Technology.The researchers used three independent meteorological datasets to track variables including three-dimensional wind field, geopotential-height field and temperature field at points across the globe from 1979 to 2013. They then used the data to compute the Lorenz energy cycle of the global atmosphere. Such an energy cycle in the atmosphere significantly influences weather and climate.Previous studies have covered only five-year and 10-year periods before 1973, Li said. "Now we can investigate the Lorenz energy cycle of the global atmosphere during the past 35 years, using satellite-based observations," he said.While the researchers reported that the total mechanical energy of the global atmosphere remains constant over time, there has been a significant increase in what they describe as "eddy energies," or the energies associated with storms, eddies and turbulence.Li said the positive trends for eddy energies were especially pronounced in the southern hemisphere and over parts of Asia, and the researchers point out that intensifying storm activity over the southern oceans and increasing drought in Central Asia contribute to the positive trends."This is a new perspective to explain global warming from an energy standpoint," he said.
Weather
2,017
January 19, 2017
https://www.sciencedaily.com/releases/2017/01/170119143356.htm
New England's 1816 'Mackerel Year' and climate change today
Hundreds of articles have been written about the largest volcanic eruption in recorded history, at Indonesia's Mt. Tambora just over 200 years ago. But for a small group of New England-based researchers, one more Tambora story needed to be told, one related to its catastrophic effects in the Gulf of Maine that may carry lessons for intertwined human-natural systems facing climate change around the world today.
In the latest issue of Alexander says, "We approached our study as a forensic examination. We knew that Tambora's extreme cold had afflicted New England, Europe, China and other places for as long as 17 months. But no one we knew of had investigated coastal ecosystems and fisheries. So, we looked for evidence close to home."In work that integrates the social and natural sciences, they used historical fish export data, weather readings, dam construction and town growth chronologies and other sources to discover Tambora's effects on the Gulf of Maine's complex human and natural system.The 1815 eruption caused a long-lasting, extreme climate event in 1816 known as the "year without a summer." As volcanic winter settled on much of the Northern Hemisphere, crops failed, livestock died and famine swept over many lands. In New England, crop yields may have fallen by 90 percent. The researchers found that 1816 was also called "the mackerel year," a clue to what they would find regarding fisheries.Besides Tambora's climate effects, the authors examined other system-wide influences to explain observed trends. These included historical events such as the War of 1812, human population growth, fish habitat obstruction due to dam building and changes in fishing gear that might have affected fisheries at the time. Employing historical methods in a Complex Adaptive Systems approach allowed them to group and order data at different scales of organization and to identify statistically significant processes that corresponded to known outcomes, Alexander says.For instance, temperature fluctuations influenced the entire Gulf of Maine for short periods of time, while dam construction affected individual watersheds through the life of the dams. Space and time scales differ in each case, but both temperature fluctuations and habitat obstructions affect fish, and thus fisheries, at the same time. Such interactions are characteristic of complex systems, she notes.Establishing timing was key to solving the mystery, Alexander adds. Major export species including freshwater-spawning alewives and shad and marine-spawning mackerel and herring, have different temperature tolerances and seasonal migration patterns and timing, or phenology. Alewives and mackerel arrived earlier when water was colder, shad and herring arrive later after water had warmed up. Because of their phenology and vulnerability in rivers and streams during spawning, alewives suffered the most from the extreme climate event. In Massachusetts where streams had been dammed for a long time, its effects were compounded, the researchers found.In the early 1800s alewives were a "utility fish," an important commercial export but also used as chicken feed, garden fertilizer and human food in winter. The winter of 1816 was so cold, Alexander says, that "Penobscot Bay froze solid from Belfast to Castine." When alewives arrived at their seasonal spawning time, adverse conditions likely disrupted spawning runs, increased natural mortality and, critically for the people depending on them, decreased catch.She adds, "During this climate crisis, people couldn't catch enough alewives to meet their needs, so they quickly turned to mackerel, the next abundant species to arrive along the coast. Pursuing mackerel and rapidly distributing it to communities with no other sources of food fundamentally altered the infrastructure of coastal fisheries." Although records suggest that alewife populations apparently recovered within 25 years, "people responded rapidly and effectively to Tambora in only five years and never looked back when the crisis passed."Rates of human and alewife response became uncoupled and the quick fixes, become permanent, later achieved an air of inevitability, the authors suggest.They add that "complex solutions elude simple explanations." They point out the "many and obvious," parallels between that sudden extreme event and current occurrences of drought, flood, storm devastation, food disruption and famine attributed to climate change."The past can be a laboratory," Alexander and colleagues write. Employing historical methods within a Complex Adaptive Systems approach may offer a simple way to examine complex systems where scale, rate and phenology interconnect human and natural processes, and help to "advance human resilience by strengthening resilience in the natural world."UMass Amherst fisheries ecologist Adrian Jordaan adds, "When the resources are available locally, they can help societies cope with change. Also, during extreme climate events, unthinkable changes including large societal shifts can occur. These are things that we must be prepared for in the world of today, where extreme climatic events are becoming more frequent and severe."Michelle Staudinger, an ecologist with the Northeast Climate Science Center at UMass Amherst, says, "Alewives and other fishes that inhabit both rivers and oceans are highly vulnerable to climate change. The lessons learned from this study will help us better anticipate, prepare and cope for additional future impacts on their populations as well as the human communities that depend on them."Alex Bryan, a U.S. Geological Survey climate scientist and co-author, says studying a 200-year-old event was a challenge. "Long-term temperature records don't begin until the turn of the 20th century. Fortunately, we found the weather journal of a physician residing in Salem, Mass., who recorded the air temperature four times a day from the 1780s to the 1820s. Without his devotion to monitoring the weather, this study would not have been possible."
Weather
2,017
January 18, 2017
https://www.sciencedaily.com/releases/2017/01/170118112554.htm
2016 warmest year on record globally, NASA and NOAA data show
Earth's 2016 surface temperatures were the warmest since modern recordkeeping began in 1880, according to independent analyses by NASA and the National Oceanic and Atmospheric Administration (NOAA).
Globally-averaged temperatures in 2016 were 1.78 degrees Fahrenheit (0.99 degrees Celsius) warmer than the mid-20th century mean. This makes 2016 the third year in a row to set a new record for global average surface temperatures.The 2016 temperatures continue a long-term warming trend, according to analyses by scientists at NASA's Goddard Institute for Space Studies (GISS) in New York. NOAA scientists concur with the finding that 2016 was the warmest year on record based on separate, independent analyses of the data.Because weather station locations and measurement practices change over time, there are uncertainties in the interpretation of specific year-to-year global mean temperature differences. However, even taking this into account, NASA estimates 2016 was the warmest year with greater than 95 percent certainty."2016 is remarkably the third record year in a row in this series," said GISS Director Gavin Schmidt. "We don't expect record years every year, but the ongoing long-term warming trend is clear."The planet's average surface temperature has risen about 2.0 degrees Fahrenheit (1.1 degrees Celsius) since the late 19th century, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere.Most of the warming occurred in the past 35 years, with 16 of the 17 warmest years on record occurring since 2001. Not only was 2016 the warmest year on record, but eight of the 12 months that make up the year -- from January through September, with the exception of June -- were the warmest on record for those respective months. October, November, and December of 2016 were the second warmest of those months on record -- in all three cases, behind records set in 2015.Phenomena such as El Niño or La Niña, which warm or cool the upper tropical Pacific Ocean and cause corresponding variations in global wind and weather patterns, contribute to short-term variations in global average temperature. A warming El Niño event was in effect for most of 2015 and the first third of 2016. Researchers estimate the direct impact of the natural El Niño warming in the tropical Pacific increased the annual global temperature anomaly for 2016 by 0.2 degrees Fahrenheit (0.12 degrees Celsius).Weather dynamics often affect regional temperatures, so not every region on Earth experienced record average temperatures last year. For example, both NASA and NOAA found the 2016 annual mean temperature for the contiguous 48 United States was the second warmest on record. In contrast, the Arctic experienced its warmest year ever, consistent with record low sea ice found in that region for most of the year.NASA's analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations. These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions. The result of these calculations is an estimate of the global average temperature difference from a baseline period of 1951 to 1980.NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth's polar regions and global temperatures.GISS is a laboratory within the Earth Sciences Division of NASA's Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.NASA monitors Earth's vital signs from land, air and space with a fleet of satellites, as well as airborne and ground-based observation campaigns. The agency develops new ways to observe and study Earth's interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.The full 2016 surface temperature data set and the complete methodology used to make the temperature calculation are available at:
Weather
2,017
January 18, 2017
https://www.sciencedaily.com/releases/2017/01/170118103834.htm
Extreme space weather-induced blackouts could cost US more than $40 billion daily
The daily U.S. economic cost from solar storm-induced electricity blackouts could be in the tens of billions of dollars, with more than half the loss from indirect costs outside the blackout zone, according to a new study.
Previous studies have focused on direct economic costs within the blackout zone, failing to take into account indirect domestic and international supply chain loss from extreme space weather."On average the direct economic cost incurred from disruption to electricity represents only 49 percent of the total potential macroeconomic cost," says the paper published in Under the study's most extreme blackout scenario, affecting 66 percent of the U.S. population, the daily domestic economic loss could total $41.5 billion plus an additional $7 billion loss through the international supply chain.Electrical engineering experts are divided on the possible severity of blackouts caused by "Coronal Mass Ejections," or magnetic solar fields ejected during solar flares and other eruptions. Some believe that outages would last only hours or a few days because electrical collapse of the transmission system would protect electricity generating facilities, while others fear blackouts could last weeks or months because those transmission networks could in fact be knocked out and need replacement.Extreme space weather events occur often, but only sometimes affecting Earth. The best-known geomagnetic storm affected Quebec in 1989, sparking the electrical collapse of the Hydro-Quebec power grid and causing a widespread blackout for about nine hours.There was a very severe solar storm in 1859 known as the "Carrington event" (after the name of a British astronomer). A widely cited 2012 paper by Pete Riley of Predictive Sciences Inc. said that the probability of another Carrington event occurring within the next decade is around 12 percent; a 2013 report by insurer Lloyd's, produced in collaboration with Atmospheric and Environmental Research, said that while the probability of an extreme solar storm is "relatively low at any given time, it is almost inevitable that one will occur eventually.""We felt it was important to look at how extreme space weather may affect domestic U.S. production in various economic sectors, including manufacturing, government and finance, as well as the potential economic loss in other nations owing to supply chain linkages," says study co-author Edward Oughton of the Cambridge Centre for Risk Studies at Cambridge Judge Business School. "It was surprising that there had been a lack of transparent research into these direct and indirect costs, given the uncertainty surrounding the vulnerability of electrical infrastructure to solar incidents."The study's scope was guided by a July 2015 conference held at Cambridge Judge.The study looks at three geographical scenarios for blackouts caused by extreme space weather, depending on the latitudes affected by different types of incidents.If only extreme northern states are affected, with 8 percent of the U.S. population, the economic loss per day could reach $6.2 billion supplemented by an international supply chain loss of $0.8 billion. A scenario affecting 23 percent of the population could have a daily cost of $16.5 billion plus $2.2 billion internationally, while a scenario affecting 44 percent of the population could have a daily cost of $37.7 billion in the US plus $4.8 billion globally. (The study is calculated using 2011 U.S. dollars.)Manufacturing is the U.S. economic sector most affected by those solar-induced blackouts, followed by government, finance and insurance, and property. Outside of the U.S., China would be most affected by the indirect cost of such U.S. blackouts, followed by Canada and Mexico -- as "these countries provide a greater proportion of raw materials, and intermediate goods and services, used in production by U.S. firms."
Weather
2,017
January 18, 2017
https://www.sciencedaily.com/releases/2017/01/170118083415.htm
Climate change to shift global pattern of mild weather
As scientists work to predict how climate change may affect hurricanes, droughts, floods, blizzards and other severe weather, there's one area that's been overlooked: mild weather. But no more.
NOAA and Princeton University scientists have produced the first global analysis of how climate change may affect the frequency and location of mild weather -- days that are perfect for an outdoor wedding, baseball, fishing, boating, hiking or a picnic. Scientists defined "mild" weather as temperatures between 64 and 86 degrees F, with less than a half inch of rain and dew points below 68 degrees F, indicative of low humidity.Knowing the general pattern for mild weather over the next decades is also economically valuable to a wide range of businesses and industries. Travel, tourism, construction, transportation, agriculture, and outdoor recreation all benefit from factoring weather patterns into their plans.The new research, published in the journal "Extreme weather is difficult to relate to because it may happen only once in your lifetime," said first author Karin van der Wiel, a Princeton postdoctoral researcher at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL) located on the university's Forrestal Campus. "We took a different approach here and studied a positive meteorological concept, weather that occurs regularly, and that's easier to relate to."Scientists predict the largest decreases in mild weather will happen in tropical regions because of rising heat and humidity. The hardest-hit areas are expected to be in Africa, Asia and Latin America, where some regions could see 15 to 50 fewer days of mild weather a year by the end of the century. These are also areas where NOAA and partner research shows economic damages due to climate change. The loss of mild weather days, especially during summer, when they can serve to break up extended heatwaves, also could significantly affect public health.People living in the mid-latitudes, which include much of the United States, as well as many mountainous areas around the world, will gain mild weather days on average, the new study found. The biggest winners will include communities along the border with Canada in the Northeast, Midwest and Northwest, as well as many parts of Canada.Other areas projected to gain as much as 10 to 15 days more annually of mild weather by the end of the 21st century include parts of England and northern Europe, and Patagonia in extreme southern South America. In some of these areas, mild weather will drop during increasingly hot and humid summers but become more plentiful in fall, winter and spring as winters warm and the shoulder seasons last longer."We believe improving the public understanding of how climate change will affect something as important as mild weather is an area ripe for more research and more focused studies," said Sarah Kapnick, a physical scientist at NOAA's GFDL and co-author. "Predicting changes in mild weather is not only important to business and industry, but can also contribute to research on the future of physical and mental health, leisure and urban planning."Scientists used high-resolution climate models to investigate the changing patterns of mild weather globally by examining the effect over time of increased warming from the buildup of greenhouse gas emissions in the atmosphere. The work was made possible by decades of Earth system and model development at NOAA's GFDL and by improvements made to NOAA's research supercomputing capability, including access to two high performance supercomputers, Gaea and Theia, named after figures in Greek mythology.
Weather
2,017
January 16, 2017
https://www.sciencedaily.com/releases/2017/01/170116121807.htm
Study tracks 'memory' of soil moisture
The top two inches of topsoil on all of Earth's landmasses contains an infinitesimal fraction of the planet's water -- less than one-thousandth of a percent. Yet because of its position at the interface between the land and the atmosphere, that tiny amount plays a crucial role in everything from agriculture to weather and climate, and even the spread of disease.
The behavior and dynamics of this reservoir of moisture have been very hard to quantify and analyze, however, because measurements have been slow and laborious to make.That situation changed with the launch in 2015 of a NASA satellite called SMAP (Soil Moisture Active Passive), designed to provide globally comprehensive and frequent measurements of the moisture in that top layer of soil. SMAP's first year of observational data has now been analyzed and is providing some significant surprises that will help in the modeling of climate, forecasting of weather, and monitoring of agriculture around the world.These new results are reported in the journal The SMAP observations are providing an unprecedented level of detailed, worldwide information on the amount of water in those top 2 inches (5 centimeters) of soil, collected globally every two to three days. Entekhabi says this is important because this thin layer is a key part of the global water cycle over the continents, and also a key factor in the global energy and carbon cycles.Precipitation on land, and the evaporation of that moisture from the land, "transfers large amounts of energy" between the continents and the atmosphere, Entekhabi says, and Earth's climate would be drastically different without this element. The oceans, containing 97 percent of Earth's water, provide a major role in storing and releasing heat, but over land that role is provided by the moisture in the topmost layer of the soil, albeit through different mechanisms. That moisture "is a tiny, tiny fraction of the water budget, but it's sitting at a very critical zone at the surface of the land, and plays a disproportionately critical role in the cycling of water," he says. "It plays a significant role in moderating climate, on seasonal and annual timescales."Understanding these cycles better, thanks to the new data, could help make weather predictions more accurate over longer timescales, which could be an important boon for agriculture. Several federal agencies have already begun using the SMAP data, Entekhabi says, for example, to help make forecasting of drought and flood conditions more accurate."The satellite is providing an extraordinary quality of surface soil moisture information that makes this analysis possible," he says. The satellite's primary mission of three years is about halfway over, he says, but the team is working on applying for an extended mission that could last as much as a decade.One of the big surprises from the new data is that this top level of soil preserves a "memory" for weather anomalies, more so than had been predicted from theory and earlier, sparser measurements. Memory refers to the persistence of effects from unusually high or low amounts of rainfall. Contrary to most researchers' expectations, it turns out that these effects persist for a matter of days, rather than just a few hours. On average, about one-seventh of the amount of rain that falls is still present in that topmost layer of soil three days after it falls -- and this persistence is greatest in the driest regions.The data also show a significant feedback effect that can amplify the effects of both droughts and floods, Entekhabi says. When moisture evaporates from wet soil, it cools the soil in the process, but when the soil gets too dry that cooling diminishes, which can lead to hotter weather and heat waves that extend and deepen drought conditions. Such effects "had been speculated," he says, "but hadn't been observed directly."The ongoing SMAP mission also provides educational opportunities that help to verify and calibrate the satellite data. With minimal equipment, students can participate in hands-on lessons in data collection, using measurement methods that are considered the gold standard. For example, they can gather a sample of soil in a fixed volume such as a tuna can, and weigh it before and after drying it out. The difference between the two weights gives a precise measure of the soil's moisture content in that volume, which can be compared with the satellite's moisture measurement.Even young students "can carry out 'gold standard' measurements, and all it takes is a kitchen scale and an oven," Entekhabi says. "But it's very labor-intensive. So we have engaged with schools around the world to do these measurements."
Weather
2,017
January 13, 2017
https://www.sciencedaily.com/releases/2017/01/170113155423.htm
Giant Middle East dust storm caused by a changing climate, not human conflict
In August 2015, a dust storm blanketed large areas of seven Middle East nations in a haze of dust and sand thick enough to obscure them from satellite view. The storm led to several deaths, thousands of cases of respiratory ailments and injuries, and canceled airline flights and closed ports.
At the time, the storm's unusual severity was attributed to the ongoing civil war in Syria by media outlets in the Middle East, Europe and the United States. Reports blamed the conflict for changes in land use and cover -- and for activities like increased military traffic over unpaved surfaces and farmers reducing irrigation or abandoning agricultural land -- that created extreme amounts of dust to fuel the storm.Now, a team of researchers including Elie Bou-Zeid an associate professor of civil and environmental engineering at Princeton who experienced the storm while in Lebanon, have found a more likely cause for the unprecedented storm -- it was not human conflict, but a combination of climatic factors and unusual weather.While reduced vegetation cover and soil disturbance can make more sediment available for emission in dust storms, the researchers say, the widely reported link between the storm and the fighting in Syria was untested and lacked empirical support. "The reports suggesting that this was related to the conflict in Syria were not supported by any research," Bou-Zeid said. "It was just hypotheticals thrown into the air."As the storm got more attention, Shmuel Assouline of Israel's Agricultural Research Organization and Bou-Zeid emailed several colleagues and suggested they look into the cause of the storm.For their study, published Nov. 8, 2016, in After gathering data on surface air temperature, humidity and wind speed, and running meteorological simulations for the region before, during and after the storm using the Weather Research and Forecasting (WRF) model, the researchers say that climate, not conflict, led to the conditions that made the storm possible."The simulations showed that what was very unique about this storm is that first, it was preceded by a very hot period, and so the land that was not covered with vegetation would be drier and it would be easier to entrain sand grains from it," Bou-Zeid said.Summer 2015 was unusually hot and dry relative to the last 20 years, and extreme high temperatures and low humidity were more frequent in August and September than during the region's long-term drought, which lasted from 2007 to 2010, Bou-Zeid said. The extremely arid conditions increased the amount of dust available and lowered its threshold for erosion, making it more likely that the dust would dislodge into the atmosphere.The other factor that helped generate the storm, the researchers say, was an unusual wind pattern. "Usually these dust storms are created somewhere between Syria and Iraq and are transported south, but during this period the wind pattern was going east to west," Bou-Zeid said. The wind reversal added friction on the ground -- which dislodges more dust -- and transported it westward over long distances before depositing it in high concentrations on the densely populated east Mediterranean coast.The storm ultimately enveloped major parts of Syria, Lebanon, Turkey, Israel, Egypt, Jordan and the Palestinian territories.Bou-Zeid said it was important to answer the question of the storm's origins directly. "If the cause of the storm was human conflict, then when the conflict ends, the causes go away, and that's all good," he said. "But if the cause was not conflict and is more climate, and this is due to climatic conditions that are going to become more frequent in the future, then this is something that will reoccur."The team warns that if the Middle East becomes more arid in the long term due to climate change, extreme dust storms may become more common, and their impact unavoidable.Bou-Zeid and Assouline's co-authors on the paper, "Climate, not conflict, explains extreme Middle East dust storm," were Anthony Parolari from Marquette University, Dan Li from Boston University and Gabriel Katul from Duke University.
Weather
2,017
January 13, 2017
https://www.sciencedaily.com/releases/2017/01/170113155611.htm
Changing atmospheric conditions may contribute to stronger ocean waves in Antarctica
Over the past few years, a large fracture has grown across a large floating ice shelf on the Antarctic Peninsula. The world is watching the ice shelf, now poised to break off an iceberg the size of Delaware into the ocean.
It's not a new phenomenon; this "thumb" of Antarctica, which juts out into the stormy Southern Ocean, has lost more than 28,000 square kilometers of floating ice -- almost as large as Massachusetts -- over the past half-century. This has included the complete disintegration of four ice shelves, the floating extensions of glaciers.Now, a new study led by Colorado State University provides important details on the extent of sea ice, which can protect ice shelves from the impacts of ocean storms, in the Antarctic Peninsula.Scientists have long thought that a shift in the Southern Annular Mode, which describes a large-scale pattern of atmospheric variability for the Southern Hemisphere similar to El Nino in the tropics, may produce conditions that can lead to the collapse of ice shelves.The CSU-led research team offers important details on how the Southern Annular Mode affects storm activity and the extent of sea ice surrounding the Antarctic Peninsula. Sea ice may protect ice shelves from the impacts of ocean storms by weakening wave intensity before it reaches the coastline.The researchers utilized a novel approach of studying long-term variations in seismic signals, called microseisms, generated by ocean waves in the region. The findings have implications for the wave environment of the Southern Ocean and, potentially, for factors driving the collapse of ice shelves, which can lead to an accelerated increase in global sea level.Robert Anthony, who recently received a Ph.D. from CSU's Department of Geosciences and is now a Mendenhall Research Fellow at the U.S. Geological Survey's Albuquerque Seismological Laboratory, said that the team looked at 23 years of seismic data from Palmer Station on the Antarctic Peninsula and East Falkland Island near South America. They looked specifically at seismic signals generated by ocean waves."We were able to show that storm and ocean wave activity in the Drake Passage, the ocean basin between the Antarctic Peninsula and South America, increases during positive phases of the Southern Annular Mode, " he explained. "We were also able to verify that sea ice cover does indeed impede ocean swell from reaching the coastline by showing which regions of sea ice impact the intensity of microseisms. This type of analysis may be useful for future applications of using seismic records to track the strength of sea ice over large regions, which has been difficult to determine from satellite observations."Anthony, lead author of the study, said that based on the findings, the positive phase of the Southern Annular Mode may contribute to ice shelf weakening and potential collapse events by:Researchers had previously speculated on a link between ice shelf collapse and the Southern Annular Mode, based mainly on elevated air temperatures. But the CSU team now suspects that the reduction of sea ice and strong wave events in the Drake Passage could also play a role in rapid collapse events, such as the dramatic collapse of the Larsen A ice shelf in 1995 and, perhaps, the ongoing fracturing of the Larsen C ice shelf.The team's next steps include looking more closely at specific ocean swell events and sea ice conditions during known ice shelf collapses and large iceberg calving events.
Weather
2,017
January 10, 2017
https://www.sciencedaily.com/releases/2017/01/170110101618.htm
Changing rainfall patterns linked to water security in India
Changing rainfall is the key factor driving changes in groundwater storage in India, according to a new study led by the Indian Institute of Technology (IIT) Gandhinagar published in the journal
Agriculture in India relies heavily on groundwater for irrigation, particularly in the dry northern regions where precipitation is scarce. Groundwater withdrawals in the country have increased over tenfold since the 1950's, from 10-20 cubic kilometers per year in 1950, to 240-260 cubic kilometers per year in 2009. And satellite measurements have shown major declines in groundwater storage in some parts of the country, particularly in northern India."Groundwater plays a vital role in food and water security in India. Sustainable use of groundwater resources for irrigation is the key for future food grain production," says study leader Vimal Mishra, of the IIT Gandhinagar. "And with a fast-growing population, managing groundwater sustainably is going become even more important. The linkage between monsoon rainfall and groundwater can suggest ways to enhance groundwater recharge in India and especially in the regions where rainfall has been declining, such as the Indo-Gangetic Plain."Groundwater acts like a bank for water storage, receiving deposits from surface water and precipitation, and withdrawals as people pump out water for drinking, industry, and irrigating fields. If withdrawals add up to more than the deposits, eventually the accounts could run dry, which could have disastrous consequences."This study adds another dimension to the existing water management framework. We need to consider not just the withdrawals, but also the deposits in the system," says Yoshihide Wada, a study coauthor and the deputy director of the Water program at the International Institute for Applied Systems Analysis (IIASA) in Austria.The issue of groundwater depletion has been a topic of much discussion in India, but most planning has focused on pumping, or the demand side, rather than the deposit side. By looking at water levels in wells around the country, the researchers could track groundwater replenishment following the monsoons. They found that in fact, variability in the monsoons is the key factor driving the changing groundwater storage levels across the country, even as withdrawals increase.In addition, the researchers found that the monsoon precipitation is correlated with Indian Ocean temperature, a finding which could potentially help to improve precipitation forecasts and aid in water resource planning."Weather is uncertain by nature, and the impacts of climate change are extremely difficult to predict at a regional level," says Wada "But our research suggests that we must focus more attention on this side of the equation if we want to sustainably manage water resources for the future."
Weather
2,017
January 9, 2017
https://www.sciencedaily.com/releases/2017/01/170109190328.htm
Newly proposed reference datasets improve weather satellite data quality
"Traffic and weather, together on the hour!" blasts your local radio station, while your smartphone knows the weather halfway across the world. A network of satellites whizzing around Earth collecting mountains of data makes such constant and wide-ranging access to accurate weather forecasts possible. Just one satellite, such as the National Oceanic and Atmospheric Administration's (NOAA) Geostationary Operational Environmental Satellite-R that launched in 2016, can collect 3.5 terabytes of weather data per day.
But how do scientists ensure satellite-measured weather data is good? They can compare live data against high-quality reference data from in-orbit satellites. Making such resources available is a goal of the Global Space-based Inter-Calibration System (GSICS), an international consortium of 15 satellite agencies that collaborate on monitoring satellites and developing methods to ensure the quality of their weather data."The quality of the satellite data drives how prepared nations -- and the world -- can be when it comes to weather-related events," said GSICS Deputy Director Manik Bali, a faculty research assistant in the Earth System Science Interdisciplinary Center (ESSIC), a joint center of the University of Maryland and NASA's Goddard Space Flight Center. Bali is also a NOAA affiliate.The World Meteorological Organization (WMO), a United Nations specialized agency and the Coordination Group for Meteorological Satellites (CGMS), launched GSICS in 2005. ESSIC contributes manpower and infrastructure support to GSICS, including the servers needed to share data between GSICS collaborators worldwide, enabling the monitoring of weather satellites among member agencies and the correction of measurement anomalies in real time.One GSICS breakthrough came in 2011, with a paper demonstrating that a GSICS-developed algorithm corrected a temperature difference of approximately 3 degrees Celsius between two satellites. The results were published in the Bulletin of the American Meteorological Society. While that temperature difference may sound small, the world's nations recently negotiated the Paris Climate Agreement, which seeks to limit global warming to a maximum of 2 degrees Celsius above pre-industrial temperatures.Also in 2011, Cheng-Zhi Zou, a NOAA research scientist and former chair of the GSICS Microwave Subgroup, intercalibrated 38 years of climate data -- starting in 1979 -- to generate what NOAA calls a fundamental climate data record (FCDR). The FCDR was published in the Journal of Geophysical Research: Atmospheres.At the American Geophysical Union's (AGU) fall meeting in December 2016, Bali demonstrated that Zou's FCDR was suitable for monitoring microwave satellites, including the Advanced Technology Microwave Sounder onboard NOAA/NASA's Joint Polar Satellite System (JPSS). When launched, JPSS will replace the aging National Polar-orbiting Operational Environmental Satellite System and provide full global monitoring coverage twice a day. Bali expects the FCDR will help monitor and adjust data gathered during JPSS missions.At the recent AGU meeting, Bali also showed that the European Organisation for the Exploitation of Meteorological Satellites' Infrared Atmospheric Sounding Interferometer (IASI) and NASA's Atmospheric Infrared Sounder exhibit sufficiently stable behavior to serve as in-orbit references. Calibrating against these satellites can reduce errors from 2 degrees Celsius to below 0.1 degrees Celsius."This has given tremendous confidence to the GSICS calibration community that uses IASI-A as an in-orbit reference to monitor its geostationary satellites," said Bali.Moving forward, Bali's colleagues at ESSIC will continue to support the science goals of the JPSS satellite mission through the Cooperative Institute for Climate and Satellites (CICS), which is managed by ESSIC and was created in 2009 through a $93 million agreement with NOAA."ESSIC's leadership in supporting these global initiatives is very important," said Bali. "Looking ahead, I see a far greater interaction between NOAA and ESSIC/CICS, which will help NOAA lead the global satellite calibration efforts."
Weather
2,017
January 6, 2017
https://www.sciencedaily.com/releases/2017/01/170106162952.htm
Large-scale tornado outbreaks increasing in frequency
The frequency of large-scale tornado outbreaks is increasing in the United States, particularly when it comes to the most extreme events, according to research recently published in
The study by researchers including Joel E. Cohen, a visiting scholar at the University of Chicago, finds the increase in tornado outbreaks does not appear to be the result of a warming climate as earlier models suggested. Instead, their findings tie the growth in frequency to trends in the vertical wind shear found in certain supercells -- a change not so far associated with a warmer climate."What's pushing this rise in extreme outbreaks, during which the vast majority of tornado-related fatalities occur, is far from obvious in the present state of climate science," said Cohen, the Abby Rockefeller Mauzé Professor at Rockefeller University and Professor of Earth and Environmental Sciences at Columbia University, who conducted the research while a visiting scholar in UChicago's Department of Statistics.Tornado outbreaks are large-scale weather events that last one to three days, featuring several thunderstorms and six or more tornadoes in close succession. In the study, published in the Dec. 16 issue of The researchers estimated that the number of tornadoes in the most extreme outbreak in a five-year interval doubled over the last half-century. This means that in 1965 the worst outbreak expected over five years would have had about 40 tornadoes, while in 2015 the worst outbreak expected over five years would have had about 80 tornadoes."Viewing the data on thousands of tornadoes that have been reliably recorded in the United States over the past half-century as a population has permitted us to ask new questions and discover new, important changes in outbreaks of these tornadoes," Cohen said.To understand the increased frequency in tornado outbreaks, the researchers looked at two factors: convective available potential energy, or CAPE, and storm relative helicity, which is a measure of vertical wind shear.Earlier studies had projected a warming climate would increase CAPE, creating conditions favorable to a rise in severe thunderstorms -- and potentially tornado outbreaks. But Cohen and his colleagues found the increases in outbreaks were driven instead by storm relative helicity, which has not been projected to increase under a warming climate."Our study raises new questions about what climate change will do to severe thunderstorms and what is responsible for recent trends," said co-author Michael K. Tippett, an associate professor at Columbia University's Fu Foundation School of Engineering and Applied Science. "The fact that we didn't see the presently understood meteorological signature of global warming in changing outbreak statistics for tornadoes leaves two possibilities: Either the recent increases are not due to a warming climate, or a warming climate has implications for tornado activity that we don't understand."
Weather
2,017
January 5, 2017
https://www.sciencedaily.com/releases/2017/01/170105101314.htm
Hot weather not to blame for salmonella on egg farms
New research conducted by the University of Adelaide shows there is no greater risk of
Despite a higher number of cases of The findings are further evidence that the hygiene around egg handling in the supply chain and in household and restaurant kitchens is critical to reducing food poisoning from eggs.Researchers conducted a study of four Australian commercial free range egg farms, with the results now published online ahead of print in the journal "Eggs and egg products have been associated with an increased risk of "Birds raised in the free range production system could potentially be exposed to weather extremes, and the free range environment is not as easily controlled as in cage egg production. Therefore, it has been assumed that hot weather has a role to play in the potential contamination of eggs at the site of free range egg production."Our results show that the types and levels of "However, we found that there was no direct association between hot weather and increased prevalence of Salmonella at the production stage, even when data was collected in the hottest month of February," Associate Professor Chousalkar says."This helps to reinforce a simple health safety message: that it's important for people to wash their hands before and after handling eggs, whether at home, in a restaurant, or while working in the supply chain."The bacteria As well as renewing calls for people to practice good hand hygiene when using eggs, Associate Professor Chousalkar says there is a need for nationwide standards and uniform practices on the surveillance of egg contamination and safety."Currently, each of the states has their own food safety and surveillance programs. Because of its implications for public health, we believe the incidence of
Weather
2,017
January 4, 2017
https://www.sciencedaily.com/releases/2017/01/170104154359.htm
Potential instability in Atlantic Ocean water circulation system
One of the world's largest ocean circulation systems may not be as stable as today's weather models predict, according to a new study.
In fact, changes in the Atlantic Meridional Overturning Circulation (AMOC) -- the same deep-water ocean current featured in the movie "The Day After Tomorrow" -- could occur quite abruptly, in geologic terms, the study says. The research appears in the Jan. 4 online edition of the journal "We show that the possibility of a collapsed AMOC under global warming is hugely underestimated," said Wei Liu, a postdoctoral associate in the Department of Geology and Geophysics at Yale University and lead author of the study. Liu began the research when he was a graduate student at the University of Wisconsin-Madison, and continued it at the Scripps Institution of Oceanography, prior to coming to Yale.AMOC is responsible for carrying oceanic heat northward in the Atlantic Ocean. It consists of a lower limb of denser, colder water that flows south, and an upper limb of warm, salty water that flows north. The system is a major factor for regional climate change, affecting the Atlantic rim countries, especially those in Europe."In current models, AMOC is systematically biased to be in a stable regime," Liu said. "A bias-corrected model predicts a future AMOC collapse with prominent cooling over the northern North Atlantic and neighboring areas. This has enormous implications for regional and global climate change."A collapse of the AMOC system, in Liu's model, would cool the Northern Atlantic Ocean, cause a spreading of Arctic sea ice, and move tropical Atlantic rain belts farther south.While a calamity on the order of the fictional plot of "The Day After Tomorrow" is not indicated, the researchers said a significant weather change could happen quickly in the next few centuries."It's a very provocative idea," said study co-author Zhengyu Liu, professor of atmospheric and oceanic sciences, and of environmental studies at the University of Wisconsin-Madison Center for Climatic Research in the Nelson Institute. "For me it's a 180-degree turn because I had been thinking like everyone else."The researchers stressed that their new model may require additional refinement, as well. They said detailed information about water salinity, ocean temperature, and melting ice -- over a period of decades -- is essential to the accuracy of AMOC models.The researchers also noted the major impact that climate change itself has on AMOC patterns. Additional carbon dioxide, for example, warms the cold water of the North Atlantic. Such developments would have an impact on AMOC behavior, the researchers said.Other co-authors of the study are Shang-Ping Xie of the Scripps Institution of Oceanography and Jiang Zhu of the University of Wisconsin-Madison.The National Science Foundation, the U.S. Department of Energy, and the Ministry of Science and Technology of the People's Republic of China funded the research.
Weather
2,017
January 4, 2017
https://www.sciencedaily.com/releases/2017/01/170104103918.htm
Students film breathtaking curvature of Earth using high altitude weather balloon
Physics students from the University of Leicester have captured breathtaking images of Earth's stratosphere using a high altitude weather balloon.
The unmanned balloon and sensor payload reached an altitude of 23.6km, putting it at 1.7 times the altitude ceiling of a 747 airliner.In conditions close to a vacuum with ambient temperatures around -56The payload then descended quickly to Earth reaching a maximum speed of over 100mph.The launch took place in December near Tewkesbury, Gloucestershire, and the payload was recovered in Warwickshire.As well as producing photographs and video, this flight tested electronic control systems for future pollution monitoring flights and advanced navigational systems.It also allowed the students, assisted by amateur radio enthusiasts, to test tracking techniques which will be used again on future flights.Student Robert Peck, from the University of Leicester Department of Physics and Astronomy, said: "We've proven the reliability of the payload electronics and tracking methods, the payload returned in perfect condition, that's a lot to say for something that's been to 23.6km and plunged back to earth at over 44.7m/s. The tracking also worked perfectly, we are indebted to the amateur radio community for helping us to set up the tracking equipment."The flight was conducted by student members of the University of Leicester's Astronomy and Rocketry society, with Ryan Bradley-Evans as team leader of the project, Oli Thomas operating the tracking equipment, Robert Peck responsible for flight control electronics and Aleisha Hogan responsible for public relations. Several other team members also assisted with the project.The team is planning future launches aiming to test the full sensor and advanced navigation systems which time constraints prevented them from launching on the first flight. With the control electronics proven they consider their chances of success to be high.A video of the high altitude weather balloon on a flight to the stratosphere is available here (raw file available on request):
Weather
2,017
January 3, 2017
https://www.sciencedaily.com/releases/2017/01/170103122313.htm
More extreme storms ahead for California
On December 11, 2014, a freight train of a storm steamed through much of California, deluging the San Francisco Bay Area with three inches of rain in just one hour. The storm was fueled by what meteorologists refer to as the "Pineapple Express" -- an atmospheric river of moisture that is whipped up over the Pacific's tropical waters and swept north with the jet stream.
By evening, record rainfall had set off mudslides, floods, and power outages across the state. The storm, which has been called California's "storm of the decade," is among the state's most extreme precipitation events in recent history.Now MIT scientists have found that such extreme precipitation events in California should become more frequent as the Earth's climate warms over this century. The researchers developed a new technique that predicts the frequency of local, extreme rainfall events by identifying telltale large-scale patterns in atmospheric data. For California, they calculated that, if the world's average temperatures rise by 4 degrees Celsius by the year 2100, the state will experience three more extreme precipitation events than the current average, per year.The researchers, who have published their results in the "One of the struggles is, coarse climate models produce a wide range of outcomes. [Rainfall] can increase or decrease," says Adam Schlosser, senior research scientist in MIT's Joint Program on the Science and Policy of Global Change. "What our method tells you is, for California, we're very confident that [heavy precipitation] will increase by the end of the century."The research was led by Xiang Gao, a research scientist in the Joint Program on the Science and Policy of Global Change. The paper's co-authors include Paul O'Gorman, associate professor of earth, atmospheric, and planetary sciences; Erwan Monier, principal research scientist in the Joint Program; and Dara Entekhabi, the Bacardi Stockholm Water Foundations Professor of Civil and Environmental Engineering.Currently, researchers estimate the frequency of local heavy precipitation events mainly by using precipitation information simulated from global climate models. But such models typically carry out complex computations to simulate climate processes across hundreds and even thousands of kilometers. At such coarse resolution, it's extremely difficult for such models to adequately represent small-scale features such as moisture convection and topography, which are essential to making accurate predictions of precipitation.To get a better picture of how future precipitation events might change region by region, Gao decided to focus on not simulated precipitation but large-scale atmospheric patterns, which climate models are able to simulate much more reliably."We've actually found there's a connection between what climate models do really well, which is to simulate large-scale motions of the atmosphere, and local, heavy precipitation events," Schlosser says. "We can use this association to tell how frequently these events are occurring now, and how they will change locally, like in New England, or the West Coast."While definitions vary for what is considered an extreme precipitation event, in this case the researchers defined such an event as being within the top 5 percent of a region's precipitation amounts in a particular season, over periods of almost three decades. They focused their analysis on two areas: California and the Midwest, regions which generally experience relatively high amounts of precipitation in the winter and summer, respectively.For both regions, the team analyzed large-scale atmospheric features such as wind currents and moisture content, from 1979 to 2005, and noted their patterns each day that extreme precipitation occurred. Using statistical analysis, the researchers identified telltale patterns in the atmospheric data that were associated with heavy storms."We essentially take snapshots of all the relevant weather information, and we find a common picture, which is used as our red flag," Schlosser explains. "When we examine historical simulations from a suite of state-of-the-art climate models, we peg every time we see that pattern."Using the new scheme, the team was able to reproduce collectively the frequency of extreme events that were observed over the 27-year period. More importantly, the results are much more accurate than those based on simulated precipitation from the same climate models."None of the models are even close to the observations," Gao says. "And regardless of the combination of atmospheric variables we used, the new schemes were much closer to observations."Bolstered by their results, the team applied their technique to large-scale atmospheric patterns from climate models to predict how the frequency of heavy storms may change in a warming climate in California and the Midwest over the next century. They analyzed each region under two climate scenarios: a "business as usual" case, in which the world is projected to warm by 4 degrees Celsius by 2100, and a policy-driven case, in which global environmental policies that regulate greenhouse gases should keep the temperature increase to 2 degrees Celsius.For each scenario, the team flagged those modeled large-scale atmospheric patterns that they had determined to be associated with heavy storms. In the Midwest, yearly instances of summer extreme precipitation decreased slightly under both warming scenarios, although the researchers say the results are not without uncertainty.For California, the picture is much clearer: Under the more intense scenario of global warming, the state will experience three more extreme precipitation events per year, on the order of the December 2014 storm. Under the policy-driven scenario, Schlosser says "that trend is cut in half."The team is now applying its technique to predict changes in heat waves from a globally warming climate. The researchers are looking for patterns in atmospheric data that correlate with past heat waves. If they can more reliably predict the frequency of heat waves in the future, Schlosser says that can be extremely helpful for the long-term maintenance of power grids and transformers."That is actionable information," Schlosser says.
Weather
2,017
December 29, 2016
https://www.sciencedaily.com/releases/2016/12/161229113435.htm
Flood threats changing across US
The risk of flooding in the United States is changing regionally, and the reasons could be shifting rainfall patterns and the amount of water in the ground.
In a new study, University of Iowa engineers determined that, in general, the threat of flooding is growing in the northern half of the U.S. and declining in the southern half. The American Southwest and West, meanwhile, are experiencing decreasing flood risk.UI engineers Gabriele Villarini and Louise Slater compiled water-height information between 1985 and 2015 from 2,042 stream gauges operated by the U.S. Geological Survey. They then compared the data to satellite information gathered over more than a dozen years by NASA's Gravity Recovery and Climate Experiment (GRACE) mission showing "basin wetness," or the amount of water stored in the ground.What they found was the northern sections of the country, generally, have an increased amount of water stored in the ground, and thus are at greater risk for minor and moderate flooding, two flood categories used by the National Weather Service. Meanwhile, minor to moderate flood risk was decreasing in the southern portions of the U.S., where stored water has declined.Not surprisingly, the NASA data showed decreased stored water -- and reduced flood risk -- in the Southwest and western U.S., in large part due to the prolonged drought gripping those regions."It's almost like a separation where generally flood risk is increasing in the upper half of the U.S. and decreasing in the lower half," says Villarini, associate professor in civil and environmental engineering and an author on the paper, published in the journal Some of the regional variation can be attributed to changes in rainfall; a study led by Villarini published last year showed the Midwest and Plains states have experienced more frequent heavy rains in the past half-century. More rainfall leads to more groundwater, a "higher water base line," Villarini explains."The river basins have a memory," adds Slater, a post-doctoral researcher and the paper's corresponding author. "So, if a river basin is getting wetter, in the Midwest for example, your flood risk is also probably increasing because there's more water in the system."Why some sections of the nation are getting more, or less, rainfall is not entirely clear. The researchers say some causes could be the rains are being redistributed as regional climate changes.The researchers hope that their findings could revise how changing flood patterns are communicated. In the past, flood risk trends have typically been discussed using stream flow, or the amount of water flowing per unit time. The UI study views flood risk through the lens of how it may affect people and property and aligns the results with National Weather Service terminology understood by the general public."The concept is simple," says Villarini, whose primary appointment is in IIHR-Hydroscience, a branch of the College of Engineering. "We're measuring what people really care about."
Weather
2,016
December 22, 2016
https://www.sciencedaily.com/releases/2016/12/161222130524.htm
For critical marine low clouds, a research and observation plan
Marine low clouds hover in the lowest couple of kilometers above the world's oceans. They produce little but drizzle, and could never match their deeper mid-continent cousin clouds for dramatic weather and severe storms. But marine low clouds are vastly important to the world's climate and energy balance. They are a major determinant of the Earth's albedo, a measure of the amount of solar energy reflected from the Earth back into space. And they also have a big impact on global energy and hydrologic cycles.
Still, there remain gaps in our understanding of the processes that regulate marine clouds, such that they are a challenge to properly represent in climate models, and remain a source of the greatest uncertainties in these simulations of future climate states. They are subject to complex processes that control their coverage and condensate loading and that determine their microphysical and radiative properties. Most marine clouds are also remote, which adds significant observational challenges.All this importance and all these challenges add up to why marine low clouds received major scientific attention in 2016: a January workshop at Brookhaven National Laboratory, sponsored by the U.S. Department of Energy's Atmospheric System Research (ASR) Program; an associated June workshop report; and a meeting summary in the September issue of the Bulletin of the American Meteorological Society.In all three cases, in varying degrees of detail, the same imperative was laid out: Develop an action plan for the next five to 15 years in order to close the gaps in our current knowledge and to improve simulation capabilities. That means improving our understanding of the key processes that make and alter marine low clouds and improving ways of quantifying them.Four research themes emerged: the interactions of aerosols and cloud droplets, precipitation rates, entrainment (how cloudy and clear air mix), and cloud organization on scales of 5 to 100 kilometers.These research themes emphasized the importance of better representing these clouds in climate models. Workshop participants and report authors pointed to the contributions that will be made by ASR scientists and the Atmospheric Radiation Measurement (ARM) Climate Research Facility.Among the research needs, one recurrent theme emerged: the need for measurements at both long-term sites, such as ARM's Eastern North Atlantic observation facility, and the need for measurements from aircraft. It is thought that airborne instrument platforms could fill in data gaps related to aerosol composition, particle size, and variations in cloud fields and trace gases.To meet this need, the ARM Facility will be conducting the Aerosol and Cloud Experiments in the Eastern North Atlantic (ACE-ENA) in June 2017, which is being led by ASR researcher Jian Wang of Brookhaven National Laboratory.
Weather
2,016
December 20, 2016
https://www.sciencedaily.com/releases/2016/12/161220175534.htm
Computer models find ancient solutions to modern climate problems
Washington State University archaeologists are at the helm of new research using sophisticated computer technology to learn how past societies responded to climate change.
Their work, which links ancient climate and archaeological data, could help modern communities identify new crops and other adaptive strategies when threatened by drought, extreme weather and other environmental challenges.In a new paper in the "For every environmental calamity you can think of, there was very likely some society in human history that had to deal with it," said Kohler, emeritus professor of anthropology at WSU. "Computational modeling gives us an unprecedented ability to identify what worked for these people and what didn't."Kohler is a pioneer in the field of model-based archaeology. He developed sophisticated computer simulations, called agent-based models, of the interactions between ancestral peoples in the American Southwest and their environment.He launched the Village Ecodynamics Project in 2001 to simulate how virtual Pueblo Indian families, living on computer-generated and geographically accurate landscapes, likely would have responded to changes in specific variables like precipitation, population size and resource depletion.By comparing the results of agent-based models against real archeological evidence, anthropologists can identify past conditions and circumstances that led different civilizations around the world into periods of growth and decline.Agent-based modeling is also used to explore the impact humans can have on their environment during periods of climate change.One study mentioned in the WSU review demonstrates how drought, hunting and habitat competition among growing populations in Egypt led to the extinction of many large-bodied mammals around 3,000 B.C. In addition, d'Alpoim Guedes and Bocinsky, an adjunct faculty member in anthropology, are investigating how settlement patterns in Tibet are affecting erosion."Agent-based modeling is like a video game in the sense that you program certain parameters and rules into your simulation and then let your virtual agents play things out to the logical conclusion," said Crabtree, who completed her Ph.D. in anthropology at WSU earlier this year. "It enables us to not only predict the effectiveness of growing different crops and other adaptations but also how human societies can evolve and impact their environment."Species distribution or crop-niche modeling is another sophisticated technology that archeologists use to predict where plants and other organisms grew well in the past and where they might be useful today.Bocinsky and d'Alpoim Guedes are using the modeling technique to identify little-used or in some cases completely forgotten crops that could be useful in areas where warmer weather, drought and disease impact food supply.One of the crops they identified is a strain of drought-tolerant corn the Hopi Indians of Arizona adapted over the centuries to prosper in poor soil."Our models showed Hopi corn could grow well in the Ethiopian highlands where one of their staple foods, the Ethiopian banana, has been afflicted by emerging pests, disease and blasts of intense heat," Bocinsky said. "Cultivating Hopi corn and other traditional, drought-resistant crops could become crucial for human survival in other places impacted by climate change."WSU researchers also used crop-niche modeling to identify a viable alternative food source on the Tibetan Plateau. Rapidly rising temperatures make it difficult for the region's inhabitants to grow cold weather crops and raise and breed yaks, a staple form of subsistence.In a paper published in 2015, d'Alpoim Guedes and Bocinsky found that foxtail and proso millet, which fell out of cultivation on the Plateau 4,000 years ago as the climate got colder, could soon be grown there again as the climate warms up."These millets are on the verge of becoming forgotten crops," d'Alpoim Guedes said. "But due to their heat tolerance and high nutritional value, and very low rainfall requirements, they may once again be useful resources for a warmer future."With hundreds of years of anthropological data from sites around the world yet to be digitized, scientists are just beginning to tap the potential of archaeology-based modeling."The field is in the midst of a renaissance toward more computational approaches," Kohler said. "Our hope is that combining traditional archaeology fieldwork with data-driven modeling techniques will help us more knowledgeably manage our numbers, our ecosystem interactions and avoid past errors regarding climate change."
Weather
2,016
December 19, 2016
https://www.sciencedaily.com/releases/2016/12/161219151744.htm
Freezing in record lows? You may doubt global warming
If you're shivering from unusually teeth-rattling cold this holiday season, global warming is probably the last thing on your mind.
"The local weather conditions people experience likely play a role in what they think about the broader climate," says Utah State University researcher Peter Howe. "Climate change is causing record-breaking heat around the world, but the variability of the climate means that some places are still reaching record-breaking cold. If you're living in a place where there's been more record cold weather than record heat lately, you may doubt reports of climate change."Howe says people's beliefs about climate change are driven by many factors, but a new study in which he participated suggests weather events in your own backyard may be an important influence.With colleagues Robert Kaufmann, Sucharita Gopal, Jackie Liederman, Xiaojing Tang and Michelle Gilmore of Boston University; Michael Mann of The George Washington University and Felix Pretis of the University of Oxford, Howe published findings in the Dec. 19, 2016, Early Edition of the Howe, assistant professor of human-environment geography in USU's Department of Environment and Society and the USU Ecology Center, generated the public opinion dataset used in the analysis. The collected information is based on a statistical model of more than 12,000 survey respondents across the nation from 2008 to 2013 collected by the Yale Project on Climate Change Communication and George Mason Center for Climate Change Communication."We found that places with more record high temperatures than lows have more residents who believe the planet is warming," he says. "Conversely, in places with more record low temperatures, more people tend to doubt global warming."The study notes part of this dichotomy may be because early terminology used to describe climate change suggested the earth was simply warming, rather than changing in innumerable but measurable ways."One of the greatest challenges to communicating scientific findings about climate change is the cognitive disconnect between local and global events," says Mann, one of Howe's partners in the study. "It's easy to assume that what you experience at home must be happening elsewhere."The scientists note the importance of differentiating between weather, the temperatures of a relatively short period of time, such as a season, and climate, the average temperature over a period of 25 or 30 years. Emphasizing the different between weather and climate may help the scientific community more effectively explain climate change, they say."Our work highlights some of the challenges of communicating about climate change, and the importance of situating people's experiences at the local level within the larger global context," Howe says.
Weather
2,016
December 19, 2016
https://www.sciencedaily.com/releases/2016/12/161219151735.htm
El Niño fueled Zika outbreak, new study suggests
Scientists at the University of Liverpool have shown that a change in weather patterns, brought on by the 'Godzilla' El Niño of 2015, fuelled the Zika outbreak in South America.
The findings were revealed using a new epidemiological model that looked at how climate affects the spread of Zika virus by both of its major vectors, the yellow fever mosquito (The model can also be used to predict the risk of future outbreaks, and help public health officials tailor mosquito control measures and travel advice.The model used the worldwide distribution of both vectors as well as temperature-dependent factors, such as mosquito biting rates, mortality rates and viral development rates within mosquitoes, to predict the effect of climate on virus transmission. It found that in 2015, when the Zika outbreak occurred, the risk of transmission was greatest in South America.The researchers believe that this was likely due to a combination of El Niño -- a naturally occurring phenomenon that sees above-normal temperatures in the Pacific Ocean and causes extreme weather around the world -- and climate change, creating conducive conditions for the mosquito vectors.El Niños occur every three to seven years in varying intensity, with the 2015 El Niño, nicknamed the 'Godzilla', one of the strongest on record. Effects can include severe drought, heavy rains and temperature rises at global scale.Dr Cyril Caminade, a population and epidemiology researcher who led the work, said: "It's thought that the Zika virus probably arrived in Brazil from Southeast Asia or the Pacific islands in 2013."However, our model suggests that it was temperature conditions related to the 2015 El Niño that played a key role in igniting the outbreak -- almost two years after the virus was believed to be introduced on the continent.""In addition to El Niño, other critical factors might have played a role in the amplification of the outbreak, such as the non-exposed South American population, the risk posed by travel and trade, the virulence of the Zika virus strain and co-infections with other viruses such as dengue."The World Health Organisation recently declared that Zika, which has been linked to birth defects and neurological complications, will no longer be treated as an international emergency, but as a "significant enduring public health challenge."Professor Matthew Baylis, from the University's Institute of Infection and Global Health, added: "Zika is not going away, and so the development of tools that could help predict potential future outbreaks and spread are extremely important."Our model predicts a potential seasonal transmission risk for Zika virus, in the south eastern United States, southern China, and to a lesser extent over southern Europe during summer."The researchers now plan to adapt the model to other important flaviviruses, such as Chikungunya and Dengue fever, with the aim of developing disease early warning systems that could help public health officials prepare for, or even prevent, future outbreaks.The research was funded by the National Institute for Health Research (NIHR) Health Protection Research Unit (HPRU) in Emerging Infections and Zoonoses, a collaboration between the University of Liverpool, Liverpool School of Tropical Medicine and Public Health England.The paper 'Global risk model for vector-borne transmission of Zika virus reveals the role of El Niño 2015' is published in the
Weather
2,016
December 15, 2016
https://www.sciencedaily.com/releases/2016/12/161215152126.htm
Supercomputer simulations confirm observations of 2015 India/Pakistan heat waves
A paper published this week during the American Geophysical Union (AGU) fall meeting in San Francisco points to new evidence of human influence on extreme weather events.
Three researchers from Lawrence Berkeley National Laboratory (Berkeley Lab) are among the co-authors on the paper, which is included in "Explaining Extreme Events of 2015 from a Climate Perspective," a special edition of the Bulletin of the American Meteorological Society (BAMS) released December 15 at the AGU meeting.The paper, "The Deadly Combination of Heat and Humidity in India and Pakistan in Summer 2015," examined observational and simulated temperature and heat indexes, concluding that the heat waves in the two countries "were exacerbated by anthropogenic climate change." While these countries typically experience severe heat in the summer, the 2015 heat waves -- which occurred in late May/early June in India and in late June/early July in Pakistan -- have been linked to the deaths of nearly 2,500 people in India and 2,000 in Pakistan."I was deeply moved by television coverage of the human tragedy, particularly parents who lost young children," said Michael Wehner, a climate researcher at Berkeley Lab and lead author on the paper, who has studied extreme weather events and anthropogenic climate change extensively. This prompted him and collaborators from Berkeley Lab, the Indian Institute of Technology Delhi and UC Berkeley to investigate the cause of the 2015 heat waves and determine if the two separate meteorological events were somehow linked.They used simulations from the Community Atmospheric Model version 5 (CAM5), the atmospheric component of the National Center for Atmospheric Research's Community Earth System Model, performed by Berkeley Lab for the C20C+ Detection and Attribution Project. Current climate model-based products are not optimized for research on the attribution of the human influence on extreme weather in the context of long-term climate change; the C20C+ Detection and Attribution Project fills this gap by providing large ensembles of simulation data from climate models, running at relatively high spatial resolution.The experimental design described in the BAMS paper used "factual" simulations of the world and compared them to "counterfactual" simulations of the world that might have been had humans not changed the composition of the atmosphere by emitting large amounts of carbon dioxide, explained Dáithí Stone, a research scientist in Berkeley Lab's Computational Research Division and second author on the BAMS paper."It is relatively common to run one or a few simulations of a climate model within a certain set of conditions, with each simulation differing just in the precise weather on the first day of the simulation; this difference in the first day propagates through time, providing different realizations of what the weather 'could have been,'" Stone said. "The special thing about the simulations used here is that we ran a rather large number of them. This was important for studying a rare event; if it is rare, then you need a large amount of data in order to have it occurring frequently enough that you can understand it."The researchers examined both observational and simulated temperature alone as well as the heat index, a measure incorporating both temperature and humidity effects. From a quality-controlled weather station observational dataset, they found the potential for a very large, human-induced increase in the likelihood of the magnitudes of the two heat waves. They then examined the factual and counterfactual simulations to further investigate the presence of a human influence."Observations suggested the human influence; simulations confirmed it," Wehner said.The research team also found that, despite being close in location and time, the two heat waves were "meteorologically independent." Even so, Wehner emphasized, "the India/Pakistan paper confirms that the chances of deadly heat waves have been substantially increased by human-induced climate change, and these chances will certainly increase as the planet continues to warm."Data from Berkeley Lab's simulations were also analyzed as part of another study included in the special edition of BAMS released at the AGU meeting. That study, "The Late Onset of the 2015 Wet Season in Nigeria," which was led by the Nigerian Meteorological Agency, explores the role of greenhouse gas emissions in changing the chance of a late wet season, as occurred over Nigeria in 2015."The C20C+ D&A Project is continuing to build its collection of climate model data with the intention of supporting research like this around the world," Stone said.The C20C+ D&A portal is hosted and supported by Berkeley Lab's National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility, and the simulations for the two papers were run on NERSC's Hopper supercomputer, while the data analysis was done on NERSC's Edison and Cori systems. The simulations were conducted as part of a program dedicated to advancing our understanding of climate extremes and enhancing our ability to attribute and project changes in their risk because of anthropogenic climate change. The research was supported by the DOE Office of Science and the National Science Foundation."Explaining Extreme Events of 2015 from a Climate Perspective," a special edition of the
Weather
2,016
December 14, 2016
https://www.sciencedaily.com/releases/2016/12/161214151652.htm
Revolutions in understanding the ionosphere, Earth's interface to space
Scientists from NASA and three universities have presented new discoveries about the way heat and energy move and manifest in the ionosphere, a region of Earth's atmosphere that reacts to changes from both space above and Earth below.
Far above Earth's surface, within the tenuous upper atmosphere, is a sea of particles that have been split into positive and negative ions by the sun's harsh ultraviolet radiation. Called the ionosphere, this is Earth's interface to space, the area where Earth's neutral atmosphere and terrestrial weather give way to the space environment that dominates most of the rest of the universe -- an environment that hosts charged particles and a complex system of electric and magnetic fields. The ionosphere is both shaped by waves from the atmosphere below and uniquely responsive to the changing conditions in space, conveying such space weather into observable, Earth-effective phenomena -- creating the aurora, disrupting communications signals, and sometimes causing satellite problems.Many of these effects are not well-understood, leaving the ionosphere, for the most part, a region of mystery. Scientists from NASA's Goddard Space Flight Center in Greenbelt, Maryland, the Catholic University of America in Washington, D.C., the University of Colorado Boulder, and the University of California, Berkeley, presented new results on the ionosphere at the fall meeting of the American Geophysical Union on Dec. 14, 2016, in San Francisco.One researcher explained how the interaction between the ionosphere and another layer in the atmosphere, the thermosphere, counteract heating in the thermosphere -- heating that leads to expansion of the upper atmosphere, which can cause premature orbital decay. Another researcher described how energy outside the ionosphere accumulates until it discharges -- not unlike lightning -- offering an explanation for how energy from space weather crosses over into the ionosphere. A third scientist discussed two upcoming NASA missions that will provide key observations of this region, helping us better understand how the ionosphere reacts both to space weather and to terrestrial weather.Changes in the ionosphere are primarily driven by the sun's activity. Though it may appear unchanging to us on the ground, our sun is, in fact, a very dynamic, active star. Watching the sun in ultraviolet wavelengths of light from space -- above our UV light-blocking atmosphere -- reveals constant activity, including bursts of light, particles, and magnetic fields.Occasionally, the sun releases huge clouds of particles and magnetic fields that explode out from the sun at more than a million miles per hour. These are called coronal mass ejections, or CMEs. When a CME reaches Earth, its embedded magnetic fields can interact with Earth's natural magnetic field -- called the magnetosphere -- sometimes compressing it or even causing parts of it to realign.It is this realignment that transfers energy into Earth's atmospheric system, by setting off a chain reaction of shifting electric and magnetic fields that can send the particles already trapped near Earth skittering in all directions. These particles can then create one of the most recognizable and awe-inspiring space weather events -- the aurora, otherwise known as the Northern Lights.But the transfer of energy into the atmosphere isn't always so innocuous. It can also heat the upper atmosphere -- where low-Earth satellites orbit -- causing it to expand like a hot-air balloon."This swelling means there's more stuff at higher altitudes than we would otherwise expect," said Delores Knipp, a space scientist at the University of Colorado Boulder. "That extra stuff can drag on satellites, disrupting their orbits and making them harder to track."This phenomenon is called satellite drag. New research shows that this understanding of the upper atmosphere's response to solar storms -- and the resulting satellite drag -- may not always hold true."Our basic understanding has been that geomagnetic storms put energy into the Earth system, which leads to swelling of the thermosphere, which can pull satellites down into lower orbits," said Knipp, lead researcher on these new results. "But that isn't always the case."Sometimes, the energy from solar storms can trigger a chemical reaction that produces a compound called nitric oxide in the upper atmosphere. Nitric oxide acts as a cooling agent at very high altitudes, promoting energy loss to space, so a significant increase in this compound can cause a phenomenon called overcooling."Overcooling causes the atmosphere to quickly shed energy from the geomagnetic storm much quicker than anticipated," said Knipp. "It's like the thermostat for the upper atmosphere got stuck on the 'cool' setting."That quick loss of energy counteracts the previous expansion, causing the upper atmosphere to collapse back down -- sometimes to an even smaller state than it started in, leaving satellites traveling through lower-density regions than anticipated.A new analysis by Knipp and her team classifies the types of storms that are likely to lead to this overcooling and rapid upper atmosphere collapse. By comparing over a decade of measurements from Department of Defense satellites and NASA's Thermosphere, Ionosphere, Mesosphere Energetics and Dynamics, or TIMED, mission, the researchers were able to spot patterns in energy moving throughout the upper atmosphere."Overcooling is most likely to happen when very fast and magnetically-organized ejecta from the sun rattle Earth's magnetic field," said Knipp. "Slow clouds or poorly-organized clouds just don't have the same effect."This means that, counterintuitively, the most energetic solar storms are likely to provide a net cooling and shrinking effect on the upper atmosphere, rather than heating and expanding it as had been previously understood.Competing with this cooling process is the heating that caused by solar storm energy making its way into Earth's atmosphere. Though scientists have known that solar wind energy eventually reaches the ionosphere, they have understood little about where, when and how this transfer takes place. New observations show that the process is localized and impulsive, and partly dependent on the state of the ionosphere itself.Traditionally, scientists have thought that the way energy moves throughout Earth's magnetosphere and atmosphere is determined by the characteristics of the incoming particles and magnetic fields of the solar wind -- for instance, a long, steady stream of solar particles would produce different effects than a faster, less consistent stream. However, new data shows that the way energy moves is much more closely tied to the mechanisms by which the magnetosphere and ionosphere are linked."The energy transfer process turns out to be very similar to the way lightning forms during a thunderstorm," said Bob Robinson, a space scientist at NASA Goddard and the Catholic University of America.During a thunderstorm, a buildup of electric potential difference -- called voltage -- between a cloud and the ground leads to a sudden, violent discharge of that electric energy in the form of lightning. This discharge can only happen if there's an electrically conducting pathway between the cloud and the ground, called a leader.Similarly, the solar wind striking the magnetosphere can build up a voltage difference between different regions of the ionosphere and the magnetosphere. Electric currents can form between these regions, creating the conducting pathway needed for that built-up electric energy to discharge into the ionosphere as a kind of lightning."Terrestrial lightning takes several milliseconds to occur, while this magnetosphere-ionosphere 'lightning' lasts for several hours -- and the amount of energy transferred is hundreds to thousands of times greater," said Robinson, lead researcher on these new results. These results are based on data from the global Iridium satellite communications constellation.Because solar storms enhance the electric currents that let this magnetosphere-ionosphere lightning take place, this type of energy transfer is much more likely when Earth's magnetic field is jostled by a solar event.The huge energy transfer from this magnetosphere-ionosphere lightning is associated with heating of the ionosphere and upper atmosphere, as well as increased aurora.Though scientists are making progress in understanding the key processes that drive changes in the ionosphere and, in turn, on Earth, there is still much to be understood. In 2017, NASA is launching two missions to investigate this dynamic region: the Ionospheric Connection Explorer, or ICON, and Global Observations of the Limb and Disk, or GOLD."The ionosphere doesn't only react to energy input by solar storms," said Scott England, a space scientist at the University of California, Berkeley, who works on both the ICON and GOLD missions. "Terrestrial weather, like hurricanes and wind patterns, can shape the atmosphere and ionosphere, changing how they react to space weather."ICON will simultaneously measure the characteristics of charged particles in the ionosphere and neutral particles in the atmosphere -- including those shaped by terrestrial weather -- to understand how they interact. GOLD will take many of the same measurements, but from geostationary orbit, which gives a global view of how the ionosphere changes.Both ICON and GOLD will take advantage of a phenomenon called airglow -- the light emitted by gas that is excited or ionized by solar radiation -- to study the ionosphere. By measuring the light from airglow, scientists can track the changing composition, density, and even temperature of particles in the ionosphere and neutral atmosphere.ICON's position 350 miles above Earth will enable it to study the atmosphere in profile, giving scientists an unprecedented look at the state of the ionosphere at a range of altitudes. Meanwhile, GOLD's position 22,000 miles above Earth will give it the chance to track changes in the ionosphere as they move across the globe, similar to how a weather satellite tracks a storm."We will be using these two missions together to understand how dynamic weather systems are reflected in the upper atmosphere, and how these changes impact the ionosphere," said England.
Weather
2,016
December 14, 2016
https://www.sciencedaily.com/releases/2016/12/161214151555.htm
Researchers dial in to 'thermostat' in Earth's upper atmosphere
A team led by the University of Colorado Boulder has found the mechanism behind the sudden onset of a "natural thermostat" in Earth's upper atmosphere that dramatically cools the air after it has been heated by violent solar activity.
Scientists have known that solar flares and coronal mass ejections (CMEs) -- which release electrically charged plasma from the sun -- can damage satellites, cause power outages on Earth and disrupt GPS service. CMEs are powerful enough to send billions of tons of solar particles screaming toward Earth at more than 1 million miles per hour, said CU Boulder Professor Delores Knipp of the Department of Aerospace Engineering Sciences.Now, Knipp and her team have determined that when such powerful CMEs come off the sun and speed toward Earth, they create shock waves much like supersonic aircraft create sonic booms. While the shock waves from CMEs pour energy into Earth's upper atmosphere, puffing it up and heating it, they also cause the formation of the trace chemical nitric oxide, which then rapidly cools and shrinks it, she said."What's new is that we have determined the circumstances under which the upper atmosphere goes into this almost overcooling mode following significant heating," said Knipp, also a member of CU Boulder's Colorado Center for Astrodynamics Research. "It's a bit like having a stuck thermostat -- it's really a case of nature reining itself in."Knipp gave a presentation at the 2016 fall meeting of the American Geophysical Union being held in San Francisco Dec. 12 through Dec. 16. The presentation was tied to an upcoming paper that is slated to be published in the journal Solar storms can cause dramatic change in the temperatures of the upper atmosphere, including the ionosphere, which ranges from about 30 miles in altitude to about 600 miles high -- the edge of space. While CME material slamming into Earth's atmosphere can cause temperature spikes of up to 750 degrees Fahrenheit, the nitric oxide created by the energy infusion can subsequently cool it by about 930 F, said Knipp.The key to solving the mystery came when Knipp was reviewing satellite data from a severe solar storm that pounded Earth in 1967. "I found a graphic buried deep in a long forgotten manuscript," she said. "It finally suggested to me what was really happening."Because the upper atmosphere expands during CMEs, satellites in low-Earth orbit are forced to move through additional gaseous particles, causing them to experience more drag. Satellite drag -- a huge concern of government and aerospace companies -- causes decays in the orbits of spacecraft, which subsequently burn up in the atmosphere.As part of the new study, Knipp and her colleagues compared two 15-year-long satellite datasets. One was from the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) instrument riding on NASA's TIMED satellite. The other was from data collected by U.S. Department of Defense satellites."We found that the fastest material streaming off the sun was triggering these shockwaves, causing the atmosphere to heave up and heat up," she said. "But it became very clear that these shock waves were at the root of creating the nitric oxide, which caused the atmosphere to shed energy and cool."SABER has been collecting data on nitric oxide in the atmosphere since its launch in 2001, following on the heels of another nitric oxide-measuring satellite known as the Student Nitric Oxide Explorer (SNOE). Launched in 1998, SNOE involved more than 100 CU Boulder students, primarily undergraduates, in its design and construction. Once in orbit, SNOE was controlled by students on campus 24 hours a day for nearly six years.Geomagnetic storms have had severe impacts on Earth. A 1989 storm caused by a CME resulted in the collapse of the Hydro-Quebec's electricity transmission system, causing six million Canadians to lose power. In 1859 a solar storm called the Carrington Event produced auroras from the North Pole to Central America and disrupted telegraph communications, even sparking fires at telegraph offices that caused several deaths.
Weather
2,016
December 12, 2016
https://www.sciencedaily.com/releases/2016/12/161212084912.htm
Mountain glaciers are showing some of the strongest responses to climate change
Mountain glaciers have long been a favorite poster child of climate change. The near-global retreat of glaciers of the last century provides some of the most iconic imagery for communicating the reality of human-driven climate change.
But the scientific basis for their retreat has been less clear. Glaciers respond slowly to any climate changes, they are susceptible to year-to-year variations in mountain weather, and some of the largest are still catching up after the end of the Little Ice Age. Scientists can connect climate change to the overall retreat of glaciers worldwide, but linking an individual glacier's retreat to climate change has remained a subject of debate.The last report from the Intergovernmental Panel on Climate Change concluded only that it was "likely" that a "substantial" part of mountain glacier retreat is due to human-induced climate change — a much weaker conclusion than for other things, like temperature.Now, using statistical techniques to analyze 37 mountain glaciers around the world, a University of Washington study finds that for most of them the observed retreat is more than 99 percent likely due to climate change. In the climate report's wording, it is "virtually certain" that the retreat of these mountain glaciers is due to climate change over the past century."Because of their decades-long response times, we found that glaciers are actually among the purest signals of climate change," said Gerard Roe, a UW professor of Earth and space sciences. He is corresponding author of the study published Dec. 12 in The new study analyzes specific glaciers with a history of length observations, and nearby weather records of temperature and precipitation. The authors also sought different glacier locations, focusing on roughly seven glaciers in each of five geographic regions: North America, Europe, Asia, Scandinavia and the Southern Hemisphere."We evaluate glaciers which are hanging on at high altitudes in the deserts of Asia as well as glaciers that are being beaten up by midlatitude storms in maritime climate settings," Roe said. "The thickness, slope and area of the glaciers are different, and all of those things affect the size of the glacier length fluctuations."The authors used statistical tools to compare the natural, weather-induced variations in a glacier's length with its observed changes over the last 130 years, and establish a signal-to-noise ratio. They then use that to calculate the probability that observed retreats would have happened without any background change in the climate. Likewise, for the well-known Franz Josef Glacier in New Zealand, even though the glacier has experienced re-advances of up to 1 kilometer (0.6 miles) in a given decade, there is a less than 1 percent chance that natural variations could explain the overall 3.2 kilometers (2 miles) retreat in the last 130 years.The least significant retreats among the glaciers studied were for Rabots Glacier in northern Sweden, and South Cascade Glacier in Washington state, with probabilities of 11 and 6 percent, respectively, that their retreats might be natural variability."South Cascade is at the end of the Pacific storm track, and it experiences a high degree of wintertime variability. Average wintertime snowfall generates about 3 meters (10 feet) of ice per year, whereas for glaciers in desert Asia, ice accumulation might be as low as 10 centimeters (4 inches) per year,” Roe said. "So they're experiencing very different climate settings. As a result, their variability, and also their sensitivity to climate change, varies from place to place."The method uses a signal-to-noise ratio that relies on observational records for glacier length, local weather, and the basic size and shape of the glacier, but does not require detailed computer modeling. The technique could be used on any glacier that had enough observations.Overall, the results show that changes in the 37 glaciers' lengths are between two and 15 standard deviations away from their statistical means. That represents some of the highest signal-to-noise ratios yet documented in natural systems' response to climate change."Even though the scientific analysis arguably hasn't always been there, it now turns out that it really is true — we can look at these glaciers all around us that we see retreating, and see definitive evidence that the climate is changing," Roe said. "That's why people have noticed it. These glaciers are stunningly far away from where they would have been in a preindustrial climate."
Weather
2,016
December 7, 2016
https://www.sciencedaily.com/releases/2016/12/161207140721.htm
Despite evolutionary inexperience, northern sockeye manage heat stress
Sockeye salmon that evolved in the generally colder waters of the far north still know how to cool off if necessary, an important factor in the species' potential for dealing with global climate change.
Sockeyes, which spawn in fresh water and spend two to three years in the Pacific Ocean, range from southern Alaska south to the Columbia River.Research by Oregon State University revealed that sockeyes at the northern edge of that range, despite lacking their southern counterparts' evolutionary history of dealing with heat stress, nevertheless have an innate ability to "thermoregulate."Thermoregulation means that when their surroundings warm up too much, the fish will seek cooler water that precisely meets their physiological needs. A study conducted by an OSU researcher at an Alaska lake during a heat wave shed light on sockeyes' ability to find the water temperatures they need.Multiple earlier studies had demonstrated thermoregulation behavior among sockeye salmon at lower latitudes, but northern populations' behavioral response to heat stress had largely gone unexamined.While it may seem obvious that any fish would move around to find the water temperature it needed, prior research has shown thermoregulation is far from automatic -- even among populations living where heat stress is a regular occurrence."Often what's happened has been counterintuitive, so we had no idea what to expect," said Jonathan B. Armstrong, assistant professor in the College of Agricultural Sciences' Department of Fish and Wildlife, the lead author on the study. "About 40 million sockeye return to Bristol Bay every year. These huge salmon runs are a big part of the regional culture and economy, so how these fish respond to climate change will have very real effects on people's lives. It's encouraging that the sockeyes showed this innate capacity to respond."Results of the research were recently published in Armstrong and his collaborators at the University of Washington worked in 2013 at Little Togiak Lake -- one of five major lakes in the Wood River watershed that drain into Bristol Bay, a fishery that produces nearly 70 percent of all the sockeye salmon caught in the United States. Bristol Bay is close to the 60-degree latitude that marks the northern boundary of the sockeyes' primary range.Adult sockeye salmon return to the Wood River system from the Bering Sea in early summer, then mature and develop secondary sexual traits before spawning later in the summer or at the beginning of fall.During the time between entering fresh water and spawning, the fish group together in their lake's epilimnion -- the upper, warmer level of water in a thermally stratified lake. Usually the fish congregate, or stage, near tributary inlets and along shorelines.During a staging period of unusually warm weather -- maximum daily air temperatures hovered around 80 degrees for a week, the second-warmest heat wave on record -- researchers used a seine to capture fish and outfitted 95 of them with devices that logged water temperatures at 20-minute intervals.What they learned from the 40 recovered temperature loggers was that when the epilimnion temperature rose above about 12 degrees Celsius, or about 53 degrees, the fish thermoregulated by moving to tributary plumes or to deeper water.By swimming away from the rising temperatures, the fish expended 50 percent less energy during the warmest conditions -- 64 to 68 degrees -- than they would have had they stayed put."The hotter it is, the more energy they burn, but these fish don't just want the coldest water possible," Armstrong said. "If they were cars looking for maximum fuel efficiency, they'd just find the coldest water, but instead it's a Goldilocks sort of thing -- they're looking for not too warm, not too cold."They want their system to go fast enough for them to go through maturation before they spawn, where they go from these silver torpedoes to these crazy, exaggerated beasts of sexual selection with a red body and green jaws."Armstrong noted the broader message of the study is what it says about the ability of animals to exploit the kinds of diversity of temperature and diversity of habitat found in ecosystems that are intact and not heavily developed."There's all this diversity and connectivity up there," Armstrong said. "Fish have lots of options for coping with warming or environmental change in general."When we develop watersheds, we often simplify habitats and take away these options. In our research we are constantly stumbling across new and interesting ways that fish and wildlife thrive by exploiting diversity in temperatures, often at small spatial scales that would be very easy to overlook. This study is one more example of how all the little details matter, and they could be what save animals from climate change, or at least reduce the impacts."
Weather
2,016
December 6, 2016
https://www.sciencedaily.com/releases/2016/12/161206110224.htm
Longest-living animal gives up ocean climate secrets
A study of the longest-living animal on Earth, the quahog clam, has provided researchers with an unprecedented insight into the history of the oceans.
By studying the chemistry of growth rings in the shells of the quahog clam, an international team led by experts from Cardiff University and Bangor University have pieced together the history of the North Atlantic Ocean over the past 1000 years and discovered how its role in driving the atmospheric climate has drastically changed.The research team showed that prior to the industrial period (pre AD 1800), changes in the North Atlantic Ocean, brought about by variations in the Sun's activity and volcanic eruptions, were driving our climate and led to changes in the atmosphere, which subsequently impacted our weather.However, this has switched during the industrial period (1800-2000) and changes in the North Atlantic are now synchronous with, or lag behind, changes in the atmosphere, which the researchers believe could be due to the influences of greenhouse gases.The results are extremely important in terms of discerning how changes in the North Atlantic Ocean may impact the climate and the weather across the Northern Hemisphere in the future.The findings have been published in the journal The quahog clam, also known as a hard clam or chowder clam, is an edible mollusc native to the continental shelf seas of North America and Europe that can live for over 500 years.The chemistry in the growth rings in the shells of the clam -- which occur much like the annual growth rings in the centre of trees -- can act as a proxy for the chemical make-up of the oceans, enabling researchers to reconstruct a history of how the oceans have changed over the past 1000 years with unprecedented dating precision.By comparing this record with records of solar variability, volcanic eruptions and atmospheric air temperatures, the researchers have been able to construct a bigger picture and investigate how each of these things have been linked to one another over time.Lead author of the study Dr David Reynolds, from the School of Earth and Ocean Sciences, said: "Our results show that solar variability and volcanic eruptions play a significant role in driving variability in the oceans over the past 1000 years. Results also showed that marine variability has played an active role in driving changes to Northern Hemisphere air temperatures in the pre-industrial era."This trend is not seen during the industrial period, where Northern Hemisphere temperature changes, driven by humanmade forcings, precede variability in the marine environment."Up until now, instrumental observations of the oceans have only spanned the last 100 years or so, whilst reconstructions using marine sediment cores come with significant age uncertainties. This has limited the ability of researchers to look further back in time and examine the role the ocean plays in the wider climate system using such detailed statistical analyses.Co-author of the study Professor Ian Hall, from the School of Earth and Ocean Sciences, said: "Our results highlight the challenge of basing our understanding of the climate system on generally short observational records."Whilst they likely capture an element of natural variability, the strong anthropogenic trends observed over recent decades likely masks the true natural rhythms of the climate system. These data therefore provides an invaluable archive of the natural state of the ocean system and the expression of anthropogenic change over the last 1000 years."If we are to continue to develop the most robust near-term predictions of future climate change we must continue to develop robust reconstructions of past ocean variability."
Weather
2,016
December 6, 2016
https://www.sciencedaily.com/releases/2016/12/161206111455.htm
Snow data from satellites improves temperature predictions, researchers show
Researchers with The University of Texas at Austin have found that incorporating snow data collected from space into computer climate models can significantly improve seasonal temperature predictions.
The findings, published in November in "We're interested in providing more accurate climate forecasts because the seasonal timescale is quite important for water resource management and people who are interested in next season's weather," said Peirong Lin, the lead author of the study and a graduate student at the UT Jackson School of Geosciences.Seasonal forecasts are influenced by factors that are significantly more difficult to account for than the variables for daily to weekly weather forecasts or long-term climate change, said Zong-Liang Yang, a co-author of the study and a professor at the Jackson School of Geosciences Department of Geological Sciences."Between the short and very long time scale there's a seasonal time scale that's a very chaotic system," Yang said. "But there is some evidence that slowly varying surface conditions, like snow cover, will have a signature in the seasonal timescale."The researchers found that incorporating snow data collected by NASA satellites into climate models improved regional temperature predictions by 5 to 25 percent. These findings are the first to go beyond general associations and break down how much snow can impact the temperature of a region months into the future. Improving temperature predictions is a key element to improving the computer models that provide climate predictions months in advance.The researchers analyzed how data on snow cover and depth collected from two NASA satellites -- MODIS and GRACE -- affected temperature predictions of the Northern Hemisphere in a climate model. The study examined seasonal data from 2003 through 2009, so the researchers could compare the model's predictions to recorded temperatures. The model ran predictions in three-month intervals, with January, February and March each used as starting months.The computer model's temperature improvement changed depending on the region and time, with the biggest improvements happening in regions where ground-based measurements are sparse such as Siberia and the Tibetan Plateau. Climatic conditions of both these areas can influence the Indian Monsoon -- seasonal rains that are vital to agriculture in India, a fact that shows the far-reaching applicability of seasonal climate prediction, Yang said."This correlation between snow and future monsoon has been established for several decades, but here we are developing a predictive framework where you can run the model forward and get a quantity, not just a correlation," Yang said.In the future the researchers plan to expand their research to predict other climatic factors, such as snowfall and rainfall. For the time being, they hope that their findings can be useful to national organizations that make climate predictions, such as the U.S. National Oceanic and Atmospheric Administration and the European Forecasting Center.Randal Koster, a scientist at NASA's Goddard Space Flight Center who studies land-atmosphere interactions using computer models, said that the study is an example of how satellites can improve climate forecasts by providing more accurate data to inform the starting conditions of the model."In the future such use of satellite data will be standard," said Koster, who was not involved with the study. "Pioneering studies like this are absolutely critical to seeing this happen."
Weather
2,016
December 5, 2016
https://www.sciencedaily.com/releases/2016/12/161205113434.htm
Extreme downpours could increase fivefold across parts of the US
At century's end, the number of summertime storms that produce extreme downpours could increase by more than 400 percent across parts of the United States -- including sections of the Gulf Coast, Atlantic Coast, and the Southwest -- according to a new study by scientists at the National Center for Atmospheric Research (NCAR).
The study, published in the journal "These are huge increases," said NCAR scientist Andreas Prein, lead author of the study. "Imagine the most intense thunderstorm you typically experience in a single season. Our study finds that, in the future, parts of the U.S. could expect to experience five of those storms in a season, each with an intensity as strong or stronger than current storms."The study was funded by the National Science Foundation (NSF), NCAR's sponsor, and the Research Partnership to Secure Energy for America."Extreme precipitation events affect our infrastructure through flooding, landslides and debris flows," said Anjuli Bamzai, program director in NSF's Directorate for Geosciences, which funded the research. "We need to better understand how these extreme events are changing. By supporting this research, NSF is working to foster a safer environment for all of us."An increase in extreme precipitation is one of the expected impacts of climate change because scientists know that as the atmosphere warms, it can hold more water, and a wetter atmosphere can produce heavier rain. In fact, an increase in precipitation intensity has already been measured across all regions of the U.S. However, climate models are generally not able to simulate these downpours because of their coarse resolution, which has made it difficult for researchers to assess future changes in storm frequency and intensity.For the new study, the research team used a new dataset that was created when NCAR scientists and study co-authors Roy Rasmussen, Changhai Liu, and Kyoko Ikeda ran the NCAR-based Weather Research and Forecasting (WRF) model at a resolution of 4 kilometers, fine enough to simulate individual storms. The simulations, which required a year to run, were performed on the Yellowstone system at the NCAR-Wyoming Supercomputing Center.Prein and his co-authors used the new dataset to investigate changes in downpours over North America in detail. The researchers looked at how storms that occurred between 2000 and 2013 might change if they occurred instead in a climate that was 5 degrees Celsius (9 degrees Fahrenheit) warmer -- the temperature increase expected by the end of the century if greenhouse gas emissions continue unabated.Prein cautioned that this approach is a simplified way of comparing present and future climate. It doesn't reflect possible changes to storm tracks or weather systems associated with climate change. The advantage, however, is that scientists can more easily isolate the impact of additional heat and associated moisture on future storm formation."The ability to simulate realistic downpours is a quantum leap in climate modeling. This enables us to investigate changes in hourly rainfall extremes that are related to flash flooding for the very first time," Prein said. "To do this took a tremendous amount of computational resources."The study found that the number of summertime storms producing extreme precipitation is expected to increase across the entire country, though the amount varies by region. The Midwest, for example, sees an increase of zero to about 100 percent across swaths of Nebraska, the Dakotas, Minnesota, and Iowa. But the Gulf Coast, Alabama, Louisiana, Texas, New Mexico, Arizona, and Mexico all see increases ranging from 200 percent to more than 400 percent.The study also found that the intensity of extreme rainfall events in the summer could increase across nearly the entire country, with some regions, including the Northeast and parts of the Southwest, seeing particularly large increases, in some cases of more than 70 percent.A surprising result of the study is that extreme downpours will also increase in areas that are getting drier on average, especially in the Midwest. This is because moderate rainfall events that are the major source of moisture in this region during the summertime are expected to decrease significantly while extreme events increase in frequency and intensity. This shift from moderate to intense rainfall increases the potential for flash floods and mudslides, and can have negative impacts on agriculture.The study also investigated how the environmental conditions that produce the most severe downpours might change in the future. In today's climate, the storms with the highest hourly rainfall intensities form when the daily average temperature is somewhere between 20 and 25 degrees C (68 to 77 degrees F) and with high atmospheric moisture. When the temperature gets too hot, rainstorms become weaker or don't occur at all because the increase in atmospheric moisture cannot keep pace with the increase in temperature. This relative drying of the air robs the atmosphere of one of the essential ingredients needed to form a storm.In the new study, the NCAR scientists found that storms may continue to intensify up to temperatures of 30 degrees C because of a more humid atmosphere. The result would be much more intense storms."Understanding how climate change may affect the environments that produce the most intense storms is essential because of the significant impacts that these kinds of storms have on society," Prein said.
Weather
2,016
December 1, 2016
https://www.sciencedaily.com/releases/2016/12/161201172318.htm
Climate change will drive stronger, smaller storms in U.S., new modeling approach forecasts
The effects of climate change will likely cause smaller but stronger storms in the United States, according to a new framework for modeling storm behavior developed at the University of Chicago and Argonne National Laboratory. Though storm intensity is expected to increase over today's levels, the predicted reduction in storm size may alleviate some fears of widespread severe flooding in the future.
The new approach, published today in "Climate models all predict that storms will grow significantly more intense in the future, but that total precipitation will increase more mildly over what we see today," said senior author Elisabeth Moyer, associate professor of geophysical sciences at the University of Chicago and co-PI of the Center for Robust Decision-Making on Climate and Energy Policy (RDCEP). "By developing new statistical methods that study the properties of individual rainstorms, we were able to detect changes in storm frequency, size, and duration that explain this mismatch."While many concerns about the global impact of climate change focus on increased temperatures, shifts in precipitation patterns could also incur severe social, economic, and human costs. Increased droughts in some regions and increased flooding in others would dramatically affect world food and water supplies, as well as place extreme strain on infrastructure and government services.Most climate models agree that high levels of atmospheric carbon will increase precipitation intensity, by an average of approximately 6 percent per degree temperature rise. These models also predict an increase in total precipitation; however, this growth is smaller, only 1 to 2 percent per degree temperature rise.Understanding changes in storm behavior that might explain this gap have remained elusive. In the past, climate simulations were too coarse in resolution (100s of kilometers) to accurately capture individual rainstorms. More recently, high-resolution simulations have begun to approach weather-scale, but analytic approaches had not yet evolved to make use of that information and evaluated only aggregate shifts in precipitation patterns instead of individual storms.To address this discrepancy, postdoctoral scholar Won Chang (now an assistant professor at the University of Cincinnati) and co-authors Michael Stein, Jiali Wang, V. Rao Kotamarthi, and Moyer developed new methods to analyze rainstorms in observational data or high-resolution model projections. First, the team adapted morphological approaches from computational image analysis to develop new statistical algorithms for detecting and analyzing individual rainstorms over space and time. The researchers then analyzed results of new ultra-high-resolution (12 km) simulations of U.S. climate performed with the Weather Research and Forecasting Model (WRF) at Argonne National Laboratory.Analyzing simulations of precipitation in the present (2002-2011) and future (years 2085-2094), the researchers detected changes in storm features that explained why the stronger storms predicted didn't increase overall rainfall as much as expected. Individual storms become smaller in terms of the land area covered, especially in the summer. (In winter, storms become smaller as well, but also less frequent and shorter.)"It's an exciting time when climate models are starting to look more like weather models," Chang said. "We hope that these new methods become the standard for model evaluation going forward."The team also found several important differences between model output and present-day weather. The model tended to predict storms that were both weaker and larger than those actually observed, and in winter, model-forecast storms were also fewer and longer than observations. Assessing these model "biases" is critical for making reliable forecasts of future storms."While our results apply to only one model simulation," Moyer said, "we do know that the amount-intensity discrepancy is driven by pretty basic physics. Rainstorms in every model, and in the real world, will adjust in some way to let intensity grow by more than total rainfall does. Most people would have guessed that storms would change in frequency, not in size. We now have the tools at hand to evaluate these results across models and to check them against real-world changes, as well as to evaluate the performance of the models themselves."New precipitation forecasts that include these changes in storm characteristics will add important details that help assess future flood risk under climate change. These results suggest that concerns about higher-intensity storms causing severe floods may be tempered by reductions in storm size, and that the tools developed at UChicago and Argonne can help further clarify future risk.The paper, "Changes in spatio-temporal precipitation patterns in changing climate conditions," will appear in the Dec. 1 edition of Journal of Climate, at This work was conducted as part of the Research Network for Statistical Methods for Atmospheric and Oceanic Sciences (STATMOS), supported by NSF awards 1106862, 1106974, and 1107046, and the Center for Robust Decision-making on Climate and Energy Policy (RDCEP), supported by the NSF ''Decision Making under Uncertainty'' program award 0951576.
Weather
2,016
December 1, 2016
https://www.sciencedaily.com/releases/2016/12/161201164803.htm
More frequent, more intense and longer-lasting storms cause heavier spring rain in central US
Intense storms have become more frequent and longer-lasting in the Great Plains and Midwest in the last 35 years. What has fueled these storms? The temperature difference between the Southern Great Plains and the Atlantic Ocean produces winds that carry moisture from the Gulf of Mexico to the Great Plains, according to a recent study in
"These storms are impressive," said atmospheric scientist Zhe Feng at the Department of Energy's Pacific Northwest National Laboratory. "A storm can span the entire state of Oklahoma and last 24 hours as it propagates eastward from the Rocky Mountain foothills across the Great Plains, producing heavy rain along the way."Understanding how storms changed in the past is an important step towards projecting future changes. The largest storms, especially, have been challenging to simulate."These storms bring well over half of the rain received in the central U.S. in the spring and summer," said atmospheric scientist Ruby Leung, a coauthor with Feng and others at PNNL. "But almost no climate model can simulate these storms. Even though these storms are big enough for the models to capture, they are more complicated than the smaller isolated thunderstorms or the larger frontal rainstorms that models are wired to produce."Previous research had found more heavy springtime rain falling in the central United States in recent decades, but scientists did not know what types of storms were causing the increase. Different storm types might respond in their own unique ways as the climate warms, so the PNNL researchers set out to find out.To do so, the team worked out a way to identify storms called mesoscale convective systems. This type of storm develops from smaller convective storms that aggregate to form the largest type of convective storms on Earth. They are best detected using satellites with a bird's eye view from space. Feng transformed well-established satellite detection methods into a new technique that he then applied to rainfall measured by radars and rain gauges for the past 35 years. This allowed the researchers to identify thousands of the large convective storms and their rainfall east of the Rocky Mountains.The results showed the frequency of very long-lasting ones increased by about 4 percent per decade, most notably in the northern half of the central region -- just below the Great Lakes. The researchers rated the storms that produced the top five percent of rainfall as extreme events and saw that extreme events have become more frequent in the last 35 years.But what contributes to the changes in the frequency and characteristics of mesoscale convective systems? To find out, the researchers analyzed the region's meteorological environment. They found that the Southern Great Plains warms more than the ocean does.This difference in temperature creates a pressure gradient between the Rocky Mountains and the Atlantic Ocean that induces stronger winds that push moisture up from the Gulf of Mexico. The warmer and moister air converge in the Northern Great Plains, where it falls in massive storms.Although these storms are occurring more often and producing heavier rainfall, whether they turn into floods depends on the details."Flooding depends not only on precipitation intensity and duration, but also how much water the ground can hold," said Leung. "Teasing out whether the observed changes in rain have led to increased flooding is complicated by reservoirs and land use. Both have the ability to modulate soil moisture and streamflow, hence flooding."
Weather
2,016
December 1, 2016
https://www.sciencedaily.com/releases/2016/12/161201161715.htm
Increasing tornado outbreaks: Is climate change responsible?
Tornadoes and severe thunderstorms kill people and damage property every year. Estimated U.S. insured losses due to severe thunderstorms in the first half of 2016 were $8.5 billion. The largest U.S. impacts of tornadoes result from tornado outbreaks, sequences of tornadoes that occur in close succession. Last spring a research team led by Michael Tippett, associate professor of applied physics and applied mathematics at Columbia Engineering, published a study showing that the average number of tornadoes during outbreaks -- large-scale weather events that can last one to three days and span huge regions -- has risen since 1954. But they were not sure why.
In a new paper, published December 1 in Science via First Release, the researchers looked at increasing trends in the severity of tornado outbreaks where they measured severity by the number of tornadoes per outbreak. They found that these trends are increasing fastest for the most extreme outbreaks. While they saw changes in meteorological quantities that are consistent with these upward trends, the meteorological trends were not the ones expected under climate change."This study raises new questions about what climate change will do to severe thunderstorms and what is responsible for recent trends," says Tippett, who is also a member of the Data Science Institute and the Columbia Initiative on Extreme Weather and Climate. "The fact that we don't see the presently understood meteorological signature of global warming in changing outbreak statistics leaves two possibilities: either the recent increases are not due to a warming climate, or a warming climate has implications for tornado activity that we don't understand. This is an unexpected finding."The researchers used two NOAA datasets, one containing tornado reports and the other observation-based estimates of meteorological quantities associated with tornado outbreaks. "Other researchers have focused on tornado reports without considering the meteorological environments," notes Chiara Lepore, associate research scientist at the Lamont-Doherty Earth Observatory, who is a coauthor of the paper. "The meteorological data provide an independent check on the tornado reports and let us check for what would be expected under climate change."U.S. tornado activity in recent decades has been drawing the attention of scientists. While no significant trends have been found in either the annual number of reliably reported tornadoes or of outbreaks, recent studies indicate increased variability in large normalized economic and insured losses from U.S. thunderstorms, increases in the annual number of days on which many tornadoes occur, and increases in the annual mean and variance of the number of tornadoes per outbreak. In the current study, the researchers used extreme value analysis and found that the frequency of U.S. outbreaks with many tornadoes is increasing, and is increasing faster for more extreme outbreaks. They modeled this behavior using extreme value distributions with parameters that vary to match the trends in the data.Extreme meteorological environments associated with severe thunderstorms showed consistent upward trends, but the trends did not resemble those currently expected to result from global warming. They looked at two factors: convective available potential energy (CAPE) and a measure of vertical wind shear, storm relative helicity. Modeling studies have projected that CAPE will increase in a warmer climate leading to more frequent environments favorable to severe thunderstorms in the U.S. However, they found that the meteorological trends were not due to increasing CAPE but instead due to trends in storm relative helicity, which has not been projected to increase under climate change."Tornadoes blow people away, and their houses and cars and a lot else," says Joel Cohen, coauthor of the paper and director of the Laboratory of Populations, which is based jointly at Rockefeller University and Columbia's Earth Institute. "We've used new statistical tools that haven't been used before to put tornadoes under the microscope. The findings are surprising. We found that, over the last half century or so, the more extreme the tornado outbreaks, the faster the numbers of such extreme outbreaks have been increasing. What's pushing this rise in extreme outbreaks is far from obvious in the present state of climate science. Viewing the thousands of tornadoes that have been reliably recorded in the U.S. over the past half century or so as a population has permitted us to ask new questions and discover new, important changes in outbreaks of these tornadoes."Adds Harold Brooks, senior scientist at NOAA's National Severe Storms Laboratory, who was not involved with this project, "The study is important because it addresses one of the hypotheses that has been raised to explain the observed change in number of tornadoes in outbreaks. Changes in CAPE can't explain the change. It seems that changes in shear are more important, but we don't yet understand why those have happened and if they're related to global warming."Better understanding of how climate affects tornado activity can help to predict tornado activity in the short-term, a month, or even a year in advance, and would be a major aid to insurance and reinsurance companies in assessing the risks posed by outbreaks. "An assessment of changing tornado outbreak size is highly relevant to the insurance industry," notes Kelly Hererid, AVP, Senior Research Scientist, Chubb Tempest Re R&D. "Common insurance risk management tools like reinsurance and catastrophe bonds are often structured around storm outbreaks rather than individual tornadoes, so an increasing concentration of tornadoes into larger outbreaks provides a mechanism to change loss potential without necessarily altering the underlying tornado count. This approach provides an expanded view of disaster potential beyond simple changes in event frequency."Tippett notes that more studies are needed to attribute the observed changes to either global warming or another component of climate variability. The research group plans next to study other aspects of severe thunderstorms such as hail, which causes less intense damage but is important for business (especially insurance and reinsurance) because it affects larger areas and is responsible for substantial losses every year.The study was partially funded by Columbia University Research Initiatives for Science and Engineering (RISE) award; the Office of Naval Research; NOAA's Climate Program Office's Modeling, Analysis, Predictions and Projections; Willis Research Network; and the National Science Foundation.
Weather
2,016
November 30, 2016
https://www.sciencedaily.com/releases/2016/11/161130141053.htm
6,000 years ago the Sahara Desert was tropical, so what happened?
As little as 6,000 years ago, the vast Sahara Desert was covered in grassland that received plenty of rainfall, but shifts in the world's weather patterns abruptly transformed the vegetated region into some of the driest land on Earth. A Texas A&M university researcher is trying to uncover the clues responsible for this enormous climate transformation -- and the findings could lead to better rainfall predictions worldwide.
Robert Korty, associate professor in the Department of Atmospheric Sciences, along with colleague William Boos of Yale University, have had their work published in the current issue of Nature Geoscience.The two researchers have looked into precipitation patterns of the Holocene era nd compared them with present-day movements of the intertropical convergence zone, a large region of intense tropical rainfall. Using computer models and other data, the researchers found links to rainfall patterns thousands of years ago."The framework we developed helps us understand why the heaviest tropical rain belts set up where they do," Korty explains."Tropical rain belts are tied to what happens elsewhere in the world through the Hadley circulation, but it won't predict changes elsewhere directly, as the chain of events is very complex. But it is a step toward that goal."The Hadley circulation is a tropical atmospheric circulation that rises near the equator. It is linked to the subtropical trade winds, tropical rainbelts, and affects the position of severe storms, hurricanes, and the jet stream. Where it descends in the subtropics, it can create desert-like conditions. The majority of Earth's arid regions are located in areas beneath the descending parts of the Hadley circulation."We know that 6,000 years ago, what is now the Sahara Desert was a rainy place," Korty adds."It has been something of a mystery to understand how the tropical rain belt moved so far north of the equator. Our findings show that that large migrations in rainfall can occur in one part of the globe even while the belt doesn't move much elsewhere."This framework may also be useful in predicting the details of how tropical rain bands tend to shift during modern-day El Niño and La Niña events (the cooling or warming of waters in the central Pacific Ocean which tend to influence weather patterns around the world)."The findings could lead to better ways to predict future rainfall patterns in parts of the world, Korty believes."One of the implications of this is that we can deduce how the position of the rainfall will change in response to individual forces," he says. "We were able to conclude that the variations in Earth's orbit that shifted rainfall north in Africa 6,000 years ago were by themselves insufficient to sustain the amount of rain that geologic evidence shows fell over what is now the Sahara Desert. Feedbacks between the shifts in rain and the vegetation that could exist with it are needed to get heavy rains into the Sahara."
Weather
2,016
November 29, 2016
https://www.sciencedaily.com/releases/2016/11/161129103936.htm
NASA's ISS-RapidScat Earth science mission ends
NASA's International Space Station Rapid Scatterometer (ISS-RapidScat) Earth science instrument has ended operations following a successful two-year mission aboard the space station. The mission launched Sept. 21, 2014, and had recently passed its original decommissioning date.
ISS-RapidScat used the unique vantage point of the space station to provide near-real-time monitoring of ocean winds, which are critical in determining regional weather patterns. Its measurements of wind speed and direction over the ocean surface have been used by agencies worldwide for weather and marine forecasting and tropical cyclone monitoring. Its location on the space station made it the first spaceborne scatterometer that could observe how winds evolve throughout the course of a day."As a first-of-its-kind mission, ISS-RapidScat proved successful in providing researchers and forecasters with a low-cost eye on winds over remote areas of Earth's oceans," said Michael Freilich, director of NASA's Earth Science Division. "The data from ISS-RapidScat will help researchers contribute to an improved understanding of fundamental weather and climate processes, such as how tropical weather systems form and evolve."The agencies that routinely used ISS-RapidScat's data for forecasting and monitoring operations include the National Oceanic and Atmospheric Administration (NOAA) and the U.S. Navy, along with European and Indian weather agencies. It provided more complete coverage of wind patterns far out to sea that could build into dangerous storms. Even if these storms never reach land, they can bring devastating wave impacts to coastal areas far away."The unique coverage of ISS-RapidScat allowed us to see the rate of change or evolution in key wind features along mid-latitude storm tracks, which happen to intersect major shipping routes," said Paul Chang, Ocean Surface Winds Science team lead at NOAA's Center for Satellite Applications and Research. "ISS-RapidScat observations improved situational awareness of marine weather conditions, which aid optimal ship routing and hazard avoidance, and marine forecasts and warnings."During its mission, ISS-RapidScat also provided new insights into research questions such as how changing winds over the Pacific drove changes in sea surface temperature during the 2015-2016 El Niño event. Due to its unique ability to sample winds at different times of day, its data will be useful to scientists for years to come.ISS-RapidScat was born out of ingenuity, expertise and a need for speed. It was constructed in less than two years to replace its widely valued predecessor, NASA's decade-old QuikScat scatterometer satellite, at a fraction of the cost of the original -- largely by adapting spare parts from QuikScat.On Aug. 19, a power distribution unit for the space station's Columbus module failed, resulting in a power loss to ISS-RapidScat. Later that day, as the mission operations team from NASA's Jet Propulsion Laboratory in Pasadena, California, attempted to reactivate the instrument, one of the outlets on the power distribution unit experienced an electrical overload. In the following weeks, multiple attempts to restore ISS-RapidScat to normal operations were not successful, including a final attempt on Oct. 17.NASA currently does not plan to launch another scatterometer mission. However, the loss of ISS-RapidScat data will be partially mitigated by the newly launched ScatSat ocean wind sensor, a mission of the Indian Space Research Organization.ISS-RapidScat was the first continuous Earth-observing instrument specifically designed and developed to operate on the International Space Station exterior, but it's no longer the only one. The Cloud-Aerosol Transport System (CATS) joined the space station in January 2015 to provide cost-effective measurements of atmospheric aerosols and clouds in Earth's atmosphere. Two more instruments are scheduled to launch to the space station in 2017 -- one that will allow scientists to monitor the ozone layer's gradually improving health, and another to observe lightning over Earth's tropics and mid-latitudes. Following that, two additional Earth science instruments are scheduled for launch in 2018 and 2019.ISS-RapidScat was a partnership between JPL and the International Space Station Program Office at NASA's Johnson Space Center in Houston, with support from the Earth Science Division of NASA's Science Mission Directorate in Washington. Other mission partners include the agency's Kennedy Space Center in Florida and its Marshall Space Flight Center in Huntsville, Alabama; the European Space Agency; and SpaceX.NASA collects data from space, air, land and sea to increase our understanding of our home planet, improve lives and safeguard our future. NASA develops new ways to observe and study Earth's interconnected natural systems with long-term data records. The agency freely shares this unique knowledge and works with institutions around the world to gain new insights into how our planet is changing.To access ISS-RapidScat data, or for more information, visit:
Weather
2,016
November 29, 2016
https://www.sciencedaily.com/releases/2016/11/161129084235.htm
Climate change affects Swedish reindeer herding and increases tularemia infection
In northern Sweden, data from certain weather stations have shown that the snow season has been shortened by over two months in the last 30 years, which has huge effects on reindeer herding. Also, the climate sensitive human infection tularemia has tenfolded over the same period and is much more common now than before. This according to a dissertation at Umeå University in Sweden.
In some places in the north of Sweden, the snow season has been shortened by more than two months between 1978 and 2008, which has dire consequences on life in the North. Data from ten weather stations in reindeer herding areas, from Frösön in mid-Swedish Jämtland to the very north of Sweden, shows that the coldest days have dwindled the most during the period and that long periods of really cold weather are today much less common than previously."Our research shows that climate change in northern Sweden is more extensive than anticipated and that reindeer herding is very vulnerable," says Maria Furberg, doctoral student at the Department of Public Health and Clinical Medicine and the Department of Clinical Microbiology.Climate change in northern Sweden, indicated by shorter periods of snow for instance, has had negative effects on reindeer herders' livelihood. Reindeer herders' ability to handle the consequences is weakened further by other circumstances that also affect reindeer herding, such as for instance increased competition from other businesses, continuously shrinking grazing lands, predator policies and poor financial conditions.In her dissertation, Maria Furberg shows that the Swedish national incidence of tularemia, also known as rabbit fever (read more below), has increased significantly both geographically and in number between 1984 and 2012. The cases also seem to be related to watercourses and lakes. A survey of 1,500 randomly selected inhabitants in the two northernmost counties of Sweden, Norrbotten and Västerbotten, completed in 2014, showed that just under three per cent showed signs of having had a tularemia infection. That corresponds to a 16 time increase in comparison to reported cases."The massive increase in numbers of reported tularemia cases is startling and the disease seems to be much more common than previously anticipated. This means that our health care needs to improve tularemia diagnostics so that all patients receive the correct treatment. Also, the reasons behind the increase in tularemia needs to be investigated further with continuous research," says Maria Furberg.Tularemia is a zoonotic disease, which means that it is transmitted from animals to humans. In Sweden, the disease is often mosquito-borne. The symptoms of the disease are high temperature, ulceration, swelling of lymph nodes and sometimes even severe pneumonia.
Weather
2,016
November 24, 2016
https://www.sciencedaily.com/releases/2016/11/161124081751.htm
Gulfstream may strengthen with more precipitation in the far north
Using a new theory, Erwin Lambert shows that more freshwater in the Arctic may strengthen the Gulfstream's extension into the polar regions -- the opposite of what has generally been anticipated with future climate change.
A new study from researchers at the Bjerknes Centre for Climate Research gives less reason to fear a weakening of the Gulfstream due to climate change. One of the suggested 'tipping points' in the climate system is a substantial slow-down or even collapse of the Gulfstream due to increased freshwater input in the northern seas. In a warmer climate, the hydrological cycle of precipitation and evaporation will strengthen including more rainfall, river runoff and ice melt in the north. One can in its most extreme imagine this literally to close the large-scale ocean circulation between the Arctic and the lower latitudes.In the article 'How northern freshwater input can stabilise thermohaline circulation', Erwin Lambert, PhD-student at UiB and the Bjerknes Centre and the University of Bergen, studies how ocean circulation is affected by increased freshwater input. Lambert and colleagues show how increased freshwater input in the north in some cases can even strengthen the Gulfstream extension into the Arctic -- just like a river in a typical Norwegian fjord is a driver for the fjord's exchange with the surrounding ocean.In 1961, the American oceanographer Henry Stommel reduced the ocean to a few equations. With Stommel's model, the North Atlantic can be split into a warm part in the south and the cold Nordic Seas in the north -- a thought experiment of two boxes, without coastlines, islands or underwater ridges. In the North Atlantic, water flows northward at the surface, before sinking to the bottom in the Nordic Seas and flowing back southward in the deep ocean. The surface current flowing north is what we think of as the Gulf Stream. Stommel's description of how water circulates between warm and cold regions like this, was entirely theoretical, and it was the Finnish oceanographer Clas Rooth who applied it to the Atlantic in 1982.Stommel's model is simple. It does not represent all factors in the real world, but still made it possible to answer a big question. You can neglect the wind, and there will be circulation in the North Atlantic. As long as water sinks in the north, the Gulf Stream will continue to flow north."The beauty of such a model is that we can understand the full behavior of its circulation," says Erwin Lambert.Lambert is a PhD candidate at the Geophysical Institute at the University of Bergen and the Bjerknes Centre, and works with a box model that builds on Stommel's model. He remarks that both theoretical models, like his, and the large and more detailed circulation models used for weather forecasting and climate projections, only represent the real world to a limited extent."The benefit of a theoretical model is that we know, and actually choose, what these limitations will be."Simple models make it easier to pin-point the effect of changes. Like Stommel, Lambert can choose to let the water in the north be less salty and calculate how the ocean current will react to more freshwater in the Nordic Seas. A fresher north is exactly what is expected with global warming.In a warmer world, there will be more rain and snow in the northern regions, meltwater from glaciers and sea ice will pour into the ocean, and together this will make the water in the Nordic Seas less salty. The salty Atlantic water that flows in from the south will mix with water that is fresher than it used to, and the mix will be less dense. As a result, water entering the Nordic Seas will not sink as efficiently as it has done in the past. According to Stommel's model, this would reduce the circulation in the Atlantic Ocean. This is the background for theories that global warming may weaken the Gulf Stream.Two thirds of the water that enters the Nordic Seas flow back south in the deep ocean. The remaining one third continues on the north-bound route and enters the Arctic Ocean. This water is not included in Stommel's model, and when calculating the effect of climate change, it must be. The old model consists of one box for the southern part of the North Atlantic and one for the Nordic Seas.By adding a third box, the Arctic Ocean, the ciculation in the Atlantic Ocean is stabilized. When you include the effect of more freshwater in the Arctic Ocean, the current will be less reduced than in Stommel's model with only two boxes. This makes Erwin Lambert think that increasing precipitation in the north may be less important for the circulation in the Atlantic than previously believed.He admits that it's still an open question of how well such simple box models represent reality. For example, wind -- which the Stommel model does not consider -- is a vital driver of the Gulf Stream near the surface. But Lambert maintains that simple models still make it possible to study major processes in the ocean."It's amazing how much knowledge can be gained from a model that consists of merely five equations."
Weather
2,016
November 22, 2016
https://www.sciencedaily.com/releases/2016/11/161122122811.htm
Deep sea coral in North Atlantic faces threat from climate change
North Atlantic coral populations -- key to supporting a variety of sea life -- are under threat from climate change, a study suggests.
Changes to winter weather conditions could threaten the long-term survival of coral in the region, upsetting fragile ecosystems that support an array of marine species, researchers say.Corals allow diverse forms of marine life to thrive by building reef structures that provide protection from predators and safe spaces to reproduce.The team focused on a species of cold-water coral -- known as Researchers at the University of Edinburgh used computer models to simulate the migration of larvae across vast stretches of ocean. They did so to predict the effect weather changes could have on the long-term survival of They found that a shift in average winter conditions in western Europe -- one of the predicted impacts of climate change -- could threaten coral populations. Ocean currents -- affected by changing wind patterns -- could drive larvae away from key sites in a new network of marine areas established to help safeguard coral populations, researchers say.The team found Scotland's network of Marine Protected Areas -- or MPAs -- appears to be weakly connected, making it vulnerable to the effects of climate change. A coral population on Rosemary Bank seamount, an undersea mountain off Scotland's west coast, is key to maintaining the network.Corals also thrive on oil and gas platforms in the North Sea and west of Shetland, which may help to bridge a gap in the MPA network between populations in the Atlantic and along the coast of Norway, the team says.The study is published in the journal Dr Alan Fox, of the University of Edinburgh's School of GeoSciences, who conducted the analysis, said: "We can't track larvae in the ocean, but what we know about their behaviour allows us to simulate their epic journeys, predicting which populations are connected and which are isolated. In less well connected coral networks, populations become isolated and cannot support each other, making survival and recovery from damage more difficult."Professor Murray Roberts, of the University of Edinburgh's School of GeoSciences and co-ordinator of the ATLAS project, said: "Scotland's seabed plays a unique role as a stepping stone for deep-sea Atlantic species. By teaming up with researchers in Canada and the US, we will expand this work right across the Atlantic Ocean."
Weather
2,016
November 21, 2016
https://www.sciencedaily.com/releases/2016/11/161121094115.htm
El Niño conditions in Pacific precede dengue fever epidemics in South Asia
Researchers have found a strong association between El Niño-Southern Oscillation conditions in the Pacific to observed weather and dengue epidemics in Sri Lanka. According to a study published in the
"Dengue is the major public health burden in Sri Lanka and the Kalutara district is one of the most affected areas. So understanding how reoccurring weather patterns drive dengue is vital in controlling and preventing the disease spread," says Joacim Rocklöv, researcher at the Unit for Epidemiology and Global Health at Umeå University in Sweden and co-author of the article."These new findings allow disease early warning systems to provide warnings for upcoming epidemics with much longer lead time than before," says Prasad Liyanage, doctoral student at Umeå University and Medical Officer for dengue control in Kalutara district at the Sri Lankan Ministry of Health.In the study, researchers used the Oceanic Niño Index, which is a measure indicating el Niño activity by sea surface temperature in the Pacific Ocean, along with local weather and epidemiological data to quantify data associations in 10 healthcare divisions of Kalutara in southwestern Sri Lanka. Weekly weather variables and data on dengue notifications, gathered by Prasad Liyanage for the Ministry of Health between 2009 and 2013, were analysed to estimate locally specific and overall relationships between weather and dengue.The results showed an increasing relative risk of dengue with increasing rainfall starting at above 50 mm per week. The strongest association between rainfall and dengue was found around 6 to 10 weeks following rainfalls of more than 300 mm per week, which amounts to very wet conditions and floods. With increasing temperatures of 30 degrees Celsius or higher, the overall relative risk of dengue increased steadily starting from a lag of 4 weeks."Looking at weather and dengue incidents over longer periods, we found a similar strong link between how increased rainfall and warmer temperatures resulting from the reoccurring el Niño phenomenon are associated with elevated risks of dengue epidemics. In the longer perspective, our data further confirms this association and suggests that dengue fever thrives whenever el Niño visits our island," says Prasad Liyanage.Improving epidemic warning lead timesAccording to the researchers, the findings can be used to improve predictive surveillance models with lead times of up to six months. This would give health officials longer time to increase preparedness and mount control effort responses prior to the epidemics. Today, such control efforts usually have limited effects as they start when signs of an epidemic can be seen within the hospital and primary care surveillance system.
Weather
2,016
November 11, 2016
https://www.sciencedaily.com/releases/2016/11/161111094820.htm
How lightning strikes can improve storm forecasts
Humans have always been frightened and fascinated by lightning. This month, NASA is scheduled to launch a new satellite that will provide the first nonstop, high-tech eye on lightning over the North American section of the planet.
University of Washington researchers have been tracking global lightning from the ground for more than a decade. Lightning is not only about public safety -- lightning strike data have recently been introduced into weather prediction, and a new UW study shows ways to apply them in storm forecasts."When you see lots of lightning you know where the convection, or heat-driven upward motion, is the strongest, and that's where the storm is the most intense," said co-author Robert Holzworth, a UW professor of Earth and space sciences. "Almost all lightning occurs in clouds that have ice, and where there's a strong updraft."The recent paper, published in the American Meteorological Society's The study used data from the UW-based WorldWide Lightning Location Network, which has a global record of lightning strikes going back to 2004. Director Holzworth is a plasma physicist who is interested in what happens in the outer edges of the atmosphere. But the network also sells its data to commercial and government agencies, and works with scientists at the UW and elsewhere.A few years ago Holzworth joined forces with colleagues in the UW Department of Atmospheric Sciences to use lightning to improve forecasts for convective storms, the big storms that produce thunderstorms and tornadoes.Apart from ground stations, weather forecasts are heavily dependent on weather satellites for information to start or "initialize" the numerical weather prediction models that are the foundation of modern weather prediction.What's missing is accurate, real-time information about air moisture content, temperature and wind speed in places where there are no ground stations."We have less skill for thunderstorms than for almost any other meteorological phenomenon," said co-author Cliff Mass, a UW professor of atmospheric sciences. "This paper shows the promise of lightning information. The results show that lightning data has potential to improve high-resolution forecasts of thunderstorms and convection."The new method could be helpful in forecasting storms over the ocean, where no ground instruments exist. Better knowledge of lightning-heavy tropical ocean storms could improve weather forecasts far from the equator, Mass said, since many global weather systems originate in the tropics.The study was funded by NASA and the National Oceanic and Atmospheric Administration. Greg Hakim, a UW professor of atmospheric sciences, is the other co-author. The Worldwide Lightning Location Network began in 2003 with 25 detection sites. It now includes some 80 host sites at universities or government institutions around the world, from Finland to Antarctica.The latest thinking on how lightning occurs is that ice particles within clouds separate into lighter and heavier pieces, and this creates charged regions within the cloud. If strong updrafts of wind make that altitude separation big enough, an electric current flows to cancel out the difference in charge.A bolt of lightning creates an electromagnetic pulse that can travel a quarter way around the planet in a fraction of a second. Each lightning network site hosts an 8- to 12-foot antenna that registers frequencies in the 10 kilohertz band, and sends that information to a sound card on an Internet-connected laptop. When at least five stations record a pulse, computers at the UW register a lightning strike, and then triangulate the arrival times at different stations to pinpoint the location.The network's online map shows lightning strikes for the most recent 30 minutes in Google Earth. An alternate display shows the last 40 minutes of lightning in different parts of the world on top of NASA cloud maps, which are updated from satellites every 30 minutes. The program is the longest-running real-time global lightning location network, and it is operated by the research community as a global collaboration.Lightning already kills hundreds of people every year. That threat may be growing -- a recent study projected that lightning will become more frequent with climate change."The jury's still out on any long-term changes until we have more data," Holzworth said. "But there is anecdotal evidence that we're seeing lightning strikes in places where people are not expecting it, which makes it more deadly."On Nov. 19, NASA is scheduled to launch the new GOES-R satellite that will be the first geostationary satellite to include an instrument to continuously watch for lightning pulses. Holzworth will help calibrate the new instrument, which uses brightness to identify lightning, against network data. NASA also funded the recent research as one of the potential applications for lightning observations."GOES-R will offer more precise, complete lightning observations over North and South America, which will supplement our global data," Holzworth said. "This launch has been long anticipated in the lightning research community. It has the potential to improve our understanding of lightning, both as a hazard and as a forecasting tool."
Weather
2,016
November 9, 2016
https://www.sciencedaily.com/releases/2016/11/161109114055.htm
Ensemble forecast of a major flooding event in Beijing
An extreme rainfall event occurred in Beijing, China, on 21 July 2012. The average 24-hr accumulated rainfall across rain gauge stations in the city was 190 mm, which is the highest in Beijing's recorded history since 1951(Chen et al., 2012). The event resulted in major urban flooding in Beijing, with 79 people losing their lives. However, most operational model forecasts failed to capture the precipitation associated with the convection in the warm sector ahead of the approaching cold front (Zhang et al., 2013), which was a key factor contributing to the total amount of extreme precipitation (Tao and Zheng, 2013). Therefore, the intensity of the rainfall was significantly underpredicted by models.
In a paper recently published in Some members of the ensemble forecasting system captured the extreme prediction reasonably well (certainly better than the operational forecasts available at the time), including the timing and location of extreme precipitation. The authors then used those data to analyze why some members were able to predict the convection in the warm sector while others could not. Comparison between good and bad members showed that orographic lift of very moist low-level flows with a significant southeasterly component played an important role in producing the initial convection. They also found that the extreme rainfall was more sensitive to the mesoscale environmental conditions than to the model physics.Compared to single deterministic forecasts, ensemble forecasts can better capture extreme events and, at the same time, provide information on the reliability or uncertainty of the forecasts. "The ensemble mean forecast is often more accurate than individual forecasts," says Ming Xue, a professor of meteorology and the director of the Center for Analysis and Predictions of Storms at the University of Oklahoma. "Rapidly updated ensemble probabilistic prediction at convection-resolving resolutions is the future for the forecasting and warning of severe convective weather, and the effective assimilation of very high-resolution weather observations into the forecasting models is also essential."
Weather
2,016
November 8, 2016
https://www.sciencedaily.com/releases/2016/11/161108122748.htm
The global climate 2011-2015: Hottest five-year period on record
The World Meteorological Organization has published a detailed analysis of the global climate 2011-2015 -- the hottest five-year period on record -- and the increasingly visible human footprint on extreme weather and climate events with dangerous and costly impacts.
The record temperatures were accompanied by rising sea levels and declines in Arctic sea-ice extent, continental glaciers and northern hemisphere snow cover.All these climate change indicators confirmed the long-term warming trend caused by greenhouse gases. Carbon dioxide reached the significant milestone of 400 parts per million in the atmosphere for the first time in 2015, according to the WMO report which was submitted to U.N. climate change conference.The Global Climate 2011-2015 also examines whether human-induced climate change was directly linked to individual extreme events. Of 79 studies published by the Bulletin of the American Meteorological Society between 2011 and 2014, more than half found that human-induced climate change contributed to the extreme event in question. Some studies found that the probability of extreme heat increased by 10 times or more."The Paris Agreement aims at limiting the global temperature increase to well below 2 ° Celsius and pursuing efforts towards 1.5 ° Celsius above pre-industrial levels. This report confirms that the average temperature in 2015 had already reached the 1°C mark. We just had the hottest five-year period on record, with 2015 claiming the title of hottest individual year. Even that record is likely to be beaten in 2016," said WMO Secretary-General Petteri Taalas."The effects of climate change have been consistently visible on the global scale since the 1980s: rising global temperature, both over land and in the ocean; sea-level rise; and the widespread melting of ice. It has increased the risks of extreme events such as heatwaves, drought, record rainfall and damaging floods," said Mr Taalas.The report highlighted some of the high-impact events. These included the East African drought in 2010-2012 which caused an estimated 258,000 excess deaths and the 2013-2015 southern African drought; flooding in South-East Asia in 2011 which killed 800 people and caused more than US$40 billion in economic losses, 2015 heatwaves in India and Pakistan in 2015, which claimed more than 4,100 lives; Hurricane Sandy in 2012 which caused US$67 billion in economic losses in the United States of America, and Typhoon Haiyan which killed 7,800 people in the Philippines in 2013.The report was submitted to the Conference of the Parties of the United Nations Framework Convention on Climate Change. The five-year timescale allows a better understanding of multi-year warming trends and extreme events such as prolonged droughts and recurrent heatwaves than an annual report.WMO will release its provisional assessment of the state of the climate in 2016 on 14 November to inform the climate change negotiations in Marrakech, Morrocco.2011-2015 was the warmest five-year period on record globally and for all continents apart from Africa (second warmest). Temperatures for the period were 0.57 °C (1.03 °F) above the average for the standard 1961-1990 reference period. The warmest year on record to date was 2015, during which temperatures were 0.76 °C (1.37 °F) above the 1961-1990 average, followed by 2014. The year 2015 was also the first year in which global temperatures were more than 1 °C above the pre-industrial era.Global ocean temperatures were also at unprecedented levels. Globally averaged sea-surface temperatures for 2015 were the highest on record, with 2014 in second place. Sea-surface temperatures for the period were above average in most of the world, although they were below average in parts of the Southern Ocean and the eastern South Pacific.A strong La Niña event (2011) and powerful El Niño (2015/2016) influenced the temperatures of individual years without changing the underlying warming trend.Arctic sea ice continued its decline. Averaged over 2011-2015, the mean Arctic sea-ice extent in September was 4.70 million km2, 28% below the 1981-2010 average. The minimum summer sea-ice extent of 3.39 million km2 in 2012 was the lowest on record.By contrast, for much of the period 2011- 2015, the Antarctic sea-ice extent was above the 1981-2010 mean value, particularly for the winter maximum.Summer surface melting of the Greenland ice sheet continued at above-average levels, with the summer melt extent exceeding the 1981-2010 average in all five years from 2011 to 2015. Mountain glaciers also continued their decline.Northern hemisphere snow cover extent was well below average in all five years and in all months from May to August, continuing a strong downward trend.As the oceans warm, they expand, resulting in both global and regional sea-level rise. Increased ocean heat content accounts for about 40% of the observed global sea-level increase over the past 60 years. A number of studies have concluded that the contribution of continental ice sheets, particularly Greenland and west Antarctica, to sea-level rise is accelerating.During the satellite record from 1993 to present, sea levels have risen approximately 3 mm per year, compared to the average 1900-2010 trend (based on tide gauges) of 1.7 mm per year.Many individual extreme weather and climate events recorded during 2011-2015 were made more likely as a result of human-induced (anthropogenic) climate change. In the case of some extreme high temperatures, the probability increased by a factor of ten or more.Examples include the record high seasonal and annual temperatures in the United States in 2012 and in Australia in 2013, hot summers in eastern Asia and western Europe in 2013, heatwaves in spring and autumn 2014 in Australia, record annual warmth in Europe in 2014, and a heatwave in Argentina in December 2013.The direct signals were not as strong for precipitation extremes (both high and low). In numerous cases, including the 2011 flooding in South-East Asia, the 2013-2015 drought in southern Brazil, and the very wet winter of 2013-2014 in the United Kingdom, no clear evidence was found of an influence from anthropogenic climate change. However, in the case of the extreme rainfall in the United Kingdom in December 2015, it was found that climate change had made such an event about 40% more likely.Some impacts were linked to increased vulnerability. A study of the 2014 drought in south-east Brazil found that similar rainfall deficits had occurred on three other occasions since 1940, but that the impacts were exacerbated by a substantial increase in the demand for water, due to population growth.Some longer-term events, which have not yet been the subject of formal attribution studies, are consistent with projections of near- and long-term climate change. These include increased incidence of multi-year drought in the subtropics, as manifested in the 2011-2015 period in the southern United States, parts of southern Australia and, towards the end of the period, southern Africa.There have also been events, such as the unusually prolonged, intense and hot dry seasons in the Amazon basin of Brazil in both 2014 and 2015, which are of concern as potential "tipping points" in the climate system.
Weather
2,016
November 4, 2016
https://www.sciencedaily.com/releases/2016/11/161104102535.htm
The destructive effects of supercooled liquid water on airplane safety and climate models
Supercooled water sounds smooth enough to be served at espresso bars, but instead it hangs out in Earth's atmosphere, unpredictably freezing on airplane wings and hampering the simulations of climate theorists.
To learn more about this unusual state of matter, Sandia National Laboratories atmospheric scientist Darielle Dexheimer and colleagues have organized an expedition to fly huge tethered balloons in Alaska this coming winter, where temperatures descend to 40 degrees below zero and it's dark as a dungeon for all but a few hours of the day."We'll start in November and see how it goes," she said.Supercooled liquid water is pure water that remains a liquid below its normal freezing point because it has nothing to nucleate around.The idea is to gain a large dataset about it, uncollected elsewhere, to fine-tune the accuracy of climate models and reduce the number of ice-delayed flights and crashes. The team collected data from tethered balloons in Alaska last year, but didn't operate later than October.Ice pops from a balloon's tether line as Sandia National Laboratories researcher Darielle Dexheimer gathers in an instrumented balloon at the Atmospheric Radiation Measurement research station at Oliktok Point, Alaska. The balloon is about 25 feet above Dexheimer's head and the lines are completely iced over. (Photo courtesy of Darielle Dexheimer) Click on the photo for a high-resolution image.The team will wrest more data about the presence and behavior of supercooled liquid water where it is most plentiful and at a location most crucial to climate modelers: Oliktok Point at the tip of oilfields of Prudhoe Bay, one of the northernmost points of the United States."Supercooled liquid water freezes on impact with aircraft and presents a hazard to aviation," said Dexheimer. "The potential for aircraft icing is difficult to model because its extent is typically reported in PIREPs [pilot reports] that are subjective and vary in space and time and from aircraft that are typically attempting to avoid icing. Acquiring in situ icing data in clouds, through use of tethered balloons, eliminates many risks associated with manned research aircraft acquiring that data. These datasets will allow us to better characterize supercooled liquid water in the Arctic for climate modeling and icing research."For climate models, there's not much data about how much liquid and ice are in the Arctic clouds, when and where clouds form, the altitudes at which they hover or how long they last. Clouds containing a lot of liquid in the Arctic are important, as they can act as blankets to warm the surface. Data about the vertical location and concentration of supercooled liquid water in clouds will help atmospheric scientists better understand how these clouds persist for days even though they contain all three phases of water -- liquid, ice and supercooled.The balloon itself presents another challenge. "These 13-foot-tall balloons can easily lift a person, so we need to stay focused," said Dexheimer. Ice is expected to form on the balloon and its tether; up to 50 pounds of ice has accumulated on a balloon after a flight through the clouds. The balloon can also be difficult to control in high winds, and the team won't launch in wind speeds above 25 mph (approximately 22 knots).The problem isn't theoretical. In the relatively balmy weather in late July, an unexpected gust of wind broke a helium-filled balloon free of its tether. It drifted north across the Beaufort Sea, dropping into the water roughly 60 miles north of the research base at Oliktok Point.This didn't dampen Dexheimer's ambition to find out more about climate. "With data from our instrument-laden balloons and their tethers, we can sample in situ for long periods of time -- maybe the entire life cycle of a cloud layer, and hopefully use that data to parameterize climate models," Dexheimer said. "The models currently represent the arctic surface as colder and more stable than it should be in the winter, which results in fewer and shorter duration low clouds than are really happening."A large winch spools out the tether at a steady pace, letting the helium-filled balloon rise slowly through the bottom, middle and top of the clouds, before the team slowly takes it back down to Earth.Sandia National Laboratories researcher Darielle Dexheimer, right, and Sandia colleagues Erika Roesler and Joe Hardesty are using balloon-borne instruments to learn more about supercooled liquid water in the Arctic atmosphere. The data has implications for climate models, aviation and planetary science. (Photo by Sandia National Laboratories) Click on the photo for a high-resolution image."Because our sensor data resolution takes place every meter along the tether and we take temperature readings every 30 seconds, we can measure an almost continuous temperature profile along the entire length of the tether during flight," she said. "We'll have a large dataset that isn't being collected elsewhere that will improve our understanding of cloud processes and hopefully improve the accuracy of climate model output."Said Sandia climate modeler Erika Roesler, "High enough concentrations of liquid water in clouds can warm the surface of the Arctic through emission of thermal radiation. As the Arctic continues to warm, models will most likely -- but not necessarily -- predict that 30 years from now there will be less ice and more liquid in clouds when compared to today's Arctic atmosphere. Data that Dari is taking will give us a reference point of today for the future and help inform regional climate model predictions."Sandia science liaison Joe Hardesty added, "One of the biggest gaps in understanding cloud formation and stability is how the various processes result in persistent seasonal clouds in the Arctic. Supercooled liquid water that switches state to ice, ice nucleation, temperature inversions where clouds become warmer at the top and colder at bottom, and turbulent air flow are all important drivers of cloud formation and behavior."
Weather
2,016
November 3, 2016
https://www.sciencedaily.com/releases/2016/11/161103115201.htm
Frog, toad larvae become vegetarian when it is hot
Climate change is currently one of the greatest threats to biodiversity, and one of the groups of animals most affected by the increase in temperature is amphibians. A team of scientists with Spanish participants studied how heat waves affect the dietary choices of three species of amphibian found on the Iberian Peninsula: the European tree frog, the Mediterranean tree frog and the Iberian painted frog.
Global warming is causing not only a general increase in temperatures, but also an increase in the frequency and intensity of extreme weather events, such as flooding, heat waves and droughts. These environmental changes pose a challenge for many organisms, among them amphibians, who have to change their behaviour, physiology and life strategies in order to survive.Researchers at the Universities of Lisbon (Portugal) and Uppsala (Sweden) studied the behaviour of three kinds of amphibians that inhabit the Iberian Peninsula: the European tree frog (As Germán Orizaola, co-author of the study published in the journal Amphibians are a group that is highly sensitive to global warming due to the permeability of their skin and their complex lifecycle, which combines an aquatic stage as larvae and a terrestrial stage when young and as adults. "In fact, they are already experiencing sharp declines in population and extinction on a global scale, and they have become the focus of several research and conservation programmes in recent decades," the scientist explains.The researchers conducted a laboratory experiment in which they exposed the larvae of these three species to various kinds of heat waves, which varied in duration and intensity, by increasing the temperature of the water where they were growing."The larvae were kept in three different sets of conditions: with a solely vegetable-based diet, solely animal-based or a mixed diet. This third situation allowed us to assess whether they modified their diets towards a greater or lower percentage of vegetable matter," Orizaola adds.They also examined the relationship between various carbon and nitrogen isotopes in the tissue of larvae with a mixed diet and compared them with those of exclusively vegetable-based or animal-based 'menus'. This enabled them to reconstruct the type of diet larvae exposed to a combined diet selected."Our results indicated first that larvae of various species have a diet adapted to the conditions under which they reproduce. The painted frog, which reproduces when it is cold, has a carnivorous diet, while the Mediterranean tree frog, which reproduces during the hottest season of the year, maintains a vegetarian diet," the investigator notes.The most important result is that these larvae have very flexible dietary habits. All three species increased the percentage of vegetables consumed during heat waves. By analysing these larvae's rates of survival, growth and development, reduced effectiveness of the carnivorous diet in favour of a vegetarian diet was discovered in hot conditions."This phenomenon could be common to many species living in continental, aquatic environments. If so, the increased frequency and intensity of heat waves forecast by climate change models could bring about considerable changes to these environments," Orizaola concludes.
Weather
2,016
November 1, 2016
https://www.sciencedaily.com/releases/2016/11/161101111659.htm
Losing its cool: Will ice melt heat up naval operations in Arctic Ocean?
As diminishing sea ice in the Arctic Ocean expands navigable waters, scientists sponsored by the Office of Naval Research (ONR) have traveled to the region to study the changing environment -- and provide new tools to help the U.S. Navy operate in a once-inaccessible area.
"This changing environment is opening the Arctic for expanded maritime and naval activity," said Rear Adm. Mat Winter, chief of naval research. "Developing a deeper understanding and knowledge of this environment is essential for reliable weather and ice predictions to ensure the safety of future scientific and operational activities in the region."A recent announcement from the National Snow and Ice Data Center revealed that 2016's sea ice minimum -- the annual measurement of when sea ice hits its lowest point -- tied with 2007 for the second-lowest ice minimum since satellite monitoring began in the 1970s. The lowest minimum ever occurred in 2012.ONR sponsored its scientific research through two initiatives within its Arctic and Global Prediction Program -- Marginal Ice Zone, and Waves and Sea State. Additional research involved the program's CANada Basin Acoustic Propagation Experiment (CANAPE) initiative.Scientists measured the strength and intensity of waves and swells moving through the weakened Arctic sea ice. The accumulated data will be used to develop more accurate computer models and prediction methods to forecast ice, ocean and weather conditions.CANAPE researchers used sophisticated oceanographic and acoustic sensors to gauge temperature, salinity, ice and ambient noise conditions under the surface of the ice and water -- factors that can dramatically impact the effectiveness of sonar operations and antisubmarine warfare."Abundant sea ice reduces waves and swells and keeps the Arctic Ocean very quiet," said Dr. Robert Headrick, an ONR program officer overseeing the CANAPE research. "With increased sea ice melt, however, comes more waves and wind, which create more noise and makes it harder to track undersea vessels. The goal of CANAPE is to gain a better and more comprehensive understanding of these changing oceanographic conditions."Because of its thick shield of sea ice, the Arctic historically has had limited naval strategic relevance beyond submarine operations. But as this frozen cover changes, it is opening new commercial shipping lanes; increasing oil and natural gas exploration, fishing and tourism; and raising potential new security concerns. It also may create new requirements for the Navy's surface fleet."Having accurate forecasting models will help the Navy determine what types of surface vessels it will need to build in the near future and 30 years from now, to withstand the climate conditions," said Dr. Scott Harper, an ONR program officer overseeing the Marginal Ice Zone and Waves and Sea State research. "That way, the Navy can operate as safely and effectively in the Arctic as it does throughout the rest of the world."Watch a video about ONR-sponsored research in the Arctic Ocean:
Weather
2,016
November 1, 2016
https://www.sciencedaily.com/releases/2016/11/161101103624.htm
West Coast record low snowpack in 2015 influenced by high temperatures
The western-most region of the continental United States set records for low snowpack levels in 2015 and scientists, through a new study, point the finger at high temperatures, not the low precipitation characteristic of past "snow drought" years.
The study suggests greenhouse gases were a major contributor to the high temperatures, which doesn't bode well for the future, according to authors of a new study published in the journal In 2015, more than 80 percent of the snow measurement sites in the region -- comprised of California, Oregon, Washington, western Nevada and western Idaho -- experienced record low snowpack levels that were a result of much warmer-than-average temperatures. Most of the previous records were set in 1977, when there just wasn't enough moisture to generate snow, according to Philip Mote, director of the Oregon Climate Change Research Institute at Oregon State University and lead author on the study."The 2015 snowpack season was an extreme year," Mote said. "But because of the increasing influence of greenhouse gases, years like this may become commonplace over the next few decades." Impacts of the snow drought in California, Oregon and Washington led the governors of those states to order reductions in water use and saw many ski areas, particularly those in lower elevations, struggle.California has been in a drought since 2011 and this multi-year period of low precipitation, by some measures, is the state's most severe in 500 years. In 2015, higher temperatures combined with low precipitation, leading to one of its lowest snowpack levels on record.Oregon and Washington experienced much higher-than-average temperatures during the 2014-15 winter but were not as dry overall as California. Oregon, in fact, was 6.5 degrees (Fahrenheit) warmer than average during that period."The story of 2015 was really the exceptional warmth," said Dennis Lettenmaier, distinguished professor of geography at University of California Los Angeles and co-author of the study. "Historically, droughts in the West have mostly been associated with dry winters, and only secondarily with warmth. But 2015 was different. The primary driver of the record low snowpacks was the warm winter, especially in California, but in Oregon and Washington as well."The 2015 year was an eye-opener for the scope of the snow drought:To determine the impact of greenhouse gases, the researchers used tens of thousands of citizen computers, each running a regional climate simulation in a sort of crowd-sourced supercomputer. The researchers ran one set of simulations using actual sea surface temperatures and greenhouse gas emissions from December 2014 to September 2015.Then they ran a series of simulations with lower greenhouse gas levels corresponding to the pre-industrial era, and teased out the impacts. A third set of simulations used modern greenhouse gases but removed the unusual pattern of sea surface temperatures in 2014-15."The data showed that both greenhouse gases and sea surface temperature anomalies contributed strongly to the risk of snow drought in Oregon and Washington," said Mote, a professor in OSU's College of Earth, Ocean, and Atmospheric Sciences. "The contribution of sea surface temperatures was about twice that of human influence for Oregon and Washington."Higher sea surface temperatures led to a huge patch of warm water, dubbed "The Blob," that appeared in the northern Pacific Ocean more than two years ago. Scientists aren't sure why the blob formed, though many blame a ridge of high pressure that brought sunnier weather and less mixing of surface water with colder, deeper water."Some recent studies suggest that a high pressure ridge that caused warmer temperatures over land also created the blob, but our results suggest that the blob itself may also have contributed to the warm winter here," Mote said.
Weather
2,016
October 31, 2016
https://www.sciencedaily.com/releases/2016/10/161031090032.htm
Research into extreme weather effects may explain recent butterfly decline
Increasingly frequent extreme weather events could threaten butterfly populations in the UK and could be the cause of recently reported butterfly population crashes, according to research from the University of East Anglia (UEA).
Researchers investigated the impact of Extreme Climatic Events (ECEs) on butterfly populations. The study shows that the impact can be significantly positive and negative, but questions remain as to whether the benefits outweigh the negative effects.While it is well known that changes to the mean climate can affect ecosystems, little is known about the impact of short-term extreme climatic events (ECEs) such as heatwaves, heavy rainfall or droughts.Osgur McDermott-Long, PhD student and lead author from the School of Environmental Sciences at UEA, said: "This is the first study to examine the effects of extreme climate events across all life stages of the UK butterflies from egg to adult butterfly. We wanted to identify sensitive life stages and unravel the role that life history traits play in species sensitivity to ECEs."The researchers used data from the UK Butterfly Monitoring Scheme (UKBMS), a high-quality long-term dataset of UK butterfly abundances collected from over 1,800 sites across the UK, spanning 37 years, to examine the effects of weather data and extreme events (drought, extremes of rain, heat and cold) on population change.The team looked at resident species of butterflies, those which only breed once in a year, and those having more than one brood annually. Multi-brood species were found to be more vulnerable than single brood species and in general extremes of temperature rather than precipitation were found to influence changes in butterfly populations.Dr Aldina Franco, co-author said: "A novel finding of this study was that precipitation during the pupal (cocoon) life-stage was detrimental to over one quarter of the species. This study also found that extreme heat during the 'overwintering' life stage was the most detrimental extreme weather event affecting over half of UK species. This may be due to increased incidents of disease or potentially extreme hot temperatures acting as a cue for butterflies or their larvae to come out from overwintering too early and subsequently killed off by temperatures returning to colder conditions."In addition to the negative impacts, the authors found that some life stages may benefit from climatic extreme weather, with extreme heat in the adult stage causing a positive population change in over one third of the UK species.Dr Franco, added: "This is not an unexpected finding given that butterflies are warm loving creatures. Years with extreme warm summers and winters may have mixed effects. For example, this year was terrible for butterflies, although the summer was warm the number of butterflies counted during the Big Butterfly Count was particularly low. Our study indicates that this could have resulted from the detrimental effects of the warm winter, for example the recent low counts1 of Gatekeeper, Common Blue, Comma, Peacock and Small Tortoiseshell butterflies could be explained by our results due to their negative response to warm winters which was just experienced2."Mr McDermott Long said: "The study has demonstrated previously unknown sensitivities of our UK butterflies to extreme climatic events, which are becoming more frequent with climate change. Some of these effects are undoubtedly putting future populations at risk, such as extremely warm winters, however we've seen that warm and even climatically extreme hot summers may actually benefit butterflies.Further research is needed regarding the balance of the importance that these variables could have, to see if the benefits of warmer summers will be outweighed by the detrimental winter effects."Dr Tom Brereton from Butterfly Conservation and a co-author of the study, said: "If we are to mitigate against extreme events as part of conservation efforts, in particular, we need a better understanding of the habitat conditions which can lead to successful survival of adult, pupal and overwintering life stages of UK butterflies in these situations."This work is part of Osgur McDermott-Long's PhD project funded by the University of East Anglia and with Butterfly Conservation and the Biological Records Centre as project partners. Prof Rachel Warren, Dr Aldina Franco and Dr Jeff Price and are the PhD supervisors. External co-authors include Dr Tom Brereton from Butterfly Conservation and Dr Marc Botham from Centre for Ecology and Hydrology.'Sensitivity of UK butterflies to local climatic extremes: which life stages are most at risk?' is published in
Weather
2,016
October 28, 2016
https://www.sciencedaily.com/releases/2016/10/161028161902.htm
See how Arctic sea ice is losing its bulwark against warming summers
Arctic sea ice, the vast sheath of frozen seawater floating on the Arctic Ocean and its neighboring seas, has been hit with a double whammy over the past decades: as its extent shrunk, the oldest and thickest ice has either thinned or melted away, leaving the sea ice cap more vulnerable to the warming ocean and atmosphere.
"What we've seen over the years is that the older ice is disappearing," said Walt Meier, a sea ice researcher at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "This older, thicker ice is like the bulwark of sea ice: a warm summer will melt all the young, thin ice away but it can't completely get rid of the older ice. But this older ice is becoming weaker because there's less of it and the remaining old ice is more broken up and thinner, so that bulwark is not as good as it used to be."Direct measurements of sea ice thickness are sporadic and incomplete across the Arctic, so scientists have developed estimates of sea ice age and tracked their evolution from 1984 to the present. Now, a new NASA visualization of the age of Arctic sea ice shows how sea ice has been growing and shrinking, spinning, melting in place and drifting out of the Arctic for the past three decades."Ice age is a good analog for ice thickness because basically, as ice gets older it gets thicker," Meier said. "This is due to the ice generally growing more in the winter than it melts in the summer."In the early 2000s, scientists at the University of Colorado developed a way to monitor Arctic sea ice movement and the evolution of its age by using data from a variety of sources, but primarily satellite passive microwave instruments. These instruments gauge brightness temperature: a measure of the microwave energy emitted by sea ice that is influenced by the ice's temperature, salinity, surface texture and the layer of snow on top of the sea ice. Each floe of sea ice has a characteristic brightness temperature, so the researchers developed an approach that would identify and track ice floes in successive passive microwave images as they moved across the Arctic. The system also uses information from drifting buoys as well as weather data."It's like bookkeeping; we're keeping track of sea ice as it moves around, up until it melts in place or leaves the Arctic," said Meier, who is a collaborator of the group at the University of Colorado and the National Snow and Ice Data Center in Boulder, Colorado, the center that currently maintains the Arctic sea ice age data.Every year, sea ice forms in the winter and melts in the summer. The sea ice that survives the melt season thickens with each passing year: newly formed ice grows to about 3 to 7 feet of thickness during its first year, while multi-year ice (sea ice that has survived several melt seasons) is about 10 to 13 feet thick. The older and thicker ice is more resistant to melt and less likely to get pushed around by winds or broken up by waves or storms.The motion of sea ice is not limited to its seasonal expansion and shrinkage: Except for coastal regions where sea ice is attached to the shore, the sea ice cap is in almost constant movement. The primary driver of sea ice movement in the Arctic is wind and there are two major features in the Arctic circulation: the Beaufort Gyre, a clockwise ice circulation that makes ice spin like a wheel in the Beaufort Sea, north of Alaska, and the Transpolar Drift Stream, which transports ice from Siberia's coast toward the Fram Strait east of Greenland, where the ice exits the Arctic basin and melts in the warmer waters of the Atlantic Ocean."On a week-to-week basis, there are weather systems that come through, so the ice isn't moving at a constant rate: sometimes the Beaufort Gyre reverses or breaks down for a couple weeks or so, the Transpolar Drift Stream shifts in its direction ... but the overall pattern is this one," Meier said. "Then the spring melt starts and the ice shrinks back, disappearing from the peripheral seas."The new animation shows two main bursts of thick ice loss: the first one, starting in 1989 and lasting a few years, was due to a switch in the Arctic Oscillation, an atmospheric circulation pattern, which shrunk the Beaufort Gyre and enhanced the Transpolar Drift Stream, flushing more sea ice than usual out of the Arctic. The second peak in ice loss started in the mid-2000s."Unlike in the 1980s, it's not so much as ice being flushed out -though that's still going on too," Meier said. "What's happening now more is that the old ice is melting within the Arctic Ocean during the summertime. One of the reasons is that the multiyear ice used to be a pretty consolidated ice pack and now we're seeing relatively smaller chunks of old ice interspersed with younger ice. These isolated floes of thicker ice are much easier to melt.""We've lost most of the older ice: In the 1980s, multiyear ice made up 20 percent of the sea ice cover. Now it's only about 3 percent," Meier said. "The older ice was like the insurance policy of the Arctic sea ice pack: as we lose it, the likelihood for a largely ice-free summer in the Arctic increases."
Weather
2,016
October 27, 2016
https://www.sciencedaily.com/releases/2016/10/161027094137.htm
Discovery of 'lost' early satellite data to aid understanding of the Earth's climate
Serendipity, expertise, foresight and the equivalent of an Earth observation data archaeological dig have led to recovery of almost-40-year-old satellite imagery -- thought lost forever -- which will significantly add to understanding of our planet's climate.
The data, from the European Space Agency's prototype Meteosat-1 geostationary meteorological satellite, was found at the University of Wisconsin-Madison's Space Science and Engineering Center (SSEC) in the United States.It has now been provided to EUMETSAT, which operates and disseminates data from Meteosat-1's "descendents" and, crucially, has an uninterrupted record of climate data from these satellites stretching back more than 30 years. That record, although with a small gap, now extends even further back in time.To say that the discovery of this lost data was greeted with enthusiasm would be an understatement, with climate scientists describing it as "like finding a lost child" -- "the first born"!Meteosat-1 was launched on 23 November 1977, and was positioned in a geostationary orbit at 0° degrees longitude, with a constant view of most of Europe, all of Africa, the Middle East and part of South America.From that position, this view of the "full-disk" was scanned every 30 minutes, with the data being provided in near-real time to users. The satellite's mission lasted until 25 November 1979.Meteosat-1 represented cutting-edge technology for its time, introducing the concept of a global system of geostationary platforms capable of observing the atmospheric circulation and weather around the equator in near-real time. It was also the first geostationary meteorological satellite to have a water vapour channel, tracking the motion of moisture in the air.The data found in America comprises 20,790 images, from 1 December 1978 to 24 November 1979.On 27 June 2016, EUMETSAT held an event to celebrate its 30th anniversary, in Darmstadt, Germany. Among the guests was Dr Paul Menzel, Senior Scientist with the Cooperative Institute for Meteorological Satellite Studies, part of the University of Wisconsin-Madison's Space Science and Engineering Center.A memento guests at the event received was a memory stick with links to EUMETSAT's climate data record, from 1 January 1984 up until the anniversary in 2016 -- more than 32 years."It was pointed out that the data was all there, except for two days, which were missing," Dr Menzel said."That prompted me to have a look whether we had the data for those two days. When I went back, we started looking for the data but I was told we didn't have any Meteosat data from before 1992. I knew that couldn't be right."The SSEC Data Center didn't have the data for the missing two days but did find something even more valuable.In 1978-79, the First GARP (Global Atmospheric Research Programme) Global Experiment (FGGE) was undertaken -- a project reported by New Scientist at the time as the biggest cooperative international venture ever undertaken. Its aim was to find out which gaps in global weather monitoring could be filled to improve weather forecasting seven to 10 days in advance.Meteosat-1 data was provided to the SSEC for this project. The centre's founder, Verner Suomi, often referred to as the "Father of Satellite Meteorology," had the foresight to recognise the importance of preserving Earth observation data."I thought we must have the FGGE data," Dr Menzel said. "Vern's mentality was, I don't want to lose any of the data."Dr Menzel's colleague, CIMSS Programme Manager Dr David Santek said teams of experts had worked in three shifts around the clock tracking cloud features in the images from the 1,200 nine-track tapes of Meteosat-1 data that was shipped to them for the FGGE project in 1978-79."Then those tapes sat around for 20 years," Dr Santek said."In 1997, we started converting data from old tape media on to more modern media. We could not dispose of those old tapes."From 2001-2004, new nine-track tape drives were acquired to extract most of the data from the tapes and, over the past 15 years, the original data were stored on disk, although, without any attempt to use it.That's why the old data were able to be found.But finding the data was not the end of the story. The files were stored on disk in the original tape format and needed to be decoded.Dan Forrest, SSEC's Senior Systems Engineer, spent several weeks piecing the files together, dug up old documentation, wrote a decoder and was able to retrieve the data, but it was not quite usable.In another serendipitous twist, Dr Santek was the person who wrote some of the original code and he provided modules for navigating and calibrating the data.The data from Meteosat-1 will help scientists better understand the climate and how it has changed.EUMETSAT Climate Services and Product Manager Dr Jörg Schulz, said the discovery would not only provide a longer time series of climate data but would be reanalysed and reprocessed using the latest methodology."It gives us information about the state of Earth's atmosphere from a time when there was less interference from human activity," Dr Menzel added.Dr Schulz said this would help further improve understanding of Earth's climate system."One of the grand challenges in climate science is to better understand atmospheric circulation in general," Dr Schulz said. "Where is the tropical, warm, moist air going? Where is the polar, cool, dry air going? And how does this change over time?"This data will be very important to support the analysis of position, strength and variability of storm tracks as well as circulation-cloud interactions."The three scientists were keen to stress not only the scientific and historical importance of the data but also how this demonstrates the value of strong collaboration and cooperation."It's another example of the strong collaboration between SSEC and EUMETSAT and I'm very happy to have found those tapes," Dr Menzel said."A lot of people were involved," Dr Santek added. "It's history and we are able to make it useful, even though it hasn't been looked at for 30 years.""We are excited about the work done at SSEC and look forward to analysing and improving the data in collaboration with SSEC in the coming years," Dr Schulz concluded.You can see an animation made from one day's worth of imagery from Meteosat-1 on the EUMETSAT YouTube channel:
Weather
2,016
October 27, 2016
https://www.sciencedaily.com/releases/2016/10/161027093613.htm
Strong link between atmospheric forcing, deep convection, ocean ventilation and anthropogenic carbon sequestration
Based on a unique dataset collected during a research cruise to the Irminger Sea in April 2015, a new paper reveals a strong link between atmospheric forcing, deep convection, ocean ventilation and anthropogenic carbon sequestration.
The Irminger Sea, a small ocean basin between Greenland and Iceland, is known for its harsh and extreme weather conditions during winter. Research cruises that take measurements in the subpolar North Atlantic almost exclusively do so in summer, although the area is particularly interesting in the convectively active winter season.Wintertime on-board ship measurements in the Irminger Sea were collected in April 2015 by scientists from the Bjerknes Centre for Climate research, as part of the SNACS project funded by the Norwegian Research Council. The results are now published in Nature Communications by Friederike Fröb, a PhD student at the Geophysical Institute of the University of Bergen and the Bjerknes Centre for Climate Research, with colleagues from the University of Bergen, Uni Research Bergen, the University of Toronto and the Bedford Institute of Oceanography, both in Canada.Compared to the far more famous Labrador Sea where deep convection is observed almost every year, convection in the Irminger Sea is more rare, and more variable in extent and strength. The 2015 data show record winter mixed layers of 1,400m depth -- usually observed are 400m. The last time winter mixing had been that deep was probably in the mid-1990s, however, there is only indirect evidence for that; no direct measurements are available from that time. In the late 2000s, during the winters 2007/08 and 2011/12, convection down to between 800m and 1,000m was observed by ARGO floats.With the newly collected data in 2015, oxygen and carbon concentrations during active convection have been determined as well.These data show that oxygen and anthropogenic CO2 concentrations were both almost saturated with respect to the atmosphere in the upper water column. This resulted in a replenishment of depleted oxygen levels at mid-depth as well as a sequestration of large amounts of anthropogenic carbon to the deep ocean. Compared to historic cruise data in 1997 and 2003 covering the same transect as the 2015 cruise, the anthropogenic carbon storage rate almost tripled in response to the large variability in the physical climate system.The main driver for that extreme convective event in 2015 was the strong heat flux from the water column, a consequence of exceptionally strong winds that developed that winter around the southern tip of Greenland. The winter 2014-2015 was also the coldest on record in the North Atlantic, a phenomenon known as the 'cold-blob'. This cold-blob has been tied to a reduced Atlantic Meridional Overturning Circulation as a consequence of increased freshwater runoff from the melting Greenland Ice Sheet and the Arctic, which increases ocean stratification. Although observations of one extreme event during winter can not be used to reject a hypothesis that is based on long-term trends, global climate model predictions are definitely challenged. The ability or lack of such to resolve small scale atmospheric phenomena like the ones in the Irminger Sea might be of greater relevance to simulate convective processes in the North Atlantic than anticipated.Overall, the cruise observations reveal the strong, direct link between atmospheric forcing, oceanic heat loss, ventilation, and anthropogenic carbon storage in the Irminger Sea. Further, the cruise data shows the necessity of ongoing, continuous data collection in remote areas also during harsh seasons, allowing to study highly variable natural processes as well as the impact of anthropogenic climate change on ocean biogeochemistry.
Weather
2,016
October 26, 2016
https://www.sciencedaily.com/releases/2016/10/161026081551.htm
Extreme cold winters fuelled by jet stream and climate change
Scientists have agreed for the first time that recent severe cold winter weather in the UK and US may have been influenced by climate change in the Arctic, according to a new study.
The research, carried out by an international team of scientists including the University of Sheffield, has found that warming in the Arctic may be intensifying the effects of the jet stream's position, which in the winter can cause extreme cold weather, such as the winter of 2014/15 which saw record snowfall levels in New York.Scientists previously had two schools of thought. One group believe that natural variability in the jet stream's position has caused the recent severe cold winter weather seen in places such as the Eastern United States and the UK. The other camp includes scientists who are finding possible connections between the warming of the Arctic -- such as melting sea ice, warming air temperatures, and rising sea surface temperatures -- and the emerging pattern of severe cold winter weather.Now, Professor Edward Hanna and Dr Richard Hall from the University's Department of Geography, together with Professor. James E. Overland from the US Oceanographic and Atmospheric Administration (NOAA), have brought together a diverse group of researchers from both sides of the debate.The researchers have found that the recent pattern of cold winters is primarily caused by natural changes to the jet stream's position; however, the warming of the Arctic appears to be exerting an influence on cold spells, but the location of these can vary from year to year.Previous studies have shown that when the jet stream is wavy there are more episodes of severe cold weather plunging south from the Arctic into the mid-latitudes, which persist for weeks at a time. But when the jet stream is flowing strongly from west to east and not very wavy, we tend to see more normal winter weather in countries within the mid-latitudes."We've always had years with wavy and not so wavy jet stream winds, but in the last one to two decades the warming Arctic could well have been amplifying the effects of the wavy patterns," Professor Hanna said. He added: "This may have contributed to some recent extreme cold winter spells along the eastern seaboard of the United States, in eastern Asia, and at times over the UK (e.g. 2009/10 and 2010/11)."Improving our ability to predict how climate change is affecting the jet stream will help to improve our long-term prediction of winter weather in some of the most highly populated regions of the world."This would be hugely beneficial for communities, businesses, and entire economies in the northern hemisphere. The public could better prepare for severe winter weather and have access to extra crucial information that could help make live-saving and cost-saving decisions."The study, "Nonlinear response of mid-latitude weather to the changing Arctic" is published in the journal It further cements the University's position at the forefront of climate change research and gives geography students at Sheffield access to the latest innovations in environmental science.
Weather
2,016
October 24, 2016
https://www.sciencedaily.com/releases/2016/10/161024171132.htm
Weather forecasts for the past
In the new study, the annual rainfall and average temperatures in the national park were inferred from the teeth of herbivorous mammals. Such reverse engineering opens up new opportunities for interpreting fossil records.
What is interesting about this research is that it shows that features in animal teeth are particularly good at detecting where the weather has been unfavourable for the species in question. Such weather conditions include long dry periods, heavy rains or exceptionally low temperatures -- anything that could result in the animal's primary food source becoming unavailable, forcing the animals to turn to less preferred plants to survive.The researchers were particularly interested in why animals were absent from a particular geographical area."African national parks frequently endure poor years, which seem to prevent the establishment of permanent populations of certain animals. Animals live where the conditions allow them to live and reproduce over the span of decades or centuries," says Mikael Fortelius, professor of evolutionary palaeontology at the University of Helsinki.This study was not about where elks or zebras live or what they eat, nor about what their teeth are like. All of this information can be found in previous research.Mikael Fortelius has discovered how dental traits are connected to the environment. For example, how efficiently the tooth cuts, how well it can withstand wear or whether it can crush hard, woody plants. These characteristics can tell us which conditions specific animals need to survive.Exact data on the number and geographical spread of the animals in Kenya's national parks have been collected over the course of the past 60 years."We calculated the average dental traits for each area and modelled them in relation to environmental factors. The model's generalisability was tested, and the predictability of individual environmental factors was compared," explains researcher Indr? ?liobait?, who was responsible for the modelling.
Weather
2,016
October 23, 2016
https://www.sciencedaily.com/releases/2016/10/161023155113.htm
U.S. Winter outlook predicts warmer, drier South and cooler, wetter North
Forecasters at NOAA's Climate Prediction Center issued the U.S. Winter Outlook today, saying that La Nina is expected to influence winter conditions this year. The Climate Prediction Center issued a La Nina watch this month, predicting the climate phenomenon is likely to develop in late fall or early winter. La Nina favors drier, warmer winters in the southern U.S and wetter, cooler conditions in the northern U.S. If La Nina conditions materialize, forecasters say it should be weak and potentially short-lived.
"This climate outlook provides the most likely outcome for the upcoming winter season, but it also provides the public with a good reminder that winter is just up ahead and it's a good time to prepare for typical winter hazards, such as extreme cold and snowstorms," said Mike Halpert, deputy director, NOAA's Climate Prediction Center. "Regardless of the outlook, there is always some chance for extreme winter weather, so prepare now for what might come later this winter."Other factors that often play a role in the winter weather include the Arctic Oscillation, which influences the number of arctic air masses that penetrate into the South and create nor'easters on the East Coast, and the Madden-Julian Oscillation, which can affect the number of heavy rain events in the Pacific Northwest.Wetter than normal conditions are most likely in the northern Rockies, around the Great Lakes, in Hawaii and in western AlaskaDrier than normal conditions are most likely across the entire southern U.S. and southern Alaska.Warmer than normal conditions are most likely across the southern U.S., extending northward through the central Rockies, in Hawaii, in western and northern Alaska and in northern New England.Cooler conditions are most likely across the northern tier from Montana to western Michigan.Drought will likely persist through the winter in many regions currently experiencing drought, including much of California and the SouthwestDrought is expected to persist and spread in the southeastern U.S. and develop in the southern Plains.New England will see a mixed bag, with improvement in the western parts and persistence to the east.Drought improvement is anticipated in northern California, the northern Rockies, the northern Plains and parts of the Ohio Valley.This seasonal outlook does not project where and when snowstorms may hit or provide total seasonal snowfall accumulations. Snow forecasts are dependent upon the strength and track of winter storms, which are generally not predictable more than a week in advance. However, La Nina winters tend to favor above average snowfall around the Great Lakes and in the northern Rockies and below average snowfall in the mid-Atlantic.NOAA produces seasonal outlooks to help communities prepare for what's likely to come in the next few months and minimize weather's impacts on lives and livelihoods. Empowering people with actionable forecasts and winter weather tips is key to NOAA's effort to build a Weather-Ready Nation.A video of NOAA's 2016 winter outlook is available here:
Weather
2,016
October 19, 2016
https://www.sciencedaily.com/releases/2016/10/161019132804.htm
Biomass heating could get a 'green' boost with the help of fungi
In colder weather, people have long been warming up around campfires and woodstoves. Lately, this idea of burning wood or other biomass for heat has surged in popularity as an alternative to using fossil fuels. Now, in the journal
The benefit of biomass, which consists of plant material and animal waste, is that there is no shortage. It is produced continuously in enormous quantities as a waste product from paper and agricultural industries. But burning it emits fine particles and volatile organic compounds, or VOCs, linked to health and environmental problems. So scientists have been trying to figure out how to use biomass with minimal emissions. One approach involves adding microorganisms that can degrade the materials. In this process, heat is released without giving off fine particles or VOCs. So far, most investigations into this method have involved room-temperature conditions. But for sustained use, these reactions would need to take place at temperatures above ambient conditions as heat is produced. Leire Caizán Juanarena and colleagues wanted to warm things up to see how much heat they could coax out of the process.The researchers incubated two fungi species that do well in hot climates -- lignin-degrading
Weather
2,016
October 18, 2016
https://www.sciencedaily.com/releases/2016/10/161018193555.htm
Soil moisture, snowpack data could help predict 'flash droughts'
New research suggests that "flash droughts" -- like the one that unexpectedly gripped the Southern Rockies and Midwest in the summer of 2012 -- could be predicted months in advance using soil moisture and snowpack data.
Researchers at the National Center for Atmospheric Research (NCAR) analyzed the conditions leading up to the 2012 drought, which ultimately caused $30 billion in economic losses, looking for any warning signs that a drought was on the way. In a study funded by the National Science Foundation and published in the "The 2012 drought over the Midwest was one of the most severe and extensive U.S. droughts since the 1930s Dust Bowl, but it was also extremely challenging to predict," said Debasish PaiMazumder, lead author of the study. "This study demonstrated the potential to improve seasonal drought outlooks in the future, giving farmers, water planners, and others more time to prepare."Seasonal drought forecasts issued in May 2012 for the upcoming summer did not foresee a drought forming in the country's midsection. But by the end of August, a drought that had started in the Southern Rockies had spread across the Midwest, parching Oklahoma, Kansas, Nebraska, and Missouri.These flash droughts -- which form and intensify rapidly -- can catch forecasters off guard because they are not preceded by any large-scale climate patterns that could act as a warning signal. For example, one contributor to the recent California drought was a persistent high-pressure system parked off the west coast of Canada that deflected storms away from the state. Because forecasters could identify the high-pressure system, they could also accurately predict fewer storms and a worsening of the drought.Previous research has shown that looking at soil moisture alone could improve the lead-time of drought predictions by one to two months. PaiMazumder and NCAR colleague James Done were interested in whether they could extend this further by adding snowpack into the equation."Advance knowledge of a drought even a month or two ahead of time can greatly minimize the effects on society," said Anjuli Bamzai, program director in NSF's Division of Atmospheric and Geospace Sciences, which funded the research. "This study highlights the role of snowpack and soil moisture conditions in predicting the sudden onset of drought."To explore the physical connections among snowpack, soil moisture, and drought, the researchers analyzed data collected between 1980-2012. To supplement those observations, they also explored the physical connections in a new NCAR-based community Weather Research and Forecasting (WRF) model dataset comprising 24 simulations of the period 1990-2000 and 2012. Because each simulation was run with small tweaks to the way the model represents atmospheric physics, the result was a broad look at different climate scenarios that could have plausibly unfolded during the study period."The model helped us get a handle on how robust the relationships between snowpack, soil moisture, and drought are," Done said. "The stronger the relationship, the better a predictor is."While observations of snowpack and soil moisture could have helped predict the 2012 drought, the method does not replace other drought prediction measures that identify large-scale phenomena that frequently lead to drought conditions."This is another ingredient that could be used when making seasonal drought forecasts," Done said. "But it's not the only ingredient, and for many droughts that are tied to large-scale precursors, it may not be the most important one."
Weather
2,016
October 13, 2016
https://www.sciencedaily.com/releases/2016/10/161013220633.htm
Wind patterns in lowest layers of supercell storms key to predicting tornadoes
New research from North Carolina State University has found that wind patterns in the lowest 500 meters of the atmosphere near supercell thunderstorms can help predict whether that storm will generate a tornado. The work may help better predict tornado formation and reduce the number of false alarms during tornado season.
Supercells are a special type of thunderstorm. They last much longer than normal thunderstorms and produce the vast majority of tornadoes and other severe weather. Seventy-five percent of supercell thunderstorms are nontornadic, or don't cause tornadoes. Difficulty in predicting which storms may produce tornadoes has resulted in a false alarm ratio for tornado warnings that also hovers around 75 percent. When using traditional weather sampling methods, there are no clearly observable differences between tornadic and nontornadic supercells in terms of precipitation echoes, rotating updrafts or surface air circulations.To address this knowledge gap, researchers involved in the second Verification of the Origins of Rotation in Tornadoes Experiment (VORTEX2) collected data in close proximity to supercell storms. Using data from the 12 best-sampled storms -- seven of which produced tornadoes -- Brice Coffer, a graduate student in marine, earth, and atmospheric sciences at NC State and lead author of a paper describing the work, ran simulations of supercell storms to determine which factors made tornadogenesis more likely."We noticed that the biggest difference between tornadic and nontornadic storms was the wind in the lowest 500 meters near the storm," Coffer says. "Specifically, it was the difference in the way the air rotated into the storm in the updraft."All storms have an updraft, in which air is drawn upward into the storm, feeding it. In supercells, the rising air also rotates due to wind shear, which is how much the wind changes in speed and direction as you go higher in the atmosphere. Coffer's simulations demonstrated that if wind shear conditions are right in the lowest 500 meters, then the air entering the updraft spirals like a perfectly thrown football. This leads to a supercell that is configured to be particularly favorable for producing a tornado, as broad rotation at the ground is stretched by the updraft's lift, increasing the speed of the spin and resulting in a tornado.On the other hand, if the wind shear conditions in the lowest part of the atmosphere are wrong, then the air tumbles into the storm like a football rotating end over end after a kickoff. This results in a disorganized storm that doesn't produce tornadoes due to a lack of stretching near the ground. Coffer hopes that his results may lead to fewer tornado false alarms."This work points to the need for better observational techniques of the low-level winds being drawn into the storms' updraft," Coffer says. "Improving this aspect of storm monitoring will improve our predictive abilities when it comes to tornadoes."The research appears in
Weather
2,016
October 13, 2016
https://www.sciencedaily.com/releases/2016/10/161013095756.htm
Ocean rogue waves: A mystery unveiled?
Rogue waves are extremely high ocean waves that exceed the significant wave height by more than a factor of 2. Extreme waves are also very rare; less than one in 100,000 waves exceeds the rogue wave criterion. While their existence was long disputed throughout the 1990s, thousands of rogue waves have been recorded on oil rigs in the past 20 years. Nevertheless, the origin of rogue waves is still disputed, with a multitude of competing theories that fall into two basic categories: linear theories consider incidental random interference the origin of rogue waves. This means that it is just bad luck when your ship is hit by a rogue wave. Then nothing can be done to foresee such an event. Recently nonlinear theories gained increasing popularity as they promise that certain characteristic wave patterns may possibly precede a rogue event. While this appears very appealing, neither theory can sufficiently explain the measured probabilities of rogue waves in the ocean.
In a collaborative effort, the group of Günter Steinmeyer at the Max-Born-Institut in Berlin together with colleagues from the Leibniz-University in Hannover and the Technical University in Dortmund now report a new approach to shed more light on the rogue wave mystery. To this end, they suggest a new metric for the complexity of the wave motion, namely, the so-called phase space dimension. This metric measures the effective number of waves that interfere at one given location on the ocean surface. More importantly, they also propose a way to measure the dimension, and this measurement could readily be implemented on ships, possibly providing an early warning of rogue waves.In fact, it seems that the capability of the ocean to form rogue waves is variable. The study suggests that the ocean surface movement is fairly simply structured throughout most of the time. Even in heavy storms, mostly conditions prevail that do not enable rogue wave formation. However, the complexity of the wave patterns may suddenly increase when crossing seas are generated, resulting in rogue-wave prone situations. Using the suggested dimensional analysis, it is exactly these rogue-wave prone situations that can be detected. Nevertheless, the individual rogue wave event remains unforeseeable. Moreover the study suggests that the ocean dynamics are ruled by linear yet still very complex dynamics.The study therefore opens a new perspective for a better understanding of ocean rogue waves. Much research went into ocean nonlinearities, but it appears that the latter play a minor role for rogue wave formation. In contrast, winds have found very little attention in the rogue wave discussion so far. As winds are ultimately the drivers behind ocean wave formation in general, it therefore seems perfectly possible to identify rogue-wave prone situations from meteorological analysis, identifying situations that may give rise to crossing seas early on. The appearance of an individual rogue wave may remain a mystery, but at least, we may soon be able to predict the "rogueness" of ocean weather hours or days in advance.
Weather
2,016
October 12, 2016
https://www.sciencedaily.com/releases/2016/10/161012132150.htm
Climate change may help Ethiopia, increase the country's access to water
Despite the many disastrous impacts of climate change, there are some regions of the globe that might benefit from hotter temperatures.
A team of researchers from Virginia Tech have predicted that water availability in the Blue Nile Basin of Ethiopia may increase in coming decades due to global climate change. It could also lead to increased crop production, spur massive hydroelectric power projects, and foster irrigation development in the region."For all the catastrophic impacts of climate change, there are some silver linings," said Zach Easton, associate professor of biological systems engineering. "The sad irony is that climate change may be the catalyst Ethiopia needs to become a food-exporting country."The research team used a suite of climate and hydrologic models to predict the impact of climate change on water availability and sediment transport in the Blue Nile. Most previous Nile Basin climate impact studies have only focused on water availability, but the study conducted by the team at Virginia Tech was a first of its kind to to assess sediment transport, a big problem in the basin where some of the highest erosion rates in the world have been measured.The findings of the study were recently published in the journal "Ethiopia could experience increased water accessibility making growing seasons longer and potentially allowing for two crops to be grown per year," said Moges Wagena, from Assosa, Ethiopia. Wagena is first author on the paper and also associated with the Abay Basin Authority, a water resource management entity for one of Ethiopia's 12 water basins. Wagena is one of Easton's doctoral candidates in the Department of Biological Systems Engineering, housed in both the College of Agriculture and Life Sciences and the College of Engineering. The team also included Andrew Sommerlot, another of Easton's doctoral candidates; Daniel Fuka, a post-doctoral student working with Easton; researchers from the University of Maryland; and the International Water Management Institute, Nile Basin Office. The work was funded by the World Bank and the International Water Management Institute.The team coupled hydrologic models with bias-corrected and downscaled Intergovernmental Panel on Climate Change, Coupled Model Intercomparison Project 5 models, known as CMIP5, for the project. Previously, studies that looked only at temperature and precipitation from the climate models found an increased rate of water availability of just 10 percent, where Easton and Wagena found potentially 20 to 30 percent more streamflow available in the region in the coming decades.One potential problem that the analysis identified was increased sediment transport in the rivers due to increased water flow. The increased sediment has the potential to reduce the capacity of reservoirs and dams, making massive hydroelectric projects like Ethiopia's largest dam currently under construction, the Grand Renascence Dam, less efficient in storing the 65 billion cubic meters of water that could potentially turn its turbines."Greater water availability is certainly a positive outcome, but this is countered by more sediment. One way to combat that is through installing conservation practices on farms, for instance using cover crops and low- and no-till planting methods to make the soil healthier, more stable, and reduce erosion," said Easton.While climate change is and will continue to cause untold problems, nuances in climate-induced weather events could benefit the Blue Nile Basin with increased rainfall in the area."It's interesting, because much of the Blue Nile Basin is well above 5,000 feet in elevation, giving it pretty much an ideal climate for agriculture with low humidity, low disease and pest pressure, and potentially great water availability, which could spur development," said Easton.
Weather
2,016
October 6, 2016
https://www.sciencedaily.com/releases/2016/10/161006092009.htm
Spring starting earlier in U.S. national parks, study finds
Spring is coming earlier in more than 200 U.S. national parks at locations from Alaska to Florida, according to a new study that employed a model created by a UWM climatologist.
The study correlated temperature records with climate change indicators developed by Mark D. Schwartz, UWM distinguished professor of geography. In Schwartz's model, the start of spring is pegged to the timing of a particular seasonal event -- the first leaf of certain plants.A team of researchers from the National Park Service, the U.S. Geological Survey and three universities, including UWM, found that spring is coming earlier than its historical average in three-quarters of the 276 examined parks.Results also showed that more than half the parks included in the analysis are experiencing extreme early onsets of spring.Changes in the timing of seasonal events are ideal indicators of the impact of local and global temperature changes, said Schwartz. "An example would be when plants pump more water into the atmosphere at first-leaf, the temperatures may be quite different than when the plants were dormant."The researchers dated the onset of spring in each park, year by year, and then analyzed those trends over the 112-year period in which temperature data were available."My model provides the key to knowing when plants are responding to the growing season," said Schwartz. "Coupling this with temperature data offers a standard way of looking at the park. It tells you what the trends are going to be."Results of the study were published Oct. 6 in an article in the journal "The bottom line is not just that parks are susceptible to change. In fact, they have already changed," said Jake Weltzin, an ecologist with USGS and a co-author on the study. "Many park managers are already managing in an extreme environment."Spring's early and sometimes unpredictable onset can lead to costly management issues. Earlier warm weather often gives a head start to invasive plants, can contribute to wildfires and appear to be disrupting natural relationships such as the peak bloom of wildflowers and the arrival of birds, bees and butterflies.
Weather
2,016
September 29, 2016
https://www.sciencedaily.com/releases/2016/09/160929095425.htm
Sandy's surge topped by 'rogue' 1950 storm in some areas
In November 1950, a freak storm spawned a record storm surge in Atlantic City and a near-record surge at Sandy Hook.
Damaging winds gusted to a record 108 mph in Newark and 94 mph in New York City, while southern states endured record low temperatures and crop damage. The storm dumped heavy rain in New Jersey and up to 57 inches of snow in the central Appalachians, killing nearly 400 people across the region, according to federal reports.Luckily, the largely forgotten storm arrived around low tide in most of the New York-New Jersey coastal area, averting catastrophic flooding akin to what happened during Superstorm Sandy on Oct. 29, 2012."That was a very severe storm," said Arielle Catalano, a doctoral student studying atmospheric science at Rutgers University. "For the most part, the public these days doesn't know how serious that storm was. They just look at Sandy as being an anomaly."Although it has largely slipped from memory, researchers at Rutgers believe there is a lot to learn from the 1950 storm and others like it.Catalano and Anthony J. Broccoli, professor and chair of Rutgers' Department of Environmental Sciences, are studying weather systems known as extratropical cyclones or nontropical storms, and the storm surges they have generated along the northern East Coast. The researchers are looking at several hundred events since the early 1900s by examining tide gauge records from The Battery -- the shoreline at the tip of Manhattan -- Boston, Massachusetts, and Norfolk, Virginia.They're trying to understand the atmospheric circulation patterns during such storms, which include nor'easters. And they're trying to determine whether state-of-the-art climate models simulate them well enough to get an idea of what kind of unusual events may develop in the future so we can better prepare for them, Broccoli said.The scientists also want to look into whether nor'easters and other storms that form outside of tropical areas will change in a warming climate, and that's a complex issue."Some studies suggest that extratropical cyclones will become more intense in a warming climate," Broccoli. "Other studies suggest that storm tracks will shift, possibly increasing the intensity of storms or the frequency of intense storms in some areas and vice versa in other areas."He added: "The baseline for coastal flooding will change because of sea-level rise. So even if nothing about storms changed meteorologically, their impacts will be greater. Their meteorological characteristics may also change."After it ended, the freak Thanksgiving weekend winter storm in 1950 was called "one of the most destructive storms ever recorded in (the) northeast United States," according to an article in the federal The storm -- now known as the Great Appalachian Storm of 1950 -- arose from a small area of low pressure over North Carolina and western Virginia. It evolved into strong low pressure over Ohio and, combined with strong high pressure over Canada's Labrador region, helped generate high winds in the northeastern U.S., the article says."The strong onshore winds caused excessively high tides and flooding in some cities in Connecticut and New Jersey," the article says. Some areas reported more damage than during the unnamed 1938 and 1944 hurricanes.According to Broccoli, the storm surge generated by the 1950 storm is the second-highest on record at Sandy Hook and The Battery, and ranks first at Atlantic City. The surge does not include the astronomical tide, which ranges from high to low daily. High tide can be at least several feet higher than low tide, and Sandy arrived around high tide during a full moon, when tides are higher than normal.In Atlantic City, the storm surge hit 5.7 feet during the 1950 storm -- a foot higher than Sandy's surge, according to Broccoli. At Sandy Hook, the surge reached 7.9 feet, only a foot shy of Sandy's. At The Battery, the surge hit 7.5 feet versus 9.4 feet during Sandy."It occurred at low tide, but what if it had occurred at high tide?" Catalano said. "Then what? It's a very interesting question."David A. Robinson, professor of geography at Rutgers and the New Jersey state climatologist, said the remarkable rogue storm in many respects was "forgotten because it wasn't a classic nor'easter. It wasn't a storm that brought massive river flooding. It didn't hit the Atlantic coastal beaches in New Jersey as much as it hit Delaware Bay and New York Harbor. But it shows how vulnerable the Jersey Shore is to a period of persistent onshore winds."
Weather
2,016
September 28, 2016
https://www.sciencedaily.com/releases/2016/09/160928083214.htm
Climate change jigsaw puzzle: Antarctic pieces missing
A shortage of data on the weather in Antarctica is hampering efforts to understand climate change in the region, according to new research.
The study, led by Dr Julie Jones from the University of Sheffield's Department of Geography, has revealed that limited data on Antarctica's climate is making it difficult for researchers to disentangle changes caused by human activity from natural climate fluctuations.Scientists can confidently say that Earth is warming due to greenhouse gas emissions caused by humans, but data on climate trends over the Antarctic and the surrounding Southern Ocean only go back to 1979 when regular satellite observations began. This makes it difficult for researchers to see how longer term climate trends have changed in the region.The inhospitable nature of the continent means that it has never had permanent inhabitants to take regular weather observations unlike most other places on the planet.The first routine observations only started in 1957 with the establishment of the network of Antarctic Research Stations. This provides important weather data, however they are mostly located around the coast, so leave vast areas of the continent and the surrounding oceans uncovered.It is only with the advent of regular satellite observations in 1979 that measurement of surface climate over the Antarctic and the surrounding Southern Ocean became possible.To gain a longer view of recent changes, Dr Jones and her international team of scientists used a compilation of climate records from natural archives, such as ice cores from the Antarctic ice sheet, which give indications of how the region's climate has changed over the last 200 years. They also studied how these recent changes compared to those in experiments with climate models.They confirmed that human-induced changes have caused the belt of prevailing westerly winds over the Southern Ocean to shift towards Antarctica.The research concludes that for other changes, such as regional warming and sea ice changes, the observations over the satellite-era since 1979 are not yet long enough for the signal of human-induced climate change to be clearly separated from the strong natural variability in the regionUnderstanding Antarctic climate change is important not only because of the potential sea level rise locked up in the vast Antarctic ice sheet, but also the shift in the westerly winds has moved rainfall away from southern Australia. In combination with increasing temperatures (this year is set to be the country's hottest year on record) this has had profound impacts on Australian communities and ecosystemsDr Julie Jones said: "The Antarctic climate is like a giant jigsaw puzzle with most of the pieces still missing. There are some parts of the picture which are clear, particularly the way that climate change is causing westerly winds to shift southwards, but there are still huge gaps that we need to fill in order to fully understand how much human activity is changing weather in the region."At face-value, many of the climate trends in Antarctica seem counter-intuitive for the warming world. Scientists have good theories for why, but these are difficult to prove with the short records we are working with."Co-author Nerilie Abram, from the Australian National University, said: "In order to better understand climate change in Antarctica, we need continued climate measurements in the Antarctic and Southern Ocean, and extension of these short observational records with past climate reconstructions and climate modelling."Co-researcher Professor Matthew England from the University of New South Wales added that scientists still had much to learn about the climate system of Antarctica and the region, and how it would affect people and the environment.The Sheffield-led research team included scientists from 19 institutions from around the world. Their results will help to shape the way the University contributes to the global effort of understanding climate change and gives geography students at Sheffield access to the latest innovations in climate change research.The research is published in
Weather
2,016
September 20, 2016
https://www.sciencedaily.com/releases/2016/09/160920165610.htm
WMO rules on longest distance and longest duration lightning flashes
A World Meteorological Organization committee of experts has established two new world records for the longest reported distance and the longest reported duration for a single lightning flash in, respectively, Oklahoma (United States of America) and southern France.
The lightning flash over Oklahoma in 2007 covered a horizontal distance of 321 kilometers (199.5 miles). The lightning event over southern France in 2012 lasted continuously for 7.74 seconds, the WMO evaluation committee found."Lightning is a major weather hazard that claims many lives each year," said WMO Secretary-General Petteri Taalas. "Improvements in detecting and monitoring these extreme events will help us improve public safety."It is the first time that lightning has been included in the official WMO Archive of Weather and Climate Extremes, which is maintained by the WMO Commission for Climatology and documents details of records for heat, cold, wind speed, rainfall and other events.Full details of the assessment are given in an Early Online Release posting of the article published on 15 September. The article will be formally published in an upcoming issue of the Dramatic improvements in lightning remote sensing techniques have allowed the detection of previous unobserved extremes in lightning occurrence and so enabled the WMO committee to conduct a critical evaluation.The WMO evaluation committee judged that the world's longest detected distance for a single lightning flash occurred over a horizontal distance of 321 km (199.5 miles) using a maximum great circle distance between individual detected VHF lightning sources. The event occurred on 20 June 2007 across the state of Oklahoma.The committee also accepted the world's longest detected duration for a single lightning flash as a single event that lasted continuously for 7.74 seconds on 30 August 2012 over Provence-Alpes-Côte d'Azur, France."This investigation highlights the fact that, because of continued improvements in meteorology and climatology technology and analysis, climate experts can now monitor and detect weather events such as specific lightning flashes in much greater detail than ever before," said Randall Cerveny, chief Rapporteur of Climate and Weather Extremes for WMO."The end result reinforces critical safety information regarding lightning, specifically that lightning flashes can travel huge distances from their parent thunderstorms. Our experts' best advice: when thunder roars, go indoors," he said.The investigating committee was composed of lightning and climate experts from the United States, France, Australia, Spain, China, Morocco, Argentina, and the United Kingdom. As part of their evaluation, the committee also unanimously agreed that the existing formal definition of "lightning discharge" should be amended to a "series of electrical processes taking place continuously" rather the previously specified time interval of one second. This is because technology and analysis have improved to the point that lightning experts now can detect and monitor individual lightning flashes with lifetimes much longer than a single second.Validation of these new world lightning extremes (a) demonstrates the recent and on-going dramatic augmentations and improvements to regional lightning detection and measurement networks, (b) provides reinforcement to lightning safety concerns that lightning can travel large distances and so lightning dangers can exist even long distances from the parent thunderstorm, and (c) for lightning engineering concerns.A full list of weather and climate extremes is available at the WMO Archive of Weather and Climate Extremes (
Weather
2,016
September 20, 2016
https://www.sciencedaily.com/releases/2016/09/160920130704.htm
Soil management may help stabilize maize yield in the face of climate change
How will we feed our growing population in the face of an increasingly extreme climate? Many experts suggest the answer lies in breeding novel crop varieties that can withstand the increases in drought, heat, and extreme rainfall events predicted in the not-too-distant future. But breeding is only part of the equation, according to new research from the University of Illinois and several collaborating institutions across the Midwestern U.S.
"It might not be necessary to put all the stress of climate adaptation and mitigation on new varieties. Instead, if we can manage agroecosystems more appropriately, we can buffer some of the effects of climate instability," says U of I and USDA Agricultural Research Service ecologist Adam Davis.To find the management tool that could ameliorate the effects of climate instability, Davis and his collaborators had to go beyond the traditional field-scale experiment. "We had to think at a much broader spatial scale," he notes.The team obtained weather, soil, and yield data from every county in four states -- Illinois, Michigan, Minnesota, and Pennsylvania -- across a span of 15 years. They then used a new analytical approach, which borrowed from economic concepts, to determine the effects of weather and soil properties on maize yield."The things that were most effective at buffering against the different forms of yield instability were soil organic matter and water holding capacity," Davis says. This pattern was true across all years and all study locations.Greater water holding capacity, which increases with more soil organic matter, gives crops an advantage in hot, dry climates. They can continue to take up water from the soil, which means continued growth and strong yields even in adverse climates.The good news for farmers is that they may be able to manage for improvements in water holding capacity, giving them a potential tool to support novel maize varieties. "In locations with coarse soils, you can see really quick and gratifying responses to soil organic matter amendments," Davis says.Davis suggests a number of practices to increase soil organic matter, including using cover crops, avoiding excessive soil disturbance, increasing crop rotation length, and adding composted manures. He points out that cover crops might be the best choice for some farmers."Cover crops are a great way for improving soil organic matter; even small amounts of cover crop biomass seem to have soil organic matter benefits," Davis explains. "They also can have weed suppressive benefits, so cover crops may represent a win-win scenario."No matter which amendment practice farmers choose, he says, "soil organic matter amendments are an important place to start building a cropping system resilient to climate change."The study, "Soil water holding capacity mitigates downside risk and volatility in US rainfed maize: Time to invest in soil organic matter?" is published in the journal
Weather
2,016
September 19, 2016
https://www.sciencedaily.com/releases/2016/09/160919162849.htm
Smoke from 2015 Indonesian fires may have caused 100,000 premature deaths
In the fall of 2015, hazardous levels of smoke from agricultural fires blanketed much of Equatorial Asia. Schools and businesses closed, planes were grounded and tens of thousands sought medical treatment for respiratory illness.
In a new study, Harvard University researchers and their colleagues estimate that the 2015 smoke event caused upwards of 100,000 deaths across Indonesia, Malaysia, and Singapore.To mitigate the impact of future smoke events, the team developed a model framework which could help governments and policymakers in Southeast Asia identify, in almost real time, the fires with the highest potential to cause damage to human health.The research is described in "Although the regions experienced several major haze events over the past twenty years, the 2015 event was one of the worst," said Shannon N. Koplitz, first author and Harvard graduate student. "We understand many of the underlying conditions that lead to these extreme events, and we can often predict when smoke pollution will be severe based on particular meteorological indicators, but regional efforts to mitigate the effects on public health have not been successful.""Our hope is that this framework can inform early-response efforts to identify areas where effective fire and land use management would yield the greatest benefits to human health, even as the haze event is still unfolding," said Loretta J. Mickley, Senior Research Fellow at SEAS and coauthor.The research is part of a larger effort by Harvard and Columbia to provide local stakeholders with an effective tool to assess the public health costs from fires and guide policy decisions. The framework uses a mathematical model to quickly identify the fires that will have the biggest impact on human health downwind."Decisions on how to manage the lands, which geographies to protect, how industries are regulated and where fires are allowed to burn are decisions of life and death," said Samuel Myers, Senior Research Scientist at the T.H. Chan School of Public Health, Director of the Planetary Health Alliance and coauthor of the paper. "We want to support local policy makers to make those decisions with clear data."Fires started by farmers in Indonesia, particularly those producing palm oil and timber for wood pulp and paper, are the main culprits of haze events in this region. The fires, largely in coastal peatlands, burn at relatively low temperatures and can smolder for weeks or even months before extinguishing, resulting in lots of smoke.During periods of extreme dry weather caused by El Niño and a phenomenon called the positive Indian Ocean Dipole, smoke emissions are considerably higher -- either because farmers are taking advantage of the dry weather to burn more land or because once burning, the fires are more difficult to control. Although many fires burn in remote areas of Indonesia, prevailing winds can carry the smoke hundreds of miles to densely populated cities like Palembang in Sumatra, and Singapore and Kuala Lumpur.The region experienced similar smoke conditions caused by El Niño in 2006 but the Harvard-led team found that deaths from air pollution more than doubled between the 2006 and 2015 events, from about 38,000 to about 100,000. This is largely because of where the fires burned in relation to population centers, and their intensity. Fires in southern Sumatra and nearby Jambi province turn out to be particularly deadly."Based on years of epidemiological research, we understand very well the relationship between pollution and mortality," said Jonathan Buonocore, coauthor and research associate at the T.H. Chan School of Public Health. "We know for each incremental increase in air pollution, you get a certain incremental increase in mortality risk."Being able to identify the most dangerous fires could help save lives in the future, Buonocore said. "For the first time in Indonesia, we have a rapid assessment modeling tool that can quickly estimate the cost to human health of these haze events, as they are happening," he said.In ongoing work, the researchers are using the model to diagnose the health impacts of different land-use scenarios over the next 20-30 years. This effort could promote more rational land-use decisions and management that could save thousands of lives."If regional policy makers understand fully the health dimension of these biomass fires, we believe they will be in a better position to manage them more effectively and improve human health and ecosystems at the same time," said Ruth DeFries, of Columbia University and coauthor of the paper.This research was funded by the Rockefeller Foundation.
Weather
2,016
September 19, 2016
https://www.sciencedaily.com/releases/2016/09/160919131958.htm
Heatwaves in the ocean: Risk to ecosystems?
Did you know that heatwaves not only occur on land, but also in the sea? We all remember the record-breaking European heatwave in summer of 2003: forests burned, rivers dried up and more than ten thousand people in Europe died as a result of the extremely high temperatures. The marine environment -- and in particular the organisms -- also suffer from heat stress. Two exceptional heatwaves in the ocean during the past few years have alarmed us scientists. Humans will also feel their consequences in the long term.
An unusually long-lasting warm water bubble -- nicknamed 'The Blob' -- spread across the surface of the Northeast Pacific from winter 2013/2014 to the end of 2015. The warm water bubble at times measured up to 1,600 kilometres in diameter and had water temperatures of more than 3 degrees Celsius above the long-term average. Because warm surface water has a lower density than the cold deep water, the exchange of nutrient-rich deep water with warm surface water was reduced, especially along the west coast of North America. This had far-reaching consequences for marine organisms and ecosystems: the growth of phytoplankton decreased due to the reduced supply of nutrients, and some zooplankton and fish species migrated from the warm and nutrient-poor water to cooler regions. By contrast, researchers found pygmy killer whales in the North Pacific for much longer than usual: this tropical whale species is usually observed 2,500 kilometres further south.A stronger but shorter heatwave hit Australia's west coast at the turn of the year 2010/2011, with sea temperatures of up to 6 degrees Celsius above normal levels for that time of year. The seabed along the coast of Western Australia is known for its high concentration of brown algae. These marine 'kelp forests' have similar functions as terrestrial forests: they provide habitat and food resource to numerous species; in particular a large number of fish. Australian researchers demonstrated that most of the kelp forest stocks rapidly disappeared during this heatwave. In total, an area of 1,000 square kilometres of kelp forest was lost -- this corresponds to twice the size of Lake Constance. Today, algae stocks haven't recovered yet. Instead, a new ecosystem with tropical fish and seaweeds has developed.We have known for some time that extreme weather and climate events on land, such as heatwaves, shape the structure of biological systems and affect their biogeochemical functions and the services they provide for society in a fundamental manner. It is also known that heatwaves affect a number of biological systems, including humans, more strongly than slower changes in the average temperature. This has to do with the fact that such extreme events push organisms and ecosystems to their limits of their resilience and beyond, potentially causing dramatic and irreversible changes.The two extreme events in the North Pacific and along the west coast of Australia detailed us for the first time that marine heatwaves can also lead to a number of unprecedented ecological and socioeconomic consequences. For example, it revealed that a large number of fish moved to colder northern waters. Escaping to cooler ocean depths is often not an option because deeper depths lack sunlight, oxygen and plants for food. This may ultimately lead to losses for both the fishing and tourism sectors.As the world's oceans continue to warm, marine heatwaves are likely to become more frequent and intense. Observations and model simulations also demonstrate that other factors such as ocean acidification and deoxygenation are putting additional stress on marine organisms and ecosystems.Until recently, climate models were unable to accurately represent the relevant physical and biogeochemical processes to simulate extreme events in the ocean and predict future changes. The uncertainties in future projections, particularly at the regional scale, were simply too large. New model simulations linking the global carbon and oxygen cycle with high-resolution physical processes now enable us to make quantitative predictions about the frequency, strength and spatial distribution of future extreme events in the ocean for the first time. And this is precisely what my scientific research focuses on. But in order to better understand the impact of these extreme events on individual organisms or entire ecosystems and their socioeconomic services, interdisciplinary collaborations are urgently needed. Research on understanding such events is only just beginning.
Weather
2,016
September 16, 2016
https://www.sciencedaily.com/releases/2016/09/160916132142.htm
Mystery of colorful giant plants of the subantarctic solved
The mystery of why so many plants on New Zealand's otherwise bleak subantarctic islands have very large deeply coloured flowers and giant leaves has been solved by new University of Otago research.
These insect-pollinated "megaherbs" stand out like sore thumbs amongst the islands' other flora which are small, wind-pollinated plants that mainly reproduce by self-pollination or asexual reproduction.Department of Botany researchers thermally imaged six species of Campbell Island megaherbs -- whose mainland relatives are small and pale flowered -- and discovered that their flowers and leaves heat up rapidly to make the most of rare moments of sunshine and calm weather.The researchers found that leaf and flower temperatures of all six species were considerably higher than simultaneously measured surrounding temperatures, with the greatest heating seen in Campbell Island daisies.Study co-author Dr Janice Lord says these daisies and other megaherbs appear to have evolved deeply pigmented flowers and often large, thick, hairy leaves to cope with some of the most relentlessly cloudy and cool conditions in the world."Their dark floral pigments are able to more efficiently harvest the unpredictable, intermittent sunshine to speed up metabolism and attract insects seeking warmth and their large rosette leaves can provide mini-glasshouse effects," Dr Lord says.Their adaptions mirror those of giant tropical alpine plants, she says."Plants in those climates face similar challenges in terms of cloudiness and the cold, especially at night."The findings appear in the journal
Weather
2,016
September 15, 2016
https://www.sciencedaily.com/releases/2016/09/160915153338.htm
2016 ties with 2007 for second lowest Arctic sea ice minimum
The Arctic's ice cover appears to have reached its minimum extent on September 10, 2016, according to scientists at the National Snow and Ice Data Center (NSIDC). Arctic sea ice extent on that day stood at 4.14 million square kilometers (1.60 million square miles), statistically tied at second lowest in the satellite record with the 2007 minimum. The 2007 minimum occurred on September 18 of that year, when Arctic sea ice extent stood at 4.15 million square kilometers (1.60 million square miles).
"It was a stormy, cloudy, and fairly cool summer," said NSIDC director Mark Serreze. "Historically, such weather conditions slow down the summer ice loss, but we still got down to essentially a tie for second lowest in the satellite record.""It really suggests that in the next few years, with more typical warmer conditions, we will see some very dramatic further losses," said Ted Scambos, NSIDC lead scientist.Arctic sea ice cover grows each autumn and winter, and shrinks each spring and summer. Each year, the Arctic sea ice reaches its minimum extent in September. The record lowest extent in the 37-year satellite record occurred on September 17, 2012 when sea ice extent fell to 3.39 million square kilometers (1.31 million square miles).During the first ten days of September this year, the Arctic lost ice at a faster than average rate. On average, the Arctic lost 34,100 square kilometers (13,200 square miles) per day compared to the 1981 to 2010 long-term average of 21,000 square kilometers (8,100 square miles) per day. The early September rate of decline also greatly exceeded the rate observed for the same period during the record low year of 2012 (19,000 square kilometers, or 7,340 square miles, per day). By September, the air is cooling and there is little surface melt. This argues that that the fairly rapid early September ice loss was due to extra heat in the upper ocean. Recent ice loss was most pronounced in the Chukchi Sea, northwest of Alaska. NSIDC scientists said ice may also relate to the impact of two strong storms that passed through the region during August."This has been an exciting year with several record low extents reached during winter and early summer but thanks to a colder than average summer, more ice remained than at the end of 2012," said Julienne Stroeve, NSIDC senior scientist. NSIDC scientists said there was a lot of thin ice at the beginning of the melt season, because thinner ice does not take as much energy to melt away, this may have also contributed to this year's low minimum extent.Please note that the Arctic sea ice extent number for 2016 is preliminary -- changing winds could still push the ice extent lower. NSIDC will issue a formal announcement at the beginning of October with full analysis of the possible causes behind this year's ice conditions, particularly interesting aspects of the melt season, the set up going into the winter growth season ahead, and graphics comparing this year to the long-term record.See the full analysis at NSIDC's Arctic Sea Ice News and Analysis page:
Weather
2,016
September 15, 2016
https://www.sciencedaily.com/releases/2016/09/160915131524.htm
Pacific Ocean’s response to greenhouse gases could extend California drought for centuries
Clues from prehistoric droughts and arid periods in California show that today's increasing greenhouse gas levels could lock the state into drought for centuries, according to a study led by UCLA professor Glen MacDonald.
The study, published today in the Nature journal As long as warming forces like greenhouse gases are present, the resulting radiative forcing can extend drought-like conditions more or less indefinitely, said MacDonald, a distinguished professor of geography and of ecology and evolutionary biology."Radiative forcing in the past appears to have had catastrophic effects in extending droughts," said MacDonald, an international authority on drought and climate change. "When you have arid periods that persist for 60 years, as we did in the 12th century, or for millennia, as we did from 6,000 to 1,000 B.C., that's not really a 'drought.' That aridity is the new normal."Researchers tracked California's historic and prehistoric climate and water conditions by taking a sediment core in the Sierra Nevada mountains. They pulled a 2-inch-wide, 10-foot-deep cylinder of sediment from the bottom of Kirman Lake and analyzed it in third-of-an-inch sections, creating the most detailed and continuous paleoenvironmental record of California.The team correlated their findings with other studies of California climate history, and for the first time, united all the studies and cross-referenced them with histories of the Pacific Ocean's temperature taken from marine sediment cores and other sources.What they found was not only that periods of increased radiative forcing could produce drought-like conditions that extended indefinitely, but that these conditions were closely tied to prolonged changes in Pacific Ocean surface temperatures.Changes in ocean temperatures are linked to El Niño and La Niña conditions, which increase and decrease precipitation in California. Until now, no one had the long, detailed record of California's dry periods needed to show that that aridity went hand-in-hand with changes in the prehistoric climate records of the Pacific Ocean, MacDonald said."Climate models today have a challenging time predicting what will happen with Pacific sea-surface temperatures in the face of climate change, and we hope that our research can improve that," he added.The researchers chose Kirman Lake in central-eastern California for its sensitivity to climate changes and its stable geologic history. Today, it's a small, freshwater lake full of trout and about 16 feet deep, with a small marsh at one edge.The team found evidence though that through the millennia Kirman Lake has grown more and less salty, dried until it was exclusively marshland and refilled again. All the while, sediment accumulated on the lake's bottom, forming a record of lake conditions, the changing climate and the surrounding environment.The research team spent years analyzing the core sample, which revealed California's history layer by layer:From 6,000 to 1,000 B.C., during a time geologists refer to as the mid-Holocene, the core sample captures a 5,000-year dry period in California that has been seen in less detail through other paleoenvironmental records. This arid period is linked to a slight variation in Earth's orbit that increased the amount of solar energy received by the Northern Hemisphere in the summer months. California was warm and dry, while marine sediment records show the Pacific was in a La Niña-like state, likely reducing precipitation.A similar dry period was seen from about 950 to 1250 B.C., a time known as the medieval climate anomaly. Increased radiative forcing and warming at this time is connected to decreased volcanic activity and increased sunspots. Again, La Niña appears to have reigned in the Pacific Ocean."We suspected we would see the millennia of aridity during the mid-Holocene at Kirman Lake, but we were surprised to see a very clear record of the medieval climate anomaly as well," MacDonald said. "It was very cool to see the lake was sensitive on the scale of not just thousands of years, but also something that lasted just a few centuries."Even more exciting to the researchers was a brief shift in the record toward moister conditions around 2,200 B.C. In the middle of thousands of years of mid-Holocene dryness, Kirman Lake suddenly became moister again, MacDonald said, while simultaneously the Pacific Ocean record switched to more El Niño-like conditions."This change at 2,200 B.C. was a global phenomenon," MacDonald said. "It's associated with the collapse of the Old Kingdom in Egypt. It's linked to the decline of the Akkadian Empire in Mesopotamia and similar Bronze Age societal disruptions in India and China. It was amazing to find evidence of it in our own backyard."That blip in the record was a reminder that El Niño and La Niña weather patterns have global repercussions. It also confirmed the accuracy and sensitivity of Kirman Lake's record, and the strong link between the ocean and California's weather.All this has consequences for California, the researchers said. Drought-like conditions can last indefinitely as long as increased warming, or radiative forcing, is present. And greenhouse gases are currently expected to increase."In a century or so, we might see a retreat of forest lands, and an expansion of sagebrush, grasslands and deserts," MacDonald said. "We would expect temperatures to get higher, and rainfall and snowfall would decrease. Fire activity could increase, and lakes would get shallower, with some becoming marshy or drying up."California might remain an agricultural state, thanks to irrigation and engineering, though productivity might decrease and crops might change, said MacDonald, who emphasized that while the past is no guarantee of the future, in this case it does provide cause for concern."I think we would find a way to keep our cities going through prolonged drought, but we're not going to engineer a way to conserve or preserve the ecosystems of the state," MacDonald said. "We can't save our huge expanses of oak woodlands, or our pine and fir forests, or high-elevation alpine ecosystems with irrigation projects like we might our orchards and gardens. I worry that we will see very different wildlands by the end of this century."
Weather
2,016
September 14, 2016
https://www.sciencedaily.com/releases/2016/09/160914090454.htm
El Niño, global warming combine to cause extreme drought in Amazon rainforest
A study led by researchers at the Global Change Unit at the Universitat de València (UV) shows the impact the current 2015/2016 El Niño is having in Amazonia. Areas of extreme drought and changes to their typical distribution in the region are among the most evident consequences.
The El Niño effect is part of a cycle of global heating and cooling associated with the changing temperatures of a band of ocean water in the central and east-central equatorial Pacific ocean. Repeating every three to five years, it is one of the main drivers of climate variability. Although its consequences are felt at the global level, its impact on tropical forests -- particularly the Amazon rainforests -- are considered particularly significant, since this ecosystem is considered one of the planet's main carbon sinks.Some El Niño events, like those of 1982/1983 and, especially, 1997/1998, are stronger than average. In 2014 alarm bells started ringing at the possibility of another such 'Mega Niño', as they are known, though ultimately not all of the necessary conditions converged. However, in 2015 they all fell into place, leading to the current 2015/2016 event, which, coupled with the trend of global warming, is proving more extreme than any on record.The study, by researchers at the Universitat de València and published in Scientific Reports, shows how the current El Niño event is associated with an unprecedented heating of Amazonia, reaching the highest temperature in the last forty years and, probably, the last century. Additionally, extreme drought has hit a much larger area of this region than usual and is distributed atypically, with extremely dry conditions in the northeast and unusual wetting in the southeast (something which occurred in 2009/2010, though to a lesser extent).According to the UV scientists, this fact, not observed in the 1982/1983 and 1997/1998 events, implies that, the more the central equatorial Pacific is heated, the more marked the difference between and distribution of the wet zones and areas of extreme drought in the Amazon rainforest.Some studies associate the current context of global warming to a greater frequency of these stronger El Niño event, although no clear consensus exists among scientists. The severity of the impact of these extreme drought events on tropical forests has to do with the lower absorption rate of atmospheric CO2, as well as an increased risk of fires and the consequent loss of biomass.Currently the temperature of the Pacific ocean is neutral, with odds at slightly over 50% of entering La Niña, the cold phase of this natural global climate cycle. However, the drought is expected to continue over the coming months.The research behind this study was carried out using climate data and temperature and rainfall records generated by the European Centre of Medium Range Weather Forecasts, the Met Office Hadley Centre and the University of East Anglia's Climate Research Unit, as well as satellite imagery. Some of these data are available on the Termal Amazoni@, developed by the UV's Global Change Unit.Taking part in this research were: Juan Carlos Jiménez Muñoz and José Antonio Sobrino, from the Global Change Unit, based at the Image Processing Lab at the University's Science Park. They were joined by scientists from the University of Chile, University of Leeds, University of Maryland, Geophysical Institute of Peru, University of Oxford and Royal Netherlands Meteorological Institute (KNMI).The study was published in
Weather
2,016
September 13, 2016
https://www.sciencedaily.com/releases/2016/09/160913141512.htm
After a strong El Nino winter, NASA model sees return to normal
Not too hot, not too cold -- instead, water temperatures in the equatorial Pacific Ocean should be just around normal for the rest of 2016, according to forecasts from the Global Modeling and Assimilation Office, or GMAO. With these neutral conditions, scientists with the modeling center at NASA's Goddard Space Flight Center say there is unlikely to be a La Niña event in late 2016.
Last winter saw an extremely strong El Niño event, in which warmer-than-normal water sloshed toward the eastern Pacific Ocean. Historically, some of the larger El Niño events are followed by a La Niña event, in which deep, colder-than-normal water surfaces in the eastern Pacific Ocean, off the coast of South America."We are consistently predicting a more neutral state, with no La Niña or El Niño later this year," said Steven Pawson, chief of the GMAO. "Our September forecast continues to show the neutral conditions that have been predicted since the spring."As part of a research and development project, GMAO contributes experimental seasonal forecasts each month to the North American Multi-Model Ensemble (NMME) and other centers. MME produces a forecast by combining the individual forecasts of a number of participating institutions, which helps to reduce the uncertainty involved in forecasting events nine to twelve months in advance. The NMME prediction system delivers forecasts based on the National Oceanic and Atmospheric Administration (NOAA) operational schedule and is used by many operational forecasters in predicting El Niño and La Niña events.For GMAO, the seasonal forecasts are one way to use NASA satellite data to improve near-term climate predictions of the Earth system."We're really trying to bring as much NASA observational data as possible into these systems," Pawson said.The scientists with GMAO feed a range of NASA satellite data and other information into the seasonal forecast model to predict if an El Niño or La Niña event will occur in the nine months -- information on the aerosols and ozone in the atmosphere, sea ice, winds, sea surface heights and temperatures, and more. The models are run on supercomputers at the NASA Center for Climate Simulation -- 9 terabytes of data each month.For much of this spring and summer, however, the Goddard group's forecast of neutral conditions looked like an outlier. Most other forecasts originally called for a La Niña event, but then shifted to more neutral outlooks in August. But the GMAO forecasts produced in January 2016, which look nine months ahead, saw the Pacific Ocean reverting to normal temperatures after last year's El Niño, and even getting a little colder than normal. Still, the water wouldn't get cold enough to be considered a La Niña, according to the GMAO forecasts.It's not the first time in recent memory that GMAO was an outlier. "The big El Niño that peaked in November 2015, we actually began forecasting that back in March, and our forecast was in excellent agreement with the real event," said Robin Kovach, a research scientist at GMAO. While the strength of the 2015-2016 El Niño predicted by the model seemed at first to be excessive, it was borne out in subsequent observations.The GMAO models aren't always right, though, Kovach said. In 2014 the group forecast a large El Niño that didn't materialize."There's a fair degree of uncertainty when you start predicting for nine months ahead," Pawson said. But the group is constantly upgrading their systems, and is currently working to improve the resolution and bring in new types of satellite observations, such as soil moisture information from the Soil Moisture Active Passive mission, which launched in 2015.GMAO scientists are also investigating how to incorporate observations of ocean color into the seasonal forecast model. Shades of green can tell researchers about how much phytoplankton is in a region, which in turn can provide information about fish populations."So if there's another big El Niño in five years or so, we could be able to do online predictions of phytoplankton," he said, "and help fishermen predict where fish might be."
Weather
2,016
September 13, 2016
https://www.sciencedaily.com/releases/2016/09/160913115639.htm
Westerly winds have blown across central Asia for at least 42 million years
The gusting westerly winds that dominate the climate in central Asia, setting the pattern of dryness and location of central Asian deserts, have blown mostly unchanged for 42 million years. A University of Washington geologist led a team that has discovered a surprising resilience to one of the world's dominant weather systems. The finding could help long-term climate forecasts, since it suggests these winds are likely to persist through radical climate shifts.
"So far, the most common way we had to reconstruct past wind patterns was using climate simulations, which are less accurate when you go far back in Earth's history," said Alexis Licht, a UW assistant professor of Earth and space sciences who is lead author of the paper published in August in Earlier studies of the Asian climate's history used rocks from the Loess Plateau in northwestern China to show dust accumulation began 25 million to 22 million years ago and increased over time, especially over the past 3 million years. It had been believed that these rocks reflected the full history of central Asian deserts, linking them with the rise of the Tibetan Plateau and a planetwide cooling.But Licht led previous research at the University of Arizona using much older rocks, dating back more than 40 million years, from northeastern Tibet. Dust in those rocks confirmed the region already was already parched during the Eocene epoch. This upended previous beliefs that the region's climate at that time was more subtropical, with regional wind patterns brought more moisture from the tropics.The new paper traces the origin of this central Asian dust using samples from the area around Xining, the largest city at the northeastern corner of the Tibetan Plateau. Chemical analyses show that the dust came from areas in western China and along the northern edge of the Tibetan Plateau, like today, and was carried by the same westerly winds."The origin of the dust hasn't changed for the last 42 million years," Licht said.During the Eocene, the Tibetan Plateau and Himalayan Mountains were much lower, temperatures were hot, new mammal species were rapidly emerging, and Earth's atmosphere contained three to four times more carbon dioxide than it does today."Neither Tibetan uplift nor the decrease in atmospheric carbon dioxide concentration since the Eocene seem to have changed the atmospheric pattern in central Asia," Licht said. "Wind patterns are influenced by changes in Earth's orbit over tens or hundreds of thousands of years, but over millions of years these wind patterns are very resilient."The study could help predict how climates and ecosystems might shift in the future."If we want to have an idea of Earth's climate in 100 or 200 years, the Eocene is one of the best analogs, because it's the last period when we had very high atmospheric carbon dioxide," Licht said.Results of the new study show that the wind's strength and direction are fairly constant over central Asia, so the amount of rain in these dry zones depends mostly on the amount of moisture in the air, which varies with carbon dioxide levels and air temperature. The authors conclude that winds will likely remain constant, but global warming could affect rainfall through changes in the air's moisture content."Understanding the mechanism of those winds is a first step to understand what controls rainfall and drought in this very wide area," Licht said. "It also provides clues to how Asian circulation may change, since it suggests these westerly winds are a fundamental feature that have persisted for far longer than previously believed."
Weather
2,016
September 8, 2016
https://www.sciencedaily.com/releases/2016/09/160908151118.htm
Unprecedented atmospheric behavior disrupts one of Earth's most regular climate cycles
The normal flow of air high up in the atmosphere over the equator, known as the quasi-biennial oscillation, was seen to break down earlier this year. These stratospheric winds are found high above the tropics, their direction and strength changes in a regular two- to three-year cycle which provides forecasters with an indication of the weather to expect in Northern Europe. Westerly winds are known to increase the chance of warm and wet conditions, while easterlies bring drier and colder weather.
Scientists from NCAS at the University of Oxford and the Met Office were part of an international team that observed the unusual behaviour in February, noticing a reversal of the expected pattern in the winds. This same team then identified the reason why.The quasi-biennial oscillation is a regular feature of the climate system. On average, these equatorial eastward and westward winds alternate every 28 to 29 months, making them very predictable in the long term. The team's findings published in Dr Scott Osprey, an NCAS scientist at the University of Oxford, said: "The recent disruption in the quasi-biennial oscillation was not predicted, not even one month ahead. If we can get to the bottom of why the normal pattern was affected in this way, we could develop more confidence in our future seasonal forecasts."Prof Adam Scaife, Head of Long-range Forecasting at the Met Office and Honorary Visiting Professor at the University of Exeter, said: "This unexpected disruption to the climate system switches the cycling of the quasi-biennial oscillation forever. And this is important as it is one of the factors that will influence the coming winter."A return to more typical behaviour within the next year is forecast, though scientists believe that the quasi-biennial oscillation could become more susceptible to similar disruptions as the climate warms.Later this month international research groups will meet in Oxford to discuss the origins and implications of this event.
Weather
2,016