Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
October 17, 2018 | https://www.sciencedaily.com/releases/2018/10/181017172832.htm | Eliminating emissions in India and China could add years to people's lives | The 2.7 billion people who live in China and India -- more than a third of the world's population -- regularly breath some of the dirtiest air on the planet. Air pollution is one of the largest contributors to death in both countries, ranked 4th in China and 5th in India, and harmful emissions from coal-fire powerplants are a major contributing factor. | In a recent study, researchers from Harvard University wanted to know how replacing coal-fired powerplants in China and India with clean, renewable energy could benefit human health and save lives in the future.The researchers found that eliminating harmful emissions from powerplants could save an estimated annual 15 million years of life in China and 11 million years of life in India.The research was published in the journal Previous research has explored mortality from exposure to fine particulate matter (known as PM2.5) in India and China but few studies have quantified the impact of specific sources and regions of pollution and identified efficient mitigation strategies.Using state-of-the-art atmospheric chemistry modeling, the researchers calculated province-specific annual changes in mortality and life expectancy due to power generation. Using the province-specific approach, the researchers were able to narrow down the areas of highest priority, recommending upgrades to the existing power generating technologies in Shandong, Henan, and Sichuan provinces in China, and Uttar Pradesh state in India due to their dominant contributions to the current health risks."This study shows how modeling advances and expanding monitoring networks are strengthening the scientific basis for setting environmental priorities to protect the health of ordinary Chinese and Indian citizens," said Chris Nielsen, executive director of the Harvard-China Project and a co-author of the paper. "It also drives home just how much middle-income countries could benefit by transitioning to non-fossil electricity sources as they grow." | Pollution | 2,018 |
October 17, 2018 | https://www.sciencedaily.com/releases/2018/10/181017141024.htm | Wind farms and reducing hurricane precipitation | With the United States being pummeled over the last couple of years with several high-category, high-damage hurricanes, the University of Delaware's Cristina Archer recently published a paper that discovered an unexpected benefit of large-scale offshore wind farms: they lessen the precipitation caused by these devastating storms. | Archer said that while previous studies have shown that hypothetical offshore wind farms can harness the kinetic energy from hurricanes and lessen the effects of wind and storm surge, this study showed that offshore wind farms can also have an impact on precipitation. The paper demonstrates a clear decrease in precipitation for onshore locations that are downstream of a wind farm and an increase in precipitation in offshore areas that are upstream or within the wind farms themselves.Archer, professor in UD's College of Earth, Ocean and Environment and the Wind Power Associate Director of the Center for Carbon-free Power Integration (CCPI), worked with Yang Pan and Chi Yan, both former doctoral students of Archer's while at UD, on the paper that was recently published in the The researchers used Hurricane Harvey as an example because it brought possibly the heaviest rain ever recorded in United States history to the Texas coast and caused unprecedented flooding.Unlike Hurricanes Katrina or Sandy, for which storm surge was one of the biggest problems, Hurricane Harvey flooded Houston because of the amount of rain that it dropped on the city.Archer explained that wind farms can help mitigate the precipitation by affecting two large factors that cause precipitation: wind convergence and divergence.The strong hurricane winds slow down when they hit wind turbines, which is an effect known as convergence and enhances precipitation."Think about convergence like when there's traffic on a freeway and everybody is going fast and then all of a sudden, there's an accident and everybody slows down. You get a convergence of cars that backs up because everybody slows down. That's the convergence upstream of the offshore wind farms," said Archer.This leads to increased precipitation because when the winds converge at a point on the surface, they have no other place to go except up, and that vertical motion brings more moisture into the atmosphere.Using the accident on the freeway metaphor again, Archer said that divergence is similar to what happens when cars finally get past the accident: everybody speeds up."Divergence is the opposite effect. It causes downward motion, attracting air coming down, which is drier and suppresses precipitation. I was wondering what if that would also happen when there is an offshore farm?" said Archer.In numerical simulations with a model domain set up to cover the coast of Texas and Louisiana, Archer found there would be regional convergence before the storm hit the hypothetical farms, which would "squeeze out" precipitation before getting close to the coast. Past the farms, there would be divergence, which suppressed the precipitation further."By the time the air reaches the land, it's been squeezed out of a lot of moisture. We got a 30 percent reduction of the precipitation with the Harvey simulations," said Archer. "That means, potentially, if you have arrays of offshore turbines in an area where there are hurricanes, you will likely see a reduction in precipitation inland if the farm is there."The study used a number of hypothetical turbines ranging from a control case that used 0 to a maximum of 74,619, a number that Archer stressed was out of the realm of possibility in the near future. The United States currently has just five offshore wind turbines, but in Europe, where the industry is more developed, there are offshore wind farms with more than 100 turbines, which Archer said she would consider a normal number for an offshore wind energy project.Still, with this study showing that offshore wind farms can be of benefit to coastal communities not just by providing clean energy, but also by reducing the effects of hurricanes, Archer said that she is hopeful the numbers will increase in the future."The more wind farms you have, the more impact they will have on a hurricane," said Archer. "By the time a hurricane actually makes landfall, these arrays of turbines have been operating for days and days, extracting energy and moisture out of the storm. As a result, the storm will be weaker. Literally." | Pollution | 2,018 |
October 17, 2018 | https://www.sciencedaily.com/releases/2018/10/181017140942.htm | Dandelion seeds reveal newly discovered form of natural flight | The extraordinary flying ability of dandelion seeds is possible thanks to a form of flight that has not been seen before in nature, research has revealed. | The discovery, which confirms the common plant among the natural world's best fliers, shows that movement of air around and within its parachute-shaped bundle of bristles enables seeds to travel great distances -- often a kilometre or more, kept afloat entirely by wind power.Researchers from the University of Edinburgh carried out experiments to better understand why dandelion seeds fly so well, despite their parachute structure being largely made up of empty space.Their study revealed that a ring-shaped air bubble forms as air moves through the bristles, enhancing the drag that slows each seed's descent to the ground.This newly found form of air bubble -- which the scientists have named the separated vortex ring -- is physically detached from the bristles and is stabilised by air flowing through it.The amount of air flowing through, which is critical for keeping the bubble stable and directly above the seed in flight, is precisely controlled by the spacing of the bristles.This flight mechanism of the bristly parachute underpins the seeds' steady flight. It is four times more efficient than what is possible with conventional parachute design, according to the research.Researchers suggest that the dandelion's porous parachute might inspire the development of small-scale drones that require little or no power consumption. Such drones could be useful for remote sensing or air pollution monitoring.The study, published in Dr Cathal Cummins, of the University of Edinburgh's Schools of Biological Sciences and Engineering, who led the study, said: "Taking a closer look at the ingenious structures in nature -- like the dandelion's parachute -- can reveal novel insights. We found a natural solution for flight that minimises the material and energy costs, which can be applied to engineering of sustainable technology." | Pollution | 2,018 |
October 17, 2018 | https://www.sciencedaily.com/releases/2018/10/181017111027.htm | Moss rapidly detects, tracks air pollutants in real time | Moss, one of the world's oldest plants, is surprisingly in tune with the atmosphere around it. Now in a study appearing in ACS' journal | Plants have evolved the ability to sense light, touch, gravity and chemicals in the air and soil, allowing them to adapt and survive in changing environments. Thus, plants have been used in studies to assess the long-term damage caused by accumulated air pollution worldwide. However, this type of study requires skilled personnel and expensive instrumentation. Xingcai Qin, Nongjian Tao and colleagues wanted to develop an easier way to use moss, a particularly good indicator of sulfur dioxide pollution, as a rapid, real-time sensor.The researchers gathered wild moss and exposed it to various concentrations of sulfur dioxide in a chamber. Using a highly sensitive, inexpensive webcam, the research team found that moss leaves exposed to sulfur dioxide slightly shrank or curled and changed color from green to yellow. Some of these changes, analyzed with an imaging algorithm, began within 10 seconds of exposure to the pollutant. However, once the sulfur dioxide was removed from the chamber, the moss leaves gradually recovered. This result suggests that the plant, unlike traditional colorimetric sensors, can regenerate its chemical sensing capacity. The researchers conclude that combining remote webcams or drones with moss or other plant-based sensors could lead to cheaper, faster and more precise monitoring of the air for sulfur dioxide and other pollutants over vast regions. | Pollution | 2,018 |
October 16, 2018 | https://www.sciencedaily.com/releases/2018/10/181016132032.htm | Climate models fail to simulate recent air-pressure changes over Greenland | Climatologists may be unable to accurately predict regional climate change over the North Atlantic because computer model simulations have failed to accurately include air pressure changes that have taken place in the Greenland region over the last three decades. | This deficiency may mean regional climate predictions for the UK and parts of Europe could be inaccurate, according to new research published today.Researchers compared real data with simulation data over a 30 year period and found that the simulations on average showed slightly decreasing air pressure in the Greenland region, when in fact, the real data showed a significant increase in high air pressure -- or so-called 'Greenland blocking' -- during the summer months. These simulations are widely used by climate scientists worldwide as a basis for predicting future climate change.The findings raise serious questions about the accuracy of regional climate projections in the UK and neighbouring parts of Europe because meteorological conditions in those regions are closely linked to air-pressure changes over Greenland.Researchers warn that record wet summers in England and Wales such as those experienced in 2007 and 2012 could become more frequent if Greenland air pressure continues to strengthen over the next few decades, but such a trend might not be predicted due to inaccurate regional climate simulations.The study, carried out by the University of Lincoln, UK, and the University of Liège in Belgium, also concluded that current models of melting on the Greenland Ice Sheet -- a vast body of ice which covers more than 80 per cent of the surface of Greenland -- may significantly underestimate the global sea-level rise expected by 2100.Professor Edward Hanna led the study with Dr Richard Hall, both from the University of Lincoln's School of Geography, and Dr Xavier Fettweis of University of Liège. Professor Hanna said: "These differences between the estimates from the current climate models and observations suggests that the models cannot accurately represent recent conditions or predict future changes in Greenland climate."While there is natural variability in the climate system, we think that the recent rapid warming over Greenland since the early 1990s is not being fully simulated by the models, and that this misrepresentation could mean that future changes in atmospheric circulation and the jet stream over the wider North Atlantic region may not be properly simulated."Until now, no-one has systematically examined the projections to see how they represent the last few decades and future changes -- up to the year 2100 -- from a Greenland regional perspective. Previous work reported a tendency for global warming to result in a slightly more active jet stream in the atmosphere over the North Atlantic by 2100 but our results indicate we may actually see a somewhat weaker jet, at least in summer."The research is the first to systematically compare global climate model data and observational data of air pressure changes for the Greenland region. | Pollution | 2,018 |
October 15, 2018 | https://www.sciencedaily.com/releases/2018/10/181015120907.htm | Applying auto industry's fuel-efficiency standards to agriculture could net billions | Adopting benchmarks similar to the fuel-efficiency standards used by the auto industry in the production of fertilizer could yield $5-8 billion in economic benefits for the U.S. corn sector alone, researchers have concluded in a new analysis. | The work, appearing in the journal "A CAFE-style approach to reducing nitrogen pollution could provide powerful incentives for fertilizer manufacturers to learn where and how enhanced-efficiency fertilizers work best, and ultimately to develop more technically sophisticated nitrogen products tailored to specific crops, climates, and soil conditions," they write.Nitrogen pollution represents a significant environmental concern, scientists have concluded, and comes mainly from the inefficient use of fertilizer and manure on farms. Others have found policies to address this pollution source to be largely ineffective, largely because of the challenges in both changing farming practices and in monitoring and enforcement."Moreover, the farm lobby is an extremely powerful political force in many countries," observe Kanter, a professor in NYU's Department of Environmental Studies, and Searchinger, a research scholar at Princeton's Woodrow Wilson School of Public and International Affairs. "Consequently, new policy options for addressing this environmental issue need to be explored."In their analysis, the researchers turned to U.S. fuel efficiency standards, which focus on the car industry instead of consumers, and evaluated whether the fertilizer industry could be a newfound regulatory focus of nitrogen policies.Specifically, they evaluated the potential impact of a requirement to steadily increase the proportion of enhanced efficiency fertilizer sold with traditional fertilizer -- with the implicit aim of incentivizing technology development in this industry. India implemented such requirements in 2015.As with cars, the price of enhanced-efficiency fertilizers (EEF) could be more costly to growers; however, they could also potentially bolster profits because lower amounts are needed to grow crops -- in the same way fuel-efficient cars require less gasoline. EEFs, already produced by several major fertilizer companies, have been shown to reduce nitrogen losses and improve yields -- yet EEFs are currently used on only about 12 percent U.S. corn cropland.In their analysis, the researchers adopted a case study -- the U.S. corn industry, which devotes approximately 33 million hectares of American cropland to corn production and has the highest nitrogen application rate of any major crop in the U.S.To estimate the impact of wider usage of EEFs, they examined, over a 10-year period and using different scenarios, how a CAFE-style standard mandating that EEFs compose higher percentages of nitrogen fertilizer sales -- e.g., 20 percent of sales by 2020 and 30 percent by 2030 -- would affect incomes from higher yields and increased fertilizer costs.Their results showed that higher efficiency standards, depending on the standard set, could produce net economic benefits of $5-$8 billion per year by 2030. These benefits include both a combination of farmer and industry profits as well as environmental and human health gains from avoided nitrogen pollution, the researchers explain.Specifically, farm profits are due to slight boosts in yield, which offset the increased cost of EEF use, while industry profits arise from increased sales of EEFs, which have a higher profit margin.The researchers add that the impact of such standards for fertilizer could be felt more immediately than that for cars -- CAFE requirements apply only to newly sold cars, which have an average fleet turnover of nearly 16 years, while fertilizer is bought annually, so improved products will have an instantaneous effect."A state could pioneer such an approach -- possibly California, which has already adopted ambitious climate goals across all sectors," Kanter and Searchinger propose. "Although the heterogeneity of agricultural, climatic, and political systems across the world requires a range of policy approaches to address the nitrogen pollution issue, industry-focused technology-forcing policies could be a promising option for reducing nitrogen losses, even as we push our planet to produce far more food." | Pollution | 2,018 |
October 11, 2018 | https://www.sciencedaily.com/releases/2018/10/181011103659.htm | New technique for turning sunshine and water into hydrogen fuel | A research team led by DGIST Professor Jong-Sung Yu's team at the Department of Energy Science and Engineering has successfully developed a new catalyst synthesis method that can efficiently decompose water into oxygen and hydrogen using solar light. It is expected that this method will facilitate hydrogen mass production due to higher efficiency than the existing photocatalyst method. | Due to the intensifying environmental problems such as air pollution and global warming caused by the increased use of fossil energy, hydrogen is recently drawing attention as an ecofriendly energy source of next generation. Accordingly, research is being conducted globally on how to produce hydrogen using solar light and photocatalyst by decomposing water. To overcome the limitations of photocatalyst that only reacts to light in ultraviolet rays, researchers have doped dual atom such as Nitrogen (N), Sulfur (S), and Phosphorus (P) on photocatalyst or synthesized new photocatalysts, developing a photocatalyst that reacts efficiently to visible light.With Professor Samuel Mao's team at UC Berkeley in the U.S., Professor Yu's research team developed a new H-doped photocatalyst by removing oxygen from the photocatalyst surface made of titanium dioxide and filling hydrogen into it through the decomposition of MgHMgH2 reduction can synthesize new matters by applying to Titanium oxide used in this research as well as the oxides composed of other atoms such as Zr, Zn, and Fe. This method is applicable to various other fields such as photocatalyst and secondary battery. The photocatalyst synthesized in this research has four times higher photoactivity than the existing white titanium dioxide and is not difficult to manufacture, thus being very advantageous for hydrogen mass production.Another characteristic of the photocatalyst developed by the research team is that it reduces band gap more than the existing Titanium dioxide photocatalyst used for hydrogen generation and can maintain four times higher activity with stability for over 70 days.The new method can also react to visible light unlike existing photosynthesis, overcoming the limitation of hydrogen production. With the new photocatalyst development, the efficiency and stability of hydrogen production can both dramatically improved, which will help popularize hydrogen energy in the near future.Professor Yu said "The photocatalyst developed this time is a synthesis method with much better performance than the existing photocatalyst method used to produce hydrogen. It is a very simple method that will greatly help commercialize hydrogen energy. With a follow-up research on improving the efficiency and economic feasibility of photocatalyst, we will take the lead in creating an environment stable hydrogen energy production that can replace fossil energy." | Pollution | 2,018 |
October 11, 2018 | https://www.sciencedaily.com/releases/2018/10/181011103646.htm | Questioning the link between pollution by magnetite particles and Alzheimer's disease | A 2016 study showed that exposure to urban pollution involving magnetite particles played a role in the development of Alzheimer's disease. It began from the hypothesis that magnetite particles would generate chemical reactions that could cause oxidative stress for neurons. CNRS researchers have now called this connection into question, showing that it is very unlikely that magnetite is involved in neuron degeneration. Their work was published in | Magnetite, which is one of the main iron ores, is very stable, even over the geological timescale. However, a scientific study that appeared in 2016 suggested that magnetite nanoparticles coming from atmospheric pollution could penetrate the brain by inhalation and, by binding to amyloid peptide, cause the neuron degeneration responsible for Alzheimer's disease.A 2007 study described that magnetite could generate harmful oxidation reactions. The 2016 article presenting magnetite penetration into the brain and its binding to amyloid stated that atmospheric pollution was therefore a probable cause of Alzheimer's disease.Researchers from the CNRS Laboratoire de Chimie de Coordination reproduced the experiments under the same temperature and pH conditions as physiological conditions and have shown that magnetite cannot bind to amyloid peptide or cause oxidation reactions.This result, which agrees with magnetite's very high stability, therefore leads us to think that magnetite is inert in vivo and that it is therefore very unlikely that it is involved in the neuron degeneration observed in Alzheimer's disease. This study must lead to careful rereading of the experimental work on the hazardous nature of magnetite in the human brain. | Pollution | 2,018 |
October 9, 2018 | https://www.sciencedaily.com/releases/2018/10/181009115102.htm | Clean Water Act dramatically cut pollution in US waterways | The 1972 Clean Water Act has driven significant improvements in U.S. water quality, according to the first comprehensive study of water pollution over the past several decades, by researchers at the University of California, Berkeley and Iowa State University. | The team analyzed data from 50 million water quality measurements collected at 240,000 monitoring sites throughout the U.S. between 1962 and 2001. Most of 25 water pollution measures showed improvement, including an increase in dissolved oxygen concentrations and a decrease in fecal coliform bacteria. The share of rivers safe for fishing increased by twelve percent between 1972 and 2001.Despite clear improvements in water quality, almost all of 20 recent economic analyses estimate that the costs of the Clean Water Act consistently outweigh the benefits, the team found in work also coauthored with researchers from Cornell University. These numbers are at odds with other environmental regulations like the Clean Air Act, which show much higher benefits compared to costs."Water pollution has declined dramatically, and the Clean Water Act contributed substantially to these declines," said Joseph Shapiro, an associate professor of agricultural and resource economics in the College of Natural Resources at UC Berkeley. "So we were shocked to find that the measured benefit numbers were so low compared to the costs."The researchers propose that these studies may be discounting certain benefits, including improvements to public health, or a reduction in industrial chemicals not included in current water quality testing.The analyses appear in a pair of studies published in the Quarterly Journal of Economics and the Americans are worried about clean water. In Gallup polls, water pollution is consistently ranked as Americans' top environmental concern -- higher than air pollution and climate change.Since its inception, the Clean Water Act has imposed environmental regulations on individuals and industries that dump waste into waterways, and has led to $650 billion in expenditure due to grants the federal government provided municipalities to build sewage treatment plants or improve upon existing facilities.However, comprehensive analyses of water quality have been hindered by the sheer diversity of data sources, with many measurements coming from local agencies rather than national organizations.To perform their analysis, Shapiro and David Keiser, an assistant professor of economics at Iowa State University, had to compile data from three national water quality data repositories. They also tracked down the date and location of each municipal grant, an undertaking that required three Freedom of Information Act requests."Air pollution and greenhouse gas measurements are typically automated and standard, while water pollution is more often a person going out in a boat and dipping something in the water." Shapiro said. "It was an incredibly data and time-intensive project to get all of these water pollution measures together and then analyze them in a way that was comparable over time and space."In addition to the overall decrease in water pollution, the team found that water quality downstream of sewage treatment plants improved significantly after municipalities received grants to improve wastewater treatment. They also calculated that it costs approximately $1.5 million to make one mile of river fishable for one year.Adding up all the costs and benefits -- both monetary and non-monetary -- of a policy is one way to value its effectiveness. The costs of an environmental policy like the Clean Water Act can include direct expenditures, such as the $650 billion in spending due to grants to municipalities, and indirect investments, such as the costs to companies to improve wastewater treatment. Benefits can include increases in waterfront housing prices or decreases in the travel to find a good fishing or swimming spot.The researchers conducted their own cost-benefit analysis of the Clean Water Act municipal grants, and combined it with 19 other recent analyses carried out by hydrologists and the EPA. They found that, on average, the measured economic benefits of the legislation were less than half of the total costs. However, these numbers might not paint the whole picture, Shapiro said."Many of these studies count little or no benefit of cleaning up rivers, lakes, and streams for human health because they assume that if we drink the water, it goes through a separate purification process, and no matter how dirty the water in the river is, it's not going to affect people's health," Shapiro said. "The recent controversy in Flint, MI recently seems contrary to that view.""Similarly, drinking water treatment plants test for a few hundred different chemicals and U.S. industry produces closer to 70,000, and so it is possible there are chemicals that existing studies don't measure that have important consequences for well-being," Shapiro said.Even if the costs outweigh the benefits, Shapiro stresses that Americans should not have to compromise their passion for clean water -- or give up on the Clean Water Act."There are many ways to improve water quality, and it is quite plausible that some of them are excellent investments, and some of them are not great investments," Shapiro said. "So it is plausible both that it is important and valuable to improve water quality, and that some investments that the U.S. has made in recent years don't pass a benefit-cost test." | Pollution | 2,018 |
October 9, 2018 | https://www.sciencedaily.com/releases/2018/10/181009102523.htm | Increase in plastics waste reaching remote South Atlantic islands | The amount of plastic washing up onto the shores of remote South Atlantic islands is 10 times greater than it was a decade ago, according to new research published today (8 October) in the journal | Scientists investigating plastics in seas surrounding the remote British Overseas Territories discovered they are invading these unique biologically-rich regions. This includes areas that are established or proposed Marine Protected Areas (MPAs).The study shows for the first time that plastic pollution on some remote South Atlantic beaches is approaching levels seen in industrialised North Atlantic coasts.During four research cruises on the BAS research ship RRS James Clark Ross between 2013 and 2018, a team of researchers from ten organisations sampled the water surface, water column and seabed, surveyed beaches and examined over 2000 animals across 26 different species.The amount of plastic reaching these remote regions has increased at all levels, from the shore to the seafloor. More than 90% of beached debris was plastic, and the volume of this debris is the highest recorded in the last decade.Lead author Dr David Barnes from British Antarctic Survey (BAS) explains:"Three decades ago these islands, which are some of the most remote on the planet, were near-pristine. Plastic waste has increased a hundred-fold in that time, it is now so common it reaches the seabed. We found it in plankton, throughout the food chain and up to top predators such as seabirds."The largest concentration of plastic was found on the beaches. In 2018 we recorded up to 300 items per metre of shoreline on the East Falkland and St Helena -- this is ten times higher than recorded a decade ago. Understanding the scale of the problem is the first step towards helping business, industry and society tackle this global environmental issue."Plastic causes many problems including entanglement, poisoning and starving through ingestion. The arrival of non-indigenous species on floating plastic "rafts" has also been identified as a problem for these remote islands. This study highlights that the impacts of plastic pollution are not only affecting industrialised regions but also remote biodiverse areas, which are established or proposed MPAs.Andy Schofield, biologist, from the RSPB, who was involved in this research says:"These islands and the ocean around them are sentinels of our planet's health. It is heart-breaking watching Albatrosses trying to eat plastic thousands of miles from anywhere. This is a very big wake up call. Inaction threatens not just endangered birds and whale sharks, but the ecosystems many islanders rely on for food supply and health." | Pollution | 2,018 |
October 8, 2018 | https://www.sciencedaily.com/releases/2018/10/181008114616.htm | When yesterday's agriculture feeds today's water pollution | A study led by researchers at Université de Montréal quantifies for the first time the maximum amount of nutrients -- specifically, phosphorus -- that can accumulate in a watershed before additional pollution is discharged into downriver ecosystems. | That average threshold amount is 2.1 tonnes per square kilometre of land, the researchers estimate in their study published today in This amount is shockingly low, the researchers say; given current nutrient application rates in most agricultural watersheds around the world, tipping points in some cases could be reached in less than a decade.The study was led by Jean-Olivier Goyette, a doctoral student in biology at UdeM, and supervised by UdeM aquatic ecosystem ecologist Roxane Maranger in collaboration with sustainability scientist Elena Bennett at McGill University.Phosphorus, an element in fertilizer, is essential to the growth of plant food. But the mineral is also harmful when overused. When it gets into surface water, it can lead to excessive plant growth in lakes and rivers and proliferation of toxic algae, harmful to human and animal health.Focusing on 23 watersheds feeding the St. Lawrence River in Quebec, the researchers reconstructed historic land-use practices in order to calculate how much phosphorus has accumulated on the land over the past century.The two main sources of phosphorus to watersheds, the land adjacent to tributaries, come from agriculture (fertilizers and animal manure) and from the human population (through food needs and sewage).Using Quebec government data, the researchers matched the estimated accumulation with phosphorus concentrations measured in the water for the last 26 years. Since the watersheds they studied had different histories -- some had been used intensively for agriculture for decades whereas others were forested and pristine -- this method allowed the researchers to establish a gradient of different phosphorus accumulations among sites. In so doing, they were able to see at what point the watershed "tipped" or reached a threshold and began to leak considerably more phosphorus into the water."Think of the land as a sponge," Maranger said. "After a while, sponges that absorb too much water will leak. In the case of phosphorus, the landscape absorbs it year after year after year, and after a while, its retention capacity is reduced. At that point historical phosphorus inputs contribute more to what reaches our water."Until now, no-one had been able to put a number to the amount of accumulated phosphorus at the watershed scale that's needed to reach a tipping point in terms of accelerating the amount of the mineral flowing into the aquatic ecosystem."This is a very important finding," Bennett said. "It takes our farm-scale knowledge of fertilizers and pollution and scales it up to understand how whole watersheds respond within a historical context."Agriculture on a mass scale began in Quebec only in the 1950s, but some of the province's more historical agricultural watersheds had already passed the tipping point by the 1920s, the study found.Even if phosphorus inputs ceased immediately, eliminating the accumulated phosphorus in saturated Quebec watersheds would take between 100 and 2,000 years, the researchers estimate.In some countries, including China, Canada, and the US, phosphorus is so heavily used now that the saturation point is reached in as little as five years."Nutrient management strategies developed using novel creative approaches ... are urgently required for the long-term sustainability of water resources," the researchers urge in their study."One possible mitigating measure would be to do what is already being done in some European countries: instead of adding more and more to help plants grow, phosphorus already stored in soils can be accessed using new practices and approaches," Goyette said."Furthermore, phosphorus can be recycled and reused as fertilizer rather than accessing more of the raw mined material."The dilemma is this: humans need to eat but need to have clean water, yet growing food requires phosphorus that pollutes the water when too much leaves the watershed and pollutes adjacent aquatic ecosystems."Are some of our more extreme (agricultural) watersheds impossible to repair?" Maranger asked. "I can't answer that. It's a societal issue and there are solutions. We should never despair, but it's a wicked problem." | Pollution | 2,018 |
October 4, 2018 | https://www.sciencedaily.com/releases/2018/10/181004110021.htm | Gas stations vent far more toxic fumes than previously thought | A study led by environmental health scientists at Columbia University Mailman School of Public Health examined the release of vapors from gas station vent pipes, finding emissions were 10 times higher than estimates used in setback regulations used to determine how close schools, playgrounds, and parks can be situated to the facilities. Findings appear in the journal | Gasoline vapors contain a number of toxic chemicals, notably benzene, a carcinogen.The researchers attached gas flow meters to venting pipes at two large gas stations in the Midwest and Northwest and took measurements over a three-week period. They report average daily evaporative losses of 7 and 3 gallons of liquid gasoline, respectively, or 1.4 pounds and 1.7 pounds per 1,000 gallons dispensed at the pump. By comparison, the California Air Pollution Control Officers Association (CAPCOA) used an estimate of 0.11 pounds per 1,000 gallons. Based on CAPCOA emission estimates, the California Air Resources Board (CARB) determined their setback regulation of 300 feet (91 meters) from large gas stations. Similar laws exist in many, but not all states and localities. In urban areas like New York City, some gas stations are located directly adjacent to apartment buildings.The study also simulated how the fuel vapor was carried in the air to assess the potential for short- and medium-term benzene exposures, comparing their measurements to three established thresholds. The California Office of Environmental Health Hazard Assessment one-hour Reference Exposure Level (REL) for benzene -- defined as a continuous hour of exposure to the chemical -- was exceeded at both gas stations at distances greater than 50 meters. At the Midwest gas station, REL was exceeded on two different days at distances greater than 50 meters, and once as far as 160 meters. The Agency for Toxic Substances and Disease Registry's Minimal Risk Level (MRL) for benzene exposure over a period between two weeks and a year was exceeded within 7 or 8 meters of the two gas stations. A less stringent measure used for short-term exposures of first responders, the American Industrial Hygiene Association's Emergency Response Planning Guidelines (ERPG), was not exceeded."We found evidence that much more benzene is released by gas stations than previously thought. In addition, even during a relatively short study period, we saw a number of instances in which people could be exposed to the chemical at locations beyond the setback distance of 300 feet," said first author Markus Hilpert, PhD, associate professor of Environmental Health Sciences at the Columbia Mailman School. "Officials should reconsider their regulations based on these data with particular attention to the possibility of short spikes in emissions resulting from regular operations or improper procedures related to fuel deliveries and the use of pollution prevention technology."In previous work, Hilpert and colleagues documented the release of gasoline as fuel is stored and transferred between tanker trucks, storage tanks, and vehicle tanks, and how these spills can contaminate the surrounding environment. Next, the researchers will explore additional short-term measures of vapor spread to determine the bounds of safe setbacks.Co-authors of the new study include Ana Maria Rule at Johns Hopkins, Bernat Adria-Mora formerly at Columbia, and Tedmund Tiberi at ARID Technologies, Inc. In a competing interest statement, the authors note that Tiberi directs a company that develops technologies for reducing fuel emissions from gasoline-handling operations. The research is supported by a grant from the National Institutes of Health (ES009089). | Pollution | 2,018 |
October 4, 2018 | https://www.sciencedaily.com/releases/2018/10/181004085330.htm | What you can't see can hurt you | You can't see nasty microscopic air pollutants in your home, but what if you could? | Engineers from the University of Utah's School of Computing conducted a study to determine if homeowners change the way they live if they could visualize the air quality in their house. It turns out, their behavior changes a lot.Their study was published this month in the "The idea behind this study was to help people understand something about this invisible air quality in their home," says University of Utah School of Computing assistant professor Jason Wiese, who was a lead author of the paper along with U School of Computing doctoral student Jimmy Moore and School of Computing associate professor Miriah Meyer.During the day, the air pollution inside your home can be worse than outside due to activities such as vacuuming, cooking, dusting or running the clothes dryer. The results can cause health problems, especially for the young and elderly with asthma.University of Utah engineers from both the School of Computing and the Department of Electrical and Computer Engineering built a series of portable air quality monitors with Wi-Fi and connected them to a university server. Three sensors were placed in each of six homes in Salt Lake and Utah counties from four to 11 months in 2017 and 2018. Two were placed in different, high-traffic areas of the house such as the kitchen or a bedroom and one outside on or near the porch. Each minute, each sensor automatically measured the air for PM 2.5 (a measurement of tiny particles or droplets in the air that are 2.5 microns or less in width) and sent the data to the server. The data could then be viewed by the homeowner on an Amazon tablet that displayed the air pollution measurements in each room as a line graph over a 24-hour period. Participants in the study could see up to 30 days of air pollution data. To help identify when there might be spikes in the air pollution, homeowners were given a voice-activated Google Home speaker so they could tell the server to label a particular moment in time when the air quality was being measured, such as when a person was cooking or vacuuming. Participants also were sent an SMS text message warning them whenever the indoor air quality changed rapidly.During the study, researchers discovered some interesting trends from their system of sensors, which they called MAAV (Measure Air quality, Annotate data streams, and Visualize real-time PM2.5 levels). One homeowner discovered that the air pollution in her home spiked when she cooked with olive oil. So that motivated her to find other oils that produced less smoke at the same cooking temperature.Another homeowner would vacuum and clean the house just before a friend with allergies dropped by to try and clean the air of dust. But what she found out through the MAAV system is that she actually made the air much worse because she kicked up more pollutants with her vacuuming and dusting. Realizing this, she started cleaning the house much earlier before the friend would visit.Participants would open windows more when the air was bad or compare measurements between rooms and avoid those rooms more with more pollution. "Without this kind of system, you have no idea about how bad the air is in your home," Wiese says. "There are a whole range of things you can't see and can't detect. That means you have to collect the data with the sensor and show it to the individual in an accessible, useful way."Researchers also learned that circumstances that made the air pollution worse differed in each home. Vacuuming in the home, for example, would cause different reactions to the air quality. They also learned that if homeowners could visualize the air quality in their home, they always stayed on top of labeling and looking at the data.Wiese says no known manufacturers make air quality systems for the home that allow residents to visualize and label the air quality in this way, but he hopes their research can spur more innovation.The study involves engineering in collaboration with other University of Utah scientists, including biomedical informatics and clinical asthma researchers. It was funded as part of a larger National Institutes of Health program known as Pediatric Research using Integrated Sensor Monitoring Systems (PRISMS), launched in 2015 to develop sensor-based health monitoring systems for measuring environmental, physiological and behavioral factors in pediatric studies of asthma and other chronic diseases.Research reported in this publication was funded by the National Institute of Biomedical Imaging and Bioengineering of the National Institutes of Health under Award Number U54EB021973. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. | Pollution | 2,018 |
October 2, 2018 | https://www.sciencedaily.com/releases/2018/10/181002155235.htm | Wildfire aerosols remain longer in atmosphere than expected | Rising 2,225 meters into the air on an island in the Azores archipelago, Pico Mountain Observatory is an ideal place to study aerosols -- particles or liquids suspended in gases -- that have traveled great distances in the troposphere. | The troposphere is the portion of the atmosphere from the ground to about 10 kilometers in the air. Nearly all of the atmosphere's water vapor and aerosol exist in the troposphere, and this is also where weather occurs. The Pico Observatory rises above the first layer of clouds in the troposphere, known as the atmospheric marine boundary layer. At that boundary the temperature drops rapidly, and relatively high humidity decreases as cooling air forces water to condense into cloud droplets.Pico is often ringed in clouds, with its summit climbing above them. This feature allows scientists to study the aerosols above the boundary layer, including a set of three samples a research team at Michigan Technological University recently observed that challenges the way atmospheric scientists think about aerosol aging.In "Molecular and physical characteristics of aerosol at a remote free troposphere site: Implications for atmospheric aging" published Tuesday, October 2 in the journal "Previously, brown carbon was expected to become mostly depleted within approximately 24 hours, but our results suggested the presence of significant brown carbon roughly a week downwind of its initial wildfire source in northern Quebec," says Simeon Schum, a chemistry doctoral candidate at Michigan Tech and the paper's first author."If these aerosols have a longer lifetime than expected, then they may contribute more to light absorption and warming than expected, which could have implications for climate predictions."This work builds on a previous paper published in the same journal, "Molecular characterization of free tropospheric aerosol collected at the Pico Mountain Observatory: a case study with a long-range transported biomass burning plume."In order to determine where the molecules in aerosols originate, the team, led by the article's corresponding author and associate professor of chemistry, Lynn Mazzoleni, used a Fourier Transform-Ion Cyclotron Resonance mass spectrometer, located at Woods Hole Oceanographic Institution, to analyze the chemical species of molecules from within the samples.Aerosols, depending on their chemical and molecular composition, can have both direct and indirect effects on the climate. This is because some aerosols only scatter light, while others also absorb light, and others uptake water vapor, changing cloud properties. Aerosols play a cooling role in the atmosphere, but there are great uncertainties about the extent of forcing and climate effects.Understanding how specific aerosols oxidize -- break down -- in the atmosphere is one piece in the puzzle of understanding how Earth's climate changes. Aerosols take on a variety of consistencies, called viscosities, depending on their composition and their surroundings. Some have a consistency similar to olive oil or honey, and these tend to oxidize more rapidly than more solidified aerosol particles, which can become like pitch, or even marble-like.The three samples analyzed by the Michigan Tech team are named PMO-1, PMO-2 and PMO-3. PMO-1 and PMO-3 traveled to Pico in the free troposphere, while PMO-2 traveled to Pico in the boundary layer. Aerosols are less likely to occur in the free troposphere than in the boundary layer, but pyro-convection from wildfires can lift the particles higher up in the air. Though PMO-2 had been in the atmosphere for only two to three days, it had oxidized more than PMO-1 and PMO-3, which had been in the atmosphere roughly seven days and were estimated to be glassy in consistency."We were puzzled by the substantial difference between PMO-2 compared to PMO-1 and PMO-3. So, we asked ourselves why we would see aerosols at the station which were not very oxidized after they had been in the atmosphere for a week," says Mazzoleni. "Typically, if you put something into the atmosphere, which is an oxidizing environment, for seven to 10 days, it should be very oxidized, but we weren't seeing that."Schum said the research team hypothesized that the first and third samples had oxidized more slowly because of the free tropospheric transport path of the aerosol after being injected to that level by wildfires in Quebec. Such a path toward Pico meant lower average temperature and humidity causing the particles to become more solid, and therefore less susceptible to oxidative destruction processes in the atmosphere.That a particle would oxidize at a slower rate despite more time in the atmosphere because of its physical state provides new insight toward better understanding how particles affect the climate."Wildfires are such a huge source of aerosol in the atmosphere with a combination of cooling and warming properties, that understanding the delicate balance can have profound consequences on how accurately we can predict future changes," says Claudio Mazzoleni, professor of physics, and one of the authors of the paper.As wildfires increase in size and frequency in the world's arid regions, more aerosol particles could be injected into the free troposphere where they are slower to oxidize, contributing another important consideration to the study of atmospheric science and climate change. | Pollution | 2,018 |
October 1, 2018 | https://www.sciencedaily.com/releases/2018/10/181001154050.htm | 130-year-old brain coral reveals encouraging news for open ocean | When nitrogen-based fertilizers flow into water bodies, the result can be deadly for marine life near shore, but what is the effect of nitrogen pollution far out in the open ocean? | A 130-year-old brain coral has provided the answer, at least for the North Atlantic Ocean off the East Coast of the United States. By measuring the nitrogen in the coral's skeleton, a team of researchers led by Princeton University found significantly less nitrogen pollution than previously estimated. The study was published online in the "To our surprise, we did not see evidence of increased nitrogen pollution in the North Atlantic Ocean over the past several decades," said Xingchen (Tony) Wang, who conducted the work as part of his doctorate in geosciences at Princeton and is now a postdoctoral scholar at the California Institute of Technology.Earlier work by the Princeton-based team, however, did find elevated nitrogen pollution in another open ocean site in the South China Sea, coinciding with the dramatic increase in coal production and fertilizer usage in China over the past two decades.In the new study, the researchers looked at coral skeleton samples collected in the open ocean about 620 miles east of the North American continent near the island of Bermuda, a region thought to be strongly influenced by airborne nitrogen released from U.S. mainland sources such as vehicle exhaust and power plants.Although the team found no evidence that human-made nitrogen was on the rise, the researchers noted variations in nitrogen that corresponded to levels expected from a natural climate phenomenon called the North Atlantic Oscillation, Wang said.The result is in contrast to previously published computer models that predicted a significant increase in human-made nitrogen pollution in the North Atlantic.The work may indicate that U.S. pollution control measures are successfully limiting the amount of human-generated nitrogen emissions that enter the ocean."Our finding has important implications for the future of human nitrogen impact on the North Atlantic Ocean," said Wang. "Largely due to advances in pollution technology, human nitrogen emissions from the U.S. have held steady or even declined in recent decades," he said. "If emissions continue at this level, our results imply that the open North Atlantic will remain minimally affected by nitrogen pollution in coming decades."Nitrogen, when in its biologically available form and supplied in excess, can cause overgrowth of plants and algae and lead to severe ecosystem harm, including marine "dead zones" that form when microorganisms consume all the oxygen in the water, leaving none for fish. Fertilizer production and fossil fuel burning have greatly increased the production of biologically available, or "fixed," nitrogen since the early 20th century.When emitted to the atmosphere, fixed nitrogen can influence the ocean far from land. However, the impacts on the ocean are difficult to study because of the challenges involved in making long-term observations in the open ocean.Corals can help. Stony or "Scleractinian" corals are long-lived organisms that build a calcium carbonate skeleton as they grow. The corals soak up nitrogen from the surrounding water and deposit a small portion in their skeletons. The skeletons provide a natural record of nitrogen emissions.To distinguish human-made, or anthropogenic, nitrogen from the naturally occurring kind, the researchers take advantage of the fact that nitrogen comes in two weights. The heavier version, known as 15N, contains one more neutron than the lighter 14N. Anthropogenic nitrogen has a lower ratio of 15N to 14N than does the nitrogen in the ocean."It has long been my dream to use the nitrogen in coral skeletons to reconstruct past environmental changes; thanks to Tony, we are now doing it," said Daniel Sigman, the Dusenbury Professor of Geological and Geophysical Sciences at Princeton.While a graduate student at Princeton, Wang developed a sensitive and precise method to measure the 15N-to-14N ratio using a mass spectrometer, which is like a bathroom scale for weighing molecules.To collect coral samples in the North Atlantic Ocean, Wang and Anne Cohen, an associate scientist in geology and geophysics at Woods Hole Oceanographic Institution, led a team in 2014 to Bermuda. The investigators removed a 23-inch-long core from a living brain coral ( In addition to Wang, Cohen and Sigman, the research featured contributions from Princeton graduate student in geosciences Victoria Luu, Haojia Ren of National Taiwan University, Zhan Su of Caltech, and Gerald Haug of the Max Planck Institute for Chemistry.This work was supported by the National Science Foundation and Princeton University's Grand Challenges Program.The study, "Natural forcing of the North Atlantic nitrogen cycle in the Anthropocene," by Xingchen Tony Wang, Anne Cohen, Victoria Luu, Haojia Ren, Zhan Su, Gerald Haug and Daniel Sigman, was published online the week of October 1, 2018 in the | Pollution | 2,018 |
September 28, 2018 | https://www.sciencedaily.com/releases/2018/09/180928104500.htm | Green mango peel: A slick solution for oil-contaminated soils | Nanoparticles derived from green mango peel could be the key to remediating oil sludge in contaminated soil according to new research from the University of South Australia. | For the petroleum industry remediating oil sludge is a costly and an ongoing challenge, particularly when 3-7 per cent of oil processing activities are irreversibly lost as oily or sludge waste.Lead researcher, UniSA's Dr Biruck Desalegn says without treatment oil contaminated soil presents a massive risk to ecosystems and the environment."Last year, global oil production reached a new record of 92.6 million barrels per day, but despite improvements in control technologies, oil refineries unavoidably continue to generate large volumes of oil sludge," Dr Desalegn says."Oil contamination can present cytotoxic, mutagenic and potentially carcinogenic conditions for all living things, including people.""What's more, the toxicity and physical properties of oil change over time, which means the process of weathering can expose new, and evolved toxins."The new nanoparticles, synthesized from green mango peel extract and iron chloride, provide a novel and effective treatment for oil contaminated soil. They work by breaking down toxins in oil sludge through chemical oxidation, leaving behind only the decontaminated materials and dissolved iron.Dr Desalegn says the new plant-based nanoparticles can successfully decontaminate oil-polluted soil, removing more than 90 per cent of toxins."Plant extracts are increasingly used to create nanomaterials," Dr Desalegn says."In this study, we experimented with mango peel to create zerovalent iron nanoparticles which have the ability to breakdown various organic contaminants."With mango peel being such a rich source of bioactive compounds, it made sense that zerovalent iron made from mango peel might be more potent in the oxidation process."As we discovered, the mango peel iron nanoparticles worked extremely well, even outperforming a chemically synthesized counterpart by removing more of contaminants in the oil sludge."Dr Desalegn says this discovery presents a sustainable, green solution to address the significant pollution generated by the world's oil production."Ever since the devastation of the 2010 Deepwater Horizon oil spill, the petroleum industry has been acutely aware of their responsibilities for safe and sustainable production processes," Dr Desalegn says."Our research uses the waste part of the mango -- the peel -- to present an affordable, sustainable and environmentally friendly treatment solution for oil sludge."And while the world continues to be economically and politically reliant on oil industries as a source of energy working to remediate the impact of oil pollution will remain a serious and persistent issue."This research was conducted as part of Dr Desalegn's PhD with support from CRC CARE. | Pollution | 2,018 |
September 27, 2018 | https://www.sciencedaily.com/releases/2018/09/180927145304.htm | Large stretches of coral reefs can be rehabilitated | Even after being severely damaged by blast fishing and coral mining, coral reefs can be rehabilitated over large scales using a relatively inexpensive technique, according to a study led by the University of California, Davis, in partnership with Mars Symbioscience. | For the study, published this week in the journal Between 2013 and 2015, researchers attached coral fragments to the structures, which also stabilized rubble and allowed for water to flow through freely.Live coral cover on the structures increased from less than 10 percent to more than 60 percent. This was more than what was reported for reefs in many other areas of the Coral Triangle, at a cost of about $25 per square meter."Coral reef rehabilitation and restoration efforts are rapidly increasing around the world, but there are few large-scale examples of successful projects," said corresponding author Christine Sur, who was a UC Davis graduate student at the time of the study. "Our study demonstrates a cost-effective, scalable method that can inform other coral reef restoration efforts aimed at reducing the global decline of these valuable and unique ecosystems."Of particular surprise, while massive coral bleaching decimated other parts of the world between 2014 and 2016, bleaching in the rehabilitation area was less than 5 percent, despite warm water conditions known to stress corals.The study is one of the last publications led by Susan Williams, a professor and marine biologist with the UC Davis Bodega Marine Laboratory and Department of Evolution and Ecology who passed away in April 2018."Collaboration played a key role in the study's outcome," said corresponding author Jordan Hollarsmith, a doctoral candidate in the UC Davis Graduate Group in Ecology at Bodega Marine Laboratory. "Dr. Williams' dedication to building true partnerships between industry, local scientists and the people who live by the reefs she worked to rebuild was critical to this project's success."Mars Symbioscience, a business segment of Mars, Incorporated, initiated the project in 2013 in collaboration with local residents on the islands."This research is an important step in demonstrating how coral can be restored to a higher degree of cover quite quickly and at a relatively low cost," said Frank Mars, vice president of Mars Sustainable Solutions. "Healthy coral reef ecosystems provide natural coastal protection and are the foundation for many local fisheries, as well as jobs for tourism. They also play a critical role in the Mars supply chain as they provide food security and work for the families and communities we rely on to grow the raw materials we use in our brands around the world. For Mars, helping restore coral reefs is not only a business issue, it's also the right thing to do to ensure the planet, people, and communities that rely on them are healthy and thriving."Coral reefs are declining worldwide. Reversing their decline will require fully addressing climate change and other human impacts, the study said. For instance, illegal fishing, lack of island sanitation systems, threatened seagrass communities and marine debris such as plastic pollution are common issues in Indonesia and many tropical environments. People living in small island communities in this region also have few alternatives to fishing livelihoods and often lack access to education about the ocean environment.In the meantime, the "spider" technique and restoration projects offer a way to rehabilitate large swaths of coral reefs and the communities that depend on them, giving the reefs a chance to adapt or acclimate to worsening ocean conditions. | Pollution | 2,018 |
September 26, 2018 | https://www.sciencedaily.com/releases/2018/09/180926082740.htm | Microplastics found deep in sand where turtles nest | Microplastics have been found deep in the sand on beaches where sea turtles lay their eggs. | University of Exeter scientists found an average of 5,300 particles of plastic per cubic metre at depths of 60cm (2ft) on beaches in Cyprus used by green turtles and loggerheads.At the surface, up to 130,000 fragments of plastic were found per cubic metre -- the second-worst level ever recorded on a beach (the worst was in Guangdong, South China).Researchers say that if conditions worsen such pollution could eventually begin to affect hatching success and even the ratio of male and female turtle hatchlings."We sampled 17 nesting sites for loggerhead and green turtles and found microplastics at all beaches and all depths," said Dr Emily Duncan, of the Centre for Ecology and Conservation on the University of Exeter's Penryn Campus in Cornwall."Microplastics have different physical properties to natural sediments, so high levels could change the conditions at hatching sites, with possible effects on turtle breeding."For example, the temperature at which the egg incubates affects the sex of the hatchling -- with females more likely in warmer conditions."More research is needed to investigate what effects microplastics have on turtle eggs."Microplastics -- defined as less than 5mm in diameter -- come from numerous sources including discarded plastic items that have broken apart, microbeads from cosmetics and microfibres from clothes.Of the microplastics categorised in this research, nurdles (pellets used in the production of plastic products) and hard fragments broken from larger items were the most common."Unlike the beaches in China where the highest levels of microplastics have been recorded, these beaches in Cyprus are located far from industrial practices and aren't visited by large numbers of people," said Professor Brendan Godley, leader of the University of Exeter's marine strategy."Therefore it seems that microplastics are arriving on ocean currents. In this case, our analysis suggests most of it came from the eastern Mediterranean basin. This is also true of the large plastic items found on the beaches in Cyprus in large numbers."The findings support the theory that beaches act as a "sink" for marine microplastics, becoming key areas for contamination.The research, is based on 1,209 sediment samples taken from 17 turtle nesting beaches in northern Cyprus. | Pollution | 2,018 |
September 25, 2018 | https://www.sciencedaily.com/releases/2018/09/180925140405.htm | Weathering rates for mined lands exponentially higher than unmined sites | Mountaintop removal, a coal-mining technique used in much of Central Appalachia, is an extreme form of surface mining, that excavates ridges as deep as 600 feet -- twice the length of a football field -- and buries adjacent valleys and streams in bedrock and coal residue. This mining activity has long been known to have negative impacts on water quality downstream. | A new study led by watershed scientist Matthew Ross at Colorado State University found that many of these water quality impacts are caused by a dramatic increase in the chemical weathering rates of mined landscapes, which are melting away bedrock up to 45 times faster than unmined areas. In addition, the weathering has global consequences for the cycling of sulfur, which is a key nutrient for all life forms.The findings show that when people move large quantities of bedrock and soil to build cities or to extract resources, they can completely alter and accelerate the natural weathering processes on land, which can impact water quality downstream.Ross, an assistant professor in the Department of Ecosystem Science and Sustainability, described the chemical weathering rates as one of the highest rates ever observed, when compared to landscapes across the globe.The study, "Pyrite oxidation drives exceptionally high weathering rates and geologic COThis increased weathering -- like many mine-related impacts -- starts when iron sulfide or pyrite, a mineral also known as fool's gold often found in coal, is exposed to air. This creates sulfuric acid, making water draining from the mine extremely acidic and caustic. To neutralize the acid, in much of Central Appalachia the pyrite-bearing rock is intentionally surrounded by and mixed with carbonate rocks.While this limits acid-mine drainage problems, these acid-producing and -neutralizing reactions create ideal conditions for rapid chemical weathering of bedrock, with surprising implications for geologic carbon cycling of these landscapes.In most areas that experience chemical weathering, carbon dioxide dissolves into carbonic acid, a weak weathering agent. When carbonic acid reacts with silicates or rock-forming minerals, carbon dioxide is permanently locked into the bedrock, balancing the carbon cycle over millions of years. In unmined landscapes, this process provides a slow but inevitable sink for atmospheric carbon dioxide, or COIn mined landscapes with abundant sulfuric acid, the weathering reactions no longer rely on carbonic acid, and the potential for geologic carbon sequestration is eliminated. Instead, the sulfuric acid weathers out acid-neutralizing carbonates, which releases carbon dioxide into the atmosphere.This means that long after mining in these areas has stopped, researchers estimate that between 20 percent and 90 percent of the carbon absorbed by plants on the surface will be cancelled out by the release of rock carbon to the atmosphere."Because this weathering is happening so fast and it is powered by sulfuric acid, it creates a landscape that is a source for carbon dioxide," Ross said. "You're rapidly dissolving away the landscape and releasing a bunch of rock carbon."This regional impact also has global consequences for the cycling of sulfur, an element that is important for all life forms. While mountaintop mining operations in Appalachia cover a small portion, .006 percent, of the land area on Earth, they may contribute as much as 7 percent of the total global delivery of sulfur from land to ocean.This research, funded by the National Science Foundation, is part of an ongoing project led by Ross, who recently joined the faculty at the Warner College of Natural Resources. | Pollution | 2,018 |
September 25, 2018 | https://www.sciencedaily.com/releases/2018/09/180925115233.htm | Lifestyle intervention may mitigate weight gain due to ubiquitous contaminant | A new study finds that perfluoroalkyl and polyfluoroalkyl substances (PFASs) are associated with increases in weight, but exercise and diet may reduce the obesogenic effects of these environmental contaminants. The study, entitled Association of Perfluoroalkyl and Polyfluoroalkyl Substances with Adiposity, led by researchers from the Harvard Pilgrim Health Care Institute and the Harvard T.H Chan School of Public Health (HSPH) was published on August 31 in | PFASs are a group of synthetic chemicals that are detected in over 95% of the U.S. population. These substances have been used in nonstick cookware, oil- and water-resistant textiles, greaseproof food packaging, personal care products, floor polish, and firefighting foams and as industrial surfactants among other applications. Exposure to PFASs occurs through direct and indirect sources including contaminated drinking water and food, personal care products, soil, dust, and air. The study sought to determine the extent to which PFASs are associated with increases in weight and body size and to evaluate whether a lifestyle intervention of exercise and diet, modifies this association.The prospective cohort study included 957 individuals who participated in the Diabetes Prevention Program Outcomes Study and were followed for approximately 15 years. Study participants who were randomized to a lifestyle intervention group received training in diet, physical activity, and behavior modification. Participants randomized to placebo were given standard information about diet and exercise. The investigators found that among adults at high risk for diabetes, higher plasma PFAS concentrations were associated with a prospective and long-term increase in weight and hip girth among individuals randomized to the placebo group, but not for those randomized to the lifestyle intervention. The results indicate that lifestyle changes of exercise and diet can reduce the obesogenic effects of environmental exposures."Exercise and a balanced diet confer multiple benefits; our study results suggest that another added benefit is fighting the obesogenic action of environmental chemicals such as PFAS" said lead author Andres Cardenas, PhD, MPH, Research Fellow in the Department of Population Medicine at the Harvard Pilgrim Health Care Institute. | Pollution | 2,018 |
September 25, 2018 | https://www.sciencedaily.com/releases/2018/09/180925110030.htm | Indoor HEPA filters significantly reduce pollution indoors when outside air unhealthy, study finds | Outdoor air pollution is a major contributor to indoor air pollution -- but high-efficiency particulate air (HEPA) filters used in the home significantly reduce fine-particulate matter in the air compared with non-HEPA air filters, according to a new two-year study led by researchers at Intermountain Healthcare in Salt Lake City. | For the study, researchers monitored air quality for 12 weeks in the homes of enrolled patients who had respiratory problems. They found that the HEPA filters reduced fine particulate matter by 55 percent and particulate pollution outside coming inside was reduced by 23 percent.Fine-particulate matter, also known as PM2.5, are small airborne particules often found in areas with heavy air pollution levels. The size of PM2.5 particles is roughly 3 percent of the diameter of a piece of human hair. When inhaled, they can lead to respiratory problems, heart attacks, or exacerbate symptoms of people who already have respiratory problems like chronic obstructive pulmonary disease or asthma.In the winter in Utah, temperature inversions often trap cold, dirty air inside the Salt Lake Valley -- which gives Utah some of the worst air in the country. During these inversions, researchers tested the efficiency of HEPA filters inside the home."One of the reasons we wanted to research the effectiveness of HEPA air filters in the home is because people often ask what they can do to protect their lungs during poor air quality days," said Denitza Blagev, MD, pulmonary researcher at Intmountain Medical Center in Salt Lake City. "We found that running a stand-alone in-home HEPA filter and having the windows in the home closed can provide cleaner air inside the home, especially when outdoor air is so poor."Results of the study were presented on Sept. 16th during the European Respiratory Society's International Congress in Paris.Data from 30 participants enrolled in the study showed that when HEPA filters were used during the winter inversion months, only five percent of outdoor air PM2.5 contributed to the indoor air quality, compared to 28 percent when HEPA filters weren't in use.During the 12-week study, which occurred during the winter inversions of 2016 and 2017, air filters were placed in 52 Utah homes. A HEPA filter was used for one six-week period and a low-efficiency air filter was used during the second six-week period. The study participants weren't aware of which filter was being used during each period.Each participant's home was also outfitted with two low-cost air quality monitors. One was placed just outside the home and the other was placed inside. Differences in the quality of indoor and outdoor air were compared during the 12-week study period."Our next steps will be to look at whether the HEPA filter cleans the indoor air enough to help alleviate symptoms in patients with COPD or asthma during poor air quality days," said Dr. Blagev. "We often encourage our patients with respiratory illnesses to stay indoors on days when PM2.5 is high outdoors, but we hope to identify ways to help improve the indoor air quality and relieve symptoms in our patients, which will protect our lungs from dangerous air pollutants."Dr. Blagev stresses that while the HEPA filters help reduce PM2.5 inside the home, everyone should make efforts to improve the overall outdoor air quality in their communities.Other members of the Intermountain Medical Center team involved in the study include Daniel Bride, Danielle Babbel, and Benjamin Horne, PhD; and Daniel Mendoza, PhD, from the University of Utah.Intermountain Medical Center is the flagship hospital for the Intermountain Healthcare system based in Salt Lake City. | Pollution | 2,018 |
September 21, 2018 | https://www.sciencedaily.com/releases/2018/09/180921113456.htm | Light pollution makes fish more courageous | Artificial light at night also makes guppies more courageous during the day, according to a behavioural study led by researchers from the Leibniz-Institute of Freshwater Ecology and Inland Fisheries (IGB) and the Max Planck Institute for Human Development. Exposing fish to artificial light at night, not only made fish more active during the night, but also made them emerge quicker from hiding places during the day, which could increase their exposure to predators. Nocturnal lighting, however, did not affect their swimming speed or social behaviour during the day. | Light pollution can have many influences on ecological processes. Previous research has shown that artificial light at night can have several direct consequences on night-time activity and movement patterns of animals. Many animal species, for example birds and insects, are attracted by artificial light sources at night and can, as a result, loose their orientation. But how artificial light at night impacts the behaviour of individuals during the day, when the source of light pollution is absent, is largely unknown.In this study, a team led by Ralf Kurvers of the MPI for Human Development in collaboration with the IGB, tested how exposure to artificial light at night affected the behaviour of fish during the day. As study species, they used guppies, a tropical freshwater fish and one of the model organisms of animal behavioural science. The scientists studied three groups of animals. Each group was exposed to the same bright light conditions during the day, but to different illuminations during the night. The first group experienced complete darkness at night; the second group was kept at a low light level at night, comparable to nocturnal illuminance under a street lamp; and the third group experienced bright light at night. After ten weeks of exposure, the scientists conducted behavioural tests to study the consequences of nightly light exposure on daytime behaviours.The results: Fish left their hiding places faster during the day and swam more often in the riskier, open areas of the aquarium when exposed to strong, but also weak, artificial light at night. The light exposed fish thus increased their willingness to take risks. "The consequences of this increased risk taking behaviour are difficult to predict, but it is possible that they could be more at risk of predation by birds or other fish" says IGB researcher David Bierbach, co-author of the study. The light exposed fish did not differ in swimming speed and sociality, as compared to the control fish. "We suspect that the nocturnal light causes a stress response in the fish, and fish generally increase their risk taking when experiencing stress," explains Ralf Kurvers, lead author of the study. Also in humans, a disruption of the night can cause a stress response. For example, firefighters who slept fewer hours during the night had elevated levels of the stress hormone cortisol. | Pollution | 2,018 |
September 19, 2018 | https://www.sciencedaily.com/releases/2018/09/180919115853.htm | Mineral weathering from thawing permafrost can release substantial CO2 | The amount of carbon dioxide released from thawing permafrost might be greater than previously thought, according to a new study by University of Alberta ecologists. The research is the first to document the potential for substantial contributions of CO | Mineral weathering occurs when minerals previously locked-up in permafrost are exposed, and broken down into their chemical components by the sulfuric or carbonic acid that can exist naturally in water. Over long, geologic time scales, carbonic acid weathering is an important control on atmospheric CO"We found that rapidly thawing permafrost on the Peel Plateau in the Northwest Territories is greatly enhancing mineral weathering," explained Scott Zolkos, PhD candidate and lead author on the study. "Because weathering is largely driven by sulfuric acid in this region, intensifying permafrost thaw could be an additional source of COTo understand regional impacts of permafrost thaw and weathering, Zolkos and his supervisor Suzanne Tank, assistant professor of biology, worked with scientists from the Northwest Territories Geoscience Office to examine long-term records of river chemistry from the Peel River. Their results reveal that sulfuric-acid driven weathering has intensified with regional permafrost thaw in recent decades, and likely increased CO"Any additional warming in the Arctic, which is warming at twice the rate of the rest of the planet, promotes more permafrost thaw and thus poses substantial challenges to Arctic and global ecosystems," explained Zolkos. Yet, the effects from this mineral weathering on climate warming remain largely unexplored -- a problem that Zolkos and Tank hope to address."We'd like to thank the local Northern residents involved in our work," added Zolkos, of his experience conducting the research project. The research partnership stems from the ongoing collaborators and Northwest Territories community interests in the downstream effects of permafrost thaw. | Pollution | 2,018 |
September 19, 2018 | https://www.sciencedaily.com/releases/2018/09/180919111459.htm | Origami inspires highly efficient solar steam generator | Water covers most of the globe, yet many regions still suffer from a lack of clean drinking water. If scientists could efficiently and sustainably turn seawater into clean water, a looming global water crisis might be averted. Now, inspired by origami, the Japanese art of paper folding, researchers have devised a solar steam generator that approaches 100 percent efficiency for the production of clean water. They report their results in | Solar steam generators produce clean water by converting energy from the sun into heat, which evaporates seawater, leaving salts and other impurities behind. Then, the steam is collected and condensed into clean water. Existing solar steam generators contain a flat photothermal material, which produces heat from absorbed light. Although these devices are fairly efficient, they still lose energy by heat dissipation from the material into the air. Peng Wang and colleagues wondered if they could improve energy efficiency by designing a three-dimensional photothermal material. They based their structure on the Miura fold of origami, which consists of interlocking parallelograms that form "mountains" and "valleys" within the 3D structure.The researchers made their solar steam generator by depositing a light-absorbing nanocarbon composite onto a cellulose membrane that was patterned with the Miura fold. They found that their 3D device had a 50 percent higher evaporation rate than a flat 2D device. In addition, the efficiency of the 3D structure approached 100 percent, compared with 71 percent for the 2D material. The researchers say that, compared to a flat surface, origami "valleys" capture the sunlight better so that less is lost to reflection. In addition, heat can flow from the valleys toward the cooler "mountains," evaporating water along the way instead of being lost to the air. | Pollution | 2,018 |
September 18, 2018 | https://www.sciencedaily.com/releases/2018/09/180918195347.htm | Air pollution may be linked to heightened dementia risk | Air pollution is now an established risk factor for heart disease/stroke and respiratory disease, but its potential role in neurodegenerative diseases, such as dementia, isn't clear. | To try and explore this further, the researchers used carefully calculated estimates of air and noise pollution levels across Greater London to assess potential links with new dementia diagnoses.To do this, they drew on anonymised patient health records from the Clinical Practice Research Datalink (CPRD). This has been collecting data from participating general practices across the UK since 1987.For the purposes of this study, the researchers focused on just under 131,000 patients aged 50 to 79 in 2004, who had not been diagnosed with dementia, and who were registered at 75 general practices located within the London orbital M25 motorway.Based on the residential postcodes of these patients, the researchers estimated their yearly exposure to air pollutants-specifically nitrogen dioxide (NOThe health of these patients was then tracked for an average of 7 years, until a diagnosis of dementia, death, or deregistration from the practice, whichever came first.During the monitoring period, 2181 patients (1.7%) were diagnosed with dementia, including Alzheimer's disease.These diagnoses were associated with ambient levels of NOThose living in areas in the top fifth of NOThese associations were consistent and unexplained by known influential factors, such as smoking and diabetes, although when restricted to specific types of dementia, they remained only for patients diagnosed with Alzheimer's disease.This is an observational study, and as such, can't establish cause, and the findings may be applicable only to London. Nor were the researchers able to glean long term exposures, which may be relevant as Alzheimer's disease may take many years to develop.Many factors may be involved in the development of dementia, the exact cause of which is still not known, the researchers point out, and while there are several plausible pathways for air pollutants to reach the brain, how they might contribute to neurodegeneration isn't clear.But they suggest: "Traffic related air pollution has been linked to poorer cognitive development in young children, and continued significant exposure may produce neuroinflammation and altered brain innate immune responses in early adulthood."And they conclude that even if the impact of air pollution were relatively modest, the public health gains would be significant if it emerged that curbing exposure to it might delay progression of dementia. | Pollution | 2,018 |
September 18, 2018 | https://www.sciencedaily.com/releases/2018/09/180918180504.htm | Green space near home during childhood linked to fewer respiratory problems in adulthood | Children who have access to green spaces close to their homes have fewer respiratory problems, such as asthma and wheezing, in adulthood, according to new research presented today (Wednesday) at the European Respiratory Society International Congress [1]. In contrast, children who are exposed to air pollution are more likely to experience respiratory problems as young adults. | Until now, little has been known about the association between exposure to air pollution as a child and long-term respiratory problems in adulthood. RHINESSA [2] is a large international study that has been investigating lung health in children and adults in seven European countries, and that has information on residential "greenness" and air pollution exposures from birth onwards from several study centres. In a new analysis, Dr Ingrid Nordeide Kuiper (MD), from the Department of Occupational Medicine at Haukeland University Hospital, Norway, and colleagues analysed greenness data from 5415 participants aged between 18 and 52 years, contributed by RHINESSA centres in Tartu (Estonia), Reykjavik (Iceland), Uppsala, Gothenburg, Umea (Sweden) and Bergen (Norway); they also analysed air pollution data from 4414 participants, contributed from centres in Uppsala, Gothenburg, Umea and Bergen.They looked at how many people suffered from more than three respiratory symptoms, severe wheeze (in which the person experienced wheezing with breathlessness in the past year but did not have a cold), and late onset asthma (asthma that started after the age of 10 years). Respiratory symptoms included: chest wheezing or whistling; breathlessness when wheezing; wheezing or whistling without a cold; a tight chest on waking; being woken by an attack of shortness of breath; being woken by a cough; asthma attack; and taking asthma medicine.The researchers calculated average annual exposure to three air pollutants: two sizes of fine particulate matter (PM2.5 and PM10) and nitrogen dioxide (NO2) [3] from a child's birth until age 18. They also calculated annual average exposure to "greenness" in a 100-metre zone around the home address for the same period; "greenness" is assessed by means of a Normalised Difference Vegetation Index (NDVI), which uses satellite images to quantify the amount of vegetation in an area.A total of 608 participants (12%) had more than three respiratory symptoms, 384 (7.7%) had severe wheeze and 444 (9.4%) had late onset asthma."These are preliminary results," said Dr Kuiper, "but we found that exposure to greenness during childhood was associated with fewer respiratory symptoms in adulthood, while exposure to air pollutants in childhood was associated with more respiratory symptoms in adulthood and with late onset asthma."Examples of findings from the different centres that contributed data for the analysis showed that, in Bergen, exposure to PM2.5 and NO2 increased the probability of late onset asthma by 6-22%; exposure to PM10 increased the probability of developing respiratory symptoms by 21% in Uppsala and by 23% in Bergen; exposure to greenness before the age of ten was associated with a 71% lower probability of wheeze in Tartu, and exposure to greenness between the ages of 11 and 18 was associated with a 29% lower probability of respiratory symptoms and a 39% lower probability of wheeze in Tartu."We need to analyse these findings further before drawing any definite conclusions. However, it is likely that our findings will substantially expand our knowledge on the long-term effects of air pollution and greenness, enabling physicians, scientists and policy-makers to see the importance of exposure to pollution and access to green spaces, and helping to improve public health," said Dr Kuiper. "We will be conducting further analyses that include more centres that are taking part in the RHINESSA study, and we also want to expand analyses to look at the effects of exposure to air pollution and greenness across generations."She concluded: "We believe that our results, seen together with previous results, will be of particular value for city planners and policy-makers; with increasing population density in the years to come it will be vital to include a decrease in air pollution exposures and an increase in access to green spaces in future city plans and societal regulations."Professor Mina Gaga is President of the European Respiratory Society, and Medical Director and Head of the Respiratory Department of Athens Chest Hospital, Greece, and was not involved in the study. She said: "This is a fascinating study which underlines the importance for children's short- and long-term health of having plenty of green space in residential areas. The ongoing work of the RHINESSA study will, no doubt, produce more interesting and useful results to support these early indications. From a clinical point of view, access to green spaces is something that doctors may want to enquire about when they see patients with respiratory problems. They could, for instance, advise their patients about trying to avoid polluted areas and tell them about how green spaces might be able to counter some of the negative effects of pollution."Notes:[1] Abstract no: OA5185, "Lung health in adulthood after childhood exposure to air pollution and greenness," by I. Kuiper et al; oral presentation in "Effect of environmental exposure on lung function outcomes" session, 08.30-10.30 hrs CEST, Wednesday 19 September, Room 7.3M.[2] RHINESSA stands for Respiratory Health in Northern Europe, Spain and Australia.[3] PM10 and PM2.5 are atmospheric particulate matter (PM) that have a diameter of less than 10 or 2.5 micrometres respectively. PM2.5 is about 3% of the diameter of a human hair. Sources of PM10 include dust other emissions from industry, vehicle emissions and rubber tyre abrasions. Sources of PM2.5 include power plants, motor vehicles (internal combustion engines), airplanes, residential wood burning, forest fires, agricultural burning, volcanic eruptions and dust storms. Sources of NO2 include internal combustion engines and burning fossil fuels such as coal. | Pollution | 2,018 |
September 16, 2018 | https://www.sciencedaily.com/releases/2018/09/180916152704.htm | First evidence that soot from polluted air is reaching placenta | Evidence of tiny particles of carbon, typically created by burning fossil fuels, has been found in placentas for the first time, in new research presented today (Sunday) at the European Respiratory Society International Congress. | Previous research has indicated links between pregnant mothers' exposure to air pollution and premature birth, low birth weight, infant mortality and childhood respiratory problems.The new study adds to existing evidence on the dangers of pollution for unborn babies and suggests that when pregnant women breathe polluted air, sooty particles are able to reach the placenta via the bloodstream.The work was presented by Dr Norrice Liu, a paediatrician and clinical research fellow, and Dr Lisa Miyashita, a post-doctoral researcher, both members of Professor Jonathan Grigg's research group at Queen Mary University of London, UK. Dr Miyashita said: "We've known for a while that air pollution affects fetal development and can continue to affect babies after birth and throughout their lives."We were interested to see if these effects could be due to pollution particles moving from the mother's lungs to the placenta. Until now, there has been very little evidence that inhaled particles get into the blood from the lung."The researchers worked with five pregnant women who were all living in London and due to have planned caesarean section deliveries at the Royal London Hospital. They were all non-smokers with an uncomplicated pregnancy and each one gave birth to a healthy baby. The women all gave permission for researchers to study their placentas after delivery.The researchers were interested in particular cells called placental macrophages. Macrophages exist in many different parts in the body. They are part of the body's immune system and work by engulfing harmful particles, such as bacteria and pollution particles. In the placenta they also help to protect the fetus.The team studied a total of 3,500 placental macrophage cells from the five placentas and examined them under a high-powered microscope. They found 60 cells that between them contained 72 small black areas that researchers believe were carbon particles. On average, each placenta contained around five square micrometres of this black substance.They went on to study the placental macrophages from two placentas in greater details using an electron microscope and again found material that they believe was made up of tiny carbon particles.In previous research, the team have used the same techniques to identify and measure these sooty particles in macrophages in people's airways. Dr Liu added: "We thought that looking at macrophages in other organs might provide direct evidence that inhaled particles move out of the lungs to other parts of the body."We were not sure if we were going to find any particles and if we did find them, we were only expecting to find a small number of placental macrophages that contain these sooty particles. This is because most of them should be engulfed by macrophages within the airways, particularly the bigger particles, and only a minority of small sized particles would move into the circulation."Our results provide the first evidence that inhaled pollution particles can move from the lungs into the circulation and then to the placenta."We do not know whether the particles we found could also move across into the fetus, but our evidence suggests that this is indeed possible. We also know that the particles do not need to get into the baby's body to have an adverse effect, because if they have an effect on the placenta, this will have a direct impact on the fetus."Professor Mina Gaga is President of the European Respiratory Society, and Medical Director and Head of the Respiratory Department of Athens Chest Hospital, Greece, and was not involved in the study. She said: "Previous research shows that pregnant women living in polluted cities are more prone to pregnancy issues such as restricted fetal growth, premature birth and low birth weight babies. The evidence suggests that an increased risk of low birthweight can happen even at levels of pollution that are lower than the European Union recommended annual limit."This new research suggests a possible mechanism of how babies are affected by pollution while being theoretically protected in the womb. This should raise awareness amongst clinicians and the public regarding the harmful effects of air pollution in pregnant women."We need stricter policies for cleaner air to reduce the impact of pollution on health worldwide because we are already seeing a new population of young adults with health issues." | Pollution | 2,018 |
September 17, 2018 | https://www.sciencedaily.com/releases/2018/09/180917101307.htm | Air pollution affects thyroid development in fetuses | Soot and dust alters thyroid development in fetuses before they are born in smoggy cities, raising concern about health impacts later in life, new USC research shows. | It means before a doctor cuts the umbilical cord or a parent hugs a baby or a sibling gazes at the newest member of the family, the caress of air pollution already reached the womb's inner sanctum. The timing couldn't be worse, as the researchers found that no matter when they checked, thyroid impacts were evident until the final month of gestation.This is one of the few studies to monitor air pollution effects on a developing fetus and the first to track pollution changes month by month on thyroid hormones. The newly published research paper appears in "Air pollution is bad for adults and children and this study shows it may be bad for the fetus too, despite being protected in the womb," said Carrie V. Breton, corresponding author of the study and associate professor of preventive medicine at the Keck School of Medicine of USC. "Thyroid function is important for lots of elements of life and tweaking that in utero may have lifelong consequences."USC scientists have been studying the health impacts of urban air pollution for a generation under the Children's Health Study. It's one of the world's largest ongoing research efforts looking exclusively at how dirty air harms kids. USC is situated in the Los Angeles region, home to historically severe urban smog, an ideal laboratory to study air pollution health effects and environmental change across time.Since the effort began in 1992, various USC researchers have documented how air pollution contributes to school absences, asthma, bronchitis and lost-lung function. Conversely, as air quality has improved due to regulations and technology innovations, scientists have been able to track improvements in children's health.In the new study, the research team focused on 2,050 newborns, people who had been enrolled in the Children's Health Study previously. They selected them using birth data from the mid-1990s, when they were elementary school students at 13 Southern California schools. About 60 percent were white, 30 percent Latino and the remainder black or other races.The participants were included only if they had blood tests taken right after birth and had complete monthly exposure measures for air pollution throughout pregnancy. The scientists checked blood levels for total thyroxine (TT4), a hormone secreted by the thyroid gland.The researchers found that when exposure to PM2.5 increased by 16 micrograms per cubic meter of air (roughly the volume of a dishwasher), TT4 levels in blood increased 7.5 percent above average levels in babies. When exposure to PM10 increased by 22 micrograms per cubic meter, TT4 levels increased by 9.3 percent, according to the study. They did not see the same increases associated with other air pollutants, such as ozone or nitrogen dioxide.Moreover, exposures during months three to seven of pregnancy were most significant for PM2.5, which are typically sooty particles 20 times smaller than the diameter of a human hair. PM10 exposure during one to eight months of pregnancy was associated with significantly higher newborn TT4 concentrations. PM10 are airborne particles, 10 microns in diameter, which often comes from dirt dust and pulverized road grit.The findings show that the fetal thyroid gland seems particularly susceptible to airborne particulate, especially during early- to mid-pregnancy, according to the study. It's consistent with previous studies by other researchers that show industrial chemicals, tobacco smoke and indoor air pollution impact the thyroid gland.However, the study did not assess the health effects of the air pollution exposures. Thyroid hormones are critical for regulating fetal growth and metabolism and play important roles in neurodevelopment. Even subtle changes in maternal thyroid function during pregnancy have been associated with reduced fetal growth and cognitive deficits in children, with detrimental effects observed for both low- and high-levels of thyroid hormones, the study found. Also, the study only looked at one hormonal pathway associated with the thyroid gland, which the authors acknowledge is a limitation.Nonetheless, the findings underscore that air pollution penetrates deeply within the human body to reach the most vulnerable people of all -- unborn babies. Breton said it's a wakeup call not only for smoggy places like California and the United States, but rapidly industrializing cities around the world."There are several places around the world where air pollution is skyrocketing," Breton said. "This is another example of an environmental exposure that affects early development in subtle ways, and we don't know the health consequences."The study was supported by the National Institutes of Health (grants 1R21 ES025870, P30 ES007048, 1UG3OD023287 and T32 ES013678). | Pollution | 2,018 |
September 13, 2018 | https://www.sciencedaily.com/releases/2018/09/180913160039.htm | Study shows toxic effects of oil dispersant on oysters following deepwater horizon spill | Oysters likely suffered toxic effects from the oil dispersant Corexit® 9500 when it was used to clean up the 2010 Deepwater Horizon oil spill, said Morris Animal Foundation-funded researchers at the University of Connecticut. The team determined this by comparing the low levels of toxicity of oil, the dispersant and a mixture of the two on Eastern oysters. The team published their findings in the journal | After the Deepwater Horizon oil rig spilled more than 170 million gallons of oil into the Gulf of Mexico, nearly two million gallons of Corexit® 9500 was deployed into the Gulf to break the oil down."There's an unfortunate trade-off to using dispersants like this," said Lindsay Jasperse, a member of the university's research team that published the study. "They may prevent giant oil spills from washing ashore and damaging wetlands, but they also cause negative effects for species below the ocean's surface that might have been spared if dispersants weren't used."Oysters are considered a keystone species due their value to their ecosystem. Primarily, they serve as water purifiers, filtering out particles and nutrients from the water to improve the quality for surrounding species. Oyster reefs also prevent erosion and provide habitat and protection for many crabs and fish. Unfortunately, as they are immobile and so abundant, they are at a significant risk for critical exposure to oil and oil dispersants following environmental disasters.Researchers compared, in a controlled environment, the toxicity of oil alone, the dispersant alone and a mixture of the two on oysters. The researchers tested both the oysters' feeding rates, or how well they could filter algae, and immune functions, or how well they could absorb and destroy bacteria, which indicates an oyster's ability to fight off infection. A reduction in an oyster's feeding rates could result in stunted growth or even death. If an oyster's immune system is compromised, it can be more likely to succumb to infection.For the oysters' immune function, the dispersant alone was the most toxic, followed by the dispersant and oil mixture. Oil alone did not impact the oysters' immune function at all. Researchers tested the oysters' feeding rates and found the mixture of the dispersant and oil had the most toxic effect, followed by oil alone and then the dispersant alone."Knowing the effects dispersants and oil have on oysters can help us make better mitigation recommendations the next time an environmental and ecological crisis like this happens," said Dr. Kelly Diehl, Morris Animal Foundation Interim Vice President of Scientific Programs. "Species are interconnected, and what harms oysters will likely cascade through their ecosystem to the detriment of all." | Pollution | 2,018 |
September 13, 2018 | https://www.sciencedaily.com/releases/2018/09/180913134558.htm | Most fire in Florida goes undetected | A new study from Florida State University researchers indicates that common satellite imaging technologies have vastly underestimated the number of fires in Florida. | Their report, published in collaboration with researchers from the Tall Timbers Research Station and Land Conservancy, challenges well-established beliefs about the nature and frequency of fire in the Sunshine State. While there were more fires than expected, researchers said, strategically prescribed burns throughout the state are proving an effective force against the ravages of wildfire.The paper appears in the journal For scientists studying fire, sophisticated satellites whizzing far above the Earth's surface have long represented the best tool for monitoring wildfires and prescribed burns -- carefully controlled and generally small fires intended to reduce the risk of unmanageable wildfires.But FSU researchers suggest that fire experts themselves have been getting burned by faulty data, and that broadly-accepted estimates of fire area and fire-based air pollutants might be flawed."There are well-known challenges in detecting fires from satellites," said lead investigator Holly Nowell, a postdoctoral researcher in the Department of Earth, Ocean and Atmospheric Science. "Here we show that only 25 percent of burned area in Florida is detected."Using comprehensive ground-based fire records from the Florida Forest Service -- which regulates and authorizes every request for a prescribed burn in the state -- researchers found dramatic discrepancies between fires detected by satellites and fires documented by state managers.The majority of fires in Florida come in the form of prescribed burns, but because these fires are designed to be brief and contained, they often fall under the radar of satellites soaring overhead.This is especially true in a state like Florida, where dense cloud cover is common and the warm, wet climate allows vegetation to regrow quickly after a blaze, disguising the scars that fires leave in their wake."Like a detective, satellites can catch a fire 'in the act' or from the 'fingerprints' they leave behind," said study co-author Christopher Holmes, an assistant professor in EOAS. "In our area, catching an active fire in a thermal image can be hard because the prescribed fires are short, and we have frequent clouds that obscure the view from space."The state fire records also revealed a counterintuitive truth: Unlike in western states such as California, where dry conditions frequently produce massive increases in destructive and often uncontrollable fires, Florida actually experiences a decrease in land consumed by fire during drought.When drought conditions emerge, researchers said, officials are less likely to authorize prescribed burns. And because prescribed burns account for the overwhelming majority of fires in the state, overall fire activity decreases.This also suggests that prescribed burning programs -- which aim to reduce the risk of wildfire in dry conditions -- are having a materially positive effect."Although we still have occasional destructive wildfires, including the recent tragic Eastpoint fire, our results indicate that prescribed fire policy is helping to reduce wildfire risk," Holmes said, referencing the June 2018 wildfire that destroyed dozens of homes in Florida's Big Bend region.While the team's study reconfirms the utility of prescribed burning, it calls into question prevailing estimates for airborne pollution from fire. If, as the study suggests, only 25 percent of fires in Florida are detected by satellites, then there could be "a rather large bias and a significant potential underestimation of emissions," Nowell said.The study's findings are specific to Florida, but researchers suspect that similar satellite limitations may be skewing fire detection -- and, consequently, emission estimates -- in neighboring regions and geographically analogous areas like the savannas of Africa or the agricultural belts of Europe and Asia."We believe this result easily extends to the rest of the Southeast United States -- which burns more area than the rest of the United States combined in a typical year -- and other similar regions throughout the world that use small prescribed burns as a land management technique," Nowell said. | Pollution | 2,018 |
September 12, 2018 | https://www.sciencedaily.com/releases/2018/09/180912133513.htm | Air purifiers may benefit fetal growth | A new study led by SFU health sciences researchers Prabjit Barn and Ryan Allen reveals fetal growth may improve if pregnant women use portable air purifiers inside their homes. | The study, a first of its kind, was conducted in Ulaanbaatar, Mongolia, which is one of the most polluted cities in the world and has fine particulate matter (PM2.5) levels more than seven times higher than WHO guidelines. Fine particulate matter is the pollutant most consistently linked with human health effects.The researchers recruited more than 500 women early on in their pregnancies and placed high-efficiency particulate arresting (HEPA) air purifiers in half of the women's homes. The air purifiers decreased fine particulate matter in the women's homes by 29 per cent."We found that pregnant women who used HEPA air purifiers inside their homes gave birth to babies that weighed 85 grams more on average at term than women who did not use air cleaners during pregnancy," says Barn.The researchers say that these results provide further evidence that air pollution exposure during pregnancy has a negative impact on fetal growth and that reducing exposures can be beneficial. | Pollution | 2,018 |
September 11, 2018 | https://www.sciencedaily.com/releases/2018/09/180911114545.htm | Carbon nanodots do an ultrafine job with in vitro lung tissue | Epidemiological studies have established a strong correlation between inhaling ultrafine particles from incomplete combustion and respiratory and cardiovascular diseases. Still, relatively little is known about the mechanisms behind how air particulates affect human health. New work with carbon nanodots seeks to provide the first model of how ultrafine carbon-based particles interact with the lung tissues. | An international group of researchers created a 3D lung cell model system to investigate how carbon-based combustion byproducts behave as they interact with human epithelial tissue. In "Localization and quantification of inhaled carbon nanoparticles at the cellular level has been very difficult," said Barbara Rothen-Rutishauser, an author on the paper, which is part of a special focus issue of the journal At less than 100 nanometers in diameter, ultrafine particles have the small size and large relative surface area to wreak havoc on cells and potentially enter the bloodstream. Other groups' research has shown that ultrafine particles induce adverse effects on the lungs and cardiovascular system by increasing oxidative stress in the body.Because of particle size, it is difficult for lab techniques to distinguish between carbon in pollutants from carbon in tissues. Therefore, little is known about surface charge and states of agglomeration, two key physical and chemical features that affect how carbon particles interact with living tissues.To begin modeling ultrafine particles, Estelle Durantie, another author of the study, turned to fluorescent carbon nanodots doped with nitrogen and a combination of nitrogen and sulfur with different sizes and charges. The team then applied these nanodots to the top layer of a lab-grown epithelial tissue, where gas exchange typically happens in the lung.Since regular fluorescent microscopes lack the resolution to visualize such small particles, the group used spectroscopy and UV light to detect and quantify nanodots as they migrated from the luminal compartment past their lung model's immune cells. As the researchers expected, charged particles tended to stick together before penetrating the gas-exchange barrier. While most of the neutrally charged nanodots passed through the tissue after only an hour, only 20 percent of the agglomerated charged particles infiltrated the epithelium.Rothen-Rutishauser said she hopes to further improve nanodots so that they better mimic ultrafine particles. "What we're seeing is that translocation depends on aggregation state," Rothen-Rutishauser said. "We hope to continue trying out different sizes of nanodots, including other types of particles that get us closer to the real environment." | Pollution | 2,018 |
September 11, 2018 | https://www.sciencedaily.com/releases/2018/09/180911095855.htm | Evaluating the contribution of black carbon to climate change | Black carbon refers to tiny carbon particles that form during incomplete combustion of carbon-based fuels. Black carbon particles absorb sunlight, so they are considered to contribute to global warming. However, the contribution of black carbon to the heating of the Earth's atmosphere is currently uncertain. Models that can accurately assess the warming effect of black carbon on our atmosphere are needed so that we can understand the contribution of these tiny carbon particles to climate change. The mixing state of black carbon particles and their particle size strongly influence their ability to absorb sunlight, but current models have large uncertainties associated with both particle size and mixing state. | Researchers from Nagoya and Cornell Universities have combined their expertise to develop a model that can predict the direct radiative effect of black carbon with high accuracy. The team achieved such a model by considering various particle sizes and mixing states of black carbon particles in air."Most aerosol models are using one or two black carbon mixing states, which are not sufficient to accurately describe the mixing state diversity of black carbon in air," says Hitoshi Matsui. "Our model considers that black carbon particles have multiple mixing states in air. As a result, we can model the ability of black carbon particles to heat air more accurately than in previous estimates."The researchers found that the direct radiative effect of black carbon predicted by their model was highly sensitive to the particle size distribution only when the complex mixing states of black carbon were suitably described.High sensitivity was obtained by the developed model because it calculated factors like the lifetime of black carbon in the atmosphere, the ability of black carbon to absorb sunlight, and the effect of materials coating the black carbon particles on their ability to absorb sunlight realistically. All of these factors are influenced by the particle size and mixing state of black carbon.The results show that properly describing the particle size and mixing state of black carbon is very important to understand the contribution of black carbon to climate change.The team's results suggest that the interactions of black carbon with atmospheric and rain patterns are likely to be more complex than previously considered. The developed model improves our ability to estimate the effectiveness of removing black carbon from the atmosphere to suppress future changes in temperature, which should help to direct research on strategies to mitigate climate change. | Pollution | 2,018 |
September 10, 2018 | https://www.sciencedaily.com/releases/2018/09/180910160635.htm | Analyzing roadside dust to identify potential health concerns | Everyone knows that cars contribute to air pollution. And when most people consider the source, exhaust is usually what comes to mind. | However, new research led by the University of Pennsylvania's Reto Gieré, working with collaborators across the world, is helping to illuminate another significant culprit when it comes to traffic-related air pollution: Tiny bits of tires, brake pads, and road materials that become suspended in the air when vehicles pass over."More and more I've noticed that we don't know enough about what is on our roads," says Gieré, professor and chair of Penn's Department of Earth and Environmental Science in the School of Arts and Sciences. "If you have lots of traffic, cars, and trucks driving by, they re-suspend the dust on the roads into the atmosphere, and then it becomes breathable. To understand the potential health implications of these dust particles, it's really important to understand what's on the road."While regulatory efforts have helped make cars cleaner and more efficient, those restrictions do not address the pollution that arises from tire and brake wear. Increasing urban congestion stands to aggravate these as sources of pollution and possibly adverse health effects."About 4 million people die prematurely from air pollution each year," says Gieré. "From unsafe water the number is 2 million. Yet we have a United Nations Sustainable Development Goal about water pollution but not one about the air."To shed light on the contents of traffic-related dust and the conditions that make it more likely to accumulate, Gieré has teamed with German colleagues from the Federal Highway Research Institute, the German Meteorological Service, and the University of Freiburg to sample and analyze the air along roadsides. In 2017, they published the findings of a year-long sampling effort along two highly frequented motorways in Germany, one subject to more stop-and-go traffic and another in a more rural area bordered by agricultural fields.To passively collect the dust along the roadsides, they used customized cylindrical samplers with a transparent sticky foil at the bottom to trap particles that make their way in. The researchers checked the collection points and switched out the sticky "trap" weekly.Using optical microscopy to analyze the collected airborne particles, the team found that the site with busier traffic patterns had 30 percent more particles overall, with a greater fraction derived from tire wear. Weather factored significantly into the patterns they observed; dry and warm conditions were associated with a greater build-up of particles."At higher temperatures we saw more tire abrasion, more pollution than at intermediate temperatures," Gieré says. "This was exactly analogous to what two laboratory studies found."With higher temperatures and more dry spells predicted under climate change, Gieré notes that this problem of tire abrasion may only get worse, "which is significant," he says, "because nearly 30 percent of the microplastics released globally to the oceans are from tires."In a more recent study, published last month in "The optical microscope gives us a first approximation," Gieré says, "while the scanning electron microscope allows us to distinguish between tire abrasion, brake abrasion, carbon, or find out if there are minerals in there."Taking a further step, the team also ran samples through an analysis that provides information about the elements that compose each specimen, called energy-dispersive X-ray spectroscopy.This study focused on the "super coarse" particles collected, those greater than 10 micrometers in size. (For comparison, a human hair is roughly 75 micrometers in diameter.) While still tiny, these particles pose less of a health threat than those even smaller, which are more easily inhaled. Still, these larger particles can wind up in waterways and soil, affecting wildlife or possibly even agricultural crops.Ninety percent of the dust particles collected from the three sites were traffic-related and the researchers again saw differences between the sites. The slower-moving traffic on the urban road generated fewer particles from brake wear but more from tires; they noted that the tire rubber became encrusted with minerals and other materials from the roads. The highway with more stop-and-go traffic generated more brake particles.Tire and brake pad manufacturers do not disclose all the contents of their products, but it's known that zinc, lead, antimony, silicates, cadmium, and asbestos are used by some. These are chemicals that can pose a health risk if they get into the environment or, if the tires are burned as they sometimes are by coal plants, the atmosphere."These coarse particles aren't going to be transported very far, so pollution is going to be restricted to the vicinity of these roads, especially during congestion," Gieré says. "But they do also collect on the road and then wash into rivers. Our team believes that's a major pathway of how microplastics get into waterways."One way to reduce this avenue of pollution would be traffic-calming measures, such as coordinated traffic lights, that reduce the amount of starting and stopping that drivers must perform. Gieré and colleagues, including Ph.D. student Michael O'Shea, are also performing similar experiments on the streets of Philadelphia and comparing the pollution levels between different neighborhoods to see what is happening a little closer to home.The work was financially supported in part by the Federal Highway Research Institute in cooperation with the German Meteorological Service. | Pollution | 2,018 |
September 10, 2018 | https://www.sciencedaily.com/releases/2018/09/180910142417.htm | US wildfire smoke deaths could double by 2100 | The number of deaths associated with the inhalation of wildfire smoke in the U.S. could double by the end of the century, according to new research. | A new study simulating the effects of wildfire smoke on human health finds continued increases in wildfire activity in the continental United States due to climate change could worsen air quality over the coming decades. The number of human deaths from chronic inhalation of wildfire smoke could increase to more than 40,000 per year by the end of the 21st century, up from around 15,000 per year today.Wildfire smoke is composed of a mixture of gases and microscopic particles from burned material known as particulate matter. Particulate matter from wildfire smoke often reaches nearby communities and can irritate human eyes, exasperate respiratory systems and worsen chronic heart and lung diseases, according to the federal Centers for Disease Control and Prevention.Exposure to particulate matter is associated with visibility degradation, premature death in people with heart or lung disease, heart attacks, irregular heartbeats, aggravated asthma, decreased lung function and increased respiratory symptoms, according to the U.S. Environmental Protection Agency. Older adults, children and those with heart or lung diseases are most at risk.Researchers used global climate model simulations to estimate particulate matter's impacts on air quality and human health in the contiguous United States in the early-, mid-, and late-21st century under different climate scenarios. The new study, published in Emissions of particulate matter from human activities -- such as burning fossil fuels -- are declining nationwide, but wildfires are increasing in frequency and intensity because of climate change, according to the study. From January to July 2018, NOAA recorded 37,718 fires that burned 4.8 million acres of land. In 2017, the U.S. Forest Service's wildfire suppression costs reached a historic high of $2.4 billion.The new study finds the number of deaths attributable to total particulate matter from all sources will decrease by the end of the 21st century, but the number of deaths attributable to fire-related particulate matter could double under the worst climate scenarios.This new finding highlights the need to prepare for future air quality changes caused by wildfires in the U.S., according to the study's authors."We know from our own research and many, many other groups that smoke has negative impacts on human health," said Jeff Pierce, associate professor of atmospheric science at Colorado State University in Fort Collins and co-author of the new study. "With the knowledge that fires have been increasing in parts of the U.S., we wanted to look at how bad this might get."In the new study, Pierce and his team analyzed the potential effects of wildfire smoke on human health over the coming decades. They simulated the impacts of changing fire emissions on air quality, visibility, and premature deaths in the middle to late 21st century under different climate scenarios.They found that declines in particulate matter from human sources like car, industry and power plant emissions over the 21st century is offset by increases in smoke emissions from more intense wildfires, causing an increase in particulate matter in some regions. In the study, researchers used simulated concentrations of particulate matter generated by a model for early-, mid- and late-century time frames.The new study predicts that average visibility due to particulate matter will improve across the contiguous United States over the 21st century, but fire-related particulate matter will reduce visibility on the worst days in the western and southeastern U.S. Haze from wildfire smoke affects how people see colors, forms and textures of a given vista or skyline. The fine particles in the air absorb and scatter sunlight, making it difficult to see clearly, according to the National Park Service.From 2000 to 2010, approximately 140,000 deaths per year, or 5 percent of total deaths, were attributable to total particulate matter. Of those deaths, about 17,000, or 0.7 percent per year, were linked to particulate matter from wildfires. In the paper, the authors estimate uncertainties in these numbers.The new study estimates fire-related particulate matter deaths could more than double by the end of the century in the worst-case-scenario prediction model."People could use this information as sort of a first estimate of what to prepare for in terms of future air quality," Pierce said. "We need more simulations to be able to assess the different probabilities of what the future might be."Although there are increased efforts in place to reduce wildfire risks in the U.S., the occurrence of wildfires has continued to increase in frequency and intensity, which are strongly linked to a changing climate, according to the study.To continue reducing the health burdens due to fire-related particulate matter, the study's authors call for more emphasis on reducing exposure through public health campaigns in conjunction with climate mitigation efforts."I think that we need to act now," said Sheryl Magzamen, associate professor of epidemiology at Colorado State University, who was not involved in the new study. "Our exposure to wildfire smoke is only going to get worse going into the next century, so we need to plan and be prepared in terms of acting to protect population health." | Pollution | 2,018 |
September 10, 2018 | https://www.sciencedaily.com/releases/2018/09/180910111237.htm | Large trucks are biggest culprits of near-road air pollution | For the 30 per cent of Canadians who live within 500 metres of a major roadway, a new study reveals that the type of vehicles rolling past their homes can matter more than total traffic volume in determining the amount of air pollution they breathe. | A two-year U of T Engineering study has revealed large trucks to be the greatest contributors to black carbon emissions close to major roadways. Professor Greg Evans hopes these results gets city planners and residents thinking more about the density of trucks, rather than the concentration of vehicle traffic, outside their homes, schools and daycares. The study was recently published in the journal "I've been asked by people, 'We live near a high-traffic area, should we be worried?' My response is that it's not so much about how much traffic there is, it's more about the percentage of trucks, older trucks in particular."The comprehensive study -- led by Evans and collaborators at Environment and Climate Change Canada, and the Ontario Ministry of the Environment, Conservation and Parks, as well as the Metro Vancouver Regional District -- involved measuring vehicle emissions near roads in Vancouver and Toronto, including the 401, North America's busiest stretch of highway.The difference between emission levels across the sites was more correlated with the number of large trucks on the road rather than number of cars.Researchers found that air pollution levels right beside a major trucking route within a city were close to levels seen beside Highway 401, despite the road carrying less than one-tenth of the vehicle traffic. "This was in part due to differences in wind and proximity to the road but, surprisingly, the number of vehicles didn't make that much of a difference," said Evans.The data also revealed a significant drop in emissions on the 401 on the weekends, when personal vehicle traffic is still very high, but the volume of large truck traffic is low.Research consistently links traffic emissions to negative effects on both the environment and human health. "Whether it be cancer, respiratory problems, cardiac problems or neurodegenerative problems, there are numerous adverse health effects associated with the chemicals in these emissions," said Evans. "If we were able to reduce emission of pollutants like black carbon, we would also see an immediate climate benefit." Black carbon -- commonly called soot -- is a marker for exposure to diesel exhaust which is known to have negative health effects.Evans points out that modern trucks have made large improvements in their emissions -- it's the older diesel trucks that are the real culprits. "Those big, 18-wheeler diesel trucks last for a long time. We need to push to retrofit these old trucks with better emission treatment systems. Simply retrofitting the worse offending trucks, or getting them off the road, is a tremendous opportunity to improve air quality in our cities."The study will be part of a larger report in December that will stress the importance of implementing long-term monitoring of traffic related air pollution in Canada, and indicating that targeting high-emitting vehicles such as old trucks can provide a path towards improving near-road air quality.In the meantime, Evans hopes the study gets Canadians thinking about the effects of working, playing and living near truck-related air pollution. "When a cyclist is riding near a large truck and they see a large plume of soot coming out -- it's important for them to be aware. Although shipping freight and construction by truck are critical to our economy, people need to know about the negative effects. There are ways that we can achieve a better balance." | Pollution | 2,018 |
September 7, 2018 | https://www.sciencedaily.com/releases/2018/09/180907091523.htm | Cover the U.S. In 89 percent trees, or go solar | How many fields of switchgrass and forests of trees would be needed to offset the energy produced by burning coal? A lot, it turns out. | While demand for energy isn't dropping, alarms raised by burning fossil fuels in order to get that energy are getting louder. Solutions to cancel the effects of carbon dumped into our atmosphere include carbon capture and storage or bio-sequestration. This zero-emission energy uses technical means as well as plants to take in and store carbon emissions. Another route is to use solar photovoltaics to convert sunlight directly into electricity and only sequester the carbon emissions from the production of solar cells.Zero-emission energy has been offered as a way to offset the carbon dioxide production while still maintaining coal's electricity generation. That's done through carbon capture and storage in saline aquifers, or by using both enhanced oil recovery and bio-sequestration through planting trees and other plants to suck up and store carbon.In a new study published inFor the first time, researchers have shown that there is no comparison. It's not even close.In fact, coal-fired power plants require 13 times more land to be carbon neutral than the manufacturing of solar panels. We'd have to use a minimum of 62 percent of U.S. land covered by optimal crops or cover 89 percent of the U.S. with average forests to do it."We know that climate change is a reality, but we don't want to live like cavemen," says Joshua Pearce, professor of materials science and engineering and electrical engineering at Michigan Tech. "We need a method to make carbon neutral electricity. It just makes no sense whatsoever to use coal when you have solar available, especially with this data."Researchers drew these conclusions from over 100 different data sources to compare energy, greenhouse gas emissions and land transformation needed to carbon neutralize each type of energy technology.They claim a one-gigawatt coal-fired plant would require a new forest larger than the state of Maryland to neutralize all of its carbon emissions.Researchers also found that applying the best-case bio-sequestration for all the greenhouse gases produced by coal-fired power plants, would mean using 62 percent of the nation's arable land for that process, or 89 percent of all U.S. land with average forest cover.In comparison, solar cells require 13 times less land to become carbon neutral and five times less than the best-case coal scenario."If your goal is to make electricity without introducing any carbon into the atmosphere, you should absolutely not do a coal plant," Pearce says. Not only is it not realistic to capture all the carbon dioxide they release, but burning coal also puts sulfur dioxide and nitrous oxide and particulates in the air, which creates air pollution, already estimated to cause 52,000 premature deaths annually.Pearce says that he and his team were generous to coal-fired power plants in how they calculated the efficiency of carbon capture and storage when scaled up. They also did not consider new ways that solar farms are being used to make them even more efficient, like using higher efficiency black silicon solar cells, putting mirrors in between rows of panels so light falling between them can also be absorbed, or planting crops between rows (agrivoltaics) to achieve greater land use.Pearce says future research should focus on improving the efficiency of solar panels and solar farms, not on carbon capture of fossil fuel-powered plants in an attempt to become zero-emission energy-not when this data shows it isn't realistic in order to protect our changing climate. | Pollution | 2,018 |
September 6, 2018 | https://www.sciencedaily.com/releases/2018/09/180906123310.htm | New study shows ways to maximize temperature-lowering benefits of Chicago's green roofs | Extreme heat poses a unique challenge to cities in the United States. According to the National Weather Service, extreme heat accounts for 20 percent of deaths by natural hazard in the United States, taking an average of 130 lives per year. | With exploding urban populations and increasing migration, cities are struggling to keep up with increases in extreme heat-related climate impacts, threatening human health, straining energy resources and reducing economic productivity.Heavily populated cities like Chicago have made an effort to mitigate the effects of extreme heat, implementing green roofs designed to provide insulation and significantly lower temperatures.Now in a new study published in "We wanted to look at the potential of these types of mitigation strategies through the eyes of the mayor, city manager or city planner," said Ashish Sharma, research assistant professor in the Department of Civil and Environmental Engineering and Earth Sciences at the University of Notre Dame, who led the study for Notre Dame's Environmental Change Initiative. "If you're considering factors like temperature and electricity consumption to improve quality of life, reduce energy loads and lower temperatures, you need a scientific and interdisciplinary approach. We examined temperatures based on current climate models, electricity consumption (air conditioning) loads from public available data, and socioeconomic vulnerability of census tracts to identify susceptible hotspots. The goal of this study is to help city officials make more-informed decisions when it comes to urban planning."Sharma said Chicago was an ideal choice for such a study, as extreme heat has been a particular challenge for the city. During a particular brutal heat wave in the summer of 1995, more than 700 people in Chicago died due to extreme heat. Previous studies found the impact was highest among disadvantaged neighborhoods.According to the City of Chicago's website, green roof coverage accounts for an estimated 5.5 million square feet, a number that is expected to rise given the city's goal of seeing 6,000 green roofs within the city by 2020.Though green roofs have lowered temperatures and contributed to improved air quality, they are a response to hotter temperatures, not a fix. The reality of climate change shows little relief in sight when it comes to extreme heat. Temperatures are expected to rise, with comparable heat waves expected to occur at the rate of twice a decade, according to current models, rising to five times per decade in high-emission scenarios.Sharma and his team simulated temperature data and used publicly available electricity consumption for the entire Chicago region. They then calculated a social vulnerability assessment, collecting variable data from the Centers for Disease Control and the American Community Survey at the census tract. The results became the Heat Variability Index (HVI). The combination of these factors allowed researchers to take a closer look at optimal locations for green roof implementation."It's critical not only to identify where green roofs can lower the temperatures most, but also to identify populations that are disproportionately affected by high temperatures," Sharma wrote in the study.Looking only at electrical consumption, those areas where air conditioning is used most, for example, may not account for affluence. In certain neighborhoods, residents can afford the cost, which ultimately makes them less vulnerable. In lower-income neighborhoods, some residents can't afford to turn their air conditioning on, or don't have access to air conditioning at all.By layering data, the result of the study is a comprehensive look at the utility of green roofs to reduce temperatures, ease electricity consumption and help the populations most vulnerable to heat exposure."What we've seen when it comes to urban planning is decisions are made without interdisciplinary input," Sharma said. "Now, we have a framework for answering the question, how do we improve urban resilience to extreme heat?"The next step, Sharma said, is to enhance the framework to account for multiple variations, such as variables that take place throughout the day or season so models can be tailored to other cities and their unique conditions.Co-authors of the study include Alan Hamlet, Milan Budhathoki and Harindra Fernando at Notre Dame; Sierra Woodruff, formerly of Notre Dame and currently at the Department of Landscape Architecture and Urban Planning at Texas A&M University; and Fei Chen with the Research Application Laboratory at the National Center for Atmospheric Research.The study was funded by the Notre Dame Environmental Change Initiative, the Notre Dame Center for Research Computing, the National Center for Atmospheric Research supercomputing resources, and the United States Department of Agriculture-National Institute of Food and Agriculture's Agriculture and Food Research Initiative. | Pollution | 2,018 |
September 6, 2018 | https://www.sciencedaily.com/releases/2018/09/180906101344.htm | Risk gene for Alzheimer's may aggravate neurological effects of air pollution in children | There is growing evidence that exposure to air pollution adversely affects cognitive and behavioural development in children. However, the mechanisms underlying this association are, as yet, unknown. Now, the findings of a new study from the Barcelona Institute for Global Health (ISGlobal), an institute supported by the "la Caixa" Banking Foundation, suggest that the ε4 variant of the APOE gene may play a significant role in this process. The study has been published in the journal | Previous studies carried out within the framework of the BREATHE project have linked childhood exposure to air pollution with diminished cognitive development, increased behavioural problems, and even structural differences in the brains of the children studied.In the new study, which analysed data from over 1,600 children attending 39 schools in Barcelona, scientists observed that the association between exposure to traffic-related pollution and adverse effects on neurodevelopment was more marked in the children who carried the ε4 allele of the APOE gene. Carriers of this genetic variant had higher behaviour problem scores and their attention capacity developed more slowly. Moreover, the volume of the caudate nucleus, an anatomical brain structure, tended to be smaller in that population."These findings suggest that children who carry this allele could be more vulnerable to the detrimental effects that air pollution has on important aspects of their neurodevelopment," explained Silvia Alemany, ISGlobal researcher and lead author of the study"Systemic inflammation and oxidative stress are two of the most well-established mechanisms underlying the adverse health effects of air pollution. Interestingly, both these mechanisms are also involved in the pathogenesis of dementia. In fact, research has demonstrated an association between exposure to air pollution and cognitive impairment in older people. All these considerations, and the fact that APOE ε4 is the most important known genetic risk factor for Alzheimer's disease, led us to wonder whether the allele might also have a relationship with the adverse effects air pollution has on brain function in children," says Silvia Alemany.Genetic data were available for all of the participants. Tests were carried out to evaluate cognitive functions, behavioural problems and possible symptoms of attention deficit hyperactivity disorder. Traffic-related air pollution levels were calculated on the basis of actual measurements. Magnetic resonance imaging data were available for 163 of the study participants."More research will be needed in other populations to replicate these results and we need to establish whether this possible genetic vulnerability also applies to exposure to air pollution during earlier stages of development, for example, in the prenatal period," warns ISGlobal researcher Jordi Sunyer, director of the BREATHE project. "In any case, once again the findings are clear: it is essential to implement measures to reduce traffic in the urban environment and, particularly, in places where children are present, such as the areas around schools.." | Pollution | 2,018 |
September 4, 2018 | https://www.sciencedaily.com/releases/2018/09/180904114756.htm | Reducing nitrogen inputs prevents algal blooms in lakes | For decades, experts have debated whether reducing the amount of nitrogen flowing into lakes can improve water quality in the long-term, even though blue-green algae can bind nitrogen from the air. However, no lakes with decreased nitrogen inputs have been monitored for long enough to clarify this -- until now: scientists from the Leibniz-Institute of Freshwater Ecology and Inland Fisheries (IGB) have analysed long-term data to prove that decreasing nitrogen in Berlin's Lake Müggelsee is the key to reducing algal blooms in summer. They showed that the amount of atmospheric nitrogen bound by blue-green algae is far too small to be used as an argument against the ecologically necessary reduction of nitrogen inputs. | In the 1970s, scientists discovered that nutrient inputs -- mainly phosphorus and nitrogen -- from agriculture and wastewater discharge were the main cause of excessive plant and algal growth in lakes and rivers. Since then, water management experts have concentrated on reducing phosphorus inputs. "Although this strategy often works, it is by no means always successful. In shallow lakes, the sediment releases large quantities of phosphorus in summer. In these cases, reducing nitrogen input may help to control algal blooms because algae need both phosphorus and nitrogen to grow. Until now, however, there has been no convincing evidence that decreasing nitrogen inputs, which is more complex and costly than decreasing phosphorus, works in the long term," stated IGB freshwater ecologist Dr. Tom Shatwell, explaining the starting point of the study.To conduct their investigation, the scientists statistically analysed 38 years of data (1979-2016). Since the 1970s, Lake Müggelsee (in Berlin, Germany) and its tributaries have been sampled on a weekly basis as part of a long-term programme to investigate phosphorus and nitrogen concentrations as well as the species composition in algal communities. Müggelsee is one of the few lakes in the world that have experienced a significant decrease in phosphorus and nitrogen pollution and that have been monitored for a sufficiently long time to draw conclusions on the effects of reducing nitrogen inputs.Every summer, there was an excess of phosphorus in the water of Lake Müggelsee. The scientists concluded it was the decrease in nitrogen that caused algae blooms to decrease -- and water clarity to increase. Contrary to common views, blue-green algae species did not replace the nitrogen missing from the tributaries with nitrogen from the atmosphere in the long term. In fact, blue-green algae did not increase in abundance and there was very little binding of atmospheric nitrogen. "It takes much more energy to fix atmospheric nitrogen than it does to use nitrogen compounds present in the water. Blue-green algae obviously only use this method when absolutely necessary and when there is sufficient solar energy," explained Dr. Jan Köhler, co-author and leader of the "Photosynthesis and Growth of Phytoplankton and Macrophytes" research group at IGB. | Pollution | 2,018 |
September 4, 2018 | https://www.sciencedaily.com/releases/2018/09/180904114656.htm | Alpine ecosystems struggle to recover from nitrogen deposition | What happens to high mountain ecosystems when you take away air pollution? Not much, not very quickly. A new CU research study finds that degraded alpine ecosystems showed limited recovery years after long-term inputs of human-caused nitrogen air pollution, with soil acidification and effects on biodiversity lingering even after a decade of much lower nitrogen input levels. | The study, which was recently published in the journal "The legacy of the impacts of nitrogen pollution is strong, and our results emphasize that sensitive standards are needed to minimize enduring environmental impacts," said William Bowman, lead author of the study and a professor in CU Boulder's Department of Ecology and Evolutionary Biology (EBIO).Nitrogen is a key nutrient for life, but agricultural and industrial activities have increased global levels significantly over the last two centuries, with previous research indicating harmful effects on water quality, soil acidity and biodiversity. Nitrogen emission rates have slowed in most of the U.S. and Europe in recent years, but continue to increase in developing regions.The new study explores the extent to which alpine ecosystems can recover or reverse the effects of nitrogen deposition even after input levels have slowed. To test the difference, CU Boulder researchers used a long-running set of field plots first established in 1997 on Colorado's Niwot Ridge at an elevation of 11,400 feet. The plots had been artificially exposed to varying levels of additional nitrogen over the course of 12 years.Beginning in 2009, the researchers divided the plots in half, continuing to fertilize one half at the same rate while cutting off nitrogen to the other. Then, they followed the changes in the plots' biotic composition and ecosystem processes for nine more years, tracking changes in plant diversity, microbial abundance and soil acidity.Overall, the researchers found that vegetation recovery was more limited in the areas that had received the highest levels of nitrogen previously, even after gaining a reprieve in subsequent years. Bacteria and fungi abundances also remained lowered and soil remained acidic, indicating sustained impacts that cannot be easily reversed."The altered chemistry and biology of the ecosystem stimulated the rates of nitrogen cycling in the soil, extending the negative impacts of a high nitrogen condition," Bowman said. Additionally, some recovery processes operate at geologic scales, relying on the breakdown of the rocks and soil particles that can take decades or longer.The findings indicate that many of the effects of human-caused nitrogen deposition may already be baked into ecosystems and hamper their recovery regardless of future decreases in emission rates, a crucial consideration for setting environmental regulations and pollution standards.Additional co-authors of the research include Cliff Bueno de Mesquita, Noah Fierer, Stefanie Sternagel and Teal Potter of CU Boulder and Asma Ayyad of University of California Riverside. The National Science Foundation provided funding for the study via the Niwot Ridge Long-Term Ecological Research Project and a Research Experiences for Undergraduates Grant. | Pollution | 2,018 |
August 30, 2018 | https://www.sciencedaily.com/releases/2018/08/180830084812.htm | Engineered sand zaps storm water pollutants | University of California, Berkeley, engineers have created a new way to remove contaminants from storm water, potentially addressing the needs of water-stressed communities that are searching for ways to tap the abundant and yet underused source of fresh drinking water. | Using a mineral-coated sand that reacts with and destroys organic pollutants, the researchers have discovered that the engineered sand could help purify storm water percolating into underground aquifers, creating a safe and local reservoir of drinking water for parched communities."The way we treat storm water, especially in California, is broken. We think of it as a pollutant, but we should be thinking about it as a solution," said Joseph Charbonnet, a graduate student in civil and environmental engineering at UC Berkeley. "We have developed a technology that can remove contamination before we put it in our drinking water in a passive, low-cost, non-invasive way using naturally-occurring minerals."As rain water rushes over our roofs, lawns and streets, it can pick up a slew of nasty chemicals such as herbicides, pesticides, toxic metals, car oil and even dog poop. Excess storm water can also overwhelm sewer systems and flood streets and basements. Not surprisingly, cities often discharge this polluted water into neighboring rivers and streams as quickly as possible.Directing storm water through sand into underground aquifers may be an ideal solution for gathering water in cities with Mediterranean climates like Los Angeles, Charbonnet said. Like giant rain barrels, aquifers can be filled during periods of intense rainfall and then store water until it is needed in the dry season.Cities are already using storm water reclamation on smaller scales through constructs such as bioswales and rain gardens, which funnel storm water through sand or mulch to remove debris and prevent surface runoff. In the Sun Valley neighborhood of Los Angeles, Charbonnet and his adviser, David Sedlak, are working with the local community to transform a 46-acre gravel pit into a wetland and water infiltration system for storm water."Before we built the buildings, roads and parking lots that comprise our cities, rainwater would percolate into the ground and recharge groundwater aquifers," said Sedlak, professor of civil and environmental engineering at UC Berkeley and co-director of the Berkeley Water Center. "As utilities in water stressed regions try to figure out how to get urban storm water back into the ground, the issue of water quality has become a major concern. Our coated sands represent an inexpensive, new approach that can remove many of the contaminants that pose risks to groundwater systems where storm water is being infiltrated."Although the coated sand doesn't remove all types of contaminants, it may be used in conjunction with other water purification systems to remove many of the contaminants that water picks up, Sedlak said.The team details the finding Aug. 30 in the journal To create the coated sand, Charbonnet mixed plain sand with two forms of manganese that react to form manganese oxide. This harmless mineral binds to organic chemicals such as herbicides, pesticides, and the endocrine-disrupting bisphenol-A (BPA) and breaks them down into smaller pieces that are usually less toxic and more biodegradable."Manganese oxides are something that soil scientists identified 30 or 40 years ago as having these really interesting properties, but we are one of the first groups to use it in engineered ways to help unlock this water source," Charbonnet said.The manganese oxide-coated sand, which is a dull brown color, is safe and environmentally friendly. "I guarantee that you have some manganese oxide on your shoe right now because it is ubiquitous in the soil," Charbonnet said.Charbonnet tested the sand by percolating simulated storm water, which contained a low concentration of BPA, through columns of the material. The coated sand initially removed nearly all of the BPA, but lost its effectiveness over time. However, the manganese oxide could be "recharged" by bathing the sand in a solution containing a low concentration of chlorine. Recharging the sand restored all of the manganese oxide's initial reactivity."If you have to come in every year or two and dig up this sand and replace it, that is incredibly labor intensive, so in order to make this useful for community stakeholders it's really important that this stuff can be regenerated in place," Charbonnet said.Charbonnet estimates that it would take about two days to recharge a half-meter-deep layer of sand using 25 parts per million of chlorine in water, the concentration used to treat wastewater.In the next phase of the experiment, the team is performing field tests in Sonoma County using storm water from a local creek. | Pollution | 2,018 |
August 29, 2018 | https://www.sciencedaily.com/releases/2018/08/180829133214.htm | Drought increases CO2 concentration in the air | Land ecosystems absorb on average 30% of anthropogenic CO | Plants are usually able to access water deep in the soil through their roots. However, conventional satellites only see what happens at the surface and cannot measure how much water is available underground. In the last few years, a new type of satellite mission has been used to measure extremely small changes in the Earth's gravity field. It was found that some small perturbations of the gravity field are caused by changes in water storage. When there is a major drought in a given region, there is less water mass and gravity is consequently slightly weaker over that region. Such variations are so small that they are imperceptible to humans. But by measuring them with satellites, scientists are able to estimate large-scale changes in water storage to an accuracy of about four centimetres everywhere on the planet.Using these new satellite observations of water storage, Vincent Humphrey and his colleagues were able to measure the overall impact of droughts on photosynthesis and ecosystem respiration. They compared year-to-year changes in total water mass over all continents against global measurements of CODuring the last century, the concentration of CO | Pollution | 2,018 |
August 29, 2018 | https://www.sciencedaily.com/releases/2018/08/180829115514.htm | Air pollution can put a dent in solar power | Ian Marius Peters, now an MIT research scientist, was working on solar energy research in Singapore in 2013 when he encountered an extraordinary cloud of pollution. The city was suddenly engulfed in a foul-smelling cloud of haze so thick that from one side of a street you couldn't see the buildings on the other side, and the air had the acrid smell of burning. The event, triggered by forest fires in Indonesia and concentrated by unusual wind patterns, lasted two weeks, quickly causing stores to run out of face masks as citizens snapped them up to aid their breathing. | While others were addressing the public health issues of the thick air pollution, Peters' co-worker Andre Nobre from Cleantech Energy Corp., whose field is also solar energy, wondered about what impact such hazes might have on the output of solar panels in the area. That led to a years-long project to try to quantify just how urban-based solar installations are affected by hazes, which tend to be concentrated in dense cities.Now, the results of that research have just been published in the journal After initially collecting data on both the amount of solar radiation reaching the ground, and the amount of particulate matter in the air as measured by other instruments, Peters worked with MIT associate professor of mechanical engineering Tonio Buonassisi and three others to find a way to calculate the amount of sunlight that was being absorbed or scattered by haze before reaching the solar panels. Finding the necessary data to determine that level of absorption proved to be surprisingly difficult.Eventually, they were able to collect data in Delhi, India, providing measures of insolation and of pollution over a two-year period -- and confirmed significant reductions in the solar-panel output. But unlike Singapore, what they found was that "in Delhi it's constant. There's never a day without pollution," Peters says. There, they found the annual average level of attenuation of the solar panel output was about 12 percent.While that might not sound like such a large amount, Peters points out that it is larger than the profit margins for some solar installations, and thus could literally be enough to make the difference between a successful project and one that fails -- not only impacting that project, but also potentially causing a ripple effect by deterring others from investing in solar projects. If the size of an installation is based on expected levels of sunlight reaching the ground in that area, without considering the effects of haze, it will instead fall short of meeting its intended output and its expected revenues."When you're doing project planning, if you haven't considered air pollution, you're going to undersize, and get a wrong estimate of your return on investment," Peters saysAfter their detailed Delhi study, the team examined preliminary data from 16 other cities around the world, and found impacts ranging from 2 percent for Singapore to over 9 percent for Beijing, Dakha, Ulan Bator, and Kolkata. In addition, they looked at how the different types of solar cells -- gallium arsenide, cadmium telluride, and perovskite -- are affected by the hazes, because of their different spectral responses. All of them were affected even more strongly than the standard silicon panels they initially studied, with perovskite, a highly promising newer solar cell material, being affected the most (with over 17 percent attenuation in Delhi).Many countries around the world have been moving toward greater installation of urban solar panels, with India aiming for 40 gigawatts (GW) of rooftop solar installations, while China already has 22 GW of them. Most of these are in urban areas. So the impact of these reductions in output could be quite severe, the researchers say.In Delhi alone, the lost revenue from power generation could amount to as much as $20 million annually; for Kolkata about $16 million; and for Beijing and Shanghai it's about $10 million annually each, the team estimates. Planned installations in Los Angeles could lose between $6 million and $9 million.Overall, they project, the potential losses "could easily amount to hundreds of millions, if not billions of dollars annually." And if systems are under-designed because of a failure to take hazes into account, that could also affect overall system reliability, they say.Peters says that the major health benefits related to reducing levels of air pollution should be motivation enough for nations to take strong measures, but this study "hopefully is another small piece of showing that we really should improve air quality in cities, and showing that it really matters." | Pollution | 2,018 |
August 29, 2018 | https://www.sciencedaily.com/releases/2018/08/180829081342.htm | Biodegradable plastic blends offer new options for disposal | Imagine throwing your empty plastic water bottle into a household composting bin that breaks down the plastic and produces biogas to help power your home. Now, researchers have taken an early step toward this futuristic scenario by showing that certain blends of bioplastics can decompose under diverse conditions. They report their results in the ACS journal | Plastic waste pollution is a global environmental problem, particularly in oceans, where plastic debris can harm or kill sea animals and birds who ingest or become entangled in it. Despite increased levels of recycling in many countries, most plastic waste still ends up in landfills or the environment. Scientists have developed biodegradable plastics, but they often lack the flexibility, strength or toughness of conventional plastics. Blends of different bioplastics can offer improved characteristics, but their environmental fate is uncertain. Tanja Narancic, Kevin O'Connor, Ramesh Babu Padamati and colleagues wanted to examine the degradation of individual bioplastics and their blends under various conditions.The researchers studied the fates of 15 different plastics or blends under managed conditions, such as composting and anaerobic digestion, as well as unmanaged environments, including soil and fresh or marine water. Polylactic acid (PLA) is one of the best-selling biodegradable plastics on the market, but it requires high temperatures for breakdown and is not home-compostable. Surprisingly, a blend of PLA and polycaprolactone (PCL) degraded completely to carbon dioxide, biomass and water under typical home-composting conditions. Many of the individual plastics and blends that were tested decomposed under conditions of anaerobic digestion, a process that can produce biogas, and all degraded with industrial composting. The researchers say that biodegradable plastic blends could create new possibilities for managing plastic waste. However, only two plastics, polyhydroxybutyrate (PHB) and thermoplastic starch (TPS), broke down completely under all soil and water conditions. Therefore, biodegradable plastics are not a panacea for plastic pollution, and they must be managed carefully after they leave the consumer, the researchers say. | Pollution | 2,018 |
August 28, 2018 | https://www.sciencedaily.com/releases/2018/08/180828085857.htm | Traffic noise may make birds age faster | Traffic noise may be associated with an increased rate of telomere loss in Zebra finches that have left the nest, according to a study published in | Researchers at the Max Planck Institute for Ornithology, Germany and North Dakota State University, USA, investigated the effect of traffic noise on the telomere length of offspring Zebra finches. The researchers found that zebra finches that were exposed to traffic noise after they had left the nest had shorter telomeres at 120 days of age than Zebra finches that were exposed to noise until 18 days post-hatch (before they had left the nest) and whose parents were exposed to traffic noise during courtship, egg-laying, and nesting. Finches exposed to noise after leaving the nest also had shorter telomeres than those which had not been exposed to traffic noise at all.Dr Adriana Dorado-Correa, corresponding author of the study, said: "Our study suggests that urban noise alone, independent from the many other aspects of city life, such as light pollution or chemical pollution, is associated with increased telomere loss and may contribute to aging in Zebra finches. Our study is a first step towards identifying the causal mechanisms that may account for differences in life span observed between birds living in urban or rural environments."Dr Sue Anne Zollinger, co-author of the study added: "Cellular aging as a result of urban stressors is something that may not have a very visible impact, but our study indicates that although birds may seem to be adapting to life in noisy cities, they may actually be aging faster. It may be important to consider developmental stages in birds when studying the effects of urbanization, as the mechanisms by which these human-induced habitat changes impact individuals may change throughout their lifetime."As only Zebra finches exposed to noise after leaving the nest had shorter telomeres, the authors suggest that the time between 18 and 120 days after hatching is a critical period during which birds are more affected by noise. This period of time is also when Zebra finches begin song learning, which may make them more sensitive to noise. By contrast, Zebra finches may be less sensitive to noise while still in the nest, and parent birds may be able to make behavioural changes to protect offspring from the negative effects of noise exposure.The researchers evaluated the impact of traffic noise exposure on a total of 263 birds by comparing telomere lengths at 21 and 120 days post-hatch under three different conditions: birds that hatched to parents that were exposed to noise, with the offspring themselves exposed until 18 days after hatching; birds that hatched to non-noise exposed parents but which were themselves exposed to noise from day 18 to 120; and controls in which neither the parents nor the chicks were exposed to noise.The traffic noise used in the study consisted of recordings of street traffic which mimicked typical urban noise patterns. The researchers collected blood samples for each offspring bird at 21 and 120 days post hatch to measure telomere length and rate of telomere loss. | Pollution | 2,018 |
August 27, 2018 | https://www.sciencedaily.com/releases/2018/08/180827180752.htm | Many Arctic pollutants decrease after market removal and regulation | Levels of some persistent organic pollutants (POPs) regulated by the Stockholm Convention are decreasing in the Arctic, according to an international team of researchers who have been actively monitoring the northern regions of the globe. | POPs are a diverse group of long-lived chemicals that can travel long distances from their source of manufacture or use. Many POPs were used extensively in industry, consumer products or as pesticides in agriculture. Well-known POPs include chemicals such as DDT and PCBs (polychlorinated biphenyls), and some of the products they were used in included flame retardants and fabric coatings.Because POPs were found to cause health problems for people and wildlife, they were largely banned or phased out of production in many countries. Many have been linked to reproductive, developmental, neurological and immunological problems in mammals. The accumulation of DDT, a well-known and heavily used POP, was also linked to eggshell-thinning in fish-eating birds, such as eagles and pelicans, in the late 20th century, and caused catastrophic population declines for those animals.In 2001, 152 countries signed a United Nations treaty in Stockholm, Sweden intended to eliminate, restrict or minimize unintentional production of 12 of the most widely used POPs. Later amendments added more chemicals to the initial list. Today, more than 33 POP chemicals or groups are covered by what is commonly called the "Stockholm Convention," which has been recognized by182 countries."This paper shows that following the treaty and earlier phase-outs have largely resulted in a decline of these contaminants in the Arctic," says John Kucklick, a biologist from the National Institute of Standards and Technology (NIST) and the senior U.S. author on the paper, published August 23 in "In general, the contaminants that are being regulated are decreasing," says Frank Rigét from the Department of Bioscience, Aarhus University, Denmark, and lead author.POPs are particularly problematic in the Arctic because the ecosystem there is especially fragile, and pollution can come from both local sources and from thousands of miles away due to air and water currents. POPs also bioaccumulate. This means that they build up faster in animals and humans than they can be excreted, and that exposure can increase up the food chain. Plankton exposed to POPs in water are eaten by schools of fish, which are in turn eaten by seals or whales, and with each jump up the food chain the amount of POPs increases. The same is true for terrestrial animals. A large mammal's exposure, therefore, can be large and long-lasting.Indigenous people living in northern coastal areas such as Alaska often consume more fish and other animals that come from higher on the food chain than the average American. Such communities, therefore, are potentially exposed to larger amounts of these pollutants.For almost two decades beginning in 2000, Kucklick and Rigét worked in conjunction with scientists from Denmark, Sweden, Canada, Iceland and Norway to track POPs in the fat of several marine mammals and in the tissue of shellfish and seabirds. They also monitored air in the Arctic circle for pollution.To gain a fuller picture of how the deposition of POPs might have changed over time, the study included specimens archived since the 1980s and '90s in special storage facilities around the globe. The U.S. specimens were provided by the NIST Biorepository, located in Charleston, South Carolina. Samples archived in that facility are part of the Alaska Marine Mammal Tissue Archival Project (AMMTAP) or the Seabird Tissue Archival and Monitoring Project (STAMP). Both collections are conducted in collaboration with other federal agencies.The study pooled more than 1,000 samples taken over the course of several decades from many different locations throughout the Arctic Circle. In general, the so-called legacy POPs -- those that have been eliminated or restricted from production -- were shown to be decreasing over the past two to three decades, although some had decreased more than others.The biggest decreases were in a byproduct of the pesticide lindane, a-HCH, with a mean annual decline of 9 percent in Arctic wildlife.The research team found PCBs had decreased as well. Most industrial countries banned PCBs in the 1970s and '80s, and their production was reduced under the Stockholm Convention in 2004. Previously, the compounds had been widely used in electrical systems. In this study, it was found that their presence had decreased by almost 4 percent per year across the Arctic region since being pulled from the market.Two of the legacy POPs listed under Stockholm, β-HCH and HCB, showed only small declines of less than 3 percent per year. β-HCH was part of a heavily-used pesticide mixture with the active ingredient lindane and HCB was used both in agriculture and industry.A small number of the legacy POPs had increased in a few locations, although some of those were at sites suspected to be influenced by strong, still-existing local pollution sources.Notably, the flame retardant hexabromocyclododecane (HBCDD) showed an annual increase of 7.6 percent. HBCDD was one of 16 additional POPs added to the Stockholm Convention as of 2017 and is recommended for elimination from use, with certain exemptions.Most of the research conducted for this paper was a direct result of the 2001 treaty stipulations, which included a requirement that sponsors participate in ongoing, long-term biological monitoring. Although the U.S. participated in the research, it has not ratified the treaty. It is expected that work on the treaty will continue as new POPs are identified.This recent research work highlights the usefulness of long-term data and international scientific collaboration, says Rigét. "You really need to gather more than 10 years of data before you can see the trend because in the short term there can be some small fluctuations," he notes. "Looking at this data also showed us how to be more economical and avoid over-sampling in the future." | Pollution | 2,018 |
August 27, 2018 | https://www.sciencedaily.com/releases/2018/08/180827134431.htm | This bright blue dye is found in fabric: Could it also power batteries? | A sapphire-colored dye called methylene blue is a common ingredient in wastewater from textile mills. | But University at Buffalo scientists think it may be possible to give this industrial pollutant a second life. In a new study, they show that the dye, when dissolved in water, is good at storing and releasing energy on cue.This makes the compound a promising candidate material for redox flow batteries -- large, rechargeable liquid-based batteries that could enable future wind farms and solar homes to stockpile electricity for calm or rainy days.The research appeared online on Aug. 13 in the journal "Methylene blue is a widely used dye. It can be harmful to health, so it's not something you want to dump into the environment without treating it," says lead researcher Timothy Cook, PhD, assistant professor of chemistry in the UB College of Arts and Sciences. "There's been a lot of work done on ways to sequester methylene blue out of water, but the problem with a lot of these methods is that they're expensive and generate other kinds of waste products.""But what if instead of just cleaning the water up, we could find a new way to use it? That's what really motivated this project," says first author Anjula Kosswattaarachchi, a UB PhD student in chemistry.The study is just the first step in assessing how -- and whether -- methylene blue from industrial wastewater can be used in batteries."For this to be practical, we would need to avoid the costly process of extracting the dye from the water," Cook says. "One of the things we're interested in is whether there might be a way to literally repurpose the wastewater itself."In textile-making, there are salts in the wastewater. Usually, to make a redox flow battery work, you have to add salt as a supporting electrolyte, so the salt in wastewater might be a built-in solution. This is all speculative right now: We don't know if it will work because we haven't tested it yet."What Cook and Kosswattaarachchi have shown -- so far -- is that methylene blue is good at important tasks associated with energy storage. In experiments, the scientists built two simple batteries that employed the dye -- dissolved in salt water -- to capture, store and release electrons (all crucial jobs in the life of a power cell).The first battery the researchers made operated with near-perfect efficiency when it was charged and drained 50 times: Any electrical energy the scientists put in, they also got out, for the most part.Over time, however, the battery's capacity for storing energy fell as molecules of methylene blue became trapped on a membrane critical to the device's proper function.Choosing a new membrane material solved this problem in the scientists' second battery. This device maintained the near-perfect efficiency of the first model, but had no notable drop in energy storage capacity over 12 cycles of charging and discharging.The results mean that methylene blue is a viable material for liquid batteries. With this established, the team hopes to take the research one step further by obtaining real wastewater from a textile mill that uses the dye."We'd like to evaporate the wastewater into a more concentrated solution containing the methylene blue and the salts, which can then be tested directly in a battery," Cook says.In Sri Lanka's textile industry, a personal connection to the studyThe project is important to Kosswattaarachchi from a personal standpoint: Before coming to UB, she worked in textiles, developing new fabric technologies for the Sri Lanka Institute of Nanotechnology (SLINTEC).Textiles are one of the country's most important economic sectors, and the industry creates many jobs. But pollution is a downside, with wastewater an environmental concern."We believe that this work could set the stage for an alternative route for wastewater management, paving a path to a green-energy storage technology," Kosswattaarachchi says. | Pollution | 2,018 |
August 26, 2018 | https://www.sciencedaily.com/releases/2018/08/180826120735.htm | Long-term cooking with coal, wood, or charcoal associated with cardiovascular death | Long-term use of coal, wood, or charcoal for cooking is associated with an increased risk of death from cardiovascular disease, according to a study presented today at ESC Congress 2018.1 | Dr Derrick Bennett, study author, University of Oxford, UK, said: "Our study suggests that people who use solid fuels for cooking should switch to electricity or gas as soon as possible."It has been suggested that air pollution from cooking with solid fuels, such as coal, wood, or charcoal, may lead to premature death from cardiovascular disease, but there is limited evidence. This study assessed the association between solid fuel use for cooking and cardiovascular death, as well as the potential impact of switching from solid to clean fuel (electricity or gas).The study included 341,730 adults aged 30-79 years recruited from ten areas of China in 2004 to 2008. Participants were interviewed about how often they cooked and the main fuel used at their three most recent homes. The researchers then estimated the duration of exposure to solid fuels. The analysis was restricted to those who cooked at least weekly at their three most recent residences and did not have cardiovascular disease. Information on mortality up to 1 January 2017 was collected from death registries and hospital records.The average age of participants was 51.7 years and three-quarters were female. Nine out of ten had spent at least 20 years in their three most recent residences. Overall, 22.5% of participants had primarily used solid fuels for cooking for 30 years or more, 24.6% for 10-29 years, and 53.0% for less than ten years. Among the latter, 45.9% had never used solid fuels in their most recent three homes and 49.1% had switched from solid to clean fuels during this period.During 3.4 million person-years of follow-up, 8,304 participants died from cardiovascular disease. After adjusting for education, smoking and other cardiovascular risk factors, each decade of exposure to solid fuel was associated with a 3% higher risk of cardiovascular death (95% confidence interval [CI] 1-4%, p=0.0002). Participants who had used solid fuels for 30 years or longer had a 12% greater risk of cardiovascular death than those who had used them for less than ten years (95% CI 3-21%, p=0.0045).Compared to persistent long-term use of solid fuels, adopting clean fuels was associated with a lower risk of death from cardiovascular disease. Each decade earlier switch from solid to clean fuels was associated with a 5% lower risk of cardiovascular death (95% CI 1-8%, p=0.0067). Participants who had changed for ten years or longer had risks comparable to persistent clean fuel users.Professor Zhengming Chen, principal investigator, University of Oxford, UK, said: "We found that long-term use of solid fuels for cooking was associated with an excess risk of cardiovascular death, after accounting for established risk factors. Switching to electricity or gas weakened the impact of previous solid fuel use, suggesting that the negative association may be reversible." | Pollution | 2,018 |
August 26, 2018 | https://www.sciencedaily.com/releases/2018/08/180826120729.htm | Red light at night: A potentially fatal attraction to migratory bats | Night time light pollution is rapidly increasing across the world. Nocturnal animals are likely to be especially affected but how they respond to artificial light is still largely unknown. In a new study, scientists from the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) in Berlin, Germany, tested the response of European bats to red and white light sources during their seasonal migration. | Soprano pipistrelles (The study has just been published in the scientific journal Each year, light pollution increases by around six per cent worldwide. In particular, energy efficient and cheap LEDs are more and more used. Light is an important cue for orientation used by many animals, and also influences their diurnal rhythms and behaviour. It is well established that bats are sensitive to light while hunting at night. While some species are attracted to artificial light sources because of the insects nearby, most bat species generally avoid artificial light. Most previous studies examined the response of bats to artificial light during non-migratory periods. It is already well-known that artificial light causes disorientation in birds that migrate at night. Does the same apply to bats? Many bat species also travel for several hundred or even thousand kilometres during their annual migration, yet we know virtually nothing about their response to artificial light.During late summer, thousands of bats migrate along the coastline of the Baltic Sea in Latvia, through Pape Nature Reserve. Nights are starlit and largely devoid of light pollution as there are only a few human settlements in the area. Here, the scientists installed an eight metre high pole near the shoreline. A plastic board fixed to the pole was lit-up in 10-minute intervals alternating with darkness. The LED lights illuminating the board switched between red or white LED light. By using ultrasonic microphones the scientists recorded the echolocation calls of bats coming close in order to identify both the species and the number of bats passing by the unlit or lit experimental site.Soprano pipistrelles ("Bats are at a higher collision risk at wind power stations during their autumn migration. Our study indicates that the use of red light signals could have fatal consequences for them as this appears to attract them to operating wind turbines," explains Oliver Lindecke, co-author of the study. Technological solutions already available could help: "Existing light signals could easily be replaced by bat friendly alternatives, or context-dependent illumination could be deployed which is only activated if planes or helicopters are approaching a wind power plant."Exactly why bats are attracted to red light sources is unclear. "Bats have excellent eyesight and can even detect wavelengths invisible to us. Some red light sources might potentially blind and disorient them. Whether they then respond by flying towards the source of light with the highest intensity requires further research. It is also absolutely crucial to understand the long-term impact of increasing light pollution on populations of nocturnal animals," explains Christian Voigt. "Many bat species already struggle in our current anthropogenic landscapes characterised by intensive agriculture and high densities of wind turbines. Light pollution is likely to increase pressure on them even further. From a conservation perspective, it is highly advisable that we limit the use of artificial light sources at night to cover only the most pressing and essential human needs. And if there is such an essential need, then bat suitable light sources should be used." | Pollution | 2,018 |
August 24, 2018 | https://www.sciencedaily.com/releases/2018/08/180824150536.htm | Why polluted air may be a threat to your kidneys | There is good evidence that polluted air increases the risk of respiratory problems such as asthma -- as well as organ inflammation, worsening of diabetes and other life-threatening conditions. But new research suggests air pollution can also fuel something else: chronic kidney disease, or CKD, which occurs when a person's kidneys become damaged or cannot filter blood properly. | Recently published in "Similar to smoking, air pollution contains harmful toxins that can directly affect the kidneys," says Jennifer Bragg-Gresham, M.S., Ph.D., a Michigan Medicine epidemiologist and the study's lead author."Kidneys have a large volume of blood flowing through them, and if anything harms the circulatory system, the kidneys will be the first to sense those effects."People with diabetes, obesity, high blood pressure or heart disease are at increased risk of developing CKD. Which is why high-risk patients who live in heavily populated or polluted areas should recognize the danger and take precautions, Bragg-Gresham says.Air pollution contains fine particulate matter, or PM2.5, which is a cocktail of microscopic particles.Because these particles are virtually weightless, they can stay in the air longer, causing humans to unavoidably inhale them on a regular basis without knowing it. PM2.5 can lead to serious health effects when inhaled often.By reviewing Medicare claims data and air-quality data from the Centers for Disease Control and Prevention, the study's authors found a positive association between CKD rates and PM2.5 concentration.Says study co-author Rajiv Saran, M.D., a Michigan Medicine nephrologist and director of the United States Renal Data System Coordinating Center at U-M: "If you look at areas that are heavily polluted versus areas that are less polluted, you will find more chronic kidney disease."According to figures cited in the new research, chronic kidney disease afflicts more than 27 million Americans. People with CKD have an eightfold increased risk of cardiovascular mortality.Unfortunately, PM2.5 is almost impossible to avoid.We encounter air pollution from many simple everyday activities, such as cooking and driving. Other contributors are smoking, burning wood, packaged spray products, household appliances and, perhaps the most obvious, industry and vehicle emissions.Air pollution also contains heavy metals such as lead, mercury and cadmium -- all of which are known to negatively affect the kidneys.The U-M research examined several prior studies on the issue, including an effort conducted in select coal-mining areas of Appalachia that found a 19 percent higher risk of CKD among men and a 13 percent higher risk in women compared with those who lived in counties with no mining.The good news: PM2.5 levels are much lower in the U.S. than in other industrialized countries such as China and India."What this means for the countries with higher PM2.5 is significantly higher odds of CKD," says Bragg-Gresham, also an assistant research scientist at U-M. "Our research was only able to examine a small range of PM2.5 values present in America but was able to find a significant association."However, it's still important to take precautions when exposed to air pollution, especially for people who have existing health conditions or who live in densely populated or polluted cities."In heavily polluted areas, consider wearing masks that cover your nose and mouth, limit hours outside and limit long hours commuting to work in high traffic as well," Saran says, adding that the risk should be taken seriously."Many people don't see the seriousness of air pollution because it isn't something visible, but that doesn't mean it's any less important for your health." | Pollution | 2,018 |
August 24, 2018 | https://www.sciencedaily.com/releases/2018/08/180824101145.htm | Smoked out: Researchers develop a new wildfire smoke emissions model | Chemical engineering researchers from Brigham Young University have developed an advanced model that can help predict pollution caused by wildfire smoke. | The research, sponsored by the USDA Forest Service and the Department of Energy, provides a physical model that can more reliably predict soot and smoke emissions from wildfires over a range of conditions."The smoke that you see from wildfires is a combination of evolved gases and soot," said Alex Josephson, a Ph.D. student in BYU's chemical engineering program who also works on the project at Los Alamos National Laboratory. "When we look at smoke as far as health effects, typically we care about those soot particles; and that's what we're modeling."Recent wildfires in the West have caused air quality to tank in a number of major western cities for several days this August, reaching orange and even red levels for long stretches. Orange days are unhealthy for sensitive groups while red-level days are considered unhealthy to all people and can result in serious health effects for children or older individuals.The BYU/Los Alamos-developed model uses detailed physics-based formulas to predict the initial formation of soot particles emitted during wildfires. Experimental measurements of smoke content can involve fairly unsophisticated procedures, such as vacuum sampling of particles as they are produced from a flame."Billions of dollars are spent on fighting wildfires and this summer it felt like the whole West was on fire," said David Lignell, professor of chemical engineering and senior author on the study, recently published in academic journal Current wildfire prediction models are too computationally expensive to run for large-scale wildfires. The BYU/Los Alamos-produced model, which looks like something scratched out on a chalkboard in A Beautiful Mind, provides foundational elements to validate more efficient models that can be applied on supercomputers at a reasonable computational cost.The research is aimed at helping the Forest Service and other wildfire management groups better know the impact of prescribed burns on the surrounding urban environments. (Prescribed burns are one method to help prevent wildfires.) According to Josephson, he's "providing the tools to give information to help the people that need to make those decisions.""When a natural wildfire occurs, no one is responsible for the emissions because it is an act of nature," he said. "But when the Forest Service wants to prescribe a fire, then suddenly you are responsible for the smoke and the emissions coming from it. You better understand the emissions before starting a fire that could have serious effects on surrounding communities."Funding for the research comes more specifically from the Rocky Mountain Research Station of the Forest Service and the Department of Energy's National Nuclear Security Administration, through the University of Utah's Carbon Capture Multidisciplinary Simulation Center.While Lignell said there is still a gap between their research and how it directly impacts the air people are breathing, he's personally invested in bridging that gap -- not just as a chemical engineer, but as someone with asthma."When smoke fills the valley, I take that personally; it really affects people's lives," Lignell said. "It certainly makes you pay attention to wildfire issues and makes you want to be a part of working on these issues." | Pollution | 2,018 |
August 23, 2018 | https://www.sciencedaily.com/releases/2018/08/180823122252.htm | Fires overwhelming British Columbia; smoke choking the skies | British Columbia is on fire. In this Canadian province, 56 wildfires "of note" are active and continuing to blow smoke into the skies overhead. | Current statistics (from the BC Wildfire Service) show that 629,074 total hectares (1,554,475.71 acres) have burned this year in British Columbia. Specifically in each province, the Coastal province has had 86,116 hectares burn. Northwest has had 310,731 ha. burn. Prince George has had 118,233 ha. burn. Kamloops has had 38,019 ha. burn. The Southeast has had 35,639 ha. burn and Cariboo has had 40,336 ha. burn.The weather, as in the western U.S., has had a significant role in the 2018 wildfire situation. Hot, dry, and windy conditions create a breeding ground for wildfires to start and spread. With just a lightning strike or a poorly tended campfire these weather conditions allow those quick-start fires to spread rapidly and become out of control before they are even discovered.Besides the very obvious hazards of fire, there is the secondary hazard of smoke across the region. Smoke now blankets the sky above British Columbia and will be blown eastward by the jetstream. This smoke causes hazardous air quality wherever it travels. The map below shows the air quality index for the British Columbia region for August 22, 2018. Residents who either smell the smoke or notice haze in the air, should take precautions. | Pollution | 2,018 |
August 23, 2018 | https://www.sciencedaily.com/releases/2018/08/180823113618.htm | Microbes hitch a ride inland on coastal fog | Fog can act as a vector for microbes, transferring them long distances and introducing them into new environments. So reports an analysis of the microbiology of coastal fog, recently published in the journal | Co-author Kathleen Weathers, a Senior Scientist at Cary Institute of Ecosystem Studies, explains, "Fog's role in transporting water and nutrients to coastal areas is well documented. Far less is known about the biology of fog, including the communities of microbes that live in fog droplets, and how they travel between marine and terrestrial ecosystems."The research team tracked fungal and bacterial communities in fog delivered to two fog-dominated sites: Southport Island, Maine in the United States and the Namib Desert in Namibia. Their aim: to better understand how fog influences the transport of microbes from the Atlantic Ocean into these fog-fed terrestrial ecosystems.At both sites, samples of fog, clear air, and rain were analyzed to record the variety and abundance of microorganisms present. In Maine, data were collected within 30 meters of the ocean during two field campaigns. In the Namib, data were collected at two sites located 55 kilometers and 50 kilometers away from the coast.Air was sampled in Maine and the Namib before and after rain, fog, and high wind events to detect changes in airborne microbial composition due to weather conditions. Ocean water -- where coastal fog originates -- was also sampled. At both sites, bacterial and fungal DNA was extracted from filters; trends within and between sites were then analyzed.Co-lead author Sarah Evans of Michigan State's Kellogg Biological Station explains, "Fog droplets were found to be an effective medium for microbial sustenance and transport. At both sites, microbial diversity was higher during and after foggy conditions when compared to clear conditions."Marine influences on fog communities were greatest near the coast, but still evident 50 kilometers inland in the Namib Desert. Fog in both Maine and the Namib contained microbes from both soil and ocean sources.Moisture in fog allows microbes to persist longer than they would in dry aerosols. As a result, fog deposits a greater abundance and diversity of microbes onto the land than deposition by air alone.Co-lead author M. Elias Dueker of Bard College explains: "When fog rolls in, it can shift the composition of terrestrial airborne microbial communities. And in a fascinating twist, on the journey from the ocean to the land, microbes not only survive, but change during transport. Fog itself is a novel, living ecosystem."The authors note the possible health implications of the marine-terrestrial fog connection. Fog at both sites contained pathogenic microbes, including suspected plant pathogens and species known to cause respiratory infections in immune-compromised people. This raises concern about the role that fog could play in transporting harmful microbes.Dueker explains, "Bacterial and viral aerosols can originate from polluted waterways, such as those contaminated with sewage. When polluted water mixes with air, harmful substances become airborne and spread. These pathogens could also be incorporated in urban fog, increasing their threat to people, plants, and other animals.""We need a better understanding of fog's role as a vector for microbes, with special attention to pathogens that threaten health," Weathers explains. "Warming sea surface temperatures and altered wind regimes are likely to affect fog distribution in many coastal regions."The team identified the need for future studies that help predict which microbes are most likely to be transported and deposited by fog. Using traits like spore size and behavior, models could be developed that help forecast harmful fog. | Pollution | 2,018 |
August 23, 2018 | https://www.sciencedaily.com/releases/2018/08/180823092030.htm | Fish lice could be early indicators of metal pollution in freshwater | Everyone needs safe and clean water to drink. Yet industry, agriculture and urban activities threaten fresh water. In particular, metal pollution can be very hard to detect early. Because of this, scientists are always searching for sensitive indicators of water quality. Now, a fish louse shows great promise as an early indicator for monitoring pollution in rivers and dams. | Water samples only tell the story of a river for a moment in time. So researchers studied fish, because fish accumulate pollutants such as metals over time. But it can be difficult to get a complete story from fish also, says Prof Annemariè Avenant-Oldewage. She heads the Department of Zoology at the University of Johannesburg."Fish have mechanisms to protect themselves. They can reduce the toxic effects from metal pollution inside their bodies. They move the metals they accumulate to organs or other body parts where it is less harmful to them. Because of this, we cannot detect very low levels of metals by analysing fish."Also, if the fish have parasites, the parasites can accumulate the metals better than the fish. Tapeworms are an example of such internal parasites."In a way, the parasites absorb the metals from the fish. The parasites can then end up with metals in much higher concentrations than those in the host. For some internal parasites, levels of metals have been found to be up to 2 500 times higher than in the host," says Prof Avenant-Oldewage."This means we can measure metals in them, long before it is possible to do that in fish or in water samples. So parasites can give us early warnings of pollution."In follow-up research, Prof Avenant-Oldewage and her team studied tape worms. Tape worms live inside the intestines of fish, but they are not ideal. The host fish they live in has to be killed to analyse for accumulated pollutants.Added to that, the researchers found that tape worms also have a way to get rid of metals. An egg-bearing tapeworm can move metal pollutants in its body, into the egg shells it is about to release.As an alternative, the researchers then considered external fish parasites. If these work, no fish would need to be killed.Next, Prof Avenant-Oldewage's team studied an external parasite called "Like most parasites, "Yellowfish is prized as a fighting fish for angling competitions. But they are physiologically sensitive creatures. They go into shock if someone removes parasites from their gills."This is where the fish louse, In their latest study, the researchers analysed Dr Beric Gilbert caught mudfish and yellowfish in the Vaal Dam, close to Deneysville.Then he removed He froze the parasites, applied stains, and used a microscope with fluorescent functions. Then he could see areas in male and female lice that had higher concentrations of metals."Most of the metals were in the hard outer layer of the lice, also called the exoskeleton. There wasn't much difference in the amount of metals absorbed by male and female The more intense the fluorescent signal, or glow, produced by the microscope, the higher the amount of metals accumulated in those areas of the lice."Male lice seemed to concentrate more metals in the exoskeleton covering the underside of their bodies. This was visible as a brighter yellow signal, or intense glow, when studying the parasites with the microscope."But in egg-bearing females, a layer of jelly around the eggs produced a positive signal, indicating the presence of metals. The female uses the jelly to secure the eggs to surfaces in the environment, when she lays them," he says."Our next step is to find out what mechanisms the lice use to protect themselves from metals. We also need to find out how they absorb metals in the first place," she says."If | Pollution | 2,018 |
August 22, 2018 | https://www.sciencedaily.com/releases/2018/08/180822164151.htm | Trace metals in the air make big splash on life under the sea | In the ocean, a little bit of metal can go a long way. | A new Cornell University-led study shows that trace metals, deposited by aerosols like dust and other particles in the atmosphere, have a hefty impact on marine life, affecting biological productivity and changing the ocean ecosystem.The sources of such aerosol particles range from volcanoes, wildfires and desert dust to anthropogenic causes, like the burning of fossil fuels. After being spewed up and undergoing chemical reactions in the atmosphere, these particles often make their way to remote ocean regions, where they are deposited via precipitation or dry deposition."In a pollution event or a dust storm, and even in these faraway places, atmospheric deposition can be the most important source of new metals," said lead author Natalie Mahowald, the Irving Porter Church Professor of Engineering and Atkinson Center for a Sustainable Future faculty director for the environment.Some metals prove to be insoluble and drop to the ocean floor, while others are taken up by various biota -- "the little guys," in Mahowald's words -- like phytoplankton and bacteria, which make up 80 percent of marine life and act as circulators of oxygen and nutrients throughout the ecosystem."If you change the ecosystem structure at this scale -- this is where all the productivity occurs -- it will cascade up and impact the fish and the animals we see more easily," Mahowald said.While previous research has focused on the pivotal role of iron in the oceans, Mahowald and her team examined the effects of iron and other metals, including aluminum, manganese, zinc, lead, copper, nickel, cobalt and cadmium. Many of these metals, such as copper, can be toxic pollutants, but the researchers found that the metals sometimes function as nutrients, depending on how, where and with what they are mixed. | Pollution | 2,018 |
August 22, 2018 | https://www.sciencedaily.com/releases/2018/08/180822122829.htm | Air pollution leads to cardiovascular diseases | Air pollution, and fine dust in particular, is responsible for more than four million deaths each year. Almost 60 per cent of deaths occur as a result of cardiovascular diseases. Scientists around Professor Thomas Münzel, Director of Cardiology I at the Department of Cardiology at the Medical Center of Johannes Gutenberg University Mainz (JGU), reviewed the mechanisms responsible for vascular damage from air pollution together with scientists from the UK and the USA. Their findings have been published in the latest issue of the | The large percentage of deaths from cardiovascular disease has prompted an international group of experts from Germany, England, and the USA to analyze the negative effects of air pollution on vascular function in a review article. Key research questions focused on components of air pollution (particulate matter, ozone, nitrogen dioxide, carbon monoxide, and sulfur dioxide) that are particularly damaging to the cardiovascular system and mechanisms that damage the vessels."This report in the latest issue of the Other participants in the expert group include the particulate matter researcher Professor Sanjay Rajagopalan of the UH Cleveland Medical Center, the vascular researcher and cardiologist Professor John Deanfield of the Institute of Cardiovascular Science at University College London, Professor Andreas Daiber, Head of Molecular Cardiology at the Mainz University Medical Center, and Professor Jos Lelieveld from the Max Planck Institute for Chemistry (MPIC) in Mainz."The fine dust particles are chemically formed mainly in the atmosphere from emissions from traffic, industry, and agriculture. In order to achieve low, harmless concentrations, emissions from all these sources need to be reduced," commented Professor Jos Lelieveld."In the future, we will work intensively with the Max Planck Institute for Chemistry to investigate the causes of cardiovascular disease caused by air pollution, especially in combination with (flight) noise," added Professor Thomas Münzel. | Pollution | 2,018 |
August 22, 2018 | https://www.sciencedaily.com/releases/2018/08/180822112406.htm | Air pollution reduces global life expectancy by more than one year | Air pollution shortens human lives by more than a year, according to a new study from a team of leading environmental engineers and public health researchers. Better air quality could lead to a significant extension of lifespans around the world. | This is the first time that data on air pollution and lifespan has been studied together in order to examine the global variations in how they affect overall life expectancy.The researchers looked at outdoor air pollution from particulate matter (PM) smaller than 2.5 microns. These fine particles can enter deep into the lungs, and breathing PM2.5 is associated with increased risk of heart attacks, strokes, respiratory diseases and cancer. PM2.5 pollution comes from power plants, cars and trucks, fires, agriculture and industrial emissions.Led by Joshua Apte in the Cockrell School of Engineering at The University of Texas at Austin, the team used data from the Global Burden of Disease Study to measure PM2.5 air pollution exposure and its consequences in 185 countries. They then quantified the national impact on life expectancy for each individual country as well as on a global scale.The findings were published Aug. 22 in "The fact that fine particle air pollution is a major global killer is already well known," said Apte, who is an assistant professor in the Cockrell School's Department of Civil, Architectural and Environmental Engineering and in the Dell Medical School's Department of Population Health. "And we all care about how long we live. Here, we were able to systematically identify how air pollution also substantially shortens lives around the world. What we found is that air pollution has a very large effect on survival -- on average about a year globally."In the context of other significant phenomena negatively affecting human survival rates, Apte said this is a big number."For example, it's considerably larger than the benefit in survival we might see if we found cures for both lung and breast cancer combined," he said. "In countries like India and China, the benefit for elderly people of improving air quality would be especially large. For much of Asia, if air pollution were removed as a risk for death, 60-year-olds would have a 15 percent to 20 percent higher chance of living to age 85 or older."Apte believes this discovery is especially important for the context it provides."A body count saying 90,000 Americans or 1.1 million Indians die per year from air pollution is large but faceless," he said. "Saying that, on average, a population lives a year less than they would have otherwise -- that is something relatable." | Pollution | 2,018 |
August 22, 2018 | https://www.sciencedaily.com/releases/2018/08/180822091022.htm | Mixed report card for low-cost indoor air quality home monitors | Affordable indoor air quality monitors for the home can be worth the purchase, a recent product evaluation revealed, but all of the monitors tested by researchers were found to have either underreported or missed the presence of very small particles that can penetrate deeply into the lungs. | Indoor air researchers at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) recently tested seven consumer-grade air quality monitors to see if they could detect fine particles emitted by common household activities, including cooking, burning candles, and smoking. They found that four of the monitors could reliably detect high levels of particulate matter in the air, enabling use of a ventilation system or an air cleaner to reduce pollutant exposure. The study was recently published in the journal Reliable monitors could help make ventilation systems smarter by turning them on when indoor air quality (IAQ) declines -- an important control feature as building envelopes become more efficient."Not all of the monitors worked well, but the ones that did seemed to do a pretty good job, from a building control point of view," said Woody Delp, a mechanical engineer with the Indoor Environment Group in Berkeley Lab's Energy Technologies Area. "From a health perspective, where you really need high confidence in a monitor's numbers to assess an environment, we're not quite there yet.""The home IAQ monitors can be very helpful when your community is impacted by a big outdoor air pollution event like a wildfire," said Brett Singer, a staff scientist at Berkeley Lab who co-authored the paper with Delp. "After closing up the house, turning off ventilation fans, and operating any filtration that is available -- including the central forced air system if it has a good filter -- the IAQ monitor can give you an idea if these measures are effectively cutting particle levels in your home."In recent years, consumer-level air quality monitors embedded with low-cost optical particle sensors have come onto the market, costing less than $300 each. Similar to some expensive monitors used for research, they operate by sensing how light bounces off particles in the air -the light scatters differently depending on the size of the particle it hits. The light-scattering method is well established, but its effectiveness can vary widely, depending not only on the components used within a given monitor but on the type of pollution.For the study, Berkeley Lab researchers conducted common household activities that emit particles: stir-frying green beans on a gas range, making toast, burning incense, and shaking a dust mop, among others. The consumer monitors cost between $199 and $280. They also tested research monitors that cost thousands of dollars. The readings were compared with two high-end reference systems and also with data from previous studies.Four of the consumer monitors -- AirBeam, AirVisual, Foobot, and Purple Air II -- were able to reliably detect an increase in pollution from particles with a diameter 2.5 micrometers or smaller, the inhalable fine particles known as PM2.5. However, all of the monitors faltered for particles with diameters smaller than 0.3 micrometers, which have been associated with a variety of adverse respiratory and cardiovascular health effects.The study authors noted, however, that because most sources of indoor air pollution contain both larger and smaller particles, the monitors could still help reduce exposure to ultrafine particles by generally detecting when air quality worsens and triggering better ventilation during those times.The indoor air quality landscape for consumers changes fast, with new products being rolled out and existing ones being modified. "With such a dynamic market, what's really needed is a standard test method and certification process for the industry," said Singer. "Our role as researchers is to evaluate the potential of these devices generally. Industry groups need to take the lead to provide up-to-date information as the products change and new ones come online."The study was supported by DOE's Building America Program and the Environmental Protection Agency. | Pollution | 2,018 |
August 22, 2018 | https://www.sciencedaily.com/releases/2018/08/180822082521.htm | Murky lakes now surpass clear, blue lakes in US | New research reveals that many lakes in the continental United States are becoming "murkier, with potentially negative consequences for water quality and aquatic life. The findings are published in | In the 5 years between 2007 and 2012, the dominant lake type in the United States shifted from clear, blue lakes to greenish-brown, murky lakes. Blue lakes declined by 18% while murky lakes increased by 12%. The investigators cannot definitively say what is causing this shift, but they suspect that land cover and land use patterns within a watershed, as well as changes in climate, may be important factors."Blue lakes typically are those that do not show evidence of nutrient pollution or elevated organic matter while murky lakes have high levels of both," said lead author Dr. Dina Leech, of Longwood University in Farmville, Virginia. "A shift toward murkiness is a management concern because murky lakes tend to have more algae, including potentially harmful cyanobacteria. And with poor food quality at the base of the food web, over time murky lakes may not be able to support a healthy fishery." | Pollution | 2,018 |
August 21, 2018 | https://www.sciencedaily.com/releases/2018/08/180821145253.htm | Preparing for chemical attacks with improved computer models | On April 4, 2017, the town of Khan Sheikhoun in northwest Syria experienced one of the worst chemical attacks in recent history. A plume of sarin gas spread more than 10 kilometers (about six miles), carried by buoyant turbulence, killing more than 80 people and injuring hundreds. | Horrified by the attack, but also inspired to do something useful, Kiran Bhaganagar, professor of mechanical engineering at The University of Texas at San Antonio, and her team from Laboratory of Turbulence Sensing and Intelligence Systems, used computer models to replicate the dispersal of the chemical gas. Results were published in "If there is a sudden a chemical attack, questions that are important are: 'how far does it go' and 'what direction does it go,'" Bhaganagar said. "This is critical for evacuations."Bhaganagar's research is supported by the U.S. Department of Army Edgewood Chemical and Biological Center (ECBC), who hope to adopt her models to assist in the case of an attack on American soil.Chemicals, whether toxic agents like sarin gas or exhaust from vehicles, travel differently from other particulates in the atmosphere. Like wildfires, which can move incredibly fast, chemicals create their own micro-conditions, depending on the density of the material and how it mixes with the atmosphere. This phenomenon is known as buoyant turbulence and it leads to notable differences in how chemicals travel during the day or at night, and during different seasons."In the nighttime and early morning, even when you have calm winds, the gradients are very sharp, which means chemicals travel faster," Bhaganagar explained.Even ordinary turbulence is difficult to mathematically model and predict. It functions on a range of scales, each interacting with the others, and disperses energy as it moves to the smallest levels. Modeling buoyant turbulence is even harder. To predict the effects of turbulence on the dispersal of chemical particles, Bhaganagar's team ran computer simulations on the Stampede2 supercomputer at the Texas Advanced Computing Center (TACC), the largest system at any U.S. university."We go into the physics of it and try to understand what the vertices are and where the energy is," Bhaganagar said. "We decompose the problem and each processor solves for a small portion. Then we put everything back together to visualize and analyze the results."Bhaganagar used TACC's supercomputers through the University of Texas Research Cyberinfrastructure (UTRC) initiative, which, since 2007, has provided researchers at any of the University of Texas System's 14 institutions access to TACC's resources, expertise and training.The background atmosphere and time of day play a big role in the dispersal. In the case of the Syria attacks, Bhaganagar first had to determine the wind speeds, temperature, and the kinds of chemicals involved. With that information in hand, her high resolution model was able to predict how far and in what direction chemical plumes travelled."In Syria, it was very bad because the timing caused it to be ideal conditions to spread very fast," she said. "We ran the actual case of Syria on the TACC supercomputer, got all of the background information and added it to the models, and our models captured the boundaries of the plume and which cities it spread to. We saw it was very similar to what was reported in the news. That gave us confidence that our system works and that we could use it as an evacuation tool."The research is targeted to short-term predictions: understanding in what direction chemicals will propagate within a four-hour window and working with first responders to deploy personnel appropriately.However, running the high-resolution model takes time. In the case of the Syria simulation, it required five full days of number crunching on Stampede2 to complete. During a real attack, such time wouldn't be available. Consequently, Bhaganagar also developed a coarser model that uses a database of seasonal conditions as background information to speed up the calculations.For this purpose, Bhaganagar's team has introduced a novel mobile sensing protocol where they deploy low-cost mobile sensors consisting of aerial drones and ground-based sensors to gather the local wind data and use the courser model to predict the plume transport.Using this method, the four-hour predictions can be computed in as little as 30 minutes. She is working to bring the time down even further, to 10 minutes. This would allow officials to rapidly issue accurate evacuation orders, or place personnel where they are needed to assist in protecting citizens."There are hardly any models that can predict to this level of accuracy," Bhaganagar said. "The Army uses trucks with mobile sensors, which they send into a circle around the source. But it's very expensive and they have to send soldiers, which is a danger to them." In the future, the army hopes to combine computer simulations and live monitoring in the case of a chemical attack.Bhaganagar will conduct tests in the coming months at the U.S. Army experimental facility in Maryland to determine how well drones can predict wind conditions accurately."The higher the accuracy of the data -- the wind speed, wind direction, local temperature -- the better the prediction," she explained. "We use drones to give us additional data. If you can feed this data into the model, the accuracy for the four-hour window is much higher."Most recently, she and her graduate student, who is a Ph.D. candidate, Sudheer Bhimireddy, integrated their buoyant turbulence model with the high-resolution Advanced Research Weather Research and Forecast model to understand the role of atmospheric stability on the short-term transport of chemical plumes.The research appears in the September 2018 edition of In related work funded by the National Science Foundation, Bhaganagar has adopted her chemical plume model to do pollution tracking. She hopes her code can help communities predict local pollution conditions.According to Bhaganagar, low-cost wind and gas sensors purchased by a community could help produce daily forecasts so individuals can take the proper precautions when pollution levels are concentrated in an area. Recent efforts have tried to determine how many sensors are needed to allow accurate local predictions."Can we detect zones of pollution and take effective measures to avoid pollution?" Bhaganagar asked. "If we had our own small-scale models that we could use in our communities that would have a big impact on pollution."Though community pollution forecasts would ultimately run on consumer-grade computers, such predictions would not be possible without access to supercomputers to test the models and generate a database of background conditions."TACC resources are so valuable," she said. "I wouldn't have even attempted these research projects if I wasn't able to access TACC supercomputers. They're absolutely necessary for developing new turbulence models that can save lives in the future." | Pollution | 2,018 |
August 21, 2018 | https://www.sciencedaily.com/releases/2018/08/180821094308.htm | Living close to urban green spaces is associated with a decreased risk of breast cancer | An increasing number of studies are reporting health benefits of contact with urban green spaces. A new study from the Barcelona Institute for Global Health (ISGlobal), an institute supported by the "la Caixa" Banking Foundation, has examined, for the first time, the relationship between exposure to green spaces and breast cancer. The study, which analysed data from more than 3,600 women in Spain, concluded that the risk of breast cancer was lower in the women who lived closer to urban green spaces, like parks or gardens. | Previous research has identified an association between contact with green spaces and several health benefits, including better general and mental health and increased life expectancy. In the older population, contact with green spaces has recently been linked with slower cognitive decline. In children, exposure to greenness has been associated with improvements in attention capacity, behaviour, emotional development, and even beneficial structural changes in the brain. To date, few studies have focused on the relationship between exposure to natural green spaces and the risk of cancer, more specifically breast cancer, the most common malignant disease among women and the one that causes the most cancer deaths in the female population. The new study, published in the Data on lifetime residential history, socio-economic level, lifestyle factors and levels of physical activity were obtained during interviews with each one of the participants. Information on proximity to urban green spaces or agricultural areas, air pollution levels, and population density was obtained by geo-codding the residential address of each participants.The first author of the study, ISGlobal researcher Cristina O'Callaghan-Gordo, explains, "We found a reduced risk of breast cancer among women living in closer to urban green spaces. By contrast, women living closer to agricultural areas, had a risk higher. This findings suggests, that the association between green space and a risk of breast cancer is dependent on the land use."Mark Nieuwenhuijsen, the study coordinator and Director of ISGlobal's Urban Planning, Environment and Health Initiative, goes on to explain that the researchers "found a linear correlation between distance from green spaces and breast cancer risk. In other words, the risk of breast cancer in the population declines, the closer their residence is to an urban green space. These findings highlight the importance of natural spaces for our health and show why green spaces are an essential component of our urban environment, not just in the form of isolated areas but as a connective network linking the whole urban area and benefitting all its inhabitants.""We still don't know which characteristics of natural spaces are the most beneficial and nor do we understand the mechanisms underpinning these beneficial health impacts," explains ISGlobal researcher Manolis Kogevinas, coordinator of the MCC-Spain project. "Other studies have shown that the mechanisms that might explain the health benefits of green spaces include higher levels of physical activity in the population and a reduction in air pollution, an environmental hazard clearly linked to the onset of cancer. However, we did not observe these associations. We believe that other mechanisms -- including lower levels of stress among people living close to green spaces -- could play a role, but more research is needed to confirm this hypothesis," he adds. The results of earlier studies have suggested that the association between a higher risk of breast cancer and residential proximity to agricultural land may be due to the use of pesticides. O'Callaghan-Gordo concludes: "We didn't analyse levels of exposure to agrochemicals in our study, but they should be taken into account in future research to provide more insight into the mechanism underlying this negative association." | Pollution | 2,018 |
August 21, 2018 | https://www.sciencedaily.com/releases/2018/08/180821094220.htm | Portable freshwater harvester could draw up to 10 gallons per hour from the air | For thousands of years, people in the Middle East and South America have extracted water from the air to help sustain their populations. Drawing inspiration from those examples, researchers are now developing a lightweight, battery-powered freshwater harvester that could someday take as much as 10 gallons per hour from the air, even in arid locations. They say their nanofiber-based method could help address modern water shortages due to climate change, industrial pollution, droughts and groundwater depletion. | The researchers will present their results today at the 256th National Meeting & Exposition of the American Chemical Society (ACS)."I was visiting China, which has a freshwater scarcity problem. There's investment in wastewater treatment, but I thought that effort alone was inadequate," Shing-Chung (Josh) Wong, Ph.D., says. Instead of relying on treated wastewater, Wong thought it might be more prudent to develop a new type of water harvester that could take advantage of the abundant water particles in the atmosphere.Harvesting water from the air has a long history. Thousands of years ago, the Incas of the Andean region collected dew and channeled it into cisterns. More recently, some research groups have been developing massive mist and fog catchers in the Andean mountains and in Africa.To miniaturize water generation and improve the efficiency, Wong and his students at the University of Akron turned to electrospun polymers, a material they had already worked with for more than a decade. Electrospinning uses electrical forces to produce polymer fibers ranging from tens of nanometers up to 1 micrometer -- an ideal size to condense and squeeze water droplets out of the air. These nanoscale fiber polymers offer an incredibly high surface-area-to-volume ratio, much larger than that provided by the typical structures and membranes used in water distillers.By experimenting with different combinations of polymers that were hydrophilic -- which attracts water -- and hydrophobic -- which discharges water, the group concluded that a water harvesting system could indeed be fabricated using nanofiber technology. Wong's group determined that their polymer membrane could harvest 744 mg/cmUnlike existing methods, Wong's harvester could work in arid desert environments because of the membrane's high surface-area-to-volume ratio. It also would have a minimal energy requirement. "We could confidently say that, with recent advances in lithium-ion batteries, we could eventually develop a smaller, backpack-sized device," he says.What's more, Wong's nanofiber design simultaneously grabs water and filters it. The electrospun fiber network can act as an anti-fouling surface, sloughing off microbes that could collect on the harvester's surface. So the water would be "clear and free of pollutants" and immediately drinkable once it's collected, he says.Next, Wong hopes to obtain additional funding to build a prototype of the freshwater harvester. He anticipates that, once his team is able to produce the prototype, it should be inexpensive to manufacture. | Pollution | 2,018 |
August 19, 2018 | https://www.sciencedaily.com/releases/2018/08/180819160710.htm | The environmental cost of contact lenses | Many people rely on contact lenses to improve their vision. But these sight-correcting devices don't last forever -- some are intended for a single day's use -- and they are eventually disposed of in various ways. Now, scientists are reporting that throwing these lenses down the drain at the end of their use could be contributing to microplastic pollution in waterways. | The researchers are presenting their results today at the 256th National Meeting & Exposition of the American Chemical Society (ACS). The inspiration for this work came from personal experience. "I had worn glasses and contact lenses for most of my adult life," Rolf Halden, Ph.D., says. "But I started to wonder, has anyone done research on what happens to these plastic lenses?" His team had already been working on plastic pollution research, and it was a startling wake-up call when they couldn't find studies on what happens to contact lenses after use."We began looking into the U.S. market and conducted a survey of contact lens wearers. We found that 15 to 20 percent of contact wearers are flushing the lenses down the sink or toilet," says Charlie Rolsky, a Ph.D. student who is presenting the work. Halden, Rolsky and a third member of the team, Varun Kelkar, are at the Biodesign Institute's Center for Environmental Health Engineering at Arizona State University (ASU). "This is a pretty large number, considering roughly 45 million people in the U.S. alone wear contact lenses."Lenses that are washed down the drain ultimately end up in wastewater treatment plants. The team estimates that anywhere from six to 10 metric tons of plastic lenses end up in wastewater in the U.S. alone each year. Contacts tend to be denser than water, which means they sink, and this could ultimately pose a threat to aquatic life, especially bottom feeders that may ingest the contacts, Halden says.Analyzing what happens to these lenses is a challenge for several reasons. First, contact lenses are transparent, which makes them difficult to observe in the complicated milieu of a wastewater treatment plant. Further, the plastics used in contact lenses are different from other plastic waste, such as polypropylene, which can be found in everything from car batteries to textiles. Contact lenses are instead frequently made with a combination of poly(methylmethacrylate), silicones and fluoropolymers to create a softer material that allows oxygen to pass through the lens to the eye. So, it's unclear how wastewater treatment affects contacts.These differences make processing contact lenses in wastewater plants a challenge. To help address their fate during treatment, the researchers exposed five polymers found in many manufacturers' contact lenses to anaerobic and aerobic microorganisms present at wastewater treatment plants for varying times and performed Raman spectroscopy to analyze them. "We found that there were noticeable changes in the bonds of the contact lenses after long-term treatment with the plant's microbes," says Kelkar. The team concluded that microbes in the wastewater treatment facility actually altered the surface of the contact lenses, weakening the bonds in the plastic polymers."When the plastic loses some of its structural strength, it will break down physically. This leads to smaller plastic particles which would ultimately lead to the formation of microplastics," Kelkar says. Aquatic organisms can mistake microplastics for food and since plastics are indigestible, this dramatically affects the marine animals' digestive system. These animals are part of a long food chain. Some eventually find their way to the human food supply, which could lead to unwanted human exposures to plastic contaminants and pollutants that stick to the surfaces of the plastics.By calling attention to this first-of-its-kind research, the team hopes that industry will take note and at minimum, provide a label on the packaging describing how to properly dispose of contact lenses, which is by placing them with other solid waste. Halden mentions, "Ultimately, we hope that manufacturers will conduct more research on how the lenses impact aquatic life and how fast the lenses degrade in a marine environment." | Pollution | 2,018 |
August 18, 2018 | https://www.sciencedaily.com/releases/2018/08/180818115645.htm | Making aquafeed more sustainable: Scientists develop feeds using a marine microalga co-product | Dartmouth scientists have created a more sustainable feed for aquaculture by using a marine microalga co-product as a feed ingredient. The study is the first of its kind to evaluate replacing fishmeal with a co-product in feed designed specifically for Nile tilapia. The results are published in the open access journal, | Aquaculture is the world's fastest growing food sector, surpassing the global capture fisheries production in 2014. It provides more than 50 percent of the food supply to humans; however, it poses several environmental concerns. Aquaculture feed (aquafeeds) draws on 70 percent of the world's fishmeal and fish oil, which is obtained from small, ocean-caught fish such as anchovies, sardines, herring, menhaden, and mackerel¬, that are essential to the lower end of the marine food chain. Analysts project that by 2040, the demand for fishmeal and fish oil will exceed supply. Aquafeeds also draw on large amounts of soy and corn from industrial farms, which pose other environmental concerns due to the use of fertilizers and potential runoff into rivers, lakes and coastal waters. In addition, aquafeeds may trigger nutrient pollution in aquaculture effluent, as fish are unable to fully digest soy and corn, which are major feed ingredients.To address the environmental sustainability concerns regarding aquafeed, a Dartmouth team has been developing sustainable feeds for Nile tilapia, which examine the effectiveness of replacing fishmeal and fish oil with different types of marine microalgae. Marine microalgae are excellent sources of essential amino acids, minerals, vitamins, and omega-3 fatty acids, and can therefore, meet the nutrient requirements of fish. Omega-3 fatty acids are important for maintaining fish health; they also have neurological, cardiovascular and anti-cancer benefits to humans.The Dartmouth research team's latest work replaces fishmeal with a marine microalga co-product, Nannochloropsis oculata, which is rich in both protein and omega-3 fatty acids, including eicosapentaenoic acid, that are essential to fish growth and quality. The co-products are left-over algae meal, after the oils have been extracted from commercially-grown algae biomass to manufacture nutraceuticals, chemicals and fuel applications. The co-product is available at commercial scale and continued increases in supply are expected. The study's findings show promise in replacing conventional protein ingredients in tilapia feeds.The results demonstrated that the co-product had higher protein content than the whole cells but had lower digestibility than whole cells. The co-product showed the highest digestibility of lysine, an essential amino acid that is often deficient in terrestrial crop-based aquafeed ingredients, as well as the highest eicosapentaenoic acid (EPA) digestibility.The team also evaluated several feeds with varying percentages of co-product replacing fishmeal. When 33 percent of fishmeal was replaced with the co-product, the Nile tilapia had fish growth and a feed conversion ratio and survival rate similar to those on the reference diet for which fishmeal was seven percent of the diet. The team hypothesizes that the co-product may need to be enhanced with enzyme(s) to maximize nutrient availability and counter the lower digestibility observed in the experiment."The possibilities for developing a sustainable approach to aquaculture are exciting. Our society has an opportunity to shift aquafeed's reliance on fish-based ingredients to a fish-free product that is based on marine microalgae, and our findings provide new insight into how we can get there," says lead author, Pallab Sarker, a research assistant professor at Dartmouth.The research builds on the team's earlier work developing a marine microalga feed for Nile tilapia made from Schizochytrium sp., which evaluated how the feed affected digestibility and growth. The results demonstrated that Schizochytrium sp. was highly digestible lipid and DHA, an omega 3 fatty acid source for tilapia. The tilapia not only had higher weight gain but better feed conversion compared to those on a control diet containing fish oil, when the Schizochytrium sp fully replaced the fish oil.As part of the team's goal to eliminate aquafeed's reliance on marine fish and terrestrial crop inputs, they are combining Nannochloropsis co-product with other marine microalgae to make aquaculture feeds more sustainable. | Pollution | 2,018 |
August 17, 2018 | https://www.sciencedaily.com/releases/2018/08/180817093800.htm | Particulate pollution's impact varies greatly depending on where it originated | When it comes to aerosol pollution, as the old real estate adage says, location is everything. | Aerosols are tiny particles that are spewed into the atmosphere by human activities, including burning coal and wood. They have negative effects on air quality -- damaging human health and agricultural productivity.While greenhouse gases cause warming by trapping heat in the atmosphere, some aerosols can have a cooling effect on the climate -- similar to how emissions from a major volcanic eruption can cause global temperatures to drop. This occurs because the aerosol particles cause more of the Sun's light to be reflected away from the planet. Estimates indicate that aerosols have offset about a third of greenhouse gas-driven warming since the 1950s.However, aerosols have a much shorter lifespan in the atmosphere than the gases responsible for global warming. This means that their atmospheric distribution varies by region, especially in comparison to carbon dioxide."Conversations between scientists and policymakers often ignore the role of emission location when evaluating how aerosols affect the climate," explained Carnegie's Geeta Persad.Her new paper with Carnegie's Ken Caldeira finds that the impact these fine particles have on the climate varies greatly depending on where they were released. Their work is published in "Not all aerosol emissions are created equal," Caldeira said. "Aerosols emitted in the middle of a monsoon might get rained out right away, while emissions over a desert might stay in the atmosphere for many days. So far, policy discussions have not come to grips with this fact."For example, their models show that aerosol emissions from Western Europe have 14 times the global cooling effect that aerosol emissions from India do. Yet, aerosol emissions from Europe, the United States, and China are declining, while aerosol emissions from India and Africa are increasing."This means that the degree to which aerosol particulates counteract the warming caused by greenhouse gases will likely decrease over time as new countries become major emitters," Persad explained.What's more, they found that there are dramatic regional differences when it comes to how strongly a country is affected by its own emissions.For example, aerosols from India cool their country of origin 20 times more than they cool the planet. In comparison, aerosols from Western Europe, the United States, and Indonesia cool their country of origin only about twice as much as they cool the planet -- a significant difference in how these emissions are experienced. In many cases, the strongest climate effects of aerosols are felt far from where the aerosols are emitted.Caldeira and Persad's work demonstrates that the climate effects of aerosol emissions from different countries are highly unequal, which they say means that policies must reflect this variation.This work is part of a larger $1.5 million National Science Foundation project with collaborators at UC San Diego and Stanford University that looks at the combined climate, health, and decision-making dimensions of greenhouse gases in comparison to shorter-lived pollutants like aerosols."Just as aerosols' climate effects are strongly dependent on source region, we also expect their health and other air quality effects to be dependent on their origin," explained Persad. "Moving forward, we want to understand this air quality piece and the implications it could have for optimizing local air pollution mitigation."This work was supported by the Fund for Innovative Climate and Energy Research and the National Science Foundation. | Pollution | 2,018 |
August 15, 2018 | https://www.sciencedaily.com/releases/2018/08/180815130536.htm | When viruses infect phytoplankton, it can change the clouds | Microscopic plant-like organisms called phytoplankton are known to support the diversity of life in the ocean. Scientists in Israel now report that one species, | "Our aim is to better understand the effects that marine ecology can have on atmospheric properties like radiation and cloud formation," says first author Miri Trainic, an Earth scientist at the Weizmann Institute of Science. "This slim air-sea interface controls fluxes of energy, particles, and gases, so if we want to understand climate and climate change, we must understand how microscopic biological activity in the ocean alters this balance."When the virus EhV infects When observing a model system in the lab, the researchers found the volume of "Although The researchers were also surprised by the SSAs' complex structure and its effects on aerodynamics. "What we found was that we don't need to look at just the size of the SSA, but also its density," says Assaf Vardi (@vardilab), an environmental scientist at the Weizmann. "These ones are shaped like parachutes; they have an intricate structure of calcium carbonate with lots of space within it, which extends the particle's lifetime in the atmosphere."From here, the researchers will venture to places like Norway to observe these blooms and their SSA emissions in the natural world. "This study focuses on one species and its virus, but in a broader context it can show that the state of the atmosphere actually depends on the daily interactions in the seawater," Trainic says. "Now we must do our best to further understand that relationship." | Pollution | 2,018 |
August 13, 2018 | https://www.sciencedaily.com/releases/2018/08/180813125227.htm | Algorithm provides early warning system for tracking groundwater contamination | Groundwater contamination is increasingly recognized as a widespread environmental problem. The most important course of action often involves long-term monitoring. But what is the most cost-effective way to monitor when the contaminant plumes are large, complex, and long-term, or an unexpected event such as a storm could cause sudden changes in contaminant levels that may be missed by periodic sampling? | Scientists at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and Savannah River National Laboratory have developed a low-cost method for real-time monitoring of pollutants using commonly available sensors. Their study, "In Situ Monitoring of Groundwater Contamination Using the Kalman Filter," was published recently in the journal, "Conventional methods of monitoring involve taking water samples every year or every quarter and analyzing them in the lab," said Haruko Wainwright, a Berkeley Lab researcher who led the study. "If there are anomalies or an extreme event, you could miss the changes that might increase contaminant concentrations or potential health risk. Our methodology allows continuous monitoring in situ using proxy measurements, so we can track plume movement in real time.""Analysis of the autonomous in situ data can be rapidly analyzed remotely using machine learning methods," she added. "It can act as an early warning system -- we can detect sudden changes in contaminant levels. These changes may indicate a need for more or less intervention in terms of the remediation strategy, ideally leading to improved as well as more cost-effective cleanup."Environmental monitoring has become more important in recent years as remediation methods have been shifting away from intensive groundwater treatment and soil removal. "Intensive cleanup has a lot of negative environmental impacts, including air pollution, large energy-water use, and waste production," Wainwright said. "So experts have started thinking about a paradigm shift from this very intensive remediation to a more sustainable remediation, or 'green remediation,' so we don't just think at the contaminant level but we think about the net environmental impact."However, long-term monitoring could be costly over time for large contaminations. What's more, current long-term monitoring strategies do not consider how abrupt or gradual changes in weather, such as heavy rain events, might influence plume behaviors. This aspect is particularly important when considering persistent plumes, such as those associated with metal or radionuclide contamination.The new approach starts with sensors to track water quality variables that have been determined to be reliable indicators of contaminant levels. For the purposes of this study, the researchers tracked levels of tritium and uranium-238 in the groundwater at the Savannah River Site, a former nuclear weapons production site in South Carolina managed by the DOE.For this site, they measured the acidity (or pH) levels and specific conductance (a measure of electrical conductance); these variables were determined to be reliable indicators for tritium and uranium-238 concentrations. The data from the multiple sensors were then fed into a Kalman filter to estimate contaminant concentrations. A Kalman filter is not a physical filter but rather a mathematical algorithm that can integrate mixed time-series data to make estimates. It is commonly used in various fields, such as traffic prediction and remote sensing.Using historical data from the Savannah River Site, the researchers found that their technique provided reliable information about plume behavior over the last 20 years, indicating that the new approach holds significant promise as a long-term monitoring strategy for rapidly assessing a contaminant's plume stability. Another advantage over conventional approaches is that it can reduce the frequency of manual groundwater sampling and lab analysis, and thus reduce the monitoring cost.Wainwright, who is an expert in groundwater contamination and environmental data analytics, said this methodology can be used for both surface and underground water. It can also potentially be used to track other metals, radionuclides, and organic compounds commonly found in groundwater, such as arsenic, chromium, and fuels."There are so many different types of sensors available now, and sensor networking and rapid statistical analysis is straightforward," she said. "We can put together all types of in situ sensors and estimate the target contaminant concentration using this framework for data integration in real-time."She added: "Improved monitoring techniques are essential to protect public health and the ecology. People feel safe if it's properly monitored. Our technique is a way to monitor such sustainable remediation -- effectively and cheaply." | Pollution | 2,018 |
November 18, 2020 | https://www.sciencedaily.com/releases/2020/11/201118090755.htm | Antibiotic resistance surveillance tools in Puerto Rican watersheds after Hurricane Maria | When Hurricane Maria made landfall, devastating Dominica, St. Croix, and Puerto Rico in September 2017, flooding and power outages wreaked havoc on the debilitated land, resulting in the contamination of waterways with untreated human waste and pathogenic microorganisms. | Six months after the deadly Category 5 hurricane, Virginia Tech civil and environmental engineering Professor Amy Pruden led a team of Virginia Tech researchers, including Maria Virginia Riquelme and William Rhoads, then post-doctoral researchers, who packed their bags and lab supplies and headed to Puerto Rico.The island territory of the United States located in the northeast of the Caribbean Sea had been devastated, plunging its 3.4 million inhabitants into crisis. The mass destruction presented a critical opportunity for the researchers to study how wastewater infrastructure damage might contribute to the spread of antibiotic resistance -- a growing global public health threat.In a study published in American Chemical Society's "This study is a critical step toward establishing a unified and comprehensive surveillance approach for antibiotic resistance in watersheds," said Pruden, the W. Thomas Rice Professor of Civil and Environmental Engineering. "Ideally, it can be applied as a baseline to track disturbances and public health concerns associated with future storms."Over the past decade, Pruden, a microbiologist and environmental engineer, has worked with her students using next-generation DNA sequencing, a specialty of Pruden's, to examine Legionella strains as they operate before, during, after, and outside of Legionnaires' disease outbreaks in various towns and cities across the country, including Flint, Michigan.With RAPID funding from the National Science Foundation and collaborating with principal investigator Christina Bandoragoda, research scientist at the University of Washington with expertise in watershed modeling and geospatial analysis, Virginia Tech researchers teamed up with Graciela Ramirez Toro, professor and director of the Centro de Educación, Conservación e Interpretación Ambiental, and her research group at the local Interamerican University in San German, Puerto Rico. Together, they identified three sampling sites in watersheds with distinct land-use patterns and levels of wastewater input that were ideal for tracking down geospatial patterns in occurrence of bacterial genes that cause antibiotic resistance.Pruden's doctoral student and first author of the paper Benjamin Davis used a method called shotgun metagenomic DNA sequencing to detect antibiotic resistance genes in river water samples from three watersheds, including samples collected by hiking to far upstream pristine reaches of the watersheds and downstream of three wastewater treatment plants. Metagenomics is the study of genetic material recovered directly from environmental samples.Analysis of the data revealed that two anthropogenic antibiotic resistance markers -- DNA sequences associated with human impacts to the watershed -- correlated with a distinct set of antibiotic resistance genes, relative to those that correlated specifically with human fecal markers.A clear demarcation of wastewater treatment plant influence on the antibiotic resistance gene profiles was apparent and levels were elevated downstream of wastewater treatment plants, resulting in a high diversity of genes impacting resistance to clinically important antibiotics, such as beta lactams and aminoglycosides, in the watershed samples. Some of the beta lactam resistance genes detected were associated with deadly antibiotic-resistant infections in the region and showed evidence of being able to jump across bacterial strains. Beta lactam resistance genes were also noted to be more accurately predicted by anthropogenic antibiotic resistance markers than human fecal markers.Although baseline levels of antibiotic resistance genes in Puerto Rican watersheds prior to Hurricane Maria are unknown, surveillance methodologies like these could be used to assess future impacts of major storms on the spread of antibiotic resistance, the researchers said.Many international communities will likely not have access to sophisticated metagenomic-based monitoring tools in the near future, but the identification of single gene targets, such as the anthropogenic antibiotic resistance markers, make watershed surveillance of antibiotic resistance much more accessible. And such genes can be quantified directly by quantitative polymerase chain reaction, yielding cost-effective, rapid results in less than a day. | Hurricanes Cyclones | 2,020 |
November 16, 2020 | https://www.sciencedaily.com/releases/2020/11/201116112858.htm | Study reconstructs ancient storms to help predict changes in tropical cyclone hotspot | Intense tropical cyclones are expected to become more frequent as climate change increases temperatures in the Pacific Ocean. But not every area will experience storms of the same magnitude. New research from the Woods Hole Oceanographic Institution (WHOI) published in | This means that changes in atmospheric circulation, driven by differential ocean warming, heavily influence the location and intensity of tropical cyclones.In the first study of its kind so close to the equator, lead author James Bramante reconstructed 3,000 years of storm history on Jaluit Atoll in the southern Marshall Islands. This region is the birthplace of tropical cyclones in the western North Pacific -- the world's most active tropical cyclone zone. Using differences in sediment size as evidence of extreme weather events, Bramante found that tropical cyclones occurred in the region roughly once a century, but increased to a maximum of four per century from 1350 to 1700 CE, a period known as the Little Ice Age.Bramante, a recent graduate of the MIT-WHOI Joint Program in Oceanography/Applied Ocean Science and Engineering, says this finding sheds light on how climate change affects where cyclones are able to form."Atmospheric circulation changes due to modern, human-induced climate warming are opposite of the circulation changes due to the Little Ice Age," notes Bramante. "So we can expect to see the opposite effect in the deep tropics -- a decrease in tropical cyclones close to the equator. It could be good news for the southern Marshall Islands, but other areas would be threatened as the average location of cyclone generation shifts north," he adds.During major storm events, coarse sediment is stirred up and deposited by currents and waves into "blue holes," ancient caves that collapsed and turned into sinkholes that filled with sea water over thousands of years. In a 2015 field study, Bramante and his colleagues took samples from a blue hole on Jaluit Atoll and found coarse sediment among the finer grains of sand. After sorting the grains by size and analyzing the data from Typhoon Ophelia, which devastated the atoll in 1958, the researchers had a template with which to identify other storm events that appear in the sediment record. They then used radiocarbon dating -- a method of determining age by the ratio of carbon isotopes in a sample -- to date the sediment in each layer.Armed with previously-collected data about the ancient climate from tree rings, coral cores, and fossilized marine organisms, the researchers were able to piece together the conditions that existed at the time. By connecting this information with the record of storms preserved in sediment from Jaluit Atoll, the researchers demonstrated through computer modeling that the particular set of conditions responsible for equatorial trade winds heavily influenced the number, intensity and location where cyclones would form.Jeff Donnelly, a WHOI senior scientist and a co-author of the study, used similar methods to reconstruct the history of hurricanes in the North Atlantic and Caribbean. He plans to expand the Marshall Islands study westward to the Philippines to study where tropical cyclones have historically formed and how climate conditions influence a storm's track and intensity. Better understanding of how storms behaved under previous conditions will help scientists understand what causes changes in tropical cyclone activity and aid people living in coastal communities prepare for extreme weather in the future, he said."Through the geologic archive, we can get a baseline that tells us how at-risk we really are at any one location," Donnelly says. "It turns out the past provides some useful analogies for the climate change that we're currently undergoing. The earth has already run this experiment. Now we're trying to go back and determine the drivers of tropical cyclones." | Hurricanes Cyclones | 2,020 |
November 11, 2020 | https://www.sciencedaily.com/releases/2020/11/201111122826.htm | Climate change causes landfalling hurricanes to stay stronger for longer | Climate change is causing hurricanes that make landfall to take more time to weaken, reports a study published 11th November 2020 in the journal | The researchers showed that hurricanes that develop over warmer oceans carry more moisture and therefore stay stronger for longer after hitting land. This means that in the future, as the world continues to warm, hurricanes are more likely to reach communities farther inland and be more destructive."The implications are very important, especially when considering policies that are put in place to cope with global warming," said Professor Pinaki Chakraborty, senior author of the study and head of the Fluid Mechanics Unit at the Okinawa Institute of Science and Technology Graduate University (OIST). "We know that coastal areas need to ready themselves for more intense hurricanes, but inland communities, who may not have the know-how or infrastructure to cope with such intense winds or heavy rainfall, also need to be prepared."Many studies have shown that climate change can intensify hurricanes -- known as cyclones or typhoons in other regions of the world -- over the open ocean. But this is the first study to establish a clear link between a warming climate and the smaller subset of hurricanes that have made landfall.The scientists analyzed North Atlantic hurricanes that made landfall over the past half a century. They found that during the course of the first day after landfall, hurricanes weakened almost twice as slowly now than they did 50 years ago."When we plotted the data, we could clearly see that the amount of time it took for a hurricane to weaken was increasing with the years. But it wasn't a straight line -- it was undulating -- and we found that these ups and downs matched the same ups and downs seen in sea surface temperature," said Lin Li, first author and PhD student in the OIST Fluid Mechanics Unit.The scientists tested the link between warmer sea surface temperature and slower weakening past landfall by creating computer simulations of four different hurricanes and setting different temperatures for the surface of the sea.Once each virtual hurricane reached category 4 strength, the scientists simulated landfall by cutting off the supply of moisture from beneath.Li explained: "Hurricanes are heat engines, just like engines in cars. In car engines, fuel is combusted, and that heat energy is converted into mechanical work. For hurricanes, the moisture taken up from the surface of the ocean is the "fuel" that intensifies and sustains a hurricane's destructive power, with heat energy from the moisture converted into powerful winds."Making landfall is equivalent to stopping the fuel supply to the engine of a car. Without fuel, the car will decelerate, and without its moisture source, the hurricane will decay."The researchers found that even though each simulated hurricane made landfall at the same intensity, the ones that developed over warmer waters took more time to weaken."These simulations proved what our analysis of past hurricanes had suggested: warmer oceans significantly impact the rate that hurricanes decay, even when their connection with the ocean's surface is severed. The question is -- why?" said Prof. Chakraborty.Using additional simulations, the scientists found that "stored moisture" was the missing link.The researchers explained that when hurricanes make landfall, even though they can no longer access the ocean's supply of moisture, they still carry a stock of moisture that slowly depletes.When the scientists created virtual hurricanes that lacked this stored moisture after hitting land, they found that the sea surface temperature no longer had any impact on the rate of decay."This shows that stored moisture is the key factor that gives each hurricane in the simulation its own unique identity," said Li. "Hurricanes that develop over warmer oceans can take up and store more moisture, which sustains them for longer and prevents them from weakening as quickly."The increased level of stored moisture also makes hurricanes "wetter" -- an outcome already being felt as recent hurricanes have unleashed devastatingly high volumes of rainfall on coastal and inland communities.This research highlights the importance for climate models to carefully account for stored moisture when predicting the impact of warmer oceans on hurricanes.The study also pinpoints issues with the simple theoretical models widely used to understand how hurricanes decay."Current models of hurricane decay don't consider moisture -- they just view hurricanes that have made landfall as a dry vortex that rubs against the land and is slowed down by friction. Our work shows these models are incomplete, which is why this clear signature of climate change wasn't previously captured," said Li.The researchers now plan to study hurricane data from other regions of the world to determine whether the impact of a warming climate on hurricane decay is occurring across the globe.Prof. Chakraborty concluded: "Overall, the implications of this work are stark. If we don't curb global warming, landfalling hurricanes will continue to weaken more slowly. Their destruction will no longer be confined to coastal areas, causing higher levels of economic damage and costing more lives." | Hurricanes Cyclones | 2,020 |
November 4, 2020 | https://www.sciencedaily.com/releases/2020/11/201104114737.htm | Effective government saves lives in cyclones, other disasters | Cyclone Nargis killed more than 138,000 people in Myanmar. It was a powerful category 3 or 4 storm at landfall, but tropical storms with similar wind speeds that year resulted in far fewer fatalities in other countries. | Elizabeth Tennant, postdoctoral associate in economics at Cornell University, wondered: What made the difference?To quantify the relationship between natural disaster outcomes and the effectiveness of governments and other institutions, Tennant and co-author Elisabeth Gilmore, associate professor in the Department of International Development, Community and Environment at Clark University, analyzed data from more than 1,000 tropical cyclones from 1979 to 2016. They found, in a paper published Nov. 3 in Moreover, storms concentrated in areas with weaker public services, as indicated by elevated infant mortality rates, are especially deadly, the researchers found."These results suggest that policies and programs to enhance institutional capacity and governance can support risk reduction from extreme weather events," Tennant wrote.One of the original motivations of the study, Tennant said, was to better understand how effective institutions and governments can moderate the increasing risks of future extreme weather events due to climate change. This research contributes to the body of evidence that institutions are an important foundation for climate adaptation, Tennant said.There are many examples indicating that strong institutions -- including government -- play a critical role in protecting populations from adverse effects of natural disasters, Tennant said. But it is much more difficult to determine how universal this relationship is, she said, because there is so much variation in the frequency and severity of storms.Natural hazards such as cyclones, the researcher wrote, result in disasters only when vulnerable human systems are exposed to hazardous conditions. In their analysis, Tennant and Gilmore explicitly accounted for hazard exposure, connecting the analysis of governance and other indicators of well-being to estimates of the severity and exposure to the tropical cyclone hazard.They used several data sources to gather information about people, places, events and storms, including: the National Oceanic and Atmospheric Administration; the Centre for Research on the Epidemiology of Disasters Emergency Events Database; and World Governance Indicators."We developed an approach where we carefully modeled the extent of the storms to match them to the measures of governance and living conditions in affected areas," Tennant said. "This helps us to identify what makes people vulnerable."Tennant first became interested in the intersections of disasters and development during her time as a Peace Corps volunteer in Honduras, where resources were constrained."I saw how a decade after the devastating Hurricane Mitch [1998], the disaster still affected the local communities and their well-being," Tennant said. "So what does disaster preparedness look like in a country where many people are without secure access to nutritional food and clean drinking water now? To what extent can investing in health, education and the quality of governments and institutions also serve as a useful foundation for disaster risk reduction activities?"While the study does not suggest specific approaches to improving the quality and effectiveness of institutions, it does highlight their importance, Tennant said. "Ensuring that local institutions are involved and accountable for the delivery of public services may have multiple benefits," she said, "including reducing deaths from natural disasters."And while the researchers completed the study before the onset of the COVID-19 pandemic, the results are consistent with lessons emerging from the virus, Tennant said: "In our view, the pandemic has provided an immediate example of how government effectiveness plays an important role in shaping societal risks, regardless of a country's wealth." | Hurricanes Cyclones | 2,020 |
October 28, 2020 | https://www.sciencedaily.com/releases/2020/10/201028124517.htm | New dataset provides county-level exposure numbers for tropical cyclones, human health | Hurricanes and other tropical cyclones can severely impact human health in communities across the country, but data for these events is limited, especially in a format that is easy to link with human health outcomes. | Scientists have looked at death certificates to see if the cause could be linked up clearly to a storm, but it is easy to miss something in this type of data review. A person could have a heart attack brought on by stress from clearing tree limbs in a yard, following a storm.An interdisciplinary team -- including epidemiologists, engineers, an atmospheric scientist and software developer -- led by Colorado State University (CSU) has created an open source data set that can be used for epidemiological research on tropical cyclones. The new tool also provides insights that can guide the design and analysis of this type of research.The paper describing the new data set, "Assessing United States County-Level Exposure for Research on Tropical Cyclones and Human Health," is published Oct. 28 in Brooke Anderson, lead author of the paper and an associate professor in the Department of Environmental and Radiological Health Sciences at CSU, teamed up on this project with scientists including Andrea Schumacher, research associate with the Cooperative Institute for Research in the Atmosphere (CIRA) at CSU."For heatwaves, there has already been a lot of research on what the risks are for human health," said Anderson. Scientists can estimate community-wide deaths and illnesses associated with several types of climate-related disasters, including heat waves, floods and wildfires, she added.Anderson said that with this new data set, scientists can now analyze multiple storms in different places and time periods, and drill down to see what happens in different health outcomes for people. Prior to the release of this new tool, most research has focused on data related to a single storm.Schumacher said she's always been interested in looking at how counties across the country are affected by hurricane winds on a broad scale. And she found the perfect research partner in Anderson."Brooke and I found that we had a similar interest in characterizing hurricane winds," said Schumacher, who helps develop satellite-based products for hurricane forecasters.Data set designed for others to build upon The team used what's known as the R programming language to create the data set, which will allow scientists from around the world to add new facets and enrich what the team has started.Anderson said the concept is similar to when a child gets a toy train set, and it just comes with an oval. "Next, the grandparents come along, and you get a bridge or you get new roads that head off in different directions," she said.The new tool -- maintained by an international group of volunteers on the Comprehensive R Archive Network, CRAN -- also provides very precise data. A scientist can look at rainfall during Hurricane Floyd from several days before (or after) the storm came, Anderson said.CSU scientists are already using the new data set to look at Medicare hospitalizations following storms over a decade in the Eastern United States. Anderson said the team found that the number of people being hospitalized for respiratory conditions tended to increase when the storm hit, and a week after.Another team she's part of used the dataset to look at how tropical cyclones affect different birth outcomes, including pre-term birth and pregnancy outcomes.Schumacher said she sees lots of possibilities for where this research tool can go."From my end, there's still plenty of work to do," she said. "What we've created is a really good start. I can see all kinds of ways that we can improve on it, especially as we get better and better observations." | Hurricanes Cyclones | 2,020 |
October 27, 2020 | https://www.sciencedaily.com/releases/2020/10/201027192405.htm | Hurricanes pack a bigger punch for Florida's west coast | Hurricanes, the United States' deadliest and most destructive weather disasters, are notoriously difficult to predict. With the average storm intensity as well as the proportion of storms that reach category 4 or 5 likely to increase, more accurate predictions of future hurricane impacts could help emergency officials and coastal populations better prepare for such storms -- and ultimately, save lives. | Such predictions rely on historical records that reveal cyclic changes, such as the El Niño-Southern Oscillation, that can affect hurricane frequency. But the short observational records that exist for many regions, including Florida's East Coast, are inadequate for detecting climate patterns that fluctuate over longer timeframes.Now new research presented Wednesday at the annual meeting of The Geological Society of America is extending Florida's hurricane record thousands of years back in time -- and hinting at a surprise finding."There has been little to no research done on the hurricane record for Florida's East Coast," explains Ilexxis Morales, a graduate student in the Environmental Science program at Florida Gulf Coast University and the study's lead author. "The national hurricane database for this area currently only extends back to the 1850s," she says.But what that record suggests, says Morales, is quite intriguing, especially with respect to intense (category 3-5) storms. "It shows that at least for the past 170 years, Florida's Atlantic Coast has been hit by fewer intense hurricanes than the state's Gulf Coast," she says.To better understand this discrepancy, Morales and her Florida Gulf Coast University co-authors, Joanne Muller and James Javaruski, collected sediment cores from a series of lagoons tucked behind narrow barrier islands along the state's eastern coast. Their analysis shows that in contrast to the dark organic matter that comprises most of the cores, hurricanes leave behind a coarser deposit distinctive enough to be called a "tempest.""When a large storm comes through the area," says Morales, "it picks up light-colored sand from the beach and deposits it in the lagoon." Because the grains of sand deposited by large storms are coarser than the organic-rich muds, the researchers can detect ancient tempest deposits using simple grain-size analyses.After identifying the tempest deposits (called tempestites), the team used a variety of methods, including a Lead-210 germanium detector and radiocarbon dating, to determine their ages. While still preliminary, the results from the seven cores the researchers have analyzed to date suggest that there are fewer visible tempestites in the East Coast cores compared to those analyzed from the West Coast.The results hint that the pattern of more major hurricanes hitting Florida's Gulf Coast may extend thousands of years back in time. Morales speculates this difference could be due to the shifting position of the Bermuda High, a semi-permanent ridge of high pressure that can affect a hurricane's direction. "When the Bermuda High is in a more northeasterly position, hurricanes tend to track along Florida's East Coast and up to the Carolinas," says Morales. "When it shifts southwestward towards the U.S., the high tends to push storms into the Gulf of Mexico instead." Sea-surface temperatures can also help explain the difference, says Morales. "Normally the Atlantic is colder than the Gulf, and this colder water makes it harder for hurricanes to sustain their strength," she explains.Similar "paleotempestology" studies have been conducted in other locations that are also susceptible to hurricanes, including Texas, Louisiana, New England, and even Australia, and the results have a number of practical applications. "This data will go to the national hurricane database, which will then help meteorologists better predict storm paths," Morales says. The data will also help show which areas are more susceptible to hurricane damage, enabling insurance companies to better adjust hurricane-insurance rates and developers to select building sites less susceptible to storm surge.Once complete, says study co-author James Javaruski, the longer storm record could help researchers determine whether changes observed in it can be attributed to human-induced climate change. The findings can also offer insight into what could happen in the future. "If we see in other studies that sea surface temperatures were increasing over a certain time frame and find that hurricanes also increased over that same time frame," Javaruski says, "it can give us a good idea of what to expect as we artificially raise sea surface temperatures now." | Hurricanes Cyclones | 2,020 |
October 26, 2020 | https://www.sciencedaily.com/releases/2020/10/201026154004.htm | Greenhouse effect of clouds instrumental in origin of tropical storms | With the tropical storm season in the Atlantic Ocean underway and already well into the Greek alphabet for naming, better storm track prediction has allowed timely evacuations and preparations. However, the formation and intensification of these storms remains challenging to predict, according to an international team of researchers who are studying the origin of tropical cyclones. | "There are critical questions around the formation and intensification of hurricanes that makes forecasting them extremely difficult," said James H. Ruppert Jr., assistant research professor of meteorology and atmospheric science, Penn State. "We don't yet have sufficient understanding of the processes that drive storm formation."Tropical depressions are the weak precursors to intense hurricanes, usually identifiable as a disorganized cluster of clouds in a weak low-pressure area, according to Rupert."The tropical depression stage is usually the first time that forecasters are able to identify and start tracking a storm," he said.Environmental conditions usually provide a narrow window in which these depressions can form into intense tropical cyclones."Understanding the transition from this depression stage to an intensifying hurricane is what we are after," said Ruppert.To investigate tropical cyclone formation, the researchers looked at storms forming in the Atlantic and in the western Pacific oceans. They considered two storms, Super Typhoon Haiyan, which occurred in 2013, and Hurricane Maria, which occurred in 2017.The researchers found that infrared radiative feedback from clouds creates a localized greenhouse effect that traps heat in the area of the tropical depression. Deep clouds that are heavily laden with water droplets and ice crystals trap outgoing infrared radiation and warm the atmosphere. This local warming causes lifting motion in the storm, which helps fully saturate the atmosphere and increase inward flowing winds near the ocean's surface. As long as the storm is more than a few degrees above or below the equator, the Coriolis Effect causes these inward flowing winds to form a circulation near the surface. This circulation then intensifies with the help of surface evaporation and eventually forms a central eye, taking on the classic appearance of an intense tropical cyclone.The researchers found that the localized warming created by the cloud greenhouse effect helped accelerate the formation of both Haiyan and Maria. When they removed the effect in the model simulation, the storms either formed more slowly or not at all. The researchers argue that the cloud greenhouse effect is therefore likely instrumental in the formation of many tropical storm events. They report their results today (Oct. 26) in the "Our ultimate goal is to better forecast tropical cyclones, and it currently remains very hard to forecast storm formation," said Ruppert. "Storm track forecasting has improved immensely in recent decades. Large-scale winds primarily control storm tracks and our ability to both measure and predict these winds has improved greatly, allowing for major progress in storm track prediction. The small-scale processes that govern storm formation and intensification in the first place -- that is where our understanding and ability to observe are still really challenged." | Hurricanes Cyclones | 2,020 |
October 19, 2020 | https://www.sciencedaily.com/releases/2020/10/201019164947.htm | Tropical cyclones moving faster in recent decades | About 40% of the U.S. population lives in a coastal area and in Hawai'i, nearly everyone is vulnerable to the effects of tropical storms and hurricanes. | Tropical cyclones, regionally known as hurricanes or typhoons, have been moving across ocean basins faster since 1982, according to a new study published in The study, led by Sung-Hun Kim, a post-doctoral researcher in the University of Hawai'i at Manoa School of Ocean and Earth Science and Technology (SOEST) at the time of the work, also determined the North Atlantic region has experienced an increase in the frequency of hurricanes and that tropical cyclone activity has shifted toward the poles in both the Pacific and Atlantic Oceans.The researchers, including Pao-Shin Chu, atmospheric sciences professor in SOEST, focused on tropical cyclones since 1982, when modern, reliable satellite data became available. They assessed the frequency and locations of storms and trends in tropical cyclone movement speed -- how quickly a storm moves forward -- globally and regionally in each ocean basin."For people in Hawai'i, the threat of hurricanes is always there every year," said Chu. "If hurricanes move faster they would pose danger to coastal communities and emergency managers because they would have less time to prepare for evacuation and other measures."The recent study suggests the reason for the observed changes is a combination of natural variations and human-induced climate change.The researchers continue the study the trends in and connections between climate variability and tropical cyclone activity. | Hurricanes Cyclones | 2,020 |
October 13, 2020 | https://www.sciencedaily.com/releases/2020/10/201008104301.htm | Arctic weather observations can improve hurricane track forecast accuracy | In 2017, Category 5 Hurricane Irma devastated islands of the Lesser and Greater Antilles before turning northward and ultimately making landfall in southwestern Florida. Forecasting the timing and position of that northward turn was critical for Floridians to prepare for the storm's impact, but the uncertainty surrounding prediction of the upper-level trough that would steer the turn made this difficult. Collecting additional meteorological data, including measurements from locations as distant as the Arctic, could help meteorologists forecast the tracks of future tropical cyclones like Irma. | In a new study published in the journal Although hurricane track forecasting has significantly improved in recent decades, there are still significant errors in some cases, and the consequences can be severe. In particular, uncertainty regarding the paths of upper-level troughs with strong winds in the mid-latitudes can lead to greater uncertainty when they influence the tracks of tropical cyclones. The research team found that in cases of hurricanes steered by upper-level troughs, forecasting errors of the hurricanes' central positions were larger than those in cases not influenced by upper-level troughs.Lead author Kazutoshi Sato explains, "During the forecast period Hurricane Irma in 2017, there was large meandering of the jet stream over the North Pacific and North Atlantic, which introduced large errors in the forecasts. When we included additional radiosonde observation data from the Research Vessel Mirai collected in the Arctic in the late summer of 2017, the error and ensemble spread of the upper-level trough at the initial time of forecast were improved, which increased the accuracy of the track forecast for Irma."The researchers also investigated the effect of including additional dropsonde data collected by the United States Air Force Reserve Command and the Aircraft Operations Center of the National Oceanic and Atmospheric Administration over the Atlantic Ocean near Hurricane Irma in 2017. Hurricane forecast accuracy was improved both by dropsonde measurements near the hurricanes and by radiosonde observations over the Arctic Ocean.According to co-author Jun Inoue, an associate professor of National Institute of Polar Research, "Our findings show that developing a more efficient observation system over the higher latitudes will be greatly beneficial to tropical cyclone track forecasting over the mid-latitudes, which will help mitigate the human casualties and socioeconomic losses caused by these storms." | Hurricanes Cyclones | 2,020 |
October 9, 2020 | https://www.sciencedaily.com/releases/2020/10/201009102731.htm | Ice melt projections may underestimate Antarctic contribution to sea level rise | Fluctuations in the weather can have a significant impact on melting Antarctic ice, and models that do not include this factor can underestimate the global impact of sea level rise, according to Penn State scientists. | "We know ice sheets are melting as global temperatures increase, but uncertainties remain about how much and how fast that will happen," said Chris Forest, professor of climate dynamics at Penn State. "Our findings shed new light on one area of uncertainty, suggesting climate variability has a significant impact on melting ice sheets and sea level rise."While it is understood that continued warming may cause rapid ice loss, models that predict how Antarctica will respond to climate change have not included the potential impacts of internal climate variability, like yearly and decadal fluctuations in the climate, the team of scientists said.Accounting for climate variability caused models to predict an additional 2.7 to 4.3 inches -- 7 to 11 centimeters -- of sea level rise by 2100, the scientists recently reported in the journal "That increase alone is comparable to the amount of sea level rise we have seen over the last few decades," said Forest, who has appointments in the departments of meteorology and atmospheric science and geosciences. "Every bit adds on to the storm surge, which we expect to see during hurricanes and other severe weather events, and the results can be devastating."The Antarctic ice sheet is a complex system, and modeling how it will evolve under future climate conditions requires thousands of simulations and large amounts of computing power. Because of this, modelers test how the ice will respond using a mean temperature found by averaging the results of climate models.However, that process smooths out peaks caused by climate variability and reduces the average number of days above temperature thresholds that can impact the ice sheet melt, creating a bias in the results, the scientists said."If we include variability in the simulations, we are going to have more warm days and more sunshine, and therefore when the daily temperature gets above a certain threshold it will melt the ice," Forest said. "If we're just running with average conditions, we're not seeing these extremes happening on yearly or decadal timescales."To study the effects of internal climate variability, the researchers analyzed two large ensembles of climate simulations. Large ensembles are generated by starting each member with slightly different initial conditions. The chaotic nature of the climate system causes each member to yield slightly different responses, and this represents internally generated variability, the scientists said.Instead of averaging the results of each ensemble, the scientists fed the atmospheric and oceanic data representing this variability into a three-dimensional Antarctic ice sheet model. They found atmospheric variations had a larger and more immediate impact on the ice sheet, but ocean variability was also a significant factor.Extensive parts of the ice sheet are in contact with ocean water, and previous studies have suggested that warming oceans could cause large chunks to break away. The process may expose ice cliffs so tall that they collapse under their own weight, inducing a domino effect that further depletes the ice shelf.The scientists found model simulations that did not include the effects of internal climate variability significantly delayed the retreat of the ice sheet by up to 20 years and underestimated future sea level rise."This additional ice melt will impact the hurricane storm surges across the globe. Additionally, for years, the IPCC reports have been looking at sea level rise without considering this additional variability and have been underestimating what the impact may be," Forest said. "It's important to better understand these processes contributing to the additional ice loss because the ice sheets are melting much faster than we expected."The National Science Foundation and the Penn State Center for Climate Risk Management funded this research. | Hurricanes Cyclones | 2,020 |
September 30, 2020 | https://www.sciencedaily.com/releases/2020/09/200930144433.htm | Acropora spp. coral still thrives in the holdout refuge of Coral Gardens, Belize | Coral Gardens Reef in Belize remains a refuge for | Once a key coral species providing the architectural framework for sprawling coral reef structures across the tropical western North Atlantic/Caribbean region, In order to test whether one of the largest populations of extant Acropora cervicornis in the western Caribbean was recently established (post-1980s) or a longer-lived, stable population, the authors collected 232 samples of premodern and recently dead A. cervicornis coral skeleton material across 3 sites at Coral Gardens Reef, Belize, using a subset of these samples for radiometric as well as high-precision uranium-thorium dating. Sample sites were chosen using a new genetic-aging technique to identify key sites and minimize damage to the living coral.The data revealed coral samples ranging in age from 1910 (based on carbon dating) or 1915 (based on thorium dating) to at least November 2019; Greer and colleagues were also able to determine that Coral Gardens Reef has been home to consistent and sustained living A. cervicornis coral throughout the 1980s and up to at least 2019. While the authors cannot exclude the possibility of short gaps in the presence of A. cervicornis prior to 1980, the radiometric ages and continuous coral growth patterns found at the sample sites strongly suggests that Acropora coral has been growing and thriving at Coral Gardens for over 100 years.Though the results from this study are not sufficient to determine exactly why this site seems to be a refuge for Acropora coral -- Coral Gardens has been subject to the same increases in temperature and tropical storm/hurricane activity as reefs in the region with devastated coral populations, and the genetic diversity of Acropora is not unusually high -- the findings here may be key to efforts to grow, preserve, conserve, and re-seed Caribbean reefs, as is identifying similar coral refuge sites in the area.The authors add: "Now that we have identified an exceptional refuge for Caribbean Staghorn corals, we hope to better determine, in collaboration with reef scientists with many additional interests and expertise, the sources for success at Coral Gardens." | Hurricanes Cyclones | 2,020 |
September 22, 2020 | https://www.sciencedaily.com/releases/2020/09/200922135747.htm | Compounding impact of severe weather events fuels marine heatwave in the coastal ocean | Several coastal communities are picking up the pieces after being ravaged by hurricanes in the past month. Hurricane Laura, a category 4, and Hurricane Sally, a category 2, seemed to meander their way across the Gulf of Mexico constantly shifting forecasts and keeping meteorologists on their toes. In the hours before these storms struck land, they seemed to explode in intensity. | Researchers at the Dauphin Island Sea Lab with support from the Jet Propulsion Laboratory can offer insight into why these storms intensified quickly as they moved across the continental shelf."Surprisingly, both Hurricane Laura and Hurricane Sally appeared to have similar setups to Hurricane Michael with both storm events being preceded by smaller storms (i.e. Hurricane Hanna and Marco, respectively)," Dr. Brian Dzwonkowski explained. "This pre-storm setup of the oceanic environment likely contributed to the intensification prior to landfall. Importantly, this pre-landfall intensification was not well predicted by hurricane models or forecasts, which as you can imagine is critical information for evacuation and disaster preparation."Dzwonkowski and his team's publication, "Compounding impact of severe weather events fuels marine heatwave in the coastal ocean," outlines how one storm could impact the intensity of another storm by restructuring the thermal properties of the water column. The research focuses on Hurricane Michael which devastated Mexico Beach, Florida, and the surrounding communities, on October 10, 2018. The category 5 storm intensified hours before making landfall.Dzwonkowski, a physical oceanographer with the Dauphin Island Sea Lab and Associate Professor at the University of South Alabama in the Department of Marine Sciences, and his team tracked down the key events and processes that pushed the coastal waters in the Gulf of Mexico to an extremely warm state (i.e. a marine heatwave), likely contributing to the intensification of a storm so close to shore.Unlike the deep ocean, the continental shelf has a shallow bottom that limits how much cold water can be mixed up to the surface, cooling the sea surface temperature and weakening approaching storms. Dzwonkowski and his team focused on how a strong mixing event pushes surface heat downward and clears the bottom water of its cold water reserve. If this mixing is followed by a period of rewarming, such as an atmospheric heatwave, the shelf's oceanic environment could be primed for the potential generation of extreme storm events, i.e. Hurricane Michael."This work shows that understanding the preceding weather conditions in a region where a storm is going to make landfall can improve interpretation of hurricane model forecasts and what the storm is likely to do prior to landfall," says Dr. DzwonkowskiIn mapping out heat flux and mixing, the team focused on the Mississippi Bight in late summer and early fall with data gathered by a mooring site off Dauphin Island's coastline. The mooring site collects data throughout the water column allowing for the full heat content of the shelf to be determined. The period prior to the landfall of Hurricane Michael turned out to be the warmest ocean conditions during this time period in the 13-year record."Turns out hurricanes and atmospheric heatwaves will be getting stronger in a warming world which would indicate the identified sequence of events that generate these extreme conditions may become more frequent," Dzwonkowski said. "The occurrence of extreme heat content events, like marine heatwaves has significant implications for a broad range of scientific management interests beyond hurricane intensity."Importantly, the mechanisms that generated this marine heatwave are expected to be more frequent and intense in the future due to climate change, increasing the likelihood of such extreme conditions.For example, coral reefs and hypoxia-prone shelves are already stressed by long-term warming trends. These temperature-specific benthic communities and habitats are typically of significant societal and economic value. As such, the newly identified sequence of compounding processes is expected to impact a range of coastal interests and should be considered in management and disaster response decisions.This research was funded by the NOAA RESTORE Science Program and NOAA NGI NMFS Regional Collaboration network. | Hurricanes Cyclones | 2,020 |
September 1, 2020 | https://www.sciencedaily.com/releases/2020/09/200901112201.htm | Loggerhead turtles record a passing hurricane | In early June 2011, NOAA Fisheries researchers and colleagues placed satellite tags on 26 loggerhead sea turtles in the Mid-Atlantic Bight. The tagging was part of ongoing studies of loggerhead movements and behavior. The Mid-Atlantic Bight, off the U.S. East Coast, is the coastal region from Cape Hatteras, North Carolina to southern Massachusetts. A little more than 2 months later, on August 28, Hurricane Irene passed through the area, putting 18 of the tagged turtles in its direct path. The researchers were able to track changes in the turtles' behavior coinciding with the hurricane, and found that they reacted in various ways. | "Hurricanes are some of the most intense weather events loggerheads in the mid-Atlantic experience, and we thought it was worth investigating how turtles in our dataset may be influenced by these dramatic environmental changes," said Leah Crowe, a contract field biologist at the NOAA Northeast Fisheries Science Center's laboratory in Woods Hole, Massachusetts, and lead author of the study published recently in Satellite tags attached to a turtle's carapace, or shell, transmitted the turtles' location and dive behavior. They also recorded sea-surface temperatures and temperature-depth profiles for approximately 13 months. This enabled the researchers to investigate the movements of 18 juvenile and adult-sized loggerhead turtles and associated oceanographic conditions as the hurricane moved through the region.Most of the turtles moved northward during the hurricane, aligning themselves with the surface currents -- perhaps to conserve energy. Researchers observed longer dive durations after the hurricane for turtles that stayed in their pre-storm foraging areas. Some dives lasted an hour or more, compared with less than 30 minutes for a typical dive before the storm.The turtles that left their foraging areas after the hurricane passed moved south earlier than would be expected, based on their normal seasonal movements. This change was also more than a month earlier than the typical seasonal cooling in the water column, which is also when the foraging season for loggerhead turtles ends in the Mid-Atlantic Bight."Loggerheads experience environmental changes in the entire water column from the surface to the bottom, including during extreme weather events," said Crowe. "This study was an opportunistic look at turtle behavior during a hurricane. Their behavior makes loggerheads good observers of oceanographic conditions where they forage."The study was conducted by researchers at the Northeast Fisheries Science Center and colleagues at the nearby Coonamessett Farm Foundation in East Falmouth, Massachusetts. The team has tagged more than 200 loggerheads in the Mid-Atlantic Bight since 2009.This work has created a continuous time-series of data on loggerhead sea turtles. With 10 years of data, researchers can now get a deeper understanding of how turtles behave and what environmental factors drive them. They can also look back at the data and ask new questions, as they did in this study.Waters in the Mid-Atlantic Bight are highly stratified, or layered, by temperature in the summer. At the surface, water is warm. A cold layer, also called a cold pool, forms beneath this warm layer and is present from May to October. The presence of the cold pool overlaps with the Atlantic hurricane season, which runs from June through November. It also overlaps with the presence of foraging loggerheads that are in the area between May and September.Hurricane modeling is especially difficult in the Mid-Atlantic Bight because of the cold pool. In this study, it was unclear which aspect of the environmental changes prompted behavioral changes. Previous studies have found that loggerhead behavior appears to be sensitive to changes in water temperatures throughout the water column. Hurricanes cause the water layers to mix, which creates cooler surface temperatures. The mixing also disrupts the thermocline -- the boundary layer between warm surface waters and colder, deeper waters.Ocean temperature data recorded by the turtles' satellite tags are consistent with observations from weather buoys and autonomous gliders operating in the region. Depending on how many tags are deployed, data from tagged turtles can cover a more extensive area within a season than other oceanographic data sources.More measurements of water temperatures throughout the water column in the region could help improve oceanographic models. Researchers say data from the turtle tags are an underused resource that has the potential to improve weather models, including hurricane models.Many of the natural and human-induced impacts on sea turtle behavior, or the environments that sea turtles live in, are still unknown.Previous studies indicate that sounds from dredge operations, seismic activity,offshore wind farm development, and marine recreation may also impact sea turtle distribution and dive behavior. Turtles might be impacted directly or through habitat alterations. While studies have looked at how tropical storms and hurricanes affect some marine species, there are few examples of examining sea turtle interactions with large storms.In this study, turtle behavior did not return to pre-storm behavior within 2 weeks after the storm."The long-term cumulative effects of a changing climate and the increase in intensity of hurricanes and other storms is something that needs to be looked at. Changes in sea turtle movements and behavior can affect abundance estimates and management decisions," Crowe said. "This study reminds us that turtles live in a dynamic environment, and we cannot assume their behavior will be consistent throughout space and time."The study was supported by funds from the Atlantic Marine Assessment Program for Protected Species and the New England/Greater Atlantic Region's Research Set-Aside Program. | Hurricanes Cyclones | 2,020 |
August 28, 2020 | https://www.sciencedaily.com/releases/2020/08/200828081023.htm | Amateur drone videos could aid in natural disaster damage assessment | It wasn't long after Hurricane Laura hit the Gulf Coast Thursday that people began flying drones to record the damage and posting videos on social media. Those videos are a precious resource, say researchers at Carnegie Mellon University, who are working on ways to use them for rapid damage assessment. | By using artificial intelligence, the researchers are developing a system that can automatically identify buildings and make an initial determination of whether they are damaged and how serious that damage might be."Current damage assessments are mostly based on individuals detecting and documenting damage to a building," said Junwei Liang, a Ph.D. student in CMU's Language Technologies Institute (LTI). "That can be slow, expensive and labor-intensive work."Satellite imagery doesn't provide enough detail and shows damage from only a single viewpoint -- vertical. Drones, however, can gather close-up information from a number of angles and viewpoints. It's possible, of course, for first responders to fly drones for damage assessment, but drones are now widely available among residents and routinely flown after natural disasters."The number of drone videos available on social media soon after a disaster means they can be a valuable resource for doing timely damage assessments," Liang said.Xiaoyu Zhu, a master's student in AI and Innovation in the LTI, said the initial system can overlay masks on parts of the buildings in the video that appear damaged and determine if the damage is slight or serious, or if the building has been destroyed. | Hurricanes Cyclones | 2,020 |
August 27, 2020 | https://www.sciencedaily.com/releases/2020/08/200827130612.htm | Hurricanes could be up to five times more likely in the Caribbean if tougher targets are missed | Global warming is dramatically increasing the risk of extreme hurricanes in the Caribbean, but meeting more ambitious climate change goals could up to halve the likelihood of such disasters in the region, according to new research. | The study, led by the University of Bristol, analysed future projections of hurricane rainfall in the Caribbean and found it to be particularly vulnerable to climate change, resulting in extreme hurricane rainfall events being as much as five times more likely in a warmer world."Hurricane research has previously focused on the United States, so we wanted to look at the Caribbean region, which has fewer resources to recover. The findings are alarming and illustrate the urgent need to tackle global warming to reduce the likelihood of extreme rainfall events and their catastrophic consequences, particularly for poorer countries which take many years to recover," said lead author Emily Vosper, Research Student at the School of Computer Science, at the University of Bristol.The researchers generated thousands of synthetic hurricanes under three climate scenarios: present day conditions compared to the Paris Agreement goals of 1.5 degrees Celsius and 2°C warming above pre-industrial levels. The main objective of the Paris Agreement, a global framework to tackle climate change, is to hold the global average temperature increase to well below 2°C above pre-industrial levels and endeavour to limit the temperature increase to 1.5°C.Focusing their analysis on the Caribbean region, the study generated rainfall statistics by applying a physics-based model to the synthetic hurricanes. The model takes into account several factors including the land features and large-scale winds, and has been shown to give realistic results compared to observations of real-life hurricanes.The study, published in Hurricane Maria brought as much as a quarter of normal annual rainfall to some regions of Puerto Rico when it made landfall in 2017 and storms of this magnitude are roughly once in a 100-year events. The results show that in a 2°C warmer world, an event of similar size to Maria would be more than twice (2.3 times) as likely, occurring once every 43 years. Similarly, a 100-year storm affecting the Bahamas would be 4.5 times as likely under the 2°C Paris Agreement scenario compared to the present day. Under the more ambitious goal of 1.5°C warming, such extreme hurricane rainfall events affecting the Dominican Republic would occur roughly once every 57 years, which is half as likely compared to the 2°C warming scenario where they would occur once every 30 years.Emily said: "We expected extreme hurricanes to be more prevalent in the 2°C global warming scenario, but the scale of the projected increases was surprising and should serve as a stark warning to countries across the globe underscoring the importance of keeping climate change under control."The projections reinforce the Intergovernmental Panel on Climate Change special report, which concludes that restricting global warming to 1.5°C would limit the risk of climate-related hazards, such as torrential rainfall, drought, and temperature extremes.Emily said: "Our findings show that the impacts of a 2°C warming above pre-industrial levels are set to disproportionately affect the Caribbean. By focusing efforts to stabilise global warming to the more ambitious 1.5°C goal, we could dramatically reduce the likelihood of extreme hurricane rainfall events in the area, particularly in the Eastern Caribbean region."It takes at least six years for even the richest of the Caribbean countries to rebuild after a major hurricane hits, stalling economic growth. Building resilient infrastructure throughout the islands is not feasible due to financial and time constraints. The study recommends its findings could be used to inform a multi-hazard, multi-scale approach which identifies the most at-risk areas so resilience funding and strategies can be more effectively targeted.Emily said: "Resources to mitigate damage are limited, so our findings could help highlight the hotspots in greatest danger and need. An integrated climate risk approach is needed to fully understand the threat of future hurricanes to Caribbean populations."Further studies could therefore incorporate factors that directly affect the health and well-being of local populations -- such as storm surge, flood and landslide modelling -- into the rainfall results to quantify such threats and feed into adaptation and resilience planning."Reducing the likelihood of extreme hurricanes should be the overriding priority. Our research clearly illustrates how vital it is to keep striving to meet the lower global warming temperature target, and the collective responsibility all countries, cities, communities, governments and individuals share to make that happen." | Hurricanes Cyclones | 2,020 |
August 24, 2020 | https://www.sciencedaily.com/releases/2020/08/200824144410.htm | Contagion model predicts flooding in urban areas | Inspired by the same modeling and mathematical laws used to predict the spread of pandemics, researchers at Texas A&M University have created a model to accurately forecast the spread and recession process of floodwaters in urban road networks. With this new approach, researchers have created a simple and powerful mathematical approach to a complex problem. | "We were inspired by the fact that the spread of epidemics and pandemics in communities has been studied by people in health sciences and epidemiology and other fields, and they have identified some principles and rules that govern the spread process in complex social networks," said Dr. Ali Mostafavi, associate professor in the Zachry Department of Civil and Environmental Engineering. "So we ask ourselves, are these spreading processes the same for the spread of flooding in cities? We tested that, and surprisingly, we found that the answer is yes."The findings of this study were recently published in Nature The contagion model, Susceptible-Exposed-Infected-Recovered (SEIR), is used to mathematically model the spread of infectious diseases. In relation to flooding, Mostafavi and his team integrated the SEIR model with the network spread process in which the probability of flooding of a road segment depends on the degree to which the nearby road segments are flooded.In the context of flooding, susceptible is a road that can be flooded because it is in a flood plain; exposed is a road that has flooding due to rainwater or overflow from a nearby channel; infected is a road that is flooded and cannot be used; and recovered is a road where the floodwater has receded.The research team verified the model's use with high-resolution historical data of road flooding in Harris County during Hurricane Harvey in 2017. The results show that the model can monitor and predict the evolution of flooded roads over time."The power of this approach is it offers a simple and powerful mathematical approach and provides great potential to support emergency managers, public officials, residents, first responders and other decision makers for flood forecast in road networks," Mostafavi said.The proposed model can achieve decent precision and recall for the spatial spread of the flooded roads."If you look at the flood monitoring system of Harris County, it can show you if a channel is overflowing now, but they're not able to predict anything about the next four hours or next eight hours. Also, the existing flood monitoring systems provide limited information about the propagation of flooding in road networks and the impacts on urban mobility. But our models, and this specific model for the road networks, is robust at predicting the future spread of flooding," he said. "In addition to flood prediction in urban networks, the findings of this study provide very important insights about the universality of the network spread processes across various social, natural, physical and engineered systems; this is significant for better modeling and managing cities, as complex systems."The only limitation to this flood prediction model is that it cannot identify where the initial flooding will begin, but Mostafavi said there are other mechanisms in place such as sensors on flood gauges that can address this."As soon as flooding is reported in these areas, we can use our model, which is very simple compared to hydraulic and hydrologic models, to predict the flood propagation in future hours. The forecast of road inundations and mobility disruptions is critical to inform residents to avoid high-risk roadways and to enable emergency managers and responders to optimize relief and rescue in impacted areas based on predicted information about road access and mobility. This forecast could be the difference between life and death during crisis response," he said.Civil engineering doctoral student and graduate research assistant Chao Fan led the analysis and modeling of the Hurricane Harvey data, along with Xiangqi (Alex) Jiang, a graduate student in computer science, who works in Mostafavi's UrbanResilience.AI Lab."By doing this research, I realize the power of mathematical models in addressing engineering problems and real-world challenges.This research expands my research capabilities and will have a long-term impact on my career," Fan said. "In addition, I am also very excited that my research can contribute to reducing the negative impacts of natural disasters on infrastructure services." | Hurricanes Cyclones | 2,020 |
August 17, 2020 | https://www.sciencedaily.com/releases/2020/08/200817104251.htm | New study reveals strength of the deep ocean circulation in the South Atlantic | A new study from oceanographers at NOAA and the University of Miami Rosenstiel School's Cooperative Institute for Marine and Atmospheric Studies (CIMAS) has for the first time described the daily variability of the circulation of key deep currents in the South Atlantic Ocean. The research by the lead scientists based at the University of Miami's Rosenstiel School of Marine and Atmospheric Science (UM) and NOAA's Atlantic Oceanographic and Meteorological Laboratory (AOML) demonstrates strong variations in these key currents, changes that are linked to climate and weather around the globe. | The study, published in the journal "A key finding from this study is that our data showed that the ocean currents in the deepest parts of the South Atlantic Ocean behave differently than we thought before we had this new long-term dataset, which may have large implications for the climate and weather forecasts made by ocean models in the future," said Marion Kersale, an oceanographer with the UM Rosenstiel School's Cooperative Institute for Marine and Atmospheric Studies and lead author on the study.The MOC is one of the main components of ocean circulation, which constantly moves heat, salt, carbon, and nutrients throughout the global oceans. Variations of the MOC have important impacts on many global scale climate phenomena such as sea level changes, extreme weather, and precipitation patterns.The MOC consists of an upper cell of warmer, lighter waters that sits on top of colder, denser waters, known as the abyssal cell. These water masses travel around the global ocean, exchanging temperature, salinity, carbon and nutrients along the way.This study provided remarkable insights into the full-depth vertical, horizontal, and temporal resolution of the MOC. A key new result from this study has been the estimation of the strength of the abyssal cell (from 3000 m to the seafloor), which previously have only been available as once-a-decade snapshot estimates from trans-basin ship sections.This study found that the upper layer circulation is more energetic than that in the very deep, or abyssal, layer at all time scales ranging from a few days to a year. The flows in the upper and deep layers of the ocean behave independently of one another which can impact how the entire MOC system influences sea level rise and hurricane intensification in the Atlantic.Research such as the study led by Kersale is helping oceanographers to refine and improve our understanding of the complexities of the MOC system. These observations will allow scientists to validate Earth system models and will aid in UM Rosentiel School and NOAA's goals to improve our understanding of the climate/weather system. | Hurricanes Cyclones | 2,020 |
August 7, 2020 | https://www.sciencedaily.com/releases/2020/08/200807111926.htm | Atlantic hurricanes linked to weather system in East Asia | With a new Atlantic hurricane season in full swing, scientists may have found a new influence on how tropical cyclones develop. | Researchers led by the University of Iowa have identified a connection between a climate system in East Asia and the frequency of tropical storms that develop in the Atlantic Ocean -- which can strengthen into hurricanes that threaten the United States.In a new study, the researchers say the East Asian Subtropical Jet Stream (EASJ) an upper-level river of wind that originates in East Asia and moves west to east across the globe, carries with it an atmospheric phenomenon called a Rossby wave.Rossby waves occur naturally within the Earth's oceans and atmosphere, forming because of the planet's rotation. The researchers say Rossby waves hitch a ride on the EASJ to the North Atlantic when tropical cyclones in the Atlantic are most likely to form. The waves affect wind shear, a key element in the formation of tropical storms."When the EASJ is stronger, it can enhance this pattern, which leads to stronger teleconnections and stronger wind shear in the North Atlantic," explains says Wei Zhang, a climate scientist at IIHR-Hydroscience & Engineering at Iowa. "That can suppress Atlantic tropical cyclone formation."The scientists observed nearly 40 years of Atlantic tropical cyclones during prime formation season, from August to November, and their connection during the same time period with EASJ activity between July to October."What we found was there is a signal (Rossby waves) in terms of wind shear and that this signal is coming from the west, being Asia, over the Atlantic, via the East Asian Subtropical Jet Stream," says Zhang, who is corresponding author on the study, published online in the journal The researchers analyzed various data sets, as well as the database from the National Hurricane Center between 1980 and 2018, to seek patterns associated between tropical cyclones generated in the Atlantic and the EASJ. They determined based on that information that a stronger EASJ is associated with fewer Atlantic tropical cyclones.The study comes as Hurricane Isiaias became the fifth named storm to make landfall in the continental U.S. -- and already the second hurricane to swipe land -- when it swept across the U.S. East Coast last week.The researchers previously found a connection between the EASJ and storms affecting the western U.S. After that study, they looked for other associations."We said, 'OK let's see whether this subtropical jet can influence other weather systems," says Gabriele Villarini, IIHR's director and a co-author on the study."We found a physical mechanism that can provide a basic understanding in the context of tropical cyclone formation," Villarini says. "Then the question becomes, 'OK, now that you know that, what are you going to do with it?'"He continues: "That's the part that is not there yet, in the sense of how predictable is the East Asian Subtropical Jet, and how far ahead can we predict it for an entire season, so that it can become a useful tool for predicting tropical cyclone formation in the North Atlantic."The researchers also aim to understand how climate change could affect the EASJ, which may contribute to tropical cyclones' frequency in the North Atlantic. | Hurricanes Cyclones | 2,020 |
August 6, 2020 | https://www.sciencedaily.com/releases/2020/08/200806131159.htm | 'Extremely active' hurricane season possible for Atlantic Basin | Atmospheric and oceanic conditions are primed to fuel storm development in the Atlantic, leading to what could be an "extremely active" season, according to forecasters with NOAA's Climate Prediction Center, a division of the National Weather Service. Today, the agency released its annual August update to the Atlantic Hurricane Season Outlook, initially issued in May. | The 2020 Atlantic hurricane season has been off to a rapid pace with a record-setting nine named storms so far and has the potential to be one of the busiest on record. Historically, only two named storms form on average by early August, and the ninth named storm typically does not form until October 4. An average season produces 12 named storms, including six hurricanes of which three become major hurricanes (Category 3, 4, or 5)."This is one of the most active seasonal forecasts that NOAA has produced in its 22-year history of hurricane outlooks. NOAA will continue to provide the best possible science and service to communities across the Nation for the remainder of hurricane season to ensure public readiness and safety," said U.S. Secretary of Commerce Wilbur Ross. "We encourage all Americans to do their part by getting prepared, remaining vigilant, and being ready to take action when necessary."The updated outlook calls for 19-25 named storms (winds of 39 mph or greater), of which 7-11 will become hurricanes (winds of 74 mph or greater), including 3-6 major hurricanes (winds of 111 mph or greater). This update covers the entire six-month hurricane season, which ends Nov. 30, and includes the nine named storms to date.A comprehensive measure of the overall hurricane season activity is the Accumulated Cyclone Energy (ACE) index, which measures the combined intensity and duration of all named storms during the season. Based on the ACE projection, combined with the above-average numbers of named storms and hurricanes, the likelihood of an above-normal Atlantic hurricane season has increased to 85%, with only a 10% chance of a near-normal season and a 5% chance of a below-normal season."This year, we expect more, stronger, and longer-lived storms than average, and our predicted ACE range extends well above NOAA's threshold for an extremely active season," said Gerry Bell, Ph.D., lead seasonal hurricane forecaster at NOAA's Climate Prediction Center.Current oceanic and atmospheric conditions that make an "extremely active" hurricane season possible are warmer-than-average sea surface temperatures in the tropical Atlantic Ocean and Caribbean Sea, reduced vertical wind shear, weaker tropical Atlantic trade winds and an enhanced west African monsoon. These conditions are expected to continue for the next several months. A main climate factor behind these conditions is the ongoing warm phase of the Atlantic Multi-Decadal Oscillation, which reappeared in 1995 and has been favoring more active hurricane seasons since that time.Another contributing climate factor this year is the possibility of La Nina developing in the months ahead. Indicative of cooler-than-average sea surface temperatures in the equatorial regions of the eastern Pacific Ocean, La Nina can further weaken the wind shear over the Atlantic Basin, allowing storms to develop and intensify.NOAA's hurricane season outlook is for overall seasonal activity and is not a landfall forecast. Landfalls are largely determined by short-term weather patterns, which are only predictable within about a week of a storm potentially reaching a coastline. NOAA's National Hurricane Center provides tropical weather outlooks out to five days in advance, provides track and intensity forecasts for individual storms, and issues watches and warnings for specific tropical storms, hurricanes and the associated storm surge."NOAA has the most highly trained and dedicated forecasters that serve to protect American lives and property. With improved forecast skill, new storm surge products, and new observations, such as GPS Radio Occultation, we are better positioned than ever before to keep Americans out of harm's way," said Neil Jacobs, Ph.D., acting NOAA administrator. "It is now more important than ever to stay informed with our forecasts, have a preparedness plan, and heed guidance from local emergency management officials."This hurricane season FEMA encourages residents in hurricane-prone regions to keep COVID-19 in mind when making preparations and during evacuations. Visit | Hurricanes Cyclones | 2,020 |
July 21, 2020 | https://www.sciencedaily.com/releases/2020/07/200721102150.htm | How Hurricane Lane brought fire and rain to Hawaiian islands | Hurricane Lane was an impactful event for the Hawaiian Islands. In August 2018, over a four-day period, the island of Hawai'i received an average of 17 inches of rainfall, with a four-day single-station maximum of 57 inches, making Hurricane Lane the wettest tropical cyclone ever recorded in Hawai'i. A recently published study, led by University of Hawai'i at Manoa researchers, details the compounding hazards -- fire and rain -- produced by the storm. | "In this study we document what we believe to be the first instance of a hurricane causing both heavy rainfall and contributing to multiple instances of fire simultaneously," said Alison Nugent, lead author of the study and assistant professor of Atmospheric Sciences in the UH Manoa School of Ocean and Earth Science and Technology (SOEST).A team of UH Manoa and East-West Center scientists analyzed multiple aspects of the storm's meteorology and climatology, the environmental conditions leading up to the storm, and documented the associated societal impacts.They found that land-use characteristics and preceding moisture conditions exacerbated fire hazard, and both fire and rain severity were influenced by the hurricane environment and local topographic features. Conditions at the edge of the storm resulted in dry windy weather conducive to fire, while closer to the storm center, the incredibly moist atmosphere lifted by Hawai'i's mountains brought intense, long-lasting rainfall. The simultaneous occurrence of rain-driven flooding and landslides, strong winds, and multiple fires complicated emergency response.The vulnerability of a population in any given location to the impacts of tropical cyclone hazards is determined by a multitude of interacting factors. Biophysical aspects include distance inland from the coast, terrain slope, coastal ecosystem integrity, and land surface cover. Socioeconomic factors include infrastructure quality, the availability of early warning systems, and capacity for evacuation and emergency response."The surprising thing about Hurricane Lane was that, despite never making landfall, the storm caused considerable damage and disruptions across the state from two rather contradictory things: fire and rain," said Nugent. "Severe flooding on the windward island of Hawai'i Island built over several days, and multiple fires initiated on the lee sides of Maui and Oʻahu within hours of each other. Hurricane Lane is one of only three documented cases of hurricanes influencing wildland fire risk in real-time."In Hawai'i, landfall by hurricanes is relatively rare due to persistent vertical wind shear over the islands, which weakens hurricanes by essentially tipping them over. However, when hurricanes do occur near Hawai'i, the geography of the islands can exacerbate the hazards. The nearly 750 miles of coastline makes much of the state susceptible to coastal flooding, and the mountainous topography can enhance high-intensity rainfall, as well as intensifying wind speeds. In addition, the steep mountainous terrain can enhance flash flooding and trigger landslide events.The study highlights Hawai'i's vulnerability to natural hazards and reveals that these events can place significant constraints on emergency responders. This research also demonstrates UH Manoa's technical expertise across multiple disciplines -- climatology, meteorology, water resources, fire science -- to assess and predict the impacts of natural hazards and other climate-related events.In the future the team plans to develop the analytical approaches and Hawai'i-focused climate products needed to assess and prepare for future impacts, especially in the context of a changing climate where intensity and frequency of extreme events is likely to increase. | Hurricanes Cyclones | 2,020 |
July 16, 2020 | https://www.sciencedaily.com/releases/2020/07/200716101556.htm | Greater flood risks in coastal region of China | New research led by the Department of Geography at Hong Kong Baptist University (HKBU) has revealed that the observed average moving speed (or translation speed) of tropical cyclones making landfall over the coast of China dropped by 11% between 1961 and 2017. These slow-moving tropical cyclones brought about 20% more local total rainfall on average when compared with fast-moving ones, resulting in greater flood risks in the region. | The study also found that the occurrence of tropical cyclones with lower moving speeds and higher total rainfalls became more frequent after 1990 in the Pearl River Delta. The discovery offers invaluable insights that will enable the development of better flood management and adaptation strategies in the coastal region of China which is under threat due to tropical cyclones.The research team led by Dr Li Jianfeng, Assistant Professor of the Department of Geography at HKBU, studied 406 tropical cyclones which made landfall and lasted for more than two days over the coast of China, and specifically the Pearl River Delta where Hong Kong is located, between 1961 and 2017. The study, which started in 2018, aimed to investigate the trend of tropical cyclones' moving speeds and its correlation with the volume of rainfall in the long run.The research team analysed track data of the 406 tropical cyclones from the International Best Track Archive for Climate Stewardship (IBTrACS) and numerical simulations of eight Global Climate Models (GCMs) developed by meteorological and modelling centres around the world. IBTrACS is one of the most commonly used datasets for tropical cyclone studies, while GCMs are important tools for scientific communities to investigate and project climate behaviour.The team found that the observed moving speed of the tropical cyclones underwent a significant drop of 11%, decreasing from 21 km per hour in 1961 to 18.6 km per hour in 2017. The simulated moving speed also showed a drop of 10%, decreasing from 21.2 km per hour to 19.1 km per hour during the same period.Data on the volume of local rainfall brought about by the 406 tropical cyclones was also examined. While the mean total volume of local rainfall increased by 8% between 1961 and 2017, the 90th percentile of the total volume of local rainfall increased even more significantly by 18%, rising from 187 mm to 223 mm. As a result, the data indicated an increase in extreme rainfall caused by tropical cyclones over the 57-year period examined.Using statistical analysis, the team detected a negative correlation between the moving speeds of the tropical cyclones and their volume of local rainfall. The mean volume of local rainfall of slow-moving tropical cyclones with moving speeds of 15 km per hour or below was 99.1 mm, while that of fast-moving tropical cyclones with moving speeds of 25 km per hour or above was 80.5 mm. In other words, slow-moving tropical cyclones brought about 20% more rainfall on average when compared with fast-moving ones."The total amount of rainfall over a specific region brought about by a tropical cyclone is directly proportional to rainfall intensity, and inversely proportional to moving speed. The slower a tropical cyclone moves, the longer it spends passing over the region. As the region is affected for a longer duration, slower tropical cyclones bring about more rainfall," said Dr Li.The study further examined the correlation between the moving speeds of tropical cyclones and total rainfall in the Pearl River Delta. Among the 147 tropical cyclones that affected the Pearl River Delta between 1961 and 2017, 14 of them were slow-moving and had a rainfall intensity of 30 mm per day or more. Ten of them occurred after 1990, including three with a total volume of rainfall of more than 200 mm, indicating a substantial increase of flood risks caused by slow-moving tropical cyclones in recent years.Among the 406 tropical cyclones examined in this study, 82 affected Hong Kong and moved within 200 km of the city. Out of these 82 tropical cyclones, 22 were slow-moving, and 14 of them (about 64%) occurred after 1990. They include Typhoon York in 1999 and Severe Tropical Storm Goni in 2009 which caused extensive damage in the region."With analysis backed by long-term observations, we have provided evidence showing that slower tropical cyclone movement tends to elevate rainfall volume and thus it imposes greater flood risks at a regional scale. Therefore, more holistic and integrated flood risk management strategies, as well as flexible adaptation options, will be needed to deal with the growing threat of floods," said Dr Li.The research findings were published in the scientific journal Apart from HKBU researchers, the research team also comprised researchers from Shenzhen University, China University of Geosciences, The Chinese University of Hong Kong (Shenzhen), University of Alberta, and Princeton University. | Hurricanes Cyclones | 2,020 |
July 7, 2020 | https://www.sciencedaily.com/releases/2020/07/200707160155.htm | Future Texas hurricanes: Fast like Ike or slow like Harvey? | Climate change will intensify winds that steer hurricanes north over Texas in the final 25 years of this century, increasing the odds for fast-moving storms like 2008's Ike compared with slow-movers like 2017's Harvey, according to new research. | The study published online July 3 in The research began in Houston as Harvey deluged the city with 30-40 inches of rain over five days. Rice University researchers riding out the storm began collaborating with colleagues from Columbia University's Lamont-Doherty Earth Observatory (LDEO) and Harvard University to explore whether climate change would increase the likelihood of slow-moving rainmakers like Harvey."We find that the probability of having strong northward steering winds will increase with climate change, meaning hurricanes over Texas will be more likely to move like Ike than Harvey," said study lead author Pedram Hassanzadeh of Rice.Harvey caused an estimated $125 billion in damage, matching 2005's Katrina as the costliest hurricane in U.S. history. Ike was marked by coastal flooding and high winds that caused $38 billion damage across several states. It was the second-costliest U.S. hurricane at the time and has since moved to sixth. Ike struck Galveston around 2 a.m. Sept. 13, 2008, crossed Texas in less than one day and caused record power outages from Arkansas to Ohio on Sept. 14.Hassanzadeh, a fluid dynamicist, atmospheric modeler and assistant professor of both mechanical engineering and Earth, environmental and planetary sciences, said the findings don't suggest that slow-moving storms like Harvey won't happen in late 21st century. Rather, they suggest that storms during the period will be more likely to be fast-moving than slow-moving. The study found the chances that a Texas hurricane will be fast-moving as opposed to slow-moving will rise by about 50% in the last quarter of the 21st century compared with the final quarter of the 20th century."These results are very interesting, given that a previous study that considered the Atlantic basin as a whole noticed a trend for slower-moving storms in the past 30 years," said study co-author Suzana Camargo, LDEO's Marie Tharp Lamont Research Professor. "By contrast, our study focused on changes at the end of the 21st century and shows that we need to consider much smaller regional scales, as their trends might differ from the average across much larger regions."Hassanzadeh said the researchers used more than a dozen different computer models to produce several hundred simulations and found that "all of them agreed on an increase in northward steering winds over Texas."Steering winds are strong currents in the lower 10 kilometers of the atmosphere that move hurricanes."It doesn't happen a lot, in studying the climate system, that you get such a robust regional signal in wind patterns," he said.Harvey was the first hurricane Hassanzadeh experienced. He'd moved to Houston the previous year and was stunned by the slow-motion destruction that played out as bayous, creeks and rivers in and around the city topped their banks."I was sitting at home watching, just looking at the rain when (study co-author) Laurence (Yeung) emailed a bunch of us, asking 'What's going on? Why is this thing not moving?'" Hassanzadeh recalled. "That got things going. People started replying. That's the good thing about being surrounded by smart people. Laurence got us started, and things took off."Yeung, an atmospheric chemist, Hassanzadeh and two other Rice professors on the original email, atmospheric scientist Dan Cohan and flooding expert Phil Bedient, won one of the first grants from Rice's Houston Engagement and Recovery Effort (HERE), a research fund Rice established in response to Harvey."Without that, we couldn't have done this work," Hassanzadeh said. The HERE grant allowed Rice co-author Ebrahim Nabizadeh, a graduate student in mechanical engineering, to work for several months, analyzing the first of hundreds of computer simulations based on large-scale climate models.The day Harvey made landfall, Hassanzadeh also had reached out to Columbia's Chia-Ying Lee, an expert in both tropical storms and climate downscaling, procedures that use known information at large scales to make projections at local scales. Lee and Camargo used information from the large-scale simulations to make a regional model that simulated storms' tracks over Texas in a warming climate."One challenge of studying the impact of climate change on hurricanes at a regional level is the lack of data," said Lee, a Lamont Assistant Research Professor at LDEO. "At Columbia University, we have developed a downscaling model that uses physics-based statistics to connect large-scale atmospheric conditions to the formation, movement and intensity of hurricanes. The model's physical basis allowed us to account for the impact of climate change, and its statistical features allowed us to simulate a sufficient number of Texas storms."Hassanzadeh said, "Once we found that robust signal, where all the models agreed, we thought, 'There should be a robust mechanism that's causing this.'"He reached out to tropical climate dynamicist Ding Ma of Harvard to get another perspective."We were able to show that changes in two important processes were joining forces and resulting in the strong signal from the models," said Ma, a postdoctoral researcher in Earth and planetary sciences.One of the processes was the Atlantic subtropical high, or Bermuda high, a semipermanent area of high pressure that forms over the Atlantic Ocean during the summer, and the other was the North American monsoon, an uptick in rainfall and thunderstorms over the southwestern U.S. and northwestern Mexico that typically occurs between July and September. Hassanzadeh said recent studies have shown that each of these are projected to change as Earth's climate warms."The subtropical high is a clockwise circulation to the east that is projected to intensify and shift westward, producing more northward winds over Texas," he said. "The North American monsoon, to the west, produces a clockwise circulation high in the troposphere. That circulation is expected to weaken, resulting in increased, high-level northward winds over Texas."Hassanzadeh said the increased northward winds from both east and west "gives you a strong reinforcing effect over the whole troposphere, up to about 10 kilometers, over Texas. This has important implications for the movement of future Texas hurricanes."Models showed that the effect extended into western Louisiana, but the picture became murkier as the researchers looked further east, he said."You don't have the robust signal like you do over Texas," Hassanzadeh said. "If you look at Florida, for instance, there's a lot of variation in the models. This shows how important it is to conduct studies that focus on climate impacts in specific regions. If we had looked at all of North America, for example, and tried to average over the whole region, we would have missed this localized mechanism over Texas." | Hurricanes Cyclones | 2,020 |
June 17, 2020 | https://www.sciencedaily.com/releases/2020/06/200616113930.htm | Hurricane season combined with COVID-19 pandemic could create perfect storm | When extreme climate conditions interact with stressors to social systems, such as the COVID-19 pandemic, the consequences could be severe unless experts from diverse backgrounds work together to develop comprehensive solutions to combat their negative impacts. | That's the recommendation of a new article in Thomas Wahl, an assistant professor in UCF's Department of Civil, Environmental and Construction Engineering and a member of UCF's National Center for Integrated Coastal Research, is one of 14 experts with diverse backgrounds who authored the article."In the perspective article my input mainly focused on the impacts of connected extremes on the water sector," Wahl says. "With my research group at UCF, we have extensively worked on many different projects focused on compound flooding, when, for example, storm surges coincide with extreme rainfall or high river discharge."The article brought together scientists and stakeholder representatives with different backgrounds, ranging from the natural sciences to social sciences, public health and engineering.The authors focused on four main sectors -- food, water, health and infrastructure -- where connected extremes often lead to unforeseen impacts.Examples of connected extremes include the impact of Hurricane Maria in 2017 on Puerto Rico's under-maintained infrastructure, limited budget and aging population, and the spring 2011 Mississippi River floods in which water was released to protect urban areas at the detriment of agricultural lands.A present example could be the COVID-19 pandemic and the current hurricane season, Wahl says."The COVID-19 crisis will very likely increase the impacts associated with the climatic extreme events that will inevitably occur somewhere across the globe over the next weeks or months or already have occurred," Wahl says."For example, shelters cannot operate at full capacity, health care systems are already under pressure, and emergency funds are depleted."The researcher says many of the most impactful natural hazards experienced over the past decade could be considered connected extremes, where either different factors in the physical climate system combined in unfortunate ways or the impacts were made worse by interactions between physical and societal systems."It's important to recognize and treat connected extremes as such, and for scientists from different fields to engage directly with stakeholders and decision makers to develop new, robust and flexible policies to better combat their negative impacts," Wahl says.Article co-authors were Colin Raymond, lead author, with California Institute of Technology and Columbia University; Radley M. Horton with Columbia University; Jakob Zscheischler with the University of Bern; Olivia Martius with the University of Bern; Amir AghaKouchak with the University of California; Jennifer Balch with the University of Colorado-Boulder; Steven G. Bowen with Aon; Suzana J. Camargo with Columbia University; Jeremy Hess with the University of Washington; Kai Kornhuber with Columbia University; Michael Oppenheimer with Princeton University; Alex C. Ruane with the Goddard Institute for Space Studies; and Kathleen White with the U.S. Army Corps of Engineers.Wahl earned his doctorate in civil engineering from the University of Siegen, Germany, and joined UCF in 2017. | Hurricanes Cyclones | 2,020 |
June 4, 2020 | https://www.sciencedaily.com/releases/2020/06/200604113704.htm | Vital buffers against climate change are just offshore | A new study finds that about 31 million people worldwide live in coastal regions that are "highly vulnerable" to future tropical storms and sea-level rise driven by climate change. But in some of those regions, powerful defenses are located just offshore. | Of those 31 million people, about 8.5 million directly benefit from the severe weather-protection of mangroves and coral reefs, key buffers that could help cushion the blow against future tropical storms and rising waters, according to the study published May 29 in the peer-reviewed journal Because the two "natural infrastructures" absorb wave energy, reduce wave heights and provide a host of other environmental benefits, the study findings underscore the need for worldwide conservation and restoration of these natural resources. A particular focus, the authors said, should be placed on the most vulnerable regions, which lack available resources for more expensive protective measures, such as construction of levees or sea walls."Simply put, it's much cheaper to conserve a mangrove than to build a sea wall," said Northern Illinois University scientist Holly Jones, the study's lead author.A 100-meter-wide coastal strip of mangroves can reduce wave heights by as much as two-thirds, previous research has shown. Coral reefs meanwhile buffer wave energy by up to 97% in some contexts, significantly reducing erosion and cutting flood-damage costs in half annually."Coral reefs and mangroves serve as cost-efficient buffers against the adverse impacts of climate change, and they already play important roles in protecting human lives and livelihoods, while providing a multitude of biodiversity benefits," said Jones, who holds a joint appointment at NIU in biological sciences and environmental studies.Her co-authors on the research are Barry Nickel and Erika Zavaleta of the University of California, Santa Cruz; Tanja Srebotnjak of Harvey Mudd College in Claremont, California; and Will Turner, Mariano Gonzalez-Roglich and David G. Hole of Conservation International in Arlington, Virginia.The study aimed to identify highly vulnerable coastal regions that would benefit most from "ecosystem-based adaptation," or using conservation, restoration and sustainable management of existing ecosystems to address climate impacts.Regions meeting study criteria for "highly vulnerable" were within two miles of coastline and scored in the top 10th percentile of the authors' vulnerability index for being highly exposed to the effects of tropical storms and/or sea-level rise, dense in population and low in "adaptive capacity." The authors developed the adaptive-capacity measure to take into account economic data, education levels and other factors that play into a region's ability to adjust to climate variability.The authors found that 30.9 million people globally live in regions that are most vulnerable to tropical storms and projected sea-level rise."Our estimate is very conservative," Jones said. "This population lives in regions in the top 10th percentile for vulnerability. If we apply our model to coastal regions that scored in the top half for vulnerability, the population soars to over 700 million people."Highly vulnerable, coastal regions that would benefit the most from the conservation of mangroves and coral reefs span across Central America, the Caribbean, Eastern Africa, Southeast Asia and the South Pacific region. And yet only 38% of mangroves and 11% of coral reefs located along the most vulnerable coastlines are protected, according to the study."Protection of mangroves and coral reefs is critical," said Will Turner, study co-author and senior vice president of global strategies at Conservation International. "They have the potential to save lives, store carbon and support fisheries; the co-benefits they provide are great. At Conservation International, we're working with local communities, carbon finance, governments and the insurance industry to ensure that generations will benefit from the protection and restoration of these ecosystems."The authors noted that many of the world's coastal zones already bear the brunt of extreme weather. Events such as Hurricane Dorian in the Bahamas and hurricanes Maria, Harvey and Irma in the United States and the Caribbean claimed thousands of lives and generated financial costs running into hundreds of billions of dollars.Additionally, past research indicates that the loss of lives and assets in coastal zones is likely to increase significantly as a result of demographic and socio-economic trends alone, leading to a doubling or more of hurricane damages by 2100.Coastlines in Florida and the U.S. island territories of Puerto Rico and Guam also stand to benefit from mangrove and coral reef conservation and restoration.Globally, the densest populations receiving adaptation benefits (people protected per hectare) from mangroves are in India, the United States and Ghana, the study found. The greatest amount of people protected per hectare of coral reefs are in South Africa, Singapore, China and the United States.More than a billion coastal dwellers worldwide face some degree of vulnerability to climate change. While most coastal regions are outside of tropical zones and aren't buffered by mangroves or coral reefs, other ecosystems such as wetlands, estuaries and seagrasses provide protective benefits, Jones said."The United States is a wealthy country and has more ability to adapt than other countries, but it still could see significant benefits from conservation and restoration of existing natural infrastructure," Jones said. "Coastal ecosystems reduce the proportion of vulnerable people and infrastructure along exposed U.S. coastlines by around half through their absorption of wave energy."The study authors also pointed to other benefits of natural infrastructures, which provide habitats for a host of marine and terrestrial animals and create recreation and tourism opportunities. In the most vulnerable coastal regions alone, mangroves store at least 896 million metric tons of carbon."Ensuring the resiliency of mangroves is a win-win-win for people, nature and the climate," said Dave Hole, co-author of the study and senior director within Conservation International's Betty and Gordon Moore Center for Science."Mangroves store more carbon than any other forest ecosystem on Earth, drawing CO2 down from the atmosphere and storing it for decades, and so helping slow global warming," he said. "As interest in ecosystem-based adaptation continues to grow, it's vital that its multiple co-benefits are part of the conversation."The authors noted that their global study provides "a coarse approximation" of threats and of regions of the world that might benefit most from ecosystem-based adaptation.Despite mounting interest in ecosystem-based adaptation, implementation has mostly been in the form of site-specific projects. What's needed next, the authors said, is systematic assessment of the broader potential, analyses of the ways in which ecosystem protection varies from site to site, and further investment in conservation and restoration."It's a relatively new option that's gaining more traction," Jones said. "Beyond engineering solutions, there are these ecosystems that have been providing benefits to us for centuries. They're worthy of our attention and resources." | Hurricanes Cyclones | 2,020 |
May 27, 2020 | https://www.sciencedaily.com/releases/2020/05/200527123358.htm | Cyclones can damage even distant reefs | Big and strong cyclones can harm coral reefs as far as 1000 kilometres away from their paths, new research shows. | A study led by Dr Marji Puotinen from the Australian Institute of Marine Science (AIMS) sounds a warning about the way strong cyclone winds build extreme seas that affect coral reefs in Australia and around the world.Conventional modelling used to predict how a cyclone, hurricane or typhoon might impact corals assumes that wave damage occurs primarily within 100 kilometres of its track.To test this, Dr Puotinen and colleagues looked at Scott Reef, a well-studied atoll reef structure off the northwest of Western Australia, and how it fared as a result of Cyclone Lua -- a slow-moving weather event that developed off the coast in 2012.Although the area of the cyclone producing the most intense winds came no closer than 500 kilometres to the reef, the high seas it whipped up battered it with waves four to 20 metres high for three and a half days.The researchers found that at its most exposed sections, Scott Reef lost 50 per cent of its massive and robust Porites corals and virtually all its more fragile branching Acropora coral species. Similar damage was found on another reef, a further 300 kilometres distant, and models predicted damaging waves could be felt up to 1000 kilometres away."This example demonstrates that if we assume damage from all cyclones occurs within a 100 kilometre radius of a cyclone's track, we will underestimate the spatial extent for big, strong cyclones by up to 10 times," Dr Puotinen said."This could lead to making unfortunate choices when trying to prioritise conservation targets."She added that estimates of wave damage from cyclones involve highly complex calculations because they change constantly, varying in strength, size and speed over time. The largest waves occur from storms that move slowly, and have the highest winds spread over the largest area.To test the consequences of using the standard distance-based model, she and colleagues -- from the AIMS node in Perth, the University of Western Australia and the Indian Ocean Marine Research Centre -- collected existing information on cyclone size and frequency, crunching data gathered between 1985 and 2015 for 150 coral reef ecoregions around the world.Position and strength and size for each cyclone was recorded every six hours, allowing variations to be plotted in detail.They found that more than 70 per cent of the ecoregions had experienced at least one impact by a cyclone at peak strength and size during the 30-year period. Some, however, experienced them roughly every five years, and others roughly every 10."Coral reefs have been living with cyclones for millions of years," said Dr Puotinen. "But recovery after a big battering is a slow process, which can take a decade or more. This means that many coral reefs around the world will not have time to fully regrow before the next cyclone hits."Climate change models present a complex picture for cyclones. The total number occurring in any given period may well not increase -- but that's not necessarily good news for vulnerable reefs."Changes in the atmosphere mean it will be harder for cyclones to form in the first place, but warmer ocean water, which fuels their intensity, means it will be easier for them to strengthen once they do," Dr Puotinen explained.She added that her team's findings carry lessons for reef management and conservation strategies."When deciding where on the Great Barrier Reef, for instance, to invest millions of dollars to repair or enhance reefs, you don't want to select a location likely to be regularly battered by cyclone waves," she said."Our research should make it easier for reef managers to choose between candidate reefs."Dr James Gilmour, also from AIMS, a co-author on the paper, said the findings illustrated the complexity and severity of the threats facing reefs around the world."Coral reef communities around the world are under increasing threat from a range of stressors, and we must understand which parts of the reef should be the focus of conservation efforts," he said."In particular, it is the combination of cyclones with exposure to rising water temperatures that is the most significant emerging threat to reefs globally."Unravelling the specific effects of cyclones, the researchers conclude, will provide vital clues for the management of at-risk areas. | Hurricanes Cyclones | 2,020 |
May 18, 2020 | https://www.sciencedaily.com/releases/2020/05/200518154948.htm | Long-term data show hurricanes are getting stronger | In almost every region of the world where hurricanes form, their maximum sustained winds are getting stronger. That is according to a new study by scientists at the National Oceanic and Atmospheric Administration National Center for Environmental Information and University of Wisconsin-Madison Cooperative Institute for Meteorological Satellite Studies, who analyzed nearly 40 years of hurricane satellite imagery. | A warming planet may be fueling the increase."Through modeling and our understanding of atmospheric physics, the study agrees with what we would expect to see in a warming climate like ours," says James Kossin, a NOAA scientist based at UW-Madison and lead author of the paper, which is published today (May 18, 2020) in the The research builds on Kossin's previous work, published in 2013, which identified trends in hurricane intensification across a 28-year data set. However, says Kossin, that timespan was less conclusive and required more hurricane case studies to demonstrate statistically significant results.To increase confidence in the results, the researchers extended the study to include global hurricane data from 1979-2017. Using analytical techniques, including the CIMSS Advanced Dvorak Technique that relies on infrared temperature measurements from geostationary satellites to estimate hurricane intensity, Kossin and his colleagues were able to create a more uniform data set with which to identify trends."The main hurdle we have for finding trends is that the data are collected using the best technology at the time," says Kossin. "Every year the data are a bit different than last year, each new satellite has new tools and captures data in different ways, so in the end we have a patchwork quilt of all the satellite data that have been woven together."Kossin's previous research has shown other changes in hurricane behavior over the decades, such as where they travel and how fast they move. In 2014, he identified poleward migrations of hurricanes, where tropical cyclones are travelling farther north and south, exposing previously less-affected coastal populations to greater risk.In 2018, he demonstrated that hurricanes are moving more slowly across land due to changes in Earth's climate. This has resulted in greater flood risks as storms hover over cities and other areas, often for extended periods of time."Our results show that these storms have become stronger on global and regional levels, which is consistent with expectations of how hurricanes respond to a warming world," says Kossin. "It's a good step forward and increases our confidence that global warming has made hurricanes stronger, but our results don't tell us precisely how much of the trends are caused by human activities and how much may be just natural variability."This work was supported by NOAA Oceanic and Atmospheric Research Climate Program Office. | Hurricanes Cyclones | 2,020 |
May 14, 2020 | https://www.sciencedaily.com/releases/2020/05/200514115849.htm | Even small disturbances can trigger catastrophic storms | You've probably seen the satellite images that show a hurricane developing: thick white clouds clumping together, arms spinning around a central eye as it heads for the coast. | After decades of research, meteorologists still have questions about how hurricanes develop. Now, Florida State University researchers have found that even the smallest changes in atmospheric conditions could trigger a hurricane, information that will help scientists understand the processes that lead to these devastating storms."The whole motivation for this paper was that we still don't have that universal theoretical understanding of exactly how tropical cyclones form, and to really be able to forecast that storm-by-storm, it would help us to have that more solidly taken care of," said Jacob Carstens, a doctoral student in the Department of Earth, Ocean and Atmospheric Science.The research by Carstens and Assistant Professor Allison Wing has been published in the Current theories on the formation of hurricanes agree that some sort of disturbance must exist to start the process that leads to a hurricane. Carstens used numerical models that started with simple conditions to better understand exactly how those disturbances arise."We're trying to go as bare bones as possible, looking at just how exactly clouds want to organize themselves without any of these external factors playing into it to form a tropical cyclone more efficiently," he said. "It's a way we can further round out our broader understanding and look more purely at the actual tropical cyclones themselves rather than the surrounding environment's impact on it."The simulations started with mostly uniform conditions spread across the imaginary box where the model played out. Then, researchers added a tiny amount of random temperature fluctuations to kickstart the model and observed how the simulated clouds evolved.Despite the random start to the simulation, the clouds didn't stay randomly arranged. They formed into clusters as the water vapor, thermal radiation and other factors interacted. As the clusters circulated through the simulated atmosphere, the researchers tracked when they formed hurricanes. They repeated the model at simulated latitudes between 0.1 degrees and 20 degrees north, representative of areas such as parts of western Africa, northern South America and the Caribbean. That range includes the latitudes where tropical cyclones typically form, along with latitudes very close to the equator where their formation is rare and less studied.The scientists found that every simulation in latitudes between 10 and 20 degrees produced a major hurricane, even from the stable conditions under which they began the simulation. These came a few days after a vortex first emerged well above the surface and affected its surrounding environment.They also showed the possibility of cloud interaction contributing to the development of a tropical cyclone very close to the equator, which rarely occurs in nature but has still been observed as close as 1.4 degrees north away.Hurricanes are dangerous weather events. Forecasting can help prevent deaths, but a big storm can still cause billions of dollars in damage. A better theoretical understanding of their formation will help meteorologists predict and prepare for these storms, both in short-term forecasts and long-term climate projections, and communicate their understanding to the public."It's becoming ever more important in our field that we connect with emergency managers, the general population and other local officials to advise them on what they can expect, how they should prepare and what sorts of impacts are going to be heading their way," Carstens said. "A more robust understanding of how tropical cyclones form can help us to better forecast their location, their track and their intensity. It really goes down the line and helps us to communicate sooner as well as more efficiently and eloquently to the public that really needs it."This research was supported by the National Science Foundation. | Hurricanes Cyclones | 2,020 |
May 4, 2020 | https://www.sciencedaily.com/releases/2020/05/200504114056.htm | When natural disasters strike locally, urban networks spread the damage globally | When cyclones and other natural disasters strike a city or town, the social and economic impacts locally can be devastating. But these events also have ripple effects that can be felt in distant cities and regions -- even globally -- due to the interconnectedness of the world's urban trade networks. | In fact, a new study by researchers at the Yale School of Forestry & Environmental Studies finds that local economic impacts -- such as damage to factories and production facilities -- can trigger secondary impacts across the city's production and trade network. For the largest storms, they report, these impacts can account for as much as three-fourths of the total damage.According to their findings, published in the journal "Cities are strongly connected by flows of people, of energy, and ideas -- but also by the flows of trade and materials," said Chris Shughrue '18 Ph.D., lead author of the study which is based on his dissertation work at Yale. He is now a data scientist at StreetCred Labs in New York. "These connections have implications for vulnerability, particularly as we anticipate cyclones and other natural hazards to become more intense and frequent as a result of climate change over the coming decades."The paper was co-authored by Karen Seto, a professor of geography and urbanization science at F&ES, and B.T. Werner, a professor from the Scripps Institution of Oceanography."This study is especially important in the context of climate impacts on urban areas," Seto said. "Whereas we tend to consider a city's vulnerability to climate change as limited to local events, this study shows that we need to rethink this conceptualization. It shows that disasters have a domino effect through urban networks."Using a simulation coupled with a global urban trade network model -- which maps the interdependencies of cities worldwide -- the researchers show how simulated disasters in one location can trigger a catastrophic domino effect.The global spread of damage was particularly acute when cyclones occurred in cities of North America and East Asia, largely because of their outsize role in global trade networks -- as purchasers and suppliers, respectively -- and because these regions are particularly susceptible to cyclone events.Often, adverse impacts are primarily caused by a spike in material prices, followed by production losses to purchasers. These production losses eventually can cause industrial shortages, which can then induce additional cycles of price spikes and shortages throughout the production chain.Similar outcomes have been borne out following real world disasters. For instance, when catastrophic flooding occurred in Queensland, Australia, the impact on coking coal production prompted a 25-percent spike in the global costs. And the economic impacts of Hurricane Katrina extended far beyond New Orleans for several years after the historic storm.While the example of cyclones can act as a proxy for other isolated disasters -- such as the 2011 tsunami in Japan which caused global economic disruptions, particularly in the auto sector -- the researchers say the findings are particularly relevant in terms of climate-related natural events."To be resilient to climate change is not only about building dikes and sea walls, but understanding a city's supply chains and how they are linked to other cities that may be vulnerable," Seto said. | Hurricanes Cyclones | 2,020 |
April 22, 2020 | https://www.sciencedaily.com/releases/2020/04/200422151312.htm | Human-caused warming will cause more slow-moving hurricanes, warn climatologists | Hurricanes moving slowly over an area can cause more damage than faster-moving storms, because the longer a storm lingers, the more time it has to pound an area with storm winds and drop huge volumes of rain, leading to flooding. The extraordinary damage caused by storms like Dorian (2019), Florence (2018) and Harvey (2017) prompted Princeton's Gan Zhang to wonder whether global climate change will make these slow-moving storms more common. | Zhang, a postdoctoral research associate in atmospheric and oceanic sciences, decided to tackle the question by using a large ensemble of climate simulations. He worked with an international team of researchers from the Geophysical Fluid Dynamics Laboratory on Princeton University's Forrestal campus and the Meteorological Research Institute in Tsukuba, Japan. The results of this work appear in the April 22 issue of Science Advances.Zhang and his colleagues selected six potential warming patterns for the global climate, then ran 15 different possible initial conditions on each of the six patterns, resulting in an ensemble of 90 possible futures. In all 90 simulations, they told the computers to assume that global carbon dioxide levels have quadrupled and the planet's average temperature has risen by about 4 degrees Celsius -- a level of warming that experts predict could be reached before the turn of the century, if no action is taken to curb fossil fuel use."Our simulations suggest that future anthropogenic warming could lead to a significant slowing of hurricane motion, particularly in some populated mid-latitude regions," Zhang said. His team found about the storms' forward motion would slow by about 2 miles per hour -- about 10 to 20% of the current typical speeds -- at latitudes near Japan and New York City."This is the first study we are aware of that combines physical interpretation and robust modeling evidence to show that future anthropogenic warming could lead to a significant slowing of hurricane motion," he said."Since the occurrence of Hurricane Harvey, there has been a huge interest in the possibility that anthropogenic climate change has been contributing to a slow down in the movement of hurricanes," said Suzana Camargo, the Marie Tharp Lamont Research Professor at Columbia University's Lamont-Doherty Earth Observatory, who was not involved in this research. "In a new paper, Gan Zhang and collaborators examined the occurrence of a slowdown of tropical cyclones in climate model simulations. They showed that in this model, there is a robust slowdown of tropical cyclone motion, but this occurs mainly in the mid-latitudes, not in the tropics."Why would the storms slow down? The researchers found that 4 degrees of warming would cause the westerlies -- strong currents blowing through the midlatitudes -- to push toward the poles. That shift is also accompanied by weaker mid-latitude weather perturbations. These changes could slow down storms near populated areas in Asia (where these storms are called typhoons or cyclones, not hurricanes) and on the U.S. eastern seaboard.Usually when people talk about hurricane speeds, they're referring to the winds whipping around the eye of the storm. Those wind speeds are what determine a storm's strength -- a Category 5 hurricane, for example, has sustained winds of more than 157 miles per hour. By contrast, Zhang and his colleagues are looking at the "translational motion," sometimes called the "forward speed" of a storm, the speed at which a hurricane moves along its path. (The term comes from geometry, where a figure is "translated" when it slides from one part of a graph to another.) No matter how fast its winds are, a storm is considered "slow-moving" if its translational speed is low. Hurricane Dorian, which battered Grand Bahama Island from Sept. 1 to 3, 2019, was a Category 5 hurricane with wind gusts reaching 220 miles per hour, but it had a translational speed of just 1.3 mph, making it one of the slowest hurricanes ever documented.Some researchers have suggested that tropical storm translation speeds have slowed over land regions in the United States since 1900. Zhang and his colleagues used their climate models to see if human-caused warming was responsible for the observed slowdown, but they couldn't find a compelling link, at least based on trends since 1950 in their simulations. In addition, they noted that observed slowing translational speeds reported in recent studies could arise primarily from natural variability rather than human-caused climate changes.Zhang used the metaphor of dieting to explain the ambiguity of hurricane observations."If I go to the gym and eat fewer sweets," he said, "I would expect to lose weight. But if I'm only using a bathroom scale to weigh myself, I'm not going to get convincing data very soon, for many reasons including that my bathroom scale isn't the most accurate," he continued. "Assume after two weeks, I see some weak trend," he said. "I still can't tell whether it's due to exercise, diet or just randomness."Similarly, the observed slowdown trend in hurricanes or tropical storms over the past century could be due to small-scale local changes or could just be random, he said."In the debate between 'Everything is caused by climate change' and 'Nothing is caused by climate change' -- what we are doing here is trying to offer that maybe not everything can be immediately attributed to climate change, but the opposite is not right, either," Zhang said. "We do offer some evidence that there could be a slowdown of translational motion in response to a future warming on the order of 4 degrees Celsius. Our findings are backed by physics, as captured by our climate models, so that's a new perspective that offers more confidence than we had before." | Hurricanes Cyclones | 2,020 |
March 27, 2020 | https://www.sciencedaily.com/releases/2020/03/200327104114.htm | Disasters can affect cervical cancer screening for years | Cervical cancer screening rates in Japan were significantly affected in the years following the devastating Great East Japan Earthquake of 2011, Tohoku University scientists report in the journal | "Conflicts and disasters, and the social isolation that often follows, have a major impact on healthcare and lead to delays in the diagnosis and treatment of cancers," says Tohoku University's Yasuhiro Miki, who specializes in disaster obstetrics and gynecology.On March 11, 2011, Miyagi Prefecture in eastern Japan experienced a 9.0 magnitude earthquake, followed by a destructive tsunami that affected its coastal areas. Miki and colleagues at Tohoku University, led by disaster scientist Kiyoshi Ito, examined how the earthquake affected cervical cancer screening rates in Miyagi Prefecture.Across Japan, approximately 15 women per 100,000 people are affected by cervical cancer. This rate is higher than that in countries such as the US (6.5) and South Korea (8.4), and similar to that in India (14.7) and the Philippines (14.9). Also, less than 1% of girls in Japan have received the human papillomavirus vaccine, which protects against cervical cancer. This means that cervical cancer screening is of particular importance for early detection and diagnosis. Even so, cervical cancer screening rates are lower in Japan (42.3% of women aged 20-69) compared to other countries (80% in the US and the UK, for example).In the five years after the 2011 disaster, cervical cancer screenings dropped by more than 3% in four areas of Miyagi Prefecture covered by mobile van testing. In the coastal city of Onagawa, for example, cervical cancer screening dropped 7% following the disaster. While rates improved slightly over the years, they were still 6.9% lower in 2016 compared to pre-earthquake levels. Similar trends, though less severe, were found in other areas of the prefecture; with rates significantly lower in coastal areas compared to non-coastal ones."Cervical cancer screening is essential for maintaining good health, but in many affected areas, the rates markedly decreased in the year following the earthquake," says Miki. "More problematically, the decline in cervical cancer screening rates did not even recover in some areas five years after the earthquake."The issue is not specific to Japan. Researchers in the US had previously observed fewer women were diagnosed with cervical cancer in areas affected by Hurricane Katrina in the five years following 2005, compared to the five years preceding it. Those diagnosed also had more advanced disease, suggesting that the cervical cancer screening services were not being fully utilized."Long term monitoring of women's health is needed after a disaster," Miki says. "Measures need to be taken to restore screening rates in all affected areas."The team recommends further studies to understand why screening rates were affected more in some areas compared to others. | Hurricanes Cyclones | 2,020 |
March 9, 2020 | https://www.sciencedaily.com/releases/2020/03/200309093027.htm | Rain, more than wind, led to massive toppling of trees in Hurricane Maria, says study | A new study says that hurricanes Irma and Maria combined in 2017 to knock down a quarter of the biomass contained in Puerto Rico's trees -- and that massive rainfall, more than wind, was a previously unsuspected key factor. The surprising finding suggests that future hurricanes stoked by warming climate may be even more destructive to forests than scientists have already projected. The study appears this week in the journal | "Up to now, the focus on damage to forests has been on catastrophic wind speeds. Here, the data show that rain tends to be the greatest risk factor," said Jazlynn Hall, a Columbia University PhD. student who led the study. Her team identified several ways in which extreme rain might topple trees, but they do not completely understand the phenomenon yet, she said. She said that adding in climate-driven extreme rainfall to the various dangers threatening tropical and subtropical forests suggests that they may store less carbon in the future than previously thought.When Irma arrived off Puerto Rico on Sept. 6, 2017, it was then the most powerful Atlantic hurricane ever recorded. (Dorian, two years later, surpassed it.) But the main storm passed well off the coast; it dumped a foot of rain, but spared the island the heaviest winds. Forests suffered little damage. Then, two weeks later, on Sept, 20, Maria hit directly, with sustained winds of up to 130 miles per hour, and an astonishing 5 feet of rain over 48 hours in some areas.Extrapolating from a combination of satellite imagery and on-the-ground surveys made a year before the hurricanes, and repeated shortly after, the researchers say that in the wake of Maria, some 10.4 billion tons of Puerto Rico's tree biomass went down, with trunks snapped off, uprooted or stripped of leaves and branches -- 23 percent of the island's pre-hurricane forest. But the damage was not uniform, and the researchers sorted through various risk factors that might account for differences.Conventional wisdom has it that big trees high up on slopes directly exposed to high winds should suffer the most in storms. Indeed, the researchers did find that canopy height was an overarching factor; they confirmed earlier research showing that the island's biggest trees were prime victims. After that, conventional wisdom dissolved. Drilling down past tree height, the scientists found that the next most important factors were the amount of rain a specific locality got, plus the maximum local sustained wind speeds. Underlying those: the amount of antecedent rain from Irma, plus the amount of water that could be stored in the first five feet or so of soil from both storms. Adding it all up, the researchers concluded that rain, and its resulting storage in soil, dominated in determining which locales suffered the worst damage. Slope, elevation, topographic protection from wind and orientation toward the wind turned out to be the weakest factors."It's surprising, in the sense that when you think about hurricane damage to forests, you think about wind," said Hall's advisor and paper coauthor Maria Uriarte, a professor at Columbia's Earth Institute. "We're very aware of what flooding does to human infrastructure, but not so much to natural ecosystems." Uriarte led a series of prior studies on the storms, including one last year suggesting that forests in the paths of increasingly powerful and frequent hurricanes may eventually go into permanent decline.The researchers say extreme rain potentially could affect trees in multiple ways. For one, in relatively flat areas where soils are porous and have a high capacity to store water for extended periods, Irma probably pre-loaded the dirt with liquid. When Maria came along, the ground around tree root zones became waterlogged. This theoretically would weaken the soil and make it easier for wind to uproot trees.In addition to uprooting, the researchers also found that many trees in high-damage areas instead suffered snapped trunks. This, Hall speculated, could happen because rain simultaneously increases the weight of the soil and a tree's canopy, exerting increased strain on the trunk in the face of high winds. A heavier canopy could also contribute to uprooting by simply making it easier for the tree to tip over in saturated soil, she said. Counterintuitively, trees growing on slopes might in many cases resist damage better, because soils there might drain more quickly than those in low-lying areas that are protected from wind, but which collect more rainfall."The protective role of topography may be lessened in storms of Hurricane Maria's magnitude, which may foreshadow similar effects in future intense storms," says the paper. "Our study supports the idea that compounded disturbances can interact in ways that cannot be predicted."Hurricanes derive their strength from heated air, and previous studies have projected that, due to warming climate, wind speeds of North Atlantic hurricanes may increase by 6 to 15 percent by 2100. Perhaps more salient in light of the new study: Warmer air also pulls in more moisture, and current models project that rainfall will increase even more drastically -- 20-plus percent. Added to that, hurricanes may stall over land for increased times, meaning that rainfall will not be more intense, but last longer. This was what caused 2017's Hurricane Harvey to devastate southeast Texas with the wettest recorded tropical cyclone ever to hit the United States.A study last year by other researchers says that things may be heading this way already. It estimates that trends in sea-surface temperatures over the last 60 years have made the probability of Hurricane Maria-scale precipitation five times more likely. In addition, intervals between high-rain storms like Irma and Maria have already decreased by 50 percent, hiking up the possibility of the sequence that took place in 2017.Tropical forests are now absorbing a third less carbon from the air than they did in the 1990s, according to a study out last week. The main reasons right now are burning and logging of trees, higher temperatures and droughts. But if the new study holds up, in some places it may not be the fire next time, but water. | Hurricanes Cyclones | 2,020 |
March 5, 2020 | https://www.sciencedaily.com/releases/2020/03/200305002851.htm | Deltas help to decrease impact of river flooding | Most coastal cities and ports face a double threat from storm surge and river flooding. Infrastructure development along waterways and sea-level rise increase vulnerability for these communities. In a recent publication, The Propagation of Fluvial Flood Waves Through a Backwater-Estuarine Environment, historical data is examined to determine how to reduce the risk of coastal river flooding to communities. | Usually, in rivers, large flooding events move from upstream to downstream faster than small events. This study identified a different model by tracking flooding events as moved from the river to the coastal ocean. The river delta, which is common in many natural systems, turned out to be very important for understanding when and where flooding is likely to happen.Using years of observations (in some cases 9 decades of data), this study found that the Tombigbee-Alabama Delta (also known as the Mobile-Tensaw Delta) delays and reduces flooding for cities along the delta and bay. Amazingly, this effect is largely caused by the vegetation that naturally occurs in the delta.Most of the delta is a densely packed tupelo-bald cypress swamp, supporting the most biodiverse location in temperate North America. For large events, the delta swamp acts like a sponge quickly absorbing the initial floodwaters, and then slowly releases the water back to the main rivers. This gives communities more time to prepare and reduces the risk of river flooding overlapping with a storm surge during a hurricane. The slower release of water from the delta also slows the impact on the bay, delaying the initial flushing while also keeping the salinity low for a longer period of time. In contrast, smaller flooding events moved downstream faster. This occurs because smaller flooding events remain in the confines of the river channel, where they are not impacted by the swamps of the delta.These findings indicate the intensity of coastal flooding can be decreased and provide more time to prepare by allowing inland regions of rivers to flood and/or by managing vegetation type, both of which reduce the downstream height of water. | Hurricanes Cyclones | 2,020 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.