Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
May 23, 2019 | https://www.sciencedaily.com/releases/2019/05/190523104925.htm | Exposure to air pollution before and after birth may affect fundamental cognitive abilities | A growing body of research suggests that exposure to air pollution in the earliest stages of life is associated with negative effects on cognitive abilities. A new study led by the Barcelona Institute for Global Health (ISGlobal), a centre supported by "la Caixa," has provided new data: exposure to particulate matter with a diameter of less than 2.5 μm (PM2.5) during pregnancy and the first years of life is associated with a reduction in fundamental cognitive abilities, such as working memory and executive attention. | The study, carried out as part of the BREATHE project, has been published in The study included 2,221 children between 7 and 10 years of age attending schools in the city of Barcelona. The children's cognitive abilities were assessed using various computerized tests. Exposure to air pollution at home during pregnancy and throughout childhood was estimated with a mathematical model using real measurements.The study found that greater PM2.5 exposure from pregnancy until age 7 years was associated with lower working memory scores on tests administered between the ages of 7 and 10 years. The results suggest that exposure to fine particulate matter throughout the study period had a cumulative effect, although the associations were stronger when the most recent years of exposure were taken into account. Working memory is a cognitive system responsible for temporarily holding information for subsequent manipulation. It plays a fundamental role in learning, reasoning, problem-solving and language comprehension.Sex-stratified analysis showed that the relationship between PM2.5 exposure and diminished working memory was found only in boys. "As yet, we don't understand what causes these differences, but there are various hormonal and genetic mechanisms that could lead to girls having a better response to inflammatory processes triggered by fine particulate matter and being less susceptible to the toxicity of these particles," commented Ioar Rivas, ISGlobal researcher and lead author of the study.The study also found that higher exposure to particulate matter was associated with a reduction in executive attention in both boys and girls. Executive attention is one of the three networks that make up a person's attention capacity. It is involved in high-level forms of attention, such as the detection and resolution of conflicts between options and responses, error detection, response inhibition, and the regulation of thoughts and feelings.Whereas previous studies in the BREATHE project analysed exposure to air pollution at schools over the course of a year, this study assessed exposures at the participants' homes over a much longer time: from the prenatal period to 7 years of age."This study reinforces our previous findings and confirms that exposure to air pollution at the beginning of life and throughout childhood is a threat to neurodevelopment and an obstacle that prevents children from reaching their full potential," commented Jordi Sunyer, Childhood and Environment Programme Coordinator at ISGlobal and last author of the study. | Pollution | 2,019 |
May 23, 2019 | https://www.sciencedaily.com/releases/2019/05/190523091301.htm | Widespread permafrost degradation seen in high Arctic terrain | Rapid changes in terrain are taking place in Canada's high Arctic polar deserts due to increases in summer air temperatures. | A McGill-led study published recently in "Our study suggests that the warming climate in the high Arctic, and more specifically the increases in summer air temperatures that we have seen in recent years, are initiating widespread changes in the landscape," says Melissa Ward Jones, the study's lead author and a PhD candidate in McGill's Department of Geography.The research team noted that:"Despite the cold polar desert conditions that characterize much of the high Arctic, this research clearly demonstrates the complex nature of ice-rich permafrost systems and climate-permafrost interaction," adds Wayne Pollard, a professor in McGill's Department of Geography and co-author on the study. "Furthermore, it raises concerns about the over simplification of some studies that generalize about the links between global warming and permafrost degradation."The research was funded by the Association of Canadian Universities for Northern Studies (ACUNS), the Natural Sciences and Engineering Research Council (NSERC), the Fonds de recherche du Québec -- Nature et technologies (FRQNT), David Erb Fellowship, Eben Hobson Fellowship and the Northern Scientific Training Program (NSTP). | Pollution | 2,019 |
May 23, 2019 | https://www.sciencedaily.com/releases/2019/05/190523104936.htm | Seeing inside superfog | While prescribed fires are common tools in wildland management, a combination of smoke and fog, known as superfog, has in some cases crossed over major roadways, leading to multicar pileups and fatalities in visibility of less than 3 meters. | New research led by the University of California, Riverside, and sponsored by the USDI/USDA Joint Fire Sciences Program, has for the first time produced superfog in a laboratory. With a better understanding of how superfog forms, foresters may be able to add additional criteria in planning future prescribed burns.The team also identified the smoke particle size distribution and concentration, ambient liquid water content, ambient temperature, ambient relative humidity, fuel moisture content and wind speed that leads to superfog formation. The authors caution, however, that the science of predicting when some of these conditions will be met is still in its infancy.Fog forms when water molecules condense around microscopic solid particles suspended in the air, somewhat like dew forming around a blade of grass. Particles come from many sources, including dust, vehicle emissions, and smoke. In order for water to condense, the ambient air temperature must be cool enough to become saturated with water vapor introduced by processes like wind, evaporation, or plant respiration. When nearly saturated air blends with smoke and moisture released by smoldering organic materials, a dense, low-lying superfog can form.Because superfog is uncommon and hard to study naturally, the researchers designed a laboratory setup to explore conditions that create it. They burned wildland fuels, such as pine needles, in an air conditioned, custom-made fire wind tunnel under varying environmental conditions and fuel moisture content.The team found that when water content is low, particle size needs to be small enough to create droplets no larger than one micron, small enough to fit 50 droplets in the diameter of human hair, to reduce visibility to superfog levels. If the droplets get much bigger than that, they don't absorb as much light and require more water. Droplet concentrations must be around 100,000 per cubic centimeter, or about 100,000 droplets packed into a volume smaller than two M&Ms. Burning vegetation usually exceeds this amount.Superfog also requires ambient temperatures less than 4 degrees Celsius or 39.2 degrees Fahrenheit, humidity over 80%, and high fuel moisture content. High fuel moisture content lets more water vapor into the smoke and produces the thickest superfog.When combined with modeling of other atmospheric conditions that influence the growth and spread of fog, the experiments replicated conditions matching superfog that caused multicar pileups in Florida in 2008 and 2012.The combination of high humidity and high plant moisture content required to make superfog explains why it occurs mostly in Southern states like Florida and Louisiana rather than California. However, it is still not possible to predict when or where it will occur."Now we know what are the proper mixtures of various ingredients that form superfog, but we still do not know how to predict when those ingredients in the right mixture will form due to the combination of atmospheric, fuel, and ground conditions," said co-author Marko Princevac, a professor of mechanical engineering in the Marlan and Rosemary Bourns College of Engineering at UC Riverside who studies wildfire combustion and behavior. "I believe it is still early to claim that superfog can be predicted with any certainty." | Pollution | 2,019 |
May 22, 2019 | https://www.sciencedaily.com/releases/2019/05/190522120501.htm | Global temperature change attributable to external factors, new study confirms | Researchers at the University of Oxford have confirmed that human activity and other external factors are responsible for the rise in global temperature. While this has been the consensus of the scientific community for a long time, uncertainty remained around how natural ocean-cycles might be influencing global warming over the course of multiple decades. The answer we can now give is: very little to none. | In a new study, published in the "We can now say with confidence that human factors like greenhouse gas emissions and particulate pollution, along with year-to-year changes brought on by natural phenomenon like volcanic eruptions or the El Niño, are sufficient to explain virtually all of the long-term changes in temperature," says study lead author Dr Karsten Haustein. "The idea that oceans could have been driving the climate in a colder or warmer direction for multiple decades in the past, and therefore will do so in the future, is unlikely to be correct."The study showed that global warming that occurred during the 'early warming' period (1915 -- 1945) was in fact caused by external factors as well. Formerly, it had been largely attributed to natural ocean temperature changes, which is why there has been uncertainty over how much of global warming is influenced by unpredictable natural factors."Our study showed that there are no hidden drivers of global mean temperature," says co-author Dr Friederike Otto. "The temperature change we observe is due to the drivers we know. This sounds boring, but sometimes boring results are really important. In this case, it means we will not see any surprises when these drivers -- such as gas emissions -- change. In good news, this means when greenhouse gas concentrations go down, temperatures will do so as predicted; the bad news is there is nothing that saves us from temperatures going up as forecasted if we fail to drastically cut greenhouse gas emissions." | Pollution | 2,019 |
May 22, 2019 | https://www.sciencedaily.com/releases/2019/05/190520165012.htm | Giving rural Indians what they want increases demand for cookstoves | Global health efforts to design and deliver improved cookstoves don't always catch on. Experience has shown poor households in rural settings will rarely pay for or use these new stoves, which are intended to lower firewood demands and improve indoor and outdoor air quality. | However, adopting some common business practices, such as upgrading the supply chain, performing careful market analysis and offering price rebates, can increase purchase and adoption of improved cookstoves by as much as 50 percent in rural India, according to a new study led by Duke University researchers.Three billion people still rely on traditional cookstoves that use solid fuels such as wood or coal. These stoves contribute to climate change through carbon emissions, deforestation and toxic air pollution, which contributes to poor health among users and their communities.Improved cookstoves use either electricity or biomass as an energy source. Switching to them can deliver 'triple wins': better household health, better environmental health and reduced climate change emissions.The adoption of improved cookstoves has been slow, however, likely because of constraints imposed by differences in markets, culture and geography."Previous studies have found low demand for these cookstoves, however, our study found that when barriers to adopting the stoves were addressed, the demand was high," said Subhrendu Pattanayak, Oak Professor of Environmental and Energy Policy at Duke's Sanford School of Public Policy and lead author of the study."A big question for policy scientists has been: Can we figure out what technology and energy and environmental services people want, and use that understanding to get people to pay for them? Our study shows that we can," he said.The Duke researchers took a novel approach by implementing the study in three phases -- diagnose, design and test -- over a period of five years.In the first phase, the researchers analyzed existing research on improved cookstove adoption, and looked at sales across different potential study communities, which provided insight into both demand- and supply-side barriers to adoption. They found no common strategies for promoting changes in cooking behavior, but instead concluded the socio-economic case for adoption was influenced by local context. They then conducted focus groups in more than 100 households in 11 rural Indian communities, which allowed researchers to understand local cooking practices, perceptions of different stoves and preferences for stove features.In the design phase, researchers worked with local organizations to implement eight small pilot programs in three different settings. This included small-scale testing of various supply chain issues such as marketing and home delivery, rebates and financing, and offers of electric and/or biomass cookstoves.In the third phase, they conducted a field experiment to determine whether the combination of upgraded supply and demand promotion would lead to increased adoption of improved cookstoves. The field test included nearly 1,000 households in 97 geographically distinct villages in the Indian Himalayas.The experiment showed that more than half of the intervention households bought an improved cookstove compared with zero purchases in the control villages. The demand was very price-sensitive, and the largest rebate, 33 percent of retail price, led to the largest purchase rate, 74 percent. In the areas that only had an upgraded supply chain and promotion without rebates, there was a 28 percent increase in ownership of the improved cookstoves.Households overwhelmingly preferred the electric stove over the biomass stove, by a factor of two to one. Respondents liked the lack of smoke, speed of cooking and portability and attractiveness of the stove.However, this preference for electric stoves highlighted the lack of a steady source of electricity. In India, rural electrification rates have been rising rapidly, growing from 57 to 83 percent between 2005 and 2015."Our work shows how energy access programs and projects can scale up and achieve success by understanding local demand and developing robust regional supply chains," said co-author Marc Jeuland, professor of public policy and global health at Sanford.The researchers also looked at whether stove ownership persisted over time, going back to the households three and 15 months after purchase. Most households still owned the stoves, although at the later date, 15 percent of households reported their stoves needed repair. The authors argue that barriers to improved cookstove adoption can be overcome and that households are willing to pay substantial prices for them, but that maintenance and sustainability require additional attention.The interventions that helped improve adoption rates are similar to common marketing and sales practices of private firms, Pattanayak said. While the interventions appear costly on paper, the costs are less than the social and environmental benefits gained."Our findings suggest that market analysis, robust supply chains and price discounts are critical for improved cookstove adoption," said Pattanayak.The study appears the week of May 20 in the This research was partially funded by a grant from the United States Agency for International Development (USAID) (Cooperative Agreement GHS-A000-09-00015-00) and a grant from the Office of the Provost of Duke University. | Pollution | 2,019 |
May 21, 2019 | https://www.sciencedaily.com/releases/2019/05/190521124534.htm | New lidar instruments peer skyward for clues on weather and climate | Researchers have developed a set of diode-based lidar instruments that could help fill important gaps in meteorological observations and fuel a leap in understanding, modeling and predicting weather and climate. The instruments are particularly well suited for insights on atmospheric dynamics at the mesoscale, a size range equivalent to the area of a small city up to that of a U.S. state. | Collaborators from Montana State University (MSU) in Bozeman and the National Center for Atmospheric Research (NCAR) in Boulder, Colo. will discuss the work during The Optical Society's Optical Sensors and Sensing Congress, which will take place from 25-27 June in San Jose, Calif., during Sensors Expo 2019.So far, the team has created five diode-based micro-pulse differential absorption lidar (DIAL) instruments -- MPD instruments, for short -- for profiling water vapor in the lower troposphere, the region of the atmosphere where most weather occurs. Diode laser based instruments operate in the range of wavelengths from 650 to 1,000 nanometers, mostly within the infrared spectrum. The instruments can be deployed both day and night, largely unattended, without risking eye damage to humans."The network of five water vapor MPD instruments was deployed to the atmospheric radiation measurement Southern Great Plains atmospheric observatory in mid-April," team member Catharine Bunn said. "From this three-month field experiment we will gain insight into how weather forecasting may be impacted by continuous MPD measurements of atmospheric water vapor.Several reports by the National Academies of Sciences, Engineering and Medicine and other expert groups over the past decade have identified a critical need for vertical measurement profiles of humidity, aerosols, and temperature in the lower troposphere. Experts also call for the creation of a "network of networks" for collecting and sharing this data. To provide needed coverage for improved weather and climate forecasting across the U.S., one report proposed deploying an array of sensors on the ground at about 400 sites nationwide spaced roughly 125 kilometers apart.However, there has been a gap in the instrumentation to meet this vision for research and monitoring without relying on aircraft-based devices, which are expensive to deploy. Building on prior work by other teams and collaborating with NCAR scientists, MSU instrument developers turned to diode-based MPD technology as an economical route to a profiler that could make accurate measurements and fulfill desired specifications for continuous, unattended operation and eye safety.The researchers have developed five different instruments based on a common architecture in which laser pulses are sent into the atmosphere and the return signal, which varies as the light interacts with water vapor, is measured with single photon counting modules. All five instruments are operational and two have been deployed in ground-based weather- and climate-research experiments.One instrument, developed collaboratively by MSU and NCAR scientists, was fielded as part of the Front Range Air Pollution and Photochemistry Experiment (FRAPPE). The instrument measured the vertical water vapor profile with less than 10 percent mean error over a range of atmospheric conditions, as compared to profiles collected by airborne devices. It also ran unattended for 50 continuous days during FRAPPE with no apparent performance decline, providing about 95 percent data coverage.The researchers have also advanced toward vertically profiling two other high-interest features of the lower troposphere: aerosols and temperature. Based on the MPD architecture NCAR researchers built a novel high spectral resolution lidar (HSRL) capable of profiling aerosols. Complementing this work, an MSU physicist adapted mathematical techniques from quantum mechanics to solve an equation that opens the door to using measurements of properties of oxygen molecules and other atmospheric data to create a vertical temperature profile. Models and preliminary experiments suggest that in addition to measuring water vapor and other airborne particles, the HSRL can provide measurements needed for fine-grained, high frequency temperature profiling.During the June Congress, the researchers are planning to provide the latest about their temperature profiling work and other updates on their instrumentation. For now, Bunn said, "We are beginning to retrieve temperature profiles of the lower troposphere with an accuracy of +/- 2 Kelvin and we are working to improve instrument and retrieval algorithm performance." | Pollution | 2,019 |
May 20, 2019 | https://www.sciencedaily.com/releases/2019/05/190520115740.htm | Counter-intuitive climate change solution | A relatively simple process could help turn the tide of climate change while also turning a healthy profit. That's one of the hopeful visions outlined in a new Stanford-led paper that highlights a seemingly counterintuitive solution: converting one greenhouse gas into another. | The study, published in "If perfected, this technology could return the atmosphere to pre-industrial concentrations of methane and other gases," said lead author Rob Jackson, the Michelle and Kevin Douglas Provostial Professor in Earth System Science in Stanford's School of Earth, Energy & Environmental Sciences.The basic idea is that some sources of methane emissions -- from rice cultivation or cattle, for example -- may be very difficult or expensive to eliminate. "An alternative is to offset these emissions via methane removal, so there is no net effect on warming the atmosphere," said study coauthor Chris Field, the Perry L. McCarty Director of the Stanford Woods Institute for the Environment.In 2018, methane -- about 60 percent of which is generated by humans -- reached atmospheric concentrations two and a half times greater than pre-industrial levels. Although the amount of carbon dioxide in the atmosphere is much greater, methane is 84 times more potent in terms of warming the climate system over the first 20 years after its release.Most scenarios for stabilizing average global temperatures at 2 degrees Celsius above pre-industrial levels depend on strategies for both reducing the overall amount of carbon dioxide entering the atmosphere and removing what's already in the atmosphere through approaches such as tree planting or underground sequestration. However, removing other greenhouse gases, particularly methane, could provide a complementary approach, according to the study's authors, who point to the gas's outsized influence on the climate.Most scenarios for removing carbon dioxide typically assume hundreds of billions of tons removed over decades and do not restore the atmosphere to pre-industrial levels. In contrast, methane concentrations could be restored to pre-industrial levels by removing about 3.2 billion tons of the gas from the atmosphere and converting it into an amount of carbon dioxide equivalent to a few months of global industrial emissions, according to the researchers. If successful, the approach would eliminate approximately one-sixth of all causes of global warming to date.Methane is challenging to capture from air because its concentration is so low. However, the authors point out that zeolite, a crystalline material that consists primarily of aluminum, silicon and oxygen, could act essentially as a sponge to soak up methane. "The porous molecular structure, relatively large surface area and ability to host copper and iron in zeolites make them promising catalysts for capturing methane and other gases," said Ed Solomon, the Monroe E. Spaght Professor of Chemistry in the School of Humanities and Sciences.The whole process might take the form of a giant contraption with electric fans forcing air through tumbling chambers or reactors full of powdered or pelletized zeolites and other catalysts. The trapped methane could then be heated to form and release carbon dioxide, the authors suggest.The process of converting methane to carbon dioxide could be profitable with a price on carbon emissions or an appropriate policy. If market prices for carbon offsets rise to $500 or more per ton this century, as predicted by most relevant assessment models, each ton of methane removed from the atmosphere could be worth more than $12,000.A zeolite array about the size of a football field could generate millions of dollars a year in income while removing harmful methane from the air. In principle, the researchers argue that the approach of converting a more harmful greenhouse gas to one that's less potent could also apply to other greenhouse gases.While reducing greenhouse gases in the atmosphere to pre-industrial levels may seem unlikely in the near future, the researchers argue that it could be possible with strategies like these. | Pollution | 2,019 |
May 20, 2019 | https://www.sciencedaily.com/releases/2019/05/190520115640.htm | New method to predict the vulnerability of ecosystems | Natural ecosystems are as vulnerable as they are diverse. Environmental changes such as climate change, pollution or the spread of alien species can easily throw an ecosystem off balance. Researchers are therefore investigating how susceptible ecosystems are to disruption. But in their search for answers they face the problem that the complex network of relationships includes innumerable interactions, which are virtually impossible to record comprehensively and convert into measurable data. | In an effort to overcome this obstacle, a team lead by ecologist Prof. Ulrich Brose of Friedrich Schiller University in Jena (Germany) and of the German Centre for Integrative Biodiversity Research (iDiv) has developed a new approach. The special feature of the method is that only limited information is needed about the characteristics of 'predators' that hunt prey animals. These data enable researchers to determine the structure and stability of a habitat, without the need for a comprehensive examination of the relationships to other organisms. The scientists were able to confirm the value of their method using a large dataset of 220,000 interactions from 290 food webs. They had collected the data from research partners throughout the world over a period of more than 10 years."The decisive characteristic of a predator is the relationship between its body mass and that of its prey," explains Brose, who was recently awarded the Thuringian Research Prize. "If there is a big difference, this has a positive effect on the equilibrium of the energy flows of the food web and, by extension, on the stability of the ecosystem." Large hunters with small prey, such as mouse-hunting martens, therefore have an important positive effect on the organisms' habitats.With the help of the data they had collected, Brose and his team were able to predict precisely which animals play a key role within a food web. The prediction is even more precise if, in addition to body mass, additional features such as the mode of locomotion or the metabolic type are considered. The analysis showed that, depending on the nature of the habitat, different species of predator maintain the equilibrium of an ecosystem. In three-dimensional biotopes (air, water), very large predators have a stabilising effect, whereas in two-dimensional spaces (land), this is done by smaller predators.Brose and his team now wish to study the reasons for these differences. For their next step, they want to supplement the existing data on food webs with additional physical factors, such as the gravitational force or the viscosity of the medium in which the organisms live. "Our aim is to uncover the fundamental laws of the architecture of biodiversity," says Brose, who makes his data available to other research teams through the iDiv database. His latest findings might also help to close the gap between food web theory and practical nature conservation. If we understand nature conservation to be a way of cushioning nature against disturbance from outside, we will have most success if we protect large hunters such as whales and sharks in water and large birds of prey in the air. In contrast, on land we should prioritise small mammals such as weasels or polecats. | Pollution | 2,019 |
May 20, 2019 | https://www.sciencedaily.com/releases/2019/05/190520081911.htm | Epidemiology: Measures for cleaner air | Many measures have been introduced around the world with the aim of reducing outdoor air pollution and concomitantly improving public health. These efforts include, for example, the regulation of industrial emissions, the establishment of low emission zones and the subsidies for public transport, as well as restrictions on the use of wood and coal for heating in private households. The link between these actions and improved air quality and health seems obvious, but it is actually very difficult to quantify their effects. "It's quite a challenge to evaluate the introduction of a measure like the low emission zone," says Jacob Burns from the Institute for Medical Information Processing, Biometry and Epidemiology (IBE) at the LMU's Pettenkofer School of Public Health. | The negative effects of air pollution on public health linked to cardiovascular and respiratory diseases, among others, are well established. But whether measures designed to improve outdoor air quality actually reduce the concentration of pollutants present, and mitigate their effects on public health, is less clear. "It's important to remember how many factors influence both air quality and the relevant health conditions," says Burns. "Levels of energy consumption in industry, transport and domestic households all play a substantial role in air pollution levels, as does the weather," he points out. And with respect to health, the risk of developing cardiovascular disease, to cite one example, is influenced not only by particulate matter and other pollutants we breathe in, but also by numerous genetic, physiological und social risk factors. "This illustrates how difficult it can be to attribute changes in air pollutant concentrations, numbers of individuals admitted to hospitals, or mortality rates to any single measure."These difficulties are reflected in the new review published in the Cochrane Library. Cochrane is a network of more than 13,000 researchers, whose primary aim is to improve the quality of the scientific knowledge base upon which policy decisions relevant to human health are based. The authors of the new study, led by Professor Eva Rehfuess' research group from the IBE at the Pettenkofer School of Public Health, provide the first systematic review aiming to identify and critically appraise all studies that evaluate the impact of measures aiming to improve air quality. The study considers 38 specific measures, ranging from those to reduce traffic, to the regulation of industrial emissions and opportunities for cleaner and more efficient household heating systems."For the most part, the studies that we reviewed show either positive or unclear effects. But these studies differ so much from one another that we could not, with confidence, draw general conclusions about what works and what does not work," Burns explains. The LMU epidemiologists emphasize, however, that this is not an argument against such interventions. Indeed, the authors explicitly note that "it is important to emphasize that lack of evidence of an association is not equivalent to evidence of no association." For them, the more important message is that "the methods of evaluation in this area must be improved, so that decision-makers have a reliable basis on which to base their policy choices," says Rehfuess.In this study, the LMU researchers make a number of specific recommendations -- in particular with respect to the design of future studies in this area, but some of which also directed at policymakers. "At the moment," says Rehfuess, "many studies are conducted retrospectively. Ideally, the evaluation could be incorporated into the planning and introduction of the measure." | Pollution | 2,019 |
May 16, 2019 | https://www.sciencedaily.com/releases/2019/05/190516131752.htm | Electric car switch on for health benefits | Could the health benefits and reduced costs to healthcare systems be enough to justify subsidizing charging infrastructure to allow society to switch from the internal combustion engine to electric vehicles faster than current trends predict? | Writing in the International Journal of Electric and Hybrid Vehicles, Mitchell House and David Wright of the University of Ottawa, Canada, suggest that the migration from polluting vehicles that burn fossil fuels to electric vehicles, ideally using electricity generated sustainably could significantly reduce the incidence of cardiopulmonary illness due to air pollution. This would lead not only to less employee absence from work through illness but also lead to broad improvements in quality and length of life.The team's paper compares the financial costs of building electric vehicle charging infrastructure using empirical data with health costs to see if there is a net benefit. They have found that in the majority of plausible scenarios of balanced growth, when the number of vehicles rises and so does the number of charging stations, there is a positive net benefit to society."Since health benefits accrue to governments, businesses, and individuals, these results justify the use of government incentives for charging station deployment and this paper quantifies the impact of different levels of incentive," the team concludes.The team explains that the Electric Vehicles Initiative (EVI) (an organization supported by 16 governments) has a target of 20 million electric vehicles by the year 2020. This was based on a notional growth rate of 75% per year defined in 2016. At that time, EV sales amounted to more than half a million (550000) worldwide in 2015, which represented growth of 70% on 2014. Electric vehicle sales have continued to grow, with 2017 and 2018 experiencing 61% and 64% year-over-year growth respectively.Their results suggest that a 75% growth rate for electric vehicle uptake is not unrealistic. Moreover, in the face of anthropogenic climate change and the detrimental effects of health on pollution, some observers see the transition to electric vehicles as being a matter of serious urgency. This has to take into consideration the electricity generating mix from which the vehicles derive their power. If electricity is mostly supplied from power stations generating electricity by burning fossil fuels, including coal, gas, and oil, then many of the benefits are lost. This is particularly true in terms of climate impact at the global level but also in terms of sulfur oxide, nitrogen oxide, and particulate pollution. This has been witnessed in China, India, and Russia, as electricity demand has risen rapidly.This latest study points out that governments have not been keen to support charging infrastructure due to a variety of industry players being involved and their responsibility to carry some of the cost. This would include electric utility companies who would profit directly from charging vehicles, out-of-town shopping centers that could attract more customers with charging points in their car parks, the manufacturers of vehicles and a new generation of "gas station" operators."The savings that can be achieved by 2021 are higher than the cost of installing charging station infrastructure over a wide range of scenarios," the team writes. "These net benefits apply both to balanced growth in charging stations (in which the number of charging stations is proportional to the number of EVs) and also to rapid build out (in which charging stations are built over 2-4 years in order to achieve government EV targets for 2020 and 2025)." Ultimately, it is the reduced financial burden of a healthier populace that offsets the costs. | Pollution | 2,019 |
May 16, 2019 | https://www.sciencedaily.com/releases/2019/05/190516090838.htm | Australian islands home to 414 million pieces of plastic pollution | A survey of plastic pollution on Australia's Cocos (Keeling) Islands has revealed the territory's beaches are littered with an estimated 414 million pieces of plastic debris. | The study led by IMAS researcher Dr Jennifer Lavers and published in the journal Dr Lavers' research made headlines around the world when in May 2017 she revealed that beaches on remote Henderson Island in the South Pacific had the highest density of plastic debris reported anywhere on Earth.While the density of plastic debris on Cocos (Keeling) Islands beaches is lower than on Henderson Island, the total volume dwarfs the 38 million pieces weighing 17 tonnes found on the Pacific island.Dr Lavers said remote islands which don't have large human populations depositing rubbish nearby are an indicator of the amount of plastic debris circulating in the world's oceans."Islands such as these are like canaries in a coal mine and it's increasingly urgent that we act on the warnings they are giving us," Dr Lavers said."Plastic pollution is now ubiquitous in our oceans, and remote islands are an ideal place to get an objective view of the volume of plastic debris now circling the globe."Our estimate of 414 million pieces weighing 238 tonnes on Cocos (Keeling) is conservative, as we only sampled down to a depth of 10 centimetres and couldn't access some beaches that are known debris 'hotspots'."Unlike Henderson Island, where most identifiable debris was fishing-related, the plastic on Cocos (Keeling) was largely single-use consumer items such as bottle caps and straws, as well as a large number of shoes and thongs," Dr Lavers said.Co-author Dr Annett Finger from Victoria University said global production of plastic continues to increase, with almost half of the plastic produced over the past 60-years manufactured in the last 13 years."An estimated 12.7 million tonnes of plastic entered our oceans in 2010 alone, with around 40 per cent of plastics entering the waste stream in the same year they're produced," Dr Finger said."As a result of the growth in single-use consumer plastics, it's estimated there are now 5.25 trillion pieces of ocean plastic debris."Plastic pollution is a well-documented threat to wildlife and its potential impact on humans is a growing area of medical research."The scale of the problem means cleaning up our oceans is currently not possible, and cleaning beaches once they are polluted with plastic is time consuming, costly, and needs to be regularly repeated as thousands of new pieces of plastic wash up each day."The only viable solution is to reduce plastic production and consumption while improving waste management to stop this material entering our oceans in the first place," Dr Finger said.The plastic survey on the beaches of Cocos (Keeling) Islands was assisted by Sea Shepherd's Marine Debris program, the Tangaroa Blue Foundation and the Two Hands Project. | Pollution | 2,019 |
May 15, 2019 | https://www.sciencedaily.com/releases/2019/05/190515115831.htm | Iceland volcano eruption in 1783-84 did not spawn extreme heat wave | An enormous volcanic eruption on Iceland in 1783-84 did not cause an extreme summer heat wave in Europe. But, as Benjamin Franklin speculated, the eruption triggered an unusually cold winter, according to a Rutgers-led study. | The study, in the The eight-month eruption of the Laki volcano, beginning in June 1783, was the largest high-latitude eruption in the last 1,000 years. It injected about six times as much sulfur dioxide into the upper atmosphere as the 1883 Krakatau or 1991 Pinatubo eruptions, according to co-author Alan Robock, a Distinguished Professor in the Department of Environmental Sciences at Rutgers University-New Brunswick.The eruption coincided with unusual weather across Europe. The summer was unusually warm with July temperatures more than 5 degrees Fahrenheit above the norm, leading to societal disruption and failed harvests. The 1783-84 European winter was up to 5 degrees colder than average.Franklin, the U.S. ambassador to France, speculated on the causes in a 1784 paper, the first publication in English on the potential impacts of a volcanic eruption on the climate.To determine whether Franklin and other researchers were right, the Rutgers-led team performed 80 simulations with a state-of-the-art climate model from the National Center for Atmospheric Research. The computer model included weather during the eruption and compared the ensuing climate with and without the effects of the eruption."It turned out, to our surprise, that the warm summer was not caused by the eruption," Robock said. "Instead, it was just natural variability in the climate system. It would have been even warmer without the eruption. The cold winter would be expected after such an eruption."The warm 1783 summer stemmed from unusually high pressure over Northern Europe that caused cold polar air to bypass the region, the study says. After the eruption, precipitation in Africa and Asia dropped substantially, causing widespread drought and famine. The eruption also increased the chances of El Niño, featuring unusually warm water in the tropical Pacific Ocean, in the next winter.The eruption spawned a sulfuric aerosol cloud -- called the "Laki haze" -- that lingered over most of the Northern Hemisphere in 1783. Reports from across Europe included lower visibility and the smell of sulfur or hydrogen sulfide. The air pollution was linked to reports of headaches, respiratory issues and asthma attacks, along with acid rain damage to trees and crops, the study notes.More than 60 percent of Iceland's livestock died within a year, and about 20 percent of the people died in a famine. Reports of increased death rates and/or respiratory disorders crisscrossed Europe."Understanding the causes of these climate anomalies is important not only for historical purposes, but also for understanding and predicting possible climate responses to future high-latitude volcanic eruptions," Robock said. "Our work tells us that even with a large eruption like Laki, it will be impossible to predict very local climate impacts because of the chaotic nature of the atmosphere."Scientists continue to work on the potential impacts of volcanic eruptions on people through the Volcanic Impacts on Climate and Society project. The Laki eruption will be included in their research. Volcanic eruptions can have global climate impacts lasting several years.The study's lead author is Brian Zambri, a former post-doctoral associate who earned his doctorate at Rutgers and is now at the Massachusetts Institute of Technology. Scientists at the National Center for Atmospheric Research and University of Cambridge contributed to the study. | Pollution | 2,019 |
May 15, 2019 | https://www.sciencedaily.com/releases/2019/05/190515102137.htm | Unprecedented weakening of Asian summer monsoon | Rainfall from the Asian summer monsoon has been decreasing over the past 80 years, a decline unprecedented in the last 448 years, according to a new study. | The new research used tree ring records to reconstruct the Asian summer monsoon back to 1566. The study, published in the AGU journal The new research finds human-made atmospheric pollutants are likely the reason for the decline. The 80-year decline in the monsoon coincides with the ongoing boom in industrial development and aerosol emissions in China and the northern hemisphere that began around the end of World War II, according to the study's authors.Previous studies have looked at tree ring chronologies from this region but the new study, "surpasses [previous dendrochronology studies] in terms of the timespan covered and the number of trees involved," said Steve Leavitt, a dendrochronologist at the University of Arizona in Tucson and a co-author of the new study. "We were able to gather nearly 450 years worth of tree ring data with clear annual resolution from an area where tree ring growth correlates very strongly with rainfall."Nearly half of the world's population is affected by the Asian summer monsoon, which dumps the majority of the continent's rainfall in a few short, torrential months. Summer rainfall has been declining in recent decades, influencing water availability, ecosystems and agriculture from India to Siberia.Instrumental and observational records of monsoon strength and annual precipitation only go back about a hundred years. Long-term paleoclimate records are needed to help determine whether this decline is due to anthropogenic factors such as aerosol pollution or natural variation in the monsoon cycle.The new study uses an ensemble of 10 tree ring chronologies collected from the western Loess Plateau in north central China to track precipitation trends over the last 448 years. In wetter years, trees tend to grow thicker rings and precipitation records can be gleaned by measuring the thickness and density of the individual layers."One of the primary advantages of using tree rings to study precipitation is the annual resolution and the exact dating," Leavitt said.The tree rings captured drought periods such as the one that struck in 1928 and 1929 that led to widespread famine where more than 500,000 people died in China alone. The findings were also cross-checked with Chinese historical records of locust plagues, which tend to occur in drought years, as well as previously published tree ring chronologies.The new study found that the 80-year declining rainfall trend is unprecedented in the last 450 years, with more thin growth rings in the last 80 years than any other period. Prior to the 1940s, drought periods tended to be intermittent and shorter in duration with no other decades-long declines since 1566.Several factors are thought to affect the strength of the Asian Summer Monsoon, including solar variability, volcanic eruptions and anthropogenic aerosols. The study's authors used climate models to show sulfate aerosols -- atmospheric pollutants that cause haze -- are likely the dominant forcing agent controlling the decline of the Asian Summer Monsoon over the past 80 years.The study is an important data point in the ongoing quest to better understand the past and future of the global monsoon systems that deliver much of the world's precipitation, says Liviu Giosan, a paleoclimatologist at Woods Hole Oceanographic Institution in Massachusetts, who was not involved in the new study."Monsoons are notoriously difficult to model and predict due to the high degree of regional variability," he said."To learn more about the future, we need to better understand the past," Giosan said. "More of these kinds of studies that show [monsoon activity] over entire regions will help us better understand how the Asian Summer Monsoon functions as a whole, synoptically, over the entire continent." | Pollution | 2,019 |
May 14, 2019 | https://www.sciencedaily.com/releases/2019/05/190514143246.htm | Electrode's 'hot edges' convert carbon dioxide gas into fuels and chemicals | A team of scientists has created a bowl-shaped electrode with 'hot edges' which can efficiently convert CO | The research team, from the University of Bath, Fudan University, Shanghai, and the Shanghai Institute of Pollution Control and Ecological Security, hopes the catalyst design will eventually allow the use of renewable electricity to convert COUsing this reaction, known as the reduction of carbon dioxide, has exciting potential but two major obstacles are poor conversion efficiency of the reaction and a lack of detailed knowledge about the exact reaction pathway.This new electrode addresses these challenges with higher conversion efficiency and sensitive detection of molecules created along the reaction's progress -- thanks to its innovative shape and construction. The bowl shaped electrode works six times faster than standard planar -- or flat -- designs.The bowl-like shape of the design, technically known as an "inverse opal structure" concentrates electric fields on its hot edges -- the rim of the bowl -- which then concentrates positively charged potassium ions on the active sites of the reaction, reducing its energy requirements.The Copper-Indium alloy electrode can also be useful to sensitively study the reaction process via measuring the Raman signal, which is higher compared to a typical electrode.The study is published in the Professor Ventsislav Valev, from the University of Bath's Department of Physics, said: "There is no more pressing human need than breathing. Yet for hundreds of million people this most basic activity is a source of anxiety over lowering life expectancy, rising child mortality and climate change. There is evidence that COThe team wants to continue research to develop the most efficient catalyst to perform carbon reduction.Professor Liwu Zhang, from Fudan University, said: "CO"However, to improve the efficiency of transforming CO"Just as plants transform COThe study was funded by the Ministry of Science and Technology of the People's Republic of China, and National Natural Science Foundation of China, the Engineering and Physical Sciences Research Council(EPSRC) Centre for Doctoral Training in Condensed Matter Physics (CDT-CMP), and the Royal Society. | Pollution | 2,019 |
May 14, 2019 | https://www.sciencedaily.com/releases/2019/05/190514081738.htm | Plastic pollution harms the bacteria that help produce the oxygen we breathe | Ten per cent of the oxygen we breathe comes from just one kind of bacteria in the ocean. Now laboratory tests have shown that these bacteria are susceptible to plastic pollution, according to a study published in | "We found that exposure to chemicals leaching from plastic pollution interfered with the growth, photosynthesis and oxygen production of "Now we'd like to explore if plastic pollution is having the same impact on these microbes in the ocean."Plastic pollution has been estimated to cause more than US$13 billion in economic damage to marine ecosystems each year, and the problem is only getting worse with marine plastic pollution estimated to outweigh fish by 2050."This pollution can leach a variety of chemical additives into marine environments, but unlike the threats posed by animals ingesting or getting entangled in plastic debris the threat these leachates pose to marine life has received relatively little attention," says Dr Lisa Moore, a co-author on the paper.In the first study of its kind, the researchers looked at the effects these chemicals have on the smallest life in our oceans, photosynthetic marine bacteria."We looked at a group of tiny, green bacteria called These microbes are heavy lifters when it comes to carbohydrate and oxygen production in the ocean via photosynthesis."These tiny microorganisms are critical to the marine food web, contribute to carbon cycling and are thought to be responsible for up to 10 per cent of the total global oxygen production," says Lisa, explaining the fundamental importance of these microbes to ocean health."So one in every ten breaths of oxygen you breathe in is thanks to these little guys, yet almost nothing is known about how marine bacteria, such as In the lab, the team exposed two strains of They found that exposure to these chemicals impaired the growth and function of these microbes -- including the amount of oxygen they produce -- as well as altering the expression of a large number of their genes."Our data shows that plastic pollution may have widespread ecosystem impacts beyond the known effects on macro-organisms, such as seabirds and turtles," says Sasha."If we truly want to understand the full impact of plastic pollution in the marine environment and find ways to mitigate it, we need to consider its impact on key microbial groups, including photosynthetic microbes." | Pollution | 2,019 |
May 8, 2019 | https://www.sciencedaily.com/releases/2019/05/190508163544.htm | Low oxygen levels could temporarily blind marine invertebrates | Scientists at Scripps Institution of Oceanography at the University of California San Diego have found that low oxygen levels in seawater could blind some marine invertebrates. | These results, published recently in the Oxygen levels in the ocean are changing globally from natural and human-induced processes. Many marine invertebrates depend on vision to find food, shelter, and avoid predators, particularly in their early life stages when many are planktonic. This is especially true for crustaceans and cephalopods, which are common prey items for other animals and whose larvae are highly migratory in the water column.Research on terrestrial animals has shown that low oxygen levels can affect vision. In fact, humans can lose visual function in low oxygen conditions. Pilots flying at high altitude, for instance, have been shown to experience vision impairment if aircraft fail to supplement cockpits with additional oxygen. Additionally, health problems such as high blood pressure and strokes, both associated with oxygen loss, can damage vision."With all of this knowledge about oxygen affecting vision in land animals, I wondered if marine animals would react in a similar manner," said Lillian McCormick, lead author of the National Science Foundation-funded study and PhD student at Scripps Oceanography.Her results shocked her. Studying four local California marine invertebrates -- market squid, two-spot octopus, tuna crab, and a brachyuran crab -- she found that vision was reduced by 60-100 percent under low-oxygen conditions.Using larvae collected in the waters off Scripps, McCormick tested the acute response -- the short-term reaction to exposure to reduced oxygen -- in the vision of the larvae. She worked with Nicholas Oesch, a researcher at the UC San Diego Department of Psychology, to develop a setup for such small specimens."Most of the work in the lab is geared towards addressing biomedical questions in mammalian vision," said Oesch. "So it has been fun to step out to less traditional model systems and apply our techniques to a completely different field."Placed on a microscope stage with flowing seawater of gradually reduced oxygen levels, the larvae were exposed to light conditions that McCormick could use to elicit visual responses. She measured these responses using electrodes connected to the retina of the larvae. This technique is called an electroretinogram."Imagine the device as an EKG machine for the eye," said McCormick. "Instead of measuring electrical activity in the heart, we're looking at the part of the eye called the retina."As soon as the oxygen availability began decreasing from well-oxygenated levels, such as are found at the surface of the ocean, McCormick saw an immediate response from the larvae. This was especially true in the brachyuran crab and squid, which lost almost all of their vision at the lowest oxygen conditions tested, about 20 percent of surface oxygen levels. Octopuses held out longer, with retinal responses only declining after oxygen was reduced to a certain level, while the tuna crabs were quite resilient. Adult tuna crabs are known to tolerate low-oxygen waters."I was surprised to see that even within a few minutes of being exposed to low oxygen, some of these species became practically blind," said McCormick.Fortunately, when oxygen levels were restored, most of the specimens recovered some visual function, indicating that the damage may not be permanent for short-term periods of low oxygen.McCormick is interested in how this reduced vision could affect animal behavior, especially in those that experience the most dramatic vision loss. These animals rely on cues from light, and an inability to detect these cues could affect their survival. One example is migration. Larvae of these species migrate vertically, sinking to deeper depths during the day and ascending to the surface at night, and use changes in light intensity as their migration cue.Additionally, the larvae rely on vision for finding prey and avoiding predators. Squid larvae hunt fast-swimming prey, like copepods, and their vision is crucial for this. The response speed of the retina in squid larvae was slowed during exposure to low oxygen, indicating that this visual impairment may inhibit the larvae's ability to detect copepods and feed. Losing the ability to react to changes in light intensity -- such as a shadow of a predator -- or visually see prey might decrease survival in these highly-visual larvae. Market squid in San Diego may be particularly susceptible because they lay their eggs in areas prone to low oxygen, like the seafloor near the canyon off La Jolla.In the marine environment, oxygen levels change over daily, seasonal, and inter-annual time scales. There are also large fluctuations of oxygen with depth. However, these conditions are changing due to human-influenced climate change and even pollution. Atmospheric warming is changing temperatures in the ocean as well, which decreases the mixing of well-oxygenated surface waters to deeper waters. Additionally, nearshore environments are increasingly losing oxygen in a process called eutrophication, in which excessive nutrients in the water fuel a bloom of plankton that then deplete available oxygen dissolved in the water. This can lead to die offs of fish and other marine animals. Eutrophication is often the result of coastal pollution, like runoff from agriculture or sewage. Oxygen losses are especially pronounced in areas of naturally occurring low oxygen and upwelling, such as off the coast of California.These results are based on acute responses, and McCormick is curious how long-term exposure to low oxygen could affect these animals in the wild. Her future work will test visual behaviors under different oxygen conditions, as well as compare the results from physiology and behavior studies to oxygen and light conditions in the ocean over time. | Pollution | 2,019 |
May 3, 2019 | https://www.sciencedaily.com/releases/2019/05/190503100816.htm | Crowd oil: Fuels from air-conditioning systems | Researchers at the Karlsruhe Institute of Technology (KIT) and the University of Toronto have proposed a method enabling air conditioning and ventilation systems to produce synthetic fuels from carbon dioxide (CO | To prevent the disastrous effects of global climate change, human-made greenhouse gas emissions must be reduced to "zero" over the next three decades. This is clear from the current special report of the Intergovernmental Panel on Climate Change (IPCC). The necessary transformation poses a huge challenge to the global community: entire sectors such as power generation, mobility, or building management must be redesigned. In a future climate-friendly energy system, synthetic energy sources could represent an essential building block: "If we use renewable wind and solar power as well as carbon dioxide directly from the ambient air to produce fuels, large amounts of greenhouse gas emissions can be avoided," says Professor Roland Dittmeyer from the Institute for Micro Process Engineering (IMVT) at KIT.Due to the low CO"We want to use the synergies between ventilation and air-conditioning technology on the one hand and energy and heating technology on the other to reduce the costs and energy losses in synthesis. In addition, 'crowd oil' could mobilize many new actors for the energy transition. Private photovoltaic systems have shown how well this can work." However, the conversion of COIn a joint publication in the journal The team can rely on preliminary investigations of the individual process steps and process simulations, among others from the Kopernikus project P2X of the Federal Ministry of Education and Research. On this basis, the scientists expect an energy efficiency -- i.e. the proportion of electrical energy used that can be converted into chemical energy -- of around 50 to 60 percent. In addition, they expect carbon efficiency -- i.e. the proportion of spent carbon atoms found in the fuel produced -- to range from around 90 to almost 100 percent. In order to confirm these simulation results, IMVT researchers and project partners are currently building up the fully integrated process at KIT, with a planned COAt the same time, however, the scientists have found that the proposed concept -- even if it were introduced all over Germany -- would not be able to fully meet today's demand for crude oil products. Reducing the demand for liquid fuels, for example through new mobility concepts and the expansion of local public transport, remains a necessity. Although the components of the proposed technology, such as the plants for CO | Pollution | 2,019 |
May 2, 2019 | https://www.sciencedaily.com/releases/2019/05/190502161652.htm | What happens when schools go solar? Overlooked benefits | Sunshine splashing onto school rooftops and campuses across the country is an undertapped resource that could help shrink electricity bills, new research suggests. | The study, published in the April issue of the peer-reviewed journal At the same time, solar panels could help schools unplug from grids fed by natural gas and coal power plants that produce particulate matter, sulfur dioxide and nitrogen oxides -- air pollutants that can contribute to smog and acid rain as well as serious health consequences including heart attacks and reduced lung function. "This is an action we can take that benefits the environment and human health in a real, meaningful way," said Stanford behavioral scientist Gabrielle Wong-Parodi, an author of the study.New solar projects may easily slip down the list of priorities in a time of widespread protests by teachers calling for increased school funding, smaller class sizes and higher wages. But the U.S. Department of Energy estimates K-12 school spend more than $6 billion per year on energy, and energy costs in many districts are second only to salaries. In the higher education sector, yearly energy costs add up to more than $14 billion.The current paper suggests investments in the right solar projects -- with the right incentives from states -- could free up much-needed money in schools' budgets. "Schools are paying for electricity anyway," said Wong-Parodi, an assistant professor of Earth system science at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth). "This is a way, in some cases, that they can reduce their costs. If there's a rebate or a subsidy, it can happen more quickly."Educational institutions account for approximately 11 percent of energy consumption by U.S. buildings and 4 percent of the nation's carbon emissions. But while the potential for solar panels on homes and businesses has been widely studied, previous research has largely skipped over school buildings.The new estimates are based on data for 132,592 schools, including more than 99,700 public and 25,700 private K-12 schools, as well as nearly 7,100 colleges and universities. The researchers began by estimating the rooftop area available for solar panels at each institution, the hourly electricity output given the amount of sunshine at the site and the hourly electricity demand of each institution.Not surprisingly, the study finds three large, sunny states -- Texas, California and Florida -- have the greatest potential for generating electricity from solar panels on school rooftops, with nearly 90 percent of institutions having at least some roof space suitable for installations. Meanwhile, residents in midwestern states including Wisconsin and Ohio stand to see the biggest reductions in key air pollutants -- and costs associated with addressing related health effects -- if schools switch from the grid to solar power.Beyond measurable effects on air pollution and electricity bills, solar installations can also provide new learning opportunities for students. Some schools are already using data from their on-site solar energy systems to help students grapple with fractions, for example, or see firsthand how shifting panel angles can affect power production. "It takes this abstract idea of renewables as something that can reduce greenhouse gas emissions and brings it home," Wong-Parodi said.According to the study, it's not economically viable for educational institutions to purchase rooftop solar systems outright in any state. Rather, the projects can make financial sense for schools if they contract a company to install, own and operate the system and sell electricity to the school at a set rate.Nationwide, the researchers project benefits stemming from an all-out push for solar installations on school buildings could be worth as much as $4 billion per year, if each ton of carbon released to the air is assumed to cost society $40 and the value of a statistical human life -- in the way that regulators and economists calculate it -- is pegged at $10 million. The estimated benefits capture the cost of premature deaths and other health impacts linked to air pollution from power plants.The group's estimates do not account for environmental and health impacts tied to international mining and transport of raw materials, or to manufacturing and disposal of solar panels. Such a holistic view, they write, "may yield quite different results."Zeroing in on likely impacts within the United States, the researchers conclude that nearly all states could reap value from school solar projects far greater than the amount they're spending on subsidies and rebates. The study shows that's true even when factoring in typical costs for installation, maintenance, operation and routine hardware replacements."There is an argument for increasing the level of incentives to increase adoption of solar panels by the educational sector," said study author Inês Azevedo, who co-directs Carnegie Mellon University's Center for Climate and Energy Decision Making and will be joining Stanford Earth's faculty in July 2019.California and New York, however, are exceptions. In those two states, the researchers concluded that currently available rebates exceed the financial, health, environmental and climate change benefits provided to society by rooftop solar systems on schools -- at least at today's prices for offsetting carbon emissions through other means."California and New York are doing a fantastic job of incentivizing solar, but we still don't see 100 percent penetration," Wong-Parodi said. "A good use of their time and resources may be to evaluate all the schools that don't have it yet, and try to understand why." | Pollution | 2,019 |
May 2, 2019 | https://www.sciencedaily.com/releases/2019/05/190502143516.htm | An evolutionary rescue in polluted waters | The combination of a big population, good genes and luck helps explain how a species of fish in Texas' Houston Ship Channel was able to adapt to what normally would be lethal levels of toxins for most other species, according to a study to be published May 3 in the journal | The exceptional survivor story of the Gulf killifish was one scientists at the University of California, Davis, Baylor University and their co-authoring colleagues wanted to unveil so they could learn more about what other species may need to adapt to drastically changed environments.The minnow-like Gulf killifish are an important part of the food web for a number of larger fish species in coastal marsh habitats."Most species don't survive radically altered environments," said corresponding author Andrew Whitehead, a UC Davis professor of environmental toxicology. "By studying the survivors, we get insight into what it takes to be successful. In the case of the killifish, it came down to huge population sizes and luck."The researchers sequenced the genomes of hundreds of Gulf killifish living across a spectrum of toxicity -- from clean water, moderately polluted water and very polluted water. They were searching for the footprints of natural selection that allowed the species to rapidly transition from a fish that is highly sensitive to pollution to one extremely resistant to it.They were surprised to find that the adaptive DNA that rescued this Gulf Coast species came from an Atlantic Coast species of killifish, which has also been known to rapidly evolve high levels of pollution resistance. But Atlantic Coast killifish live at least 1,500 miles from their Houston brethren, leaving researchers to think their transport to the Gulf was likely an accident initiated by humans.Nonnative species can wreak environmental havoc on native species and habitats. But in this case, their arrival in the 1970s -- right at a moment when Gulf killifish were likely beginning to decline -- amounted to an "evolutionary rescue" from pollution for the Gulf killifish."While the vast majority of research on invasive species rightly focuses on the environmental damage they can cause, this research shows that under rare circumstances they can also contribute valuable genetic variation to a closely related native species, thus acting as a mechanism of evolutionary rescue," said co-corresponding author Cole Matson, an associate professor at Baylor University.Gulf killifish began with many advantages other species do not have. Species with large populations can harbor high levels of genetic diversity that can help them adapt to rapid change. Gulf killifish already had among the highest levels of genetic diversity of any species with a backbone. Then, at the moment its population was beginning to decline, a long-distant relative -- the Atlantic Coast killifish -- came to visit, was able to successfully mate, and injected the Gulf species with genetic resources that helped it develop resilience and resistance to toxins. Whitehead is quick to note that not all species are so lucky."The adaptation of these killifish is a cautionary tale," Whitehead said. "It tells us what we need to do better for the vast majority of species that don't have access to the kind of genetic resources killifish have. If we care about preserving biodiversity, we can't expect evolution to be the solution. We need to reduce how much and how quickly we're changing the environment so that species can keep up."Humans are not only radically changing the environment, we are also fragmenting it, making it harder for animals to move throughout their range. Whitehead said a key lesson from killifish is the importance of keeping the doors to genetic diversity open. This includes connecting and preserving landscapes to allow for genetic variation to move more freely and naturally. That could help set the stage for more evolutionary "rescues" in the rapidly changing future.Additional coauthoring institutions include the University of Connecticut and Indiana University.The work was supported by funding from a C. Gus Glasscock Endowed Research Fellowship, Baylor University, the Exxon-Valdez Oil Spill Trustee Council, National Science Foundation, National Institutes of Environmental Health Sciences and Indiana University. | Pollution | 2,019 |
May 2, 2019 | https://www.sciencedaily.com/releases/2019/05/190502143359.htm | Localized efforts to save coral reefs won't be enough, study suggests | A National Science Foundation study of factors that cause corals stress suggests that localized attempts to curb pollution on reefs won't save them without a worldwide effort to reduce global warming. | Findings by researchers at Oregon State University and the University of California, Santa Barbara were published today in Ocean habitats are increasingly under human-caused stress in the forms of pollution and global warming.Coral reefs are found in less than 1 percent of the ocean but are home to nearly one-quarter of all known marine species. Reefs also help regulate the sea's carbon dioxide levels and are a crucial hunting ground that scientists use in the search for new medicines.Corals are home to a complex composition of dinoflagellates, fungi, bacteria and archaea that together make up the coral microbiome. Shifts in microbiome composition are connected to changes in coral health.Rebecca Maher, a graduate research fellow in the OSU College of Science, led the study, which involved coral samples collected off the coast of Moorea, a South Pacific island that's part of French Polynesia. The corals examined in tank experiments by the scientists, who included Oregon State's Rebecca Vega Thurber and Ryan McMinds, were Pocillopora meandrina, commonly known as cauliflower corals."We subjected the corals to three stressors: increased temperature, nutrient enrichment -- meaning pollution -- and manual scarring," Maher said. "We scarred the corals with pliers, which was meant to simulate fish biting the coral."The scientists then studied how these stressors can interact to negatively affect the coral microbiome and thus coral health."We found that with every form of stress, the amount of 'friendly' bacteria decreases in the coral and the amount of 'unfriendly' or disease-related bacteria increases," Maher said. "Stressed corals had more unstable microbiomes, possibly leading to more disease and coral death."The researchers were surprised to learn that a pair of different stressors unleashed on the corals at the same time didn't necessarily result in twice the stress -- in fact, sometimes there was less effect from two stressors than one. But all three stressors at play together seemed to fuel each other."Two stressors did not always compound each other's negative effects but instead interacted antagonistically to produce less-than-additive effects on changes in microbial community distinctness, instability and diversity," Maher said. "However, when three forms of stress were experienced by corals, the microbiomes dramatically changed, showing that stress can act synergistically to amplify the negative effects of single stressors."The simulated fish bites proved a significant environmental stressor, but "high temperature seemed to be the nail in the coffin.""There is no magical number of stressors, but multiple stressors may interact in ways that we would not expect and that can depend on the type of stressor -- human vs. environmental -- or the severity of the stress," Maher said. "Therefore, we should take care to understand these interactions before attempting to manage them with conservation actions. Our work is an important step in informing those actions by providing insights into how the coral and its microbiome will change under increasing human impacts."In addition to the National Science Foundation, the Riverbanks Zoo & Garden Conservation Support Fund supported this research. | Pollution | 2,019 |
May 2, 2019 | https://www.sciencedaily.com/releases/2019/05/190502100943.htm | India could meet air quality standards by cutting household fuel use | India could make a major dent in air pollution by curbing emissions from dirty household fuels such as wood, dung, coal and kerosene, shows a new analysis led by researchers at the University of California, Berkeley and the India Institute of Technology. | Eliminating emissions from these sources -- without any changes to industrial or vehicle emissions -- would bring the average outdoor air pollution levels below the country's air quality standard, the study shows. Mitigating the use of household fuels could also reduce air pollution-related deaths in the country by approximately 13%, which is equivalent to saving about 270,000 lives a year."Household fuels are the single biggest source of outdoor air pollution in India," said Kirk R. Smith, professor of global environmental health at UC Berkeley and director of the Collaborative Clean Air Policy Centre. "We looked at what would happen if they only cleaned up households, and we came to this counterintuitive result that the whole country would reach national air pollution standards if they did that."Smith is co-author of a paper describing the analysis that appeared this week in the journal Americans usually associate air pollution with smokestacks and car exhaust pipes. But in many rural areas of the world where electricity and gas lines are scarce, the bulk of air pollution originates from burning biomass, such as wood, cow dung or crop residues to cook and heat the home, and from burning kerosene for lighting. As of early 2016, nearly half of the Indian population was reliant on biomass for household fuel.In addition to generating greenhouse gases like carbon dioxide and methane, these dirty fuels kick out chemicals and other fine particulate matter that can stick in the lungs and trigger a whole host of diseases, including pneumonia, heart disease, stroke, lung cancer and chronic obstructive pulmonary disease."There are 3,000 chemicals that have been identified in wood smoke, and taken at a macro level, it is very similar to tobacco smoke," Smith said.In 2015, India's average annual air pollution level was 55 micrograms per cubic meter (ug m-3) of fine particulate matter. Levels in New Delhi -- by many estimates, the most polluted city in the world -- often soared beyond 300 ug m-3. By comparison, fine particulate matter in the San Francisco Bay Area peaked at around 200 ug m-3 during the 2018 Camp Fire.Complete mitigation of biomass as fuel -- which could be achieved through widespread electrification and distribution of clean-burning propane to rural areas -- would cut India's average annual air pollution to 38 ug m-3, just below the country's National Ambient Air Quality Standard of 40 ug m-3. While this is still far above the World Health Organization (WHO) standard of 10 ug m-3, it could still have dramatic impacts on the health of the country's residents, Smith said."You can't have a clean environment when about half the houses in India are burning dirty fuels every day," Smith said. "India has got to do other things to fix air pollution -- they've got to stop garbage burning, they've got to control the power plants, they've got to control vehicles and so forth. But they need to recognize the fact that households are very important contributors to outdoor air pollution, too."In 2016, India instituted a national program to distribute clean burning stoves and propane to 80 million impoverished households, or about 500 million people. The rationale behind this program was to prevent illness due to cooking and heating smoke trapped within the home. However, Smith hopes the study's findings will bolster support for reducing outdoor air pollution, as well. Similar programs have been successful in China, where air pollution is now on the decline in 80 cities."We've realized that pollution may start in the kitchen, but it doesn't stay there -- it goes outside, it goes next door, it goes down the street and it becomes part of the general outdoor air pollution," Smith said.While curbing the use of dirty household fuels will reduce emissions of health-damaging fine particulate matter, it's not clear what effect the change will have on the emissions of greenhouse gases that cause climate change, Smith says. That's because both "dirty" fuels, like biomass, and "clean" fuels, like propane, both emit carbon dioxide when burned.And though it may come as a surprise to many Americans, air pollution from wood burning is still a problem here, too."Wood smoke is actually the chief cause of air pollution in the Bay Area, because we've cleaned up everything else, but we haven't done anything about fireplaces," Smith said. "And people don't even need fireplaces for heating, they just like them. I like them, too." | Pollution | 2,019 |
May 1, 2019 | https://www.sciencedaily.com/releases/2019/05/190501141112.htm | Human influence on global droughts goes back 100 years, NASA study finds | Human-generated greenhouse gases and atmospheric particles were affecting global drought risk as far back as the early 20th century, according to a study from NASA's Goddard Institute for Space Studies (GISS) in New York City. | The study, published in the journal The team said the study is the first to provide historical evidence connecting human-generated emissions and drought at near-global scales, lending credibility to forward-looking models that predict such a connection. According to the new research, the fingerprint is likely to grow stronger over the next few decades, potentially leading to severe human consequences.The study's key drought indicator was the Palmer Drought Severity Index, or PDSI. The PDSI averages soil moisture over the summer months using data such as precipitation, air temperature and runoff. While today NASA measures soil moisture from space, these measurements only date back to 1980. The PDSI provides researchers with average soil moisture over long periods of time, making it especially useful for research on climate change in the past.The team also used drought atlases: Maps of where and when droughts happened throughout history, calculated from tree rings. Tree rings' thickness indicates wet and dry years across their lifespan, providing an ancient record to supplement written and recorded data."These records go back centuries," said lead author Kate Marvel, an associate research scientist at GISS and Columbia University. "We have a comprehensive picture of global drought conditions that stretch back way into history, and they are amazingly high quality."Taken together, modern soil moisture measurements and tree ring-based records of the past create a data set that the team compared to the models. They also calibrated their data against climate models run with atmospheric conditions similar to those in 1850, before the Industrial Revolution brought increases in greenhouse gases and air pollution."We were pretty surprised that you can see this human fingerprint, this human climate change signal, emerge in the first half of the 20th century," said Ben Cook, climate scientist at GISS and Columbia University's Lamont-Doherty Earth Observatory in New York City. Cook co-led the study with Marvel.The story changed briefly between 1950 and 1975, as the atmosphere became cooler and wetter. The team believes this was due to aerosols, or particles in the atmosphere. Before the passage of air quality legislation, industry expelled vast quantities of smoke, soot, sulfur dioxide and other particles that researchers believe blocked sunlight and counteracted greenhouse gases' warming effects during this period. Aerosols are harder to model than greenhouse gases, however, so while they are the most likely culprit, the team cautioned that further research is necessary to establish a definite link.After 1975, as pollution declined, global drought patterns began to trend back toward the fingerprint. It does not yet match closely enough for the team to say statistically that the signal has reappeared, but they agree that the data trends in that direction.What made this study innovative was seeing the big picture of global drought, Marvel said. Individual regions can have significant natural variability year to year, making it difficult to tell whether a drying trend is due to human activity. Combining many regions into a global drought atlas meant there was a stronger signal if droughts happened in several places simultaneously."If you look at the fingerprint, you can say, 'Is it getting dry in the areas it should be getting drier? Is it getting wetter in the areas it should be getting wetter?'" she said. "It's climate detective work, like an actual fingerprint at a crime scene is a unique pattern."Previous assessments from national and international climate organizations have not directly linked trends in global-scale drought patterns to human activities, Cook said, mainly due to lack of data supporting that link. He suggests that, by demonstrating a human fingerprint on droughts in the past, this study provides evidence that human activities could continue to influence droughts in the future."Part of our motivation was to ask, with all these advances in our understanding of natural versus human caused climate changes, climate modeling and paleoclimate, have we advanced the science to where we can start to detect human impact on droughts?" Cook said. His answer: "Yes."Models predict that droughts will become more frequent and severe as temperatures rise, potentially causing food and water shortages, human health impacts, destructive wildfires and conflicts between peoples competing for resources."Climate change is not just a future problem," said Cook. "This shows it's already affecting global patterns of drought, hydroclimate, trends, variability -- it's happening now. And we expect these trends to continue, as long as we keep warming the world." | Pollution | 2,019 |
May 1, 2019 | https://www.sciencedaily.com/releases/2019/05/190501114438.htm | Diagnosing urban air pollution exposure with new precision | A new review of studies on levels of urban exposure to airborne pollutants and their effects on human health suggests that advanced instrumentation and information technology will soon allow researchers and policymakers to gauge the health risks of air pollution on an individual level. | In New York City alone, the economic impact of premature death from causes related to air pollution, including asthma and other respiratory conditions and cardiovascular complications, exceeds $30.7 billion a year. Globally, 4.2 million deaths per year are attributable to airborne pollution, making it the fifth-ranking mortality risk factor according to a 2015 study published in the Lancet.An interdisciplinary research team from New York University, led by Masoud Ghandehari, an associate professor in NYU Tandon's Department of Civil and Urban Engineering and the Center for Urban Science and Progress (CUSP), published a comprehensive review of recent efforts to assess the impact of air pollution exposure in cities.Ghandehari's co-authors are Andrew Caplin, Silver Professor in the NYU Department of Economics; Paul Glimcher, Silver Professor and professor of neural science and psychology; George Thurston, NYU School of Medicine professor in the Departments of Environmental Medicine and Population Health; and Chris Lim, a recent Ph.D. graduate of the School of Medicine.Their paper, published in Yet the authors argue that advanced sensing and information technologies can be used to even greater advantage, offering the potential for far more granular assessments -- at the level of the individual. "One of the questions we want to answer is how different people experience pollution, and why?" Ghandehari said.He explained that population-level assessments overlook factors such as personal mobility -- including commuting by car, bus, bicycle, or on foot, and often do not consider indoor climate control conditions or life stage. For example, students and working adults are more mobile than older people and are therefore more exposed, while children experience lifelong adversities.Socioeconomic status is also a known factor for increased exposure to airborne pollutants as well as increased risk of asthma and cardiovascular disease. "People from all points on the economic spectrum live in polluted areas, yet they often have different health outcomes," Ghandehari said. "Using technology to study individual associations between air pollution and health outcomes -- rather than group associations -- will yield evidence-based arguments for change that would particularly impact individuals at higher risk of negative health impacts." | Pollution | 2,019 |
May 1, 2019 | https://www.sciencedaily.com/releases/2019/05/190501082004.htm | Environmental pollutants could impact cellular signs of aging | Researchers have linked some environmental pollutants with diseases, a decreased life span and signs of premature aging, such as wrinkles and age spots. But can accelerated aging be detected at the cellular level in healthy people exposed to pollutants? Now, researchers in the ACS journal | Some environmental pollutants cause mitochondria -- the cell's powerhouses -- to release more reactive oxygen species, which can damage the DNA ¬¬in these organelles and lead to inflammation. Telomeres, the DNA-protein caps on the ends of chromosomes that allow them to continue dividing, are also sensitive to environmental stress. Shorter telomeres are a hallmark of aging, whereas abnormally long telomeres are often seen in cancer cells. Michelle Plusquin of Hasselt University and colleagues wondered if individual pollutants, or combinations of them, could affect mitochondrial DNA content or telomere length in people.To find out, the researchers analyzed various pollutants in blood and urine samples from 175 adults (50 to 65 years old) enrolled in the Flemish Environment and Health Study. The team determined mitochondrial DNA content and telomere length from the participants' blood cells. The researchers used multipollutant models to study all pollutants simultaneously, a novel approach in environmental sciences. They found that people with higher levels of urinary copper and serum perfluorohexane sulfonic acid had decreased mitochondrial DNA content, while higher urinary copper and serum perflurooctanoic acid levels were linked with shorter telomeres. But some pollutants were associated with either higher mitochondrial DNA content or longer telomeres. These findings suggest that pollutants could impact molecular hallmarks of aging, though more research is needed to determine the mechanism and biological effects, the researchers say. | Pollution | 2,019 |
April 30, 2019 | https://www.sciencedaily.com/releases/2019/04/190430173212.htm | New forecasting system alerts residents of New Delhi about unhealthy air | Residents of New Delhi and nearby heavily polluted areas of northern India now have access to air quality forecasts that provide critical information for reducing their exposure to potentially unhealthy air. | The new forecasting system, developed by the National Center for Atmospheric Research (NCAR) in collaboration with the Indian Institute of Tropical Meteorology (IITM) in Pune has begun providing 72-hour forecasts of fine particulate matter, known as PM2.5. These tiny airborne particles, 2.5 microns or less in diameter, are a major concern because they are small enough to penetrate deep into the lungs or even the bloodstream, potentially causing significant respiratory and cardiac problems.The air pollution can become so extreme under typical wintertime meteorological conditions that officials in New Delhi have closed schools and restricted traffic on highly polluted days. The government of India has also formulated a graded response action plan to impose temporary controls on industries, power generation, and construction activities to avert severe air pollution episodes."By developing this forecasting system, we are working to provide timely and accurate information to the public about forthcoming episodes of poor air quality," said NCAR's Rajesh Kumar, the lead scientist on the project. "It's critical to inform people so they can plan in advance to reduce their exposure to air pollutants that can affect their health."The system uses measurements of pollutants, computer modeling, and statistical techniques. It updates the forecast every 24 hours.Preliminary results indicate that it is accurately predicting day-to-day variability in PM2.5, giving officials and residents advance warning of unusually poor air quality. It does not always capture the precise levels of the pollutant, but Kumar believes they can improve the forecasting system to better predict that.The technology, which scientists will refine during a two-year research project in India, may eventually be adapted to provide air quality forecasts in other polluted areas in developing countries, as well as in the United States.The Ministry of Earth Sciences in India is funding the project.New Delhi ranks among the world's most polluted cities, according to the World Health Organization. It suffers from particularly high levels of PM2.5, a major threat to human health and economic activity throughout much of India and many parts of the developing world.Fine particulates are emitted from numerous sources, including agricultural fires, motor vehicles, and smokestacks. On days when atmospheric concentrations of PM2.5 in New Delhi soar to many times the level that is considered unhealthy, prolonged exposure to the toxic haze is equivalent to smoking two packs of cigarettes a day. During a particularly severe pollution episode in 2017, the state of Delhi's chief minister tweeted: "Delhi has become a gas chamber."A recent study in Indian officials have turned to air quality forecasts in the past that drew on computer modeling of basic atmospheric conditions. But the forecasts were unreliable because they did not include detailed atmospheric measurements or accurate inventories of emissions, nor did they correctly capture some of the atmospheric processes that produce particulates.The new system attempts to address these limitations by incorporating satellite measurements of particles in the atmosphere and near-real time emissions from major fires associated with crop-residue burning upwind of Delhi. It also draws on inventories of emissions from transportation, industry, and other human activities. This information is fed into an advanced NCAR-based atmospheric chemistry model known as WRF-Chem (the chemistry component of the Weather Research and Forecasting model).NCAR scientists are developing a specialized statistical system to combine the observations and WRF-Chem output, further improving the accuracy of PM2.5 predictions and enabling scientists to reliably quantify the uncertainties in the forecast. In addition, IITM is conducting extensive field campaigns and monitoring air quality to better understand processes that influence the formation and movement of particulates in New Delhi's atmosphere.Kumar and his colleagues are also developing techniques to ensure that the system accurately predicts spikes in PM2.5 related to special events, such as the annual Diwali festival of lights that is celebrated with candles and fireworks.With air quality a pressing concern across much of the globe, the research team is studying similar forecasting approaches in the United States and in additional highly polluted developing countries. This work is taking place under the umbrella of a new international project called Monitoring, Analysis, and Prediction of Air Quality (MAP-AQ), led by Kumar and by Guy Brasseur, director of NCAR's Atmospheric Chemistry Observations and Modeling Laboratory."The lessons we are learning in the United States are quite useful in India, and vice-versa," Kumar said. "This research can lead to accurate forecasting systems in many regions, enabling millions of vulnerable residents to take necessary steps to limit their exposure." | Pollution | 2,019 |
April 30, 2019 | https://www.sciencedaily.com/releases/2019/04/190430132306.htm | Field study finds pellet-fed stoves cut air pollutant emissions 90% | A study by North Carolina State University researchers finds that a new cookstove design, which makes use of compressed wood pellets, reduces air pollution by about 90% for a range of contaminants associated with health problems and climate change. The findings stem from a Rwanda field study designed to test the performance of the stoves in real-world conditions. | "We wanted to evaluate these new, pellet-fueled stoves, and secured funding from the Clean Cooking Alliance and the Climate and Clean Air Coalition based on our experience conducting field evaluations," says Andy Grieshop, an associate professor of environmental engineering at NC State and corresponding author of a paper on the work."There have been numerous attempts to develop cleaner cookstoves for use in developing countries, but while they've often done well in lab testing, they've had disappointing performance when tested in real-world conditions," Grieshop says. "However, we found that the pellet-fed stoves performed well in the field. We saw drastic cuts in pollutant emissions."The pellet-fueled stoves rely on battery-powered fans to burn the pellets efficiently, reducing pollutant emissions. The stoves come with solar panels for recharging the batteries, making long-term use feasible in areas where plugging a battery charger into the wall isn't an option.The researchers worked with residents in Rwanda, monitoring air emissions when people prepared food using conventional wood fires, which are common in rural areas; when using conventional charcoal-burning cookstoves, which are common in urban areas; and when using the new pellet-fueled cookstoves.The researchers tested for a range of air pollutants, including fine particulate matter (PM) and carbon monoxide -- both of which pose health risks -- and black carbon, which is a major contributor to climate change. A large fraction of global black carbon emissions comes from heating and cooking in homes."For the most part, we saw pollutant levels across the board cut by 90% or more using the pellet-burning stoves," Grieshop says.For example, the researchers found that mean reduction in PM emissions from pellet-fueled stoves was 97% compared to wood fires and 89% compared to charcoal cookstoves. The pellet-fueled stoves are also the first cookstoves that have met the International Organization for Standardization's (ISO) highest standard -- Tier 5 -- for carbon monoxide emissions. The stoves also met stringent emission rate targets for carbon monoxide set by the World Health Organization, though they fell slightly short of the PM emission target."The ISO standard was developed with lab testing in mind, and we got these results in the field, which is remarkable," Grieshop says."The only exceptions we found to these emission cuts were when a pellet stove was used improperly," Grieshop says. "For example, when people used sticks or other materials to start the fire instead of kerosene, as suggested by the manufacturer, we found higher emissions in roughly 10% of the 59 pellet stove research tests. But even the worst performance from a pellet stove was comparable to the best performance from a wood or charcoal cooking test."While these results are promising, the real challenge will be whether the pellet-fueled stove concept can be scaled up to meet the demands of millions or billions of people in low- income regions. Test results showing how effective the pellet-fueled stoves are at reducing air pollution are new, but the concept of introducing new cookstoves is not."The stoves we tested are essentially provided for free by a company that makes money by selling the necessary wood fuel pellets," Grieshop says. "Anecdotally, we found in Rwanda that the cost of the pellets was comparable to the cost of the charcoal people buy for their cooking needs. However, right now, the fuel pellets are made using sawdust waste from timber operations. We just don't know to what extent that is scalable."The paper, "Pellet-fed gasifier stoves approach gas-stove like performance during in-home use in Rwanda," is published in the journal The work was done with support from the Clean Cooking Alliance and the Climate and Clean Air Coalition. | Pollution | 2,019 |
April 29, 2019 | https://www.sciencedaily.com/releases/2019/04/190429182821.htm | 'Right' cover-crop mix good for both Chesapeake and bottom lines | Planting and growing a strategic mix of cover crops not only reduces the loss of nitrogen from farm fields, protecting water quality in the Chesapeake Bay, but the practice also contributes nitrogen to subsequent cash crops, improving yields, according to researchers. | The economic benefits of taking cover-cropping to the next level are important because the benefits will convince more corn, wheat and soybean growers in the Northeast and Mid-Atlantic regions to adopt the practice, the researchers said. Those cash crops are grown over large acreages mainly to feed livestock such as dairy and beef cows, hogs, and poultry."Cover crops are one of the main tools we have for reducing nutrients in the waters of the Chesapeake Bay -- they have the potential to be agricultural nitrogen regulators that reduce leaching through soils and then deliver nitrogen to subsequent cash crops," said Jason Kaye, professor of soil biogeochemistry. His research group in Penn State's College of Agricultural Sciences has been studying cover-crop species for nearly a decade.Yet regulating nitrogen in this way has proven difficult, because cover-crop species excel at either reducing nitrogen leaching or increasing nitrogen supply to cash crops, but they fail to excel at both simultaneously. Researchers tested mixed-species cover-crop stands to see if they could balance the nitrogen-fixing and nitrogen-scavenging capabilities of individual species.They tested six cover-crop monocultures and four mixtures for their effects on nitrogen cycling in an organically managed maize-soybean-wheat feed-grain rotation at Penn State's Russell E. Larson Agricultural Research Center. For three years, researchers used a suite of integrated approaches to quantify soil nitrogen dynamics and measure plant nitrogen uptake.In the study, all cover-crop species -- including legume monocultures -- reduced nitrogen leaching compared to fallow plots. Cereal rye monocultures reduced nitrogen leaching by 90 percent, relative to fallow plots. Notably, mixtures with a low-seeding rate of rye did almost as well. Austrian winter-pea monocultures increased nitrogen uptake in cash crops the most, relative to fallow. Conversely, rye monocultures decreased nitrogen uptake, relative to fallow.The research's results, published this month in "The mixtures were all very good at reducing pollution, but their impact on yields was limited," he said. "Mixtures that performed best are the ones that had a higher proportion of legumes and mixed in a little bit of the grasses such as rye. Mixtures that include cereal rye dramatically decreased pollution because grasses establish root biomass that holds nitrogen and provides microbial activity in the soil."The process in which cover crops keep nutrients out of streams, protecting the bay, and pass along nitrogen to cash crops is straightforward. Kaye noted that in the fall after a cash crop is harvested, there is nitrogen in the soil that could go to the Chesapeake. A cover crop can take up that nitrogen and then, when the cover crop is killed in the spring, the nitrogen in the tissues of the cover crop are decomposed by microorganisms in the soil and passed on to the next cash crop, increasing yields.The research findings may hold timely significance for Pennsylvania organic feed-grain producers, Kaye believes. Cover crops are especially important in organic systems in which tillage is used to manage and kill weeds and cover crops. In those situations, reducing erosion in disturbed soil is critical."Organic agriculture is growing very quickly in Pennsylvania," he said. "One example is that Bell & Evans recently expanded organic poultry operations in the state. So, farmers potentially have a new and expanding market for feed grains that are grown organically. Incorporating cover-crop mixtures into their rotations will help them be more productive without increasing nutrient pollution." | Pollution | 2,019 |
April 29, 2019 | https://www.sciencedaily.com/releases/2019/04/190429154518.htm | As oceans warm, microbes could pump more carbon dioxide back into air, study warns | The world's oceans soak up about a quarter of the carbon dioxide that humans pump into the air each year -- a powerful brake on the greenhouse effect. In addition to purely physical and chemical processes, a large part of this is taken up by photosynthetic plankton as they incorporate carbon into their bodies. When plankton die, they sink, taking the carbon with them. Some part of this organic rain will end up locked into the deep ocean, insulated from the atmosphere for centuries or more. But what the ocean takes, the ocean also gives back. Before many of the remains get very far, they are consumed by aerobic bacteria. And, just like us, those bacteria respire by taking in oxygen and expelling carbon dioxide. Much of that regenerated CO | A new study suggests that CO"The results are telling us that warming will cause faster recycling of carbon in many areas, and that means less carbon will reach the deep ocean and get stored there," said study coauthor Robert Anderson, an oceanographer at Columbia University's Lamont-Doherty Earth Observatory.Scientists believe that plankton produce about 40 billion to 50 billion tons of solid organic carbon each year. They estimate that, depending on the region and conditions, about 8 billion to 10 billion tons manage to sink out of the surface ocean into greater depths, past about 100 meters, without getting eaten by bacteria. However, scientists have had a poor understanding of the depths at which COUsing data from a 2013 research cruise from Peru to Tahiti, the scientists looked at two distinct regions: the nutrient-rich, highly productive waters off South America, and the largely infertile waters that circle slowly in the central ocean below the equator in a set of currents known as the South Pacific Gyre.To measure how deep organic particles sink, many oceanographic studies use relatively primitive devices that passively trap particles as they sink. However, these devices can collect only a limited amount of data over the vast distances and depths of the ocean. For the new study, the researchers instead pumped large amounts of seawater at different depths and sifted through it. From these, they isolated particles of organic carbon and isotopes of the element thorium, which together enabled them to calculate the amount of carbon sinking through each depth that they sampled. This procedure yields far more data than traditional methods do.In the fertile zone, oxygen gets used up quickly near the surface, as bacteria and other organisms gobble up organic matter. At a depth of about 150 meters, oxygen content reaches near zero, halting aerobic activity. Once organic material reaches this layer, called the oxygen minimum zone (OMZ) it can sink untouched to the deeper ocean. The OMZ thus forms a sort of protective cap over any organic matter that sinks past it. In the deeps, oxygen levels pick up again and aerobic bacteria can go back to work; however, any COUp to now, many scientists have thought much of the organic matter produced near the surface makes it through the OMZ, and thus most CO"People did not think that much regeneration was taking place in the shallower zone," said the study's lead author, Frank Pavia, a graduate student at Lamont-Doherty. "The fact that it's happening at all shows that the model totally doesn't work in the way we thought it did."This matters because researchers project that as the oceans warm, OMZs will both spread horizontally over wider areas, and vertically, toward the surface. Under the conventional paradigm, this would allow more organic matter to reach the deep ocean to get trapped there. However, the new study suggests that as OMZs spread, so will the vigorous COFurther out, in the South Pacific Gyre, the results were less ambiguous. There is less biologic activity here than above the OMZs because of lack of nutrients, and previous research using sediment traps has suggested that much of whatever organic matter does form on the surface sinks to the cold deeps. Some COThis matters because, like OMZs, the South Pacific Gyre and similar current systems in other parts of the oceans are projected to grow as the oceans warm. The gyres will divide these regions into stratified layer cakes of warmer waters on top and colder waters below. And because, according to the study, so much COThe researchers point out that the processes they studied are only part of the ocean carbon cycle. Physical and chemical reactions independent of biology are responsible for much of the exchange of carbon between atmosphere and oceans, and these processes could interact with the biology in complex and unpredictable ways. "This [the study] gives us information that we didn't have before, that we can plug into future models to make better estimates," said Pavia.The other authors of the study are Phoebe Lam of the University of California, Santa Cruz; B.B. Cael of the University of Hawaii, Manoa; Sebastian Vivancos and Martin Fleisher of Lamont-Doherty Earth Observatory; and Yanbin Lu, Hai Cheng and R. Lawrence Edwards of University of Minnesota. | Pollution | 2,019 |
April 29, 2019 | https://www.sciencedaily.com/releases/2019/04/190429104721.htm | Biodegradable bags can hold a full load of shopping three years after being discarded in the environment | Biodegradable plastic bags are still capable of carrying full loads of shopping after being exposed in the natural environment for three years, a new study shows. | Researchers from the University of Plymouth examined the degradation of five plastic bag materials widely available from high street retailers in the UK.They were then left exposed to air, soil and sea, environments which they could potentially encounter if discarded as litter.The bags were monitored at regular intervals, and deterioration was considered in terms of visible loss in surface area and disintegration as well as assessments of more subtle changes in tensile strength, surface texture and chemical structure.After nine months in the open air, all the materials had completely disintegrated into fragments.However, the biodegradable, oxo-biodegradable and conventional plastic formulations remained functional as carrier bags after being in the soil or the marine environment for over three years.The compostable bag completely disappeared from the experimental test rig in the marine environment within three months but, while showing some signs of deterioration, was still present in soil after 27 months.Writing in Environmental Science and Technology, researchers from the University's International Marine Litter Research Unit say the study poses a number of questions.The most pertinent is whether biodegradable formulations can be relied upon to offer a sufficiently advanced rate of degradation to offer any realistic solution to the problem of plastic litter.Research Fellow Imogen Napper, who led the study as part of her PhD, said: "After three years, I was really amazed that any of the bags could still hold a load of shopping. For a biodegradable bag to be able to do that was the most surprising. When you see something labelled in that way, I think you automatically assume it will degrade more quickly than conventional bags. But, after three years at least, our research shows that might not be the case."In the research, scientists quote a European Commission report in 2013 which suggested about 100 billion plastic bags were being issued every year, although various Governments (including the UK) have since introduced levies designed to address this.Many of these items are known to have entered the marine environment, with previous studies by the University having explored their impact on coastal sediments and shown they can be broken down into microplastics by marine creatures.Professor Richard Thompson OBE, Head of the International Marine Litter Research Unit, was involved in those studies and gave evidence to the Government inquiry which led to the introduction of the 5p levy. He added: "This research raises a number of questions about what the public might expect when they see something labelled as biodegradable. We demonstrate here that the materials tested did not present any consistent, reliable and relevant advantage in the context of marine litter. It concerns me that these novel materials also present challenges in recycling. Our study emphasises the need for standards relating to degradable materials, clearly outlining the appropriate disposal pathway and rates of degradation that can be expected." | Pollution | 2,019 |
April 26, 2019 | https://www.sciencedaily.com/releases/2019/04/190426130054.htm | Mapping industrial 'hum' in the US | Using a dense sensor network that scanned the United States between 2003 and 2014, researchers have identified areas within the country marked by a persistent seismic signal caused by industrial processes. | At the 2019 SSA Annual Meeting, Omar Marcillo of Los Alamos National Laboratory said that he and his colleagues are developing a map of this harmonic tonal noise or "industrial hum." The harmonic tonal noise map that Marcillo and colleagues are developing can pinpoint where certain types of industrial activity are prominent.Marcillo and his colleagues first noted the industrial signals in acoustic data they were collecting from infrasound experiments to study the global propagation of noise from phenomena like ocean storms. They realized that part of the infrasound signal they were detecting was coming from wind farms tens of kilometers away from the detectors.They realized that the wind farms should send a seismic signal through the ground as well, and began searching for these signals in seismic data collected across the U.S. by the Transportable Array, a mobile network of 400 high-quality broadband seismographs that has been moved east to west across the U.S.The characteristic signals are produced by large industrial machinery like the wind farm turbines, turbines in places like the Hoover Dam, and large water pumping stations in Washington State and California, Marcillo explained."We realized that the same processes that happen in a big wind farm, you would see in big pumps, anything that has rotating parts, hydroturbines or blowers, anything that moves very harmonically and creates these tonal signatures," he said. "We are looking at big machines, we are not talking about something like the air conditioner in a house."Marcillo said the map could help researchers learn more about where these industrial signals might interfere with other seismic data collection efforts. Scientists building massive and hypersensitive instrumentation such as particle colliders might also benefit from knowing about nearby sources of industrial hum.Marcillo noted that many researchers, including himself, have sought to erase instead of enhance these kinds of industrial signals in their work. "Some of the techniques we have to study earthquakes, for instance, try to get rid of all these features because they are noise," he said. "But one person's noise is another person's signal." | Pollution | 2,019 |
April 25, 2019 | https://www.sciencedaily.com/releases/2019/04/190425122349.htm | Human-caused climate change played limited role in Beijing's 2013 'airpocalypse' | In January 2013, a suffocating, poisonous haze hung over Beijing for four days. The record high levels of fine particulate matter in the air caused airports to close and thousands of coughing, choking citizens to seek hospital care. | Although the particulate matter that filled the winter skies resulted from both human and natural emissions, a new Northwestern University study concludes that human-caused climate change played only a minor role in the air's stagnation.The study, which used computational simulations of climate, is one of the first to tie an air quality episode to human-caused climate change. This type of research is part of the growing subfield of climate science called "detection and attribution of extreme events," which assesses how human emission of greenhouse gases may have contributed to the occurrence and/or severity of a particularly impactful event."Typically, single event detection and attribution work is performed on 'charismatic' extreme weather events, such as hurricanes, heat waves and droughts. Here, we perform the analysis on something less glamorous -- still air over Beijing," said Northwestern's Daniel Horton, the study's senior author. "Our work applies detection and attribution methods to a less glamorous yet highly impactful -- particularly for public health -- meteorological phenomenon."The lingering haze -- that was signature to Beijing's 2013 "airpocalypse" and other smog events -- requires two factors: the emission of pollutants and still air that allows the pollutants to build. Beijing's coal-burning power plants and 5 million motor vehicles are responsible for much of the city's pollution. Horton and first author Christopher Callahan aimed to discover if human-caused climate change played a role in the meteorological conditions that led to the still air."Even though we found that climate change has not had a major influence on winter air quality over Beijing to date, this work adds some meteorological diversity to recent examinations of links between climate change and individual extreme events," said Callahan, a former undergraduate student in Horton's lab, who led the research.The study will publish April 30 in the To explore the meteorological conditions underlying the airpocalypse, the team compared climate simulations: one set of model experiments included the current trend of human-caused climate change and one set as if human-made climate change did not exist. The researchers found that, in both scenarios, meteorological conditions conducive to poor air quality in Beijing still occurred.The simulations did indicate, however, that if human-caused climate change continues along the same trend, it might lead to more extreme air quality events. Already, 4.2 million deaths worldwide are caused by air pollution, according to the World Health Organization. Many of these deaths result from pollution-related respiratory diseases, heart disease or stroke."We found that climate change was not the most important factor in shaping air quality during Beijing's past winters. Natural atmospheric fluctuations were conducive to air quality degradation," Callahan said. "However, we found a small human fingerprint that could increase into the future."Even though human emission of greenhouse gases only played a minor role in Beijing's past poor winter air quality events, Callahan contends that humans can greatly improve air quality by cutting air pollutant emissions. Such reductions are often coincident with greenhouse gas reductions. Thus, not only would reductions help curtail global climate change, they also would have the added benefits of clean air and fewer pollution-related illnesses."Climate change mitigation, at its core, requires greenhouse gas emission reductions," he said. "If we can do that while also reducing pollutant emissions, it's a win-win."The study, "Multi-index attribution of extreme winter air quality in Beijing, China," was supported by the National Science Foundation (award number CBET-1848683) and the Ubben Program for Climate and Carbon Science at Northwestern. | Pollution | 2,019 |
April 25, 2019 | https://www.sciencedaily.com/releases/2019/04/190425104139.htm | Unravelling the complexity of air pollution in the world's coldest capital city | Ulaanbaatar is often called the world's coldest capital city because the temperature can reach ?40 °C on winter nights. The harsh climate causes each household in a ger (a traditional Mongolian house) to consume >5 t of raw coal and 3 m | The fine particulate matter (e.g., PM2.5) mass concentration is often used as an ambient air pollution index. PM2.5 concentrations have often been found to be even higher in Ulaanbaartar than in heavily polluted Asian megacities. However, the PM2.5 concentration is calculated from the mass of particles and does not take into account the toxic chemicals sorbed to or contained in the particles.Polycyclic aromatic hydrocarbons (PAHs) are mainly produced when organic matter is imperfectly combusted and pyrolyzed. PAHs are suspected to be largely responsible for various symptoms related to air pollution (e.g., allergy, asthma, cancer, and reproductive disorders).This study was the first of its kind to be performed in Ulaanbaatar. The aim was to characterize spatial and temporal variations in particulate-bound PAH pollution, identify the pollutant sources, and assess the health risks posed.Suspended particle samples were collected in five districts in Ulaanbaatar during the heating and non-heating periods of 2017. The samples were analyzed, and the concentrations of 15 PAHs with two-six benzene rings were determined.The highest total PAH concentration (773 ng mSpecific PAH markers indicated strong influences from coal and wood combustion, particularly in an area containing gers in the heating season.The results indicated that there is a direct link between high PAH concentrations in certain districts of Ulaanbaatar and the types of fuel used. The results also indicated the dilemma faced by the city: residents must choose between heating and improving air quality.Prolonged exposure to polluted air in winter gives a high lifetime cancer risk, indicating that there is an urgent need for dramatic mitigation measures to be implemented. The results provide evidence for developing effective, scientifically based, air pollution control strategies. | Pollution | 2,019 |
April 24, 2019 | https://www.sciencedaily.com/releases/2019/04/190424153636.htm | Air pollution poses risks for childhood cancer survivors | Poor air quality days significantly increase the risk of hospitalizations for respiratory issues in young survivors of cancer, according to a study conducted by researchers at Huntsman Cancer Institute (HCI) at the University of Utah (U of U) and published in the International Journal of Environmental Research and Public Health. | Better treatments -- developed through research -- have resulted in a dramatic increase in the rates of childhood cancer survival; today, nearly 80 percent of children diagnosed with cancer will survive their disease. However, these survivors may experience long-term detrimental health issues related to their cancer treatment. In this project, HCI researchers sought to better understand what a polluted environment means for the health of cancer survivors who may already be at a higher risk for illness because of the type of cancer treatment they received. The team examined the medical records of nearly 4,000 childhood, adolescent, and young adult cancer survivors diagnosed or treated at Primary Children's Hospital between 1986 and 2012. They tracked when and how often those survivors required emergency room treatment or were admitted to a hospital in Utah due to respiratory illness. The study was divided into three groups: those who received chemotherapy as part of their cancer treatment, those who didn't receive chemotherapy, and a cancer-free group. The researchers found the risk for respiratory hospitalization was significantly higher among the survivors who received chemotherapy compared to the cancer-free group.HCI's researchers were specifically looking at what happened to survivors on unhealthy air days. The study found the risk for hospitalizations among cancer survivors was significant when air pollution (PMThis is the first study to report a connection between PM"This study has wide application to cancer survivors in Utah as well as nationwide," said Judy Ou, PhD, a cancer epidemiologist at HCI and lead author on the study. "There are approximately 17 million cancer survivors in the United States, and statistics show about 40 percent of the U.S. population lives in places that are considered polluted at certain times of the year. This study provides valuable information to the medical community about how air pollution affects young survivors of cancer. We can use this to inform strategies to address this risk."Anne Kirchhoff, PhD, HCI cancer researcher and associate professor of pediatrics at the U of U, said, "We really haven't thought about how environmental exposures may affect long-term healthcare needs and health outcomes. We may need to rethink guidelines, both on air pollution notifications from public health agencies as well as guidelines we're giving cancer patients."Kirchhoff and Ou are working to identify effective strategies for sharing this information with health advocates, air quality organizations, and families affected by childhood cancers. The researchers collaborated with scientists from other departments at the University of Utah and Brigham Young University on this work.The researchers say the Utah Population Database was vital to their research. They hope to use the resource on a follow-up project that would include a larger sample size to further evaluate the results of this study. They would also like to extend the study to adult cancer survivor populations. "We would like to understand the effects of pollution on a large sample and be able to provide guidance to cancer survivors across the country," said Ou.This HCI-led research is supported by the National Cancer Institute grant P30 CA042014, St. Baldrick's Foundation Childhood Cancer Research Grant, Huntsman Cancer Foundation, and Intermountain Healthcare's Primary Children's Hospital. | Pollution | 2,019 |
April 23, 2019 | https://www.sciencedaily.com/releases/2019/04/190423145520.htm | Carbon dioxide from Silicon Valley affects the chemistry of Monterey Bay | MBARI researchers recently measured high concentrations of carbon dioxide in air blowing out to sea from cities and agricultural areas, including Silicon Valley. In a new paper in | Extending their calculations to coastal areas around the world, the researchers estimate that this process could add 25 million additional tons of carbon dioxide to the ocean each year, which would account for roughly one percent of the ocean's total annual carbon dioxide uptake. This effect is not currently included in calculations of how much carbon dioxide is entering the ocean because of the burning of fossil fuels.Less than half of the carbon dioxide that humans have released over the past 200 years has remained in the atmosphere. The remainder has been absorbed in almost equal proportions by the ocean and terrestrial ecosystems. How quickly carbon dioxide enters the ocean in any particular area depends on a number of factors, including the wind speed, the temperature of the water, and the relative concentrations of carbon dioxide in the surface waters and in the air just above the sea surface.MBARI has been measuring carbon dioxide concentrations in the air and seawater of Monterey Bay almost continuously since 1993. But it wasn't until 2017 that researchers began looking carefully at the atmospheric data collected from sea-surface robots. "One of our summer interns, Diego Sancho-Gallegos, analyzed the atmospheric carbon dioxide data from our research moorings and found much higher levels than expected," explained MBARI Biological Oceanographer Francisco Chavez.Chavez continued, "If these measurements had been taken on board a ship, researchers would have thought the extra carbon dioxide came from the ship's engine exhaust system and would have discounted them. But our moorings and surface robots do not release carbon dioxide to the atmosphere."In early 2018 MBARI Research Assistant Devon Northcott started working on the data set, analyzing hourly carbon dioxide concentrations in the air over Monterey Bay. He noticed another striking pattern -- carbon dioxide concentrations peaked in the early morning.Although atmospheric scientists had previously noticed early-morning peaks in carbon dioxide concentrations in some cities and agricultural areas, this was the first time such peaks had been measured over ocean waters. The finding also contradicted a common scientific assumption that concentrations of carbon dioxide over ocean areas do not vary much over time or space.Northcott was able to track down the sources of this extra carbon dioxide using measurements made from a robotic surface vessel called a Wave Glider, which travels back and forth across Monterey Bay making measurements of carbon dioxide in the air and ocean for weeks at a time."Because we had measurements from the Wave Glider at many different locations around the bay," Northcott explained, "I could use the Wave Glider's position and the speed and direction of the wind to triangulate the direction the carbon dioxide was coming from."The data suggested two main sources for the morning peaks in carbon dioxide -- the Salinas and Santa Clara Valleys. The Salinas Valley is one of California's largest agricultural areas, and many plants release carbon dioxide at night, which may explain why there was more carbon dioxide in the air from this region. Santa Clara Valley [aka Silicon Valley] is a dense urban area, where light winds and other atmospheric conditions in the early morning could concentrate carbon dioxide released from cars and factories.Typical morning breezes blow directly from the Salinas Valley out across Monterey Bay. Morning breezes also carry air from the Santa Clara Valley southward and then west through a gap in the mountains (Hecker Pass) and out across Monterey Bay."We had this evidence that the carbon dioxide was coming from an urban area," explained Northcott. "But when we looked at the scientific literature, there was nothing about air from urban areas affecting the coastal ocean. People had thought about this, but no one had measured it systematically before."The researchers see this paper not as a last word, but as a "wake-up call" to other scientists. "This brings up a lot of questions that we hope other researchers will look into," said Chavez. "One of first and most important things would be to make detailed measurements of carbon dioxide in the atmosphere and ocean in other coastal areas. We need to know if this is a global phenomenon. We would also like to get the atmospheric modelling community involved.""We've estimated that this could increase the amount of carbon dioxide entering coastal waters by roughly 20 percent," said Chavez. "This could have an effect on the acidity of seawater in these areas. Unfortunately, we don't have any good way to measure this increase in acidity because carbon dioxide takes time to enter the ocean and carbon dioxide concentrations vary dramatically in coastal waters.""There must be other pollutants in this urban air that are affecting the coastal ocean as well," he added."This is yet another case where the data from MBARI's autonomous robots and sensors has led us to new and unexpected discoveries," said Chavez. "Hopefully other scientists will see these results and will want to know if this is happening in their own backyards." | Pollution | 2,019 |
April 23, 2019 | https://www.sciencedaily.com/releases/2019/04/190423114031.htm | Auroral 'speed bumps' are more complicated, scientists find | Researchers at the University of New Hampshire Space Science Center find that "speed bumps" in space, which can slow down satellites orbiting closer to Earth, are more complex than originally thought. | "We knew these satellites were hitting "speed bumps," or "upswellings," which cause them to slow down and drop in altitude," said Marc Lessard, a physicist at UNH. "But on this mission we were able to unlock some of the mystery around why this happens by discovering that the bumps are much more complicated and structured."In the study, published in AGU's journal Scientists had long suspected that the aurora may be instigating the upwelling events affecting the lower altitude satellites because when they were flying through the aurora they would encounter "space speed bumps" caused by the heating up of the very high-altitude thermosphere. But since they occur at such high altitudes, these lower-energy auroras transfer more of their energy to the thin atmosphere at 250-400 kilometers (150-250 miles) above the ground, and produce more interesting effects than more familiar aurora, which sparkle at closer to 100 kilometers (60 miles) up."You can think of the satellites traveling through air pockets or bubbles similar to those in a lava lamp as opposed to a smooth wave," said Lessard.When early space programs first put satellites into orbit, they noticed the degradation of the satellites' orbits when the sun was active. The problem is when the extra drag slows down the satellites they move closer to Earth. Without extra fuel to boost them back up, they will eventually fall back to Earth.These specific satellites, that orbit in this area closer to Earth, are important because they do everything from take pictures of Earth to help provide up-to-date information for climate monitoring, crop yields, urban planning, disaster response and even military intelligence.Funding for this research was provided by the National Aeronautics and Space Administration (NASA). | Pollution | 2,019 |
April 22, 2019 | https://www.sciencedaily.com/releases/2019/04/190422112759.htm | From coal to gas: How the shift can help stabilize climate change | Led by Katsumasa Tanaka, a senior climate risk researcher at the National Institute for Environmental Studies in Japan, the study examined global scenarios for transitioning from coal to gas using a novel approach that applied metrics developed for climate impact assessments to the coal-gas debate for the first time. Focusing on the world's leading power generators -- China, Germany, India, and the United States -- the study examined the impacts from a variety of direct and indirect emissions of such a shift on both shorter and longer timescales ranging from a few decades to a century. | "Many previous studies were somewhat ambivalent about the climate benefits of the coal-to-gas shift," said Tanaka. "Our study makes a stronger case for the climate benefits that would result from this energy transition, because we carefully chose metrics to evaluate the climate impacts in light of recent advances in understanding metrics.""Given the current political situation, we deliver a much-needed message to help facilitate the energy shift away from coal under the Paris Agreement," Tanaka said. "However, natural gas is not an end goal; we regard it as a bridge fuel toward more sustainable forms of energy in the long run as we move toward decarbonization."Concerns about methane leakage from natural gas have been intensely debated, particularly in the United States given the increasing use of fracking over the past decade. Recent scientific efforts have improved understanding of the extent of methane leakage in the United States, but the potential impacts of methane leakage remain highly uncertain in the rest of the world."Our conclusion that the benefits of natural gas outweigh the possible risks is robust under a broad range of methane leakage, and under uncertainties in emissions data and metrics," Tanaka said.This research was partially supported by the Environment Research and Technology Development Fund (2-1702) of the Environmental Restoration and Conservation Agency in Japan, with additional support from the Institute for Advanced Sustainability Studies in Germany and the Research Council of Norway.Emissions metrics, or indicators to evaluate the impacts to climate change from a variety of emission types, are useful tools to gain insights into climate impacts without the need for climate model runs.These metrics work like weighting factors when calculating CO2-equivalent emissions from the emissions of a variety of greenhouse gases. However, the resulting climate impacts observed through CO2-equivalent emissions are sensitive to the specific metrics chosen."Because the outcome can strongly depend on which metrics are chosen and applied, there is a need for careful reflection about the meaning and implications of each specific choice," said Francesco Cherubini, a professor at the Norwegian University of Science and Technology. "Each emission type elicits a different climate system response. The diverging outcomes in previous studies may well stem from the type of metric that was chosen."The study combined multiple metrics to address both short- and long-term climate impacts in parallel. It was found that natural gas power plants have both smaller short- and long-term impacts than coal power plants, even when high potential methane leakage rates, a full array of greenhouse gases and air pollutants, or uncertainty issues are considered."Our study uses a set of metrics jointly, unlike many studies using just one, to consider climate impacts on different time scales -- one metric for a few decades and another one for approximately a century," said Otavio Cavalett, a colleague of Cherubini. "This allowed us to consider the host of pollutants that can affect the climate on different time scales.""In practice, we used the metrics available from the latest IPCC report and focused on those that are most consistent with the Paris Agreement temperature goals," Cherubini said. The authors' choice of metrics aligned with recent recommendations by the United Nations Environmental Programme and the Society of Environmental Toxicology and Chemistry. It is the first application of such recommendations to the coal-to-gas debate.To ensure that possible regional differences were accounted for in the global study, the study compared global metrics with regional metrics to more precisely examine impacts."We considered a suite of so-called short-lived climate pollutants (SLCPs), such as SOx, NOx, and black carbon, that can be emitted from these plants," said Bill Collins, a professor at the University of Reading, in the United Kingdom. "This required a regional analysis because climate impacts from SLCPs depend on where they are emitted, due to their short lifetimes in the atmosphere."The study by Tanaka and coauthors is part of a growing body of literature that reaffirms the need to phase out coal in order to mitigate rising global temperatures and slow or reverse negative impacts of climate change.Future related work could consider supply chains and trade within and across nations and other environmental factors, in addition to work on improving the consistency of metrics for evaluating climate impacts."Air quality is not part of our analysis, but including it would likely strengthen our conclusion," said Tanaka. Other environmental effects, such as drinking water contamination and induced seismic activities, could also add important dimensions to the debate." | Pollution | 2,019 |
April 18, 2019 | https://www.sciencedaily.com/releases/2019/04/190418131345.htm | Using the physics of airflows to locate gaseous leaks more quickly in complex scenarios | Engineers at Duke University are developing a smart robotic system for sniffing out pollution hotspots and sources of toxic leaks. Their approach enables a robot to incorporate calculations made on the fly to account for the complex airflows of confined spaces rather than simply 'following its nose.' | "Many existing approaches that employ robots to locate sources of airborne particles rely on bio-inspired educated but simplistic guesses, or heuristic techniques, that drive the robots upwind or to follow increasing concentrations," said Michael M. Zavlanos, the Mary Milus Yoh and Harold L. Yoh, Jr. Associate Professor of Mechanical Engineering and Materials Science at Duke. "These methods can usually only localize a single source in open space, and they cannot estimate other equally important parameters such as release rates."But in complex environments, these simplistic methods can send the robots on wild goose chases into areas where concentrations are artificially increased by the physics of the airflows, not because they're the source of the leak."If somebody is smoking outside, it doesn't take long to find them by just following your nose because there's nothing stopping the air currents from being predictable," said Wilkins Aquino, the Anderson-Rupp Professor of Mechanical Engineering and Materials Science at Duke. "But put the same cigarette inside an office and suddenly it becomes much more difficult because of the irregular air currents created by hallways, corners and offices."In a recent paper published online in the IEEE Transactions on Robotics, Zavlanos, Aquino and newly minted PhD graduate Reza Khodayi-mehr instead take advantage of the physics behind these airflows to trace the source of an emission more efficiently.Their approach combines physics-based models of the source identification problem with path planning algorithms for robotics in a feedback loop. The robots take measurements of contaminant concentrations in the environment and then use these measurements to incrementally calculate where the chemicals are actually coming from."Creating these physics-based models requires the solution of partial differential equations, which is computationally demanding and makes their application onboard small, mobile robots very challenging," said Khodayi-mehr. "We've had to create simplified models to make the calculations more efficient, which also makes them less accurate. It's a challenging trade-off."Khodayi-mehr built a rectangular box with a wall nearly bisecting the space length-wise to create a miniature U-shaped hallway that mimics a simplified office space. A fan pumps air into the corridor at one end of the U and back out of the other, while gaseous ethanol is slowly leaked into one of the corners. Despite the simplicity of the setup, the air currents created within are turbulent and messy, creating a difficult source identification problem for any ethanol-sniffing robot to solve.But the robot solves the problem anyway.The robot takes a concentration measurement, fuses it with previous measurements, and solves a challenging optimization problem to estimate where the source is. It then figures out the most useful location to take its next measurement and repeats the process until the source is found."By combining physics-based models with optimal path planning, we can figure out where the source is with very few measurements," said Zavlanos. "This is because physics-based models provide correlations between measurements that are not accounted for in purely data-driven approaches, and optimal path planning allows the robot to select those few measurements with the most information content.""The physics-based models are not perfect but they still carry way more information than just the sensors alone," added Aquino. "They don't have to be exact, but they allow the robot to make inferences based on what is possible within the physics of the airflows. This results in a much more efficient approach."This complex series of problem solving isn't necessarily faster, but it's much more robust. It can handle situations with multiple sources, which is currently impossible for heuristic approaches, and can even measure the rate of contamination.The group is still working to create machine-learning algorithms to make their models even more efficient and accurate at the same time. They're also working to extend this idea to programming a fleet of robots to conduct a methodical search of a large area. While they haven't tried the group approach in practice yet, they have published simulations that demonstrate its potential."Moving from a lab environment with controlled settings to a more practical scenario obviously requires addressing other challenges too," said Khodayi-mehr. "For example, in a real-world scenario we probably won't know the geometry of the domain going in. Those are some of the ongoing research directions we're currently working on." | Pollution | 2,019 |
April 18, 2019 | https://www.sciencedaily.com/releases/2019/04/190418080756.htm | Green material for refrigeration identified | Researchers from the UK and Spain have identified an eco-friendly solid that could replace the inefficient and polluting gases used in most refrigerators and air conditioners. | When put under pressure, plastic crystals of neopentylglycol yield huge cooling effects -- enough that they are competitive with conventional coolants. In addition, the material is inexpensive, widely available and functions at close to room temperature. Details are published in the journal The gases currently used in the vast majority of refrigerators and air conditioners -- hydrofluorocarbons and hydrocarbons (HFCs and HCs) -- are toxic and flammable. When they leak into the air, they also contribute to global warming."Refrigerators and air conditioners based on HFCs and HCs are also relatively inefficient," said Dr Xavier Moya, from the University of Cambridge, who led the research with Professor Josep Lluís Tamarit, from the Universitat Politècnica de Catalunya. "That's important because refrigeration and air conditioning currently devour a fifth of the energy produced worldwide, and demand for cooling is only going up."To solve these problems, materials scientists around the world have sought alternative solid refrigerants. Moya, a Royal Society Research Fellow in Cambridge's Department of Materials Science and Metallurgy, is one of the leaders in this field.In their newly published research, Moya and collaborators from the Universitat Politècnica de Catalunya and the Universitat de Barcelona describe the enormous thermal changes under pressure achieved with plastic crystals.Conventional cooling technologies rely on the thermal changes that occur when a compressed fluid expands. Most cooling devices work by compressing and expanding fluids such as HFCs and HCs. As the fluid expands, it decreases in temperature, cooling its surroundings.With solids, cooling is achieved by changing the material's microscopic structure. This change can be achieved by applying a magnetic field, an electric field or through mechanic force. For decades, these caloric effects have fallen behind the thermal changes available in fluids, but the discovery of colossal barocaloric effects in a plastic crystal of neopentylglycol (NPG) and other related organic compounds has levelled the playfield.Due to the nature of their chemical bonds, organic materials are easier to compress, and NPG is widely used in the synthesis of paints, polyesters, plasticisers and lubricants. It's not only widely available but also is inexpensive.NPG's molecules, composed of carbon, hydrogen and oxygen, are nearly spherical and interact with each other only weakly. These loose bonds in its microscopic structure permit the molecules to rotate relatively freely.The word "plastic" in "plastic crystals" refers not to its chemical composition but rather to its malleability. Plastic crystals lie at the boundary between solids and liquids.Compressing NPG yields unprecedentedly large thermal changes due to molecular reconfiguration. The temperature change achieved is comparable with those exploited commercially in HFCs and HCs.The discovery of colossal barocaloric effects in a plastic crystal should bring barocaloric materials to the forefront of research and development to achieve safe environmentally friendly cooling without compromising performance.Moya is now working with Cambridge Enterprise, the commercialisation arm of the University of Cambridge, to bring this technology to market. | Pollution | 2,019 |
April 17, 2019 | https://www.sciencedaily.com/releases/2019/04/190417153800.htm | 'Induced' driving miles could overwhelm potential energy-saving benefits of self-driving | The benefits of self-driving cars will likely induce vehicle owners to drive more, and those extra miles could partially or completely offset the potential energy-saving benefits that automation may provide, according to a new University of Michigan study. | In the coming years, self-driving cars are expected to yield significant improvements in safety, traffic flow and energy efficiency. In addition, automation will allow vehicle occupants to make productive use of travel time.Previous studies have shown that greater fuel efficiency induces some people to travel extra miles, and those added miles can partially offset fuel savings. It's a behavioral change known as the rebound effect.In addition, the ability to use in-vehicle time productively in a self-driving car -- people can work, sleep, watch a movie, read a book -- will likely induce even more travel.Taken together, those two sources of added mileage could partially or completely offset the energy savings provided by autonomous vehicles, according to a team of researchers at the U-M School for Environment and Sustainability led by Dow Sustainability Doctoral Fellow Morteza Taiebat.Conceivably, the added miles could even result in a net increase in energy consumption, a phenomenon known as backfire, according to the U-M researchers. The study published April 17 in the journal "The core message of the paper is that the induced travel of self-driving cars presents a stiff challenge to policy goals for reductions in energy use," said co-author Samuel Stolper, assistant professor of environment and sustainability at SEAS."Thus, much higher energy efficiency targets are required for self-driving cars," said co-author Ming Xu, associate professor of environment and sustainability at SEAS and associate professor of civil and environmental engineering at the College of Engineering.In the paper, Taiebat and his colleagues used economic theory and U.S. travel survey data to model travel behavior and to forecast the effects of vehicle automation on travel decisions and energy use.Most previous studies of the energy impact of autonomous vehicles focused exclusively on the fuel-cost component of the price of travel, likely resulting in an overestimation of the environmental benefits of the technology, according to the U-M authors.In contrast, the study by Taiebat and colleagues looked at both fuel cost and time cost. Their approach adapts standard microeconomic modeling and statistical techniques to account for the value of time.Traditionally, time spent driving has been viewed as a cost to the driver. But the ability to pursue other activities in an autonomous vehicle is expected to lower this "perceived travel time cost" considerably, which will likely spur additional travel.The U-M researchers estimated that the induced travel resulting from a 38% reduction in perceived travel time cost would completely eliminate the fuel savings associated with self-driving cars."Backfire -- a net rise in energy consumption -- is a distinct possibility if we don't develop better efficiencies, policies and applications," Taiebat said.The possibility of backfire, in turn, implies the possibility of net increases in local and global air pollution, the study authors concluded.In addition, the researchers suggest there's an equity issue that needs to be addressed as autonomous vehicles become a reality. The study found that wealthier households are more likely than others to drive extra miles in autonomous vehicles "and thus stand to experience greater welfare gains." | Pollution | 2,019 |
April 15, 2019 | https://www.sciencedaily.com/releases/2019/04/190415200244.htm | New report examines the safety of using dispersants in oil spill clean ups | A multi-disciplinary team of scientists has issued a series of findings and recommendations on the safety of using dispersal agents in oil spill clean-up efforts in a report published this month by the National Academies of Science, Engineering, and Medicine. | By measuring the level of a leading dispersal agent, dioctyl sodium sulfosuccinate, in sea life following the 2010 Deepwater Horizon spill in the Gulf of Mexico, the team was able to establish how long the chemical lingers and what health effects it has on various organisms. The scientists found the risks associated with using DOSS were minimal, the team found that in areas where oil concentrations in water were more than 100 milligrams per liter did increase the toxicity, though they noted oil concentrations are typically much lower than that in most spills.Terry Hazen, the University of Tennessee-Oak Ridge National Laboratory Governor's Chair for Environmental Biotechnology, who is known for his work on the Deepwater Horizon oil spill recovery efforts, co-authored the report."One of the biggest concerns in cleanup efforts is the effect the spill has on people's health and livelihood," Terry Hazen, University of Tennessee faculty a University of Tennessee said. "It's not just that oil itself is harmful and potentially even flammable, but you have to be careful what kind of chemicals you expose crews to while trying to clean or contain the oil."The researchers also discovered that the testing methodology used after spills is typically so varied that it is hard to draw broad conclusions when comparing different spills.To combat this, the team recommends the creation of some form of standardized testing that would enable data to be correlated from one spill to the next.The work was sponsored by several government agencies, including the US Environmental Protection Agency, the Bureau of Ocean Energy Management, and the Gulf of Mexico Research Initiative, among others, and can be read online. | Pollution | 2,019 |
April 12, 2019 | https://www.sciencedaily.com/releases/2019/04/190412085226.htm | Unique oil-eating bacteria found in world's deepest ocean trench, Mariana Trench | Scientists from the University of East Anglia have discovered a unique oil eating bacteria in the deepest part of the Earth's oceans -- the Mariana Trench. | Together with researchers from the China and Russia, they undertook the most comprehensive analysis of microbial populations in the trench.The Mariana Trench is located in the Western Pacific Ocean and reaches a depth of approximately 11,000 metres. By comparison, Mount Everest is 8,848 metres high."We know more about Mars than the deepest part of the ocean," said Prof Xiao-Hua Zhang of the Ocean University in China, who led the study.To date, only a few expeditions have investigated the organisms inhabiting this ecosystem.One of these expeditions was organized and led by noted marine explorer and Academy Award-winning film director James Cameron, who built a specialised submersible to collect samples in the trench.Dr Jonathan Todd, from UEA's School of Biological Sciences, said: "Our research team went down to collect samples of the microbial population at the deepest part of the Mariana Trench -- some 11,000 metres down. We studied the samples that were brought back and identified a new group of hydrocarbon degrading bacteria."Hydrocarbons are organic compounds that are made of only hydrogen and carbon atoms, and they are found in many places, including crude oil and natural gas."So these types of microorganisms essentially eat compounds similar to those in oil and then use it for fuel. Similar microorganisms play a role in degrading oil spills in natural disasters such as BP's 2010 oil spill in the Gulf of Mexico.""We also found that this bacteria is really abundant at the bottom of the Mariana Trench."In fact, the team found that the proportion of hydrocarbon degrading bacteria in the Trench is the highest on Earth.The scientists isolated some of these microbes and demonstrated that they consume hydrocarbons in the laboratory under environmental conditions that simulate those in the Mariana Trench.In order to understand the source of the hydrocarbons 'feeding' this bacteria, the team analysed samples of sea water taken at the surface, and all the way down a column of water to the sediment at the bottom of the trench.Dr Nikolai Pedentchouk, from UEA's School of Environmental Sciences, said: "We found that hydrocarbons exist as deep as 6,000 meters below the surface of the ocean and probably even deeper. A significant proportion of them probably derived from ocean surface pollution."To our surprise, we also identified biologically produced hydrocarbons in the ocean sediment at the bottom of the trench. This suggests that a unique microbial population is producing hydrocarbons in this environment.""These hydrocarbons, similar to the compounds that constitute diesel fuel, have been found in algae at the ocean surface but never in microbes at these depths."Dr David Lea-Smith, from UEA's School of Biological Sciences, said: "These hydrocarbons may help microbes survive the crushing pressure at the bottom of the Mariana Trench, which is equal to 1,091 kilograms pressed against a fingernail."They may also be acting as a food source for other microbes, which may also consume any pollutant hydrocarbons that happen to sink to the ocean floor. But more research is needed to fully understand this unique environment.""Identifying the microbes that produce these hydrocarbons is one of our top priorities, as is understanding the quantity of hydrocarbons released by human activity into this isolated environment," added Prof Xiao-Hua Zhang. | Pollution | 2,019 |
April 11, 2019 | https://www.sciencedaily.com/releases/2019/04/190411131543.htm | New research adds to work of Prandtl, father of modern aerodynamics | In 1942, Ludwig Prandtl -- considered the father of modern aerodynamics -- published "Führer durch die Strömungslehre," the first book of its time on fluid mechanics and translated to English from the German language in 1952 as "Essentials of Fluid Mechanics." The book was uniquely successful such that Prandtl's students continued to maintain and develop the book with new findings after his death. Today, the work is available under the revised title "Prandtl -- Essentials of Fluid Mechanics," as an expanded and revised version of the original book with contributions by leading researchers in the field of fluid mechanics. | Over the years, the last three pages of Prandtl's original book, focusing on mountain and valley winds, have received some attention from the meteorology research community, but the specific pages have been largely overlooked by the fluid mechanics community to the point that the content and the exact mathematical solutions have disappeared in the current expanded version of the book. But today in the age of supercomputers, Inanc Senocak, associate professor of mechanical engineering and materials science at the University of Pittsburgh Swanson School of Engineering, is finding new insights in Prandtl's original work, with important implications for nighttime weather prediction in mountainous terrain.Drs. Senocak and Cheng-Nian Xiao, a postdoctoral researcher in Dr. Senocak's lab, recently authored a paper titled "Stability of the Prandtl Model for Katabatic Slope Flows," published in the Katabatic slope flows are gravity-driven winds common over large ice sheets or during nighttime on mountain slopes, where cool air flows downhill. Understanding those winds are vital for reliable weather predictions, which are important for air quality, aviation and agriculture. But the complexity of the terrain, the stratification of the atmosphere and fluid turbulence make computer modeling of winds around mountains difficult. Since Prandtl's model does not set the conditions for when a slope flow would become turbulent, that deficiency makes it difficult, for example, to predict weather for the area around Salt Lake City in Utah, where the area's prolonged inversions create a challenging environment for air quality."Now that we have more powerful supercomputers, we can improve upon the complexity of the terrain with better spatial resolutions in the mathematical model," says Dr. Senocak. "However, numerical weather prediction models still make use of simplified models that have originated during a time when computing power was insufficient."The researchers found that while Prandtl's model is prone to unique fluid instabilities, which emerge as a function of the slope angle and a new dimensionless number, they have named the stratification perturbation parameter as a measure of the disturbance to the background stratification of the atmosphere due to cooling at the surface. The concept of dimensionless numbers, for example the Reynolds number, plays an important role in thermal and fluid sciences in general as they capture the essence of competing processes in a problem.An important implication of their finding is that, for a given fluid such as air, dynamic stability of katabatic slope flows cannot simply be determined by a single dimensionless parameter alone, such as the Richardson number, as is practiced currently in the meteorology and fluids dynamics community. The Richardson number expresses a ratio of buoyancy to the wind shear and is commonly used in weather prediction, investigating currents in oceans, lakes and reservoirs, and measuring expected air turbulence in aviation."An overarching concept was missing, and the Richardson number was the fallback," says Dr. Senocak. "We're not saying the Richardson number is irrelevant, but when a mountain or valley is shielded from larger scale weather motions, it doesn't enter into the picture. Now we have a better way of explaining the theory of these down-slope and down-valley flows."Not only will this discovery be important for agriculture, aviation and weather prediction, according to Dr. Senocak, but it will also be vital for climate change research and associated sea-level rise, as accurate prediction of katabatic surface wind profiles over large ice sheets and glaciers is critical in energy balance of melting ice. He notes that even in the fluids dynamics community, the discovery of this new surprising type of instability is expected to arouse a lot of research interest.Next, Dr. Senocak is advising and sponsoring a senior design team to see if researchers can actually observe these fluid instabilities in the lab at a scale much smaller than a mountain.The paper was published online in February and will appear in print April 25, 2019. | Pollution | 2,019 |
April 11, 2019 | https://www.sciencedaily.com/releases/2019/04/190411101829.htm | World's fastest hydrogen sensor could pave the way for clean hydrogen energy | Hydrogen is a clean and renewable energy carrier that can power vehicles, with water as the only emission. Unfortunately, hydrogen gas is highly flammable when mixed with air, so very efficient and effective sensors are needed. Now, researchers from Chalmers University of Technology, Sweden, present the first hydrogen sensors ever to meet the future performance targets for use in hydrogen powered vehicles. | The researchers' ground-breaking results were recently published in the scientific journal The plastic around the tiny sensor is not just for protection, but functions as a key component. It increases the sensor's response time by accelerating the uptake of the hydrogen gas molecules into the metal particles where they can be detected. At the same time, the plastic acts as an effective barrier to the environment, preventing any other molecules from entering and deactivating the sensor. The sensor can therefore work both highly efficiently and undisturbed, enabling it to meet the rigorous demands of the automotive industry -- to be capable of detecting 0.1 percent hydrogen in the air in less than a second."We have not only developed the world's fastest hydrogen sensor, but also a sensor that is stable over time and does not deactivate. Unlike today's hydrogen sensors, our solution does not need to be recalibrated as often, as it is protected by the plastic," says Ferry Nugroho, a researcher at the Department of Physics at Chalmers.It was during his time as a PhD student that Ferry Nugroho and his supervisor Christoph Langhammer realised that they were on to something big. After reading a scientific article stating that no one had yet succeeded in achieving the strict response time requirements imposed on hydrogen sensors for future hydrogen cars, they tested their own sensor. They realised that they were only one second from the target -- without even trying to optimise it. The plastic, originally intended primarily as a barrier, did the job better than they could have imagined, by also making the sensor faster. The discovery led to an intense period of experimental and theoretical work."In that situation, there was no stopping us. We wanted to find the ultimate combination of nanoparticles and plastic, understand how they worked together and what made it so fast. Our hard work yielded results. Within just a few months, we achieved the required response time as well as the basic theoretical understanding of what facilitates it," says Ferry Nugroho.Detecting hydrogen is challenging in many ways. The gas is invisible and odourless, but volatile and extremely flammable. It requires only four percent hydrogen in the air to produce oxyhydrogen gas, sometimes known as knallgas, which ignites at the smallest spark. In order for hydrogen cars and the associated infrastructure of the future to be sufficiently safe, it must therefore be possible to detect extremely small amounts of hydrogen in the air. The sensors need to be quick enough that leaks can be rapidly detected before a fire occurs."It feels great to be presenting a sensor that can hopefully be a part of a major breakthrough for hydrogen-powered vehicles. The interest we see in the fuel cell industry is inspiring," says Christoph Langhammer, Professor at Chalmers Department of Physics.Although the aim is primarily to use hydrogen as an energy carrier, the sensor also presents other possibilities. Highly efficient hydrogen sensors are needed in the electricity network industry, the chemical and nuclear power industry, and can also help improve medical diagnostics."The amount of hydrogen gas in our breath can provide answers to, for example, inflammations and food intolerances. We hope that our results can be used on a broad front. This is so much more than a scientific publication," says Christoph Langhammer.In the long run, the hope is that the sensor can be manufactured in series in an efficient manner, for example using 3D printer technology. | Pollution | 2,019 |
April 10, 2019 | https://www.sciencedaily.com/releases/2019/04/190410210003.htm | Millions of children worldwide develop asthma annually due to traffic-related pollution | About 4 million children worldwide develop asthma each year because of inhaling nitrogen dioxide air pollution, according to a study published today by researchers at the George Washington University Milken Institute School of Public Health (Milken Institute SPH). The study, based on data from 2010 to 2015, estimates that 64 percent of these new cases of asthma occur in urban areas. | The study is the first to quantify the worldwide burden of new pediatric asthma cases linked to traffic-related nitrogen dioxide by using a method that takes into account high exposures to this pollutant that occur near busy roads, said Susan C. Anenberg, PhD, the senior author of the study and an associate professor of environmental and occupational health at Milken Institute SPH."Our findings suggest that millions of new cases of pediatric asthma could be prevented in cities around the world by reducing air pollution," said Anenberg. "Improving access to cleaner forms of transportation, like electrified public transport and active commuting by cycling and walking, would not only bring down NOThe researchers linked global datasets of NOKey findings from the study published in Asthma is a chronic disease that makes it hard to breathe and results when the lung's airways are inflamed. An estimated 235 million people worldwide currently have asthma, which can cause wheezing as well as life-threatening attacks.The World Health Organization calls air pollution "a major environmental risk to health" and has established Air Quality Guidelines for NO"That finding suggests that the WHO guideline for NOThe researchers found that in general, cities with high NOAdditional research must be done to more conclusively identify the causative agent within complex traffic emissions, said the researchers. This effort, along with more air pollution monitoring and epidemiological studies conducted in data-limited countries will help to refine the estimates of new asthma cases tied to traffic emissions, Anenberg and Achakulwisut added. | Pollution | 2,019 |
April 10, 2019 | https://www.sciencedaily.com/releases/2019/04/190410120609.htm | Unusual phenomenon in clouds triggers lightning flash | In a first-of-its-kind observation, researchers from the University of New Hampshire Space Science Center have documented a unique event that occurs in clouds before a lightning flash happens. Their observation, called "fast negative breakdown," documents a new possible way for lightning to form and is the opposite of the current scientific view of how air carries electricity in thunderstorms. | "This is the first time fast negative breakdown has ever been observed, so it's very exciting," said Ningyu Liu, professor of physics. "Despite over 250 years of research, how lightning begins is still a mystery. The process was totally unexpected and gives us more insight into how lightning starts and spreads."Their finding, published in the journal "These findings indicate that lightning creation within a cloud might be more bidirectional than we originally thought," said Julia Tilles, a doctoral candidate in the UNH Space Science Center.In collaboration with a lightning research team from New Mexico Institute of Mining and Technology, the researchers documented fast negative breakdown in a Florida lightning storm at Kennedy Space Center using radio waves originating deep inside the storm clouds. An array of ground-based antennas picked up the radio waves, which then allowed researchers to create a highly detailed image of the radio sources and identify this unusual phenomenon.Researchers continue to develop images from the data and hope to learn more about how often fast negative breakdown events occur and what fraction of them can initiate an actual lightning flash. | Pollution | 2,019 |
April 10, 2019 | https://www.sciencedaily.com/releases/2019/04/190410095934.htm | Falling levels of air pollution drove decline in California's tule fog | The Central Valley's heavy wintertime tule fog -- known for snarling traffic and closing schools -- has been on the decline over the past 30 years, and falling levels of air pollution are the cause, says a new study by scientists at the University of California, Berkeley. | Tule fog, named for a sedge grass that populates California's wetlands, is a thick ground fog that periodically blankets the Central Valley during the winter months.To find out why the fog is fading, the researchers analyzed meteorological and air pollution data from the Central Valley reaching back to 1930. They found that while yearly fluctuations in fog frequency could be explained by changes in annual weather patterns, the long-term trends matched those of pollutants in the air.The results help explain the puzzling decades-long rise and fall in the number of "fog days" affecting the region, which increased 85 percent between 1930 to 1970 and then decreased 76 percent between 1980 to 2016. This up-and-down pattern follows trends in air pollution in the valley, which rose during the first half of the century, when the region was increasingly farmed and industrialized, and then dropped off after the enactment of air pollution regulations in the 1970s."That increase and then decrease in fog frequency can't be explained by the rising temperatures due to climate change that we've seen in recent decades, and that's what really motivated our interest in looking at trends in air pollution," said Ellyn Gray, a graduate student in environmental science, policy and management at UC Berkeley and first author on the paper, which appears online in The link between air pollution and fog also explains why southern parts of the valley -- where higher temperatures should suppress the formation of fog -- actually have a higher occurrence of fog than northern parts of the valley."We have a lot more fog in the southern part of the valley, which is also where we have the highest air pollution concentrations," Gray said.And it makes sense, given what we know about how clouds and fog form, Gray says. Oxides of nitrogen (NOx) react with ammonia to form ammonium nitrate particles, which help trigger water vapor to condense into small fog droplets. Emissions of NOx have declined dramatically since the 1980s, resulting in a decrease in ammonium nitrate aerosols and fog."In order to get fog to form, not only do you need the temperature to go down, but there has to be some sort of seed for water to condense around, similar to how you would have a cloud seed in the atmosphere," Gray said. "Ammonium nitrate happens to make very good fog seeds -- water is very attracted to it."As a next step, the team plans to take a close look at the association between air pollution, tule fog and traffic safety in the valley."When I was growing up in California in the 1970s and early 1980s, tule fog was a major story that we would hear about on the nightly news," said Allen Goldstein, a professor in the Department of Environmental Science, Policy, and Management, and in the Department of Civil and Environmental Engineering at UC Berkeley and senior author on the paper. "These tule fogs were associated with very damaging multi-vehicle accidents on freeways in the Central Valley resulting from the low visibility. Today, those kind of fog events and associated major accidents are comparatively rare." | Pollution | 2,019 |
April 9, 2019 | https://www.sciencedaily.com/releases/2019/04/190409083220.htm | Tracking the sources of plastic pollution in world's oceans | Plastic pollution in the world's oceans is now widely recognised as a major global challenge -- but we still know very little about how these plastics are actually reaching the sea. A new global initiative, led by the University of Birmingham shows how focussing on rivers and river mouths can yield vital clues about how we might manage this plastic crisis. | The 100 Plastic Rivers Project is engaging with scientists in more than 60 locations worldwide to sample water and sediment in rivers. The aim is to better understand how plastics are transported and transformed in rivers and how they accumulate in river sediments, where they create a long-lasting pollution legacy.First results of the project will be presented at the General Assembly of the European Geosciences Union (EGU), held in Vienna, Austria, from 7-12 April 2019. They show a complex picture, with a huge diversity in types and sources of plastic in selected river estuaries in the UK and France.Professor Stefan Krause, of the School of Geography, Earth and Environmental Sciences at the University of Birmingham, explains: "Even if we all stopped using plastic right now, there would still be decades, if not centuries-worth of plastics being washed down rivers and into our seas. We're getting more and more aware of the problems this is causing in our oceans, but we are now only starting to look at where these plastics are coming from, and how they're accumulating in our river systems. We need to understand this before we can really begin to understand the scale of the risk that we're facing."The 100 Plastic Rivers programme analyses both primary microplastics, such as micro-beads used in cosmetics, and secondary microplastics -- from larger plastic items that have broken down in the environment or fibres from clothing.A key part of the project is to establish a standard method for the sampling and analysis of microplastics which can be used to provide an assessment of the plastic pollution of our river networks. The Birmingham team have produced a toolkit, with detailed instructions for sampling water and river sediments at locations where stream flow is known or measured and developed methods of automating and thus, objectifying the identification and analysis of microplastics.In a recently completed pilot study, the University of Birmingham team collaborated with the Clean Seas Odyssey citizen science project to test parts of the developed methodology based on sampling water and sediments from river estuaries around the UK and France Channel coast. By analysing the samples taken by interested members of the public, they were able to test the sampling protocol and develop a picture of the different types of polymers accumulated in river sediments at their confluence with the sea.The results of this initial survey showed a much wider variety of plastic types in the samples than had been anticipated. This shows that, even in relatively well-regulated countries like the UK and France, there are a variety of different sources contributing to a high concentration of microplastics in river systems.The research is funded by the Leverhulme Trust, the EU Horizon 2020 Framework, the Royal Society, and the Clean Seas Odyssey. | Pollution | 2,019 |
April 8, 2019 | https://www.sciencedaily.com/releases/2019/04/190408183729.htm | Engineers develop concept for hybrid heavy-duty trucks | Heavy-duty trucks, such as the 18-wheelers that transport many of the world's goods from farm or factory to market, are virtually all powered by diesel engines. They account for a significant portion of worldwide greenhouse gas emissions, but little has been done so far to curb their climate-change-inducing exhaust. | Now, researchers at MIT have devised a new way of powering these trucks that could drastically curb pollution, increase efficiency, and reduce or even eliminate their net greenhouse gas emissions.The concept involves using a plug-in hybrid engine system, in which the truck would be primarily powered by batteries, but with a spark ignition engine (instead of a diesel engine). That engine, which would allow the trucks to conveniently travel the same distances as today's conventional diesel trucks, would be a flex-fuel model that could run on pure gasoline, pure alcohol, or blends of these fuels.While the ultimate goal would be to power trucks entirely with batteries, the researchers say, this flex-fuel hybrid option could provide a way for such trucks to gain early entry into the marketplace by overcoming concerns about limited range, cost, or the need for excessive battery weight to achieve longer range.The new concept was developed by MIT Energy Initiative and Plasma Fusion and Science Center research scientist Daniel Cohn and principal research engineer Leslie Bromberg, who are presenting it at the annual SAE International conference on April 11."We've been working for a number of years on ways to make engines for cars and trucks cleaner and more efficient, and we've been particularly interested in what you can do with spark ignition [as opposed to the compression ignition used in diesels], because it's intrinsically much cleaner," Cohn says. Compared to a diesel engine vehicle, a gasoline-powered vehicle produces only a tenth as much nitrogen oxide (NOx) pollution, a major component of air pollution.In addition, by using a flex-fuel configuration that allows it to run on gasoline, ethanol, methanol, or blends of these, such engines have the potential to emit far less greenhouse gas than pure gasoline engines do, and the incremental cost for the fuel flexibility is very small, Cohn and Bromberg say. If run on pure methanol or ethanol derived from renewable sources such as agricultural waste or municipal trash, the net greenhouse gas emissions could even be zero. "It's a way of making use of a low-greenhouse-gas fuel" when it's available, "but always having the option of running it with gasoline" to ensure maximum flexibility, Cohn says.While Tesla Motors has announced it will be producing an all-electric heavy-duty truck, Cohn says, "we think that's going to be very challenging, because of the cost and weight of the batteries" needed to provide sufficient range. To meet the expected driving range of conventional diesel trucks, Cohn and Bromberg estimate, would require somewhere between 10 and 15 tons of batteries "That's a significant fraction of the payload" such a truck could otherwise carry, Cohn says.To get around that, "we think that the way to enable the use of electricity in these vehicles is with a plug-in hybrid," he says. The engine they propose for such a hybrid is a version of one the two researchers have been working on for years, developing a highly efficient, flexible-fuel gasoline engine that would weigh far less, be more fuel-efficient, and produce a tenth as much air pollution as the best of today's diesel-powered vehicles.Cohn and Bromberg did a detailed analysis of both the engineering and the economics of what would be needed to develop such an engine to meet the needs of existing truck operators. In order to match the efficiency of diesels, a mix of alcohol with the gasoline, or even pure alcohol, can be used, and this can be processed using renewable energy sources, they found. Detailed computer modeling of a whole range of desired engine characteristics, combined with screening of the results using an artificial intelligence system, yielded clear indications of the most promising pathways and showed that such substitutions are indeed practically and financially feasible.In both the present diesel and the proposed flex-fuel vehicles, the emissions are measured at the tailpipe, after a variety of emissions-control systems have done their work in both cases, so the comparison is a realistic measure of real-world emissions. The combination of a hybrid drive and flex-fuel engine is "a way to enable the introduction of electric drive into the heavy truck sector, by making it possible to meet range and cost requirements, and doing it in a way that's clean," Cohn says.Bromberg says that gasoline engines have become much more efficient and clean over the years, and the relative cost of diesel fuel has gone up, so that the cost advantages that led to the near-universal adoption of diesels for heavy trucking no longer prevail. "Over time, gas engines have become more and more efficient, and they have an inherent advantage in producing less air pollution," he says. And by using the engine in a hybrid system, it can always operate at its optimum speed, maximizing its efficiency.Methane is an extremely potent greenhouse gas, so if it can be diverted to produce a useful fuel by converting it to methanol through a simple chemical process, "that's one of the most attractive ways to make a clean fuel," Bromberg says. "I think the alcohol fuels overall have a lot of promise."Already, he points out, California has plans for new regulations on truck emissions that are very difficult to meet with diesel engine vehicles. "We think there's a significant rationale for trucking companies to go to gasoline or flexible fuel," Cohn says. "The engines are cheaper, exhaust treatment systems are cheaper, and it's a way to ensure that they can meet the expected regulations. And combining that with electric propulsion in a hybrid system, given an ever-cleaner electric grid, can further reduce emissions and pollution from the trucking sector."Pure electric propulsion for trucks is the ultimate goal, but today's batteries don't make that a realistic option yet, Cohn says: "Batteries are great, but let's be realistic about what they can provide."And the combination they propose can address two major challenges at once, they say. "We don't know which is going to be stronger, the desire to reduce greenhouse gases, or the desire to reduce air pollution." In the U.S., climate change may be the bigger push, while in India and China air pollution may be more urgent, but "this technology has value for both challenges," Cohn says.The research was supported by the MIT Arthur Samberg Energy Innovation Fund. | Pollution | 2,019 |
April 8, 2019 | https://www.sciencedaily.com/releases/2019/04/190408161646.htm | Researchers remove harmful hormones from Las Vegas wastewater using green algae | A common species of freshwater green algae is capable of removing certain endocrine disrupting chemicals (EDCs) from wastewater, according to new research from the Desert Research Institute (DRI) in Las Vegas. | EDCs are natural hormones and can also be found in many plastics and pharmaceuticals. They are known to be harmful to wildlife, and to humans in large concentrations, resulting in negative health effects such as lowered fertility and increased incidence of certain cancers. They have been found in trace amounts (parts per trillion to parts per billion) in treated wastewater, and also have been detected in water samples collected from Lake Mead.In a new study published in the journal "This type of algae is very commonly found in any freshwater ecosystem around the world, but its potential for use in wastewater treatment hadn't been studied extensively," explained Bai, lead author and Assistant Research Professor of environmental sciences with the Division of Hydrologic Sciences at DRI. "We wanted to explore whether this species might be a good candidate for use in an algal pond or constructed wetland to help remove wastewater contaminants."During a seven-day laboratory experiment, the researchers grew Nannochloris algal cultures in two types of treated wastewater effluents collected from the Clark County Water Reclamation District in Las Vegas, and measured changes in the concentration of seven common EDCs.In wastewater samples that had been treated using an ultrafiltration technique, the researchers found that the algae grew rapidly and significantly improved the removal rate of three EDCs (17?-estradiol, 17?-ethinylestradiol and salicylic acid), with approximately 60 percent of each contaminant removed over the course of seven days. In wastewater that had been treated using ozonation, the algae did not grow as well and had no significant impact on EDC concentrations.One of the EDCs examined in the study, triclosan, disappeared completely from the ultrafiltration water after seven days, and only 38 percent remained in the ozonation water after seven days -- but this happened regardless of the presence of algae, and was attributed to breakdown by photolysis (exposure to light)."Use of algae for removing heavy metals and other inorganic contaminants have been extensively studied in the past, but for removing organic pollutants has just started," said Acharya, Interim Vice President for Research and Executive Director of Hydrologic Sciences at DRI. "Our research shows both some of the potential and also some of the limitations for using Nannochloris to remove EDCs from wastewater."Although these tests took place under laboratory conditions, a previous study by Bai and Acharya that published in November 2018 in the journal "Algae sit at the base of the food web, thereby providing food for organisms in higher trophic levels such as quagga mussels and other zooplantkons," Bai said. "Our study clearly shows that there is potential for these contaminants to biomagnify, or build up at higher levels of the food chain in the aquatic ecosystem."Bai is now working on a new study looking for antibiotic resistance in genes collected from the Las Vegas Wash, as well as a study of microplastics in the Las Vegas Wash and Lake Mead. Although Las Vegas's treated wastewater meets Clean Water Act standards, Bai hopes that her research will draw public attention to the fact that treated wastewater is not 100 percent clean, and will also be helpful to utility managers as they develop new ways to remove untreated contaminants from wastewater prior to release."Most wastewater treatment plants are not designed to remove these unregulated contaminants in lower concentrations, but we know they may cause health effects to aquatic species and even humans, in large concentrations," Bai said. "This is concerning in places where wastewater is recycled for use in agriculture or released back into drinking water sources." | Pollution | 2,019 |
April 8, 2019 | https://www.sciencedaily.com/releases/2019/04/190408161615.htm | Carbon-negative power generation for China | If we're going to limit global temperature increases to 2 degrees above pre-industrial levels, as laid out in the Paris Climate Agreement, it's going to take a lot more than a transition to carbon-neutral energy sources such as wind and solar. It's going to require carbon-negative technologies, including energy sources that actually reduce carbon dioxide levels in the atmosphere. | While most climate researchers and activists agree that carbon-negative solutions will be needed to meet the terms of the Paris Agreement goal, so far most of these solutions have been viewed as impractical in the near term, especially for large, coal-reliant countries like China.Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences and the Harvard-China Project on Energy, Economy and Environment, in collaboration with colleagues from Tsinghua University in Beijing and other institutions in China, Australia and the U.S., have analyzed technical and economic viability for China to move towards carbon-negative electric power generation.The research is published in the "This paper is making a bold suggestion that not only can China move towards negative carbon power but that it can do so in an economically competitive way," said Michael McElroy, the Gilbert Butler Professor of Environmental Studies at Harvard and a senior co-author of the paper."The system we describe not only offers a carbon-negative alternative to generate electricity in the long run but also brings significant near-term co-benefit to reducing air pollution in China," said Xi Lu, Associate Professor in the School of Environment at Tsinghua University and first author of the paper. Lu is also a former SEAS graduate student and postdoctoral fellow.The strategy McElroy, Lu and their colleagues lay out involves the combination of two forms of green energy: coal-bioenergy gasification and carbon capture and storage.Bioenergy one of the most important tools in the carbon-negative toolbox.Bioenergy comes from the best COThe process of converting biomass into energy and then capturing and storing the waste COBut what if there was a way to make the process more practical and efficient?Lu, McElroy and their international team turned to an unlikely solution for green energy: coal."If you try to do this with biofuel alone, it's not very effective," McElroy said. "The addition of coal provides an energy source that is really important. If you combine biofuel with coal and gasify the mixture, you can essentially develop a pure source of hydrogen in the process."By modeling different ratios of biofuel to coal, the researchers found that as long as at least 35 percent of the mixture is biomass and the waste carbon is captured, the power generated would actually reduce COA key component to this strategy is the use of crop residue -- the remains of plants after fields have been harvested -- as biofuel.Seasonal agricultural fires, when farmers set fire to their fields to clear stubble after a harvest, are a major source of air pollution in China. Collecting that stubble and using it as biofuels would not only reduce COThe researchers acknowledge that developing a system to collect the biomass and deliver it to powerplants will take time but they argue that the system doesn't need to be implemented all at once."Because we've investigated the whole range of coal-to-biomass ratios, we've demonstrated how China could move incrementally towards an increasingly carbon-negative energy source," said Chris P. Nielsen, Executive Director of the Harvard-China Project and co-author of the study. "First, small amounts of biofuel could be used to reduce the net positive carbon emissions. Then, the system could grow toward carbon neutrality and eventually to a carbon-negative system. You don't have to accomplish everything from the get-go.""This study provides critical information for policy makers seeking to implement carbon-negative energy opportunities in China," said Lu.The research was co-authored by Liang Cao, Haikun Wang, Wei Peng, Jia Xing, Shuxiao Wang, Siyi Cai, Bo Shen, and Qing Yang; lead author Lu and three other China-based co-authors are alumni of the Harvard-China Project. It was supported in part by a grant from the Harvard Global Institute. | Pollution | 2,019 |
April 8, 2019 | https://www.sciencedaily.com/releases/2019/04/190408091254.htm | Current methods may inadequately measure health impacts from oil, natural gas extraction | An examination of peer-reviewed studies published over six years on hazardous air pollutants associated with the extraction of oil and natural gas finds that measurements of hazardous air pollutant concentrations near operational sites have generally failed to capture levels above standard health benchmarks; yet, the majority of studies continue to find poor health outcomes increasing as distance from these operations decreases. | While it is unclear why there is a gap in the evidence between environmental sampling and health-based studies, the current review provides insights into methodological shortcomings that may help explain this discrepancy. Authors state that current health benchmarks may not provide accurate risk estimates from the broad range of pollutants associated with oil and natural gas development, and fail to adequately address potential risks associated with long-term, chronic, lower levels of exposure or from a mixture of chemicals. Further, a failure of sampling methods to properly account for degradation and dispersion of pollutants, or inappropriate sampling timeframes that may not capture peak emission periods that are characteristic of oil and natural gas extraction, may also contribute to the current gap in the literature.The authors call for additional investigations of emissions using measurements and research that incorporate appropriate timeframes and proximity to oil and gas extraction on health impacts from chronic, low-level ambient hazardous air pollutant exposures, among others.Energy demands have increased over several decades as technical innovations have led to more extraction of oil and natural gas, making the United States one of the world's leading producers of petroleum and natural gas hydrocarbons. Several hazardous air pollutants such as benzene, toluene, and ethyl-benzene are listed by the Environmental Protection Agency as known or suspected carcinogens, or that carry other health effects, have been measured at elevated concentrations around oil and natural gas extraction sites.The researchers reviewed 37 peer-reviewed journal articles published between Jan. 1, 2012 and Feb. 28, 2018. One focused on Poland and the rest on the U.S.This review will help guide future research on air quality near oil and natural gas development sites by highlighting future research priorities. It may also bring insights into possible exposures of communities near oil and natural gas development and storage sites such as Aliso Canyon in Los Angeles' Porter Ranch, where there was a major methane leak that affected the community. | Pollution | 2,019 |
April 8, 2019 | https://www.sciencedaily.com/releases/2019/04/190408080225.htm | Air temperatures in the Arctic are driving system change | A new paper shows that air temperature is the "smoking gun" behind climate change in the Arctic, according to John Walsh, chief scientist for the UAF International Arctic Research Center. | "The Arctic system is trending away from its 20th century state and into an unprecedented state, with implications not only within but beyond the Arctic," according to lead author Jason Box of the Geological Survey of Denmark and Greenland in Copenhagen.Several University of Alaska Fairbanks researchers are co-authors on the paper, which says that "increasing air temperatures and precipitation are drivers of major changes in various components of the Arctic system."The study is the first to combine observations of physical climate indicators, such as snow cover, with biological impacts, such as a mismatch in the timing of flowers blooming and pollinators working.Climate indicators are key pieces of information that capture the essence of a system, according to Walsh. An example would be September sea ice extent, which summarizes the effects of things like temperature, winds, ocean heat and other variables."I didn't expect the tie-in with temperature to be as strong as it was," Walsh said. "All the variables are connected with temperature. All components of the Arctic system are involved in this change.""Never have so many Arctic indicators been brought together in a single paper," he said.The authors correlated records of observations from 1971 to 2017 of nine key indicators: air temperature, permafrost, hydroclimatology, snow cover, sea ice, land ice, wildfires, tundra and terrestrial ecosystems, and carbon cycling. All the indicators correlate with rising temperatures, pointing to a warming climate and a fundamental change in the Arctic."The Arctic system is trending away from its 20th century state and into an unprecedented state, with implications not only within but beyond the Arctic," according to lead author Jason Box of the Geological Survey of Denmark and Greenland in Copenhagen."Because the Arctic atmosphere is warming faster than the rest of the world, weather patterns across Europe, North America and Asia are becoming more persistent, leading to extreme weather conditions. Another example is the disruption of the ocean circulation that can further destabilize climate: for example, cooling across northwestern Europe and strengthening of storms," said Box.The paper is the flagship piece in a special issue on Arctic climate change indicators published by the journal The authors of the study hope that these indicator-based observations provide a foundation for a more integrated understanding of the Arctic and its role in the dynamics of the Earth's biogeophysical systems. | Pollution | 2,019 |
April 3, 2019 | https://www.sciencedaily.com/releases/2019/04/190403095516.htm | Coral study traces excess nitrogen to Maui wastewater treatment facility | A new method for reconstructing changes in nitrogen sources over time has enabled scientists to connect excess nutrients in the coastal waters of West Maui, Hawaii, to a sewage treatment facility that injects treated wastewater into the ground. | Algal blooms and degradation of coral reefs along Maui's coast have been attributed to nutrient pollution, and previous studies have suggested the Lahaina Wastewater Reclamation Facility is a major source of excess nutrients in coastal waters. Previous experiments using dye tracers showed a direct link between the facility's injection site and small submarine seeps that discharge freshwater near the coral reefs. But there are many potential sources of nitrogen, and it has been hard to show that the excess nitrogen in the water comes from the treatment plant."They didn't have a smoking gun to say that the nitrogen came from the sewage," said Adina Paytan, a research professor in the Institute of Marine Sciences at UC Santa Cruz.Paytan and UCSC graduate student Joseph Murray worked with U.S. Geological Survey researchers Nancy Prouty and Sara Peek on the new study, published April 3 in "While submarine groundwater discharge is a natural process, humans are altering the composition of the water, making the reefs vulnerable to activities we do on land," Prouty said.The researchers developed a procedure for analyzing nitrogen isotopes incorporated into the crystal structure of coral skeletons. Corals lay down new skeleton material in layers that can be seen in cross-section and dated like tree rings. This enabled the researchers to correlate changes in nitrogen isotopes in the corals with changes in the operations of the wastewater facility.In addition to the implications for addressing nutrient pollution in Maui, the new findings demonstrate the power of this technique for tracking historical changes in nutrients in seawater."It's a very accurate and high-resolution way to record past nutrient pollution in seawater, which is a huge problem globally," Paytan said. "It opens up a lot of possibilities for looking at historic nitrogen pollution and other changes in nitrogen sources over time. And you don't have to use coral -- you could also use clam shells, for example."In Maui, the researchers collected cores from corals at sites near the submarine seeps. Their analysis provided a 40-year record of nitrogen isotopes incorporated into the coral skeletons. The results showed a dramatic change in the isotopes after 1995, corresponding with the implementation of a biological nutrient removal process at the Lahaina Wastewater Reclamation Facility.Paytan explained that the nutrient removal process preferentially removes lighter isotopes of nitrogen, so the remaining nitrogen is enriched in the heavier nitrogen-15 isotope."There is no other process that would result in this signature, so it has to be from the sewage," she said. "Even though the treatment process removes some of the nitrogen, there is so much in the sewage that there is still a lot left, and when they inject it into the groundwater it ends up in coastal waters."In February, the U.S. Supreme Court agreed to hear a case involving the Lahaina Wastewater Reclamation Facility and the Clean Water Act, with potentially far-reaching implications for federal environmental protections.This research was supported by the USGS Coastal and Marine Geology Program and NOAA's Coral Reef Conservation Program. | Pollution | 2,019 |
April 2, 2019 | https://www.sciencedaily.com/releases/2019/04/190402113045.htm | Researchers tap rare pristine air to reveal pollution's impact | Five years ago, researchers spent three hours packed aboard a steamy Gulfstream-1 research aircraft as it zig-zagged between pristine air over the Amazon rainforest and polluted air nearby. It was like a trip back (and forth) through time, as scientists weaved between the two vastly different settings, snagging air samples characteristic of today's industrial environment as well as samples of unpolluted air, like that before the industrial age. | An international team of scientists led by Manish Shrivastava of the U.S. Department of Energy's Pacific Northwest National Laboratory has analyzed some of the data and found that human-caused pollution spurs the production of climate-changing particles known as secondary organic aerosols much more than previously thought. The team published its results in The findings illustrate how pollution from cars, power plants and other sources combines with natural emissions from trees in the Amazon to spur a marked increase in tiny particles that can reflect or absorb sunlight, help create clouds, change rainfall patterns, and determine how carbon flows between the land and atmosphere -- all with dramatic effects on our planet.The result comes from a research campaign, known as GOAmazon, led by the Atmospheric Radiation Measurement Research Facility, a Department of Energy Office of Science user facility. The campaign focused on the region near and around Manaus, a Brazilian city of more than 2 million people that is surrounded by tropical forests for hundreds of miles. Scientists refer to the vast forest canopy around Manaus as a "Green Ocean," giving the campaign's name its first letters.The region offers a research environment hard to find anywhere else on earth. On one side of an undefined boundary is a straight-up tropical rainforest with crystal-clear air and levels of 300 aerosol particles per cubic centimeter; on the other side is the air over Manaus, with particle concentrations 100 times higher due to human activity."To really understand how pollution has influenced the atmosphere, we need to compare today's atmosphere with times before the industrial age," said Shrivastava. "This is challenging; of course, we cannot go back in time. But the Amazon is one of the few places on earth where we can study atmospheric chemistry both past and present simultaneously."On that sunny day five years ago, the ARM aircraft ambled from one side of the boundary to the other, flying about the length of a football field every second, taking air samples from the pristine and then the polluted."The region provides a wonderful natural laboratory to understand how anthropogenic emissions have an impact on atmospheric chemistry and climate," said Shrivastava.While only a tiny sliver of our planet provides the unique opportunity for this study, the findings apply to atmospheric chemistry that takes place everywhere on earth every moment.This chemistry is behind the refreshing scents of a forest meadow or fresh flowers. When those scents hit our olfactory nerve, we're actually sensing an array of gases, such as carbon-containing isoprene and terpenes, which are given off naturally by trees and other vegetation. The gases contribute to ozone and other forms of haze that affect the amount of sunlight reaching the earth.When these natural carbon emissions interact in sunlight with nitrogen oxide -- naturally emitted from soils but also a common product largely emitted by human activity -- one result is the formation of tiny particles known as secondary organic aerosols. Though aerosols are tiny, much smaller than the width of a human hair, they are no bit players when it comes to earth's climate. They're an important component in the planet's energy and carbon cycles, determining in part the fate of the planet's vast reservoir of carbon and how it flows between the land and the atmosphere.Shrivastava and colleagues sought to learn how anthropogenic emissions increase the production of these naturally occurring carbon particles -- just how extensive the effects of human activity are in spurring the transformation of gases ejected from vegetation to these powerful climate-changing particles.The team integrated this data with other laboratory measurements to develop an advanced computer model to simulate chemical reactions that form particles in the atmosphere. The model does double duty, reproducing both pre-industrial and present-day chemistry. Most models have largely been created based on present-day conditions; the Amazon measurements provide information about pre-industrial chemistry conditions that improved the model's predictive abilities.The team found that nitrogen oxide emitted from human activities like traffic and oil refineries promotes the creation of these particles from natural forest carbon much more than previously thought, causing an average increase of anywhere from 60 to 200 percent and even up to 400 percent in some cases. That's compared to the 20 percent previously estimated by scientists based on data in more polluted continental locations.The team also showed that most of these secondary carbon-containing particles were formed by this phenomenon."The impact of pollution in creating secondary organic aerosols has been very difficult to tease out," says Shrivastava. "Our findings indicate the earth's atmosphere in many continental locations has already been substantially altered by human activities, and there's a much larger and widespread impact than has been appreciated."In their paper, Shrivastava and the other authors conclude: "Our results provide a clear picture of how anthropogenic emissions are likely to have greatly modified biogenic SOA [secondary organic aerosol] formation since preindustrial times over the Earth, and imply that rapid urbanization in future years might substantially enhance biogenic SOA formation in the pristine forested regions of the Amazon."Shrivastava is one of many scientists worldwide who are creating complex computer models to explain the atmosphere and the earth system. He describes the atmosphere as "a large chemical reactor that constantly processes both natural and human emissions, and in turn affects both climate and human health." | Pollution | 2,019 |
April 1, 2019 | https://www.sciencedaily.com/releases/2019/04/190401121811.htm | Air pollution caused by corn production increases mortality rate in US | A new study establishes that environmental damage caused by corn production results in 4,300 premature deaths annually in the United States, representing a monetized cost of $39 billion. | The paper, published in the peer-reviewed journal The study also shows how the damage to human health of producing a bushel of corn differs from region to region and how, in some areas, the health damages of corn production are greater than its market price."The deaths caused per bushel in western corn belt states such as Minnesota, Iowa, and Nebraska tend to be lower than in eastern corn belt states such as Illinois, Indiana, and Ohio," said lead researcher Jason Hill, associate professor at the University of Minnesota College of Food, Agricultural and Natural Resource Sciences.Researchers, including UMN professors Stephen Polasky, applied economics; Timothy Smith, bioproducts and biosystems engineering; David Tilman, College of Biological Sciences; teaching assistant professor Natalie Hunt and doctoral graduate student Sumil Thakrar, bioproducts and biosystems engineering; and scientists at other institutions, used county-level data on agricultural practices and productivity to develop a spatially explicit life-cycle-emissions inventory for corn. The data show that reduced air quality resulting from corn production is associated with the premature deaths annually in the United States, with estimated damages in monetary terms of $39 billion. This uses a value from the U.S. EPA of $9 million for each death avoided.Increased concentrations of fine particulate matter (PM2.5) are driven by emissions of ammonia -- a PM2.5 precursor -- that result from nitrogen fertilizer use. Average health damages from reduced air quality are $3.07 per bushel (56.5 lbs.) of corn, which is 62 percent of the $4.95 per bushel average corn market price of the last decade.This paper also estimates life-cycle greenhouse gas emissions of corn production, finding total climate change damages of $4.9 billion, or $0.38 per bushel of corn."It's important for farmers to have this information so that they can implement practices that reduce the environmental impact of the crops they grow," Hill said. "Farmers can greatly improve the environmental profile of their corn by using precision agriculture tools and switching to fertilizers that have lower ammonia emissions."In addition to changing the fertilizer type and application method, the study's results suggest potential benefits from strategic interventions in corn production, including improved nitrogen use efficiency, switching to crops requiring less fertilizer, and changing the location where corn is grown.Aware that changes in practices can take time and planning, Hill suggests farmers could be offered incentives to switch to crops that demand less applied nitrogen while still offering market and nutritional benefits."Not only are ammonia emissions from fertilizer damaging to human health, they are also a waste of money for farmers because they are not getting the benefit of the nitrogen that they're paying for," Hill said. "The number of deaths related to corn production could be reduced through these key tactics."Research was funded by the United States Department of Agriculture, the U.S. Environmental Protection Agency, the U.S. Department of Energy, and the Wellcome Trust. | Pollution | 2,019 |
April 1, 2019 | https://www.sciencedaily.com/releases/2019/04/190401115843.htm | London cyclists warned evening commute has the dirtiest air, so pick a clean route home | Cyclists in London should take a different route back home during evening peak-time hours to avoid breathing in harmful black carbon from vehicles, suggests a new collaborative air pollution study from the University of Surrey's Global Centre for Clean Air Research (GCARE), jointly with University of São Paulo (Brazil) and University of Twente (Netherlands). | According to the Department of Transport, London has just over three million licensed vehicles -- 2.7 million of which are cars. In the Greater London area, 35 per cent of all trips are made by car and 730,000 cycling trips are made every day -- a number that has grown by 154 per cent since 2000.In a study published by the Overall, the results showed that the main routes in London and São Paulo exposed cyclists to higher concentrations of black carbon compared with alternative routes. In Rotterdam, concentration levels on main and alternative routes were similar.The results also found that cyclists were exposed to twice as much black carbon levels on main routes in São Paulo compared to London and Rotterdam. Interestingly, Londoners cycling home on the main route during the evening commute were exposed to more pollutants than those who took the same route in the morning, and twice as much black carbon than those who took the alternative route.Professor Prashant Kumar, Director of GCARE at the University of Surrey, said: "While it is common sense to conclude that cyclists are at risk of potentially harmful exposure levels of black carbon, our study provides further evidence that cyclists should plan alternative routes during specific times. A slower, cleaner route home could make a dramatic impact on your exposure to harmful black carbon."These findings should be considered when urban planners establish new cycle networks by increasing, as much as possible, the distance between the road and the cycle ways. This evidence also direct decision makers to seriously invest in green infrastructure throughout our major cities, as there is mounting evidence that these could provide the best line of defence against road pollution in near-road environments."Professor Maria de Fatima from the University of São Paulo added: "As the use of vehicles continues to grow in Latin America, especially in São Paulo, it is important that we continue to gather evidence so we can understand what impact this use of mostly biofuel-blended diesel fuelled vehicles has on our local environment, our personal health and the wellbeing of our planet." | Pollution | 2,019 |
March 28, 2019 | https://www.sciencedaily.com/releases/2019/03/190328112522.htm | Sea anemones are ingesting plastic microfibers | Tiny fragments of plastic in the ocean are consumed by sea anemones along with their food, and bleached anemones retain these microfibers longer than healthy ones, according to new research from Carnegie's Manoela Romanó de Orte, Sophie Clowez, and Ken Caldeira. | Their work, published by One of the most-common types of plastics in the ocean are microfibers from washing synthetic clothing and from the breakdown of maritime equipment such as ropes and nets. Microfibers are found across all the world's oceans and are beginning to appear in fish and shellfish consumed by humans."Plastic pollution is a serious and growing problem for our oceans and the animals that live in them," Romanó de Orte said. "We wanted to understand how these long-lived contaminants are affecting fragile coral reef ecosystems. Plastics could be confused by the organisms for food and could also be carriers of other harmful contaminants. Since sea anemones are closely related to corals, we decided to study sea anemones in the laboratory to better understand effects of plastics on corals in the wild."Most laboratory research on plastic pollution uses tiny beads of plastic, not microfibers. So Romanó de Orte, Clowez, and Caldeira set out to determine whether microfibers are ingested by healthy sea anemones and by those that have lost the symbiotic algae that provide them with nutrients, a condition called bleaching. On corals reefs, bleaching is caused by increasing ocean temperatures due to global climate change.The research team introduced three different kinds of microfibers -- nylon, polyester, and polypropylene -- to both unbleached and bleached sea anemones both alone and mixed with brine shrimp.They found that when introduced alone, nylon was consumed by about a quarter of the unbleached anemones and the other two microfibers were not taken up at all. But when the microfibers were mixed with brine shrimp, about 80 percent of the unbleached anemones ingested all three microfibers. For the bleached anemones, 60 percent consumed nylon and 20 percent consumed polyester with no food present, and 80 percent took up all three microfibers when mixed with brine shrimp.It took longer for the bleached anemones to expel the microfibers after ingesting them than it did for the healthy anemones, although all microfibers were gone by the third day. However, in a natural marine environment, anemones and coral would continually be reintroduced to new microfibers, making the contamination a chronic condition of their existence."Our work suggests that plastic pollution and climate change are packing a one-two punch for coral reefs," Caldeira explained. "When the reefs are bleached by hot ocean temperatures, the organisms are more likely to eat and retain plastic microfibers. It looks like the effects of global warming and of ocean pollution don't just add together, they multiply." | Pollution | 2,019 |
March 27, 2019 | https://www.sciencedaily.com/releases/2019/03/190327080702.htm | How light from street lamps and trees influence the activity of urban bats | Artificial light is rightly considered a major social, cultural and economic achievement. Yet, artificial light at night is also said to pose a threat to biodiversity, especially affecting nocturnal species in metropolitan areas. It has become clear that the response by wildlife to artificial light at night might vary across species, seasons and lamp types. A study conducted by a team led by the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) sheds new light on how exactly ultraviolet (UV) emitting and non-UV emitting street lamps influence the activity of bats in the Berlin metropolitan area and whether tree cover might mitigate any effect of light pollution. The study is published in the scientific journal | Natural sunlight sets the pace of day and night on our planet. Over millions of years, wildlife and people have adapted to the rhythm of the natural photoperiod. As creatures of daytime, humans have expanded their ecological niche into night-time by inventing and using artificial light. Yet nocturnal animals such as bats may suffer from the detrimental effects of artificial light generated by street lamps, a phenomenon now recognised as light pollution. As it turns out, bat responses to light pollution were complex. "We observed a higher activity of two pipistrelle bat species, the common pipistrelle and Nathusius' pipistrelle, in areas with high numbers of UV emitting street lamps," explains Tanja Straka, scientist at the IZW's Department of Evolutionary Ecology and first author of the study. These opportunistic species may feed on insects that are attracted to UV emitting lamps. "However, all other species were less active at and even repelled by the lamps, irrespective of whether the light they emitted did or did not contain UV light," adds Straka.The novelty of this study is that these effects were considered in relation to tree cover. Not only do trees provide bats with shelter during daytime, trees may also provide shade for bats in illuminated areas. "Our goal was to determine whether and how tree cover influences any responses of bats to artificial light," says Straka.The team found that the response of bats to artificial light was intensified in areas with high tree cover. For example, the attraction of Pipistrellus pipistrellus to UV light was more pronounced when many trees were present, probably because UV light attracted insects from the vegetation. On the other hand mouse-eared bats (Myotis spp.) were less frequently recorded in areas with a high number of street lamps (irrespective of UV or no UV emission) and lots of trees. Mouse-eared bats seem to be particularly light-sensitive and avoid illuminated areas even when these include trees or shrubs. The team also found that high-flying insectivorous bats were more active in areas when the light emission from LED street lanterns was dampened by a high tree coverage than in areas with many LED lanterns and no trees." LED lights do not attract large numbers of insects and therefore they are not attractive as foraging grounds for high-flying bats; they might even be repelled by light spillover from LED lamps. Tree cover seems to reduce light spillover, which enable high-flying bats to fly in the shadow of the tree canopy," Straka explains.These results are based on the analysis of more than 11,000 bat calls recorded during three months at 22 sites in the Berlin city area. Bat calls were identified by species and the activity of bats was calculated for each species and site. These data were compared with features of the landscape, such as tree cover and the intensity of light pollution as estimated by remote sensing (i.e. satellite based data). In addition, the exact location of street lamps and information on UV light emission was used to estimate the level of light pollution in the study area."The bottom line is that for bats the relation between artificial light and vegetation is complex and it varies between species, yet overall artificial light at night has negative consequences for bats," concludes Christian Voigt, the Head of Department. "Even those species that may hunt at street lamps opportunistically will suffer on the long run from the constant drain of insects dying at street lamps. Trees are important for urban bats, not only as a shelter but also as a source for prey insects. Hence artificial light should be avoided in habitats with many trees." Adding trees in highly lit areas or turning off lights when the area is not in use could substantially contribute to the conservation of bats and possibly also other nocturnal wildlife, because trees provide shade and refuge that bats urgently need. | Pollution | 2,019 |
March 25, 2019 | https://www.sciencedaily.com/releases/2019/03/190325080434.htm | Catalyst advance removes pollutants at low temperatures | Researchers at Washington State University, University of New Mexico, Eindhoven University of Technology, and Pacific Northwest National Laboratory have developed a catalyst that can both withstand high temperatures and convert pollutants at near room temperature -- an important advance for reducing pollution in modern cars. | They report on their work in the journal, Catalytic converters have been used in the U.S. since the 1970s as a way to clean up pollutants from vehicle exhaust. In the catalytic process, rare metals, such as platinum, are used in a chemical reaction to convert carbon monoxide and other pollutants to non-toxic carbon dioxide, nitrogen, and water.As cars have become more fuel-efficient, however, they use less energy and the temperature of the exhaust gases is lower, which makes it harder to clean up the pollutants. In fact, the U.S. Department of Energy has set a goal of removing 90 percent of harmful emissions at 150 degrees Celsius or lower.The catalysts have to perform at low temperatures but also must survive under the harsh conditions encountered during operation."The catalyst problem has increased paradoxically as cars have become better and more efficient," said Emiel Hensen, catalysis professor at Eindhoven University of Technology.Meanwhile, the industry also struggles with the high cost of the precious metals required for catalysis. Platinum, for instance, facilitates chemical reactions for many commonly used products and processes but costs more than $800 per ounce.The catalyst the researchers developed is based on the activation of single atoms of platinum supported on cerium oxide. While their catalyst outperforms current technology, it also reduces the amount of platinum required, which would lower overall costs."The industry wants to make use of every single atom of the precious metals, which is why single-atom catalysis has attracted increased attention," said Abhaya Datye, a distinguished professor at UNM's Department of Chemical and Biological Engineering.In their latest work, the researchers first ensured their catalysts were thermally stable, trapping platinum ions on a cerium oxide support at very high temperatures. Their synthesis method caused the platinum atoms to strongly bond to their support. They then activated the catalyst in carbon monoxide at about 275 degrees Celsius."To our surprise, we discovered that the high temperature synthesis made the ceria more easily reducible, allowing it to provide a key ingredient -- oxygen -- to active sites," said Yong Wang, Voiland Distinguished Professor in the Gene and Linda Voiland School of Chemical Engineering and Bioengineering at WSU.The activated oxygen was then able to remove pollutants at near room temperature at the platinum sites."This research directly addresses the 150-degree challenge identified by the U.S. Department of Energy and by automobile companies," said Wang. "The discovery of oxygen activation at near room temperature is extremely useful, and this finding could have a significant impact on the technology of exhaust emission control."The researchers now plan to study the performance of single-atom catalysts with other organic compounds and pollutants. The work was funded by the U.S. Department of Energy's Office of Basic Energy Sciences and Netherlands Research Center for Multiscale Catalytic Energy Conversion. | Pollution | 2,019 |
March 25, 2019 | https://www.sciencedaily.com/releases/2019/03/190325080426.htm | Bacteria may travel thousands of miles through the air globally | Bacteria may travel thousands of miles through the air worldwide instead of hitching rides with people and animals, according to Rutgers and other scientists. Their "air bridge" hypothesis could shed light on how harmful bacteria share antibiotic resistance genes. | "Our research suggests that there must be a planet-wide mechanism that ensures the exchange of bacteria between faraway places," said senior author Konstantin Severinov, a principal investigator at the Waksman Institute of Microbiology and professor of molecular biology and biochemistry in the School of Arts and Sciences at Rutgers University-New Brunswick."Because the bacteria we study live in very hot water -- about 160 degrees Fahrenheit -- in remote places, it is not feasible to imagine that animals, birds or humans transport them," Severinov said. "They must be transported by air and this movement must be very extensive so bacteria in isolated places share common characteristics."Severinov and other researchers studied the "molecular memories" of bacteria from their encounters with viruses, with the memories stored in bacterial DNA, according to a study in the journal Bacteriophages -- viruses of bacteria -- are the most abundant and ubiquitous forms of life on the planet, the study notes. The viruses have a profound influence on microbial populations, community structure and evolution.The scientists collected heat-loving In bacterial cells infected by viruses, molecular memories are stored in special regions of bacterial DNA called CRISPR arrays. Cells that survive infections pass the memories -- small pieces of viral DNA -- to their offspring. The order of these memories allows scientists to follow the history of bacterial interaction with viruses over time.Initially, the scientists thought that bacteria of the same species living in hot springs thousands of miles apart -- and therefore isolated from each other -- would have very different memories of their encounters with viruses. That's because the bacteria all should have independent histories of viral infections. The scientists also thought that bacteria should be evolving very rapidly and become different, much like the famous finches Charles Darwin observed on the Galapagos Islands."What we found, however, is that there were plenty of shared memories -- identical pieces of viral DNA stored in the same order in the DNA of bacteria from distant hot springs," Severinov said. "Our analysis may inform ecological and epidemiological studies of harmful bacteria that globally share antibiotic resistance genes and may also get dispersed by air instead of human travelers."The scientists want to test their air bridge hypothesis by sampling air at different altitudes and locations around the world and by identifying the bacteria there, he said. They would need access to planes, drones or research balloons. | Pollution | 2,019 |
March 25, 2019 | https://www.sciencedaily.com/releases/2019/03/190325080359.htm | Particulate air pollution linked with reduced sperm production in mice | Exposure to tiny air pollution particles may lead to reduced sperm production, suggests new research in mice to be presented Monday, March 25 at ENDO 2019, the Endocrine Society's annual meeting in New Orleans, La. | "Infertility rates are increasing around the world, and air pollution may be one of the main factors," said lead researcher Elaine Maria Frade Costa, M.D., Ph.D., of Sao Paulo University in Sao Paulo, Brazil.The World Health Organization (WHO) estimates that approximately 15 percent of the global population has difficulty with fertility, and male infertility accounts for about half of those problems.The study looked at the effect of particulate matter (PM) on sperm production. PM is a mixture of solid particles and liquid droplets found in the air. PM2.5 is a fine inhalable particle with diameters that are 2.5 micrometers or smaller. The average human hair is about 70 micrometers in diameters, making it 30 times larger than the biggest fine particle. PM2.5 is known to disrupt the endocrine system in humans and animals. The endocrine system is involved in reproduction, including the production of sperm.The study included four groups of mice. One was exposed to PM2.5 from Sao Paolo before and after birth, from the day they were weaned from their mother's milk until adulthood. The second group was exposed only during gestation. The third group was exposed after birth from weaning until adulthood; and the fourth group was exposed only to filtered air during gestation and from the time they were weaned until adulthood.The researchers analyzed the testes of the mice and their production of sperm. DNA tests were used to evaluate gene expression, the process by which genes in DNA provide instructions for proteins.The tubes in the testes that produce sperm of all the exposed mice showed signs of deterioration. In comparison with the mice not exposed to PM2.5, the sperm of the first group, which was exposed before and after birth, was of significantly worse quality.The exposure to PM2.5 led to changes in the levels of genes related to testicular cell function. Exposure to PM2.5 after birth seemed to be the most harmful to testicular function, the study found.Costa said these changes are epigenetic, which means they are not caused by changes in the DNA sequence. Epigenetic changes can switch genes on or off and determine which proteins a gene expresses.The research demonstrates for the first time that exposure to air pollution of a large city impairs production of sperm through epigenetics, mainly in exposure after birth, Costa said. "These findings provide more evidence that governments need to implement public policies to control air pollution in big cities," she said. | Pollution | 2,019 |
March 23, 2019 | https://www.sciencedaily.com/releases/2019/03/190323113751.htm | Breakthrough in air purification with a catalyst that works at room temperature | Researchers from Tokyo Metropolitan University have shown that a newly engineered catalyst made of gold nanoparticles supported on a metal oxide framework shows breakdown of ammonia impurities in air, with excellent selectivity for conversion to nitrogen gas. Importantly, it is effective at room temperature, making it suitable for everyday air purification systems. The team successfully identified the mechanism behind this behavior, paving the way towards the design of other novel catalytic materials. | The distinctive, sharp odor of ammonia is familiar to many. It is a common industrial chemical, primarily used as feedstock for fertilizers as well as disinfectants in both household and medical settings. It is also highly toxic when concentrated; the United States Occupational Safety and Hazard Administration has a strict upper limit of 50 parts per million in breathing air averaged over an eight-hour working day and forty-hour working week. Given its wide industrial use and presence in nature, it is paramount that effective measures be in place to remove unwanted ammonia from the atmosphere in our everyday working and living environments.Catalysts, like the ones found in the catalytic converters of cars, can help solve this problem. Unlike filters which simply trap harmful substances, catalytic filters can help break ammonia down into harmless products like nitrogen gas and water. Not only is it safer, preventing the buildup of toxic chemicals, it also makes it unnecessary to replace them regularly. However, common existing catalysts for ammonia only function at temperatures of over 200 degrees Celsius, making them inefficient as well as inapplicable for household settings.Now, a team led by Project Professor Toru Murayama from Tokyo Metropolitan University has designed a catalytic filter which can function at room temperature. Consisting of gold nanoparticles stuck onto a framework of niobium oxide, the newly designed filter is highly selective in what it converts ammonia to, with nearly all conversion to harmless nitrogen gas and water and no nitrogen oxide byproducts. This is known as selective catalytic oxidation (SCO). They collaborated with industrial partners from NBC Meshtec Inc. to produce a working prototype; the filter has already been applied to reduce gases contaminated with ammonia to undetectable levels.Importantly, the team also successfully uncovered the mechanism by which the material works. They showed that gold nanoparticles play an important role, with increased loading leading to increased catalytic activity; they also found that the choice of framework was extremely important, showing experimentally that chemical sites known as This work was supported by a grant from the Platform for Technology and Industry of the Tokyo Metropolitan Government. | Pollution | 2,019 |
March 21, 2019 | https://www.sciencedaily.com/releases/2019/03/190321163617.htm | Energy stealthily hitches ride in global trade | Fulfilling the world's growing energy needs summons images of oil pipelines, electric wires and truckloads of coal. But Michigan State University scientists show a lot of energy moves nearly incognito, embedded in the products of a growing society. | And that energy leaves its environmental footprint home.In this month's journal In fact, the virtual energy transferred west to east was much greater than the physical energy that moves through China's massive infrastructure investment, the West-To-East Electricity Transmission Project. China is a powerful model of energy use, having surpassed the United States. In 2013, nearly 22 percent of global energy use occurred in China.Conserving energy and managing its accompanying environmental impacts are a growing concern across the world, and it is crucial to take a holistic look at all the ways energy is used and moved," said Jianguo "Jack" Liu, Rachel Carson Chair in Sustainability of MSU's Center for Systems Integration and Sustainability (CSIS). "Only when we understand the full picture of who is producing energy and who is consuming it in all its forms can we make effective policy decisions."Virtual energy is considered critical to avoiding regional energy crises since commodities traded from one location to another include virtual energy. This means the importing area can avoid expending the energy to produce the imported commodities. The paper "Shift in a National Virtual Energy Network" examines how a region's energy haves and have-nots meet economic and energy needs by acknowledging energy is tightly wound around economic growth and demand.The researchers are first to focus on energy use after the 2008 global financial crisis, seeing how economic desperation can have a big, not always obvious, impact on energy -- and the pollution and environmental degradation that can accompany its use."China, like a lot of places across the globe, has an uneven distribution of energy resources, and China also is developing quickly," said the article's first author Zhenci Xu, an MSU-CSIS PhD student. "We wanted to understand the true paths of energy use when economic growth kicks into gear after a financial crisis. Eventually, all the costs of energy use, including environmental damage and pollution, have to be accounted for, and current policies focus primarily on physical energy, not virtual energy."The researchers found a persistent flow of total virtual energy from energy-scare to energy-abundant provinces increased from 43.2% in 2007 to 47.5% in 2012. Following the framework of metacoupling (socioeconomic-environmental interactions within as well as between adjacent and distant places), they also discovered after the financial crisis, trade was taking place between distant provinces -- trade that came with energy's environmental footprint.The authors note these types of analyses are needed across the globe to guide policies that hold the areas that are shifting their energy consumption to appropriately contribute to mitigating the true costs of energy. | Pollution | 2,019 |
March 18, 2019 | https://www.sciencedaily.com/releases/2019/03/190318132607.htm | Key discovery on how alpine streams work | An EPFL study has prompted scientists to rethink a standard approach used to calculate the velocity of gas exchange between mountain streams and the atmosphere. Research conducted in streams in Vaud and Valais indicate that equations used to predict gas exchange based on data from lowland streams undershoots the actual gas exchange velocity in mountain streams on average by a factor of 100. | This discovery -- appearing in The study was conducted at EPFL's Stream Biofilm and Ecosystem Research Laboratory (SBER), within the School of Architecture, Civil and Environmental Engineering (ENAC).In aquatic ecosystems, such as the world's oceans, streams and lakes, numerous aquatic organisms, ranging from bacteria to fish, respire oxygen and exhale COThese bubbles accelerate the gas exchange. Strikingly, the same mechanism is at work when white-capped waves appear on the surface of rough seas. Until now, scientists have ignored the contribution from air bubbles and have used the same approach to calculate gas exchange velocities in mountain streams than in calm lowland streams.It is intuitive that the rugged terrain would influence gas exchange in mountain streams, but no evidence had been collected to test this hypothesis until 2016. That's when EPFL researchers installed more than 130 environmental sensors in mountain streams in Vaud and Valais to study this physical phenomena and related biogeochemical fluxes. To measure gas exchange velocity as accurately as possible, one of the SBER scientists and first author of the study -- Amber Ulseth -- along with others, added small amounts of argon as a tracer gas to the streams. Argon is a naturally occurring gas that is harmless to aquatic ecosystems.Using cutting-edge analytical methods in the laboratory, Amber Ulseth and colleagues were able to quantify loss of argon from the streamwater. Next, they modeled the gas exchange velocity from the downstream loss of the tracer gas in the streamwater. Their results reveal that the gas exchange velocity in mountain streams is on average 100 times higher than predicted from equations developed from similar tracer gas experiments in low-land streams."Our findings have major implications. They suggest that we have been underestimating the effects of all the small but abundant mountain streams in our biogeochemical models. This opens up a new research avenue," says Tom Battin, Director of SBER and coauthor of the study. His lab is already looking into extensions of this research, such as developing a new model to predict CO | Pollution | 2,019 |
March 15, 2019 | https://www.sciencedaily.com/releases/2019/03/190315161036.htm | Nitrogen pollution's path to streams weaves through more forests (and faster) than suspected | Nitrogen in rain and snow falls to the ground where, in theory, it is used by forest plants and microbes. New research by a scientific collaboration led by the USDA Forest Service shows that more nitrogen from rain and snow is making it to more streams than previously believed and flowing downstream in forests of the United States and Canada. The study, "Unprocessed atmospheric nitrate in waters of the Northern Forest Region in the USA and Canada," was published this week in the journal | Scientists found that some nitrate, which is a form of nitrogen that plants and microbes can use, occasionally moves too fast for biological uptake, resulting in "unprocessed" nitrate bypassing the otherwise effective filter of forest biology. The study links pollutant emissions from various and sometimes distant sources including industry, energy production, the transportation sector and agriculture to forest health and stream water quality."Nitrogen is critical to the biological productivity of the planet, but it becomes an ecological and aquatic pollutant when too much is present," said Stephen Sebestyen, a research hydrologist with the USDA Forest Service's Northern Research Station based in Grand Rapids, Minn., and the study's lead author."From public land managers to woodlot owners, there is a great deal of interest in forest health and water quality. Our research identifies widespread pollutant effects, which undermines efforts to manage nitrogen pollution."Sebestyen and 29 co-authors completed one of the largest and longest examinations to trace unprocessed nitrate movement in forests. Scientists from several federal agencies and 12 academic institutions in the United States, Canada, and Japan collected water samples in 13 states and the province of Ontario, ultimately compiling more than 1,800 individual nitrate isotope analyses over the course of 21 years."We generally assumed that nitrate pollution would not travel a great distance through a forest because the landscape would serve as an effective filter," Sebestyen said. "This study demonstrates that while we have not been wrong about that, we needed more information to be better informed." Forests overall use most nitrate, unless rainfall and snowmelt runoff during higher flow events lead to brief, but important windows when unprocessed nitrate flows to streams; sometimes at levels that are unexpectedly high.Too much nitrogen contributes to forest decline and growth of nuisance vegetation in lakes and ponds. Tree species have varying levels of tolerance for nitrogen. Too much nitrogen can change forest composition and provide a foothold for non-native plants. "I'm concerned with how air pollution affects forests and watersheds," said Trent Wickman, an Air Resource Specialist with the USDA Forest Service's Eastern Region and a co-author of the study. "There are a number of federal and state programs that aim to reduce nitrogen air pollution from vehicles and industrial sources. Understanding the fate of nitrogen that originates in the air, but ends up on land, is important to gauge the effectiveness of those pollution reduction programs."Sebestyen and the study's co-authors suggest that because unprocessed nitrogen is not being filtered by natural vegetation to the extent previously believed, monitoring coupled with this baseline information is needed to give land managers a more nuanced view of forest health issues. | Pollution | 2,019 |
March 14, 2019 | https://www.sciencedaily.com/releases/2019/03/190314123149.htm | Solar-powered moisture harvester collects and cleans water from air | Access to clean water remains one of the biggest challenges facing humankind. A breakthrough by engineers at The University of Texas at Austin may offer a new solution through solar-powered technology that absorbs moisture from the air and returns it as clean, usable water. | The breakthrough, described in a recent issue of the journal A research team led by Guihua Yu in UT Austin's Cockrell School of Engineering combined hydrogels that are both highly water absorbent and can release water upon heating. This unique combination has been successfully proved to work in humid and dry weather conditions and is crucial to enabling the production of clean, safe drinking water from the air.With an estimated 50,000 cubic kilometers of water contained in the atmosphere, this new system could tap into those reserves and potentially lead to small, inexpensive and portable filtration systems."We have developed a completely passive system where all you need to do is leave the hydrogel outside and it will collect water," said Fei Zhao, a postdoctoral researcher on Yu's team and co-author of the study. "The collected water will remain stored in the hydrogel until you expose it to sunlight. After about five minutes under natural sunlight, the water releases."This technology builds upon a 2018 breakthrough made by Yu and Zhao in which they developed a solar-powered water purification innovation using hydrogels that cleans water from any source using only solar energy. The team's new innovation takes that work a step further by using the water that already exists in the atmosphere. For both hydrogel-based technologies, Yu and his research team developed a way to combine materials that possess both hygroscopic (water-absorbing) qualities and thermal-responsive hydrophilicity (the ability to release water upon simple heating)."The new material is designed to both harvest moisture from the air and produce clean water under sunlight, avoiding intensive energy consumption," said Yu, an associate professor of materials science and mechanical engineering.Harvesting water from moisture is not exactly a new concept. Most refrigerators keep things cool through a vapor condensation process. However, the common fridge requires lots of energy to perform that action. The UT team's technology requires only solar power, is compact and can still produce enough water to meet the daily needs of an average household. Prototype tests showed daily water production of up to 50 liters per kilogram of hydrogel.Representing a novel strategy to improve upon atmospheric water harvesting techniques being used today, the technology could also replace core components in existing solar-powered water purification systems or other moisture-absorbing technologies.Yu and his team have filed a patent, and Yu is working with UT's Office of Technology Commercialization on the licensing and commercialization of this innovative class of hydrogels. The research was funded by the Alfred P. Sloan Foundation, the Camille & Henry Dreyfus Foundation and the National Science Foundation. | Pollution | 2,019 |
March 13, 2019 | https://www.sciencedaily.com/releases/2019/03/190313080451.htm | What do gardens bring to urban ecosystems? | "A healthy community requires healthy soil." This idea spurred a consortium of researchers, farmers, and community garden practitioners to dive into the challenges -- and opportunities -- of urban agriculture. Their efforts, now in a second year, may highlight how urban soil can be a resource for human and environmental health. | "We can benefit from how we manage the environment," says researcher Jennifer Nicklay. "Clean water, clean air, and agriculture benefit us, our waterways, and wildlife. We put a value on crop yield, which is all well and good. But in urban ag, we're in such proximity to other humans. The other benefits become really important to think of as a whole."Nicklay is a doctoral student at the University of Minnesota. Along with researchers at the University of St. Thomas and Hamline University, all located in the Minneapolis/St. Paul region, Nicklay is working with four urban growers to understand the contributions of city soils.The growers have unique approaches to their urban plots. One group emphasizes community building and education, another culturally-relevant food. Another uses a community-supported agriculture model, while a final group emphasizes community reconciliation over yield.For all groups, land permanence in the urban environment is a challenge. A lease may expire, a city code may prevent perennial plantings, or a tax burden prove unmanageable. "When you don't know how long you'll be there, it's hard to invest in long-term solutions," Nicklay says. "All the growers value land tenure and land access."From the growers' perspective, "healthy soil" means it has enough organic matter and nutrients to encourage good plant growth. It's loose instead of compacted so water can move freely. From here, the concept of a "healthy community" moves upwards from the microbiome of helpful soil bacteria to insects, wildlife, and humans.There's often more than just soil in the soil, from copper wires to chemical contamination. This challenges the growers. Researchers hope to also determine if they can leave the urban plots better than they found them.The team is comparing the findings to another urban farm owned and monitored by the University of St. Thomas. They are also comparing the urban ag plots to urban green spaces such as parks. To do so, researchers gather soil and plant samples -- some weekly, some less often -- for 20 different lab tests. The results will provide information on urban ag's ecosystem services: changes to microbe and insect populations, water quality, soil fertility, and greenhouse gas emissions. Researchers also measure how much each urban plot will grow given different growing practices.The two distinct growing experiences build on each other. "The University of St. Thomas farm allows us to scaffold the data. We can control more variables, see patterns and put them into context. In the less-controlled scenarios of our four urban growers we see the range of possibilities in the real world," Nicklay explains.The team operates within a unique collaborative model. An annual "All Hands" meeting in the waning winter months unifies community and university participants with common goals. Weekly workdays and check-ins during the growing season maintain contact with grower sites to help share findings and address concerns. Community meetings and events throughout the year continue this close relationship."These regular, repeated interactions -- in ways that are both related and not related to the project -- are really, really important," Nicklay emphasizes. "It allows us to honor grower and community knowledge in all aspects of our work, from generating questions to designing methods to analyzing data."Nicklay says the process is time-intensive but rewarding. "When something hasn't gone well, they tell me. We're able to work through it," she says. "We're getting so much from the farmers. We want to give back and answer community questions. We make sure people know we're here and invested in their success."This research project will conclude in 2020. Researchers hope their findings will help urban growers and policymakers make better land use decisions."We need local, data-driven evaluation of these ecosystem services to complement our narratives and experiments in order to maximize land use strategies," Nicklay says."Already, we're thinking to the future. We know that there are innumerable community and home gardens in Minneapolis and St. Paul, and we want to figure out how to capture the impacts they are having. We can help researchers, growers, communities, and policymakers understand the potential impacts of urban agriculture at this larger scale."Nicklay presented this project at the Soil Science Society of America International Soils Meeting, Jan. 6-9, in San Diego. Funding from the USDA North Central Region Sustainable Research and Education Grant and the University of Minnesota Consortium on Law and Values in Health, Environment, and Life Sciences supports this continuing research. | Pollution | 2,019 |
March 13, 2019 | https://www.sciencedaily.com/releases/2019/03/190313143307.htm | Review of noise impacts on marine mammals yields new policy recommendations | Marine mammals are particularly sensitive to noise pollution because they rely on sound for so many essential functions, including communication, navigation, finding food, and avoiding predators. An expert panel has now published a comprehensive assessment of the available science on how noise exposure affects hearing in marine mammals, providing scientific recommendations for noise exposure criteria that could have far-reaching regulatory implications. | Published March 12 in "One of the things we did in 2007 was to identify major gaps in our knowledge, and we now have considerably more data. We thought there was enough new science to reconvene the panel and revisit these issues," said Southall, who served as director of NOAA's Ocean Acoustics Program from 2004 to 2009.Concern about the potential for ocean noise to cause hearing damage or behavioral changes in marine mammals began to mount in the 1990s, focusing initially on activities related to the oil and gas industry. In the early 2000s, the association of sonar with mass strandings of deep-diving whales became another focus of concern. Shipping and construction activities are other important sources of ocean noise pollution.Loud noises can cause temporary or permanent hearing loss, can mask other sounds, and can disturb animals in various ways. The new paper focuses on direct effects of noise pollution on hearing in marine mammals. Separate papers addressing behavioral effects and the acoustics of different sound sources will be published later this year."Noise-induced hearing loss occurs in animals the same way it does in humans. You can have a short-term change in response to exposure to loud noise, and you can also have long-term changes, usually as a result of repeated insults," said coauthor Colleen Reichmuth, a research scientist who leads the Pinniped Cognition and Sensory Systems Laboratory at UC Santa Cruz.Because animals vary in their sensitivities to different types and frequencies of sound, the panel categorized marine mammal species into groups based on what was known about their hearing. The new paper includes all living species of marine mammals."The diversity of species is such that a one-size-fits-all approach isn't going to work," said coauthor Darlene Ketten, a neuro-anatomist with joint appointments at Woods Hole Oceanographic Institute and Boston University's Hearing Research Center. "We need to understand how to avoid harm, and the aim is to provide guidelines to say, if this or that species is in your area, here's what you need to avoid."Over the past decade, the number of scientific studies on hearing in marine mammals has grown rapidly, enabling the panel to refine and improve its groupings and assessments. Accompanying the paper is a set of appendices compiling all the relevant information for 129 species of marine mammals."We did a comprehensive review, species by species, for all living marine mammals," said Reichmuth, who led the work on the appendices. "We pulled together the available knowledge covering all aspects of hearing, sound sensitivity, anatomy, and sound production. That's the scientific basis for the species groupings used in the noise exposure criteria.""The appendices are a really important resource that does not exist anywhere else," Southall said. "The 2007 paper was the most impactful single paper I've ever published -- it's been cited in the literature more times than all my other papers combined -- and I expect this new paper will have a similar impact."The 2007 paper covered only those species under the jurisdiction of the National Marine Fisheries Service (NOAA Fisheries). NOAA Fisheries issued U.S. regulatory guidance in 2016 and 2018 based on the 2007 paper and a 2016 Navy technical report by James Finneran, a researcher at the U.S. Navy Marine Mammal Program in San Diego and a coauthor of both papers.In addition to covering all marine mammals for the first time, the new paper also addresses the effects of both airborne and underwater noise on amphibious species in coastal environments, such as sea lions. According to Southall, publishing the new noise exposure criteria along with a comprehensive synthesis of current knowledge in a peer-reviewed journal is a major step forward."There are regulatory agencies around the world that are thirsting for this kind of guidance," Southall said. "There are still holes where we need more data, but we've made some big strides."Research on seals, sea lions, and sea otters at the UCSC Pinniped Lab now run by Reichmuth has provided much of the new data on hearing in amphibious marine mammals. Working with trained animals at UCSC's Long Marine Laboratory, Reichmuth's team is able to conduct controlled experiments and perform hearing tests similar to those used to study human hearing.Finneran's program in San Diego and coauthor Paul Nachtigall's program at the University of Hawaii have provided much of the data for dolphins and other cetaceans.But some marine mammals, such as baleen whales and other large whales, simply can't be held in a controlled environment where researchers could conduct hearing tests. That's where Ketten's research comes in. Ketten uses biomedical imaging techniques, including CT and MRI, to study the auditory systems of a wide range of species."Modeling an animal's hearing based on the anatomy of its auditory system is a very well-established technique that can be applied to baleen whales," Ketten explained. "We also do this modeling for the species that we can test in captivity, and that enables us to hone the models and make sure they're accurate. There has been a lot of resistance to modeling, but it's the only way to study hearing in some of the species with the greatest potential for harm from human sounds."Southall said he regularly hears from people around the world looking for guidance on regulating noise production by activities ranging from wind farm construction to seismic surveys. "This paper has significant international implications for regulation of noise in the ocean," he said. | Pollution | 2,019 |
March 12, 2019 | https://www.sciencedaily.com/releases/2019/03/190312075933.htm | Air pollution causes 8.8 million extra early deaths a year | Air pollution could be causing double the number of excess deaths a year in Europe than has been estimated previously, according to a study published in the | Using a new method of modelling the effects of various sources of outdoor air pollution on death rates, the researchers found that it caused an estimated 790,000 extra deaths in the whole of Europe in 2015 and 659,000 deaths in the 28 Member States of the European Union (EU-28). Of these deaths, between 40-80% were due to cardiovascular diseases (CVD), such as heart attacks and stroke. Air pollution caused twice as many deaths from CVD as from respiratory diseases.The researchers found that air pollution caused an estimated 8.8 million extra deaths globally rather than the previously estimated 4.5 million. Co-author of the study, Professor Thomas Münzel, of the Department of Cardiology of the University Medical Centre Mainz in Mainz, Germany, said: "To put this into perspective, this means that air pollution causes more extra deaths a year than tobacco smoking, which the World Health Organization estimates was responsible for an extra 7.2 million deaths in 2015. Smoking is avoidable but air pollution is not."The number of deaths from cardiovascular disease that can be attributed to air pollution is much higher than expected. In Europe alone, the excess number of deaths is nearly 800,000 a year and each of these deaths represents an average reduction in life expectancy of more than two years."The researchers used exposure data from a model that simulates atmospheric chemical processes and the way they interact with land, sea and chemicals emitted from natural and human-made sources such as energy generation, industry, traffic and agriculture. They applied these to a new model of global exposure and death rates and to data from the WHO, which included information on population density, geographical locations, ages, risk factors for several diseases and causes of death. They focused particularly on levels of polluting fine particles known as 'particulate matter' that are less than or equal to 2.5 microns in diameter -- known as PM2.5 -- and ozone.Worldwide, they found that air pollution is responsible for 120 extra deaths per year per 100,000 of the population. In Europe and the EU-28, it was even higher, causing 133 and 129 extra deaths a year per 100,000 people, respectively.When they looked at individual countries, the researchers found that air pollution caused an excess death rate of 154 per 100,000 in Germany (a reduction of 2.4 years in life expectancy), 136 in Italy (reduction in life expectancy of 1.9 years), 150 in Poland (reduction in life expectancy of 2.8 years), 98 in the UK (reduction in life expectancy of 1.5 years), and 105 in France (reduction in life expectancy of 1.6 years). Excess death rates were particularly high in eastern European countries, such as Bulgaria, Croatia, Romania and Ukraine, with over 200 each year per 100,000 of the population.Co-author, Professor Jos Lelieveld, of the Max-Plank Institute for Chemistry in Mainz and the Cyprus Institute Nicosia, Cyprus, said: "The high number of extra deaths caused by air pollution in Europe is explained by the combination of poor air quality and dense population, which leads to exposure that is among the highest in the world. Although air pollution in eastern Europe is not much worse than in western Europe, the number of excess deaths it caused was higher. We think this may be explained by more advanced health care in western Europe, where life expectancy is generally higher."As a result of these findings, the researchers say that national governments and international agencies must take urgent action to reduce air pollution, including re-evaluating legislation on air quality and lowering the EU's current limits on the annual average levels of air pollution to match the WHO guidelines.Professors Münzel and Lelieveld emphasise that, in terms of air pollution, PM2.5 particles are the main cause of respiratory and cardiovascular disease. Currently, the average annual limit for PM2.5 in the EU is 25 μg/m3 (micrograms per cubic metre), which is 2.5 times higher than the WHO guideline of 10 μg/m3. Even at this level, several European countries regularly exceed the limit.Prof Münzel said: "The current limit of 25 μg/m3 should be adjusted downwards to the WHO guideline of 10 μg/m3. Many other countries, such as Canada, the USA and Australia use the WHO guideline; the EU is lagging a long way behind in this respect. Indeed, new evidence may lead to a further lowering of the WHO guideline in the near future."The link between air pollution and cardiovascular disease, as well as respiratory diseases, is well established. It causes damage to the blood vessels through increased oxidative stress, which then leads to increases in blood pressure, diabetes, stroke, heart attacks and heart failure."Prof Lelieveld said: "Since most of the particulate matter and other air pollutants in Europe come from the burning of fossil fuels, we need to switch to other sources for generating energy urgently. When we use clean, renewable energy, we are not just fulfilling the Paris Agreement to mitigate the effects of climate change, we could also reduce air pollution-related death rates in Europe by up to 55%."According to Prof Lelieveld, the fine dust content in the air could be reduced further by limiting agricultural emissions, which are responsible for a comparatively large amount of particulate matter pollution and for the associated extra number of deaths in Europe.He said: "In Germany, for instance, agriculture contributes up to 45% of PM2.5 to the atmosphere. When manure and fertiliser are used on agricultural land, ammonia is released into the atmosphere, which reacts with sulphur and nitrogen oxides and associated sulphuric and nitric acids, forming salts such as ammonium sulphate and nitrate. These substances contribute significantly to the formation and composition of fine particles, interacting further with soot and organic aerosol compounds."Limitations of the study include the fact there is statistical uncertainty surrounding the estimates, so the size of the effect of air pollution on deaths could be larger or smaller. | Pollution | 2,019 |
March 11, 2019 | https://www.sciencedaily.com/releases/2019/03/190311152735.htm | US black and Hispanic minorities bear disproportionate burden from air pollution | Black and Hispanic Americans bear a disproportionate burden from air pollution caused mainly by non-Hispanic white Americans, according to a study set to be published in the | The research, led by researchers from the University of Minnesota and the University of Washington, quantifies for the first time the racial gap between who causes air pollution and who breathes it.Poor air quality is the largest environmental health risk in the United States. Fine particulate matter (PM2.5) pollution is especially harmful and is responsible for more than 100,000 deaths each year from heart attacks, strokes, lung cancer and other diseases.But not everyone is equally exposed to poor air quality, nor are all people equally responsible for causing it.Researchers found that PM2.5 pollution is disproportionately caused by the consumption of goods and services by the non-Hispanic white majority, but disproportionately inhaled by black and Hispanic minorities.In the report, researchers linked air pollution exposure to the consumption activities that cause it in order to explore "pollution inequity" -- the difference between the environmental health damage caused by a racial-ethnic group and the damage that group experiences."Our work is at the intersection of many important and timely topics such as race, inequality, affluence, environmental justice and environmental regulation," said Jason Hill, University of Minnesota bioproducts and biosystems engineering professor.Researchers found that:The pollution inequity metric is driven by disparities in the amount of goods and services that groups consume, and in exposure to the resulting pollution. These disparities are influenced by longstanding societal trends, such as income inequality and political representation.Because white Americans consume greater amounts of pollution-intensive goods and services, they are responsible for the creation of more PM2.5 pollution than other racial groups. Further, American blacks and Hispanics tend to live in locations with higher PM2.5 concentrations than white Americans, increasing their average daily exposure to the pollution.In the U.S., racial patterns in where people live often reflects conditions from decades earlier, such as racial segregation."Similar to previous studies, we show that racial-ethnic minorities are exposed to more pollution on average than non-Hispanic whites. What is new is that we find that those differences do not occur because minorities on average cause more pollution than whites -- in fact, the opposite is true," said lead author Christopher Tessum, civil and environmental engineering research scientist at the University of Washington and a recent University of Minnesota graduate.The United States has made strides to reduce air pollution across the country and for all ethnic groups. Average PM2.5 exposure declined approximately 50 percent between 2003 and 2015, but pollution inequity has remained high over that same period. This study is the first to quantify pollution inequity and to track it over time.Researchers say their pollution inequity metric can be applied to other types of environmental burdens, providing a simple and intuitive way for policymakers and the public to see the disparity between the pollution that population groups cause and the pollution to which they are exposed."The approach we establish in this study could be extended to other pollutants, locations and groupings of people," said Julian Marshall, civil and environmental engineering professor at the University of Washington. "When it comes to determining who causes air pollution -- and who breathes that pollution -- this research is just the beginning."Financial support for the study came from the U.S. Environmental Protection Agency, the U.S. Department of Energy, the USDA, and the University of Minnesota Institute on the Environment. Researchers from the University of Texas at Austin, the University of New Mexico, Carnegie Mellon University, and Lumina Decision Systems also contributed to the study. | Pollution | 2,019 |
March 11, 2019 | https://www.sciencedaily.com/releases/2019/03/190311125341.htm | No silver bullet for helping the Great Barrier Reef | Recent flooding and the mass outflows of dirty water onto the Great Barrier Reef are raising concerns about their impact on reef health. Across much of coastal Queensland, coastal rivers dump millions of litres of brown, polluted water out onto the Great Barrier Reef, but until now the relative effects of these annual inputs on reef corals and associated organisms have been difficult to understand. | Using a combination of advanced satellite imaging and over 20 years of coral monitoring across the Reef, a team of researchers from Dalhousie University, ARC Centre of Excellence for Coral Reef Studies at James Cook University (Coral CoE), the University of Adelaide and Lancaster University in the UK has found that chronic exposure to poor water quality is limiting the recovery rates of corals across wide swaths of the Great Barrier Reef."What we have found is that the Great Barrier Reef is an ecosystem dominated by runoff pollution, which has greatly reduced the resilience of corals to multiple disturbances, particularly among inshore areas," said lead author Dr. Aaron MacNeil of Dalhousie University."These effects far outweigh other chronic disturbances, such as fishing, and exacerbate the damage done by crown-of-thorns starfish and coral disease. Perhaps most critically, poor water quality reduced the rates at which coral cover recovers after disturbances by up to 25 percent. This shows that, by improving water quality, the rates of reef recovery can be enhanced."Yet the effects of water quality only go so far.Using a series of scenarios modelling future changes in climate and the likelihood of coral bleaching, the team found that no level of water quality improvement was able to maintain current levels of coral cover among the most scenic and valuable outer-shelf reefs that sustain much of the reef tourism industry.Dr. Camille Mellin of the University of Adelaide noted that: "Coral reefs, including the Great Barrier Reef, are subject to an increasing frequency of major coral die off events associated with climate change driven by coral bleaching. With these increasingly common disturbances becoming the new normal, the rate of coral recovery between disturbances has become incredibly important.Prof. Nick Graham of Lancaster University Environment Centre emphasised: "Although improved water quality leading to improved recovery rates of inshore reefs is encouraging, our analysis demonstrates the limits of what reducing sediments and nutrients in river runoff can do to improve the state of the outer Great Barrier Reef. We found that no level of water quality improvement will be able to buffer coral losses among the clear water reefs on the outer shelf, which are the very reefs that tourists in Australia want to see."Clearly reducing river runoff can have beneficial effects on a wide range of reef corals and needs to continue. But for large areas of the reef that are unimpacted by water quality, we must reduce carbon emissions to slow down climate change. Without such action, the most striking and iconic parts of the reef will rapidly decline and become unrecognisable.""What these results emphasise is that there is no silver bullet for addressing the threats facing the Great Barrier Reef," said Dr. MacNeil. | Pollution | 2,019 |
March 11, 2019 | https://www.sciencedaily.com/releases/2019/03/190311125237.htm | A tale of two cities: Is air pollution improving in Paris and London? | For the first time, a joint air pollution study across two mega-cities -- London and Paris -- measures the impact of policies designed to reduce air pollution from urban traffic over the last 12 years. | In a paper, published today in the journal Many polices are in place to combat air pollution at European-wide, city-wide and local scales. This study found that since 2010, these have led to improvements in nitrogen dioxide and particle concentrations across both cities.However, the rate of change has not been enough to achieve compliance with legal limits. They note that nitrogen dioxide from traffic in London has deteriorated alongside some roads.Looking at two time periods, 2005-2009 and 2010-2016, they also found that:Dr Gary Fuller, air pollution scientist at King's College London said, "The diesel emissions scandal had a serious impact on air pollution in Europe's two mega-cities. Even though new cars passed ever tighter exhaust tests, many emitted much more pollution when driven on our roads. This has led to chronic and widespread problems with limits for nitrogen dioxide."A clear lesson here the need for better feedback to make sure that our air pollution polices remain on track. There have been some successes in London and especially with the bus fleet. Although we are now heading in the right direction; we need stronger policies, such as London's forthcoming ultra-low emission zone, to improve air pollution quickly for everyone in our cities, and we need to check that they work." | Pollution | 2,019 |
March 11, 2019 | https://www.sciencedaily.com/releases/2019/03/190311125220.htm | Hot or cold, rural residents more vulnerable to extreme temperatures | Extreme temperatures, both cold and hot, bring greater mortality risk to people living in China's rural communities than in urban areas, according to a recent study published in the journal Environmental Health Perspectives. The disparity between urban and rural mortality risk was found across the entire population, but was greater for women than men, and for people over 65. | "These findings go against the general assumption that urban residents are at a higher risk due to the urban heat island effect, which raises temperatures in cities compared to surrounding areas," says IIASA researcher Stefan Hochrainer-Stigler, a coauthor on the study which was led by 2016 Young Scientists Summer Program (YSSP) participant Kejia Hu in collaboration with other researchers in China and at IIASA."Risk is composed of three key elements -- hazard, exposure and vulnerability," says IIASA researcher Wei Liu, another study co-author." While urban heat waves can mean a higher hazard level, urban populations often have less outdoor working time and better housing, perhaps with air conditioning, which reduces their exposure. They also have better access to public health support, which reduces vulnerability.It is well-known that extreme heat and cold cause deaths. Temperature extremes can lead to both direct mortality from exposure, as well as exacerbate other ailments including heart and respiratory conditions. However, most previous studies on this topic have focused on developed countries, and very few have differentiated between urban and rural populations either in terms of temperature data or population exposure.In the new analysis, the researchers used detailed weather, air pollution, population density, and mortality data from the province from 2009 to 2015 to estimate the numbers of urban and rural deaths attributable to hot and cold temperatures.The new findings suggest that by leaving out important differences between rural and urban areas and populations, previous studies may have underestimated the overall impact of extreme temperatures on population mortality."This is the first study conducted in a developing country that finds an urban-rural disparity in both heat and cold mortality risks. Importantly, we found that mortality risks (associated with both cold and hot temperatures were higher in rural areas than urban areas, for all types of diseases, people aged older than 65 years and both sex groups," says Hochrainer-Stigler.The researchers say their findings have important implications for policy, particularly in developing countries. Investments in healthcare in rural areas could help reduce vulnerability, and targeted measures to ensure people can heat and cool their homes could help reduce exposure."While fast urbanization is taking place, in the developing world there is still a large percentage of population living in rural areas," says Liu. "These people are more likely to be working long days outdoors and also to have poor public health system coverage. Both these factors lead to greater vulnerability." | Pollution | 2,019 |
March 11, 2019 | https://www.sciencedaily.com/releases/2019/03/190311125141.htm | Honey bees can help monitor pollution in cities | Honey from urban bees can tell us how clean a city is and help pinpoint the sources of environmental pollutants such as lead, new University of British Columbia research has found. | In a study published today in "The good news is that the chemical composition of honey in Vancouver reflects its environment and is extremely clean," said Kate E. Smith, lead author of the study and PhD candidate at PCIGR. "We also found that the concentration of elements increased the closer you got to downtown Vancouver, and by fingerprinting the lead we can tell it largely comes from humanmade sources."Metro Vancouver honey is well below the worldwide average for heavy metals like lead, and an adult would have to consume more than 600 grams, or two cups, of honey every day to exceed tolerable levels."The instruments at PCIGR are very sensitive and measure these elements in parts per billion, or the equivalent of one drop of water in an Olympic-sized swimming pool," said Dominique Weis, senior author and director of the institute.The researchers found the concentration of elements increased closer to areas with heavy traffic, higher urban density and industrial activity such as shipping ports. Places like the city of Delta showed elevated levels of manganese, which could be a result of agricultural activity and pesticide use in the area.In the first study of its kind in North America, the researchers also compared the lead fingerprints of the honey to those from other local environmental samples like lichen from around British Columbia, rock from the Garibaldi volcanic belt, sediment from the Fraser River and trees in Stanley Park.They discovered that the lead fingerprints of the honey did not match any local, naturally-occurring lead. However, the trees in Stanley Park and the honeys from downtown displayed some striking similarities that pointed to potential humanmade sources of lead."We found they both had fingerprints similar to aerosols, ores and coals from large Asian cities," said Weis. "Given that more than 70 per cent of cargo ships entering the Port of Vancouver originate from Asian ports, it's possible they are one source contributing to elevated lead levels in downtown Vancouver."Honey is able to provide such localized "snapshots" of the environment because honey bees typically forage for pollen and nectar within a two- to three-kilometre radius of their hives."We now have four years of consistent data from Metro Vancouver, which provides a present-day baseline that will allow us to monitor even tiny changes in our environment very efficiently," said Weis.The research was carried out in partnership with Hives for Humanity, a local non-profit that creates opportunities for people in Vancouver's Downtown Eastside to engage in urban beekeeping."One of the exciting parts of this study is that it bridges science with community interests," said Smith. "Honey sampling can easily be performed by citizen scientists in other urban centres, even if they lack other environmental monitoring capabilities."The team will continue to study how honey analysis might complement traditional air and soil monitoring techniques and test the efficiency of honey as an environmental monitor in other cities. | Pollution | 2,019 |
March 5, 2019 | https://www.sciencedaily.com/releases/2019/03/190305135259.htm | Capturing bacteria that eat and breathe electricity | Last August, Abdelrhman Mohamed found himself hiking deep into the wilderness of Yellowstone National Park. | Unlike thousands of tourists who trek to admire the park's iconic geysers and hot springs every year, the WSU graduate student was traveling with a team of scientists to hunt for life within them.After a strenuous seven mile walk through scenic, isolated paths in the Heart Lake Geyser Basin area, the team found four pristine pools of hot water. They carefully left a few electrodes inserted into the edge of the water, hoping to coax little-known creatures out of hiding -- bacteria that can eat and breathe electricity.After 32 days, the team returned to the hot springs to collect the submerged electrodes. Working under the supervision of Haluk Beyenal, Paul Hohenschuh Distinguished Professor in the Gene and Linda Voiland School of Chemical Engineering and Bioengineering, Mohamed and postdoctoral researcher Phuc Ha analyzed the electrodes.Voila! They had succeeded in capturing their prey -- heat-loving bacteria that "breathe" electricity through the solid carbon surface of the electrodes.The WSU team, in collaboration with colleagues from Montana State University, published their research detailing the multiple bacterial communities they found in the "This was the first time such bacteria were collected in situ in an extreme environment like an alkaline hot spring," said Mohamed, adding that temperatures in the springs ranged from about 110 to nearly 200 degrees Fahrenheit.These tiny creatures are not merely of academic interest.They may hold a key to solving some of the biggest challenges facing humanity -- environmental pollution and sustainable energy. Such bacteria can "eat" pollution by converting toxic pollutants into less harmful substances and generating electricity in the process."As these bacteria pass their electrons into metals or other solid surfaces, they can produce a stream of electricity that can be used for low-power applications," said Beyenal.Most living organisms -- including humans -- use electrons, which are tiny negatively-charged particles, in a complex chain of chemical reactions to power their bodies. Every organism needs a source of electrons and a place to dump the electrons to live. While we humans get our electrons from sugars in the food we eat and pass them into the oxygen we breathe through our lungs, several types of bacteria dump their electrons to outside metals or minerals, using protruding hair-like wires.To collect bacteria in such an extreme environment over 32 days, Mohamed invented a cheap portable potentiostat, an electronic device that could control the electrodes submerged in the hot springs for long periods of time."The natural conditions found in geothermal features such as hot springs are difficult to replicate in laboratory settings," said Beyenal. "So, we developed a new strategy to enrich heat-loving bacteria in their natural environment." | Pollution | 2,019 |
March 5, 2019 | https://www.sciencedaily.com/releases/2019/03/190305083633.htm | Rethinking old-growth forests using lichens as an indicator of conservation value | Two Canadian biologists are proposing a better way to assess the conservation value of old-growth forests in North America -- using lichens, sensitive bioindicators of environmental change. | Dr. Troy McMullin, lichenologist at the Canadian Museum of Nature, and Dr. Yolanda Wiersma, landscape ecologist at Memorial University of Newfoundland, propose their lichen-focussed system in a paper published today in the Ecological Society of America journal, "We are presenting a paradigm shift for the way we assess forests and manage them," says McMullin. "How do we select the forests with highest conservation value? How do we decide what to protect and what to cut? Lichens are part of the answer."Old-growth forests, especially those in North America, are perceived to be rich in biodiversity, in addition to capturing aesthetic and spiritual values. These forests are usually defined by the age of the trees, with conservation and management practices developed accordingly. McMullin and Wiersma say this is an over-simplification, as it overlooks the importance of biodiversity in those habitats."Forests with old trees in them are certainly awesome and important," says Wiersma. "However, forests change through time and something that is an older forest now may not always be a forest."The approach of the researchers lets them look at the presence of forests in the context of the broader landscape."If we think of the landscape as a patchwork quilt of different types of forests of different ages, some of those patches of forest will stick around for a long time, while others might wink in and out over different time frames. Our approach lets us identify which patch has been a forest for the longest period of time, even if it's not the one with the oldest trees."McMullin and Wiersma argue that old trees are only a proxy for biodiversity in old-forest ecosystems and that biodiversity should be measured directly -- with lichens as the ideal candidates."The advantage of lichens as indicators for this biodiversity is that they don't go anywhere, you can study them anytime of the year, and they 'eat the air', which makes them one of the most sensitive organisms in the forest ," explains McMullin.Many old-growth forests have high sustained moisture and a high number of microhabitats suitable for certain species, which can't disperse easily. Having these forests in the landscape provides a refuge for the seeds and spores that helps with the continued preservation of this biodiversity.In their paper, McMullin and Wiersma propose that suites of lichens associated with known old-growth areas can be used to develop "an index of ecological continuity" for forests of interest. This scorecard of lichen species could then be used as a tool by conservation biologists and forest mangers -- the more species that are old-growth dependent, then the higher the forest's conservation value.The scientists note that lichens are already being used to assess old-growth value and forest continuity in parts of Europe, less so in North America. As examples, they cite the studies of British botanist Dr. Francis Rose, who developed an index of about 30 lichens associated with Britain's Royal Forests or medieval parklands, which have been relatively untouched going back hundreds or even thousands of years.McMullin has also built on the work of Dr. Steven Selva at the University of Maine. Selva has documented that species of stubble lichens (the calicioids), which tend to be found predominantly in old-growth areas, can be good indicators of continuously forested lands in Acadian forests.So, what next? McMullin and Wiersma propose they can build on this knowledge to develop lists of appropriate lichen suites for forest types such as Carolinian, Boreal or Great Lakes-St. Lawrence. Next steps could include training those responsible for assessing the forests, offering access to the expertise of trained lichenologists, and taking advantage of new technologies such as DNA barcoding.There are an estimated 2,500 species of lichens in Canada. Lichens are found on every continent and can grow in frigid polar regions, harsh deserts, as well as your backyard! A lichen is part fungus and part green alga or a cyanobacterium (sometimes both). Some lichens fix nitrogen for the soil and lichens play important roles in ecosystems: they are among the first colonizers of bare rock and prevent erosion by stabilizing soil, they provide food for animals, habitats for insects, and they can serve as monitors for air-pollution. The Canadian Museum of Nature houses the National Herbarium of Canada, which includes about 150,000 specimens of lichens. | Pollution | 2,019 |
March 4, 2019 | https://www.sciencedaily.com/releases/2019/03/190304154858.htm | A faster, more accurate way to monitor drought | More than 2 billion people worldwide are affected by water shortages, wildfires, crop losses, forest diebacks or other environmental or economic woes brought on by drought. | A new monitoring method developed at Duke University allows scientists to identify the onset of drought sooner -- meaning conservation or remediation measures might be put into place sooner to help limit the damage."By combining surface and air temperature measurements from thousands of weather stations and satellite images, we can monitor current conditions across an entire region in near real time and identify the specific places where drought-induced thermal stress is occurring," said James S. Clark, Nicholas Professor of Environmental Sciences at Duke's Nicholas School of the Environment."Other methods now in use are based on data that can take a month or longer to become available," Clark said. "That means scientists or managers may not know a region is in drought until well after the conditions actually begin."Clark and his colleagues have created a free public website, called Drought Eye (The thermal stress they've measured is the difference between the air temperature at a site and the surface temperature of the plant canopy there. Ordinarily, these canopies are cooled by water evaporating into the air through small pores, or stomata, in the plants' leaves. This explains why midday temperatures in a forest in summer are cooler than in a city. During prolonged periods without rain, however, the cooling mechanism breaks down. Ground moisture available to the trees becomes limited. To conserve their water supply, the trees close their stomata, allowing the canopy's surface to heat up."This led us to speculate that the canopy-atmosphere differential could provide a simple but highly accurate indicator of drought-induced water stress on a continental scale during warm and dry seasons, when the threat of wildfires and other impacts is most severe and timely monitoring is essential," said Bijan Seyednasrollah, a 2017 graduate of the Nicholas School, who led the research as part of his doctoral dissertation.To test the hypothesis, he used measurements of thermal stress from thousands of sites to retroactively "predict" drought conditions across the contiguous U.S. over the past 15 years. He then ran similar tests using other widely employed drought indices to see which of the methods, new or old, produced results that most closely mirrored the historical record."Among the drought metrics that we considered, thermal stress had the highest correlation values and most accurately 'predicted' the onset of drought in a wide range of atmospheric and climate conditions," said Seyednasrollah, who is now a postdoctoral environmental scientist at Harvard University and Northern Arizona University.The new index will enable local authorities to determine the risks of wildfires or identify areas where water use should be restricted in a more timely manner, Clark says. It can also reveal areas where forest dieback -- which affects forest health and can add to wildfire risks -- is occurring, because trees stop transpiring when they start to die. These diebacks are often linked to pest infestations or other environmental stresses, and are a huge problem in many parts of the West.Jean-Christophe Domec, a visiting professor at the Nicholas School, co-authored the new study with Seyednasrollah and Clark.They published their peer-reviewed findings Feb. 16 in the journal | Pollution | 2,019 |
March 4, 2019 | https://www.sciencedaily.com/releases/2019/03/190304105424.htm | Scientists develop printable water sensor | A new, versatile plastic-composite sensor can detect tiny amounts of water. The 3d printable material, developed by a Spanish-Israeli team of scientists, is cheap, flexible and non-toxic and changes its colour from purple to blue in wet conditions. The researchers lead by Pilar Amo-Ochoa from the Autonomous University of Madrid (UAM) used DESY's X-ray light source PETRA III to understand the structural changes within the material that are triggered by water and lead to the observed colour change. The development opens the door to the generation of a family of new 3D printable functional materials, as the scientists write in the journal | In many fields, from health to food quality control, environmental monitoring and technical applications, there is a growing demand for responsive sensors which show fast and simple changes in the presence of specific molecules. Water is among the most common chemicals to be monitored. "Understanding how much water is present in a certain environment or material is important," explains DESY scientist Michael Wharmby, co-author of the paper and head of beamline P02.1 where the sensor-material was examined with X-rays. "For example, if there is too much water in oils they may not lubricate machines well, whilst with too much water in fuel, it may not burn properly."The functional part of the scientists' new sensor-material is a so-called copper-based coordination polymer, a compound with a water molecule bound to a central copper atom. "On heating the compound to 60 degrees Celsius, it changes colour from blue to purple," reports Pilar Amo-Ochoa. "This change can be reversed by leaving it in air, putting it in water, or putting it in a solvent with trace amounts of water in it." Using high-energy X-rays from DESY's research light source PETRA III at the experimental station P02.1, the scientists were able to see that in the sample heated to 60 degrees Celsius, the water molecule bound to the copper atoms had been removed. This leads to a reversible structural reorganisation of the material, which is the cause of the colour change."Having understood this, we were able to model the physics of this change," explains co-author José Ignacio Martínez from the Institute for Materials Science in Madrid (ICMM-CSIC). The scientists were then able to mix the copper compound into a 3D printing ink and printed sensors in several different shapes which they tested in air and with solvents containing different amounts of water. These tests showed that the printed objects are even more sensitive to the presence of water than the compound by itself, thanks to their porous nature. In solvents, the printed sensors could already detect 0.3 to 4 per cent of water in less than two minutes. In air, they could detect a relative humidity of 7 per cent.If it is dried, either in a water free solvent or by heating, the material turns back to purple. A detailed investigation showed that the material is stable even over many heating cycles, and the copper compounds are evenly distributed throughout the printed sensors. Also, the material is stable in air over at least one year and also at biological relevant pH ranges from 5 to 7. "Furthermore, the highly versatile nature of modern 3D printing means that these devices could be used in a huge range of different places," emphasises co-author Shlomo Magdassi from The Hebrew University of Jerusalem. He adds that the concept could be used to develop other functional materials as well."This work shows the first 3D printed composite objects created from a non-porous coordination polymer," says co-author Félix Zamora from the Autonomous University of Madrid. "It opens the door to the use of this large family of compounds that are easy to synthesize and exhibit interesting magnetic, conductive and optical properties, in the field of functional 3D printing." | Pollution | 2,019 |
March 4, 2019 | https://www.sciencedaily.com/releases/2019/03/190304095938.htm | India's stubble burning air pollution causes USD 30 billion economic losses, health risks | Living in districts with air pollution from intense crop residue burning (CRB) is a leading risk factor for acute respiratory infection (ARI), especially among children less than five years, in northern India. Additionally, CRB also leads to an estimated economic loss of over USD 30 billion annually. These are the key findings of a new study from researchers at the International Food Policy Research Institute (IFPRI) and partner institutes. The study estimates -- for the first time -- the health and economic costs of CRB in northern India. | "Poor air quality is a recognized global public health epidemic, with levels of airborne particulate matter in Delhi spiking to 20 times the World Health Organization's safety threshold during certain days. Among other factors, smoke from the burning of agricultural crop residue by farmers in Haryana and Punjab especially contributes to Delhi's poor air, increasing the risk of ARI three-fold for those living in districts with intense crop burning," said IFPRI Research Fellow and co-author of the study, Samuel Scott. The study also estimated the economic cost of exposure to air pollution from crop residue burning at USD 30 billion or nearly Rs. 2 lakh crore annually for the three north Indian states of Punjab, Haryana and Delhi.The study, "Risk of acute respiratory infection from crop burning in India: estimating disease burden and economic welfare from satellite and national health survey data for 250,000 persons," co-authored by IFPRI's Samuel Scott and Avinash Kishore; CGIAR Research Program on Agriculture for Nutrition and Health's Devesh Roy; University of Washington's Suman Chakrabarti; and Oklahoma State University's Md. Tajuddin Khan, will be published in an upcoming edition of the The researchers observed that as crop burning increased in the northern Indian state of Haryana, respiratory health worsened. Health was measured by the frequency of reported hospital visits for ARI symptoms [see attached chart]. They also examined other factors that could contribute to poor respiratory health such as firecracker burning during Diwali (it usually coincides with time of CRB) and motor vehicle density. In fact, economic losses owing to exposure to air pollution from firecracker burning are estimated to be around USD 7 billion or nearly Rs. 50 thousand crore a year. In five years, the economic loss due to burning of crop residue and firecrackers is estimated to be USD 190 billion, or nearly 1.7 per cent of India's GDP."Severe air pollution during winter months in northern India has led to a public health emergency. Crop burning will add to pollution and increase healthcare costs over time if immediate steps are not taken to reverse the situation. The negative health effects of crop burning will also lower the productivity of residents and may lead to long-term adverse impacts on the economy and health," said Suman Chakrabarti."Our study shows that it is not only the residents of Delhi, but also the women, children and men of rural Haryana who are the first victims of crop residue burning. Much of the public discussion on ill-effects of crop residue burning ignores this immediately affected vulnerable population," said Kishore.Even though air pollution has been linked to numerous health outcomes, and respiratory infections are a leading cause of death and disease in developing countries, none of the existing studies have directly linked crop burning to ARI. This study suggests that targeted government initiatives to improve crop disposal practices are worthy investments."Programs and policies must simultaneously address indoor and outdoor pollution through a possible combination of bans and agricultural subsidies. Other important interventions for improving respiratory health are increasing household access to clean cooking fuels, electricity, and improved drainage systems," Kishore added.Crop burning is a widespread global practice and in India is concentrated in northwest India, though has spread to other regions of the country in the past decade as new crop harvesting technology is adopted. Farmers try to maximize their yields by planting the next crop as soon as possible after the previous crop has been harvested (generally wheat after rice). To quickly clear the field for the next crop, they burn the leftover stubble rather than using the traditional method of clearing it by hand.Despite efforts from the Indian government, farmers continue to burn crop residues due to lack of convenient and affordable alternatives. Eliminating crop burning will not only improve human health but will also contribute to soil and plant biodiversity and will reduce greenhouse gas emissions. | Pollution | 2,019 |
March 4, 2019 | https://www.sciencedaily.com/releases/2019/03/190304095907.htm | Potential new source of rare earth elements | Researchers have found a possible new source of rare earth elements -- phosphate rock waste -- and an environmentally friendly way to get them out, according to a study published in the | The approach could benefit clean energy technology, according to researchers at Rutgers University-New Brunswick and other members of the Critical Materials Institute, a U.S. Department of Energy effort aimed at bolstering U.S. supply chains for materials important to clean energy.Rare earth elements like neodymium and dysprosium are essential for technologies such as solar and wind energy and advanced vehicles, along with modern electronics like smartphones. But a shortage of rare earth element production in the United States puts our energy security at risk. China produces roughly 90 percent of all such elements.Recovering them from phosphogypsum -- waste from phosphoric acid production -- is a potential solution. Each year, an estimated 250 million tons of phosphate rock are mined to produce phosphoric acid for fertilizers. The U.S. mined approximately 28 million metric tons in 2017. Rare earth elements generally amount to less than 0.1 percent in phosphate rock. But worldwide, about 100,000 tons of these elements per year end up in phosphogypsum waste. That's almost as much as the approximately 126,000 tons of rare earth oxides produced worldwide each year.Conventional methods to extract rare earth elements from ores generate millions of tons of toxic and acidic pollutants. But instead of using harsh chemicals to extract the elements, another method might use organic acids produced by bacteria, said Paul J. Antonick and Zhichao Hu, co-lead authors of the study. They are members of the thermodynamics team led by senior author Richard E. Riman, a Distinguished Professor in the Department of Materials Science and Engineering in Rutgers' School of Engineering.The research team explored using mineral and organic acids, including a bio-acid mixture, to extract six rare earth elements (yttrium, cerium, neodymium, samarium, europium and ytterbium) from synthetic phosphogypsum. Scientists led by David Reed at Idaho National Laboratory produced the bio-acid mixture -- consisting primarily of gluconic acid, found naturally in fruits and honey -- by growing the bacteria Gluconobacter oxydans on glucose. The results suggest that the bio-acid did a better job extracting rare earth elements than pure gluconic acid at the same pH (2.1), or degree of acidity. The mineral acids (sulfuric and phosphoric) failed to extract any rare earth elements in that scenario. When the four acids were tested at the same concentration, only sulfuric acid was more effective than the bio-acid.A next step would be to test bio-acid on industrial phosphogypsum and other wastes generated during phosphoric acid production that also contain rare earth elements. For their initial study, the researchers evaluated phosphogypsum made in the lab, so they could easily control its composition. Industrial samples are more complex. | Pollution | 2,019 |
March 1, 2019 | https://www.sciencedaily.com/releases/2019/03/190301160909.htm | How the humble marigold outsmarts a devastating tomato pest | Scientists have revealed for the first time the natural weapon used by marigolds to protect tomato plants against destructive whiteflies. | Researchers from Newcastle University's School of Natural and Environmental Sciences, carried out a study to prove what gardeners around the world have known for generations -- marigolds repel tomato whiteflies.Publishing their findings today (1 March) in the journal The findings of the study have the potential to pave the way to developing a safer and cheaper alternatives to pesticides.Since limonene repels the whitefly without killing them, using the chemical shouldn't lead to resistance, and the study has shown that it doesn't affect the quality of the produce. All it takes to deter the whiteflies is interspersing marigolds in tomato plots, or hang little pots of limonene in among the tomato plants so that the smell can disperse out into the tomato foliage.In fact, the research team, led by Dr Colin Tosh and Niall Conboy, has shown that may be possible in to develop a product, similar to an air freshener, containing pure limonene, than can be hung in glasshouses to confuse the whiteflies by exposing them to a blast of limonene.Newcastle University PhD student Niall said: "We spoke to many gardeners who knew marigolds were effective in protecting tomatoes against whiteflies, but it has never been tested scientifically."We found that the chemical which was released in the highest abundance from marigolds was limonene. This is exciting because limonene is inexpensive, it's not harmful and it's a lot less risky to use than pesticides, particularly when you don't apply it to the crop and it is only a weak scent in the air."Most pesticides are sprayed onto the crops. This doesn't only kill the pest that is targeted, it kills absolutely everything, including the natural enemies of the pest."Limonene makes up around 90% of the oil in citrus peel and is commonly found in household air fresheners and mosquito repellent.Dr Tosh said: "There is great potential to use limonene indoors and outdoors, either by planting marigolds near tomatoes, or by using pods of pure limonene. Another important benefit of using limonene is that it's not only safe to bees, but the marigolds provide nectar for the bees which are vital for pollination."Any alternative methods of whitefly control that can reduce pesticide use and introduce greater plant and animal diversity into agricultural and horticultural systems should be welcomed."The researchers carried out two big glasshouse trials. Working with French marigolds in the first experiment, they established that the repellent effect works and that marigolds are an effective companion plant to keep whiteflies away from the tomato plants.For the second experiment, the team used a machine that allowed them to analyse the gaseous and volatile chemicals released by the plants. Through this they were able to pinpoint which chemical was released from the marigolds. They also determined that interspersing marigolds with other companion plants, that whiteflies don't like, doesn't increase or decrease the repellent effect. It means that non-host plants of the whiteflies can repel them, not just marigolds.Whitefly adults are tiny, moth-like insects that feed on plant sap. They cause severe produce losses to an array of crops through transmission of a number of plant viruses and encouraging mould growth on the plant.Dr Tosh said: "Direct feeding from both adults and larvae results in honeydew secretion at a very high rate. Honeydew secretion that covers the leaves reduces the photosynthetic capacity of the plant and renders fruit unmarketable."Further studies will focus on developing a three companion plant mixture that will repel three major insect pests of tomato -- whiteflies, spider mites and thrips.Longer term, the researchers aim to publish a guide focussing on companion plants as an alternative to pesticides, which would be suitable across range of horticultural problems. | Pollution | 2,019 |
February 28, 2019 | https://www.sciencedaily.com/releases/2019/02/190228141346.htm | Forests, carbon sinks cannot make up for delays in decarbonizing the economy, experts argue | To stabilize the Earth's climate for people and ecosystems, it is imperative to ramp up natural climate solutions and, at the same time, accelerate mitigation efforts across the energy and industrial sectors, according to a new policy perspective published today in | Natural climate solutions -- such as enhancing carbon sinks from forests, agriculture and other lands -- come with a host of benefits like improved forests, croplands, grazing lands, and wetlands. The paper, co-authored by scientists and climate experts at the Cary Institute of Ecosystem Studies, Columbia University, Kepos Capital, Princeton University, University of Aberdeen, Stanford University, and World Wildlife Fund (WWF), underscores that natural climate solutions alone are not enough to meet the Paris Agreement and must be paired with rapid efforts to decrease emissions from the energy and industrial sectors."This is not an either-or situation. We need to be actively pursuing all possible solutions in all possible places," said co-author Christopher B. Field, a climate scientist who directs the Stanford Woods Institute for the Environment. "Even ambitious deployment of natural climate solutions leaves a big gap that needs to be filled through increased work on decreasing emissions from cars, factories, and powerplants.""By maximizing natural climate solutions and energy mitigation, we can improve forests and habitat, reduce the risk of wildfires, and decrease air and water pollution, thus improving human health and well-being as well as the health of our planet," said Christa M. Anderson, lead author and research fellow with the World Wildlife Fund (WWF).To reduce cumulative emissions and peak warming, the policy paper underscores that the solution will require policy mechanisms and incentives that support natural climate solutions and a major increase in mitigation efforts across the energy and industrial sectors. | Pollution | 2,019 |
February 28, 2019 | https://www.sciencedaily.com/releases/2019/02/190228134238.htm | PE, PP and PS: The most abundant type of microplastics in Mediterranean coastal waters | Polyethylene, polypropylene and polystyrene are the most abundant microplastics in the Mediterranean coastal waters, according to a new study published by the journal | This study describes the presence of different types of microplastics in the peninsular coastal Mediterranean, in particular at the coasts of Catalonia, the region of Murcia and Almeria in Spain. According to the results, other abundant types are nylon polymers, polyurethane (PUR), polyethylene terephthalate (PET), ethylene-vinyl acetate (EVA), polyvinyl chloride (PVC), acrylonitrile butadiene styrene (ABS) and fluorocarbon polymer. The experts also identified for the first time, signs of plastic materials with marine origins -in particular, particles of ship painting- which were not studied in the Mediterranean basin until now.Cylinders and small spheres, polyester foam, filaments from fishing gears and many pieces of plastic from varied chemical compositions are among the found materials in Mediterranean coasts. The study has analysed about 2,500 samples of plastic materials taken from different oceanographic campaigns over a north-south axis in every area of study. In all studied areas, the most abundant materials are fragments of polyethylene (54.5 %), polypropylene (16.5 %) and polyester (9.7 %) -the most produced thermoplastic polymer worldwide- which float in marine waters and are likely to come from the continent.So far, none of the scientific studies could prove the permanence time of plastics in the sea before these deteriorate or are buried. According to the new study, the microplastics researchers found in the Mediterranean coasts "are round-shaped, and small -about a milimeter- and light-weighted, which could suggest a state of advanced deterioration and therefore, a long permanence in the marine environment," says expert William P. de Haan, member of the research group on Marine Geosciences and first author of the study.The study identified places in the peninsular coasts which have maximum concentrations up to 500,000 microplastics per square kilometre, over the average, which is 100,000 mp/km2. "These results coincide with studies conducted in other regions of the Mediterranean, a marine ecosystem regarded as one of the biggest scuppers of floating microplastic worldwide," notes William P. de Haan.In the Catalan coasts, the average concentration of microplastics is over 180,000 items per square kilometre. The most extreme levels have been found in the coasts of Tordera (500,000 mp/kmIn these coastal areas, the changes in intensity and the North Current -which moves simultaneously from north to south in the coast- and currents from the littoral, are factors that may affect the distribution of microplastics in the sea. According to previous studies, the North Current could bring up to a thousand million plastic particles per day -weighting up to 86 tons.In the coastal waters of Murcia and Almeria, the variety of polymers is even higher -mainly nylon, polyurethane or polyethylene terephthalate- than the one in Catalan waters and the most predominant ones are dense microplastics which drown easily. Regarding colors, the most abundant ones are matt white (46 % in Murcia and 54 % in Almeria) and dark colors (20 % and 12 % respectively). In these coasts, marine dynamics -with the arrival of surface waters from the Strait of Gibraltar- could favour the emergence of microplastics coming from the Atlantic Ocean.Moreover, there are extensions of cultivations in greenhouses -such as in Campo de Dalias, in Almeria- that generate uncontrolled dumpings of plastic in areas near the Almeria coast, with a maximum value of 130,000 mp/km2. In Murcia, the highest concentration is in Cartagena -140,000 mp/kmThe diversity of microplastics in the sea regarding composition and color, as well as differences in concentration, show different origins and volume depending on the analysed area of the coast, according to the authors.Plastics do not always behave the same way, and that is why it is hard to know generically their final destination in the marine environment. "Size and physical and chemical properties, as well as the conditions of the marine environment, determine the destination of microplastics in water," says researcher Anna Sànchez-Vidal."Density of the plastic material is a determining factor regarding big fragments. When talking of a microplastic, dynamics are more complex. Also, density of marine water varies due several factors -temperature, salinity, geographical position, depth- and it affects directly the buoyancy of the microplastics."The study describes for the first time the aggregate potential of microplastics to integrate into marine organics, formed by organic and mineral-derived particles. This interaction -described so far in the laboratory only- is a phenomenon that occurs naturally in the marine environment, as stated in the new study.Therefore, 40 % of the microplastics -in quantity- and 25% -in mass- can create these marine aggreagtes. This process could ease the drowning and accumulation of light microplastics in marine floors, an environment far from the only agent able to deteriorate them: the solar ultraviolet radiation."Around 66 % of the microplastics we found in marine aggregates -polyethylene, polypropylene and expanded polystyrene- are low density micrplastics in the sea. This hypothesis could explain the presence of low-density microplastics in big marine depths worldwide, and why the abundance of plastics floating in the surface of the ocean is lower than expected," notes Sanchez-Vidal.Usually, plastics that float in the marine surface are eaten by marine organisms, which may think they are food. Even zooplankton is able to eat microplastics and throw them through faecal pellets. This is a known situation -and not enough studied- in marine ecosystems.Also, apart from additives they contain per se, microplastics can bring toxic compounds to the trophic chain in marine waters (metals, organic pollutants, and others). Transported by the marine currents, these plastic materials can become dispersal vehicles for invasive species and pathogenic organisms.Climate change, fishing industry, maritime transport, prospection and exploitation of hydrocarbon and industrial dumping are some of the big threats to the future of marine and coastal systems in the Mediterranean. Protecting and improving the environmental quality of the Mediterranean Sea is a priority in the environmental scientific and political European agenda.In this situation of scientific challenges, experts from the research group on Marine Geosciences of the UB have participated in different international studies that warn about the environmental impact of microparticles in the marine environment. At the same time, they take part in scientific projects to improve the preservation of marine ecosystems in the Mediterranean basin, such as the projects Policy-oriented marine Environmental Research in the Southern European Seas (PERSEUS) and Implementation of the MSFD to the DEep Mediterranean Sea (IDEM).Regarding the Mediterranean, some of the most important resources on environmental protection are the Marine Strategy Framework Directive (MSFD) and the Barcelona Agreement for the Protection of the Mediterranean Sea, signed in 1976 and modified in 1995. According to Professor Miquel Canals, head of the research group on Marine Geosciences of the UB and director of the Department of Earth and Ocean Dyanmics of the UB "The Framework Directive considers a series of initiatives to protect and improve the environmental state of marine ecosystems in Europe. In these lines, it defines a series of indicators that study aspects related to research on marine waste, and in particular, a better knowledge on environmental and biological impacts of microparticles in the marine environment.""Also, prevention of any kind of pollution is one of the main objectives of the Barcelona Agreement. In this context, there are initiatives such as Fent front a les deixalles marines a la Mediterrània and Una Mediterrània sense plastics, related to the environmental impact of microplastics," concludes Professor Miquel Canals. | Pollution | 2,019 |
February 28, 2019 | https://www.sciencedaily.com/releases/2019/02/190228093553.htm | Pesticide exposure contributes to faster ALS progression | While exact causes of amyotrophic lateral sclerosis (ALS) remain unknown, new research showspesticides and other environmental pollutants advance the progression of the neurodegenerativedisease. | The latest study from the University of Michigan ALS Center of Excellence, recently published in the "Our latest publication shows that other toxins like polychlorinated biphenyls, known as PCBs, are alsoelevated in ALS patients and correlate with poor survival," says Feldman, a Michigan Medicineneurologist. "Our research shows that environmental pollution is a public health risk that we believemust be addressed." ALS, also known as Lou Gehrig's disease, is a rapidly progressive disease that causes people to lose their ability to move their limbs and body.For the study, each of the 167 U-M patients had blood drawn shortly after being diagnosed with ALS. They were then divided into quartiles based on the concentration of pollutants in their blood stream.The quartile with the highest amount of pollutants had a median survival time of 1 year and 11 months from the initial date of ALS diagnosis. Meanwhile, the quartile with the lowest concentration ofpollutants had a months-longer median survival time at 2 years, 6 months."Our concern is that not only are these factors influencing a person's likelihood to get ALS, but alsospeeding up disease once they have it," says Michigan Medicine neurologist Stephen Goutman, M.D., M.S., the study's primary author and the director of U-M's ALS clinic.Michigan Medicine researchers are uniquely poised to study the origins of ALS in the search for moreeffective treatments and, eventually, a cure. Feldman says Michigan has one of the highest rates of ALS in the country, according to the Centers for Disease Control and Prevention."'Why us? Why Michigan?' We believe the answer may lie in the fact that Michigan is both an industrialand agricultural state," says Feldman, who founded Michigan Medicine's ALS Center of Excellence in 1998.Throughout Michigan's farming history, a variety of persistent environmental chemicals have been used in pesticides. These chemicals are absorbed into the ground and can potentially reach water supplies.While harmful pesticides have been identified and banned, such as DDT in 1972, their consequences persist, taking decades to degrade in some cases. These chemicals can accumulate in the sediments of rivers and the Great Lakes, as well as in the fish that populate them.Michigan's industrial activities have placed the state among the top five generators of hazardous wastein the U.S., with 69 designated Superfund sites. PCBs, which are non-flammable, human-made chemicals used in electrical and hydraulic equipment, were in use until 1979. Similar to pesticides, these industrial chemicals degrade slowly, can leech into the ground and may affect the environment for decades toc ome."If these chemicals are getting into the water bodies, such as lakes and rivers, in Michigan, this could bea source of exposure for everybody," Goutman says. "These persistent environmental chemicals take along time to break down, sometimes decades. Once you're exposed they may accumulate in your body.They get into the fat and can be released into the blood. We're particularly concerned about ALSpatients who have been exposed to higher amounts of these chemicals.""As pollution changes the environment, we are being exposed to more and more toxins. We don't yetknow how this is going to contribute to human disease over time. As we look at more toxins, we want toidentify those that are of greater significance in terms of disease onset or progression," Goutman adds."If we can determine what these chemicals are doing to our organs, brains and motor neurons, then wecan develop drugs to counteract those effects."ALS has no cure, but two FDA-approved medications, riluzole (also known as Rilutek) and edaravone (also known as Radicava), are shown to have modest effects in slowing disease. Goutman says using non-invasive ventilation -- a ventilator that's connected to a mask that covers the nose, nose and mouth, or entire face -- is an extremely effective therapy for ALS. It's reported to increase survival by 13 months on average. Also, it is important to not overlook other related symptoms in ALS and treat these to improve quality of life and to address mobility and safety, which Goutman discussed in a recentFacebook Live presentation.Feldman says future research will continue to address the question her patients most frequently ask:"Why did I get this disease?" A clear understanding of the development of ALS will help researchers work toward a cure.Next, the team plans to study a new cohort of patients in U-M's clinic. Repeating similar results would further validate their findings, they say, establishing the framework for a national study. The scientific team has also received funding from the Center for Disease Control to understand the metabolism andinteractions of pesticides and pollutants in ALS patients, and how specific metabolites correlate with disease onset, progression and survival. Feldman says understanding the metabolism of pesticides will lead to drug discovery. | Pollution | 2,019 |
February 27, 2019 | https://www.sciencedaily.com/releases/2019/02/190227111130.htm | Crop residue burning is a major contributor to air pollution in South Asia | While fossil fuel emissions in New Delhi account for 80 percent of the air pollution plume during the summer, emissions from biomass burning (such as crop residue burning) in neighboring regions rival those from fossil fuels during the fall and winter. | "Black carbon aerosols are damaging to human health and their levels are higher in New Delhi than in many other megacities. During fall and winter, the levels of polluting air particles in New Delhi can reach ten times the limit recommended by the World Health Organization. To determine the environmental effects of black carbon in this highly populated city, it is crucial to quantify the contributions from the key emissions sources," says August Andersson, Researcher at the Department of Environmental Science and Analytical Chemistry (ACES), Stockholm University and co-author of the study.The researchers collected air samples from New Delhi during 2011 and analyzed their black carbon content by creating carbon isotope signature profiles of each of the samples to identify the source of the particles. Different sources of carbon give their own unique isotopic fingerprints. When they compared the relative amounts of the black carbon particles from the different sources over the course of a year they discovered a strong seasonal variation. The relative contribution from fossil fuel combustion peaked during rainy summers and the contribution from biomass burning (for example wood and straw burning) peaked during the dry fall and winter months. In addition, the scientists found that the sources of the high biomass emissions were regional rather than local and urban, for example, burning of residual crops by farmers living approximately 200 km away from New Delhi. In India, crop residue burning occurs after harvest, which typically occurs in October/November for wheat, and in April/May for rice."Our findings contradict the widespread notion that the emission flux between cities and the countryside is mainly one-way. No other study has conclusively reported such high amounts of black carbon from biomass burning in the middle of a megacity, where the main source is expected to be traffic. The wintertime regional influx of black carbon into New Delhi suggests that to efficiently combat severe air pollution, it is necessary to not only mitigate the urban emissions, but also regional-scale biomass emissions, including agricultural crop residue burning," says August Andersson. He continues: "Air pollution in the Indo-Gangetic region, which includes eastern Pakistan, northern India and Bangladesh, is a regional problem, potentially affecting nearly 1 billion people." | Pollution | 2,019 |
February 26, 2019 | https://www.sciencedaily.com/releases/2019/02/190226125011.htm | 'Dead zone' volume more important than area to fish, fisheries | Dubravko Justic, the Texaco Distinguished Professor in the LSU Department of Oceanography & Coastal Sciences, and Research Associate Lixia Wang recently co-authored a study suggesting that measuring the volume rather than the area of the Gulf of Mexico's dead zone, is more appropriate for monitoring its effects on marine organisms. | The dead zone, a hypoxic zone, is a region of low oxygen that results from runoff of high nutrients, such as nitrogen and phosphorus, often found in fertilizer, flowing from the Mississippi River into the coastal ocean. It is the largest recurring hypoxic zone in the U.S., occurring most summers, and is located off the coast of Louisiana. This nutrient pollution, coupled with other factors, is believed to have a negative impact on fisheries because it depletes the oxygen required to support most marine life in bottom and near-bottom waters.Since 2001, stakeholders have used hypoxic area measurements to set goals for limiting or reversing its size, but this new study shows that the hypoxic volume appears more responsive to reductions in nitrogen flowing into the northern Gulf of Mexico than the hypoxic area. The researchers' model simulations indicate that even with a 25 percent nitrogen load reduction, the thickness of the hypoxic layer in the northern Gulf of Mexico decreases markedly, and hypoxia remains localized to a relatively thin layer near the bottom that most fish and other mobile organisms can more effectively avoid.Justic believes this should be considered when reviewing and potentially setting new hypoxia management goals."Understanding variability in hypoxic volume is relevant to assessing the effects of hypoxia on fish and fisheries, such as enhanced susceptibility to fishing due to an increased aggregation of fish avoiding hypoxic waters," Justic said. | Pollution | 2,019 |
February 26, 2019 | https://www.sciencedaily.com/releases/2019/02/190226112356.htm | Scientists simulate forest and fire dynamics to understand area burn of future wildfires | Climate change and wildfire -- It's a combustible mix with costly devastation and deadly consequences. With a goal of understanding the link between the two variables, researchers over the years have studied the effects of climate and wildfire interactions in the Sierra Nevada mountain range. That research has evolved into learning about the distribution of trees, the extent of forest cover and carbon dynamics. | However, most previous studies have looked at the relationship between climate change and area burned by wildfire as a result of climatic influences on wildfire. Fire is a self-limiting process influenced by productivity. Yet, many fire projections assume sufficient vegetation to support fire, with substantial implications for carbon dynamics and emissions.Climate influences vegetation directly and through climate-mediated disturbance processes, such as wildfire. Typically, the amount of area burned and temperature are positively associated based on the availability of vegetation to burn. The interaction with vegetation, which provides the fuel for the fire to burn, hasn't been studied in depth.Now, scientists, including Matthew Hurteau in the Department of Biology at The University of New Mexico, are examining more data via simulations of wildfires in the Sierra Nevada to improve their understanding between prior and future wildfires. They hypothesized that prior wildfires and their influence on vegetation, coupled with a changing climate and its influence on vegetation recovery after a wildfire, would likely restrict the size of wildfires in the future.The research titled Vegetation-fire feedback reduces projected area burned under climate change," was published today in Using a landscape model that simulates forest and fire dynamics, Hurteau and his colleagues conducted simulations where climate was the only influence on area burned (static) and where the interaction of climate and prior fire events on both fuel flammability and availability influenced area burned (dynamic)."Using an ecosystem model, that included photosynthesis and respiration, we were able to capture how the vegetation would respond to climate and area burned during each decade," said Hurteau. "We used the ecosystem model output from each decade and a statistical model that accounted for the effects of prior area burned and climate to re-estimate fire size distributions for the subsequent decade and capture the effects of previous wildfires."What they found was a little surprising. They had expected that prior fire events and their effect on the amount of fuel available to burn in the forest would impose a large constraint on future area burned. Scientists call it a fuel limitation effect from prior fire events, but they found that it doesn't last very long and it's not near as large as they had thought."We discovered that compared to the static scenarios, where climate is the only influence on area burned, accounting for the interaction of prior fires and climate on fuel availability and flammability moderates only reduces the cumulative area burned in the Sierra Nevada by about 7.5 percent over the course of this century. This is much lower than we anticipated because enough vegetation was able to recover following a fire that a burned area would be able to support a subsequent fire pretty quickly," said Hurteau.As part of the study, Christine Wiedinmyer, an atmospheric chemist and the associate director at the University of Colorado's Science at Cooperative Institute for Research in Environmental Sciences, built upon the work of Hurteau and others to take it to the level of air pollutant emissions by looking at the smoke release from wildfires in different climates with different types of vegetation.Wiedinmyer utilized the information from the other researchers to determine how the changes in the amount of vegetation burned impacts the amount of air pollutants emitted from the wildfire smoke. The inclusion of the new vegetation dynamics changed the estimates of wildfire pollutant emissions -- which can ultimately impact air quality. The ability to acquire more detailed predictions of vegetation/wildfire/emissions will enable air quality managers to have better predictions of wildfire air quality impacts into the future."This study revealed that wildfire activity is impacted not only by climate, but also by vegetation and prior fires," said Wiedinmyer. "When you include all of these components in predictions of wildfire -- it can change our predictions of the air pollutants from wildfires. This is important because that information can help air quality managers know what to expect and how to plan for it.""The benefit of our findings is more in terms of improving our understanding of the potential challenges that communities will face in the air sheds that are impacted when wildfire burns in the Sierra Nevada," said Hurteau. "It's also beneficial for policy planning. From a management planning perspective, this work basically shows us the effects of one of those prior fire events is only influential for about a decade, and that there's enough vegetation regrowth in the footprint of a wildfire that occurred within 10 years that can carry fire again."Hurteau noted California's climate action plan is dependent on continued carbon uptake by natural vegetation systems in the state to help balance human-caused carbon dioxide emissions. "This helps constrain some of the uncertainty around future wildfire and how it'll impact carbon uptake by the forest," he said.The research was a collaborative effort involving scientists from across various disciplines including biology, air chemistry, engineering including Shuang Liang at the University of Illinois and LeRoy Westerling at the University of California-Merced."We were able to look at wildfire on a new level of detail and work collaboratively to get a bigger picture about how wildfire behaves in different conditions," added Wiedinmyer. | Pollution | 2,019 |
February 26, 2019 | https://www.sciencedaily.com/releases/2019/02/190226112426.htm | Being surrounded by green space in childhood may improve mental health of adults | Children who grow up with greener surroundings have up to 55% less risk of developing various mental disorders later in life. This is shown by a new study from Aarhus University, Denmark, emphasizing the need for designing green and healthy cities for the future. | A larger and larger share of the world's population now lives in cities and WHO estimates that more than 450 millions of the global human population suffer from a mental disorder. A number that is expected to increase.Now, based on satellite data from 1985 to 2013, researchers from Aarhus University have mapped the presence of green space around the childhood homes of almost one million Danes and compared this data with the risk of developing one of 16 different mental disorders later in life.The study, which is published today in the Journal Postdoc Kristine Engemann from Department of Bioscience and the National Centre for Register-based Research at Aarhus University, who spearheaded the study, says: "Our data is unique. We have had the opportunity to use a massive amount of data from Danish registers of, among other things, residential location and disease diagnoses and compare it with satellite images revealing the extent of green space surrounding each individual when growing up."Researchers know that, for example, noise, air pollution, infections and poor socio-economic conditions increase the risk of developing a mental disorder. Conversely, other studies have shown that more green space in the local area creates greater social cohesion and increases people's physical activity level and that it can improve children's cognitive development. These are all factors that may have an impact on people's mental health."With our dataset, we show that the risk of developing a mental disorder decreases incrementally the longer you have been surrounded by green space from birth and up to the age of 10. Green space throughout childhood is therefore extremely important," Kristine Engemann explains.As the researchers adjusted for other known risk factors of developing a mental disorder, they see their findings as a robust indication of a close relationship between green space, urban life, and mental disorders.Kristine Engemann says: "There is increasing evidence that the natural environment plays a larger role for mental health than previously thought. Our study is important in giving us a better understanding of its importance across the broader population."This knowledge has important implications for sustainable urban planning. Not least because a larger and larger proportion of the world's population lives in cities."The coupling between mental health and access to green space in your local area is something that should be considered even more in urban planning to ensure greener and healthier cities and improve mental health of urban residents in the future," adds co-author Professor Jens-Christian Svenning from the Department of Bioscience, Aarhus University. | Pollution | 2,019 |
February 21, 2019 | https://www.sciencedaily.com/releases/2019/02/190221110418.htm | Researchers explore an often ignored source of greenhouse gas | In a new study from UBC's Okanagan campus, researchers have discovered a surprising new source of carbon dioxide (CO | "We have been studying the carbon content of soil for some time," says Melanie Jones, professor of biology and study lead author. "This large natural carbon store is hugely important in combatting rising atmospheric CODuring photosynthesis, plants remove COCritically, some of the COHannam says greater organic matter in soil has the benefit of sequestering greater atmospheric COAs part of this research effort, Jones, Hannam and fellow UBC Okanagan soil scientist Andrew Midwood have been analyzing the chemical forms of COWorking in a drip-irrigated apple orchard, the study involved continuous measurement of air coming from dynamic soil respiration chambers placed in the orchard. This allowed for high-frequency monitoring of the soil surface and air. The tests were repeated with different water supplies, using irrigation water or de-ionized water, and the results were remarkably different."It turns out that some of the COMidwood is quick to point out that understanding the processes that drive the release of CO"This is a natural process," says Hannam. "Our results have to be considered in a broader context. Irrigation is essential to fruit production in the Okanagan Valley. Along with causing the release of COTheir research has practical applications for any agriculture-based community in any arid region, especially if the main source of irrigation is from an alkaline lake. As irrigation needs to expand across arid and semi-arid regions, COTheir work was funded by Agriculture and Agri-Food Canada's, Agricultural Greenhouse Gases Program and was recently published in | Pollution | 2,019 |
February 21, 2019 | https://www.sciencedaily.com/releases/2019/02/190221083430.htm | Squid could provide an eco-friendly alternative to plastics | The remarkable properties of a recently-discovered squid protein could revolutionize materials in a way that would be unattainable with conventional plastic, finds a review published in | "Squid proteins can be used to produce next generation materials for an array of fields including energy and biomedicine, as well as the security and defense sector," says lead author Melik Demirel, Lloyd and Dorothy Foehr Huck Endowed Chair in Biomimetic Materials, and Director of Center for Research on Advanced Fiber Technologies (CRAFT) at Penn State University, USA. "We reviewed the current knowledge on squid ring teeth-based materials, which are an excellent alternative to plastics because they are eco-friendly and environmentally sustainable."As humanity awakens to the aftermath of a 100-year party of plastic production, we are beginning to heed nature's warnings -- and its solutions."Nature produces a variety of smart materials capable of environmental sensing, self-healing and exceptional mechanical function. These materials, or biopolymers, have unique physical properties that are not readily found in synthetic polymers like plastic. Importantly, biopolymers are sustainable and can be engineered to enhance their physical properties," explains Demirel.The oceans, which have borne the brunt of plastic pollution, are at the center of the search for sustainable alternatives. A newly-discovered protein from squid ring teeth (SRT) -- circular predatory appendages located on the suction cups of squid, used to strongly grasp prey -- has gained interest because of its remarkable properties and sustainable production.The elasticity, flexibility and strength of SRT-based materials, as well as their self-healing, optical, and thermal and electrical conducting properties, can be explained by the variety of molecular arrangements they can adopt. SRT proteins are composed of building blocks arranged in such a way that micro-phase separation occurs. This is a similar situation to oil and water but on a much smaller, nano-scale. The blocks cannot separate completely to produce two distinct layers, so instead molecular-level shapes are created, such as repeating cylindrical blocks, disordered tangles or ordered layers. The shapes formed dictate the property of the material and scientists have experimented with these to produce SRT-based products for a variety of uses.In the textiles industry, SRT protein could address one of the main sources of microplastic pollution by providing an abrasion-resistant coating that reduces microfiber erosion in washing machines. Similarly, a self-healing SRT protein coating could increase the longevity and safety of damage-prone biochemical implants, as well as garments tailored for protection against chemical and biological warfare agents.It is even possible to interleave multiple layers of SRT proteins with other compounds or technology, which could lead to the development of 'smart' clothes that can protect us from pollutants in the air while also keeping an eye on our health. The optical properties of SRT-based materials mean these clothes could also display information about our health or surroundings. Flexible SRT-based photonic devices -- components that create, manipulate or detect light, such as LEDs and optical displays, which are typically manufactured with hard materials like glass and quartz -- are currently in development."SRT photonics are biocompatible and biodegradable, so could be used to make not only wearable health monitors but also implantable devices for biosensing and biodetection," adds Demirel.One of the main advantages of SRT-based materials over synthetic materials and plastics made from fossil fuels are its eco-credentials. SRT proteins are cheaply and easily produced from renewable resources and researchers have found a way of producing it without catching a squid. "We don't want to deplete natural squid resources and hence we produce these proteins in genetically modified bacteria. The process is based on fermentation and uses sugar, water, and oxygen to produce biopolymers," explains Demirel.It is hoped that the SRT-based prototypes will soon become available more widely, but more development is needed.Demirel explains, "Scaling up these materials requires additional work. We are now working on the processing technology of these materials so that we can make them available in industrial manufacturing processes." | Pollution | 2,019 |
February 21, 2019 | https://www.sciencedaily.com/releases/2019/02/190221083414.htm | Innovative nanocoating technology harnesses sunlight to degrade microplastics | Low density polyethylene film (LDPE) microplastic fragments, successfully degraded in water using visible-light-excited heterogeneous ZnO photocatalysts. | The innovative nanocoating technology was developed by a research team from KTH Royal Institute of Technology, Sweden and was further investigated together with PP Polymer, Sweden, as part of the EU Horizon 2020 funded project CLAIM: Cleaning Marine Litter by Developing and Applying Innovative Methods in European Seas (GA no. 774586).Microplastics are a global menace to the biosphere owing to their ubiquitous distribution, uncontrolled environmental occurrences, small sizes and long lifetimes.While currently applied remediation methods including filtration, incineration and advanced oxidation processes like ozonation, all require high energy or generate unwanted byproducts, the team of CLAIM scientists propose an innovative toxic-free methodology reliant solely on relatively inexpensive nanocoatings and visible light.The study, published in The scientists tested the degradation of fragmented, low-density polyethylene (LDPE) microplastic residues, by visible light-induced heterogeneous photocatalysis activated by zinc oxide nanorods. Results showed a 30% increase of the carbonyl index, a marker used to demonstrate the degradation of polymeric residues. Additionally, an increase of brittleness accompanied by a large number of wrinkles, cracks and cavities on the surface were recorded."Our study demonstrates rather positive results towards the effectiveness of breaking low-density polyethylene, with the help of our nanocoating under artificial sunlight. In practice this means that once the coating is applied, microplastics will be degraded solely through the help of sunlight. The results provide new insights into the use of a clean technology for addressing the global microplastic pollution with reduced by-products." explains Prof. Joydeep Dutta, KTH Royal Institute of Technology.The photocatalytic device is one of five marine cleaning technologies developed within the CLAIM project."A year and a half in the project we are already able to demonstrate positive results towards our ultimate goal to introduce new affordable and harmless technologies to aid us tackle the uncontrolably growing problem of marine plastic pollution. We are positive that more results will come in the following months." concludes CLAIM Coordination. | Pollution | 2,019 |
February 20, 2019 | https://www.sciencedaily.com/releases/2019/02/190220121935.htm | Fossil fuel combustion is the main contributor to black carbon around Arctic | Fossil fuel combustion is the main contributor to black carbon collected at five sites around the Arctic, which has implications for global warming, according to a study by an international group of scientists that included a United States team from Baylor University. | The five-year study to uncover sources of black carbon was done at five remote sites around the Arctic and is published in the journal The Baylor team used radiocarbon to determine fossil and biomass burning contributions to black carbon in Barrow, Alaska, while their collaborators used the same technique for sites in Russia, Canada, Sweden and Norway.Baylor research was led by Rebecca Sheesley, Ph.D., associate professor of environmental science in the College of Arts & Sciences, and Tate Barrett, Ph.D., former student of Sheesley and now a postdoctoral scholar at the University of North Texas. Their interest in the research resulted from a desire to understand why the Arctic is changing and what pollutants should be controlled to mitigate that change."The Arctic is warming at a much higher rate than the rest of the globe," Sheesley said. "This climate change is being driven by air pollutants such as greenhouse gases and particles in the atmosphere. One of the most important components of this atmospheric particulate matter is black carbon, or soot. Black carbon directly absorbs incoming sunlight and heats the atmosphere. In snowy places, it also can deposit on the surface, where it heats the surface and increases the rate of melting."Findings showed that fossil fuel combustion (coal, gasoline or diesel) is responsible for most of the black carbon in the Arctic (annually around 60 percent), but that biomass burning (including wildfires and residential woodsmoke) becomes more important in the summer.The site at Barrow, Alaska, which was the focus of the Baylor team, differed from other sites in that it had higher fossil fuel contribution to black carbon and was more impacted by North American sources of black carbon than the rest of the Arctic. Since 2012, Sheesley has been expanding this work to investigate how combustion and non-combustion sources impact different types of atmospheric particulate matter across the Alaskan Arctic.*This Baylor University research received support from the U.S. Department of Energy (Atmospheric Radiation Measurement Field Campaign no. 2010-05876) and Baylor's C. Gus Glasscock Jr. Endowed Fund for Excellence in Environmental Sciences. | Pollution | 2,019 |
February 19, 2019 | https://www.sciencedaily.com/releases/2019/02/190219155156.htm | Prenatal forest fire exposure stunts children's growth | Forest fires are more harmful than previously imagined, causing stunted growth in children who were exposed to smoke while in the womb, according to new research from Duke University and the National University of Singapore. | The authors found pre-natal exposure to haze from forest fires led to a statistically significant 1.3-inch decrease in expected height at age 17."Because adult height is associated with income, this implies a loss of about 3 percent of average monthly wages for approximately one million Indonesian workers born during this period," the authors write."While previous research has drawn attention to the deaths caused by the forest fires, we show that survivors also suffer large and irreversible losses," they wrote. "Human capital is lost along with natural capital because of haze exposure.""This disadvantage is impossible to reverse," said co-author Subhrendu Pattanayak of the Duke Sanford School of Public Policy.After conducting cost-benefit analyses, the authors concluded the long-term human capital losses exceed the short-term financial benefits associated with using fire to clear land for the oil palm industry."There are ways to eliminate these fires that are not that expensive, so this seems a very shortsighted way to develop and grow an economy," Pattanayak said.The study, "Seeking natural capital projects: Forest fires, haze, and early-life exposure in Indonesia," by Pattanayak and Jie-Sheng Tan-Soo of the Lee Kuan Yew School of Public Policy appears this week in the The study combined data on mothers' exposure to widespread Indonesian forest fires in 1997 with longitudinal data on nutritional outcomes, genetic inheritance, climatic factors and various sociodemographic factors.In 1997, which was an abnormally dry year, fires set to clear land primarily for oil palm plantations spread and burned out of control. Between August and October, when the fires were most intense, they engulfed 11 million hectares (27.2 million acres), causing massive exposure to unhealthy levels of air pollution. That year, about 25 percent of global carbon emissions were generated by this single event.The study examined data for 560 affected children who were in utero or in the first six months of life at the time of the fires. Their health outcomes and household characteristics were drawn from the 1997, 2000, 2007, and 2014 rounds of the Indonesian Family Life Survey.The authors conducted a series of robustness checks and confirmed their findings were not driven by high levels of pollution in later years, geographic factors, an indirect effect of severe air pollution on a family's ability to work and to earn income, or overall reductions in food consumption during the months of the forest fires.After documenting the negative effects of the fires on health and well-being, the authors went on to conduct a series of cost-benefit analyses to determine whether spending to avoid such outcomes would be fiscally justifiable.Collectively, these analyses showed that the social net benefits from using fire to clear for oil palm are lower than the social net benefits of mechanical clearing, stronger enforcement of fire bans and better fire suppression efforts.Because oil palm growers would be unwilling to bear the higher costs of mechanical clearing, the authors recommend Indonesia pursue more effective fire bans, fire suppression and moratoriums on oil palm to protect natural and human capital. | Pollution | 2,019 |
February 19, 2019 | https://www.sciencedaily.com/releases/2019/02/190219111825.htm | Fishing and pollution regulations don't help corals cope with climate change | A new study from the University of North Carolina at Chapel Hill reports that protecting coral reefs from fishing and pollution does not help coral populations cope with climate change. The study also concludes that ocean warming is the primary cause of the global decline of reef-building corals and that the only effective solution is to immediately and drastically reduce greenhouse gas emissions. | The new study published in the Ocean warming is devastating reef-building corals around the world. About 75 percent of the living coral on the reefs of the Caribbean and south Florida has been killed off by warming seawater over the last 30 to 40 years. Australia's Great Barrier Reef was hit by extreme temperatures and mass bleaching in 2016 and 2017, wiping out roughly half of the remaining coral on the Great Barrier Reef's remote northern section.Corals build up reefs over thousands of years via the slow accumulation of their skeletons and coral reef habitats are occupied by millions of other species, including grouper, sharks, and sea turtles. In addition to supporting tourism and fisheries, reefs protect coastal communities from storms by buffering the shoreline from waves. When corals die, these valuable services are lost.The most common response to coral decline by policy makers and reef managers is to ban fishing based on the belief that fishing indirectly exacerbates ocean warming by enabling seaweeds that overgrow corals. The approach, referred to as managed resilience, assumes that threats to species and ecosystems are cumulative and that by minimizing as many threats as possible, we can make ecosystems resilient to climate change, disease outbreaks, and other threats that cannot be addressed locally.The study's authors, led by John Bruno who is a marine ecologist in the College of Arts and Sciences at the University of North Carolina at Chapel Hill, performed a quantitative review of 18 case studies that field-tested the effectiveness of the managed resilience approach. None found that it was effective. Protecting reefs inside Marine Protected Areas from fishing and pollution did not reduce how much coral was killed by extreme temperatures or how quickly coral populations recovered from coral disease, bleaching, and large storms."Managed resilience is the approach to saving reefs favored by many scientists, nongovernmental organizations, and government agencies, so it's surprising that it doesn't work. Yet the science is clear: fishery restrictions, while beneficial to overharvested species, do not help reef-building corals cope with human-caused ocean warming," said Bruno.The 18 individual studies measured the effectiveness of managed resilience by comparing the effects of large-scale disturbances, like mass bleaching events, major storms, and disease outbreaks, on coral cover inside Marine Protected Areas versus in unprotected reefs. Many also measured the rate of coral population recovery after storms. The decline in coral cover was measured directly, via scuba surveys of the reef, before and periodically after large-scale disturbances. Overall, the meta-analysis included data from 66 protected reefs and 89 unprotected reefs from 15 countries around the world.The study also assessed evidence for various assumed causes of coral decline. For many, including overfishing, seaweeds, and pollution, evidence was minimal or uncertain. In contrast, the authors found that an overwhelming body of evidence indicates that ocean warming is the primary cause of the mass coral die-off that scientists have witnessed around the world.Bruno collaborated with Dr. Isabelle Côté of Simon Fraser University and Dr. Lauren Toth of USGS St. Petersburg Coastal and Marine Science Center. Research was funded by the U.S. National Science Foundation, the Natural Sciences and Engineering Research Council of Canada, the U.S. Geological Survey's Coastal and Marine Geology program and the Climate and Land Use Research and Development program. | Pollution | 2,019 |
February 19, 2019 | https://www.sciencedaily.com/releases/2019/02/190219111718.htm | The global impact of coal power | Coal-fired power plants produce more than just the carbon dioxide that contributes to global warming. When burning coal, they also release particulate matter, sulphur dioxide, nitrogen oxide and mercury -- thus damaging the health of many people around the world in various ways. To estimate where action is most urgently required, the research group led by Stefanie Hellweg from ETH Zurich's Institute of Environmental Engineering modelled and calculated the undesired side effects of coal power for each of the 7,861 power plant units in the world. | The results, which were recently published in the journal As a result, these power plants only remove a fraction of the pollutants -- while also often burning coal of inferior quality. "More than half of the health effects can be traced back to just one tenth of the power plants. These power plants should be upgraded or shut down as quickly as possible," says Christopher Oberschelp, the lead author of the study.The global picture of coal power production shows that the gap between privileged and disadvantaged regions is widening. This is happening for two reasons. Firstly, wealthy countries -- such as in Europe -- import high-quality coal with a high calorific value and low emissions of harmful sulphur dioxide. The poorer coal-exporting countries (such as Indonesia, Colombia and South Africa) are left with low-quality coal, which they often burn in outdated power plants without modern flue gas treatment to remove the sulphur dioxide.Secondly, "In Europe, we contribute to global warming with our own power plants, which has a global impact. However, the local health damage caused by particulate matter, sulphur dioxide and nitrogen oxide occurs mainly in Asia, where coal power is used to manufacture a large proportion of our consumer products," says Oberschelp.Global coal resources will last for several hundred years, so the harmful emissions need to be limited politically. "It is particularly important to leave coal that is high in mercury and sulphur content in the ground," says Oberschelp. Reducing the negative health effects of coal power generation should be a global priority: "But further industrialisation, especially in China and India, poses the risk of aggravating the situation instead," write the researchers led by Hellweg in their article.The initial investment costs for the construction of a coal power plant are high, but the subsequent operating costs are low. Power plant operators thus have an economic interest in keeping their plants running for a long time. "The best option is therefore to not build any new coal power plants. From a health and environment perspective, we should move away from coal and towards natural gas -- and in the long term, towards renewable energy sources," says Oberschelp. | Pollution | 2,019 |
February 18, 2019 | https://www.sciencedaily.com/releases/2019/02/190218153218.htm | Climate change makes summer weather stormier yet more stagnant | Climate change is shifting the energy in the atmosphere that fuels summertime weather, which may lead to stronger thunderstorms and more stagnant conditions for midlatitude regions of the Northern Hemisphere, including North America, Europe, and Asia, a new MIT study finds. | Scientists report that rising global temperatures, particularly in the Arctic, are redistributing the energy in the atmosphere: More energy is available to fuel thunderstorms and other local, convective processes, while less energy is going toward summertime extratropical cyclones -- larger, milder weather systems that circulate across thousands of kilometers. These systems are normally associated with winds and fronts that generate rain."Extratropical cyclones ventilate air and air pollution, so with weaker extratropical cyclones in the summer, you're looking at the potential for more poor air-quality days in urban areas," says study author Charles Gertler, a graduate student in MIT's Department of Earth, Atmospheric and Planetary Sciences (EAPS). "Moving beyond air quality in cities, you have the potential for more destructive thunderstorms and more stagnant days with perhaps longer-lasting heat waves."Gertler and his co-author, Associate Professor Paul O'Gorman of EAPS, are publishing their results in the In contrast to more violent tropical cyclones such as hurricanes, extratropical cyclones are large weather systems that occur poleward of the Earth's tropical zone. These storm systems generate rapid changes in temperature and humidity along fronts that sweep across large swaths of the United States. In the winter, extratropical cyclones can whip up into Nor'easters; in the summer, they can bring everything from general cloudiness and light showers to heavy gusts and thunderstorms.Extratropical cyclones feed off the atmosphere's horizontal temperature gradient -- the difference in average temperatures between northern and southern latitudes. This temperature gradient and the moisture in the atmosphere produces a certain amount of energy in the atmosphere that can fuel weather events. The greater the gradient between, say, the Arctic and the equator, the stronger an extratropical cyclone is likely to be.In recent decades, the Arctic has warmed faster than the rest of the Earth, in effect shrinking the atmosphere's horizontal temperature gradient. Gertler and O'Gorman wondered whether and how this warming trend has affected the energy available in the atmosphere for extratropical cyclones and other summertime weather phenomena.They began by looking at a global reanalysis of recorded climate observations, known as the ERA-Interim Reanalysis, a project that has been collecting available satellite and weather balloon measurements of temperature and humidity around the world since the 1970s. From these measurements, the project produces a fine-grained global grid of estimated temperature and humidity, at various altitudes in the atmosphere.From this grid of estimates, the team focused on the Northern Hemisphere, and regions between 20 and 80 degrees latitude. They took the average summertime temperature and humidity in these regions, between June, July, and August for each year from 1979 to 2017. They then fed each yearly summertime average of temperature and humidity into an algorithm, developed at MIT, that estimates the amount of energy that would be available in the atmosphere, given the corresponding temperature and humidity conditions."We can see how this energy goes up and down over the years, and we can also separate how much energy is available for convection, which would manifest itself as thunderstorms for example, versus larger-scale circulations like extratropical cyclones," O'Gorman says.Since 1979, they found the energy available for large-scale extratropical cyclones has decreased by 6 percent, whereas the energy that could fuel smaller, more local thunderstorms has gone up by 13 percent.Their results mirror some recent evidence in the Northern Hemisphere, suggesting that summer winds associated with extratropical cyclones have decreased with global warming. Observations from Europe and Asia have also shown a strengthening of convective rainfall, such as from thunderstorms."Researchers are finding these trends in winds and rainfall that are probably related to climate change," Gertler says. "But this is the first time anyone has robustly connected the average change in the atmosphere, to these subdaily timescale events. So we're presenting a unified framework that connects climate change to this changing weather that we're seeing."The researchers' results estimate the average impact of global warming on summertime energy of the atmosphere over the Northern Hemisphere. Going forward, they hope to be able to resolve this further, to see how climate change may affect weather in more specific regions of the world."We'd like to work out what's happening to the available energy in the atmosphere, and put the trends on a map to see if it's, say, going up in North America, versus Asia and oceanic regions," O'Gorman says. "That's something that needs to be studied more." | Pollution | 2,019 |
February 17, 2019 | https://www.sciencedaily.com/releases/2019/02/190217115857.htm | A hidden source of air pollution? Your daily household tasks | Cooking, cleaning and other routine household activities generate significant levels of volatile and particulate chemicals inside the average home, leading to indoor air quality levels on par with a polluted major city, University of Colorado Boulder researchers say. | What's more, airborne chemicals that originate inside a house don't stay there: Volatile organic compounds (VOCs) from products such as shampoo, perfume and cleaning solutions eventually escape outside and contribute to ozone and fine particle formation, making up an even greater source of global atmospheric air pollution than cars and trucks do.The previously underexplored relationship between households and air quality drew focus today at the 2019 AAAS Annual Meeting in Washington, D.C., where researchers from CU Boulder's Cooperative Institute for Research in Environmental Sciences (CIRES) and the university's Department of Mechanical Engineering presented their recent findings during a panel discussion."Homes have never been considered an important source of outdoor air pollution and the moment is right to start exploring that," said Marina Vance, an assistant professor of mechanical engineering at CU Boulder. "We wanted to know: How do basic activities like cooking and cleaning change the chemistry of a house?"In 2018, Vance co-led the collaborative HOMEChem field campaign, which used advanced sensors and cameras to monitor the indoor air quality of a 1,200-square-foot manufactured home on the University of Texas Austin campus. Over the course of a month, Vance and her colleagues conducted a variety of daily household activities, including cooking a full Thanksgiving dinner in the middle of the Texas summer.While the HOMEChem experiment's results are still pending, Vance said that it's apparent that homes need to be well ventilated while cooking and cleaning, because even basic tasks like boiling water over a stovetop flame can contribute to high levels of gaseous air pollutants and suspended particulates, with negative health impacts.To her team's surprise, the measured indoor concentrations were high enough that that their sensitive instruments needed to be recalibrated almost immediately."Even the simple act of making toast raised particle levels far higher than expected," Vance said. "We had to go adjust many of the instruments."Indoor and outdoor experts are collaborating to paint a more complete picture of air quality, said Joost de Gouw, a CIRES Visiting Professor. Last year, de Gouw and his colleagues published results in the journal Science showing that regulations on automobiles had pushed transportation-derived emissions down in recent decades while the relative importance of household chemical pollutants had only gone up."Many traditional sources like fossil fuel-burning vehicles have become much cleaner than they used to be," said de Gouw. "Ozone and fine particulates are monitored by the EPA, but data for airborne toxins like formaldehyde and benzene and compounds like alcohols and ketones that originate from the home are very sparse."While de Gouw says that it is too early on in the research to make recommendations on policy or consumer behavior, he said that it's encouraging that the scientific community is now thinking about the "esosphere," derived from the Greek word 'eso,' which translates to 'inner.'"There was originally skepticism about whether or not these products actually contributed to air pollution in a meaningful way, but no longer," de Gouw said. "Moving forward, we need to re-focus research efforts on these sources and give them the same attention we have given to fossil fuels. The picture that we have in our heads about the atmosphere should now include a house." | Pollution | 2,019 |
February 17, 2019 | https://www.sciencedaily.com/releases/2019/02/190217115852.htm | Tiny fibers create unseen plastic pollution | While the polyester leisure suit was a 1970s mistake, polyester and other synthetic fibers like nylon are still around and are a major contributor to the microplastics load in the environment, according to a Penn State materials scientist, who suggests switching to biosynthetic fibers to solve this problem. | "These materials, during production, processing and after use, break down into and release microfibers that can now be found in everything and everyone," said Melik Demirel, Lloyd and Dorothy Foehr Huck Endowed Chair in Biomimetic Materials.Unlike natural fibers like wool, cotton and silk, current synthetic fibers are petroleum-based products and are mostly not biodegradable. While natural fibers can be recycled and biodegrade, mixed fibers that contain natural and synthetic fibers are difficult or costly to recycle.Islands of floating plastic trash in the oceans are a visible problem, but the pollution produced by textiles is invisible and ubiquitous. In the oceans, these microscopic plastic pieces become incorporated into plants and animals. Harvested fish carry these particles to market and, when people eat them, they consume microplastic particles as well.Demirel suggested four possible approaches to solving this problem, today (Feb. 16) at the 2019 annual meeting of the American Association for the Advancement of Science in Washington, D.C. The first is to minimize the use of synthetic fibers and switch back to natural fibers such as wool, cotton, silk and linen. However, synthetic fibers are less expensive and natural fibers have other environmental costs, such as water and land-use issues.Because much of the microfiber load that ends up in water sources comes from laundering, he suggests aftermarket filters for washing-machine outflow hoses. Clothes dryers have filters that catch lint -- also microfiber waste -- but current, front-loading washing machines usually do not."Capturing the microplastics at the source is the best filtering option," said Demirel.He also notes that bacteria that consume plastics do exist, but are currently at the academic research phase, which takes some time to gain industrial momentum. If bacteria were used on a large scale, they could aid in biodegradation of the fibers or break the fibers down to be reused.While these three options are possible, they do not solve the problem of the tons of synthetic fibers currently used in clothing around the world. Biosynthetic fibers, a fourth option, are both recyclable and biodegradable and could directly substitute for the synthetic fibers. They could also be blended with natural fibers to provide the durability of synthetic fibers but allow the blends to be recycled.Derived from natural proteins, biosynthetic fibers also can be manipulated to have desirable characteristics. Demirel, who developed a biosynthetic fiber composed of proteins similar to silk but inspired by those found in squid ring teeth, suggests that by altering the number of tandem repeats in the sequencing of the proteins, the polymers can be altered to meet a variety of properties.For example, material manufactured from biosynthetic squid ring-teeth proteins, called Squitex, is self-healing. Broken fibers or sections will reattach with water and a little pressure and enhance the mechanical properties of recycled cotton as a blend. Also, because the fibers are organic, they are completely biodegradable as well.The Army Research Office, Air Force Office of Scientific Research and the Office of Naval Research supported the squid-inspired biosynthetic material. Demirel is the co-founder of a company planning to commercialize Squitex. | Pollution | 2,019 |
February 14, 2019 | https://www.sciencedaily.com/releases/2019/02/190214084617.htm | Research forms complex picture of mercury pollution in a period of global change | Climate change and the loss of wetlands may contribute to increased levels of mercury concentrations in coastal fish, according to a Dartmouth College study. | The finding implies that forces directly associated with global change -- including increased precipitation and land use modifications -- will raise levels of the toxic metal that enter the marine food chain.Estuaries, including coastal wetlands, provide much of the seafood that is harvested for human consumption and also serve as important feeding grounds for larger marine fish.The study, published in late December in the journal "Estuaries provide habitat for the fish that feed our families," said Celia Chen, director of the Dartmouth Toxic Metals Superfund Research Program. "It's important to understand how mercury acts within our environment, particularly under increasing climate and land use pressures."The Dartmouth study concludes that higher levels of mercury, and its toxic form methylmercury, are associated with higher organic carbon in coastal waters. The study also finds that this results in higher levels of mercury occurring in fish that frequent these waters.While the amount of mercury deposited from the atmosphere has decreased in parts of the U.S., elevated levels of the pollutant already accumulated in the soils from past deposition may be released into rivers and streams with the heavier rains associated with climate change.At the same time, levels of organic carbon reaching estuaries in the region are also predicted to rise with rainfall. Since organic carbon binds with mercury, these increases may bring higher levels of mercury to estuarine waters.The research is particularly relevant in the Northeast part of the country given the high levels of existing mercury accumulated in the landscape and the increase in rainfall."Taken as a whole, these ecosystem changes have unpredictable effects on methylmercury bioaccumulation," said Vivien Taylor, a research scientist in the Department of Earth Science at Dartmouth and lead author of the study.The Dartmouth study investigated methylmercury in intertidal zones located at three different latitudes and temperature regions in the Northeast region of the United States: Mount Desert Island, Maine; Long Island Sound, Connecticut and the upper Chesapeake Bay in Maryland.The researchers studied mussels, periwinkle, shrimp, crabs and a variety of fish to confirm how carbon and methylmercury interact to enter the food chain in estuaries. In addition to chemical signatures, the team factored in parameters including fish size, tidal conditions and background levels of mercury contamination in regional sites.The study helps untangle the competing processes that affect the levels of mercury that end up in fish."The interaction between climate change and mercury are complex and require much more study. There is, however, increasing evidence that some factors related to climate change might aggravate mercury pollution," said Chen.The loss of coastal wetlands, further complicates the picture. While wetlands can be a source of methylmercury, organic carbon from wetlands may help reduce the amount of mercury that passes into fish and other aquatic species. The conversion of wetlands to other uses means that nature loses an important buffer that keeps mercury out of the food supply.The study finds that organic carbon's ability to reduce the amount of methylmercury available to estuarine fish can be overwhelmed by the increased watershed inputs of dissolved organic carbon and methylmercury associated with climate change. The result is increased methylmercury levels in estuarine species, like silversides and mummichog, which are food for larger fish like striped bass. | Pollution | 2,019 |
February 13, 2019 | https://www.sciencedaily.com/releases/2019/02/190213124422.htm | Upcycling plastic bags into battery parts | Plastic bag pollution has become a huge environmental problem, prompting some cities and countries to heavily tax or ban the sacks. But what if used plastic bags could be made into higher-value products? Now, researchers have reported a new method to convert plastic bags into carbon chips that could be used as anodes for lithium-ion batteries. They report their results in | Many plastic bags are used only once and then disposed, ending up in landfills, oceans and elsewhere in the environment, where they can take hundreds of years to decompose. Scientists have long recognized that the polyethylene in plastic bags could be an inexpensive source of energy-storing carbon. However, previous methods to upcycle polyethylene into pure carbon have been inefficient or required expensive, complex processes. Vilas Pol and colleagues wanted to develop a simpler yet efficient approach to convert plastic waste into useful carbon-containing materials.The researchers immersed polyethylene plastic bags in sulfuric acid and sealed them inside a solvothermal reactor, which heated the sample to just below polyethylene's melting temperature. This treatment caused sulfonic acid groups to be added to the polyethylene carbon-carbon backbone so that the plastic could be heated to a much higher temperature without vaporizing into hazardous gases. Then, they removed the sulfonated polyethylene from the reactor and heated it in a furnace in an inert atmosphere to produce pure carbon. The team ground the carbon into a black powder and used it to make anodes for lithium-ion batteries. The resulting batteries performed comparably to commercial batteries. | Pollution | 2,019 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.