Date
stringlengths
11
18
Link
stringlengths
62
62
Title
stringlengths
16
148
Summary
stringlengths
1
2.68k
Body
stringlengths
22
13k
Category
stringclasses
20 values
Year
int64
2k
2.02k
January 27, 2020
https://www.sciencedaily.com/releases/2020/01/200127134736.htm
Children to bear the burden of negative health effects from climate change
The grim effects that climate change will have on pediatric health outcomes was the focus of a "Viewpoint" article published in the
Pacheco, an associate professor of pediatrics at McGovern Medical School at UTHealth, along with professors from Johns Hopkins Medicine and the George Washington University, authored a series of articles that detail how increased temperatures due to climate change will negatively affect the health of humanity. In the article authored by Pacheco, she shines a light on the startling effects the crisis has on children's health before they are even born.Pacheco points to research published by the Intergovernmental Panel on Climate Change, which highlights several ways humans will experience adverse health effects from climate change, such as increased mortality and morbidity due to heat waves and fires, increased risk of food- and water-borne illnesses, and malnutrition due to food scarcity.These negative experiences bring with them psychological trauma and mental health issues that can affect both children and their caretakers. Pacheco wrote that after Hurricane Maria in 2017, many adults in Puerto Rico experienced post-traumatic stress disorder, depression, and anxiety from living weeks and months without access to necessities such as clean water, electricity, and basic medical care."Some were not capable of meeting the physical and emotional demands that such a disaster imposed on their children," Pacheco wrote.The negative health effects inflicted by the climate crisis can begin while a child is still in utero, due to maternal stress, poor nutrition, exposure to air pollution, and exposure to extreme weather events brought on by climate change. Studies of women who experienced major flooding events while pregnant reported an association with outcomes such as preterm birth and low birth weights. Pacheco wrote that pregnant women exposed to climate change experience stress, respiratory disease, poor nutrition, increased infections, heat-associated illnesses, and poverty."We will continue to see an increase in heat-associated conditions in children, such as asthma, Lyme disease, as well as an increase in congenital heart defects," Pacheco said.Pacheco wrote that the picture painted by research on climate change is daunting and now is not the time for indifference. In the article's conclusion, she wrote that everyone in the medical community must reflect on a personal level about what can be done with the knowledge they have on climate change and its negative health effects."We cannot act as if we are immune to these threats," she said. "We can jump to action or stand in complacent indifference."The series of articles were authored by experts in their field including Rexford Ahima, MD, PhD, and Arturo Casadevall, MD, PhD, MS, of Johns Hopkins Medicine; and William Dietz, MD, PhD, of the George Washington University.
Pollution
2,020
January 24, 2020
https://www.sciencedaily.com/releases/2020/01/200124155107.htm
High air pollution exposure in 1-year-olds linked to structural brain changes at age 12
A new study suggests that significant early childhood exposure to traffic-related air pollution (TRAP) is associated with structural changes in the brain at the age of 12.
The Cincinnati Children's Hospital Medical Center study found that children with higher levels of TRAP exposure at birth had reductions at age 12 in gray matter volume and cortical thickness as compared to children with lower levels of exposure."The results of this study, though exploratory, suggest that where you live and the air you breathe can affect how your brain develops, says Travis Beckwith, PhD, a research fellow at Cincinnati Children's and lead author of the study. "While the percentage of loss is far less than what might be seen in a degenerative disease state, this loss may be enough to influence the development of various physical and mental processes."Gray matter includes regions of the brain involved in motor control as well as sensory perception, such as seeing and hearing. Cortical thickness reflects the outer gray matter depth. The study found that specific regions in the frontal and parietal lobes and the cerebellum were affected with decreases on the order of 3 to 4 percent."If early life TRAP exposure irreversibly harms brain development, structural consequences could persist regardless of the time point for a subsequent examination," says Dr. Beckwith.The researchers on the study, which is published online in The volunteers in the CCAAPS had either high or low levels of TRAP exposure during their first year of life. The researchers estimated exposure using an air sampling network of 27 sites in the Cincinnati area, and 24/7 sampling was conducted simultaneously at four or five sites over different seasons. Participating children and their caregivers completed clinic visits at ages 1, 2, 3, 4, 7 and 12.Previous studies of TRAP suggest that it contributes to neurodegenerative diseases and neurodevelopmental disorders. This work supports that TRAP changes brain structure early in life.Senior author of the study is Kim Cecil, PhD, a scientist at the Cincinnati Children's Imaging Research Center.
Pollution
2,020
January 23, 2020
https://www.sciencedaily.com/releases/2020/01/200123152616.htm
Living near major roads linked to risk of dementia, Parkinson's, Alzheimer's and MS
Living near major roads or highways is linked to higher incidence of dementia, Parkinson's disease, Alzheimer's disease and multiple sclerosis (MS), suggests new research published this week in the journal
Researchers from the University of British Columbia analyzed data for 678,000 adults in Metro Vancouver. They found that living less than 50 metres from a major road or less than 150 metres from a highway is associated with a higher risk of developing dementia, Parkinson's, Alzheimer's and MS -- likely due to increased exposure to air pollution.The researchers also found that living near green spaces, like parks, has protective effects against developing these neurological disorders."For the first time, we have confirmed a link between air pollution and traffic proximity with a higher risk of dementia, Parkinson's, Alzheimer's and MS at the population level," says Weiran Yuchi, the study's lead author and a PhD candidate in the UBC school of population and public health. "The good news is that green spaces appear to have some protective effects in reducing the risk of developing one or more of these disorders. More research is needed, but our findings do suggest that urban planning efforts to increase accessibility to green spaces and to reduce motor vehicle traffic would be beneficial for neurological health."Neurological disorders -- a term that describes a range of disorders, including Alzheimer's disease and other dementias, Parkinson's disease, multiple sclerosis and motor neuron diseases -- are increasingly recognized as one of the leading causes of death and disability worldwide. Little is known about the risk factors associated with neurological disorders, the majority of which are incurable and typically worsen over time.For the study, researchers analyzed data for 678,000 adults between the ages of 45 and 84 who lived in Metro Vancouver from 1994 to 1998 and during a follow-up period from 1999 to 2003. They estimated individual exposures to road proximity, air pollution, noise and greenness at each person's residence using postal code data. During the follow-up period, the researchers identified 13,170 cases of non-Alzheimer's dementia, 4,201 cases of Parkinson's disease, 1,277 cases of Alzheimer's disease and 658 cases of MS.For non-Alzheimer's dementia and Parkinson's disease specifically, living near major roads or a highway was associated with 14 per cent and seven per cent increased risk of both conditions, respectively. Due to relatively low numbers of Alzheimer's and MS cases in Metro Vancouver compared to non-Alzheimer's dementia and Parkinson's disease, the researchers did not identify associations between air pollution and increased risk of these two disorders. However, they are now analyzing Canada-wide data and are hopeful the larger dataset will provide more information on the effects of air pollution on Alzheimer's disease and MS.When the researchers accounted for green space, they found the effect of air pollution on the neurological disorders was mitigated. The researchers suggest that this protective effect could be due to several factors."For people who are exposed to a higher level of green space, they are more likely to be physically active and may also have more social interactions," said Michael Brauer, the study's senior author and professor in the UBC school of population and public health. "There may even be benefits from just the visual aspects of vegetation."Brauer added that the findings underscore the importance for city planners to ensure they incorporate greenery and parks when planning and developing residential neighbourhoods.The study was co-authored by Hind Sbihi, Hugh Davies, and Lillian Tamburic in the UBC school of population and public health.
Pollution
2,020
January 23, 2020
https://www.sciencedaily.com/releases/2020/01/200123152606.htm
2018's Four Corners drought directly linked to human-caused climate change
The western United States has experienced such intense droughts over the past decade that technical descriptions are becoming inadequate. In many places, conditions are rocketing past "severe," through "extreme," all the way to "exceptional drought."
The 2018 Four Corners drought -- centered on the junction between Arizona, Utah, Colorado and New Mexico -- put the region deep in the red. An abnormally hot spring and summer indicated that climate change was clearly at work, but that was about as much as most people could say of the situation at the time.Climate scientists from UC Santa Barbara's geography department have now distilled just how strong an effect human-induced warming had on that event. Their findings appear in the "I was really stunned at how big an effect we found with just a 2-degree warming," said Chris Funk, director of the university's Climate Hazards Center, a U.S. Geological Survey scientist and one of the study's coauthors."The results were much more pronounced than we had expected," added lead author Emily Williams, a doctoral student in the Department of Geography. Her work focuses on end-to-end attribution, which determines exactly how much a specific natural event was exacerbated by climate change, and then links those changes to both the sources of greenhouse gasses and the impacts of warming on people and ecosystems. It's a challenging task that requires sophisticated climate models, comprehensive databases and the development of exciting new science.Williams wanted to determine to what extent the Four Corners drought was exacerbated by climate change. To make the task more manageable, she limited the scope of her investigation to rising temperatures. Williams ran two simulations on leading climate models; the first calculated temperatures under the climate regime predating the industrial revolution, while the second did so under current, human-induced climate change. Subtracting the averages from the two simulations gave her a temperature difference she could attribute to human-induced warming."We found that pretty much all of what made this the hottest year on record in that region was due to climate change," she said.With that knowledge, Williams then set out to determine how higher temperatures affected aridity. For this, she looked at the local hydrology -- mainly snowpack and surface runoff -- and agropastoral conditions -- circumstances related to growing crops and raising livestock. She used the vegetation's greenness to judge the agropastoral conditions of the region.Warmer air can hold more moisture than cold air. So, as temperatures rise, the air becomes thirstier, Williams explained. And thirstier air can suck more moisture out of the ground.The difference between the amount of water the air can absorb and the amount the land can provide is what scientists call the vapor pressure deficit. When the land can supply more than the air can hold -- as when the air temperature drops to a predawn low -- you get condensation, like the early morning dew. On the other hand, when the air is thirstier than the amount of water the ground can provide, it pulls moisture from the earth drying it out. Warmer air over dry soils will be thirstier, leading to more rapid browning of fields and fodder.The scientists used a statistical model to relate green vegetation to the vapor pressure deficit. Intuitively, warmer temperatures cause dryer conditions, which turn the vegetation brown. And that it did. The results indicated that the landscape would have been about 20% greener that year in the absence of climate change. This browner vegetation translated into poor harvests and low-quality fodder; impacts associated with more than $3 billion in economic losses, and disruptions in the lives and livelihoods of hundreds of thousands of Native Americans who were settled in reservations in the area.2018 was a dry year for the entire Western U.S., but the Four Corners region was hit particularly hard. The worst of the drought sat squarely on the Navajo Nation reservation.The team also looked at how much the region's snowpack was influenced by the higher temperatures. A robust snowpack ensures water availability late into the summer as it slowly melts away. Higher temperatures affect snowpack in two ways: They cause more precipitation to fall as rain rather than snow, and they make snow melt faster and earlier in the year, explained Shraddhanand Shukla, an associate researcher in the geography department and another of the paper's coauthors.The scientists simulated snow conditions in the region with and without climate change while keeping the total precipitation constant. They found that the March snowpack would have been 20% greater in the absence of climate change, even though this was the lowest precipitation year on record for the area. They expect the effect would have been even more pronounced in a wetter year.These findings are a conservative estimate of climate change's influence on the drought, according to Williams. For one, the study only considered the impact human-induced warming had on temperatures. Climate change may also have influenced the region's low rainfall.What's more, there are strong feedback cycles between the atmosphere and the land, which the study left out. When the air soaks up all the available moisture in the soil, evaporative can no longer cool the ground. The result is a dramatic spike ground temperature, which exacerbates the situation.However, by keeping the methodology simple, the team made the task more manageable, the insights more understandable and the technique more transferable. They plan to apply the approach on more typical years in the Four Corners region, as well as to regions in Eastern Africa experiencing similar distress.The researchers stressed the urgency of understanding these systems. "This isn't projected climate change, it's current," said Williams."There are things that are absolutely certain to happen due to climate change," Funk said. "And one thing that's absolutely certain is that saturation vapor pressure is going to go up.""And this actually increases the intensity of both droughts and floods," he added.When a region like the Four Corners experiences a drought, a warmer, thirstier atmosphere can wick away limited soil moisture more quickly. On the other hand, when a humid region, like Houston, experiences an extreme rainfall event, the warm, wet winds feeding it can hold more water, potentially causing catastrophic floods."The moment that everybody in the world understands this is the moment that we will probably start doing something about climate change," Funk said.
Pollution
2,020
January 23, 2020
https://www.sciencedaily.com/releases/2020/01/200123152604.htm
Mapping the cumulative health effects of environmental exposures
Over the last two decades, the health sciences have been transformed by genomics, which has provided insights into genetic risk factors for human disease. While powerful, the genomics revolution has also revealed the limits of genetic determinants, which account for only a fraction of total disease risk. A new article in the journal
The article by researchers at Columbia University Mailman School of Public Health; Utrecht University, the Netherlands; University of Luxembourg; and Northeastern University reviews progress in assessing the components of the exposome and its implications on human health."Our genes are not our destiny, nor do they provide a complete picture of our risk for disease," says senior author Gary Miller, PhD, Vice Dean for Research Strategy and Innovation and professor of environmental health sciences at the Columbia Mailman School. "Our health is also shaped by what we eat and do, our experiences, and where we live and work.""Less than half of the nongenetic risk burden for disease is accounted for, suggesting the existence of environmental risk factors, exposure to which may largely be preventable," says first author Roel Vermeulen, Professor of Environmental Epidemiology and Exposome Science at Utrecht University. "With growing recognition of the important role nongenetic factors play in disease, we need a coordinated and international effort to characterize the exposome at a scale comparable to that of the human genome."The exposome was conceived by the scientist Christopher Wild in 2005 as a way to represent the environmental, nongenetic, drivers of health and disease. These exposures are not restricted to the thousands of chemicals that enter our bodies through the air, water, or food, for example, but also our body's response to our environment, including the built environment and social circumstances, through inflammation, oxidative stress, infections, and gut flora, for example.Traditionally, our understanding of the health effects of chemicals has come from epidemiological and toxicological studies that analyze one or a small number of pollutants at a time. "However, our exposures are not a simple sum of a handful of chemicals," the authors write. To capture a fuller picture of environmental exposures, scientists are beginning to employ environment-wide association studies (EWAS), the exposome equivalent of genome-wide association studies (GWAS). Complementing GWAS, EWAS studies take advantage of high-resolution mass spectrometry (HRMS) to measure small molecules originating in the environment, such as air pollution, pesticides, plasticizers, and flame retardants, as well as nutrients and biological metabolites."A reductionist approach might isolate the role of a single variable, but it will inadequately capture the complexity of the exposome," the authors write. "The challenge in understanding the role of the exposome on our health lies not only in the large number of chemical exposures in our daily lives, but also in the complex ways that they interact with cells."Among the challenges to exposome research are that enrollment in studies of nongenetic environmental exposures remains relatively low. Sample sizes in excess of 100,000 are needed to explain a substantial portion of the genomic heritability of common chronic diseases. The authors posit that similar or even greater sample sizes are required for future environmental studies. A step in that direction, efforts are underway to create a Human Exposome Project representing environmental and biological exposures for tens of thousands of people, large enough to identify the most prevalent and strongest chemical risk factors, although larger studies are needed to understand the impact of many exposome factors in combination. In addition to sample size, the authors call for improvements in screening technology and data resources to identify associations; network theory to elucidate the constellation of the chemical environment and its biological consequences; and replication in independent studies and the use of methods to establish causation.Large-scale exposome studies will give regulatory bodies new information on those chemicals that have the largest adverse effects on health. "If systematic analysis reveals major adverse effects on human health from exposure to currently approved or potential replacement chemicals, then those compounds should be removed from the marketplace," the authors write. Moreover, data on the effects of classes of chemicals on specific biological pathways known to be perturbed could help in the design of new compounds with minimal impact on human health and the environment. "Current research approaches and regulatory policies fail to address the chemical complexity of our world," the authors write.In the realm of medicine, more complete information on the impact of nongenetic factors and chemical exposures would also enable the creation of an exposome risk score (ERS) akin to the polygenic risk score (PRS) which could give individuals and their clinicians a better understanding of their likelihood of developing certain diseases."Consolidating knowledge garnered from GWAS [genome-wide association studies] and EWAS [environment-wide association studies]," the authors conclude, "would allow us to map the gene and environment interface, which is where nature meets nurture and chemistry meets biology."
Pollution
2,020
January 23, 2020
https://www.sciencedaily.com/releases/2020/01/200123080716.htm
Exposure to diesel exhaust particles linked to pneumococcal disease susceptibility
A new study, published in the
The bacterium To find out more about the conditions that allow this ordinarily harmless bacterium to progress into such severe invasive diseases researchers from the University of Liverpool, Queen Mary's University, London and Trinity College Dublin, conducted a study examining the role of DEPs in the development of pneumococcal disease.The World Health Organisation (WHO) estimates that air pollution is responsible for 7 million deaths per year, with 7% of these attributable to pneumonia. An estimated 37% of the world's population live in areas where levels of airborne pollution exceed WHO guideline limits.DEPs, a major component of air pollution worldwide, is the particulate component of diesel exhaust, which includes diesel soot and aerosols such as ash particulates, metallic abrasion particles, sulphates, and silicates.The researchers, led by Professor Aras Kadioglu from the University of Liverpool's Institute of Infection & Global Health, used a combination of mouse models and lab-based assays in both mouse and human cells to provide insight into the link between DEP exposure and pneumococcal disease.The researchers found that following exposure to DEPs, airway macrophages, which are key immune cells for controlling bacterial infections and removing debris from the body, become congested with DEPs, reducing their ability to kill the pneumococcus. This allows the bacteria to survive more easily in the airways, invade the lungs, and cause significant inflammation, which eventually leads to bacterial translocation into blood, thereby causing severe disease.Professor Aras Kadioglu, said: "We know that exposure to air pollution is harmful, responsible for millions of deaths every year, of which a significant proportion is due to pneumonia. What we did not know however, was how pollution, such as diesel exhaust particles, actually causes airway disease.""In this study, we have now discovered the cellular mechanisms behind this. Our study highlights an urgent need to tackle airway pollution if we are to reduce life threatening respiratory diseases such as pneumonia."Dr Rebecca Shears, who is first author, added: "Our study shows that exposure to DEPs, which is a major airborne particulate pollutant both here in the UK and abroad, may be one of the key factors involved in the switch from harmless pneumococcal colonisation of the nasal tissues to severe disease, such as pneumonia.""Our data provides further insight to support previous observations of increased pneumonia hospital admissions in countries such as China, where airborne pollution levels are highest.""The reduced ability of DEP exposed airway macrophages to control the infection appears to be key in the increased number of cases of pneumococcal disease. This study adds further impetus to reduce global pollution levels."
Pollution
2,020
January 22, 2020
https://www.sciencedaily.com/releases/2020/01/200122080607.htm
Air pollution in New York City linked to wildfires hundreds of miles away
A new study shows that air pollutants from the smoke of fires from as far as Canada and the southeastern U.S. traveled hundreds of miles and several days to reach Connecticut and New York City, where it caused significant increases in pollution concentrations.
For the study, published 21 January in the European Geosciences Union (EGU) journal Biomass burning, which occurs on a large scale during wildfires and some controlled burns, is a major source of air pollutants that impact air quality, human health, and climate. These events release numerous gases into the atmosphere and produce particulate matter (PM), including black carbon (BC) and other primary organic aerosols (POA) with a diameter of less than 2.5 micrometers. Known as PM2.5, it has been shown to have particularly serious health effects when inhaled.While more reactive components are often chemically transformed closer to their place of origin, PM2.5 tends to last longer. In the case of this study, that allowed much of it to travel from the fires to the monitoring sites -- a period ranging from a few days to about a week."Given the sensitivity of people to the health effects emerging from exposure to PM2.5, this is certainly something that needs to be considered as policy-makers put together long-term air quality management plans," Gentner said.The impacts of wildfire smoke will likely become increasingly important in the coming years."When people are making predictions about climate change, they're predicting increases in wildfires, so this sort of pollution is likely going to become more common," said lead author Haley Rogers, who was an undergraduate student when the study was conducted. "So when people are planning for air pollution and health impacts, you can't just address local sources."Although the levels of the PM2.5 decreased over time and distance, co-author Jenna Ditto, a graduate student in Gentner's lab, noted that awareness of its presence in the atmosphere is critical to public health."Studies indicate that there are no safe levels of PM2.5, so typically any level of it is worth taking a look at," she said.
Pollution
2,020
January 22, 2020
https://www.sciencedaily.com/releases/2020/01/200122080600.htm
Missing piece to urban air quality puzzle
Despite the prominent health threat posed by fine particulate pollution, fundamental aspects of its formation and evolution continue to elude scientists.
This is true especially for the organic fraction of fine particles (also called aerosol), much of which forms as organic gases are oxidized by the atmosphere. Computer models under-predict this so-called "secondary" organic aerosol (SOA) in comparison to field measurements, indicating that the models are either missing some important sources or failing to describe the physical processes that lead to SOA formation.New research from Carnegie Mellon University in collaboration with the National Oceanic and Atmospheric Administration (NOAA) sheds light on an under-appreciated source of SOA that may help close this model-measurement gap. Published in "Our experiment shows that, in areas where you have a lot of people, you can only explain about half of the SOA seen in the field with the traditional emissions from vehicles and trees," said Albert Presto, a professor in mechanical engineering and the study's corresponding author. "We attribute that other half to these non-traditional VOCs."In 2018, researchers from NOAA made a splash in the journal "It's a lot of everyday stuff that we use," said Presto. "Anything you use that is scented contains organic molecules, which can get out into the atmosphere and react" where it can form SOA.The prevalence of these VOCs represents a paradigm shift in the urban SOA picture. The transportation sector had long been the dominant source of VOCs in urban air, but vehicle emissions in the U.S. have decreased drastically (up to 90%) due to tailpipe regulations in recent decades, even as fuel consumption has risen. As transportation-related VOCs have faded in prominence, non-traditional VOCs have begun to make up a greater relative contribution to the urban atmosphere. While NOAA's research alerted to atmospheric science community to the magnitude of non-traditional VOCs in urban environments, they could only hypothesize that these gases were likely important for SOA formation; the idea still needed to be tested.Testing how much SOA forms from these is not an easy task, however. SOA formation in the atmosphere plays out over the course of several days, making it difficult to track the journey of emitted gases as they are dispersed by winds and begin reacting with sunlight and other oxidants.Rishabh Shah, a graduate student who studied with Presto and now works at NOAA, constructed a reactor to evaluate the full potential for SOA formation within a sample of air without having to track that air over time."The reactor is kind of like an app on your smartphone for SOA formation," said Shah. "You take your picture and the app shows you what you would look like a decade from now."The reactor accelerates the meandering journey a gas takes by bombarding it with oxidants at much higher concentrations than are found in the atmosphere. This physically simulates in just a few seconds all of the reactions a gas molecule is subject to in the atmosphere over the course of a week. In just a moment's time, Shah's reactor can evaluate the full potential of the air it samples to form SOA.The team mounted their reactor in a van, creating a mobile platform from which they could access air from different settings containing varying levels of non-traditional VOCs. These locations included sites downwind from a large industrial facility, next to a construction site, within the deep 'street canyons' created by the skyscrapers of a city center, and among low-rise buildings of an urban neighborhood.In places with large amounts of non-traditional VOCs, the reactor formed large amounts of SOA. These locations included both downtown street-canyons and amongst the urban low-rises, both places where evaporation of consumer products like deodorants and conditioners are high, especially in the morning. Advanced gas-analyzers aboard the mobile platform allowed the team to detect the presence of many of these non-traditional VOCs.Importantly, in these locations the standard state-of-the-art computer models could not predict the full amount of SOA they observed in their reactor. However, in other environments with fewer non-traditional VOCs, the model was able to accurately predict how much SOA formed in the reactor.Together, these pieces of evidence form a compelling argument that non-traditional VOC emissions are responsible for a significant amount of urban SOA. Presto estimates that these non-traditional emissions have roughly the same contribution as transportation and biosphere emissions combined, in line with the hypothesis put forward by NOAA."Traditionally, we've focused a lot on power plants and vehicles for air quality, which have gotten way cleaner in the U.S.." said Presto. "What that means is that now, a substantial amount of the SOA is coming from this other 'everyday, everywhere' category that hasn't really been considered until recently."
Pollution
2,020
January 22, 2020
https://www.sciencedaily.com/releases/2020/01/200122080519.htm
Mortality costs of air pollution in US
A team of University of Illinois researchers estimated the mortality costs associated with air pollution in the U.S. by developing and applying a novel machine learning-based method to estimate the life-years lost and cost associated with air pollution exposure.
Scholars from the Gies College of Business at Illinois studied the causal effects of acute fine particulate matter exposure on mortality, health care use and medical costs among older Americans through Medicare data and a unique way of measuring air pollution via changes in local wind direction.The researchers -- Tatyana Deryugina, Nolan Miller, David Molitor and Julian Reif -- calculated that the reduction in particulate matter experienced between 1999-2013 resulted in elderly mortality reductions worth $24 billion annually by the end of that period. Garth Heutel of Georgia State University and the National Bureau of Economic Research was a co-author of the paper."Our goal with this paper was to quantify the costs of air pollution on mortality in a particularly vulnerable population: the elderly," said Deryugina, a professor of finance who studies the health effects and distributional impact of air pollution. "Understanding how air pollution affects mortality, health care use and medical costs is essential for crafting efficient environment policies because outside factors such as a person's preexisting health conditions can make it challenging to accurately estimate the causal effects of pollution on health."About 25% of the elderly Medicare population was vulnerable to acute pollution shocks, according to the researchers."Our analysis shows that the most vulnerable Medicare beneficiaries are those who suffer from chronic conditions and have high health care spending," said Reif, a professor of finance and a faculty member of the Institute of Government and Public Affairs. "We estimate that members of the most vulnerable group -- those with a life expectancy of less than one year -- are over 30 times more likely to die from pollution than the typical Medicare beneficiary.""Because we take a big data approach, we're able to see how air pollution affects the entire elderly population of the U.S. over those 14 years," said Miller, the Daniel and Cynthia Mah Helle Professor of Finance. "Medicare data is great because it has every interaction with the health care system in our sample for virtually every elderly person."The typical air pollution research is more of a case study, Miller said."There's a pollution event in a certain city, and there's a mortality count around this event, but it's hard to get an accurate general estimate of the overall impact," he said. "Pollution is produced as a package: You burn stuff and it produces particulate matter, but it also produces other pollutants. Our methodology is able to take a lot of data, people and pollution events into account. And that allows us to more accurately identify the overall impact of pollution, because wind patterns affect these different pollutants in different ways. So we can tease apart which of these pollutants we think is most important and driving these mortality effects."By exploiting the daily variation in acute fine particulate pollution exposure driven by changes in wind direction, the researchers found significant effects of exposure on mortality, hospitalizations and medical spending."A key part of the study was harnessing 40 billion observations with machine learning techniques," said Molitor, a professor of finance. "We used machine learning to predict how long people would have lived in the absence of the pollution event and to illuminate who is most vulnerable to pollution. One takeaway is that an individual's life expectancy -- how much longer they can expect to live -- is a much better measure of vulnerability to pollution than their age."The scholars also found that increases in particulate matter lead to more emergency room visits, hospitalizations and higher patient spending."Mortality is only one of many potential costs of air pollution," Molitor said. "The elderly who aren't dying may engage in other costly activities such as going to the hospital for preventive or emergency care. Those steps may help them avoid death, but it doesn't mean that pollution has no cost to their health or finances."Notably, the researchers also found that the failure to adjust for the preexisting health of those who die from an acute pollution event tends to overstate the mortality-reduction benefits of decreasing air pollution."An issue that arises when estimating mortality effects is whether those who die from pollution exposure would have passed away soon anyway without that external pollution shock," Molitor said. "If deaths caused by pollution occur disproportionately among the least healthy, then ignoring this factor could lead to an overstatement of the life years lost due to pollution.""But we found that the typical person who dies as a result of pollution exposure isn't someone you would expect to die in a week or a month," Miller said. "It's people who have 3.6 years on average to live, compared with about 11 years for the typical elderly Medicare enrollee. So, although they are less healthy than the average Medicare recipient, these are people who we expect to have three and a half reasonably healthy years of life, and this should definitely not be ignored.""Another way of thinking about our characterization of who dies from pollution is as an index of vulnerability," Molitor said. "We want to protect people from pollution, and we could do that by reducing pollution levels. But that can be costly and difficult for local governments to implement, especially if pollution is caused by something far away. By understanding who is most vulnerable to pollution, local policies and actions can be designed to better protect lives and to improve population resilience to pollution events."
Pollution
2,020
January 16, 2020
https://www.sciencedaily.com/releases/2020/01/200116155436.htm
Mix of stress and air pollution may lead to cognitive difficulties in children
Children with elevated exposure to early life stress in the home and elevated prenatal exposure to air pollution exhibited heightened symptoms of attention and thought problems, according to researchers at Columbia University Mailman School of Public Health and Columbia Psychiatry. Early life stress is common in youth from disadvantaged backgrounds who also often live in areas with greater exposure to air pollution.
Air pollution has negative effects on physical health, and recent work has begun to also show negative effects on mental health. Life stress, particularly early in life, is one of the best-known contributors to mental health problems. The new study is one of the first to examine the combined effects of air pollution and early life stress on school-age children. Results appear in the "Prenatal exposure to polycyclic aromatic hydrocarbons, a neurotoxicant common in air pollution, seems to magnify or sustain the effects of early life social and economic stress on mental health in children," says first author David Pagliaccio, PhD, assistant professor of clinical neurobiology in psychiatry at Columbia Psychiatry."Air pollutants are common in our environment, particularly in cities, and given socioeconomic inequities and environmental injustice, children growing up in disadvantaged circumstances are more likely to experience both life stress and exposure to neurotoxic chemicals," says senior author Amy Margolis, PhD, assistant professor of medical psychology in psychiatry at Columbia Psychiatry."These exposures have a combined effect on poor mental health outcomes and point to the importance of public health programs that try to lessen exposure to these critical risk factors, to improve not only physical, but psychological health," says Julie Herbstman, PhD, associate professor of environmental health science and director of the Columbia Center for Children's Environmental Health at Columbia Mailman School of Public Health.Data were from the CCCEH Mothers and Newborns longitudinal birth cohort study in Northern Manhattan and the Bronx, which includes many participants who self-identify as African American or Dominican. Mothers wore an air monitoring backpack during the third trimester of pregnancy to measure exposure to air pollutants in their daily lives. When their children were 5 years old, mothers reported on stress in their lives, including neighborhood quality, material hardship, intimate partner violence, perceived stress, lack of social support, and general distress levels. Mothers then reported on their child's psychiatric symptoms at ages 5, 7, 9 and 11.The combined effect of air pollution and early life stress was seen across several measures of thought and attention problems/ADHD at age 11. (Thought problems included obsessive thoughts and behaviors or thoughts that others find strange.) The effects were also linked to PAH-DNA adducts -- a dose-sensitive marker of air pollution exposure.The researchers say PAH and early life stress may serve as a "double hit" on shared biological pathways connected to attention and thought problems. Stress likely leads to wide-ranging changes in, for example, epigenetic expression, cortisol, inflammation, and brain structure and function. The mechanism underlying the effects of PAH is still being interrogated; however, alterations in brain structure and function represent possible shared mechanistic pathways.Earlier studies making use of the same longitudinal cohort data found that prenatal exposure to air pollution combines with material hardship to significantly increase ADHD symptoms in children. A separate study found a combination of air pollution and poverty lowered child IQ.
Pollution
2,020
January 16, 2020
https://www.sciencedaily.com/releases/2020/01/200116132247.htm
How anti-sprawl policies may be harming water quality
Urban growth boundaries are created by governments in an effort to concentrate urban development -- buildings, roads and the utilities that support them -- within a defined area. These boundaries are intended to decrease negative impacts on people and the environment. However, according to a Penn State researcher, policies that aim to reduce urban sprawl may be increasing water pollution.
"What we were interested in was whether the combination of sprawl -- or lack of sprawl -- along with simultaneous agriculture development in suburban and rural areas could lead to increased water-quality damages," said Douglas Wrenn, a co-funded faculty member in the Institutes of Energy and the Environment.These water quality damages were due to pollution from nitrogen, phosphorus and sediment, three ingredients that in high quantities can cause numerous environmental problems in streams, rivers and bays. As a part of the EPA's Clean Water Act (CWA), total maximum daily loads (TMDL) govern how much of these pollutants are allowed in a body of water while still meeting water-quality standards.According to Wrenn, an associate professor in Penn State's College of Agricultural Sciences, one of the reasons anti-sprawl policies can lead to more water pollution is because higher-density development has more impervious surfaces, such as concrete. These surfaces don't absorb water but cause runoff. The water then flows into bodies of water, bringing sediment, nitrogen and phosphorus with it.Secondly, agriculture creates considerably more water pollution than low-density residential areas. And when development outside of the boundaries that could replace agriculture is prevented, the amount of pollution that could be reduced is lost."If you concentrate development inside an urban growth boundary and allow agriculture to continue business as usual," Wrenn said, "then you could actually end with anti-sprawl policies that lead to an increase in overall water quality damages."Wrenn said it is important for land-use planners in urban areas and especially in urbanizing and urban-fringe counties to understand this.The EPA's water quality regulation is divided between point source and nonpoint source polluters. Point source polluters include wastewater treatment facilities, big factories, consolidated animal feeding operations and stormwater management systems. Nonpoint sources are essentially everything else. And the CWA does not regulate nonpoint sources, which includes agriculture."When it comes to meeting TMDL regulations, point source polluters will always end up being responsible," he said. "They are legally bound to basically do it all."Wrenn said point source polluters are very interested in getting nonpoint source polluters, specifically agriculture, involved in reducing pollution because their cost of reduction is usually far less expensive and often times more achievable."What our research has shown is that land-use regulation where land-use planners have some ability to manage where and when land-use development takes place, this gives some indication that land-use policy can be a helper or a hinderance to meeting these TMDL regulations," Wrenn said.
Pollution
2,020
January 15, 2020
https://www.sciencedaily.com/releases/2020/01/200115130448.htm
Air pollution from oil and gas production sites visible from space
Oil and gas production has doubled in some parts of the United States in the last two years, and scientists can use satellites to see impacts of that trend: a significant increase in the release of the lung-irritating air pollutant nitrogen dioxide, for example, and a more-than-doubling of the amount of gas flared into the atmosphere.
"We see the industry's growing impact from space," said Barbara Dix, a scientist at the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder and lead author of the new assessment published in the AGU journal Dix and a team of U.S. and Dutch researchers set out to see if a suite of satellite-based instruments could help scientists understand more about nitrogen oxides pollution (including nitrogen dioxide) coming from engines in U.S. oil and gas fields. Combustion engines produce nitrogen oxides, which is a respiratory irritant and can lead to the formation of other types of harmful air pollutants, such as ground-level ozone.On oil and gas drilling and production sites, there may be several small and large combustion engines, drilling, compressing gas, separating liquids and gases, and moving gas and oil through pipes and storage containers, said co-author Joost de Gouw, a CIRES Fellow and chemistry professor at CU Boulder. The emissions of those engines are not controlled. "Cars have catalytic converters, big industrial stacks may have emissions reduction equipment..." de Gouw said. "Not so with these engines."Conventional "inventories" meant to account for nitrogen oxides pollution from oil and gas sites are often very uncertain, underestimating or overestimating the pollutants, de Gouw said. And there are few sustained measurements of nitrogen oxides in many of the rural areas where oil and gas development often takes place, Dix said.So she, de Gouw and their colleagues turned to nitrogen dioxide data from the Ozone Monitoring Instrument (OMI) on board a NASA satellite and the Tropospheric Monitoring Instrument (TropOMI) on a European Space Agency satellite. They also looked at gas flaring data from an instrument on the NOAA/NASA Suomi satellite system.Between 2007 and 2019, across much of the United States, nitrogen dioxide pollution levels dropped because of cleaner cars and power plants, the team found, confirming findings reported previously. The clean air trend in satellite data was most obvious in urban areas of California, Washington and Oregon and in the eastern half of the continental United States. "We've cleaned up our act a lot," Dix said.However, several areas stuck out with increased emissions of nitrogen dioxide: The Permian, Bakken and Eagle Ford oil and gas basins, in Texas and New Mexico, North Dakota, and Texas, respectively.In those areas, the scientists used a type of time-series analysis to figure out where the pollutant was coming from: Drilling of new wells vs. longer-term production. They could do this kind of analysis because drilling activity swings up and down quickly in response to market forces while production changes far more slowly (once a well is drilled, it may produce oil and natural gas for years or even decades).Before a downturn in drilling in 2015, drilling generated about 80 percent of nitrogen dioxide from oil and gas sites, the team reported. After 2015, drilling and production produced roughly equal amounts of the pollutant. Flaring is estimated to contribute up to 10 percent in both time frames.The researchers also developed a new oil and gas emissions inventory, using data on fuel use by the industry, the location of drilling rigs, and well-level production data. The inventory confirmed the satellite trends, said co-author Brian McDonald, a CIRES scientist working in NOAA's Chemical Sciences Division, "It is a promising development that what we observe from space can be explained by expected trends in emissions from the oil and gas industry.""Scientifically, this is especially important: we can do source attribution by satellite," de Gouw said. "We need to know the important sources to address these emissions in the most cost-efficient manner."
Pollution
2,020
January 14, 2020
https://www.sciencedaily.com/releases/2020/01/200114101719.htm
School indoor air quality cannot be reliably assessed based on pupils' symptoms
In school buildings with indoor air quality related problems, such as moisture damage, temperature problems or poor ventilation, pupils experience slightly more symptoms than in buildings in which conditions are stated to be good based on expert evaluation.
However, the association between indoor air quality of the school building and the pupils' symptoms was so weak that it is not possible to reliably assess the quality of the indoor air based on the amount of reported symptoms. This was found in a recent study, by the Finnish Institute for Health and Welfare and the University of Helsinki, which was published in the scientific journal The researchers found that inadequate indoor air quality was connected to the pupils' respiratory and general symptoms, such as colds, coughs and headaches. Some link to lower airway or skin symptoms was also found, but not to eye symptoms.Earlier research also suggests that symptoms are not a good marker of the indoor air quality of a building."We need more research on how much indoor air impurities explain the differences in symptoms between schools, in comparison to other factors that influence symptoms. To date, these other factors have been remarkably little researched," said Professor Juha Pekkanen.Besides indoor air, other factors that influence symptoms include the age of the person, their health and stress levels, including how well they are doing and feeling at work and school, and beliefs and concerns about the risks of poor indoor air. The fact that many issues influence people's symptoms in indoor environments needs to be considered when interpreting the results of indoor air questionnaires."Questionnaires aimed at building users provide valuable additional information for the management of indoor air quality problems. However, the decisions on renovations should be based primarily on data obtained by investigating the building," said Chief Physician Jussi Lampi.The study included 129 lower and upper primary school buildings in the Helsinki area. The building conditions were assessed based on the City of Helsinki's evaluations. The Finnish Institute of Health and Welfare's indoor air questionnaire was used to gather data about pupils' symptoms. More than 12,000 pupils responded to the questionnaire. The response rate was 60%.
Pollution
2,020
January 13, 2020
https://www.sciencedaily.com/releases/2020/01/200113124519.htm
Future subtropical warming accelerates tropical climate change
In response to future fossil fuel burning, climate computer models simulate a pronounced warming in the tropical oceans. This warming can influence the El Niño phenomenon and shift weather and rainfall patterns across the globe. Despite being robustly simulated in computer models of the climate system, the origin of this accelerated tropical warming has remained a mystery. A new study published this week in the journal
Earth's future warming will not be identical everywhere. Atmospheric and oceanic circulation changes, as well as cloud processes, will determine which regions will warm faster and which ones will experience a delayed warming relative to the global mean. Focusing on the tropical regions, a team of scientists from the IBS Center for Climate Physics (ICCP) at Pusan National University in South Korea, along with their international collaborators, has now developed a new method that separates the contributions from local and remote physical processes that cause warming in a given region.The team found that the expected future warming in the tropics (15oS-15oN) originates mostly from warming that occurs in subtropical regions (16oN-32oN and 16oS-32oS). "To understand this surprising phenomenon, one has to understand how different areas interact with each other climatically" says Dr. Malte Stuecker from the ICCP and lead author of the study.In the subtropical regions air in the upper atmosphere is sinking, creating high pressure at the surface (see Figure). At an altitude of about 10-15 km, this sinking motion sucks in tropical air. The resulting large-scale atmospheric circulation is referred to as the Hadley cell and its surface branch, known as trade wind circulation, transports relatively dry subtropical air back to the tropics. Due to the effects of earth's rotation, trade winds also cause upwelling of cold subsurface waters in the tropical Pacific and Atlantic. "In response to increasing greenhouse gas emissions, future subtropical warming will slow down the atmospheric Hadley cell," adds Stuecker. This will lead to a weakening of the surface trade winds, less upwelling of cold ocean water, and a resulting warming of the sea surface. In addition, a weaker Hadley cell also means that less humid air is rising, and cloud coverage is reduced in most of the tropics, increasing the amount of sunlight reaching the surface. "This can further exacerbate future warming in the tropics," says Axel Timmermann, Director of the ICCP and co-author of the study.To arrive at these conclusions, the authors used a computer model of the global climate system and ran it for present-day and future CO2 conditions. By imposing the extra energy related to the CO2 change, either in the tropics or subtropics, the team then found that human-induced subtropical warming causes about 40% more future tropical surface ocean temperature change than if the same amount of extra energy would enter Earth?s atmosphere directly in the tropics.The study presents a new paradigm to understand the patterns of future global warming and the arising regional differences. "Warming in one area can affect the degree of warming in another place. We are starting to appreciate how strongly different areas are connected in the climate system." says co-author Prof. Fei-Fei Jin from the University of Hawai'i, USA.The new
Pollution
2,020
January 10, 2020
https://www.sciencedaily.com/releases/2020/01/200110073740.htm
Cracks in Arctic sea ice turn low clouds on and off
In the wintertime Arctic, cracks in the ice called "leads" expose the warm ocean directly to the cold air, with some leads only a few meters wide and some kilometers wide. They play a critical role in the Arctic surface energy balance. If we want to know how much the ice is going to grow in winter, we need to understand the impacts of leads.
The extreme contrast in temperature between the warm ocean and the cold air creates a flow of heat and moisture from the ocean to the atmosphere. This flow provides a lead with its own weather system which creates low-level clouds. The prevailing view has been that more leads are associated with more low-level clouds during winter. But University of Utah atmospheric scientists noticed something strange in their study of these leads: when lead occurrence was greater, there were fewer, not more clouds.In a paper published in
Pollution
2,020
January 8, 2020
https://www.sciencedaily.com/releases/2020/01/200108131642.htm
Debunking previous studies that say tropical fish are behaving oddly
Sometimes it helps to check the facts. You may be surprised what you find.
Over the last decade, several high-profile scientific studies have reported that tropical fish living in coral reefs are adversely affected by ocean acidification caused by climate change -- that is, they behave oddly and are attracted to predators as levels of carbon dioxide dissolved from air pollution increase.But now new research suggests that isn't the case.In fact, in the most exhaustive study yet of the impacts of ocean acidification on the behaviour of coral reef fish, headed up in Australia and co-authored by two Université de Montréal researchers, it turns out fish behaviour is not affected at all."The past decade has seen many high-profile studies that have found alarming effects of ocean acidification on coral reef fish behaviour," with some reporting that "fish become attracted to the smell of predators in acidified waters," said lead author Timothy Clark, an associate professor at Deakin University's School of Life and Environmental Sciences in Geelong, a seaside city near Melbourne, Australia.But when they tried to re-do those earlier studies with many of the same species, and by crunching the data in a new analysis, Clark and his team of Canadian and Scandinavian scientists -- including UdeM biologists Sandra Binning and Dominique Roche -- arrived at very different results.It turns out the original results couldn't be replicated."We expected previous results would be easy to repeat because of how clear and strong they appeared in the initial papers. Instead, we found consistently normal behaviours in fish that we acclimated to (predicted) end-of-(21st)-century levels of COBut "by using rigorous methods, measuring multiple behaviours in multiple species, and making our data and analysis code openly available, we have comprehensively and transparently shown that ... ocean acidification has negligible direct impacts on the behaviour of fish on coral reefs," said Clark."Specifically, elevated COThe new study is bound to make a big impact in the marine biology world, the scientists believe. Not only does it contradict earlier studies, it shows that science doesn't always produce results to buttress things everyone agrees on, like climate change.Quite the opposite, in fact."Some people may be surprised by these findings, but that's how science operates: it's a normal and healthy process to question published results. Sometimes they hold up, and sometimes they don't. Ultimately, it's the accumulation of evidence that matters and brings us closer to the truth," said Binning, an assistant professor at UdeM."It's not because some researchers have found one thing that we should take it at face value. As scientists, we should always be critical of what we read and what we see. That's how science advances.""We're not saying that climate change is not a problem -- far from it," added Roche, her husband, a research associate at UdeM. "Our point is that replication studies are very important, just as are ocean acidification and global warming generally."Clark agreed."The negative effects of CO"So, despite our new results, coral reefs and their fish communities remain in grave danger because of increasing atmospheric CONow, instead of concentrating on how fish behaviour is affected by ocean acidification, scientists would do better to focus their attention "on others aspects of climate change that are more in need of research," such as infectious disease risk, habitat destruction, and decreased oxygen levels in water, said Binning, holder of a Canada Research Chair on Eco-Evolution and Host-Parasite Interactions."With so little time left to combat climate change, it's vitally important that research dollars are used in the best way possible to better help us understand and target systems and organisms at the greatest risk," added Roche.
Pollution
2,020
January 7, 2020
https://www.sciencedaily.com/releases/2020/01/200107104913.htm
Air pollution in childhood linked to schizophrenia
Air pollution affects physical health, and research results now conclude that it also affects our psychological health. The study, which combines genetic data from iPSYCH with air pollution data from the Department of Environmental Science, shows that children who are exposed to a high level of air pollution while growing up, have an increased risk of developing schizophrenia.
"The study shows that the higher the level of air pollution, the higher the risk of schizophrenia. For each 10 ?g/m3 (concentration of air pollution per cubic metre) increase in the daily average, the risk of schizophrenia increases by approximately twenty per cent. Children who are exposed to an average daily level above 25 ?g/m3 have an approx. sixty per cent greater risk of developing schizophrenia compared to those who are exposed to less than 10 ?g/m3," explains Senior Researcher Henriette Thisted Horsdal, who is behind the study.To put these figures into perspective, the lifetime risk of developing schizophrenia is approximately two per cent, which equates to two out of a hundred people developing schizophrenia during their life. For people exposed to the lowest level of air pollution, the lifetime risk is just under two per cent, while the lifetime risk for those exposed to the highest level of air pollution is approx. three per cent.The results of the study have just been published in the scientific journal "The risk of developing schizophrenia is also higher if you have a higher genetic liability for the disease. Our data shows that these associations are independent of each other. The association between air pollution and schizophrenia cannot be explained by a higher genetic liability in people who grow up in areas with high levels of air pollution," says Henriette Thisted Horsdal about the study, which is the first of its kind to combine air pollution and genetics in relation to the risk of developing schizophrenia.The study included 23,355 people in total, and of these, 3,531 developed schizophrenia. Though the results demonstrate an increased risk of schizophrenia when the level of air pollution during childhood increases, the researchers cannot comment on the cause. Instead they emphasise that further studies are needed before they can identify the cause of this association.
Pollution
2,020
January 6, 2020
https://www.sciencedaily.com/releases/2020/01/200106222503.htm
Poplar genetically modified not to harm air quality grow as well as non-modified trees
While providing benefits to the environment, some trees also emit gases to the atmosphere that worsen air pollution and alter climate. Field trials in Oregon and Arizona show that poplar trees, which emit trace amounts of the gas isoprene, can be genetically modified not to harm air quality while leaving their growth potential unchanged.
The findings, published today in the journal Poplars and other trees used in plantation agroforestry, including palms and eucalyptus, produce isoprene in their leaves in response to climate stress such as high temperature and drought. The isoprene alleviates those stresses by signaling cellular processes to produce protective molecules; however, isoprene is so volatile that millions of metric tons leak into the atmosphere each year.The emitted isoprene reacts with gases produced by tailpipe pollution to produce ozone, which is a respiratory irritant. Isoprene also causes higher levels of atmospheric aerosol production, which reduces the amount of direct sunlight reaching the earth (a cooling effect), and it causes the global warming potential of methane in the atmosphere to increase (a warming effect). The warming effect is most likely greater than the cooling effect. The net effect of emitted isoprene is to worsen respiratory health and, most likely, warm the atmosphere.A research collaboration led by scientists at the University of Arizona, the Helmholtz Research Center in Munich, Portland State University and Oregon State University genetically modified poplars not to produce isoprene, then tested them in three- and four-year trials at plantations in Oregon and Arizona.The researchers found that trees whose isoprene production was genetically suppressed did not suffer ill effects in terms of photosynthesis or "biomass production." They were able to make cellulose, used in biofuel production, and grow as well as trees that were producing isoprene. The discovery came as a surprise, given the protective role of isoprene in stressful climates, especially in the case of the Arizona plantation."The suppression of isoprene production in the leaves has triggered alternative signaling pathways that appear to compensate for the loss of stress tolerance due to isoprene," said Russell Monson, a professor of ecology and evolutionary biology at the University of Arizona and lead author of the study. "The trees exhibited a clever response that allowed them to work around the loss of isoprene and arrive at the same outcome, effectively tolerating high temperature and drought stress.""Our findings suggest that isoprene emissions can be diminished without affecting biomass production in temperate forest plantations," said study co-author Steven Strauss, a distinguished professor of forest biotechnology at Oregon State University. "That's what we wanted to examine -- can you turn down isoprene production, and does it matter to biomass productivity and general plant health? It looks like it doesn't impair either significantly."The researchers used a genetic engineering tool known as RNA interference. RNA transmits protein coding instructions from each cell's DNA, which holds the organism's genetic code. The genetic tools for modifying the trees, and the protein analyses that revealed changes in the use of biochemical pathways, were developed by scientists at the Institute of Biochemical Plant Pathology, Helmholtz Research Center in Munich, Germany, who collaborated on the study."RNA interference is like a vaccination -- it triggers a natural and highly specific mechanism whereby specific targets are suppressed, be they the RNA of viruses or endogenous genes," Strauss said. "You could also do the same thing through conventional breeding. It would be a lot less efficient and precise, and it might be a nightmare for a breeder who may need to reassess all of their germplasm and possibly exclude their most productive cultivars as a result, but it could be done. New technologies like CRISPR, short for clustered regularly interspaced short palindromic repeats, which allows for precise DNA editing at specific stretches of the genetic code, should work even better."In an additional discovery, the researchers found that trees were able to adjust to the loss of isoprene because most plantation growth takes place during cooler and wetter times of the year."This means that, for this species, the natural seasonal cycle of growth works in favor of high biomass production when the beneficial effects of isoprene are needed least," Monson explained.This observation also clarified an adaptive role for isoprene in natural forests, where protection that enhances survival during mid-season climate stress is likely more important than processes that promote growth early in the season."The fact that cultivars of poplar can be produced in a way that ameliorates atmospheric impacts without significantly reducing biomass production gives us a lot of optimism," Monson said. "We're striving toward greater environmental sustainability while developing plantation?scale biomass sources that can serve as fossil fuel alternatives."
Pollution
2,020
January 3, 2020
https://www.sciencedaily.com/releases/2020/01/200103111726.htm
Air pollution can worsen bone health
Some of the effects of air pollution on health are well documented -lung cancer, stroke, respiratory diseases, and a long etcetera- but for others there is less scientific evidence. Such is the case of bone health: there are only a few studies and results are inconclusive. Now, a study in India led by the Barcelona Institute for Global Health (ISGlobal), an institution supported by "la Caixa," has found an association between exposure to air pollution and poor bone health.
Osteoporosis is a disease in which the density and quality of the bone is reduced. Globally, it is responsible for a substantial burden of disease and its prevalence is expected to increase due to aging of the population.The new study performed by the CHAI Project, led by ISGlobal and published in The authors used a locally-developed model to estimate outdoor exposure at residence to air pollution by fine particulate matter (suspended particles with a diameter of 2.5 ?m or less) and black carbon. The participants also filled a questionnaire on the type of fuel used for cooking. The authors linked this information with bone health assessed using a special type of radiography that measures bone density, called dual-energy x-ray absorptiometry, and measured bone mass at the lumbar spine and the left hip.The results showed that exposure to ambient air pollution, particularly to fine particles, was associated with lower levels of bone mass. No correlation was found with use of biomass fuel for cooking."This study contributes to the limited and inconclusive literature on air pollution and bone health," explains Otavio T. Ranzani, ISGlobal researcher and first author of the study. Regarding the possible mechanisms underlying this association, he says "inhalation of polluting particles could lead to bone mass loss through the oxidative stress and inflammation caused by air pollution."Annual average exposure to ambient PM2.5 was 32.8 ?g/m3, far above the maximum levels recommended by the World Health Organisation (10 ?g/m3). 58% of participants used biomass fuel for cooking."Our findings add to a growing body of evidence that indicates that particulate air pollution is relevant for bone health across a wide range of air pollution levels, including levels found in high income and low-and medium income countries" says Cathryn Tonne, coordinator of the study and of the CHAI project. "
Pollution
2,020
December 24, 2019
https://www.sciencedaily.com/releases/2019/12/191224085713.htm
Untangling links between nitrogen oxides and airborne sulfates helps tackle hazy air pollution
Dense, hazy fog episodes characterized by relatively high humidity, low visibility and extremely high PM2.5 have been a headache to many megacities including those in Mainland China. Among pollutants that are less than 2.5 microns in diameter (PM2.5), airborne sulfate is one of the most common components of hazy air pollution formed atmospherically via the oxidation of sulphur dioxide (SO
While the reactant-product link between sulphur dioxide and airborne sulfate formation is common knowledge, the complex oxidants and mechanisms that enable this transformation are not. In particular, the role of nitrogen oxides in sulfate production is unclear. Managing sulfate pollution has dogged researchers and governments alike as it is not produced directly from pollution sources, unlike nitrogen oxides which are clearly emitted from vehicle exhaust, and the combustion of fossil fuels like coal, diesel and natural gas. This is the first study systematically examining the multiple roles of nitrogen oxides in affecting oxidants that enable this set of chemical reactions.In collaboration with the California Institute of Technology, a research team led by Prof. YU Jianzhen, Professor at HKUST's Department of Chemistry and Division of Environment and Sustainability, identified three formation mechanism regimes, corresponding to the three distinct roles that nitrogen oxides play in sulfate production depending on the chemical surroundings. Under low NOThese findings indicate that in order to reduce sulfate levels in highly polluted haze-fog conditions, co-control of SO"Since sulfate is formed atmospherically and cannot be controlled directly, we must target its precursor components (such as sulphur dioxide and nitrogen oxides). Effective reduction of sulfate content in the air relies on knowledge of the quantitative relationship it has with its precursors. This work lays the conceptual framework to delineate the relationship between sulfate and one set of its controllable precursors, nitrogen oxides (NOAs sulfate is one of the major components which leads to haze formation and acid rain, this study laid the groundwork for formulating more effective measures of targeting this major pollutant involved in aforementioned events -- which do not just block the views or make aquatic environments more acidic, but also compromise human health. With greater understanding and better control, this will lead to improved air quality and better protection of public health and ecological systems as a whole.The team's findings were recently published in the scientific journal
Pollution
2,019
December 20, 2019
https://www.sciencedaily.com/releases/2019/12/191220150545.htm
Could every country have a Green New Deal? Report charts paths for 143 countries
Ten years after the publication of their first plan for powering the world with wind, water, and solar, researchers offer an updated vision of the steps that 143 countries around the world can take to attain 100% clean, renewable energy by the year 2050. The new roadmaps, publishing December 20 in the journal
In this update, Mark Z. Jacobson of Stanford University and his team find low-cost, stable grid solutions in 24 world regions encompassing the 143 countries. They project that transitioning to clean, renewable energy could reduce worldwide energy needs by 57%, create 28.6 million more jobs than are lost, and reduce energy, health, and climate costs by 91% compared with a business-as-usual analysis. The new paper makes use of updated data about how each country's energy use is changing, acknowledges lower costs and greater availability of renewable energy and storage technology, includes new countries in its analysis, and accounts for recently built clean, renewable infrastructure in some countries."There are a lot of countries that have committed to doing something to counteract the growing impacts of global warming, but they still don't know exactly what to do," says Jacobson, a professor of civil and environmental engineering at Stanford and the co-founder of the Solutions Project, a U.S. non-profit educating the public and policymakers about a transition to 100% clean, renewable energy. "How it would work? How it would keep the lights on? To be honest, many of the policymakers and advocates supporting and promoting the Green New Deal don't have a good idea of the details of what the actual system looks like or what the impact of a transition is. It's more an abstract concept. So, we're trying to quantify it and to pin down what one possible system might look like. This work can help fill that void and give countries guidance."The roadmaps call for the electrification of all energy sectors, for increased energy efficiency leading to reduced energy use, and for the development of wind, water, and solar infrastructure that can supply 80% of all power by 2030 and 100% of all power by 2050. All energy sectors includes electricity; transportation; building heating and cooling; industry; agriculture, forestry, and fishing; and the military. The researchers' modeling suggests that the efficiency of electric and hydrogen fuel cell vehicles over fossil fuel vehicles, of electrified industry over fossil industry, and of electric heat pumps over fossil heating and cooling, along with the elimination of energy needed for mining, transporting, and refining fossil fuels, could substantially decrease overall energy use.The transition to wind, water, and solar would require an initial investment of $73 trillion worldwide, but this would pay for itself over time by energy sales. In addition, clean, renewable energy is cheaper to generate over time than are fossil fuels, so the investment reduces annual energy costs significantly. In addition, it reduces air pollution and its health impacts, and only requires 0.17% of the 143 countries' total land area for new infrastructure and 0.48% of their total land area for spacing purposes, such as between wind turbines."We find that by electrifying everything with clean, renewable energy, we reduce power demand by about 57%," Jacobson says. "So even if the cost per unit of energy is similar, the cost that people pay in the aggregate for energy is 61% less. And that's before we account for the social cost, which includes the costs we will save by mitigating health and climate damage. That's why the Green New Deal is such a good deal. You're reducing energy costs by 60% and social costs by 91%."In the U.S., this roadmap -- which corresponds to the energy portion of the Green New Deal, which will eliminate the use of all fossil fuels for energy in the U.S. -- requires an upfront investment of $7.8 trillion. It calls for the construction of 288,000 new large (5 megawatt) wind turbines and 16,000 large (100 megawatt) solar farms on just 1.08% of U.S. land, with over 85% of that land used for spacing between wind turbines. The spacing land can double, for instance, as farmland. The plan creates 3.1 million more U.S. jobs than the business-as-usual case, and saves 63,000 lives from air pollution per year. It reduces energy, health, and climate costs 1.3, 0.7, and 3.1 trillion dollars per year, respectively, compared with the current fossil fuel energy infrastructure.And the transition is already underway. "We have 11 states, in addition to the District of Columbia, Puerto Rico, and a number of major U.S. cities that have committed to 100% or effectively 100% renewable electric," Jacobson says. "That means that every time they need new electricity because a coal plant or gas plant retires, they will only select among renewable sources to replace them."He believes that individuals, businesses, and lawmakers all have an important role to play in achieving this transition. "If I just wrote this paper and published it and it didn't have a support network of people who wanted to use this information," he says, "it would just get lost in the dusty literature. If you want a law passed, you really need the public to be supportive."Like any model, this one comes with uncertainties. There are inconsistencies between datasets on energy supply and demand, and the findings depend on the ability to model future energy consumption. The model also assumes the perfect transmission of energy from where it's plentiful to where it's needed, with no bottlenecking and no loss of energy along power lines. While this is never the case, many of the assessments were done on countries with small enough grids that the difference is negligible, and Jacobson argues that larger countries like the U.S. can be broken down into smaller grids to make perfect transmission less of a concern. The researchers addressed additional uncertainties by modeling scenarios with high, mean, and low costs of energy, air pollution damage, and climate damage.The work deliberately focuses only on wind, water, and solar power and excludes nuclear power, "clean coal," and biofuels. Nuclear power is excluded because it requires 10-19 years between planning and operation and has high costs and acknowledged meltdown, weapons proliferation, mining, and waste risks. "Clean coal" and biofuels are not included because they both cause heavy air pollution and still emit over 50 times more carbon per unit of energy than wind, water, or solar power.One concern often discussed with wind and solar power is that they may not be able to reliably match energy supplies to the demands of the grid, as they are dependent on weather conditions and time of year. This issue is addressed squarely in the present study in 24 world regions. The study finds that demand can be met by intermittent supply and storage throughout the world. Jacobson and his team found that electrifying all energy sectors actually creates more flexible demand for energy. Flexible demand is demand that does not need to be met immediately. For example, an electric car battery can be charged any time of day or night or an electric heat pump water heater can heat water any time of day or night. Because electrification of all energy sectors creates more flexible demand, matching demand with supply and storage becomes easier in a clean, renewable energy world.Jacobson also notes that the roadmaps this study offers are not the only possible ones and points to work done by 11 other groups that also found feasible paths to 100% clean, renewable energy. "We're just trying to lay out one scenario for 143 countries to give people in these and other countries the confidence that yes, this is possible. But there are many solutions and many scenarios that could work. You're probably not going to predict exactly what's going to happen, but it's not like you need to find the needle in the haystack. There are lots of needles in this haystack."
Pollution
2,019
December 20, 2019
https://www.sciencedaily.com/releases/2019/12/191220105623.htm
Ecological impacts of palm stearin spill to the coastal ecosystem
In August 2017, a marine accident occurred in the Pearl River Estuary where a cargo vessel accidentally released about 1,000 tonnes of palm stearin into the sea. Over 200 tonnes of palm stearin reached the southwest coasts of Hong Kong. The general public and green groups expressed concerns that such palm oil pollution could adversely affect the marine life and marine ecosystem, yet there was a lack of scientific information on the toxicity of the palm stearin toward marine organisms in the scientific literature, making it impossible to accurately evaluate its ecological risk.
Subsequently, Professor Kenneth Leung Mei Yee from School of Biological Sciences and the Swire Institute of Marine Science, The University of Hong Kong (HKU) and his research team launched a comprehensive 18-month investigation on the degradation, bioaccumulation, and toxicity of the palm stearin through bother field- and laboratory-based investigations. The study sites included Hung Shing Yeh Beach, Deep Water Bay, Repulse Bay, Chung Hom Kok, Outer Tai Tam and Inner Tai Tam. The results of the study were published in the international journal The results of the field-based study showed that the palm stearin could dissolve in seawater and sediment under elevated temperature, and marine organisms could be contaminated by the oil. In early August 2017, samples of the tissues of marine gastropod species, seawater and sediment were found to have high level of fatty acids, especially C16:0 fatty acid which is dominant in the palm stearin.After the incident, the Hong Kong SAR Government and local citizens made concerted efforts in removing the palm stearin from the impacted shores. Such important actions effectively stopped their further contamination and helped to minimise negative impacts brought by the palm stearin.In November 2017 (i.e. four months after the incident), the concentration of fatty acids in both seawater and sediment samples returned to the natural levels. Nonetheless, the concentration of fatty acids remained high in the marine gastropods; this might be due to a natural cause that the animals intensified their uptake of food to store more energy for winter.The results of the laboratory experiment suggested that the rates of disintegration and degradation of the palm stearin were very slow. After deploying the palm stearin in seawater for a year under laboratory conditions, only about 10% of the palm stearin could be disintegrated and degraded.After studying the toxicity of palm stearin on 10 different marine species, the research team also revealed that the palm stearin posed notable adverse effects on pelagic planktons and zooplanktons. It could prohibit the growth of microalgae (e.g. diatoms) and cause mortality to pelagic copepods, rotifers and brine shrimps. Benthic copepod and marine medaka fish were more tolerant to the palm stearin. Although the fish could eat the palm stearin, their growth was inhibited. Reproduction of the benthic copepod was greatly reduced after exposure to the palm stearin.Combining the above findings, the research team conducted a scientific ecological risk assessment. The results indicated that the ecological risk was very high right after the accidental spill in August 2017 (risk quotient (RQ) at all sites >> 1). Fortunately, the ecological risk was substantially reduced after 4 months of the incident (RQ < 1 in four sites; RQ < 2 in the other two sites).The results of this study highlighted the importance of the immediate action for removal of the palm stearin from the shores, because these could minimise their long-term impact to the marine environment. As there is an ever-increasing trend of marine trade, the number and frequency of marine accidents are expected to rise. This study, therefore, provides useful scientific information to authorities around the world for them to make informed decision in risk assessment and management of similar crises in future.
Pollution
2,019
December 18, 2019
https://www.sciencedaily.com/releases/2019/12/191218173853.htm
Pregnancy hypertension risk increased by traffic-related air pollution
A new report from the National Toxicology Program (NTP) suggests that traffic-related air pollution increases a pregnant woman's risk for dangerous increases in blood pressure, known as hypertension.
NTP scientists evaluated published research on the link between traffic-related air pollution, or TRAP, and hypertensive disorders broken down by pollutant measurements of TRAP, such as particulate matter (PM2.5). PM is the term for a mixture of solid particles and liquid droplets found in the air, and PM2.5 refers to fine inhalable particles, with diameters that are generally 2.5 micrometers or smaller. The average human hair is about 70 micrometers in diameter, about 30 times larger than the largest fine particle."What we found when we reviewed the literature is that exposure to PM2.5 from traffic emissions was associated with development of hypertensive disorders in pregnant women," said Brandy Beverly, Ph.D., lead scientist and researcher at the National Institute of Environmental Health Sciences, part of the National Institutes of Health. "When these women are exposed to PM2.5 during their entire pregnancy, the likelihood of developing preeclampsia increases by about 50%."Other components of TRAP that NTP evaluated included nitrogen oxides, carbon monoxide, black carbon, and elemental carbon, along with parameters like traffic density and mothers' proximity to main roads.For example, the literature suggests that women who live within a quarter of a mile of a major roadway or in high traffic density regions may be at an increased risk for developing hypertensive disorders of pregnancy.TRAP comes from the combustion of fossil fuels by motor vehicles. These vehicle emissions are mixtures of gases and particles that are easily inhaled and have adverse health effects. TRAP is known to be a major risk factor for cardiovascular disease, including hypertension.Hypertensive disorders of pregnancy complicate more than 10% of pregnancies worldwide and are a leading cause of maternal and fetal illness and death. According to the American College of Obstetrics and Gynecology, mothers with hypertension during pregnancy are more likely to have a pre-term delivery. Their infants are at greater risk for low birthweight and a range of long-term health problems associated with pre-mature birth."Hypertensive disorders of pregnancy refer to a range of clinical conditions, all of which include high blood pressure during pregnancy," said Beverly. "The disorders are classified into four distinct types, based on differences in the timing and onset of the symptoms."Pregnant women may experience four types of hypertensive disorders:Using their standard four-tier scale to classify human hazards, NTP looked at the combined evidence from the individual components and concluded that TRAP is a presumed human hazard for hypertensive disorders during pregnancy, though they weren't able to distinguish between the four types of disorders. The scale ranges from the highest hazard rating of "known," followed by "presumed," then "suspected," and finally, "not classifiable."NTP conducted the systematic review of published research on hypertensive disorders in pregnant women and its link to TRAP after receiving a nomination from several pediatricians to evaluate the connection between emerging issues associated with air pollution and children's health.NTP scientists performed a comprehensive literature search and reviewed hundreds of studies with potentially relevant data. Overall, they evaluated 18 human observational studies and one animal study that specifically addressed hypertension during pregnancy and TRAP. Usually, experimental animal data add confidence in the conclusions; unfortunately, the limited number of animal studies that assessed the impact of environmental exposures during pregnancy is a research gap.The evaluation underwent external peer review involving experts from academia and industry, who evaluated NTP's draft conclusions and agreed unanimously with NTP's final conclusion.
Pollution
2,019
December 17, 2019
https://www.sciencedaily.com/releases/2019/12/191217141545.htm
Effects of natural gas assessed in study of shale gas boom in Appalachian basin
Natural gas has become the largest fuel source for generating electricity in the United States, accounting for a third of production and consumption of energy. However, the environmental and socioeconomic impacts of natural gas have not been considered comprehensively. A new study estimated the cumulative effects of the shale gas boom in the Appalachian basin in the early 2000s on air quality, climate change, and employment. The study found that effects on air quality and employment followed the boom-and-bust cycle, but effects on climate change will likely persist for generations to come. The study, which also considered how to compensate for these effects, provides insights for long-term decision making in this field.
The study, by researchers at Carnegie Mellon University (CMU), Princeton University, and Stanford University, appears in nature sustainability."While gas development has boosted aspects of the regional economy, private firms have not faced the full costs of natural gas development," explains Nicholas Z. Muller, Associate Professor of Economics, Engineering, and Public Policy at CMU's Tepper School of Business, who coauthored the study. "In our work, we sought to evaluate the cumulative and disparate impacts of current energy systems to inform policymaking."Rapid increases in natural gas production in the United States, resulting from advancements in horizontal drilling and hydraulic fracturing (a well-stimulation technique that injects pressurized liquids into a bedrock formation) have dramatically altered world energy markets and the domestic energy outlook. The United States has been the largest natural gas consumer and producer over the past decade, comprising 20 percent of the world market, and the domestic shale gas market has contributed to price volatility and a shift in global flows of natural gas.In this study, researchers analyzed the shale gas boom and decline in the Appalachian basin, the largest natural gas basin in the United States in terms of reserves and production, from 2004 to 2016. Over this period, volumes of regional shale production increased annually and drilling peaked in 2013. Regional natural gas consumption of electric power and processing volumes also rose. Shale gas production exceeded regional demand, leading to substantial amounts of exports to other parts of the country.Specifically, the researchers examined premature mortality from fine particulate matter and secondary particulate matter formed from the atmospheric oxidation of nitrogen oxides and volatile organic compound emissions. They also estimated the change in global mean temperature from carbon dioxide and methane emissions, as well as the effects on employment associated with the development of natural gas.The study estimated that shale gas production degraded air quality, resulting in 1,200 to 4,600 premature deaths (costing $2.3 billion to $61 billion), while boosting employment by 469,000 job years or jobs lasting a year (yielding wage income valued at $8 billion to $33 billion). The authors surmised that these results are in line with the boom-and-bust cycle of shale gas development. But the study also found that natural gas production affected climate (at a cost of from $12 billion to $94 billion, depending on assumptions regarding social costs) in ways that will persist for generations to come, well beyond the period of natural gas activity in the region."Our study provides insight on the cumulative socioeconomic and environmental impacts of natural gas systems," says Jared L. Cohon, former President of CMU, another coauthor. "The argument that natural gas may serve as a bridge fuel is in part premised on its comparative climate advantage over coal and cost advantage over renewables and other energy technologies. However, this is unsupported if natural gas prices do not reflect the actual economics for producing firms or the costs of damages to climate and air quality."The study also found that employment effects were concentrated in rural areas where natural gas production occurred, but 76 percent of the cumulative premature mortality due to air pollution was downwind of those areas in urban areas of the United States. The cumulative effects of methane and carbon dioxide emissions on global mean temperature over a 30-year time period were nearly equivalent, but over the long term, the cumulative climate impact was largely due to carbon dioxide, according to the study.The authors estimated that a tax on production of natural gas of $2 per thousand cubic feet (mcf) of gas would compensate for the cumulative effects of climate and air quality across the supply chain. This is considerably greater than the current fee of $0.08/mcf. The authors were not advocating for such a tax to be implemented absent similiar policies on other energy fuels."One of our goals was to assess tradeoffs between different impacts, and comparing physical impacts reveals the implied tradeoffs involved with decisions about natural gas development," according to Erin N. Mayfield, Postdoctoral Scholar at CMU, who led the study. "Based on our cumulative estimates of premature mortality from air pollution and employment, we concluded that the implied tradeoff is 217 job years per premature mortality at a systems level, or equivalently, three job years per years of life lost."Among the study's limitations, the authors note, are that estimates of the impact of air pollution do not include all impacts associated with this type of pollution, and health impacts from natural gas extend beyond those associated with air pollution and were not considered in the study.
Pollution
2,019
December 16, 2019
https://www.sciencedaily.com/releases/2019/12/191216122420.htm
Big step in producing carbon-neutral fuel: Silver diphosphide
A new chemical process described in the journal
This new, carbon-neutral process, created by researchers at Wake Forest University, uses silver diphosphide (AgP2) as a novel catalyst that takes carbon dioxide pollution from manufacturing plants and converts it to a material called syngas, from which the liquid fuel used in manufacturing is made. The new catalyst allows the conversion of carbon dioxide into fuel with minimal energy loss compared to the current state-of-the-art process, according to the Wake Forest researchers."This catalyst makes the process much more efficient," said Scott Geyer, corresponding author of "Colloidal Silver Diphosphide Nanocrystals as Low Overpotential Catalysts for CO2 Reduction to Tunable Syngas," published online Dec. 16 in Silver has been considered the best catalyst for this process to date. Adding phosphorus removes electron density from the silver, making the process more controllable and reducing energy waste.In the future, Geyer sees being able to power this process with solar energy, directly converting sunlight into fuel. The more efficient the chemical conversion process becomes, the more likely solar energy -- instead of coal or other non-renewable energy sources -- can be used to make fuel."People make syngas out of coal all the time," Geyer said. "But we're taking something you don't want, carbon dioxide pollution, and turning it into something you want, fuel for industry."Geyer, whose lab focuses on understanding the role phosphorus plays in chemical reactions, is an assistant professor of chemistry at Wake Forest. The team that produced this paper includes Hui Li, who led the work as a Ph.D. student in Geyer's lab, plus former Wake Forest undergraduate Zachary Hood; Ph.D. in chemistry student Shiba Adhikari; and Ph.D. student in physics student Chaochao Dun, who all have stayed connected with the program through their professional posts."The ability to collaborate with a network of outstanding Wake Forest University graduates who are now at top universities and national laboratories across the United States has been essential in preparing this work as it allows us to access one-of-a-kind instrumentation facilities at their current institutions," Geyer said.
Pollution
2,019
December 16, 2019
https://www.sciencedaily.com/releases/2019/12/191216094533.htm
Warming climate will impact dead zones in Chesapeake Bay
In recent years, scientists have projected increasingly large summer dead zones in the Chesapeake Bay, areas where there is little or no oxygen for living things like crabs and fish to thrive, even as long-term efforts to reduce nutrient pollution continue. Researchers warn that climate may also have significant impact that could change the equation for nutrient reduction goals.
Researchers including Ming Li and Wenfei Ni from University of Maryland Center for Environmental Science factored in local impacts of climate change to make projections of what the oxygen content of the Chesapeake Bay will look like in the future."We projected that the hypoxic and anoxic volumes in Chesapeake Bay would increase by 10-30% between the late 20th and mid?21st century," said study author Ming Li of the University of Maryland Center for Environmental Science.The bay's hypoxic (low oxygen) and anoxic (no oxygen) zones, also called "dead zones," are caused by excess nutrient pollution, primarily from agriculture and wastewater. The excess nutrients stimulate an overgrowth of algae, which then sinks and decomposes in the water, consuming oxygen. The resulting low oxygen levels are insufficient to support most marine life and habitats in near-bottom waters, threatening the bay's crabs, oysters and other fisheries.The Chesapeake Bay has been experiencing rapid warming and accelerating relative sea level rise. In coastal waters, the depletion of oxygen in bottom water has occurred at faster rates than the open ocean and has been traditionally attributed to nutrient pollution and organic matter from the surrounding watershed and rivers."Previous studies have suggested that the climate change impact on hypoxia in Chesapeake Bay would be modest," said Ming Li. "We are saying there might actually be a bigger increase in hypoxia, and we need to factor climate change in to nutrient management strategies. Maybe we'll have to make a bigger reduction of nutrient loading to offset the impact of climate change."The researchers used several climate models to make hypoxia projections for 2050 and got similar results."This has really raised some questions," he said. "Chesapeake is vulnerable to climate change."
Pollution
2,019
December 14, 2019
https://www.sciencedaily.com/releases/2019/12/191214122548.htm
Following the lizard lung labyrinth
Take a deep breath in. Slowly let it out.
You have just participated in one of the most profound evolutionary revolutions on Earth -- breathing air on land. It's unclear how the first vertebrates thrived after crawling out of the sea nearly 400 million years ago, but the lungs hold an important clue.Birds, reptiles, mammals and birds have evolved diverse lung structures through which air flows in complicated ways. Birds and mammals are on extreme ends of the airflow spectrum. Mammals inhale oxygen-rich air that funnels into smaller branches, ending in tiny sacs where oxygen enters and carbon dioxide leaves the bloodstream. When mammals exhale, the depleted air follows the same route out of the body, exhibiting a so-called tidal flow pattern.In contrast, bird breath travels tidally through part of the respiratory system, but in a one-way loop throughout most of the lung. Thanks to a unique design with aerodynamic valves, air always moves toward the head through many tiny tubes in birds -- during both inhalation and exhalation. Scientists thought this pattern of flow is hyper efficient and evolved to support flight until University of Utah biologist Colleen Farmer's research group discovered that alligators and iguanas also have a unidirectional air flow pattern.In their latest study, U biologists have discovered that Savannah monitor lizards have lung structures that are a kind of a hybrid system of bird and mammal lungs. The researchers took CT scans of the entire lung labyrinth and used two different supercomputers to simulate airflow patterns at the highest resolution. The software used computational fluid dynamics similar to those used to forecast weather, calculating millions of equations every tenth of a second. The findings show that vertebrate lung evolution is complicated and we have yet to understand the full picture."We don't know why animals have different types of lung air flow," said lead author Robert Cieri, a postdoc at the University of the Sunshine Coast who did the research while a graduate student in Farmer's lab. "Why do humans have the lungs we have verses the lungs of a bird? That's not a simple question. By answering that, maybe we can find out more about our own history."The paper published on Dec. 13 in The The Savannah monitor lizard has long fascinated scientists because they have one of the most complicated lung systems of any reptile. In 2014, Cieri and colleagues analyzed one section of the lung that had primarily one-way airflow. This new study uses more powerful techniques to paint a completer and more complicated picture. Savannah monitor lizard lungs are structured around a long branchial tube that runs through to the back of the lung and opens into a big sac. Many smaller tubes branch off from the main one and distribute air into tiny chambers. These chambers have holes in their walls, allowing air to flow also from chamber to chamber. This complicated layout results in an airflow pattern that changes over the course of a breath cycle. It's a unique pattern that is part bird, part mammal.When the animal exhales, nearly all of the air flows towards the front of the lung and out of the trachea in a net unidirectional flow. At the beginning of inhalation, air enters through the trachea and flows towards the back of the lung. As the inhale continues, the air begins to distribute throughout the different side chambers and starts to loop back around towards the front of the lung. As these loops become more dominant, the late stages of inhalation look similar to exhalation because most of the air is flowing unilaterally back from the central chamber. The complicated structure has no flaps or valves that determine airflow, like the heart pumps blood. Pure aerodynamics guide the complicated physics."This study is important in demonstrating it is possible to numerically analyze patterns of airflow in these extremely complicated lungs. This quantitative ability opens up new avenues to study the basic mechanisms of how aerodynamic valve work, and gives us better tools to piece together the evolutionary history of these patterns of flow and the structures that underpin them," said senior author Farmer.The physics is so complicated that Cieri needed two supercomputers from the Center for High Performance Computing at the U and the National Science Foundation Blue Waters to run the computer fluid dynamics simulation. After creating the CT scans, he modified existing software to predict the velocity and pressure based on the lung structure. He divided the structures into millions of tiny "boxes." Each box has the physical parameters of that section of the lung. The simulation uses equations to predict what the pressure and velocity will be in the next box, and so on."There are millions of these elements. Each one is influencing another one every ten-thousandth of a second in every direction. That's why we needed the computer power -- the simulation is brute force balancing two equations at each step to figure out the next piece," said Cieri.The evolution of lungs is one crucial clue to understanding the pressures that led to where we are now. Along with learning more about lung evolution, Cieri believes we can learn something from the physics of the structure."We have this amazing wealth of really cool fluid dynamics out there in the animal world that we want to know more about. Maybe we can apply that knowledge to engineering or for human health," he said.
Pollution
2,019
December 13, 2019
https://www.sciencedaily.com/releases/2019/12/191213092515.htm
Barrels of ancient Antarctic air aim to track history of rare gas
Ancient air samples from one of Antarctica's snowiest ice core sites may add a new molecule to the record of changes to Earth's atmosphere over the past century and a half, since the Industrial Revolution began burning fossil fuels on a massive scale.
While carbon dioxide and methane are well known, researchers at the University of Washington and the University of Rochester are part of a team working to trace a much rarer gas, present at less than one in a trillion molecules. Though rare, the atmospheric detergent known as hydroxyl can scrub the atmosphere and determine the fate of more plentiful gases that affect Earth's climate.An Antarctic field campaign last winter led by the U.S. and Australia has successfully extracted some of the largest samples of air dating from the 1870s until today. These samples are a first step to learning the changes in hydroxyl concentration over the past 150 years. Early results from the fieldwork were shared this week at the American Geophysical Union's annual fall meeting in San Francisco."It's probably the most extreme atmospheric chemistry you can do from ice core samples, and the logistics were also extreme," said Peter Neff, a postdoctoral researcher with dual appointments at the UW and at the University of Rochester.But the months the team spent camped on the ice at the snowy Law Dome site paid off."This is, to my knowledge, the largest air sample from the 1870s that anyone's ever gotten," Neff said. His 10 weeks camped on the ice included minus-20 degrees Fahrenheit temperatures and several snowstorms, some of which he shared from Antarctica via Twitter.Air from deeper ice cores drilled in Antarctica and Greenland has provided a record of carbon dioxide and methane, two greenhouse gases, going back thousands of years. While carbon dioxide has a lifetime of decades to centuries, an even more potent gas, methane, has a lifetime of just nine or 10 years.Pinpointing the exact lifetime of methane, and how it has changed over the years, depends on the concentration of hydroxyl. That number is important for the global climate models used to study past and future climate.To trace the history of hydroxyl, a fleeting molecule with a lifetime of less than a millionth of a second, a field campaign in late 2018 and early 2019 drilled ice to study this very reactive gas by examining its slightly more plentiful companion, carbon with 14 neutrons bonded to an oxygen atom, or "carbon-14 monoxide," which is chemically destroyed by hydroxyl and so tracks hydroxyl's concentrations.Researchers get the carbon-14 monoxide gas from bubbles in the ice that form when snow is compressed."The special thing about glacier ice is that it always has these air bubbles," Neff said. "Any glacier in the world is going to have that bubbly texture, because it started as a pile of six-fingered snowflakes, and between those fingers is air."One or several decades after hitting the ground, bubbles become completely sealed off from their surroundings due to compression under layers of snow. The heavy snow accumulation at Law Dome means plenty of air bubbles per year, and provides a thick enough shield to protect the carbon-14 monoxide from solar radiation.The international team extracted about two dozen 3-foot-long sections of ice per day, then put the tubes of ice in a snow cave to protect them from cosmic rays that are stronger near the poles. Those rays can zap other molecules and distort the historic record."Once the samples are at the surface, they're hot potatoes," Neff said.The day after extracting a core, the team would clean the ice and place it in a device of Neff and his University of Rochester postdoctoral supervisor Vasilii Petrenko's design: a 335-liter vacuum chamber in a warm bath to melt the ice and process the samples at their source, to avoid contamination and collect the biggest air samples."A single sample size was about 400 or 500 kilograms of ice, about the same weight as a grand piano, to get enough of that carbon-14 monoxide molecule," Neff said. "At the field camp we turned 500 kilos of ice into one 50-liter canister of air."The team retrieved 20 barrel-shaped canisters of air from various time periods.Analysis over the coming months will aim to produce a concentration curve for carbon-14 monoxide and hydroxyl over the decades, similar to the now-famous curves for carbon dioxide and methane. The curves show how gas concentrations have changed in the atmosphere since the industrial era.Throughout the effort, Neff has also explored more lighthearted combinations of ice and air. During a trip in early 2016 to prepare for this effort, Neff did an unofficial experiment that went viral on social media when he posted it in February 2018. The video captures the sound a piece of ice makes when dropped down the tunnel created by an ice core drill.He shared more photos and videos during this past winter's expedition to Antarctica, sometimes within hours of returning from a remote camp to an internet-connected research station."It's great to be able to share something about Antarctica, from Antarctica," Neff said. "It's a way that we as geoscientists can share with people the work that they help to support."
Pollution
2,019
December 12, 2019
https://www.sciencedaily.com/releases/2019/12/191212163336.htm
Study highlights high cost of fossil fuel pollution on children's health
A new study by researchers at the Columbia Center for Children's Environmental Health (CCCEH) at Columbia Mailman School of Public Health is the first to compile the estimated per-case costs of six childhood health conditions linked to air pollution -- estimates that can be incorporated into benefits assessments of air pollution regulations and climate change mitigation policies. Results appear in the journal
The study reports case-specific monetary estimates for preterm birth, low birth weight, asthma, autism spectrum disorder, attention-deficit/hyperactivity disorder, and IQ reduction in children -- which scientific evidence shows are among the known or likely health consequences of prenatal and early childhood exposure to air pollution, 80 percent of which is attributable to burning of coal, oil, diesel and gas.The researchers conducted a systematic review of the scientific literature published between January 1, 2000 and June 30, 2018 to identify relevant economic costs for these six adverse health outcomes in children. In all, they reviewed 1,065 papers from the U.S. and U.K. and identified 12 most relevant papers. They separately identified estimates of the lost lifetime earnings associated with the loss of a single IQ point.The study cites previously published estimates ranging from $23,573 for childhood asthma not persisting into adulthood to $3,109,096 for a case of autism with a concurrent intellectual disability. The researchers also provide an example of cumulative costs: they calculate a savings of $267 million if the number of pre-term births in the United States attributable to PM2.5 (a measure of particulate matter -- one of several harmful air pollutants) were reduced by just 1 percent.The study authors prioritized monetary estimates that factored in both immediate medical costs and longer-term and broad societal costs. However, they acknowledge that their figures are likely underestimates because they don't adequately capture the long-term health and societal impacts -- for example, effects over the full life-course or losses in economic productivity.Despite the high burden of childhood illness, benefit-cost assessments of policies and other interventions have not adequately considered impacts in children -- both in terms of avoided cases and avoided economic costs. For example, the air pollution-related child health outcomes considered in the Benefits Mapping and Analysis Program Community Edition (BenMAP-CE) of the U.S. Environmental Protection Agency have been limited to acute bronchitis, lower and upper respiratory symptoms, school day loss, and asthma exacerbation in children. Whereas the BenMAP program estimates lifetime costs of a case of adult chronic asthma at $53,607, the total cost of each case of childhood asthma that persists into adulthood is estimated at $91,954.The World Health Organization estimates that more than 40 percent of the burden of environmentally related disease and more than 88 percent of the burden of climate change is borne by children younger than 5. In the U.S., disorders such as asthma and ADHD are prevalent in children and have been increasing over time, with asthma having a prevalence of about 8 percent and ADHD a prevalence of 10 percent. Even disorders with lower prevalence, such as autism, represent a growing public health concern, with about 1 in 60 U.S. children affected."Impacts on children's health are generally under-represented in benefits assessments related to environmental pollution," says study co-author Frederica Perera, PhD, professor of environmental health sciences and director of translational research at the Columbia Center for Children's Environmental Health. "Policies to clean our air and address the serious and escalating problem of climate change will yield numerous benefits for children's health and for the financial health of families and our nation."
Pollution
2,019
December 11, 2019
https://www.sciencedaily.com/releases/2019/12/191211171328.htm
Risk analysis powers air pollution solutions
Air pollution exposure threatens human health both outdoors and when polluted air infiltrates homes, offices, schools and vehicles. Exposure to certain particulate matter can cause respiratory, cardiovascular and nervous system issues, especially in vulnerable populations. Several presentations at the 2019 Society for Risk Analysis (SRA) Annual Meeting will explore new ways to measure and track air pollutants to reduce public health risk.
The first step in reducing exposure is to quantify current exposures to understand when and where people are at the highest risk. H. Christopher Frey, Ph.D., North Carolina State University, is part of a Hong Kong-based team that used portable sensor technology to measure concentrations of air pollutants at 200 locations. The team then used this data to quantify air pollutant exposure outdoors and find out how much ambient air pollution penetrates indoors.The information collected from the Hong Kong study, "Measurement and modeling of urban personal air pollution exposure in Hong Kong," will be incorporated into a publicly available cellphone app for the Personalized Real-Time Air Quality Informatics System for Exposure -- Hong Kong (PRAISE-HK). The app lets users identify air pollutants to which they are sensitive and identify ways to reduce exposure via the key features including route choice to a particular destination. Users will also be able to receive alerts about locations with high exposures and obtain exposure forecasts for up to 48 hours to aid in planning activities.In India, 'chalk and talk' is the most common and widespread method of teaching as it enhances interaction between students and teachers, but a great concern associated with this method is the generation of chalk dust. Students and teachers then inhale the dust with negative impacts on their health. Abinaya Sekar, National Institute of Technology Calicut, studied the exposure to chalk dust that teaching professionals face during class.Sekar's study, "Particulate matter exposure of teaching professionals during a typical chalk and talk class," analyzed chalk dust exposure for 40 teaching professionals in terms of particulate matter in three size ranges -- PM 10, PM 2.5 and PM 1. The study found that exposure varied with respect to the teacher's height. Those in the 5'-5'6" range had maximum exposure while writing in the middle portion of the board. Those who are in the 5'6"-5'9" range had maximum exposure while writing in the top portion."This research may provide awareness among the public to limit the usage of chalks by both students and teachers," states Sekar. "This may help policy makers to shift from traditional classrooms to smart classrooms."Over the past several years, air pollution has been linked to an increased risk of several respiratory diseases in children, especially respiratory tract infections. Elisa Gallo, University of Padova, conducted a study, "Increasing risk of emergency room admissions for bronchiolitis in infants exposed to air pollution," aimed at evaluating the association between and air pollution and pediatric emergency room (ER) admissions for bronchiolitis.Gallo found that the particulate matter and nitrogen dioxide were associated with ER admissions, although particulate matter was more likely to exacerbate the condition in children already showing mild symptoms. Gallo's research is intended to help inform parents of the risks of poor air quality, since children are very sensitive to lung disease while their immune system is developing, and early exposure can have lifelong consequences. In particular, bronchiolitis has been associated with a higher risk of lifetime wheezing and asthma.These studies will be presented during the Exposure Assessment of Air Pollutants: New Frontiers in the Assessment of Public Health Risks session on Wednesday, Dec. 11 at the 2019 SRA Annual Meeting at the Crystal Gateway Marriott in Arlington, Virginia.
Pollution
2,019
December 11, 2019
https://www.sciencedaily.com/releases/2019/12/191211083539.htm
Why polar bears at sea have higher pollution levels than those staying on land
As the climate changes, myriad animal populations are being impacted. In particular, Arctic sea-ice is in decline, causing polar bears in the Barents Sea region to alter their feeding and hunting habits. Bears that follow sea-ice to offshore areas have higher pollutant levels than those staying on land -- but why? A new study in ACS'
Barents Sea polar bears fall into two categories: pelagic, which migrate annually to hunt at sea, and coastal, which stay on land to fast or hunt. Changes in sea-ice availability have forced both types of bears to adjust how they find food. In recent decades, pelagic bears have shifted northward as southern ice has receded. Pelagic bears now have farther to migrate, while longer periods without ice have led coastal bears to feed on land-based prey or rely on their fat reserves. Previous studies have shown that pelagic bears have higher levels of pollutants, such as persistent organic pollutants (POPs), in their bodies, but little is known about why that difference exists. To solve this mystery, Pierre Blévin at the Norwegian Polar Institute and colleagues collected an array of data that paints a clearer picture of how climate change affects polar bears.In their study, the researchers gathered data on feeding habits, migration patterns, energy expenditure and geography to determine how the two polar bear types differed. They also measured pollutant levels in the prey that polar bears typically consume. The results indicated that several factors cause pelagic bears to accumulate more pollutants than those that stay on land. Sea-based bears feed on a higher proportion of marine life, especially those that are higher up the food chain, leading to multiple layers of polluted food, compared to land hunters. In addition, sea hunters have higher energy requirements, which in turn causes them to consume more prey. Pelagic bears also feed on prey located closer to pollutant sources and transport pathways. These combined factors highlight the unique pollution exposure mechanisms that polar bears face in this region, and how increased sea hunting by polar bears could enhance pollutant accumulation in these animals as the ice recedes, the researchers say.
Pollution
2,019
December 11, 2019
https://www.sciencedaily.com/releases/2019/12/191211100304.htm
How are Utah's dry lakes impacting air quality and human health?
The Great Salt Lake reached historic low levels in recent years and continues to dry as a result of drought and water diversions. As water levels decrease, the exposed area of dry lakebed increases, creating major sources of mineral dust. Declining water levels are a major concern for scientists and the general public alike, but air quality is often overlooked as one of the potentially harmful consequences of receding lakes.
New research from BYU's geological sciences department found that about 90 percent of dust in Utah's Wasatch Front comes from the west desert, an area that was once covered by the prehistoric Lake Bonneville but that is now a dried lakebed. More recently, shallow lakes like Sevier Dry Lake and the Great Salt Lake, which are remnants of Lake Bonneville, have been exposed as water inflows are diverted for consumptive use. Researchers predict this percentage is only going to increase as water levels decline and more dry lakebed is exposed."Lakebeds are muddy, but as they dry out, they become a dust pan," said study co-author and former BYU graduate student Michael Goodman. "Dry lake beds are becoming a significant dust threat to nearby communities, not only impacting air quality but also impacting soil and what can grow in it."Researchers collected and compared over 100 dust samples from three different sources: dried lakebeds from Sevier Lake and the Great Salt Lake in Utah's west desert, urban areas along the Wasatch Front and mountain snowpack from the Uintah Mountains. The team found salts common to dried lakebeds in the urban and mountain areas, suggesting that dust from dried lakebeds is transferred to these other locations.While most dust along the Wasatch Front comes from drying lakebeds, researchers said that the most dangerous contaminants are still coming from urban areas."The dust we sampled contained potentially toxic metals, and those come primarily from the urban and mining areas," said BYU geology professor and co-author Greg Carling. "Even though the urban and mining area contributes only a small fraction of the dust load, it contains the most contaminants, such as antimony and copper."Carling said whether it's the drying up of lakes or the emission of dangerous chemicals, our actions often have unintended consequences that may negatively affect the environment. The team also acknowledges that further knowledge of dust sources may be useful for understanding how water diversions, climate change and population growth affect the regional dust cycle in the future."Most people probably wouldn't think that something out in the west desert has a direct impact on the Wasatch Front," Carling said. "But we have to consider that there are consequences for our actions, many of which are indirect." Researchers also said dust should be considered an important factor in poor air quality and human health, and the amount of dust blown into urban areas could be lowered through preserving lakes.
Pollution
2,019
December 9, 2019
https://www.sciencedaily.com/releases/2019/12/191209110906.htm
Tackling air pollution: Researchers present emissions inventory for Nepal
Data on emission amounts and sources have an important role to play in shaping policy on climate protection and air quality. Now, scientists from the Institute for Advanced Sustainability Studies (IASS) in Potsdam, Germany, have presented the first high-resolution inventory to record emissions of greenhouse gases and air pollutants in Nepal over an extended period of time. Their research reveals that the air pollution problem is growing at a much faster rate than the economy.
"Over eighty percent of Nepal's energy needs are met by biomass, in particular, wood. This gives rise to considerable amounts of particulate matter and ozone precursors, which adversely affect the climate, air quality, human health, crops and the cryosphere, those parts of the Earth's surface where water is in solid form. The emissions inventory helps us identify the main causes of emissions, the proportional contributions of individual sources or sectors, and critical regions," explains lead author Pankaj Sadavarte. The researchers estimated emissions from fuel consumption due to technology use in private households, industry, agriculture, the transport sector, and other economic sectors in the period from 2001 to 2016.In Nepal, private households account for a much larger share of fuel consumption and hence emissions, especially particulates, than in industrial nations like Germany. For example, in 2011 they were responsible for 58 per cent of emissions of soot, one of the main components of particulate matter. The traditional low-efficiency wood-burning stoves used in most Nepalese homes for cooking and heating are the main culprits here. As well as being detrimental to human health, soot is an important climate forcer, the second largest after carbon dioxide. Carbon dioxide emissions in the same year came mainly from industries (46%), notably cement factories, followed by private households (31%), and transport (14%).The significant increase in total emissions from industry and the transport sector is particularly striking. In the period from 2001 to 2016 industrial emissions tripled and emissions from transport more than quadrupled. By contrast, the increase in emissions from private households, though still the dominant source, was only marginal over the same period. "Fossil-based energy consumption increased manifold during the investigation period. For example, consumption of LPG, petrol and diesel rose by a factor of 7, 6 and 4 respectively. But the national gross domestic product increased by only 74% from around US$11.42 billion in 2001. This means that the pollution problem is growing at a much faster rate than the economy -- a trend that should ideally be reversed," says co-author Maheswar Rupakheti.The researchers are currently working on part 2 of the inventory, which will present data on emissions from open fires, like agriculture waste and municipal solid waste. According to Rupakheti, as well as advancing research, the data can make a valuable contribution to analysing the mitigation potential of various measures and designing evidence-based policies. "They are helpful when it comes to evaluating potential air quality solutions with co-befits to climate and other associated issues. We have calculated, for example, that the most important air pollutants can be reduced by 30 per cent if so-called superemitters -highly polluting vehicles -- are removed from the transport system. So that would be a good immediate policy target." With this new emissions inventory and the forthcoming second part, the researchers are working to develop further air quality and climate protection strategies together with stakeholders on the ground in Nepal.
Pollution
2,019
December 6, 2019
https://www.sciencedaily.com/releases/2019/12/191206173634.htm
Dramatic health benefits following air pollution reduction
Reductions in air pollution yielded fast and dramatic impacts on health-outcomes, as well as decreases in all-cause morbidity, according to findings in "Health Benefits of Air Pollution Reduction," new research published in the American Thoracic Society's journal,
The study by the Environmental Committee of the Forum of International Respiratory Societies (FIRS) reviewed interventions that have reduced air pollution at its source. It looked for outcomes and time to achieve those outcomes in several settings, finding that the improvements in health were striking. Starting at week one of a ban on smoking in Ireland, for example, there was a 13 percent drop in all-cause mortality, a 26 percent reduction in ischemic heart disease, a 32 percent reduction in stroke, and a 38 percent reduction in chronic obstructive pulmonary disease (COPD). Interestingly, the greatest benefits in that case occurred among non-smokers."We knew there were benefits from pollution control, but the magnitude and relatively short time duration to accomplish them were impressive," said lead author of the report, Dean Schraufnagel, MD, ATSF. "Our findings indicate almost immediate and substantial effects on health outcomes followed reduced exposure to air pollution. It's critical that governments adopt and enforce WHO guidelines for air pollution immediately."In the United States, a 13-month closure of a steel mill in Utah resulted in reducing hospitalizations for pneumonia, pleurisy, bronchitis and asthma by half. School absenteeism decreased by 40 percent, and daily mortality fell by 16 percent for every 100 ?g/m3 PM10 (a pollutant) decrease. Women who were pregnant during the mill closing were less likely to have premature births.A 17-day "transportation strategy," in Atlanta, Georgia during the 1996 Olympic Games involved closing parts of the city to help athletes make it to their events on time, but also greatly decreased air pollution. In the following four weeks, children's visits for asthma to clinics dropped by more than 40 percent and trips to emergency departments by 11 percent. Hospitalizations for asthma decreased by 19 percent. Similarly, when China imposed factory and travel restrictions for the Beijing Olympics, lung function improved within two months, with fewer asthma-related physician visits and less cardiovascular mortality.In addition to city-wide polices, reducing air pollution within the home also led to health benefits. In Nigeria, families who had clean cook stoves that reduced indoor air pollution during a nine-month pregnancy term saw higher birthweights, greater gestational age at delivery, and less perinatal mortality.The report also examines the impact of environmental policies economically. It highlights that 25 years after enactment of the Clean Air Act, the U.S. EPA estimated that the health benefits exceeded the cost by 32:1, saving 2 trillion dollars, and has been heralded as one of the most effective public health policies of all time in the United States. Emissions of the major pollutants (particulate matter [PM], sulfur oxides, nitrogen oxides, carbon monoxide, volatile organic compounds, and lead) were reduced by 73 percent between 1990 and 2015 while the U.S. gross domestic product grew by more than 250 percent.Given these findings, Dr. Schraufnagel has hope. "Air pollution is largely an avoidable health risk that affects everyone. Urban growth, expanding industrialization, global warming, and new knowledge of the harm of air pollution raise the degree of urgency for pollution control and stress the consequences of inaction," he says. "Fortunately, reducing air pollution can result in prompt and substantial health gains. Sweeping policies affecting a whole country can reduce all-cause mortality within weeks. Local programs, such as reducing traffic, have also promptly improved many health measures."
Pollution
2,019
December 5, 2019
https://www.sciencedaily.com/releases/2019/12/191205155319.htm
As China rapidly adopts clean energy, use of traditional stoves persists
Old habits are hard to break. A McGill-led study of replacement of traditional wood and coal burning stoves with clean energy in China suggests that, without a better understanding of the reasons behind people's reluctance to give up traditional stoves, it will be difficult for policies in China and elsewhere in the world to succeed in encouraging this shift towards clean energy. The study was published recently in
China is ahead of most low- and middle-income countries in its energy transition: hundreds of millions of rural homes started using clean fuels such as electricity and gas in recent decades. Despite this, many Chinese homes continue using their traditional coal and wood-burning stoves -- a trend that is common in many countries.Air pollution from traditional stoves contributed to approximately 2.8 million premature deaths in 2017 and is a major contributor to regional climate change. Efforts made by governments, NGOs, and researchers to incentivize households to switch entirely to clean fuel stoves and give up their traditional stoves -- even in highly controlled randomized trials -- have largely failed."Families have used their traditional stoves for generations. People know what foods taste best with those stoves and how to best use them so that all energy needs are met," said Jill Baumgartner, an Associate Professor in McGill's Department of Epidemiology, Biostatistics and Occupational Health and the senior author on the study by an international team of researchers. "Clean fuel stoves do not always meet all of the energy uses provided by traditional stoves, and may also pose additional costs. The desire to continue using traditional stoves is also seen in the U.S. and Canada, where many homes still use wood fireplaces for space heating despite being well-equipped to only use gas and electric heaters."The researchers gathered data from over 700 homes in three provinces in China (Beijing, Shanxi and Guangxi) using a photo-based questionnaire, which allowed participants to point to each type of stove that they had ever owned. They then asked questions about the frequency and timing of use, as well as what fuel types they used with the household stoves."We were surprised by the number of stoves current used in different homes, sometimes up to 13 different cooking devices and 7 different heating stoves," said Ellison Carter, the first author on the paper who is an Assistant Professor in the Department of Civil and Environmental Engineering at Colorado State University. "We were also surprised to find that most homes that had first started using clean fuels well over a decade ago were still using their solid fuel stoves, again highlighting the enormous challenge of achieving exclusive use of clean fuel stoves."The researchers found that the factors associated with adoption of clean fuels were different from those associated with the suspension of traditional coal and wood burning stoves. The common traits among those who stopped using traditional stoves were that they tended to be younger, more educated, and had poorer self-reported health. Those who adopted clean technology tended either to be younger or retired, lived in smaller households, and had a higher income.This study focused only on China, which is home to over 500 million solid fuel stove users. But an important question for the researchers is how generalizable these results from China are to other countries with different levels of economic development. They were recently awarded funding from the U.S. NIH to develop a framework on the household and community factors that can facilitate suspension of solid fuel stoves. For this work, they obtained a number of nationally-representative datasets on energy use from India, Cambodia, and number of countries in Sub-Saharan Africa to empirically assess this same question in other settings."We have good evidence from many countries on the policies and program that can promote the uptake of clean fuel stoves," said Baumgartner. "We now need to better understand the individual or combinations of policies and programs that can accelerate the suspension of solid fuel stoves, particularly for the poorest and most vulnerable."
Pollution
2,019
December 5, 2019
https://www.sciencedaily.com/releases/2019/12/191205130602.htm
A solution for cleaning up PFAS, one of the world's most intractable pollutants
A cluster of industrial chemicals known by the shorthand term "PFAS" has infiltrated the far reaches of our planet with significance that scientists are only beginning to understand.
PFAS -- Per- and polyfluoroalkyl substances -- are human-made fluorine compounds that have given us nonstick coatings, polishes, waxes, cleaning products and firefighting foams used at airports and military bases. They are in consumer goods like carpets, wall paint, popcorn bags and water-repellant shoes, and they are essential in the aerospace, automotive, telecommunications, data storage, electronic and healthcare industries.The carbon-fluorine chemical bond, among nature's strongest, is the reason behind the wild success of these chemicals, as well as the immense environmental challenges they have caused since the 1940s. PFAS residues have been found in some of the most pristine water sources, and in the tissue of polar bears. Science and industry are called upon to clean up these persistent chemicals, a few of which, in certain quantities, have been linked to adverse health effects for humans and animals.Among those solving this enormously difficult problem are engineers in the Walter Scott, Jr. College of Engineering at Colorado State University. CSU is one of a limited number of institutions with the expertise and sophisticated instrumentation to study PFAS by teasing out their presence in unimaginably trace amounts.Now, CSU engineers led by Jens Blotevogel, research assistant professor in the Department of Civil and Environmental Engineering, have published a new set of experiments tackling a particular PFAS compound called hexafluoropropylene oxide dimer acid, better known by its trade name, GenX. The chemical, and other polymerization processes that use similar chemistries, have been in use for about a decade. They were developed as a replacement for legacy PFAS chemicals known as "C8" compounds that were -- and still are -- particularly persistent in water and soil, and very difficult to clean up (hence their nickname, "forever chemicals").GenX has become a household name in the Cape Fear basin area of North Carolina, where it was discovered in the local drinking water a few years ago. The responsible company, Chemours, has committed to reducing fluorinated organic chemicals in local air emissions by 99.99%, and air and water emissions from its global operations by at least 99% by 2030. For the last several years, Chemours has also funded Blotevogel's team at CSU as they test innovative methods that would help the environment as well as assist the company's legacy cleanup obligations.Writing in One of the current practices for treating GenX-contaminated water is high-temperature incineration -- a process that is "excessively expensive," according to the researchers, and very wasteful for water and energy recovery. "It works," Blotevogel said, "but it's not sustainable."The researchers are offering a better solution. Tong, a leading expert in membrane filtration and desalination methods for environmental hazards, employed a nanofiltration membrane with appropriate pore sizes to filter out 99.5% of dissolved GenX compounds. Once that concentrated waste stream is generated, the researchers showed that electrochemical oxidation, which Blotevogel considers one of the most viable technologies for destructive PFAS cleanup, can then break down the waste into harmless products.Currently, companies can also use several measures for removal of PFAS from water to acceptable levels: adsorption to activated carbon, ion exchange, and reverse osmosis. While all three of these technologies can be highly effective, they do not result directly in destruction of PFAS compounds, Blotevogel said.The CSU researchers alternative solution of electrochemical treatment uses electrodes to chemically change the PFAS into more benign compounds. Blotevogel's lab has demonstrated several successful pilot-scale decontamination efforts, and is working to continue optimizing their methodologies. Combined with Tong's nanofiltration system, the waste stream would be directed and concentrated, saving companies money and lowering the entire process's carbon footprint.The researchers hope to continue working together to refine their process, for example, by testing different types of filtration membranes to determine the most optimal materials and design.
Pollution
2,019
December 5, 2019
https://www.sciencedaily.com/releases/2019/12/191205130554.htm
Prenatal and early life exposure to multiple air pollutants increases odds of toddler allergies
A new article in
"Because most children are exposed to more than one pollutant or allergen, we examined the relationship between multiple exposures and allergic sensitizations at 2 years of age," says Mallory Gallant, MSc, lead author of the study. "We examined exposure to dogs, cats, air fresheners, candles, mold, environmental tobacco smoke (ETS) and carpet, all of which have been associated with childhood allergies. Of the exposures we measured, prenatal exposure to candles, 6-month exposure to cats and 2-year exposure to ETS significantly increased the chance of a positive skin prick test (SPT) at 2 years of age."108 mother-child pairs were followed from birth to 2 years of age. Exposure to air fresheners, candles, mold, cats, dogs, carpet and environmental tobacco smoke (ETS) during the prenatal, 6-month, 1-year, and 2-year timepoints were obtained. A SPT was performed on both the mother and the 2-year-old child to measure allergic sensitivity. Allergic sensitization means that a person has had (or may have had given the possibilities for false positives) an allergic type immune response to a substance. But it does not necessarily mean that the substance causes them problems."The increase in the average amount of time indoors means there is an increased risk of harmful health outcomes related to exposure to indoor air pollutants," says allergist Anne K. Ellis, MD, study author, and member of the ACAAI Environmental Allergy Committee. "Additionally, children breathe more frequently per minute than adults, and mostly breathe through their mouths. These differences could allow for air pollutants to penetrate more deeply into the lungs and at higher concentrations, making children more vulnerable to air pollutants."Another goal of the study was to evaluate the effect of multiple exposures on allergic outcomes at 2 years of age. The study found that children with a positive SPT at 2 years of age had significantly more exposures prenatally, at the 1-year and 2-year time points compared to children with a negative SPT. As the number of indoor air polluting exposures increased, the percentage of children with a positive SPT increased. Dr. Ellis says, "When considered together, the findings suggest that the effect of multiple exposures may contribute more to allergy development than one single exposure."
Pollution
2,019
December 3, 2019
https://www.sciencedaily.com/releases/2019/12/191203104759.htm
Smog-eating graphene composite reduces atmospheric pollution
Graphene Flagship partners the University of Bologna, Politecnico di Milano, CNR, NEST, Italcementi HeidelbergCement Group, the Israel Institute of Technology, Eindhoven University of Technology, and the University of Cambridge have developed a graphene-titania photocatalyst that degrades up to 70% more atmospheric nitrogen oxides (NOx) than standard titania nanoparticles in tests on real pollutants.
Atmospheric pollution is a growing problem, particularly in urban areas and in less developed countries. According to the World Health Organization, one out of every nine deaths can be attributed to diseases caused by air pollution. Organic pollutants, such as nitrogen oxides and volatile compounds, are the main cause of this, and they are mostly emitted by vehicle exhausts and industry.To address the problem, researchers are continually on the hunt for new ways to remove more pollutants from the atmosphere, and photocatalysts such as titania are a great way to do this. When titania is exposed to sunlight, it degrades nitrogen oxides -- which are very harmful to human health -- and volatile organic compounds present at the surface, oxidising them into inert or harmless products.Now, the Graphene Flagship team working on photocatalytic coatings, coordinated by Italcementi, HeidelbergCement Group, Italy, developed a new graphene-titania composite with significantly more powerful photodegradation properties than bare titania. "We answered the Flagship's call and decided to couple graphene to the most-used photocatalyst, titania, to boost the photocatalytic action," comments Marco Goisis, the research coordinator at Italcementi. "Photocatalysis is one of the most powerful ways we have to depollute the environment, because the process does not consume the photocatalysts. It is a reaction activated by solar light," he continues.By performing liquid-phase exfoliation of graphite -- a process that creates graphene -- in the presence of titania nanoparticles, using only water and atmospheric pressure, they created a new graphene-titania nanocomposite that can be coated on the surface of materials to passively remove pollutants from the air. If the coating is applied to concrete on the street or on the walls of buildings, the harmless photodegradation products could be washed away by rain or wind, or manually cleaned off.To measure the photodegradation effects, the team tested the new photocatalyst against NOx and recorded a sound improvement in photocatalytic degradation of nitrogen oxides compared to standard titania. They also used rhodamine B as a model for volatile organic pollutants, as its molecular structure closely resembles those of pollutants emitted by vehicles, industry and agriculture. They found that 40% more rhodamine B was degraded by the graphene-titania composite than by titania alone, in water under UV irradiation. "Coupling graphene to titania gave us excellent results in powder form -- and it could be applied to different materials, of which concrete is a good example for the widespread use, helping us to achieve a healthier environment. It is low-maintenance and environmentally friendly, as it just requires the sun's energy and no other input," Goisis says. But there are challenges to be addressed before this can be used on a commercial scale. Cheaper methods to mass-produce graphene are needed. Interactions between the catalyst and the host material need to be deepened as well as studies into the long-term stability of the photocatalyst in the outdoor environment.Ultrafast transient absorption spectroscopy measurements revealed an electron transfer process from titania to the graphene flakes, decreasing the charge recombination rate and increasing the efficiency of reactive species photoproduction -- meaning more pollutant molecules could be degraded.Xinliang Feng, Graphene Flagship Work Package Leader for Functional Foams and Coatings, explains: "Photocatalysis in a cementitious matrix, applied to buildings, could have a large effect to decrease air pollution by reducing NOx and enabling self-cleaning of the surfaces -- the so-called "smog-eating" effect. Graphene could help to improve the photocatalytic behaviour of catalysts like titania and enhance the mechanical properties of cement. In this publication, Graphene Flagship partners have prepared a graphene-titania composite via a one-step procedure to widen and improve the ground-breaking invention of "smog-eating" cement. The prepared composite showed enhanced photocatalytic activity, degrading up to 40% more pollutants than pristine titania in the model study, and up to 70% more NOx with a similar procedure. Moreover, the mechanism underlying this improvement was briefly studied using ultrafast transient absorption spectroscopy."Enrico Borgarello, Global Product Innovation Director at Italcementi, part of the HeidelbergCement Group, one of the world's largest producers of cement, comments: "Integrating graphene into titania to create a new nanocomposite was a success. The nanocomposite showed a strong improvement in the photocatalytic degradation of atmospheric NOx boosting the action of titania. This is a very significant result, and we look forward to the implementation of the photocatalytic nanocomposite for a better quality of air in the near future."The reasons to incorporate graphene into concrete do not stop here. Italcementi is also working on another product -- an electrically conductive graphene concrete composite, which was showcased at Mobile World Congress in February this year. When included as a layer in flooring, it could release heat when an electrical current is passed through it. Goisis comments: "You could heat your room, or the pavement, without using water from a tank or boiler. This opens the door to innovation for the smart cities of the future -- particularly to self-sensing concrete," which could detect stress or strain in concrete structures and monitor for structural defects, providing warning signals if the structural integrity is close to failure.Andrea C. Ferrari, Science and Technology Officer of the Graphene Flagship and Chair of its Management Panel, adds: "An ever-increasing number of companies are now partners, or associate members of the Graphene Flagship, since they recognize the potential for new and improved technologies. In this work, Italcementi, leader in Italy in the field of building materials, demonstrated a clear application of graphene for the degradation of environment pollutants. This can not only have commercial benefits, but, most importantly, benefit of society by resulting in a cleaner and healthier environment."
Pollution
2,019
December 3, 2019
https://www.sciencedaily.com/releases/2019/12/191203102036.htm
How to improve water quality in Europe
Toxic substances from agriculture, industry and households endanger water quality in Europe -- and by extension, ecosystems and human health. As part of the SOLUTIONS project, over 100 international scientists have developed methods and practical solutions for identifying pollutants and assessing the risks posed by chemical cocktails. This is intended to help reduce pollution in water resources. Researchers have described how politicians can implement these scientific results in 15 policy briefs.
The EU Water Framework Directive (WFD) adopted in 2000 aims to protect Europe's water resources. By 2027, EU Member States are required to bring all water bodies into a "good ecological" and "good chemical state." There's still a long way to go. This is due, for example, to the fact that a few existing substances, for which there are currently no suitable possibilities for reducing pollution, lead to environmental quality standards being exceeded across the board in Germany and Europe -- and thus to poor water quality. "What's more, the complex mixtures of pesticides, medicines and industrial chemicals that are released daily and pose a considerable risk for humans and the environment are not taken into account when establishing the chemical status of our water bodies," says UFZ Environmental Chemist Dr Werner Brack, who coordinated the SOLUTIONS project that drew to a close last year. The current WFD indicator system does not differentiate between rivers with differing pollution nor does it demonstrate any actual improvements in water quality as a result of any measures implemented. This is why it urgently needs to be developed further. Otherwise, according to Brack, the objectives of the WFD cannot be achieved.For the past five years, European scientists have carried out research as part of the SOLUTIONS project, which received EUR twelve million from the EU. "It has been shown that the current practice of limiting the assessment of chemical pollution to a few substances defined as priorities throughout Europe and certain river-basin-specific pollutants is not sufficient for recording pollution as a whole," summarises Werner Brack. At present, the WFD only lists 45 priority pollutants that are not allowed to occur or occur only to a limited extent in water bodies categorized as water bodies of good quality. However, more than 100,000 chemical substances end up in the environment and water bodies. The indicators currently used to assess water quality cannot be used to identify pollution hotspots or initiate appropriate management measures. The SOLUTIONS project has therefore developed new concepts and tools for monitoring and reducing exposure to complex mixtures.In a total of 15 policy briefs, SOLUTIONS researchers have set out how policy makers can implement these concepts and tools. For example, scientists recommend that substances in toxic mixtures should also be taken into account when prioritising chemicals under the WFD. Until now, prioritising chemicals and defining EU-wide priority and river-basin-specific substances have only been based on individual chemicals. In another policy brief, they describe how users can use the RiBaTox toolbox developed as part of the SOLUTIONS project to solve problems related to the monitoring, modelling, impact assessment and management of chemical mixtures in surface waters. Monitoring methods should be used to target the complex mixtures, i.e. effect-based methods that involve representative aquatic organisms such as algae, small crustaceans, fish embryos and suitable cell systems demonstrating how toxic each chemical cocktail is. This would allow toxic loads to be determined, even if the underlying chemicals are unknown or below the detection limit for analysis. These methods should be complemented by chemical screening techniques using high-resolution mass spectrometry to see which substances the mixtures contain, to detect emerging chemicals and to monitor pollution trends in the aquatic environment. This way, valuable information can also be collected on the occurrence of substances that are now detectable but cannot yet be identified. To be able to use this extensive data on hundreds and thousands of substances in water to assess the risk of chemical cocktails, the authors also suggest establishing a European data infrastructure. This will help gather data and make it accessible to the world of science and the authorities so it can be evaluated and shared."The policy briefs are intended to make it easier for decision-makers to access the scientific information needed to protect Europe's water resources," says Werner Brack. This is an important basis for people's health across Europe and for healthy ecosystems that provide the population with key services.
Pollution
2,019
December 2, 2019
https://www.sciencedaily.com/releases/2019/12/191202190430.htm
Harbor porpoise calves exposed to neurotoxic PCBs in mothers' milk
Harbour porpoise calves around the UK are carrying a more neurotoxic cocktail of PCBs than their mothers, as females unknowingly detoxify themselves by transferring the chemicals while feeding their young, new research reveals today.
Published in the Critically however, the most persistent toxins remain in a mother's body until they are transferred to infants during lactation -- exposing their young to dangerous doses of the chemical pollutants, that are particularly toxic during brain development.PCBs were once used in the likes of electrical equipment, surface coatings and paints back in the mid-1980s, before being banned across Europe due to their toxic effects on both people and wildlife. However, the group of persistent toxic chemicals continues to enter the marine environment through terrestrial run off, dredging and atmospheric transport, resulting in a complex mixture of the chemicals entering the food chain.The highest levels are often found in odontocetes (toothed whales) that are high up in the food chain, where they can cause suppression of the immune and reproductive systems and have contributed to population declines of several species in some regions.Rosie Williams, lead author and PhD Researcher at ZSL's Institute of Zoology and Brunel University London said: "It's a tragic irony that juvenile porpoises are being exposed to a toxic cocktail of chemicals during feeding -- when all they're supposed to be getting are the vital nutrients they need for the crucial developmental stage of their life."Previously, scientists tended to monitor PCB concentrations by grouping them together and treating them as one chemical, but as we know, they're a group of chemicals with different toxicity levels so it was a bit like trying to measure how much caffeine someone's had -- without knowing whether they drank three cans of red bull or three cups of tea. Our study has highlighted the need to change our approach to monitoring PCBs, to look at the composition of individual chemicals, so that we can get a better understanding of the risk posed by these chemicals to our marine wildlife."Studying PCB exposure in more abundant species like porpoises, helps us to predict their effects in more vulnerable species already low in numbers; such as our native population of orcas in the UK that are facing extinction because of PCBs, with only eight remaining. As top predators, killer whales are exposed to some of the highest levels of PCBs, because there is an accumulative effect of PCBs as you go up the food chain."It's obvious that marine mammals are still experiencing the lingering impacts of PCBs, so identifying the sources and pathways they're entering our oceans is a vital next step to preventing further pollution."Professor Susan Jobling, co-author at Brunel University London's, Institute of Environment, Health and Societies said: "This research helps further our understanding of these legacy industrial chemical pollutants and the effects that different levels of exposure, in complex mixtures, may have. Learning more about PCB exposure in juvenile animals is vital, so that we can try to mitigate the impact of these dangerous chemicals on populations and help protect the future status of marine mammals in UK waters."The team of scientists used the world's largest cetacean toxicology dataset generated by the Centre for Environment Fisheries and Aquaculture Science from samples collected by the CSIP from UK stranded cetaceans, with a total of 696 harbour porpoises stranded in the UK between 1992 and 2015 identified for the study.
Pollution
2,019
November 25, 2019
https://www.sciencedaily.com/releases/2019/11/191125153008.htm
A missing link in haze formation
Air-quality alerts often include the levels of particulate matter, small clumps of molecules in the lower atmosphere that can range in size from microscopic to visible. These particles can contribute to haze, clouds, and fog and also can pose a health risk, especially those at the smaller end of the spectrum. Particles known as PM10 and PM2.5, referring to clumps that are 2.5 to 10 micrometers in size, can be inhaled, potentially harming the heart and lungs.
This week, a group led by University of Pennsylvania scientists in collaboration with an international team report a new factor that affects particle formation in the atmosphere. Their analysis, published in the "Right now, we're all concerned about PM2.5 and PM10 because these have some real air-quality and health consequences," says Joseph S. Francisco, a corresponding author on the paper and an atmospheric chemist in Penn's School of Arts and Sciences. "The question has been, How do you suppress the formation of these kinds of particles? This work actually gives some very important insight, for the first time, into how you can suppress particle growth.""We and others have been studying this process of how particles grow so we can better understand the weather and the health implications," says Jie Zhong, a postdoctoral fellow at Penn and co-lead author of the work. "Previously people thought that alcohols were not important because they interact weakly with other molecules. But alcohols attracted our attention because they're abundant in the atmosphere, and we found they do in fact play a significant role in reducing particle formation."Leading up to this work, Zhong and colleagues had been focused on various reactions involving SO3, which can arise from various types of pollution, such as burning fossil fuels. When combined with water molecules, SO3 forms sulfuric acid, a major component of acid rain but also one of the most important "seeds" for growing particles in the atmosphere.Chemists knew that alcohols are not very "sticky," forming only weak interactions with SO3, and had thus dismissed it as a key contributor to particle formation. But when Zhong and colleagues took a closer look, using powerful computational chemistry models and molecular dynamics simulations, they realized that SO3 could indeed react with alcohols such as methanol when there is a lot of it in the atmosphere. The resulting product, methyl hydrogen sulfate (MHS), is sticky enough to participate in the particle-formation process."Because this reaction converts alcohols to more sticky compounds," says Zhong, "initially we thought it would promote the particle formation process. But it doesn't. That's the most interesting part. Alcohols consume or compete for SO3 so less of it is available to form sulfuric acid."Even though the reaction between methanol and SO3 requires more energy, the researchers found that MHS itself, in addition to sulfuric acid and water, could catalyze the methanol reaction."That was an interesting part for us, to find that the MHS can catalyze its own formation," says Francisco. "And what was also unique about this work and what caught us by surprise was the impact of the effect."Francisco and Zhong note that in dry and polluted conditions, when alcohols and SO3 are abundant in the atmosphere but water molecules are less available, this reaction may play an especially significant role in driving down the rate of particle formation. Yet they also acknowledge that MHS, the production of the methanol-SO3 reaction, has also been linked to negative health impacts."It's a balance," says Zhong. "On the one hand this reaction reduces new particle formation, but on the other hand it produces another product that is not very healthy."What the new insight into particle formation does offer, however, is information that can power more accurate models for air pollution and even weather and climate, the researchers say. "These models haven't been very accurate, and now we know they were not incorporating this mechanism that wasn't recognized previously," Zhong says.As a next step, the researchers are investigating how colder conditions, involving snow and ice, affect new particle formation. "That's very appropriate because winter is coming." Francisco says.
Pollution
2,019
November 22, 2019
https://www.sciencedaily.com/releases/2019/11/191122113301.htm
Clean air research converts toxic air pollutant into industrial chemical
A toxic pollutant produced by burning fossil fuels can be captured from the exhaust gas stream and converted into useful industrial chemicals using only water and air thanks to a new advanced material developed by an international team of scientists.
New research led by The University of Manchester, has developed a metal-organic framework (MOF) material that provides a selective, fully reversible and repeatable capability to capture nitrogen dioxide (NO2), a toxic air pollutant produced particularly by diesel and bio-fuel use. The NO2 can then be easily converted into nitric acid, a multi-billion dollar industry with uses including, agricultural fertilizer for crops; rocket propellant and nylon.MOFs are tiny three-dimensional structures which are porous and can trap gasses inside, acting like cages. The internal empty spaces in MOFs can be vast for their size, just one gram of material can have a surface area equivalent to a football pitch.The highly efficient mechanism in this new MOF was characterised by researchers using neutron scattering and synchrotron X-ray diffraction at the Department of Energy's Oak Ridge National Laboratory and Berkeley National Laboratory, respectively. The team also used the National Service for Electron Paramagnetic Resonance Spectroscopy at Manchester to study the mechanism of adsorption of NO2 in MFM-520. The technology could lead to air pollution control and help remedy the negative impact nitrogen dioxide has on the environment.Asin "This is the first MOF to both capture and convert a toxic, gaseous air pollutant into a useful industrial commodity." said Dr Sihai Yang, a lead author and a senior lecturer at The University of Manchester's Department of Chemistry. "It is also interesting that the highest rate of NO2 uptake by this MOF occurs at around 45 degrees Centigrade, which is about the temperature of automobile exhausts."Professor and Vice-President and Dean of the Faculty of Science and Engineering at The University of Manchester Martin Schröder, a lead author of the study, said: "The global market for nitric acid in 2016 was USD $2.5 billion, so there is a lot of potential for manufacturers of this MOF technology to recoup their costs and profit from the resulting nitric acid production. Especially since the only additives required are water and air."As part of the research, the scientists used neutron spectroscopy and computational techniques at ORNL to precisely characterize how MFM-520 captures nitrogen dioxide molecules."This project is an excellent example of using neutron science to study the structure and activity of molecules inside porous materials," said Timmy Ramirez-Cuesta, co-author and coordinator for the chemistry and catalysis initiative at ORNL's Neutron Sciences Directorate. "Thanks to the penetrating power of neutrons, we tracked how the nitrogen dioxide molecules arranged and moved inside the pores of the material, and studied the effects they had on the entire MOF structure.""The characterisation of the mechanism responsible for the high, rapid uptake of NO2 will inform future designs of improved materials to capture air pollutants." said Jiangnan Li, the first author and a PhD student at The University of Manchester.In the past, capturing greenhouse and toxic gases from the atmosphere was a challenge because of their relatively low concentrations and because water in the air competes and can often affect negatively the separation of targeted gas molecules from other gases. Another issue was finding a practical way to filter out and convert captured gases into useful, value-added products. The MFM-520 material offers solutions to many of these challenges.
Pollution
2,019
November 20, 2019
https://www.sciencedaily.com/releases/2019/11/191120131354.htm
Emissions from electricity generation lead to premature deaths for some racial groups
Air pollution doesn't just come from cars on the road, generating electricity from fossil fuels also releases fine particulate matter into the air.
In general, fine particulate matter can lead to heart attacks, strokes, lung cancer and other diseases, and is responsible for more than 100,000 deaths each year in the United States.Now University of Washington researchers have found that air pollution from electricity generation emissions in 2014 led to about 16,000 premature deaths in the continental U.S. In many states, the majority of the health impacts came from emissions originating in other states. The team also found that exposures were higher for black and white non-Latino Americans than for other groups, and that this disparity held even after accounting for differences in income.The researchers published their results Nov. 20 in the journal "Our data show that even if states take measures to change their own electricity production methods, what happens across state lines could dramatically affect their population," said senior author Julian Marshall, a UW professor of civil and environmental engineering.The team first examined how emissions from electricity generation plants could move across the continental U.S."We looked at emissions from different types of power plants -- including coal, natural gas, diesel and oil power plants -- and modeled how the pollutants would travel based on things like wind patterns or rain. We also consider how emissions can react in the atmosphere to form fine particle air pollution," said lead author Maninder Thind, a UW civil and environmental engineering doctoral student. "That gave us a map of pollution concentrations across the country. Then we overlaid that map with data from the census to get an estimate of where people live and how this pollution results in health impacts."Then, using mortality data from the National Center for Health Statistics, the team estimated premature deaths due to electricity generation emissions. In 2014, there were about 16,000 premature deaths. The researchers estimate that 91% of premature deaths were the result of emissions from coal-fired power plants. The number of deaths in each state varied, with Pennsylvania having the highest number -- about 2,000 -- and Montana and Idaho having the lowest number, with fewer than 10 deaths each.Emissions from electricity generation don't stop at state lines -- many states "imported" or "exported" pollution. In 36 states, more than half of premature deaths were the result of emissions from other states.Overall, the team found that emissions affected black Americans the most, leading to about seven premature deaths per 100,000 people in that group. White non-Latino Americans were the second most affected group with about six premature deaths per 100,000 people. Other groups averaged about four premature deaths per 100,000 people."A lot of people may expect that the disparity we see for race or ethnicity comes from an underlying difference in income. But that's not what we see," said co-author Christopher Tessum, a research scientist in the UW's civil and environmental engineering department. "We find that differences by race or ethnicity tend to be larger than differences by income group."While the researchers found that overall, lower-income households experienced more exposure to emissions, the disparities they saw between race groups still held when they accounted for income.The amount of power plant pollution that people breathe can vary by where they live. These disparities may be influenced by societal trends. For example, where people live often reflects segregation or other conditions from decades earlier.These trends don't always hold when looking at individual areas. For example, exposures for Native Americans are lower than other groups overall, but in Kansas and Oklahoma, this group is the most exposed. The state with the largest disparities by race is Kentucky, where black people are the most exposed."We've seen in our previous research that our society is more segregated by race than by income, and now it's showing up again with air pollution from electricity generation emissions," Marshall said. "These results can help local, state or national governments make more informed decisions that will improve everyone's air quality and quality of life."Inês Azevedo, a professor of energy resources engineering at Stanford University, is also a co-author on this paper. This publication was developed as part of the Center for Air, Climate, and Energy Solutions, which was supported under an Assistance Agreement awarded by the U.S. Environmental Protection Agency.
Pollution
2,019
November 20, 2019
https://www.sciencedaily.com/releases/2019/11/191120121149.htm
This humidity digester breathes in atmospheric water and exhales energy
Integrating a super moisture-absorbent gel with light-active materials, researchers in Singapore have developed a humidity digester to dry the ambient air while generating energy. The method, presented November 20 in the journal
Like plants, artificial photosynthetic devices, also known as photoelectrochemical (PEC) systems, feed on light and water to generate energy. This phenomenon inspired the researchers to integrate light-active materials and super-hygroscopic hydrogels. The hydrogels based on zinc and cobalt can harvest more than four times their weight of water from humid air. The humidity digester can reduce relative humidity by 12 percent and generate a low current under ambient light."A lot of people say Singapore is hot, but actually, it's not that hot at all. People feel hotter because of relative humidity, as it can affect how we perceive temperature," says senior author Swee Ching Tan of the Department of Materials Science and Engineering, National University of Singapore. "That got me thinking, what if I can invent something that harvests water from our ambient air and, at the same time, reduces relative humidity and provides water or energy?"The research team in Singapore came up with a humidity digester composed of a moisture-hungry hydrogel, cathode, photoanode, and a solar cell. Just like batteries, it generates power from atmospheric humidity instead of an electrolyte. The photoanodes, acting as a photo-electrocatalyst, oxidize the absorbed water in the presence of light to split water and produce energy. The hydrogel constantly replenishes the system with water that is pulled out from the air to sustain the energy generation process. The assembly generates electricity while dehumidifying the room."The second-generation cobalt hydrogel that we developed absorbs moisture faster than any commercially available drying agents in the market. We have done an experiment by placing the hydrogel in a box, and the relative humidity dropped to about 30-35 percent lower than the outside ambient," says Tan. "We put our hand in the dry box; it felt like a fridge. It's so cold inside the box because it's so dry." Tan believes that the humidity digester is a possible replacement for air conditioners when its paired with a fan.Although one of the goals of the research team is to generate energy, the device puts out a photocurrent of about 0.4 mA/cm2, which is relatively low. However, compared to commercial air conditioning units, the humidity digester can improve thermal comfort with significantly less energy input. Even scaling the device up to commercial standards, it will be easier to install, is portable, and the operation cost will only be a fraction of an air conditioner."It is a common belief that humidity affects only equatorial or tropical countries. But people from Europe are also equally affected by high humidity levels because of associated water condensation problems. High levels of humidity cause their homes to become moldy." The humidity digester has an ample application doubling as a dehumidifier and outperforms commercial drying agents."The world population is increasing, and people generally spend a lot of money on air conditioners to maintain adequate thermal comfort. The increasing need for air conditioners to cool us down results in increased energy consumption as well. This device, when coupled with a fan, can help reduce relative humidity and thereby improve thermal comfort and reduce the reliance on air conditioners. This could lead to potential energy and monetary benefits."
Pollution
2,019
November 20, 2019
https://www.sciencedaily.com/releases/2019/11/191120121141.htm
Wind more effective than cold air at cooling rooms naturally
The effectiveness of non-mechanical, low-energy methods for moderating temperature and humidity has been evaluated in a series of experiments by researchers from the University of Cambridge.
The researchers found that a temperature difference between inside and outside has a remarkably small effect on how well a room is ventilated when ventilation is primarily driven by wind. In contrast, wind can increase ventilation rates by as much as 40% above that which is driven by a temperature difference between a room and the outdoors. The exact rate of ventilation will depend on the geometry of the room.The results, reported in the journal Heating and cooling account for a significant proportion of energy use in buildings: in the US, this is as high as 50 percent. In addition, as global temperatures continue to rise, demand for air conditioning -- which emits greenhouse gases -- rises as well, creating a damaging feedback loop.Natural ventilation, which controls indoor temperature without using any mechanical systems, is an alternative to traditional heating and cooling methods, which reduces energy use and greenhouse gas emissions."Natural ventilation is a low-energy way to keep buildings at a comfortable temperature, but in order to increase its use, we need simple, accurate models that can respond quickly to changing conditions," said lead author Dr Megan Davies Wykes from Cambridge's Department of Engineering.There are two main types of natural cross-ventilation: wind-driven and buoyancy-driven. Cross-ventilation occurs in rooms that have windows on opposite sides of a room. Wind blowing on a building can result in a high pressure on the windward side and a low pressure at the leeward side, which drives flow across a room, bringing fresh air in from outside and ventilating a room. Ventilation can also be driven by temperature differences between the inside and outside of a room, as incoming air is heated by people or equipment, resulting in a buoyancy-driven flow at a window."We've all gotten used to having a well-controlled, narrow temperature range in our homes and offices," said Davies Wykes. "Controlling natural ventilation methods is much more challenging than switching on the heat or the air conditioning, as you need to account for all the variables in a room, like the number of people, the number of computers or other heat-generating equipment, or the strength of the wind."In the current study, the researchers used a miniature model room placed inside a flume to recreate the movements of air inside a room when windows are opened in different temperature and wind conditions.Using the results from lab-based experiments, Davies Wykes and her colleagues built mathematical models to predict how temperature difference between inside and outside affects how well a room is ventilated.The researchers found that the rate of ventilation depends less on temperature and more on wind. Anyone who has tried to cool down on a hot night by opening the window will no doubt be familiar with how ineffective this is when there is no wind.This is because in many rooms, windows are positioned halfway up the wall, and when they are opened, the warm air near the ceiling can't easily escape. Without the 'mixing' effect provided by the wind, the warm air will stay at the ceiling, unless there is another way for it to escape at the top of the room."It was surprising that although temperature differences do not have a strong effect on the flow of air through a window, even small temperature differences can matter when trying to ventilate a room," said Davies Wykes. "If there are no openings near the ceiling of a room, warm indoor air can become trapped near the ceiling and wind is not effective at removing the trapped air."The next steps will be to incorporate the results into building design, making it easier to create well ventilated, low energy buildings.
Pollution
2,019
November 19, 2019
https://www.sciencedaily.com/releases/2019/11/191119161457.htm
New danger for corals in warming oceans: Metal pollution
Metal copper from agricultural runoff and marine paint leaching from boat hulls poses an emerging threat to soft coral sea fans in the waters around Puerto Rico.
In a Cornell-led study, published in the journal "We know warming oceans pose an existential threat to coral reefs around the world," said ecologist Allison Tracy, who conducted this work with Drew Harvell, professor of marine biology. "Action to alleviate the impact of warming oceans is a priority, but understanding the role of pollutants in coral disease and mortality gives us more options for solutions."While plastics and microplastics are a well-known threat to the world's oceans, the effect of metal contamination is poorly understood, according to the researchers. Increased copper pollution can be a result of agricultural runoff and marine paint leaching from boat hulls.Over a one-year period, the researchers tracked 175 individual sea fan colonies with varying levels of copper concentrations found in the sediment at 15 coral reef sites around Puerto Rico. They found that reefs with higher copper concentrations in the sediment suffered a reduction in recovery from multifocal purple spots disease -- a disease that can plague the sea fans.In the laboratory, Tracy found that sea fans initially launched an immune response to a damaging infection at low levels of copper and temperature stress. But when copper concentrations were boosted, sea fans' immune response failed, which suggests that copper stressed the sea fans and eliminated their immune potential, she said."The patterns we saw in immune markers are important because they show a mechanism through which copper and warming oceans can impair the corals' health," Tracy said.This research supplied novel data on the role of environmental stressors in coral disease and may provide a toolkit for combatting coral disease on a local scale."We can't manage the climate damage to coral reefs until we better understand how pollution and disease magnify the impacts of heat stress," Harvell said. "Although healthy corals in thriving ecosystems also experience low levels of disease, the concern is that changing ocean conditions and increased pollution have led to increased disease outbreaks. As a result, corals may be losing the battle with their pathogens as ocean stressors tip the balance in favor of disease."
Pollution
2,019
November 19, 2019
https://www.sciencedaily.com/releases/2019/11/191119075306.htm
Decarbonizing the power sector
Electricity supply is one of the biggest CO
"When looking at the big picture -- from the direct emissions of power installations, to the mining of minerals and fuels for their construction and operation, to the lands necessary for the energy supply infrastructure -- we found that the best bet for both people and environment is to rely mainly on wind and solar power," Gunnar Luderer explains. He is lead author and deputy chair of PIK's research domain on transformation pathways. "A main winner of decarbonisation is human health: switching to renewables-based electricity production could cut negative health impacts by up to 80 per cent. This is mainly due to a reduction of air pollution from combusting fuels. What is more, the supply chains for wind and solar energy are much cleaner than the extraction of fossil fuels or bioenergy production."For their study published in The researchers used complex simulations sketching out the possible paths of decarbonising the electricity supply (Integrated Assessment Modelling) and combined their calculations with life cycle analyses. Anders Arvesen from the Norwegian University of Science and Technology (NTNU) says: "In combining two pairs of analytical spectacles, we were able to look at a wide range of environmental problems, from air pollution to toxicants, from finite mineral resources needed to manufacture wind turbines to the extent of lands transformed into bioenergy plantations if relying on negative emissions. This is a promising approach also to tackle other sectors, like buildings or the transport sector.""Our study delivers even more very good arguments for a rapid transition towards a renewable energy production. However, we need to be aware that this essentially means shifting from a fossil resource base to a power industry that requires more land and mineral resources," adds Luderer. "Smart choices are key to limiting the impact of these new demands on other societal objectives, such as nature conservancy, food security, or even geopolitics."Producing electricity in a climate-friendly brings huge benefits for our health -- mainly due to a reduction of air pollution from combusting fuels.
Pollution
2,019
November 18, 2019
https://www.sciencedaily.com/releases/2019/11/191118162938.htm
Four ways to curb light pollution, save bugs
Artificial light at night negatively impacts thousands of species: beetles, moths, wasps and other insects that have evolved to use light levels as cues for courtship, foraging and navigation.
Writing in the scientific journal "Artificial light at night is human-caused lighting -- ranging from streetlights to gas flares from oil extraction," Seymoure said. "It can affect insects in pretty much every imaginable part of their lives."Insects and spiders have experienced global declines in abundance over the past few decades -- and it's only going to get worse. Some researchers have even coined a term for it: the insect apocalypse."Most of our crops -- and crops that feed the animals that we eat -- need to be pollinated, and most pollinators are insects," Seymoure said. "So as insects continue to decline, this should be a huge red flag. As a society of over 7 billion people, we are in trouble for our food supply."Unlike other drivers of insect declines, artificial light at night is relatively straightforward to reverse. To address this problem, here are four things that Seymoure recommends:The evidence on this one is clear."Light pollution is relatively easy to solve, as once you turn off a light, it is gone. You don't have to go and clean up light like you do with most pollutants," Seymoure said."Obviously, we aren't going to turn off all lights at night," he said. "However, we can and must have better lighting practices. Right now, our lighting policy is not managed in a way to reduce energy use and have minimal impacts on ecosystem and human health. This is not OK, and there are simple solutions that can remedy the problem."Four characteristics of electrical light matter the most for insects: intensity (or overall brightness); spectral composition (how colorful and what color it is); polarization; and flicker."Depending on the insect species, its sex, its behavior and the timing of its activity, all four of these light characteristics can be very important," Seymoure said."For example, overall intensity can be harmful for attracting insects to light. Or many insects rely upon polarization to find water bodies, as water polarizes light. So polarized light can indicate water, and many insects will crash into hoods of cars, plastic sheeting, etc., as they believe they are landing on water."Because it is impossible to narrow down one component that is most harmful, the best solution is often to just shut off lights when they are not needed, he said.This is related to the first recommendation: If a light is only necessary on occasion, then put it on a sensor instead of always keeping it on."A big contributor to attraction of light sources for most animals is seeing the actual bulb, as this could be mistaken as the moon or sun," Seymoure said. "We can use full cut-off filters that cover the actual bulb and direct light to where it is needed and nowhere else."When you see a lightbulb outside, that is problematic, as that means animals also see that light bulb," he said. "More importantly, that light bulb is illuminating in directions all over the place, including up toward the sky, where the atmosphere will scatter that light up to hundreds of miles away resulting in skyglow. So the easiest solution is to simply put fixtures on light to cover the light bulb and direct the light where it is needed -- such as on the sidewalk and not up toward the sky.""The general rule is that blue and white light are the most attractive to insects," Seymoure said. "However, there are hundreds of species that are attracted to yellows, oranges and reds."Seymoure has previously studied how different colors of light sources -- including the blue-white color of LEDs and the amber color of high pressure sodium lamps -- affect predation rates on moths in an urban setting."Right now, I suggest people stick with amber lights near their houses, as we know that blue lights can have greater health consequences for humans and ecosystems," Seymoure said. "We may learn more about the consequences of amber lights. And make sure these lights are properly enclosed in a full cut-off fixture."
Pollution
2,019
November 18, 2019
https://www.sciencedaily.com/releases/2019/11/191118115358.htm
Pollution from Athabasca oil sands affects weather processes
Scientists have been looking at pollution affecting the air, land and water around the Athabaska Oil Sands for some time. After looking at contaminants in snow taken from up-to 25 km away from the oil sands, a McGill-led scientific team now suggests that oil sand pollution is also affecting the weather patterns in the surrounding regions.
"The beauty of frozen precipitation such as snow is that it's like a snapshot of atmospheric processes. The snow absorbs the hard metal particles and embeds it and this allows us to see things that we might not be able to see otherwise," says Professor Parisa Ariya, from McGill's Departments of Chemistry and Atmospheric and Oceanic Sciences. She led the team that recently published their research in More specifically, the researchers looked at the presence of nanosized particles of metal contaminants in order to gain insight into the larger weather patterns. They discovered:These results are of concern since both the World Health Organization (WHO) and the International Panel on Climate Change (IPCC) have identified nanoparticle pollution as a major challenge in climate change. Further research is being done in Professor Ariya's lab to explore the effects of other major industrial pollutants.
Pollution
2,019
November 18, 2019
https://www.sciencedaily.com/releases/2019/11/191118110816.htm
Nitrous oxide levels are on the rise
Nitrous oxide is a greenhouse gas and one of the main stratospheric ozone depleting substances on the planet. According to new research, we are releasing more of it into the atmosphere than previously thought.
Most of us know nitrous oxide (NIn their paper published in "We see that the NThe authors state that N"While the increased nitrogen availability has made it possible to produce a lot more food, the downside is of course the environmental problems associated with it, such as rising NThe results of the study indicate that N"We will have to adjust our emission inventories in light of these results, including those in the GAINS model," says study coauthor Wilfried Winiwarter, a researcher in the IIASA Air Quality and Greenhouse Gases Program. "Future increments in fertilizer use may trigger much larger additional emissions than previously thought -- emission abatement, as is already reflected in GAINS results, will therefore become even more prominent and also cost efficient for such situations."From their inversion-based emissions, the researchers estimate a global emission factor of 2.3 or ± 0.6%, which is significantly larger than the IPCC default for combined direct and indirect emissions of 1.375%. The larger emission factor, and the accelerating emission increase found from the inversions suggest that NThe outcome of the study implies that to ultimately help lower global N
Pollution
2,019
November 14, 2019
https://www.sciencedaily.com/releases/2019/11/191114103108.htm
Micro-rubber in the environment
Everybody is talking about microplastics. But the amount of microplastics in air and water is small compared to another polymer that pollutes our air and water -- and therefore our organism: Micro rubber. These are the finest particles from tyre abrasion, which enter our soil and air via the road surface or are removed by artificial turf. Empa researchers have now calculated that over the last 30 years, from 1988 to 2018, around 200,000 tonnes of micro rubber have accumulated in our environment in Switzerland. This is an impressive figure that has often been neglected in the discussions on microplastics.
Researchers around Bernd Nowack from Empa's "Technology and Society" lab identified car and truck tyres as the main source of micro-rubber. "We quantified the abrasion of tires, but also the removal of artificial green areas such as artificial turf," says Nowack. However, this only plays a subordinate role, because only three percent of the rubber particles emitted come from rubber granulate from artificial green areas. Tire abrasion is responsible for the remaining 97 percent. Of the particles released into the environment, almost three-quarters remain on the left and right side of the road in the first five metres, 5% in the remaining soils and almost 20% in water bodies. The team based its calculations on data on the import and export of tyres and then modelled the behaviour of rubber on roads and in road waste water. Since the year 2000, the guidelines for the recycling of water and the prevention of soil pollution have been significantly tightened. Through measures such as the construction of road wastewater treatment plants (SABA), part of the microrubber can now be removed from the water.A part of the micro rubber is first transported by air into the first five meters left and right of the road, deposited and partly whirled up again. However, Christoph Hüglin from Empa's "Air Pollution / Environmental Technology" lab estimates the impact on humans to be low, as a study from 2009 shows. "The proportion of tyre abrasion in inhaled fine dust is also in the low single-digit percentage range at locations close to traffic," says Hüglin.Researchers emphasize, however, that microplastic and microrubber are not the same. "These are different particles that can hardly be compared with each other," says Nowack. And there are also huge differences in quantity: According to Nowack's calculations, only 7% of the polymer-based microparticles released into the environment are made of plastic, while 93% are made of tire abrasion. "The amount of microrubber in the environment is huge and therefore highly relevant," says Nowack.
Pollution
2,019
November 13, 2019
https://www.sciencedaily.com/releases/2019/11/191113201335.htm
Evolution can reconfigure gene networks to deal with environmental change
Scientists at the University of Birmingham have unravelled the genetic mechanisms behind tiny waterfleas' ability to adapt to increased levels of phosphorus pollution in lakes.
By mapping networks of genes to the physiological responses of ancient and modern waterfleas (Daphnia), the researchers, based in the University's School of Biosciences, were able to show that a cluster of over 800 genes, many of them involved in metabolic processes, evolved to become "plastic," or flexible.This allows the modern Daphnia to adjust its gene expression according to the amount of phosphorus present in the environment. This is particularly fascinating as their 700-year-old ancestors were incapable of such a plastic response.Understanding the adaptive capabilities will help scientists to better predict the capacity of these creatures to help us mitigate against the threat posed by phosphorus pollution.Strikingly, the team was only able to make these discoveries by comparing the responses of modern Daphnia with their 700-year-old ancestors. Both the modern and the ancient samples studied came from the same lake in Minnesota where eutrophication -- a process that causes devastating algal blooms with high phosphorus content -- first started at the beginning of the 20th century.Modern-day industrialised agriculture with its extensive use of phosphorus-based fertilizers is adding to the many stresses on wildlife. The phosphorus eventually ends up in our freshwater systems resulting in eutrophication. Daphnia can help to reduce these blooms, but must cope with the increased phosphorus levels which can cause problems to its health.Dr Dagmar Frisch, Dr Dörthe Becker and Dr Marcin Wojewodzic, all three of them awardees of EU Marie Skłodowska-Curie fellowships, joined their expertise to develop new concepts in evolutionary ecology that enabled this analysis to take place."We used existing data and state-of-the-art analytical methods to connect patterns of gene expression with the physiological responses that allow these animals to deal with increased environmental phosphorus" says Dr Dagmar Frisch, an expert in environmental paleogenomics. "This allowed us to identify which part of the gene network was accountable for the newly evolved response."While this work helps us to better understand how animals adapt to new environments in general, Dr Dörthe Becker who is now at the University of Sheffield, points out: "Because Daphnia is such a central species in aquatic ecosystems, our study ultimately improves our understanding of how aquatic ecosystems can mitigate some of the effects of eutrophication, one of the major global threats to freshwater environments."By reviving eggs that lie dormant in the sediment of lakes, a method called resurrection ecology, the authors were able to compare the gene responses of centuries-old revived waterfleas with modern day descendants in a novel way."We used network analysis methods to find out which genes 'communicate' with others or form clusters (called modules), and how this gene communication has changed in a keystone species over the last 700 years. In addition, we were able to connect these modules with particular observed traits, which was achieved for the first time in resurrection ecology" -- says Dr Marcin Wojewodzic, now a researcher at the Cancer Registry of Norway."Our study emphasizes that evolution is a result of molecular fine-tuning that happens on different layers, ranging from basic cellular responses to complex physiological traits" says Dr Becker.Dr Frisch adds: "Our approach allows a more holistic view of how animals can and do respond to environmental change, and by that improve our understanding of organisms as integrated units of biological organisation.""After applying the recently developed network analyses, the logical next step is to explore how other molecular mechanisms including epigenetics plays a role in evolutionary processes. We have already begun this investigation" says Dr Wojewodzic.This work, published today in Molecular Biology and Evolution was funded in part by the European Union Marie Skłodowska Curie Program, the Norwegian Research Council, the German Research Foundation and the National Science Foundation of the USA.
Pollution
2,019
November 13, 2019
https://www.sciencedaily.com/releases/2019/11/191113105719.htm
Urban development reduces flash flooding chances in arid Western US
Urban development in the eastern United States results in an increase in flash flooding in nearby streams, but in the arid West, urbanization has just the opposite effect, according to a Penn State researcher, who suggests there may be lessons to be learned from the sharp contrast.
Lauren McPhillips, assistant professor of civil and environmental engineering, who led a study of how urban development affects stream flows in the Phoenix, Arizona, metropolitan area, believes the research may yield clues for better stormwater management everywhere."We found that 'flashiness' -- a measure of the rise and fall rates of water flow in streams -- actually decreased with the extent of imperviousness in arid, urban, Southwest watersheds," said McPhillips, who has appointments in the colleges of Agricultural Sciences and Engineering. "That is the opposite pattern to that observed in previous studies in wetter regions such as the East."Researchers analyzed 14 years of flow records from U.S. Geological Survey stream gauges and similar data from the Flood Control District of Maricopa County to determine how hydrologic characteristics varied with urban development. The study looked at 19 watersheds that drained areas ranging in size from less than a square mile to 175 square miles.Similar to wetter systems, researchers observed more high-flow events in the urban desert streams compared to nonurban desert streams, she explained. However, this was only at the lower flood threshold -- there was no increase in larger floods with urban development."Overall, the urban stream syndrome manifests differently in this arid system -- urbanization increases water retention and leads to less variable flows in stream ecosystems," said McPhillips.McPhillips -- who started the research as a postdoctoral scholar at Arizona State University before joining the Penn State faculty to focus explicitly on urban hydrology and green infrastructure -- hopes to apply some of what she learned in the West to her new role. In the arid West, she noted, water quantity is more of an issue, and flash flooding is a really big concern, along with water availability in aquifers and water scarcity.In the East -- especially in the Chesapeake Bay drainage -- stormwater management is equally linked to water-quality worries. Reducing pollution from urban runoff is one of the strategies for cleaning up the bay."Arizona was an interesting case study to try to understand more about urbanization and the role of intentional engineered stormwater management structures because urban areas there have developed a lot more recently, and they still are growing rapidly," McPhillips said. "The Phoenix area has had pretty progressive stormwater management policies for a while, and we looked at some watersheds where there was an extremely high implementation of stormwater-control features. So we could start to detect whether or not they are influencing stream-flow patterns."The research findings, recently published in "It was kind of weird to see water coming into the stream and creating a minimum flow on a day when there normally would not be any flow at all," she said. "It's something that is different from anything we would see here, because in the East there are constant groundwater flows."The opportunity to detect the influence of stormwater management efforts in the West stands out as the biggest difference for McPhillips. In the East, cities are much older, and municipalities are retrofitting stormwater management features into existing storm sewer infrastructure. That makes it much more difficult to judge the success or failure of engineering features since there may not be enough of them to detect their impact downstream."Phoenix, in particular, is a very new city, and municipal leaders were pretty forward-thinking in that they built flood control facilities as the urban area grew," she said. "There are whole watersheds there that have pretty substantial quantities of water management features. It was kind of exciting to find cases in the West where we could see clearly the result of stormwater management engineering."
Pollution
2,019
November 13, 2019
https://www.sciencedaily.com/releases/2019/11/191113075116.htm
Climate change expected to shift location of East Asian Monsoons
More than a billion people in Asia depend on seasonal monsoons for their water needs. The Asian monsoon is closely linked to a planetary-scale tropical air flow which, according to a new study by Lawrence Berkeley National Laboratory (Berkeley Lab), will most likely shift geographically as the climate continues to warm, resulting in less rainfall in certain regions.
Berkeley Lab researchers Wenyu Zhou and Da Yang, along with Shang-Ping Xie of Scripps Institution of Oceanography at UC San Diego, used global climate models to study the so-called Hadley cell, which is the name of this tropical atmospheric circulation pattern. Their model results suggest that the East Asian Monsoon will shift geographically as the climate continues to warm, and that enhanced warming at the equator will drive this shift. Their study was published recently in the journal The Hadley cell consists of two components -- moist air that rises at the equator, or the deep tropics, causing heavy precipitation during monsoons, and dry air that descends in the subtropics on either side of the equator, resulting in dry conditions in the subtropics. Under anthropogenic warming, the dry subtropical part will expand towards the north and south poles, while the moist deep tropical part will get smaller, according to global climate models.For this study, the researchers used a worst-case climate change scenario, as defined by the U.N. Intergovernmental Panel on Climate Change, to model the climate in the last 30 years of the 21st century.By investigating changes in the Hadley cell during different seasons, the researchers found that the occurrence of monsoon rains will shift towards the equator. "Previous studies suggested that, on average, the Hadley cell will expand poleward in warmer climates. However, we show a different behavior in the summer months -- a contraction towards the equator in June-July, due to the effect of the enhanced warming at the equator," said Zhou.This unexpected contraction could have profound impacts on the subtropical regional climate. Rainfall in East Asia currently peaks in the summer months. "The monsoon is an important water resource to East Asia and large parts of China," said Yang. "So how it moves or changes with climate will have a huge impact on water resource management and on the daily lives of people in these areas."Looking forward, this study opens the door for new research directions, the researchers say. "We are beginning to investigate the impact on other regional features such as the North-American monsoon and the hurricane tracks," said Zhou.And while this early-summer contraction was evident in their computer modeling, another important question is whether it can be seen in real-life observations. Their preliminary results suggest that over the past 30 years these patterns have been dominated by natural variability. The effect of global warming has not been apparent yet. "In other words, the consequences of climate change, as suggested in this study, are waiting to be seen," Yang said.
Pollution
2,019
November 13, 2019
https://www.sciencedaily.com/releases/2019/11/191113092600.htm
Environmental cost of cryptocurrency mines
Bitcoin, Ethereum, Litecoin and Monero -- the names of digital-based 'cryptocurrencies' are being heard more and more frequently. But despite having no physical representation, could these new methods of exchange actually be negatively impacting our planet? It's a question being asked by researchers at The University of New Mexico, who are investigating the environmental impacts of mining cryptocurrencies.
"What is most striking about this research is that it shows that the health and environmental costs of cryptocurrency mining are substantial; larger perhaps than most people realized," said Benjamin Jones, UNM Researcher and asst. professor of economics.Cryptocurrency is an internet-based form of exchange that exists solely in the digital world. Its allure comes from using a decentralized peer-to-peer network of exchange, produced and recorded by the entire cryptocurrency community. Independent "miners" compete to solve complex computing algorithms that then provides secure cryptographic validation of an exchange. Miners are rewarded in units of the currency. Digital public ledgers are kept for "blocks" of these transactions, which are combined to create what is called the blockchain. According to proponents, cryptocurrencies do not need a third party, or traditional bank, or centralized government control to provide secure validation for transactions. In addition, cryptocurrencies are typically designed to limit production after a point, meaning the total amount in circulation eventually hits a cap. These caps and ledgers are maintained through the systems of users .But the mechanisms that make these currencies so appealing are also using exorbitant amounts of energy.In a new paper titled 'Cryptodamages: Monetary value estimates of the air pollution and human health impacts of cryptocurrency mining' published in the journal, "Our expertise is in estimating the monetary damages, due to health and environmental impacts, of different economics activities and sectors," Berrens explained. "For example, it is common for economists to study the impacts from energy use connected to production and consumption patterns in agriculture, or with automobile production and use. In a world confronting climate change, economists can help us understand the impacts connected to different activities and technologies."The independent production, or 'mining', practices of cryptocurrencies are done using energy-consuming specialized computer hardware and can take place in any geographic location. Large-scale operations, called mining camps, are now congregating around the fastest internet connections and cheapest energy sources -- regardless of whether the energy is green or not."With each cryptocurrency, the rising electricity requirements to produce a single coin can lead to an almost inevitable cliff of negative net social benefit," the paper states.The UNM researchers argue that although mining practices create financial value, the electricity consumption is generating "cryptodamages" -- a term coined to describe the human health and climate impacts of the digital exchange."We looked at climate change from greenhouse gas emissions of electricity production and also the impacts local air pollutants have when they are carried downwind and across local communities," Goodkind said.The researchers estimate that in 2018, every $1 of Bitcoin value created was responsible for $.49 in health and climate damages in the United States.Their data shows that at one point during 2018, the cost in damages that it took to create Bitcoin matched the value of the exchange itself. Those damages arise from increased pollutants generated from the burning of fossil fuels used to produce energy, such as carbon dioxide, fine particulate matter, nitrogen oxides and sulfur dioxide. Exposure to some of these pollutants has been linked to increased risk of premature death."By using large amounts of electricity generated from burning fossil fuels," Jones said. "Cryptocurrency mining is associated with worse air quality and increased CO2 emissions, which impacts communities and families all across the country, including here in New Mexico."In addition to the human health impacts from increased pollutants, the trio looked at the climate change implications and how the current system of mining encourages high energy use."An important issue is the production process employed in the blockchain for securing new blocks of encrypted transactions," Berrens explained. "Along with supply rules for new units of a currency, some production processes, like the predominate Proof-of Work (POW) scheme used in Bitcoin, require ever increasing computing power and energy use in the winner-take-all competition to solve complex algorithms, and secure new blocks in the chain."Although relatively limited in overall use currently, there are cryptocurrencies with alternative production schemes which require significantly less energy use. The researchers hope by publicizing the health and climate impacts of such schemes, they will encourage alternative methods of mining."The ability to locate cryptomining almost anywhere (i.e. following the cheapest, under-regulated electricity source) ...creates significant challenges to implementing regulation," the paper says.Goodkind says the specialized machines used for mining also have to kept cool, so they won't overheat while computing such complex algorithms. That additional energy-use was not part of this study, which means even more energy is being consumed than is currently being accounted for when looking solely at the usage of running the machines.Moving forward, the challenging public policy question is: "How can you make the people who are creating the damage pay for the cost, so that it is considered in the decision in how to mine cryptocurrencies," Goodkind concluded.
Pollution
2,019
November 12, 2019
https://www.sciencedaily.com/releases/2019/11/191112090642.htm
Study shows where global renewable energy investments have greatest benefits
A new study finds that the amount of climate and health benefits achieved from renewable energy depends on the country where it is installed. Countries with higher carbon dioxide (CO2) emissions and more air pollution, such as India, China, and areas in Southeast Asia and Eastern Europe, achieve greater climate and health benefits per megawatt (MW) of renewable energy installed than those operating in areas such as North America, Brazil, and parts of Europe. The study in
Researchers measured two types of benefits -- climate benefits (reductions in carbon emissions) and health benefits (decreased mortality attributable to harmful air pollution) -- and developed a user-friendly model to compare how those benefits vary based on where renewable energy is operating. They found climate benefits are greatest in countries where the electricity grid is largely powered by coal with less-efficient plants, including Mongolia, Botswana, Estonia, Iraq, and Australia. Health benefits are greatest in countries with higher population densities where people are living downwind of emissions sources, including Myanmar, Bangladesh, Ethiopia, India, and large parts of Eastern Europe."This new global model allows us to estimate benefits from renewables at the country-level, both from reducing greenhouse gas emissions, and including the massive health benefits achieved from reductions in air pollution. That hasn't been done before in the sustainable investment world," said lead author Jonathan Buonocore, a research associate at Harvard C-CHANGE. "For example, the results show that a wind turbine or solar panel can save 30 times more lives if it is placed in India -- where air pollution is often a major public health issue -- than if that same turbine or panel is placed in the U.S., and climate benefits will be about twice as high."This framework can be used by policymakers and investors to reach the Sustainable Development Goals set by the UN in 2015. The goals include ensuring access to clean and affordable energy (SDG 7) and promoting good health and wellbeing (SDG 3) by 2030."The private sector, and investors in particular, have a unique opportunity to influence how an estimated $2.5 trillion per year can be invested to help achieve the UN Sustainable Development Goals. Investors can use this data-driven, replicable model as a guide to make sustainable investments more effectively and efficiently," said Dr. Dinah A. Koehler, Sc.D., Harvard School of Public Health (2003), and Head of Research at Net Purpose. She initiated this research collaboration while at UBS Asset Management.This latest study in Palgrave is part of a series meant to help investors take meaningful climate action through their investments. Its methodology draws from research the authors published in Science last year, which sought to build a broader framework with standardized metrics that can be used by investors looking to create portfolios that are climate- and health-friendly."People are increasingly concerned with whether or not their investments contribute to a healthier, more sustainable world, but it's not always clear which companies are actually providing a benefit to society," said senior author Ramon Sanchez, a research associate in the Department of Environmental Health at the Harvard Chan School. "Our framework can discern the actual climate and health benefits that companies are having around the world through their products and services. Investors in these companies can have more confidence in the positive impacts they're making on health and the environment."A similar study from Harvard Chan that assesses the benefits of renewable energy in different locations across the United States was published recently in Environmental Research Letters. It affirmed that location is crucial for determining health and climate benefits of renewable energy deployment, and is largely driven by what fuel sources are displaced, what those emissions are, and how many people live downwind.To show how this model can be used, researchers compared five anonymized renewable energy companies that report country-level operating data. It showed there was significant variation in the amount of climate and health benefits achieved per MW of renewable energy installed depending on where the company was operating. For example, Wind Company C, which operates mostly in India, saves about 250 lives per 1000 additional MW of wind energy installed per year, while Wind Companies A and B, which operate mostly in North America and Europe, save only 25 lives with the same amount of wind energy installed.
Pollution
2,019
November 11, 2019
https://www.sciencedaily.com/releases/2019/11/191111180100.htm
Nature's backup plan for converting nitrogen into plant nutrients
Although nitrogen is essential for all living organisms -- it makes up 3% of the human body -- and comprises 78% of Earth's atmosphere, it's almost ironically difficult for plants and natural systems to access it.
Atmospheric nitrogen is not directly usable by most living things. In nature, specialized microbes in soils and bodies of water convert nitrogen into ammonia -- a crucial form of nitrogen that life can easily access -- through a process called nitrogen fixation. In agriculture, soybeans and other legumes that facilitate nitrogen fixation can be planted to restore soil fertility.An additional obstacle in the process of making nitrogen available to the plants and ecosystems that rely on it is that microbial nitrogen "fixers" incorporate a complex protein called nitrogenase that contains a metal-rich core. Existing research has focused on nitrogenases containing a specific metal, molybdenum.The extremely small amount of molybdenum found in soil, however, has raised concerns about the natural limits of nitrogen fixation on land. Scientists have wondered what restrictions the scarcity of molybdenum places on nature's capacity to restore ecosystem fertility in the wake of human-made disturbances, or as people increasingly search for arable land to feed a growing population.Princeton University researchers have found evidence that other metals can facilitate nitrogen fixation when molybdenum is scarce, which suggests that the process may be more resilient than previously thought, according to a study published in the journal "This work prompts a major revision of our understanding of how micronutrients control ecosystem nitrogen status and fertility," said senior author Xinning Zhang, assistant professor of geosciences and the Princeton Environmental Institute."We need to know more about how nitrogen fixation manifests in terms of nutrient budgets, cycling and biodiversity," she said. "One consequence of this finding is that current estimates of the amount of nitrogen input into boreal forests through fixation may be significantly underestimated. This is a major issue for our understanding of nutrient requirements for forest ecosystems, which currently function as an important sink for anthropogenic carbon."First author Romain Darnajoux, a postdoctoral research associate in Zhang's research group, explained that the findings validate a long-held hypothesis in the scientific community that different metal variants of nitrogenase exist so that organisms can cope with changes in metal availability. The researchers found that vanadium-based nitrogen fixation was only substantive when environmental molybdenum levels were low."It would seem that nature evolved backup methods to sustain ecosystem fertility when the environment is variable," Darnajoux said. "Every nitrogen-cycle step involves an enzyme that requires particular trace metals to work. Molybdenum and iron are typically the focus of scientific study because they're considered to be essential in the nitrogen fixing enzyme nitrogenase. However, a vanadium-based nitrogenase also exists, but nitrogen input by this enzyme has been unfortunately largely ignored."Darnajoux and Zhang worked with Nicolas Magain and Francois Lutzoni at Duke University and Marie Renaudin and Jean-Philippe Bellenger at the University of Sherbrooke in Québec.The researchers' results suggest that the current estimates of nitrogen input into boreal forests through fixation are woefully low, which would underestimate the nitrogen demand for robust plant growth, Darnajoux said. Boreal forests help mitigate climate change by acting as a sink for anthropogenic carbon. Though these northern forests do not see as many human visitors as even the most lightly populated metropolis, human activities can still have major impacts on forest fertility through the atmospheric transport of air pollution loaded with nitrogen and metals such as molybdenum and vanadium."Human activities that substantially change air quality can have a far reaching influence on how even remote ecosystems function," Zhang said. "The findings highlight the importance of air pollution in altering micronutrient and macronutrient dynamics. Because air is a global commons, the connection between metals and nitrogen cycling and air pollution has some interesting policy and management dimensions."The researchers findings could help in the development of more accurate climate models, which do not explicitly contain information on molybdenum or vanadium in simulations of the global flow of nitrogen through the land, ocean and atmosphere.The importance of vanadium-driven nitrogen fixation extends to other high latitude regions, and most likely to temperate and tropical systems, Darnajoux and Zhang said. The threshold for the amount of molybdenum an ecosystem needs to activate or deactivate vanadium nitrogen fixation that they found in their study was remarkably similar to the molybdenum requirements of nitrogen fixation found for samples spanning diverse biomes.The researchers will continue the search for vanadium-based nitrogen fixation in the northern latitudes. They've also turned their eyes toward areas closer to home, initiating studies of micro- and macronutrient dynamics in temperate forests in New Jersey, and they plan to expand their work to tropical systems.The paper, "Molybdenum threshold for ecosystem-scale alternative vanadium nitrogenase activity in boreal forests," was published by the
Pollution
2,019
November 11, 2019
https://www.sciencedaily.com/releases/2019/11/191111154134.htm
New particle analysis technique paves way for better air pollution monitoring
A new technique for continuously monitoring both the size and optical properties of individual airborne particles could offer a better way to monitor air pollution. It is especially promising for analyzing fine particulate matter measuring less than 2.5 microns (PM2.5), which can reach deep into the lungs and cause health problems.
"Air pollution has become an essential problem in many countries," said research team leader Shangran Xie from the group of Prof. Philip Russell at Max Planck Institute for the Science of Light in Germany. "Since our setup is very simple and compact, it should be possible to turn it into a table-top device for continuously monitoring airborne PM2.5 in urban areas and industrial sites."In The Optical Society (OSA) journal "The most unique feature of our technique is that it can count the number of particles -- which is related to the level of pollution -- while simultaneously providing detailed real-time information on particle size distribution and chemical dispersion," said Xie. "This additional information could be useful for fast and continuous pollution monitoring in sensitive areas, for example."For the new analysis approach, airborne particles are trapped inside a laser beam by optical forces and propelled forwards by radiation pressure. The trapping force is strong enough to overcome the gravitational force acting on very small particles such as PM2.5 and automatically aligns the particles with a hollow-core photonic crystal fiber. These special fibers feature a central core that is hollow and surrounded by a glass microstructure that confines light inside the fiber.Once aligned, the laser light propels the particle into the fiber, causing the laser light inside the fiber to scatter and createe a detectable reduction in the fiber transmission. The researchers developed a new signal processing algorithm to retrieve useful information from the particle-scattering data in real-time. After detection, the particle simply ejects from the fiber without degrading the device."The transmission signal from the fiber also lets us measure time-of-flight, which is the time the particle takes to travel through the fiber," said Abhinav Sharma, the doctoral student working on this project. "The drop in fiber transmission together with the time-of-flight information allow us to unambiguously calculate the particle size and refractive index. The refractive index can assist in identifying the particle material because this optical property is already known for most common pollutants."The researchers tested their technique using polystyrene and silica particles of several different sizes. They found that the system could precisely separate particle types and could measure the 0.99-micron silica particle with a resolution as small as 18 nanometers.The researchers plan to test the system's ability to analyze particles which are more commonly found in the atmosphere. They also want to demonstrate the technique's ability to perform measurements in liquid, which would be useful for water pollution monitoring. They have filed a patent on this technique and plan to continue to develop prototype devices, such as ones that could be used to monitor air pollution outside the lab.
Pollution
2,019
November 8, 2019
https://www.sciencedaily.com/releases/2019/11/191108102845.htm
Turbulence creates ice in clouds
Vertical air motions increase ice formation in mixed-phase clouds. This correlation was predicted theoretically for a long time, but could now be observed for the first time in nature. This result was published by a team from Leibniz Institute for Tropospheric Research (TROPOS) in Leipzig in
The formation of ice in clouds is a core element of the water cycle on Earth. It is usually difficult to isolate the ice formation process in order to study it individually because the interaction of aerosol particles, air motion and microphysical processes in clouds is too complex. Nevertheless, it is necessary to understand these processes in detail in order to better map this mechanism in weather and climate models.The cloud researchers concentrated on a less spectacular and therefore less considered form of clouds in order to exclude other processes than primary ice formation. They investigated large cloud fields at an altitude of about 2 to 8 kilometres with a vertical extent of only 100 to 200 metres and contained extremely little ice in the range of micrograms per cubic meter. Such thin cloudsallow both ice to be detected with a cloud radar and the vertical air movement with a Doppler lidar, as the laser beam can still penetrate the clouds. Both lidar and radar instruments were therefore necessary to investigate the turbulence and ice formation in these clouds above Leipzig from the ground. "The effect only became visible when we observed the ice directly below the clouds' top layer. Our findings enable for the first time quantitative and well constraint insights into the relationship between turbulence and ice formation in the atmosphere. The stronger a cloud is 'shaken' by vertical air motions, the more ice falls out of it," reports Dr Johannes Bühl of TROPOS. This correlation was measured for clouds colder than -12 °C. Next, the remote sensing scientists want to explore the influence of aerosols by taking a closer look at the beginning (ice nucleation) and end (precipitation of ice particles) of the ice formation process.Ice formation in clouds is an important process in the atmosphere, because without this ice practically no precipitation would fall from clouds in the middle latitudes of the Earth. As far-reaching as these processes may be, many details have not yet been sufficiently understood and are therefore not taken into account in the weather and climate models. Tilo Arnhold
Pollution
2,019
November 7, 2019
https://www.sciencedaily.com/releases/2019/11/191107202553.htm
Aviation emissions' impacts on air quality larger than on climate, study finds
New research from the Massachusetts Institute of Technology (MIT) has quantified the climate and air quality impacts of aviation, broken down by emission type, altitude and location.
The MIT team found that growth in aviation causes twice as much damage to air quality as to the climate.Writing today in IOP Publishing's The lead researcher on the study, Dr Sebastian Eastham, from the Laboratory for Aviation and the Environment in MIT's Department of Aeronautics and Astronautics, said: "Aviation emissions are an increasingly significant contributor to anthropogenic climate change. They cause five per cent of global climate forcing."When you consider the full flight, which includes emissions from takeoff, cruise and landing, aircraft emissions are also responsible for around 16,000 premature deaths a year from impaired air quality. This is small compared to other sectors, being only around 0.4% of the total deaths attributed annually to global air quality degradation, but is often overlooked in policy analysis.""The challenges for aviation sector decision makers wanting to reduce these impacts are the trade-offs between different emission types, and their impacts in different locations."Historically, attempts to address the climate and air quality impacts from aviation have been through changes in policy, technology, and/or operations -- improvements to fuel efficiency; more stringent emissions standards; market-based measures to reduce COBut the study notes reducing one type of emission can come at the cost of increasing another, either in absolute terms or by limiting potential reductions offered by new technology.Dr Eastham explained: "We could decrease NO"We developed a set of metrics for comparing the climate and air quality impacts of aviation emissions at all flights stages, by estimating the social costs per unit of emitted pollutant. The cost metrics are broken down by flight phase -- cruise, landing and take-off -- and by the geographical region of emission, both per kg of emission and per kg of fuel burn."The research team applied the metrics to evaluate the effects of a global expansion in aviation, consistent in magnitude with its current annual growth. They then used this as a benchmark for three scenarios.First, they considered a growth scenario with fuel efficiency increases and reductions in NODr Eastham said: "Our results show three components are responsible for 97 per cent of climate and air quality damages per unit aviation fuel burn: air quality impacts of NO"To reduce the climate impacts of aviation, measures aimed at reducing CO"Finally, we found the air quality impacts of aviation emissions significantly exceed the climate impacts, with air quality impacts being 1.7 to 4.4 times higher than the climate impact per unit of fuel burn. This must be contrasted to ground-based industries, where post-combustion emissions control and access to cleaner fuels is widespread. For example, the climate impacts of the US power sector are of similar magnitude as the air quality impacts following significant declines in co-pollutant emissions over the past 15 years. This points towards potential political and technological opportunities for reducing the atmospheric impacts of the aviation sector."
Pollution
2,019
November 7, 2019
https://www.sciencedaily.com/releases/2019/11/191107122641.htm
Pesticide management is failing Australian and Great Barrier Reef waterways
Scientists say a failure of national management means excessive amounts of harmful chemicals -- many now banned in other countries such as the EU, USA and Canada -- are damaging the nation's waterways and the Great Barrier Reef.
The new study was led by Dr Jon Brodie from the ARC Centre of Excellence for Coral Reef Studies at James Cook University (Coral CoE at JCU).Dr Brodie says pesticides found at concentrations exceeding the nation's own water quality guidelines have the potential to seriously damage aquatic plants and animals. Insecticides affect prawns in freshwater streams, and herbicides affect marine species such as seagrass."The notorious insecticide imidacloprid -- now banned for its effects on bees across Europe, the USA and soon to be banned in Canada -- is found in many freshwater streams and estuaries in the Great Barrier Reef and also Queensland more broadly," Dr Brodie said."This can have a serious effect on aquatic life."The regulation and management of pesticides in Australia is a joint responsibility of the Australian and State governments."There is no evidence at the moment that imidacloprid may be banned or regulated more closely in Australia," Dr Brodie said."The processes of the Australian Government regulator, the Australian Pesticide and Veterinary Medicine Authority (APVMA), have serious deficiencies and in many cases are seriously flawed," he said.However, Dr Brodie notes that the Queensland Government is taking action to reduce pesticide pollution through research, monitoring, risk assessments and application of better pesticide application methods.Yet, only so much can be done at a local level."The APVMA are very slow to act on the copious evidence surrounding, for example, the continued use of a pesticide like imidacloprid."The highest concentrations of pesticides, often found above Australian guidelines, are found in freshwater bodies adjacent to, and downstream of, areas of intensive cropping. This is mainly sugarcane cultivation and horticulture.Dr Brodie says Australia has the expertise and knowledge of pesticide management to take action and regulate."Though pesticide regulation and management in the Great Barrier Reef region has been unsuccessful, there is some hope that pesticide levels and risks to species and ecosystems can be reduced," he said.
Pollution
2,019
November 6, 2019
https://www.sciencedaily.com/releases/2019/11/191106154523.htm
Huge gaps in research on microplastics in North America
Amid increasing concern about the effects of plastic pollution on marine ecosystems, a new study led by Portland State University found that North America is lagging behind other continents when it comes to understanding the potential risks that microplastics and associated pollutants pose to both fisheries and the humans that consume the seafood.
Researchers from Portland State University (PSU), Oregon State University (OSU), and the University of North Carolina-Wilmington (UNC-W) reviewed microplastics studies on commercially-important fishery species published before March 1, 2019, finding that most of the existing literature comes from Europe, Asia, and South America."Because seafood -- both aquacultured and wild-caught -- are so important to the human diet and culture, it's really important to investigate microplastics specifically on our continent and not relying on data from another part of the world because environmental conditions can be very different," said Britta Baechler, a Ph.D. student in PSU's Earth, Environment and Society program.The research priorities identified for North America include:"We think of North America as a hotspot for scientific research, yet in terms of understanding microplastics -- both contamination in our commercial fishery species and understanding effects, we're lagging far behind," said Elise Granek, a professor of environmental science and management in PSU's College of Liberal Arts and Sciences.
Pollution
2,019
November 6, 2019
https://www.sciencedaily.com/releases/2019/11/191106085540.htm
Combatting air pollution with nature
Air pollution is composed of particles and gases that can have negative impacts on both the environment and human health. Technologies to mitigate pollution have become widespread in recent years, but scientists are now exploring a new, pared-down approach: using nature to restore ecological balance. They report their findings in ACS'
In the decades since the Clean Air Act of 1970, air quality across the U.S. has improved dramatically. However, according to the American Lung Association, four in 10 people in the U.S. still live in areas with poor air quality, which can result in serious health effects such as asthma, lung cancer and cardiovascular disease. Technology to control and remove pollutants can be costly and often requires a great deal of energy. As an alternative, researchers are looking to nature-based solutions (NBS), a form of sustainable infrastructure that uses natural, rather than manufactured, elements. NBS are adaptable, cost-effective and can support native wildlife, making it a truly "green" solution in combatting pollution and climate change. To better understand the feasibility of NBS to reduce pollution, Bhavik Bakshi and colleagues wanted to perform a data-driven analysis.The researchers used publicly available data and calculated factors, such as current vegetation cover, county-level emissions from air pollutants and land area available for restoration, to determine the potential benefits of NBS across the U.S. Next, they calculated the financial aspect of implementing NBS to mitigate various air pollutants. The team found that in 75% of counties analyzed, it was more economical to use nature-based solutions for mitigating emissions than to implement technological interventions. Counties that were not strong candidates for NBS either did not have enough available land, or the cost of technological methods was less than that of restoration. Overall, the researchers found that both urban and rural populations could benefit from NBS, though many environmental factors should be considered before putting the approach into practice.
Pollution
2,019
November 4, 2019
https://www.sciencedaily.com/releases/2019/11/191104112904.htm
City apartments or jungle huts: What chemicals and microbes lurk inside?
What are the differences between life in a walled urban apartment versus in a jungle hut that's open to nature?
Researchers at Rutgers and other universities found city homes to be rife with industrial chemicals, cleaning agents and fungi that love warm, dark surfaces, while jungle huts had fresher air, more sunlight and natural materials with which humans evolved.The differences may profoundly affect our health, according to the study in the journal The researchers compared microscopic materials in homes and people's bodies, spanning the spectrum of urbanization in the Amazon basin. The locations included a remote Peruvian jungle village of thatched huts with no walls; a Peruvian rural town with wooden houses lacking indoor plumbing; a Peruvian city of 400,000 residents and more modern amenities; and the metropolis of Manaus, Brazil, which has a population of two million."Urbanization represents a profound shift in human behavior. Modern living literally walls us off from the natural environment and shuts us in with industrial compounds, higher carbon dioxide levels and skin-loving fungi," said senior author Maria Gloria Dominguez-Bello, a professor in Rutgers University-New Brunswick's Department of Biochemistry and Microbiology and Department of Anthropology. "This study sheds light on how human-created environments affect our health and how we can think about improving them."The study found that the diversity of chemicals clinging to indoor surfaces increases dramatically with urbanization. Molecules derived from medications and cleaning agents were part of the interior environment of homes in the metropolis and city but not in the rural or jungle homes.Although the urban dwellers reported cleaning more frequently, surfaces in their homes had a greater diversity of fungal species associated with human skin. This may be because the fungi have become resistant to cleaning products, the study said. It may also reflect the urban homes' warmer temperatures, reduced air exchange, lower levels of natural light and higher loads of human skin flakes. Samples from people in the different environments also found a greater diversity of foot fungus on the urban dwellers. Also, in the rural and jungle homes, the researchers found a greater variety of bacteria and fungi that live outside, and fewer species known for colonizing the human body."We are just now starting to quantify the effect of cutting ourselves off from the natural environment with which we as humans co-evolved and of replacing it with a synthetic environment," said co-corresponding author Rob Knight, a professor and director of the Center for Microbiome Innovation at the University of California-San Diego. "What's next is to identify the specific differences associated with urbanization that have a health impact and to design interventions to reverse them. Those could be anything from knowing how many minutes a week should be spent outdoors in natural environments to air fresheners that are good for the microbiome."Dominguez-Bello said exposure to outdoor germs and natural materials may benefit the human microbiome. Her prior research found that people in urbanized societies have lost a substantial part of their microbiota diversity compared with hunter-gatherers in isolated Amazonian villages.
Pollution
2,019
November 1, 2019
https://www.sciencedaily.com/releases/2019/11/191101111550.htm
Four decades of data sounds early warning on Lake George, NY
Although concentrations of chemicals and pollutants like salt and nutrients have increased in the deep waters of Lake George, they're still too low to harm the ecosystem at those depths, according to an analysis of nearly 40 years of data published Thursday in
The Offshore Chemistry Program (OCP), run by Rensselaer Polytechnic Institute's Darrin Fresh Water Institute, has been monitoring the deep waters of Lake George for 40 years. The "There are very few lake studies in the world that have given us such a wealth of information over time as the Offshore Chemistry Program," said Rick Relyea, director of the Darrin Fresh Water Institute and the Jefferson Project. "Thanks to that long-term commitment, we have discovered that the deep waters of the lake are fairly resilient to human impacts. This insight has shifted our research focus from the deep water to the shallow water, streams and wetlands, places that are probably a lot less resilient and where most of the human impacts are being felt."William Hintz, lead author of the study and a former Jefferson Project researcher, said the trends -- mostly defined by large percentage increases in still scarce chemicals -- offer a useful heads-up."Levels of salt, the nutrient orthophosphate, and chlorophyll -- an indirect measure of the floating algae that feed on orthophosphate -- have increased substantially, but as far as we can determine, none are at a level that will cause harm, and that's good news," said Hintz, now an assistant professor at the University of Toledo. "We're fortunate to have this early warning. We know how to bring those levels back down, and we have time to reverse these trends before they cause harm.""Updated findings of the lake's offshore chemistry program demonstrate the continuing resilience of Lake George to a growing array of stressors while also underscoring the need to focus research closer to shore and the human activities associated with increasing pressures on water quality -- from wastewater and stormwater to road salt impacts," said Eric Siy, executive director of The FUND for Lake George. "The groundbreaking work of the Jefferson Project is responding to this need to gain a fuller understanding of lake and watershed conditions that can inform policy measures designed to maintain the lake's outstanding water quality," Siy said."It is exceptionally gratifying to see what began as a couple of researchers monitoring the lake in the early 1980s to what has become the internationally recognized Jefferson Project today," said Lawrence Eichler, a Rensselaer research scientist and author of the study who has led the OCP effort for nearly all of the 40 years. "The incremental changes reported over the past 37 years are a testament to the lake's resilience. However, Lake George's resilience is not unlimited."Lake George is famed for the transparency of its water, and the study found that -- despite increases in chlorophyll -- transparency has not decreased over 37 years, and actually improved at one site. Transparency currently averages about nine meters deep across the lake.Orthophosphate increased by 70%, from 0.72 to 1.23 micrograms per liter, a change that might be attributable to natural causes, but is more likely coming from improperly functioning septic systems, failing wastewater treatment systems, and stormwater runoff.This increase in orthophosphate appears to have fueled an increase in chlorophyll, which gives algae their green color. Chlorophyll readings rose by 32%, from 1.3 to 1.7 micrograms per liter. Given that transparency has not declined in the lake, the increase in chlorophyll is likely caused by a greater density of chlorophyll in individual algal cells or a change in algal species, rather than an increase in total algal biomass.Levels of total phosphorus, which includes orthophosphate as well as forms of phosphorus bound in the bodies of living and dead organisms, have remained steady at about 4 micrograms per liter.Concentrations of calcium, which is needed to build the shells of invasive species like Asian clams and zebra mussels, have risen by 14%, but this represents a small absolute increase from 10.4 to 11.2 milligrams per liter. To sustain zebra mussels, which are currently isolated near structures like concrete piers that leach calcium into the lake, calcium levels would need to reach about 20 milligrams per liter.Salt concentrations -- which Jefferson Project research has established can have profound ecological effects at high levels -- have nearly tripled in deep waters. While the percentage change is great, the actual concentrations are still considered low. Sodium concentrations have risen from 3 to 10 milligrams per liter, and chloride concentrations have risen from 6 to 18 milligrams per liter. For comparison, the EPA has set guidelines of sodium levels below 20 milligrams per liter for drinking water, and 230 and 860 milligrams of chloride per liter for short-term and long-term exposures in fresh water, respectively. At present rates of increase, the deep waters in Lake George would reach sodium guideline levels within 50 years. Shallow areas are expected to have higher salt concentrations, especially in areas where streams enter the lake."We are seeing changes in the deep water, but the real concern is that, during some days in the spring, chloride levels at the streams are not at 18 milligrams per liter, as we currently see in the deep waters. Instead, they can spike up to 2,000 milligrams per liter, which should be lethal to many stream species," said Relyea.Discussions prompted by the previous 30-year report have already led to efforts to curb applications of road salt, the principal source of increased salt levels in the basin, demonstrating the potential rewards of heeding early warning signs.The researchers also saw improvements in the levels of sulfates and nitrates, the chemicals in "acid rain," which have decreased as a result of the Clean Air Act. Over 37 years, sulfate declined by 55% and nitrate decreased 54%, with a concurrent increase in pH and alkalinity, which are considered good signs of recovery from acid rain deposition.The Clean Air Act confronted acid rain, but no similar mechanism has been established to tackle climate change, which is showing its effects on Lake George. Mean temperatures in the upper waters of the lake rose by 1.8°C, or 0.05°C per year."Taken together, the OCP tells us that in nearly 40 years of human activities, the lake has changed in a number of ways, often not in good directions. These changes have been relatively small, but they serve as a wake-up call. We have the ability to adjust our impacts on the lake and restore it to a more pristine condition," said Relyea.
Pollution
2,019
October 31, 2019
https://www.sciencedaily.com/releases/2019/10/191031130547.htm
Oil and gas wastewater used for irrigation may suppress plant immune systems
The horizontal drilling method called hydraulic fracturing helps the United States produce close to 4 billion barrels of oil and natural gas per year, rocketing the U.S. to the top of oil-producing nations in the world.
The highly profitable practice comes with a steep price: For every barrel of oil, oil and gas extraction also produces about seven barrels of wastewater, consisting mainly of naturally occurring subsurface water extracted along with the fossil fuels. That's about 2 billion gallons of wastewater a day. Companies, policymakers and scientists are on the lookout for new strategies for dealing with that wastewater. Among the most tantalizing ideas is recycling it to irrigate food crops, given water scarcity issues in the West.A new Colorado State University study gives pause to that idea. The team led by Professor Thomas Borch of the Department of Soil and Crop Sciences conducted a greenhouse study using produced water from oil and gas extraction to irrigate common wheat crops. Their study, published in "The big question is, is it safe?" said Borch, a biogeochemist who has joint academic appointments in the Department of Chemistry and Department of Civil and Environmental Engineering. "Have we considered every single thing we need to consider before we do this?"Typically, oil and gas wastewater, also known as produced water, is trucked away from drilling sites and reinjected into the Earth via deep disposal wells. Such practices have been documented to induce earthquakes and may lead to contamination of surface water and groundwater aquifers.The idea for using such water for irrigation has prompted studies testing things like crop yield, soil health, and contaminant uptake by plants, especially since produced water is often high in salts, and its chemistry varies greatly from region to region. Borch, who has conducted numerous oil and gas-related studies, including how soils fare during accidental spills, wondered if anyone had tried to determine whether irrigation water quality impacts crops' inherent ability to protect themselves from disease.The experiments were conducted in collaboration with plant microbiome expert Pankaj Trivedi, a CSU assistant professor in the Department of Bioagricultural Sciences and Pest Management, and researchers at Colorado School of Mines. The team irrigated wheat plants with tap water, two dilutions of produced water, and a salt water control. They exposed the plants to common bacterial and fungal pathogens and sampled the leaves after the pathogens were verified to have taken hold.Using state-of-the-art quantitative genetic sequencing, the scientists determined that the plants watered with the highest concentration of produced water had significant changes in expression of genes plants normally use to fight infections. Their study didn't determine exactly which substances in the produced water correlated with suppressed immunity. But they hypothesized that a combination of contaminants like boron, petroleum hydrocarbons and salt caused the plants to reallocate metabolic resources to fight stress, making it more challenging for them to produce disease-fighting genes."Findings from this work suggest that plant immune response impacts must be assessed before reusing treated oil and gas wastewater for agricultural irrigation," the study authors wrote.
Pollution
2,019
October 30, 2019
https://www.sciencedaily.com/releases/2019/10/191030170556.htm
System provides cooling with no electricity
Imagine a device that can sit outside under blazing sunlight on a clear day, and without using any power cool things down by more than 23 degrees Fahrenheit (13 degrees Celsius). It almost sounds like magic, but a new system designed by researchers at MIT and in Chile can do exactly that.
The device, which has no moving parts, works by a process called radiative cooling. It blocks incoming sunlight to keep from heating it up, and at the same time efficiently radiates infrared light -- which is essentially heat -- that passes straight out into the sky and into space, cooling the device significantly below the ambient air temperature.The key to the functioning of this simple, inexpensive system is a special kind of insulation, made of a polyethylene foam called an aerogel. This lightweight material, which looks and feels a bit like marshmallow, blocks and reflects the visible rays of sunlight so that they don't penetrate through it. But it's highly transparent to the infrared rays that carry heat, allowing them to pass freely outward.The new system is described today in a paper in the journal Such a system could be used, for example, as a way to keep vegetables and fruit from spoiling, potentially doubling the time the produce could remain fresh, in remote places where reliable power for refrigeration is not available, Leroy explains.Radiative cooling is simply the main process that most hot objects use to cool down. They emit midrange infrared radiation, which carries the heat energy from the object straight off into space because air is highly transparent to infrared light.The new device is based on a concept that Wang and others demonstrated a year ago, which also used radiative cooling but employed a physical barrier, a narrow strip of metal, to shade the device from direct sunlight to prevent it from heating up. That device worked, but it provided less than half the amount of cooling power that the new system achieves because of its highly efficient insulating layer."The big problem was insulation," Leroy explains. The biggest input of heat preventing the earlier device from achieving deeper cooling was from the heat of the surrounding air. "How do you keep the surface cold while still allowing it to radiate?" he wondered. The problem is that almost all insulating materials are also very good at blocking infrared light and so would interfere with the radiative cooling effect.There has been a lot of research on ways to minimize heat loss, says Wang, who is the Gail E. Kendall Professor of Mechanical Engineering. But this is a different issue that has received much less attention: how to minimize heat gain. "It's a very difficult problem," she says.The solution came through the development of a new kind of aerogel. Aerogels are lightweight materials that consist mostly of air and provide very good thermal insulation, with a structure made up of microscopic foam-like formations of some material. The team's new insight was to make an aerogel out of polyethylene, the material used in many plastic bags. The result is a soft, squishy, white material that's so lightweight that a given volume weighs just 1/50 as much as water.The key to its success is that while it blocks more than 90 percent of incoming sunlight, thus protecting the surface below from heating, it is very transparent to infrared light, allowing about 80 percent of the heat rays to pass freely outward. "We were very excited when we saw this material," Leroy says.The result is that it can dramatically cool down a plate, made of a material such as metal or ceramic, placed below the insulating layer, which is referred to as an emitter. That plate could then cool a container connected to it, or cool liquid passing through coils in contact with it, to provide cooling for produce or air or water.To test their predictions of its effectiveness, the team along with their Chilean collaborators set up a proof-of-concept device in Chile's Atacama desert, parts of which are the driest land on Earth. They receive virtually no rainfall, yet, being right on the equator, they receive blazing sunlight that could put the device to a real test. The device achieved a cooling of 13 degrees Celsius under full sunlight at solar noon. Similar tests on MIT's campus in Cambridge, Massachusetts, achieved just under 10 degrees cooling.That's enough cooling to make a significant difference in preserving produce in remote locations, the researchers say. In addition, it could be used to provide an initial cooling stage for electric refrigeration, thus minimizing the load on those systems to allow them to operate more efficiently with less power.Theoretically, such a device could achieve a temperature reduction of as much as 50 C, the researchers say, so they are continuing to work on ways of further optimizing the system so that it could be expanded to other cooling applications such as building air conditioning without the need for any source of power. Radiative cooling has already been integrated with some existing air conditioning systems to improve their efficiency.Already, though, they have achieved a greater amount of cooling under direct sunlight than any other passive, radiative system other than those that use a vacuum system for insulation -- which is very effective but also heavy, expensive, and fragile.This approach could also be a low-cost add-on to any other kind of cooling system, providing additional cooling to supplement a more conventional system. "Whatever system you have," Leroy says, "put the aerogel on it, and you'll get much better performance."
Pollution
2,019
October 30, 2019
https://www.sciencedaily.com/releases/2019/10/191030132714.htm
Cumulative environmental exposures increase diabetes risk in rural populations
University of Illinois at Chicago researchers are the first to show that cumulative environmental exposures affect rural and urban populations differently when it comes to diabetes risk. Multiple environmental factors were associated with a greater risk for diabetes in rural and sparsely populated counties compared with their urban counterparts.
Their findings, which are based on an evaluation of 3,134 counties nationwide, are published in the According to the U.S. Centers for Disease Control and Prevention, diabetes affects over 30 million people in the U.S., and 84 million people have prediabetes. While excess food consumption and a lack of exercise are known to influence diabetes risk, those factors alone fail to account for how fast the population is developing diabetes. And while researchers have been interested in how environmental factors affect diabetes risk, a majority of studies have been in urban areas, leaving rural areas neglected in these analyses."This is one of the few studies to look at environmental effects on diabetes risk nationally and to determine whether or not there is a difference between urban and rural drivers," said Dr. Jyotsna Jagai, first author and research assistant professor of environmental and occupational health sciences at the UIC School of Public Health. "The drivers for both environmental quality and diabetes risk may vary in urban and rural areas. Being able to look at the entire country and look at this continuum of urban/rural was an advantage."To measure the cumulative environmental effect on diabetes risk, Jagai and her colleagues developed an Environmental Quality Index, or EQI. The EQI was derived from combined data from various environmental domains, including the quality of air, water and land, in addition to the built and sociodemographic factors within a given area.The sociodemographic domain was based on collected data such as median household income, education or violent crime rates, for example. Built domain factors included density of fast-food restaurants, density of fatal accidents and percent of highways versus roadways, for example. Each domain also was assessed independently to determine the biggest drivers of environmental risk on diabetes in specific areas."The EQI's cumulative assessment is unique," said Dr. Robert Sargis, co-author and UIC associate professor of endocrinology, diabetes and metabolism in the College of Medicine. "In most studies, we are not looking at the combination of factors. We look at single chemicals or single classes of chemicals and how they are associated with disease risk. This study pulls together all of the factors we think increase risk and puts them into a single measure to look at the cumulative environment."Overall, in rural and less populated areas, lower overall environmental quality was associated with a higher prevalence of diabetes. Specifically, diabetes risk was most closely associated with air, built and sociodemographic domains. In urban areas, diabetes risk was associated with air and sociodemographic domains only."There might be something happening in rural areas that is different than in urban areas. Our findings suggest that environmental exposures may be a bigger factor in rural counties than in urban areas in the U.S.," Jagai said. "The environment that we are exposed to is broader than pollutants alone. Our health is dependent on these combined effects, such as sociodemographic or built stressors, that can impact our livelihoods."The researchers say that the discrepancies of environmental risk factors for diabetes in urban versus rural populations could inform how communities and policymakers approach structural problems that promote disease development. Jagai says that studies like this can specifically help overlooked populations."Understanding local social and economic demographic factors can help communities develop environmental regulations and policies to improve the health outcomes of their residents," she said.
Pollution
2,019
October 30, 2019
https://www.sciencedaily.com/releases/2019/10/191030110033.htm
Traffic exhaust at residential address increases the risk of stroke
High levels of traffic exhaust at one's residence increases the risk of stroke even in low-pollution environments, according to a study by researchers at Karolinska Institutet and other universities in Sweden. The study, published in the journal
Black carbon is the sooty black material emitted from gas and diesel engines, coal-fired power plants and other fuels. In city environments, the emissions come mainly from road traffic. These particles have previously been linked to negative health effects, especially in studies of heavily polluted environments. Now researchers at Karolinska Institutet, University of Gothenburg, Umeå University, the Swedish Meteorological and Hydrological Institute and SLB analysis-Environmental unit in Stockholm have shown that long-term exposure to traffic exhaust at the residential address increases the risk of stroke in Swedish towns."This study identifies local traffic exhaust as a risk factor for stroke, a common disease with great human suffering, high mortality and significant costs to society," says Petter Ljungman, researcher at the Institute of Environmental Medicine at Karolinska Institutet and the study's main author. "We see that these emissions have consequences even in low-pollution environments like Swedish cities."The researchers followed almost 115,000 middle-aged healthy individuals living in Gothenburg, Stockholm and Umeå over a period of 20 years. During this time, some 3,100 of the people suffered a stroke. With the help of dispersion models and Swedish emission inventories, the researchers were able to estimate how much different local emission sources, including from traffic exhaust, road wear and residential heating, contributed to particulate matter and black carbon at specific addresses in these cities.The researchers found that for every 0.3 micrograms per cubic meter (μg/m3) of black carbon from traffic exhaust, the risk of stroke increased by 4 percent. Similar associations were not seen for black carbon emitted from residential heating or for particulate matter in general, neither from inhalable particles with a diameter of 10 micrometers or less (PM10) or from particles with a diameter of 2.5 micrometers or less (PM2.5). In the studied cities, the annual averages of PM2.5 ranged from 5.8 to 9.2 μg/m3, considerably lower than current European Union standard of 25 μg/m3. There is currently no specific metric for black carbon in EU, which includes it as part of its broader regulation of particulate matter."Black carbon from traffic exhaust could be an important measure to consider when assessing air quality and health consequences," says Petter Ljungman.
Pollution
2,019
October 30, 2019
https://www.sciencedaily.com/releases/2019/10/191030101133.htm
Housing developers could be the secret weapon to improving air quality
In a paper published by a multinational and multidisciplinary team of researchers in the journal
The paper argues that it will take a global community of multi-disciplined researchers to get to the bottom of how air pollution, green infrastructure and human health are connected -- three areas that are typically studied in isolation. However, until the evidence becomes clearer, house developers, urban planners and politicians should be given the best possible information to make sure future built communities provide the maximum environmental and health benefits to residents.Professor Prashant Kumar, Director of GCARE at the University of Surrey and the lead author of the study, said: "It's thought that the UK needs to build 300,000 homes a year to keep up with demand -- and this pressure for housing is the same in many places across the world. It's clear that we need to enlist housing developers and planners in the effort to reduce air pollution; for them to be effective allies, they need clear and easy to follow guidelines on what is the best way to deploy green infrastructure."A simple task such as planting a hedge near a roadside is more complicated than it sounds -- planners will have to make a decision on the species of the plant and its location to maximise benefits and, where possible, we need to make those decisions as easy as possible for them."We at the University of Surrey are working, together with colleagues across the globe, are working to build a better understanding of how air pollution, green infrastructure and human health are connected. It will take a true collaborative effort to overcome the air pollution and climate crisis, but together it is achievable."
Pollution
2,019
October 30, 2019
https://www.sciencedaily.com/releases/2019/10/191030100053.htm
Bacteria and fungi show a precise daily rhythm in tropical air
Scientists from the Singapore Centre for Environmental Life Sciences Engineering (SCELSE) at Nanyang Technological University, Singapore (NTU Singapore) have found that the air in the tropics is teeming with a rich and diverse range of at least 725 different microorganisms.
The NTU scientists used a new sampling and DNA sequencing protocol of microbial communities in Singapore's air. They found that the composition of the microbial community in the tropical air changes predictably, with bacteria dominating in the day and fungi at night.This day-night pattern and the diversity of airborne microorganisms in the near-surface atmosphere was previously unknown, as the atmosphere is technologically challenging to study, making it the most underexplored ecosystem on a planetary scale, compared to land and sea ecosystems.The new knowledge on tropical bioaerosols -- tiny airborne particles comprising fungi, bacteria and plant material -- is reported this week in the In the research paper led by NTU genomics professor Stephan Schuster, the team reported that tropical air had a microbial diversity with similar complexity to other well-studied ecosystems, such as seawater, soil, and the human gut.These bioaerosol communities (airborne particles of biological origin) have an unexpectedly high number of bacterial and fungal species, and they follow a diel cycle (a 24-hour day and night cycle) which scientists believe is driven by environmental conditions such as humidity, rain, solar irradiance and carbon dioxide levels.They also demonstrate a greater variation in composition within a day than between days or even months. This robust community structure maintains long-term predictability, despite variations in airflow and monsoonal winds across the tropical seasons.Professor Schuster, who is a research director at SCELSE, said having the ability to analyse bioaerosols provides insights into many aspects of urban life that affect our daily living and human well-being. It also provides an understanding of the interactions between atmospheric, terrestrial and aquatic planetary ecosystems, which is particularly valuable during climate change."Our current study suggests that temperature is also a global driver of microbial community dynamics for the atmosphere near the surface. Such understanding is fundamental to ensuring ecosystem sustainability in urban settings and beyond," Prof Schuster added."Locally, understanding bioaerosol dynamics will also inform the way we are managing our surrounding atmosphere, such as the effect of indoor air quality, mechanical ventilation and pollution on the air microbiomes that surround us. Further, our study provides important and much-needed insights into bioaerosols in a tropical setting, given that existing studies have exclusively involved temperate climates."On average, humans breathe in 11 cubic metres of air daily (11 thousand litres), which could contain some 50 thousand organism cells in the tropics during the day time, but 30 to 100 times that amount at night.The scientists say they are beginning to understand the effect of air microbial communities on the environment and human health, with the most immediate impact on patients with respiratory illnesses. Further, damage to agricultural crops could be avoided long term, as many of the detected organisms are plant pathogens or wood rotting fungi.The findings have led the research team to initiate an ongoing interdisciplinary research project between SCELSE and NTU's Lee Kong Chian School of Medicine in the area of respiratory health.Together with clinical experts led by respiratory physician Assistant Professor Sanjay Chotirmall, Provost's Chair in Molecular Medicine, the joint team is assessing how bioaerosols affect patients with bronchitis, chronic obstructive pulmonary disease and severe asthma.The research team set up a series of air sample collectors on a rooftop at the NTU campus, which collected samples every two hours over the course of five days (180 samples per time series). This same experiment was repeated every three months for 13 months, which included Singapore's two monsoon seasons.Airborne biomass was collected by pumping air through an electrostatic filter retaining particles from 0.5 to 10 micrometre range (about 50 times to 10 times smaller than the average thickness of a human hair respectively). Previously, air samples had to aggregate for weeks or months in order to amass sufficient material for analysis.Through the new ultra-low biomass DNA sequencing protocol developed by the NTU research team, much smaller volumes of air could be analysed, enabling for the first time the ability to study hourly changes in the air microbiome composition.All bioaerosols were classified according to their DNA similarities and differences, which allowed for species-level identification of airborne organisms after comparison with existing DNA sequence databases.The researchers surpassed the level of identification of bioaerosols previously achieved by scientists, without the need to amplify DNA or using long-term sampling periods that do not provide sufficiently detailed information at short time intervals.Previous studies by other research groups have reported bioaerosol communities based on either cultivation or gene amplification, both of which incur substantial biases and aggregates of time, without however achieving a temporal resolution that would have allowed for observing the daily cycling of airborne microorganisms.Team member Dr Elena Gusareva, a postdoctoral research fellow at NTU said the unprecedented scale and depth of analysis and classification, allowed the team to identify more than 1,200 bacterial and fungal species that perform a changing pattern of microbial community composition.Ongoing research at NTU is now looking at using the same method to analyse bioaerosols globally at other sites, which could yield similar trends in terms of following the diel cycle.
Pollution
2,019
October 30, 2019
https://www.sciencedaily.com/releases/2019/10/191030073340.htm
Harmful emissions from traffic, trucks, SUVs
Almost one third of Canadians live near a major road -- and this means they go about their everyday lives exposed to a complex mixture of vehicle air pollutants.
A new national study led by University of Toronto Engineering researchers reveals that emissions from nearby traffic can greatly increase concentrations of key air pollutants, with highly polluting trucks making a major contribution. Canada's cold winters can also increase emissions while particle emissions from brakes and tires are on the rise.The report, released today, is the culmination of a two-year study monitoring traffic emissions in Toronto and Vancouver -- the two Canadian cities with the highest percentage of residents living near major roads."There's a whole 'soup' of pollutants within traffic emissions," says Professor Greg Evans, who led the study in collaboration with Environment and Climate Change Canada, the Ontario Ministry of the Environment, Conservation and Parks, and Metro Vancouver.Evans says that this soup of pollutants includes nitrogen oxides, ultrafine particles, black carbon, metals, carbon monoxide and carbon dioxide. Exposure to these emissions has been associated with a wide range of health issues, including asthma, cancer and cardiovascular mortality."The areas of concern we identified raise important questions about the health of Canadians living near major roadways," says Evans.The national report's findings complement a parallel report on air quality in the Vancouver region that will soon be released by Metro Vancouver. Both reports underscore the need to assess and enact new measures to mitigate exposure to air pollutants.Busy roads are detracting from air quality nearby, especially during morning rush hour.The researchers measured concentrations of ultrafine particles -- the smallest airborne particles emitted by vehicles -- and found that average levels of ultrafine particles near highways were four times higher than sites far removed from traffic."These particles are less than 100 nanometres in size, much smaller than red blood cells. They can travel and trans-locate around the body," explains Evans. "We don't know yet what the health impacts of these particles are but we do know that near roads, they are a good indicator of exposure to traffic pollution."The concentrations of most traffic pollutants varied by factors of two to five across the cities.The report highlights the dangers of highly polluting diesel trucks, which represent a minority of the total trucks on roads and highways, but emit diesel exhaust at disproportionately high levels."If there's a high proportion of trucks, people who spend a lot of time near these roadways -- drivers, workers, residents -- are being more exposed to diesel exhaust, which is a recognized human carcinogen," says Evans.Though there's currently no standard for public exposure to diesel exhaust in Canada, black carbon, more commonly called soot, is used to monitor exposure in workplaces. Based on black carbon, the concentrations of diesel exhaust beside the major roads exceeded the guidelines proposed in the Netherlands for workers, implying that they are too high for the public."If these highly polluting diesel trucks were repaired, retrofitted, removed or relocated, it would make a significant difference," says Evans. "You can't move your nearby schools or homes, but we can do something about these highly-polluting trucks that are a small proportion of the truck traffic, and yet causing a lot of the trouble."Air quality is not just a concern during summer months: winter weather brings an increase in near-road concentrations of nitrogen oxides and ultrafine particles.The researchers' data suggest emission treatment systems on diesel vehicles become less effective under colder temperatures. "The systems appear to not be well designed for cold weather," says Evans. "It's concerning when you consider most of Canada faces cold temperatures and long months of winter; Toronto and Vancouver are nowhere near the coldest parts of Canada."Wind conditions also affect pollutant levels: the researchers found that concentrations were up to six times higher when monitoring the downwind side of a major road.As brake pads on cars and trucks are worn down, the materials they're made of turn to dust -- and that dust goes straight into the air."These non-tailpipe emissions, from brakes, tires and the road itself, are increasing and we believe that this is because our cars are getting larger and heavier," says Cheol Jeong, Senior Research Associate in Evans' lab, whose analysis revealed the growing issue with non-tailpipe emissions."People are buying more trucks and SUVs than small cars and that trend has been growing in recent years. The heavier it is, the more energy it takes to stop, and the more brake dust gets emitted," he adds.The report concludes with recommendations geared at all levels of government. Evans hopes the report will lead to establishing a nation-wide road-pollution research network that can advise policymakers, engage companies and the public, and lead to standards and laws that will ultimately protect the health of Canadians."We'd like to see this report, and future studies, help launch new monitoring stations across Canada so that all Canadians can get a better picture of the implications of our transportation choices and how these influence what we're breathing in," he says. "Our transportation will be changing very quickly in the coming decade and we'll need ongoing monitoring to help us stay on a path towards increased sustainability."The findings of this report and its recommendations will be discussed at a national meeting in Toronto on November 4.
Pollution
2,019
October 30, 2019
https://www.sciencedaily.com/releases/2019/10/191030073326.htm
Prenatal air pollution exposure linked to infants' decreased heart rate response to stress
A mother's exposure to particulate air pollution during pregnancy is associated with reduced cardiac response to stress in six-month-old infants, according to Mount Sinai research published in
Variability in how the heart rate responds to stressful experiences is essential for maintaining optimal functioning of the cardiovascular, respiratory, and digestive systems and also is central to emotional well-being and resilience to stress over one's lifetime. Decreased heart rate variability, as observed in this study, is a known risk factor for mental and physical health problems in later life. Air pollution's negative effect on heart rate variability has previously been found to lead to medical and psychological conditions such as heart disease, asthma, allergies, and mood or behavioral disorders in studies of older children, adolescents, and adults.Mount Sinai researchers studied 237 Boston-based mothers and their infants and used satellite data and air pollution monitors to determine the level of particulate air pollution the mothers were exposed to during pregnancy. The air pollution levels in this study were similar to levels experienced by the general U.S. population.By studying the babies' heart rate and respiration at age six months, researchers found that the higher the level of the mother's exposure to air pollution in pregnancy, the less variability in the infant's heart rate in response to a stress challenge."These findings, in combination with increasing worldwide exposure to particulate air pollution, highlight the importance of examining early-life exposure to air pollution in relation to negative medical, developmental, and psychological outcomes," said senior author Rosalind Wright, MD, MPH, Dean for Translational Biomedical Research, and Professor of Pediatrics, Environmental Medicine and Public Health, and Medicine (Pulmonary, Critical Care and Sleep Medicine), at the Icahn School of Medicine at Mount Sinai. "A critical step in identifying children at risk for costly chronic disorders is identifying exposures that lead to early vulnerability.""Identifying exposures that disrupt key processes such as heart rate response will lead to prevention strategies early in life when they can have the greatest impact. Specifically, these findings support individual-level and policy-level action to reduce exposure to particulate air pollution exposure during pregnancy," said the study's first author, Whitney Cowell, PhD, a postdoctoral fellow in Environmental Medicine and Public Health at the Icahn School of Medicine.
Pollution
2,019
October 29, 2019
https://www.sciencedaily.com/releases/2019/10/191029103311.htm
Living in a noisy area increases the risk of suffering a more serious stroke
The high levels of environmental noise we are subjected to in large cities can increase both the severity and consequences of an ischaemic stroke. More precisely, researchers from the Hospital del Mar Medical Research Institute (IMIM) and doctors from Hospital del Mar, together with researchers from the Barcelona Institute for Global Health (ISGlobal), CIBER in Epidemiology and Public Health (CIBERESP), and Brown University, in the United States, put the increased risk at 30% for people living in noisier areas. In contrast, living close to green areas brings down this risk by up to 25%. This is the first time that these factors have been analysed in relation to stroke severity. The study has been published in the journal
The researchers looked at the influence of noise levels, air pollution (particularly suspended particles smaller than 2.5 microns; PM2.5), and exposure to green areas on nearly 3,000 ischaemic stroke patients treated at Hospital del Mar between 2005 and 2014. To do this, they used data from the Cartographic Institute of Catalonia, as well as models to analyse atmospheric pollutant levels, the noise map of Barcelona, and satellite images to define areas with vegetation. Also taken into account was the socioeconomic level of the place the patients lived.Dr. Rosa María Vivanco, from the IMIM's Neurovascular Research Group and first author of the study, points out that the study gives us initial insight into how noise levels and exposure to green spaces influences the severity of ischaemic stroke. "We have observed a gradient: the more green spaces, the less serious the stroke. And the more noise, the more serious it is. This suggests that factors other than those traditionally associated with stroke may play an independent role in the condition," she explains. At the same time, Dr. Xavier Basagaña, one of the authors of the study and a researcher at ISGlobal, a centre supported by "la Caixa," stresses that "exposure to green spaces can benefit human health through various mechanisms. For example, it can reduce stress, encourage social interaction, and increase levels of physical activity." However, in this study no link was seen with atmospheric pollution. The researchers warn that one of the limitations of the work was the lack of variability in pollutant concentrations to which the study population is exposed. This made it difficult to draw conclusions, and they point out that more studies are needed in this field."Previous studies have demonstrated that living in places with high levels of air pollution or noise, or with fewer green areas, exposes the population to a higher risk of suffering an ischaemic stroke. This work broadens our knowledge in this field, showing that the place where we live affects not only the risk of suffering a stroke, but also its severity if it occurs," explains Dr Gregory A. Wellenius, from the Epidemiology Department at Brown University and final author of the study. In this sense, the results indicate that patients living in noisier areas presented more severe strokes on arrival at hospital.The researchers have analysed the effects of stroke on neurological deficits, such as speech impairment and mobility, using the NIHSS (National Institute of Health Stroke Scale). "The severity of a stroke depends on various factors, including the extent of the brain injury, the specific area of brain affected, the subtype of stroke, the existence of associated risk factors (diabetes, atrial fibrillation, atherosclerotic load), and so on. The fact that we have demonstrated, in addition to all these factors, that environmental aspects like green spaces and urban noise levels affect the severity of a stroke and therefore our health, shows that this information must be taken into account by political and health planners," emphasises Dr. Jaume Roquer, head of the Neurology Service at Hospital del Mar, coordinator of the IMIM's Neurovascular Research Group, and one of the main authors of the work.The researchers did not aim to determine which noise levels lead to increased risk, but rather to detect a gradient by comparing patients living in noisier areas with those living in quieter areas. Indeed, the World Health Organisation (WHO) recommends traffic noise limits of a maximum of 53 decibels during the day and 45 decibels at night. "The average noise level to which patients have been exposed, as well as the general population of the study area, requires reflection, as it is considerably above the WHO recommendations," points out Carla Avellaneda, an IMIM researcher and author of the work. The same researchers have already revealed that high levels of air pollution from diesel engines increase the risk of suffering atherothrombotic stroke by 20%.In Spain, stroke is the leading cause of death in women and the third ranked in men, and is estimated to affect 1 in 6 people throughout their lives (in 2012, it caused the death of 6.7 million people around the world, according to WHO data). In Catalonia there are 13,000 cases and 3,800 deaths from stroke each year. The two main types of stroke are haemorrhagic and ischaemic.Ischaemic stroke is due to the obstruction of a blood vessel in the brain and accounts for 80-85% of all cases. This lack of blood flow in the affected area of the brain can lead to permanent damage. The risk of having a stroke is closely related to factors including age, smoking, high blood pressure, diabetes, obesity, a sedentary lifestyle and, as recently demonstrated, other factors like air pollution.
Pollution
2,019
October 29, 2019
https://www.sciencedaily.com/releases/2019/10/191029080721.htm
Where to install renewable energy in US to achieve greatest benefits
A new Harvard study shows that to achieve the biggest improvements in public health and the greatest benefits from renewable energy, wind turbines should be installed in the Upper Midwest and solar power should be installed in the Great Lakes and Mid-Atlantic regions. When adjusting for energy produced, the benefits ranged from $28 per MWh of energy produced from wind in California, to $113 per MWh of wind in the Upper Midwest and for utility-scale solar in the Great Lakes and Mid-Atlantic. The study in
The researchers developed a model of the 10-regions of the U.S. electrical grid. Using the social cost of carbon -- which assigns a dollar value to the negative consequences of climate change -- they calculated the benefits of carbon dioxide reduction for each region and energy type. Health benefits come from air quality improvements that reduce premature deaths and climate benefits come from reduced impacts of droughts, extreme weather events, sea-level rise, displacement of refugees, disruptions to farming, and climate-related diseases."Our results provide a strong argument for installing more renewable energy to reduce the health impacts of climate change, and the health burden of air pollution. By tackling the root causes of climate change, we can address our nation's most pressing health problems at the same time," said Jonathan Buonocore, the lead author and a research associate at Harvard C-CHANGE. "This tool can help state and national policymakers design better climate plans by understanding where to build wind and solar, while also helping private groups, like utilities, renewable energy developers, and even investors, decide where to deploy their resources to maximize the gains from renewable energy."The study, funded by the Harvard University Climate Change Solutions Fund, shows that renewable energy is a cost-effective method to reduce carbon dioxide emissions and that the health benefits are an important component of assessing the full benefits of these projects. In many cases, the health and climate benefits are greater than the financial costs of installing wind or solar. For people living in the Upper Midwest, the climate and health benefits of renewable energy are about four times higher than in California. This is a reflection of where dirty energy, like coal, is produced, and the relationship between energy generation, air pollution, and populations living downwind from it. As a result, the benefits are much higher from deploying renewable energy in places like the Great Lakes and Upper Midwest where it tends to displace coal than in California where it tends to displace gas."To ensure that climate policies are cost-effective, the location where renewables are built is much more important than the specific technology," said Drew Michanowicz, a study author and a research fellow at Harvard C-CHANGE. "If you want to get the biggest bang for your buck in terms of the health and climate benefits of renewables, investing in the Upper Midwest and Great Lakes regions will keep populations downwind healthier while also taking important steps to decarbonize."Fossil fuels used for electricity generate about a third of greenhouse gas emissions that contribute greatly to climate change and negative health impacts. They are also a major source of air pollutants such as sulfur dioxide, nitrogen oxides (NOx), and fine particulate matter that lead to breathing problems, lung damage and increased premature deaths. Those who are most at risk are also the most vulnerable populations, including children, seniors, and people with heart and lung diseases. To reduce the worst impacts of the climate crisis, the Intergovernmental Panel on Climate Change says global human-caused emissions of carbon dioxide must rapidly decrease fossil fuel use to cut carbon emissions by 45% by 2030.
Pollution
2,019
October 25, 2019
https://www.sciencedaily.com/releases/2019/10/191025170810.htm
Study casts doubt on carbon capture
One proposed method for reducing carbon dioxide (CO
"All sorts of scenarios have been developed under the assumption that carbon capture actually reduces substantial amounts of carbon. However, this research finds that it reduces only a small fraction of carbon emissions, and it usually increases air pollution," said Jacobson, who is a professor of civil and environmental engineering. "Even if you have 100 percent capture from the capture equipment, it is still worse, from a social cost perspective, than replacing a coal or gas plant with a wind farm because carbon capture never reduces air pollution and always has a capture equipment cost. Wind replacing fossil fuels always reduces air pollution and never has a capture equipment cost."Jacobson, who is also a senior fellow at the Stanford Woods Institute for the Environment, examined public data from a coal with carbon capture electric power plant and a plant that removes carbon from the air directly. In both cases, electricity to run the carbon capture came from natural gas. He calculated the net COCommon estimates of carbon capture technologies -- which only look at the carbon captured from energy production at a fossil fuel plant itself and not upstream emissions -- say carbon capture can remediate 85-90 percent of carbon emissions. Once Jacobson calculated all the emissions associated with these plants that could contribute to global warming, he converted them to the equivalent amount of carbon dioxide in order to compare his data with the standard estimate. He found that in both cases the equipment captured the equivalent of only 10-11 percent of the emissions they produced, averaged over 20 years.This research also looked at the social cost of carbon capture -- including air pollution, potential health problems, economic costs and overall contributions to climate change -- and concluded that those are always similar to or higher than operating a fossil fuel plant without carbon capture and higher than not capturing carbon from the air at all. Even when the capture equipment is powered by renewable electricity, Jacobson concluded that it is always better to use the renewable electricity instead to replace coal or natural gas electricity or to do nothing, from a social cost perspective.Given this analysis, Jacobson argued that the best solution is to instead focus on renewable options, such as wind or solar, replacing fossil fuels.This research is based on data from two real carbon capture plants, which both run on natural gas. The first is a coal plant with carbon capture equipment. The second plant is not attached to any energy-producing counterpart. Instead, it pulls existing carbon dioxide from the air using a chemical process.Jacobson examined several scenarios to determine the actual and possible efficiencies of these two kinds of plants, including what would happen if the carbon capture technologies were run with renewable electricity rather than natural gas, and if the same amount of renewable electricity required to run the equipment were instead used to replace coal plant electricity.While the standard estimate for the efficiency of carbon capture technologies is 85-90 percent, neither of these plants met that expectation. Even without accounting for upstream emissions, the equipment associated with the coal plant was only 55.4 percent efficient over 6 months, on average. With the upstream emissions included, Jacobson found that, on average over 20 years, the equipment captured only 10-11 percent of the total carbon dioxide equivalent emissions that it and the coal plant contributed. The air capture plant was also only 10-11 percent efficient, on average over 20 years, once Jacobson took into consideration its upstream emissions and the uncaptured and upstream emissions that came from operating the plant on natural gas.Due to the high energy needs of carbon capture equipment, Jacobson concluded that the social cost of coal with carbon capture powered by natural gas was about 24 percent higher, over 20 years, than the coal without carbon capture. If the natural gas at that same plant were replaced with wind power, the social cost would still exceed that of doing nothing. Only when wind replaced coal itself did social costs decrease.For both types of plants this suggests that, even if carbon capture equipment is able to capture 100 percent of the carbon it is designed to offset, the cost of manufacturing and running the equipment plus the cost of the air pollution it continues to allow or increases makes it less efficient than using those same resources to create renewable energy plants replacing coal or gas directly."Not only does carbon capture hardly work at existing plants, but there's no way it can actually improve to be better than replacing coal or gas with wind or solar directly," said Jacobson. "The latter will always be better, no matter what, in terms of the social cost. You can't just ignore health costs or climate costs."This study did not consider what happens to carbon dioxide after it is captured but Jacobson suggests that most applications today, which are for industrial use, result in additional leakage of carbon dioxide back into the air.People propose that carbon capture could be useful in the future, even after we have stopped burning fossil fuels, to lower atmospheric carbon levels. Even assuming these technologies run on renewables, Jacobson maintains that the smarter investment is in options that are currently disconnected from the fossil fuel industry, such as reforestation -- a natural version of air capture -- and other forms of climate change solutions focused on eliminating other sources of emissions and pollution. These include reducing biomass burning, and reducing halogen, nitrous oxide and methane emissions."There is a lot of reliance on carbon capture in theoretical modeling, and by focusing on that as even a possibility, that diverts resources away from real solutions," said Jacobson. "It gives people hope that you can keep fossil fuel power plants alive. It delays action. In fact, carbon capture and direct air capture are always opportunity costs."
Pollution
2,019
October 24, 2019
https://www.sciencedaily.com/releases/2019/10/191024075009.htm
Catalysis that neutralizes air-polluting NOx from power plant emissions
We've known for decades that catalysts speed up the reaction that reduces harmful industrial emissions. And now, we know exactly how they do it.
A recent paper by Israel Wachs, the G. Whitney Snyder Professor of Chemical and Biomolecular Engineering at Lehigh University's P.C. Rossin College of Engineering and Applied Science, describes the mechanism, and was the inside back cover story of the September 2, 2019, issue of Power plants are a major source of toxic emissions associated with climate change. When fossil fuels like coal and natural gas are burned, they produce dangerous contaminants, in particular, a group of harmful gases called nitrogen oxides (or NO"The combustion process to generate energy requires very high temperatures that cause molecular nitrogen (NBack in the 1970s, the Japanese developed a technology to control NO"It's a beautiful chemical reaction, converting something very harmful to something very benign," says Wachs, who directs Lehigh's Operando Molecular Spectroscopy and Catalysis Research Lab.NOx emissions are now strongly regulated and one common abatement strategy is the selective catalytic reduction (SCR) of nitrogen oxides by ammonia. Catalysts both speed up the SCR reaction and control the reaction products (such as forming NOne SCR catalyst widely used by power plants is titania-supported vanadium oxide."The catalyst consists of vanadium oxide and tungsten oxide dispersed on the surface of a titania (TiOThe research community knew from experience that tungsten oxide thermally stabilizes the titania support, which is vital as these catalysts can spend years at high temperatures during operation. They also knew that adding tungsten oxide makes the vanadium oxide much more active, which is also important as the more active a catalyst is the less of it you need. But why did tungsten oxide have such an effect on the reactivity of vanadium oxide?Three theories have dominated over the years, says Wachs. One claimed that tungsten oxide has an acidic character that enhances the chemical reaction. The second said tungsten oxide was somehow sharing electrons with vanadium oxide, and the third stated that the tungsten oxide was changing the structure of the vanadium oxide.Wachs and his collaborators used a cutting-edge instrument called a High Field (HF) Nuclear Magnetic Resonance (NMR) spectrometer in conjunction with reaction studies to test each theory."There are only a few of these HF NMR spectrometers in the world, and their magnetic fields are so sensitive that it gives all the subtle molecular details of what was going on in the material," he says.Those molecular details appear as signals that Wachs and his team then interpreted using theoretical calculations (Density Functional Theory)."It turns out that the amount of vanadium oxide is very low in the catalyst making the vanadium oxide present as isolated species, or monomers," says Wachs. "When you add the tungsten oxide, vanadium oxide changes from monomers to oligomers or polymers, so now all the vanadium oxide is connected as a chain or an island on the titania support. We performed independent studies and found that these oligomers of vanadium oxide are 10 times more active than in the isolated vanadium oxide sites. So the tungsten oxide really does change the structure of vanadium oxide, from a less active form to a highly active form."This fundamental understanding of how the catalyst works will help guide future designs of improved SCR catalysts, says Wachs, who was recently elected as a Fellow of the National Academy of Inventors and has been recognized internationally for his innovative contributions to fundamental catalysis that have been applied in the manufacture of chemicals and control of air pollution."Now that we know what's going on, it won't be trial and error in terms of making it better since we take a scientific approach to the catalyst design."And that will have huge ramifications for industry and air pollution control, he says."A more active catalyst has significant benefits. First of all, these systems are huge, almost the size of a small house, and a lot of these plants were built before this technology was mandated, so space at the plants is limited. So if you have a more active catalyst, you need a smaller footprint. They're also expensive, so if the catalyst is more active, you don't need as much. And finally, since we also think they'll last longer, it will limit the amount of time a plant has to shut down to install a new catalyst."But for Wachs, the effect on public health is the most meaningful -- and gratifying -- result."Easily, 40,000 to 50,000 people in the United States die annually due to complications from poor air quality. So catalysis, and the research around it, has tremendous societal impact. It's very satisfying when you're able to solve a problem that's been around for 40 years, that will improve the technology, and address these health issues."
Pollution
2,019
October 21, 2019
https://www.sciencedaily.com/releases/2019/10/191021144740.htm
Lead pollution from Native Americans attributed to crushing galena for glitter paint
Native American use of galena at Kincaid Mounds, a settlement occupied during the Mississippian period (1150 to 1450 CE), resulted in more than 1.5 metric tons of lead pollution deposited in a small lake near the Ohio River. New data from IUPUI researchers found the lead did not originate from this Southern Illinois settlement, but instead was brought to the site from other Midwest sources.
Archaeologists have long known that Native Americans used galena for thousands of years, but this is the first time its use has been linked with clear indications of pollution and how much pollution.Published October 15 in IUPUI researchers extracted sediment cores from Avery Lake to look at Native American impacts on the landscape and detect signals on how they used the land and its resources. Avery Lake is a floodplain lake adjacent to the Kincaid Mounds archaeological site along the Ohio River near Brookport, Illinois. The mound-building civilizations thrived in the Midwest and at the Avery Lake site until approximately 1450 CE.Sediment core samples indicated heightened lead levels during the pre-Columbian period (before European contact in 1492 CE), which was especially surprising to the researchers. To understand where this lead pollution came from, Bird and Wilson measured the abundances of different lead isotopes, which can be used to trace the lead to the ore deposits from which it was derived."For the Mississippian period, there was a huge change in lead isotopes in the sediments -- bigger than you would usually see," Bird said.The sediment isotopes indicated the lead was not native to the Kincaid Mounds site, but instead came from galena deposits located in southeastern and central Missouri as well as the Upper Mississippi Valley. Galena, a lead sulfide mineral that is silvery and sparkles, was abundant in these regions and widely traded across the eastern U.S. It was often ground into a powder and mixed with other natural materials to use as paint for objects, buildings and personal adornment."We knew that Mississippians used this lead mineral called galena, which is a silvery mineral that they would crush to essentially create ancient glitter, but we didn't expect to see the levels of pollution that we do from its use." Bird said.Researchers believe the lead pollution in Avery Lake is the result of galena powder blowing or being washed into the lake as the people living at Kincaid Mounds crushed and used the mineral."Lake cores are a natural archive of human utilization of the landscape, and in this case, the human inputs into those lakes," Wilson said. "This is a nice way to develop proxy measures for human activity without making any impacts on the archaeological record."Researchers found lead concentration peaks in the Avery Lake data during each of the three significant occupations at the Kincaid Mounds site -- Baumer (300 BCE to 300 CE), Mississippian (1150 to 1450 CE) and the modern era (since 1800 CE).For the Baumer settlement, the lead pollution was from a single, local source -- likely a result of people using fire to clear the landscape and for daily activities. Mississippian pollution was the result of galena processing and use; modern pollution is a result of leaded fuel and coal."Part of the significance of this research is increasing the understanding of how we utilize our natural resources, understanding what the environmental impacts can be and increasing the awareness of our pre-Columbian history," Bird said.Study data gives archeologists like Wilson more perspective on the day-to-day lives of the Mississippian people."This gives us some baseline understanding of the amounts of galena that were being exchanged by native peoples during the late pre-Columbian period," Wilson said.During this period, mound-building was used to elevate and support structures that point to a social and religious movement that started in the largest pre-Columbian city, Cahokia, located in what today is East St. Louis, Illinois. This movement spread and radiated out to people in other settlements and regions, including Kincaid Mounds."People were living in new social and religious contexts that we had not previously seen in the Midwest and into the southeast," Wilson said. "It's a re-envisioning of society that happens during this early Mississippian period. Along with that is the exchange of these raw materials and enhanced trade networks, where you're seeing larger volumes of material like stone and galena traveling up and down these major river valleys -- like the Ohio, where our study site was."
Pollution
2,019
October 18, 2019
https://www.sciencedaily.com/releases/2019/10/191018181058.htm
Land management practices to reduce nitrogen load may be affected by climate changes
Nitrogen from agricultural production is a major cause of pollution in the Mississippi River Basin and contributes to large dead zones in the Gulf of Mexico.
Illinois and other Midwestern states have set goals to reduce nitrogen load through strategies that include different land management practices. A new study from University of Illinois researchers, published in "The goal was to test whether those land management practices are useful in reducing nitrogen load in the water under different climate scenarios," says Congyu Hou, doctoral student in the Department of Agricultural and Biological Engineering and lead author on the study.Using field data on soil properties, land use, land management practices, and weather patterns from the Willow Creek Watershed in Oklahoma, the researchers estimated surface runoff and nitrogen load at the field-scale level. Their model included 12 land management practices and 32 climate projections for the years 2020 to 2070, yielding a total of 384 scenarios."While modeling is commonly used to estimate nitrogen load, most projections use average climate data. This study expands on that practice by including all possible climate predictions," says Maria Chu, assistant professor of agricultural and biological engineering in the College of Agricultural, Consumer and Environmental Sciences. Chu is Hou's advisor and a co-author of the study.Hou says the study serves as a test case for the type of modeling simulation the researchers employed. "We use a field-edge model that divides the watershed into smaller cells, for a total of 5,911 cells. The model calculates the edge of the field, focusing on how much nitrogen is transported away from the field."The research also tested a new index of how to measure the soil's ability to hold nitrogen. They found that even without reduction in application, nitrogen load can be reduced simply by redistributing land uses, Hou explains.For example, their findings indicated that crop rotation practices do help reduce nitrogen loss. They also found that splitting fertilizer application between spring and fall generally is more beneficial, and that fertilizer application rate is the most critical factor in determining both the amount and probability of high nitrogen load.Hou cautions that the study's findings are preliminary. "With modeling, you're still far away from practical application," he says. "The model is the first step, then comes a small-scale field test, then small-scale tests at different locations. If those work, you can eventually expand to a larger region."In addition to expanding the scale of modeling, Chu's research group looks at other aspects of nitrogen pollution. For example, while this study focuses on surface runoff, another study looks at nitrogen load in the ground water."We're also looking at the different components of nitrogen-that is, nitrate, nitrite and ammonia-to determine which one is most critical to address," Chu says.
Pollution
2,019
October 17, 2019
https://www.sciencedaily.com/releases/2019/10/191017141114.htm
Stranded whales detected from space
A new technique for analysing satellite images may help scientists detect and count stranded whales from space. Researchers tested a new detection method using Very High Resolution (VHR) satellite images from Maxar Technologies of the biggest mass stranding of baleen whales yet recorded. It is hoped that in the future the technique will lead to real-time information as stranding events happen.
The study, published this week in the journal PLoS ONE by scientists from British Antarctic Survey and four Chilean research institutes, could revolutionise how stranded whales, that are dead in the water or beached, are detected in remote places.In 2015, over 340 whales, most of them sei whales, were involved in a mass-stranding in a remote region of Chilean Patagonia. The stranding was not discovered for several weeks owing to the remoteness of the region. Aerial and boat surveys assessed the extent of the mortality several months after discovery.The researchers studied satellite images covering thousands of kilometres of coastline, which provided an early insight into the extent of the mortality. They could identify the shape, size and colour of the whales, especially after several weeks when the animals turned pink and orange as they decomposed. A greater number of whales were counted in the images captured soon after the stranding event than from the local surveys.Many coastal nations have mammal stranding networks recognising that this is a crucial means to monitor the health of the local environment, especially for providing first notice of potential marine contamination and harmful algal blooms.Author and whale biologist Dr Jennifer Jackson at British Antarctic Survey says:"The causes of marine mammal strandings are poorly understood and therefore information gathered helps understand how these events may be influenced by overall health, diet, environmental pollution, regional oceanography, social structures and climate change."As this new technology develops, we hope it will become a useful tool for obtaining real-time information. This will allow local authorities to intervene earlier and possibly help with conservation efforts."Lead author, remote sensing specialist Dr Peter Fretwell at British Antarctic Survey says:"This is an exciting development in monitoring whales from space. Now we have a higher resolution 'window' on our planet, satellite imagery may be a fast and cost-effective alternative to aerial surveys allowing us to assess the extent of mass whale stranding events, especially in remote and inaccessible areas."
Pollution
2,019
October 17, 2019
https://www.sciencedaily.com/releases/2019/10/191017131436.htm
Industrial melanism linked to same gene in 3 moth species
The rise of dark forms of many species of moth in heavily polluted areas of 19th and 20th century Britain, known as industrial melanism, was a highly visible response to environmental change. But did the different species rely on the same gene to adapt?
New research by the University of Liverpool reveals that three species of moth, including the famous peppered moth, indeed did. Interestingly, however, the mutations that gave rise to dark forms of the 'pale brindled beauty' and 'scalloped hazel' moths likely occurred much earlier than that of the 'peppered moth' in the early 1800s and may predate the industrial revolution by hundreds of years.Principal Investigator Professor Ilik Saccheri explains: "Although many people have heard about industrial melanism in the British peppered moth, it is not widely appreciated that dark forms increased in over 100 other species of moths during the period of industrial pollution. This raises the question of whether they relied on the same or similar genetic mechanism to achieve this colour change. This was not a foregone conclusion because melanism in insects may be influenced by many different genes."In a study published in Genetic linkage mapping using parent-offspring families shows that the mutation for melanism occurs in the same genetic region (containing the cortex gene) in all three species. Further analysis of wild samples, however, suggests that the genetic origins of melanism in the pale brindled beauty and scalloped hazel are much older than that in the British peppered moth.Professor Saccheri said: "Our findings imply that these dark forms can persist at low frequencies in non-polluted environments and lend further support to the idea that adaptive evolution makes repeated use of the same genetic and developmental machinery across deep evolutionary time."The researchers chose to focus on these three species because they were studied by University of Liverpool and Manchester scientists in the 1970s, in the days before this type of molecular genetic analysis was possible.Professor Saccheri adds: "Back then, the researchers appreciated that they were dealing with similar phenotypes and selection pressures but wouldn't have considered that they might have been studying the same gene in the different species. I feel gratified that we have been able to collaborate through time with researchers working on these topics over 40 years ago."
Pollution
2,019
October 16, 2019
https://www.sciencedaily.com/releases/2019/10/191016131216.htm
Tiny particles lead to brighter clouds in the tropics
When clouds loft tropical air masses higher in the atmosphere, that air can carry up gases that form into tiny particles, starting a process that may end up brightening lower-level clouds, according to a CIRES-led study published today in
"Understanding how these particles form and contribute to cloud properties in the tropics will help us better represent clouds in climate models and improve those models," said Christina Williamson, a CIRES scientist working in NOAA's Chemical Sciences Division and the paper's lead author.The research team mapped out how these particles form using measurements from one of the largest and longest airborne studies of the atmosphere, a field campaign that spanned the Arctic to the Antarctic over a three-year period.Williamson and her colleagues, from CIRES, CU Boulder, NOAA and other institutions, including CIRES scientist Jose Jimenez, took global measurements of aerosol particles as part of the NASA Atmospheric Tomography Mission, or ATom. During ATom, a fully instrumented NASA DC-8 aircraft flew four pole-to-pole deployments -- each one consisting of many flights over a 26-day period -- over the Pacific and Atlantic Oceans in every season. The plane flew from near sea level to an altitude of about 12 km, continuously measuring greenhouse gases, other trace gases and aerosols."ATom is a flying chemistry lab," Williamson said. "Our instruments allowed us to characterize aerosol particles and their distribution in the atmosphere." The researchers found that gases transported to high altitudes by deep, convective clouds in the tropics formed large numbers of very small aerosol particles, a process called gas-to-particle conversion.Outside the clouds, the air descended toward the surface and those particles grew as gases condensed onto some particles and others stuck together to form fewer, bigger particles. Eventually, some of the particles became large enough to influence cloud properties in the lower troposphere.In their study, the researchers showed that these particles brightened clouds in the tropics. "That's important since brighter clouds reflect more energy from the sun back to space," Williamson said.The team observed this particle formation in the tropics over both the Pacific and Atlantic Oceans, and their models suggest a global-scale band of new particle formation covering about 40 percent of the Earth's surface.In places with cleaner air where fewer particles exist from other sources, the effect of aerosol particle formation on clouds is larger. "And we measured in more remote, cleaner locations during the ATom field campaign," Williamson said.Exactly how aerosols and clouds affect radiation is a big source of uncertainty in climate models. "We want to properly represent clouds in climate models," said Williamson. "Observations like the ones in this study will help us better constrain aerosols and clouds in our models and can direct model improvements."
Pollution
2,019
October 16, 2019
https://www.sciencedaily.com/releases/2019/10/191016094915.htm
Are we underestimating the benefits of investing in renewable energy?
As policymakers seek to reduce carbon dioxide and other pollutants through increases in renewable energy, improving energy efficiency or electrifying transportation, a key question arises: Which interventions provide the largest benefits to avoid the negative health effects of air pollution?
To address this question, it is important to understand how much pollution is released at different times by power plants on the electricity system. The amount of pollution that is produced per unit of energy on the electric grid is measured by what is known as emissions intensity. Traditionally, policymakers and energy modelers have used annual average emissions intensities -- averaged across all power plants over an entire year -- to estimate the emissions avoided by a power system intervention. However, doing so misses the fact that many interventions affect only a certain set of power plants, and that these effects may vary by time of day or year.By using marginal emissions that are collected on an hourly basis and account for location, policymakers may be able to glean important information that would otherwise be missed, according to new research. This approach may help decision-makers more clearly understand the impacts of different policy and investment options.Scientists tested the difference between average and marginal emissions by analyzing electricity from PJM, the largest wholesale electricity market in the United States. PJM produces about 800 terawatt hours of electricity per year -- enough to power a fifth of the U.S. -- and contributes roughly 20 percent of U.S. power sector emissions. Their findings, published in The researchers show that for certain interventions, using PJM average emissions intensities can underestimate the damages avoided by almost 50 percent compared to marginal intensities that account for which power plants are actually affected. In other words, using average values may cause a policymaker to think an intervention is only half as effective as it really is, potentially compromising its implementation despite its large benefits.While officials have historically used average emissions intensities to calculate pollution in the electricity sector, in certain cases, this has led to incorrectly estimating impacts compared with a marginal emissions approach, said study co-author Inês Azevedo, an associate professor in the Department of Energy Resources Engineering at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth).The researchers also highlight the importance of using up-to-date emissions intensity estimates. In their paper, they show that using estimates only one year out of date can overestimate the damages avoided by 25 to 35 percent."The electric grid is changing rapidly, but emissions intensity data is often released with a large lag," said Priya Donti, a PhD student at Carnegie Mellon University and study co-author. "Our study demonstrates the importance of frequently updating this data.""Boston University used some of our prior work on marginal emissions to decide where to procure renewable energy, by modeling the extent to which different procurements would reduce emissions," said Azevedo, referring to the institution's Climate Action Plan. "It's interesting to think about whether other decision-makers could start using the same sorts of tools to inform climate action plans at the city and state levels."These kinds of tools can help decision-makers understand the impacts of different policy and investment options, Donti said. "We want to help them design interventions that provide the biggest benefits when it comes to tackling climate change and improving human health."Azevedo is also a senior fellow at the Stanford Woods Institute for the Environment. J. Zico Kolter of Carnegie Mellon University is a co-author on the study. The research was supported by the Center for Climate and Energy Decision Making (CEDM) in an agreement between Carnegie Mellon University and the National Science Foundation. The study was also funded by the National Science Foundation Graduate Research Fellowship Program and the Department of Energy Computational Science Graduate Fellowship.
Pollution
2,019
October 15, 2019
https://www.sciencedaily.com/releases/2019/10/191015171556.htm
Airborne chemicals instantly identified using new technology
Scientists at Nanyang Technological University, Singapore (NTU Singapore) have developed a device that can identify a wide range of airborne gases and chemicals instantly.
The new prototype device is portable and suitable for rapid deployment by agencies to identify airborne hazards, such as from tiny gas molecules like sulphur dioxide. It can also identify larger compound molecules such as benzene, known to be harmful to human health.It can provide real-time monitoring of air quality such as during haze outbreaks, and assist in the detection of gas leaks and industrial air pollution.Developed by a research team led by Associate Professor Ling Xing Yi at the School of Physical and Mathematical Sciences, the new technology was reported last month in science journal Current methods of identifying gases in the air use a laboratory technique called Gas Chromatography -- Mass Spectrometry (GC-MS), which is reliable but requires tedious sample collection and takes between a few hours and a few days to obtain results from air samples.Emergency scenarios require a fast and ongoing analysis of potential air contamination, such as following a natural disaster, chemical spill or illegal dumping of toxic waste, so that emergency responders can take appropriate action.The new device uses a small patch made of a special porous and metallic nanomaterial to first trap gas molecules. When a laser is shone on it from a few metres away, the light interacts with the gas molecules, causing light of a lower energy to be emitted. When analysed, it gives a spectroscopic readout in the format of a graph chart.The spectroscopic readout acts like a "chemical fingerprint" corresponding to various chemicals present on the patch. The whole process takes about 10 seconds to complete.These chemical fingerprints from the sample are referenced against a digital library of fingerprints to quickly determine what chemicals have been detected.Known as Raman spectroscopy, this is a long-established technique for identifying chemical substances. Typically, it has been used only on solid and liquid samples, since gaseous chemicals are too dilute for the laser and detector to pick up.To overcome this limitation, Assoc Prof Ling and her PhD student Mr Phan Quang Gia Chuong developed a special nanostructure made from a highly porous synthetic material known as a metal-organic framework, which actively absorbs and traps molecules from the air into a 'cage'.This nanostructure also contains metal nanoparticles, which boost the intensity of the light surrounding the molecules. The result is a million-fold enhancement in the Raman spectroscopy signals, which allows for the identification of the trapped molecules.Assoc Prof Ling said the genesis of the invention was sparked by an incident in Singapore, where there were reports of a strong gas-like odour over certain parts of the island in 2017. The cause was only determined a few days later, and was traced to volatile organic compounds released by factories outside of Singapore.Together with her husband, Dr Phang In-Yee, a project leader and scientist at the Institute of Materials Research and Engineering (IMRE), they conceptualised the idea of identifying gases instantly from a distance."Our device can work remotely, so the operation of the laser camera and analysis of chemicals can be done safely at a distance. This is especially useful when it is not known if the gases are hazardous to human health," explains Assoc Prof Ling, Head of the Division of Chemistry & Biological Chemistry at NTU.The laser was tested in experiments to work up to 10 metres away and can be engineered to reach further distances. Another possible method is to use the chip to capture gases, which is subsequently analysed with a laser.In experiments, the team showed that the device can identify airborne molecules such as polyaromatic hydrocarbons (PAH), including naphthalene and derivatives of benzene, a family of colourless industrial air pollutants known to be highly carcinogenic.It can detect PAHs at parts-per-billion (ppb) concentrations in the atmosphere as well as performing continuous monitoring of the concentration of the different types of gases like carbon dioxide (CO2) in the atmosphere, which could be a useful application in many industrial settings.The laser used in the device has an energy intensity of 50 miliwatts, more than seven times weaker than in other applications of Raman spectroscopy. This makes the system safer to operate and more energy efficient.Through NTUitive, NTU's innovation and enterprise company, the team has filed for a patent and is now commercialising the technology for use in pollution monitoring, chemical disaster response, as well as other industrial applications.
Pollution
2,019
October 15, 2019
https://www.sciencedaily.com/releases/2019/10/191015131505.htm
Did early mammals turn to night life to protect their sperm?
Humans are diurnal -- we are active in the day and sleep at night. But diurnalism is by far the exception rather the rule in mammals. About 250-230 million years ago, the mammalian ancestors, called the therapsids, became exclusively nocturnal, and stayed so until the demise of the dinosaurs 66 million years ago. All of our mammal ancestors lived in the dark for about 200 years, and the majority still do to this day. Humans are, essentially, nocturnal animals that have reverted back to living in the sun.
There has been much speculation about why the therapsids became nocturnal. The traditional argument is that the archosauriforms and the dinosaurs became ecologically dominant during the Triassic. To avoid being eaten by the multitude of new carnivorous reptiles, the archaic mammals, it is argued, fled into the dark, where reptiles had yet to dominate. In a new paper, "Obligatory nocturnalism in Triassic archaic mammals: Preservation of sperm quality?", Barry G. Lovegrove proposes a simple, new, alternative hypothesis based purely upon physiological constraints.The therapsids were becoming rapidly endothermic (producing more of their own internal heat through metabolism) to fuel new energy demands and to defend the consequent elevated body temperature, especially as they got smaller during the Triassic. And herein lies a problem. As their body temperature started to approach that of the air, around 93.2°F (34°C), they would not have been able to offload excessive heat generated by being active during the day without losing vast amounts of body water through evaporative cooling, such as by sweating or panting.Archaic mammals did not have scrotums, in which the testes are kept cool, and if there had not been a way to keep sperm cool, quality would have declined through the accumulation of free radicals with the increases in temperature during sperm maturation. By becoming active during the cooler nights, these mammals were able to preserve sperm quality. A nocturnal lifestyle could solve this problem, now that they were "warm-blooded," with the newly acquired thermoregulatory toolkit to cope with the cooler night air.
Pollution
2,019
October 10, 2019
https://www.sciencedaily.com/releases/2019/10/191010142109.htm
New tool visualizes nature's benefits worldwide
Nature supports people in critical ways, often at a highly local level. Wild bees buzz through farms, pollinating vegetables as they go. Nearby, wetlands might remove chemicals from the farm's runoff, protecting a community drinking water source. In communities all around the world, nature's contributions are constantly flowing to people. Scientists have mapped these contributions at local levels for years, but a new Stanford-led study puts these local analyses on an interactive global map that emphasizes nature's declining ability to protect people from water pollution, coastal storms and under-pollinated crops.
The study, published October 10 in "Thanks to rapid recent technological improvements, we're now able to map these local contributions from nature in a detailed, accessible way at a global scale," said Becky Chaplin-Kramer, lead scientist at Stanford's Natural Capital Project and lead author on the study. "By applying this new technology, we are able to clearly see where people are receiving benefits from nature around the world. We also see where people are most likely to lose vital benefits as ecosystems degrade."Chaplin-Kramer and her fellow researchers set out to understand and map where nature contributes the most to people and how many people may be affected by future changes in climate, fossil fuel use and development. They focused on three fundamental benefits that nature provides to people: water quality regulation, protection from coastal hazards and crop pollination. Using open-source software developed by the Natural Capital Project -- a global partnership focused on natural capital research and application -- they modeled how the flow of these benefits might change in the future.Across the board, they found that where people's needs for nature are greatest, nature's ability to meet those needs is declining. By 2050, their projections show that up to 5 billion people could be at higher risk of water pollution, coastal storms and under-pollinated crops.Critically, the team's research shows that these impacts are inequitably distributed. In all scenarios, developing countries shoulder a disproportionate share of the burden."Our analyses suggest that the current environmental governance at local, regional and international levels is failing to encourage the most vulnerable regions to invest in ecosystems," said study coauthor Unai Pascual, co-chair of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) Values Assessment. "If we continue on this trajectory, ecosystems will be unable to provide natural insurance in the face of climate change-induced impacts on food, water and infrastructure."People in Africa and South Asia are the most disadvantaged in the face of diminishing contributions from nature. More than half the population in these regions is facing higher-than-average "benefit gaps," the tangible elements -- like vulnerability to coastal storms, water pollution or crop losses -- that people feel when contributions from nature stop flowing. The impacts aren't isolated to certain countries, though. Under climate change, projected sea-level rise increases risk to coastal communities everywhere and may impact over 500 million people worldwide by 2050.The researchers' technological application of integrated, high-resolution data provides an opportunity to incorporate nature into worldwide policy decisions. The vehicle for this is an online viewer that presents complex global data in a digestible way -- high-resolution, interactive maps.The team is looking to policymakers, development banks and other global influencers to use this information to drive sustainable development and conservation."Determining when and where nature is most important is critical to understanding how best to enhance people's livelihoods and wellbeing," said study coauthor Stephen Polasky, a professor of environmental economics at the University of Minnesota and coordinating lead author of the recent IPBES Global Assessment.Looking forward, the researchers are expanding their analysis to model other ecosystem benefits. They're also looking to more deeply understand where nature's contributions could best support the planet's most vulnerable populations."We hope that this work will advance the integration of nature's contributions to people into decision making and further galvanize global action," said Chaplin-Kramer. "We're equipped with the information we need to avert the worst scenarios our models project and move toward an equitable, sustainable future. Now is the time to wield it."This research was funded by the Marianne and Marcus Wallenberg Foundation and by gifts to the Natural Capital Project from P. and H. Bing and R. and V. Sant.
Pollution
2,019
October 9, 2019
https://www.sciencedaily.com/releases/2019/10/191009131742.htm
Electronic solid could reduce carbon emissions in fridges and air conditioners
A promising replacement for the toxic and flammable greenhouse gases that are used in most refrigerators and air conditioners has been identified by researchers from the University of Cambridge.
The device is based on layers of a material composed of oxygen and three metallic elements known as PST, and it displays the largest electrocaloric effects -- changes in temperature when an electric field is applied -- yet observed in a body large enough for cooling applications.The results, reported in the journal "When facing a challenge as big as climate change and reducing carbon emissions to net zero, we tend to focus on how we generate energy -- and rightly so -- but it's critical that we're also looking at the consumption of energy," said co-author Dr Xavier Moya from Cambridge's Department of Materials Science & Metallurgy.Refrigeration and air conditioning currently consume a fifth of all energy produced worldwide, and as global temperatures continue to rise, demand is only going to keep going up. In addition, the gases currently used in the vast majority of refrigerators and air conditioners are toxic, highly flammable greenhouse gases that only add to the problem of global warming when they leak into the air.Researchers have been trying to improve cooling technology by replacing these gases with solid magnetic materials, such as gadolinium. However, the performance of prototype devices has been limited to date, as the thermal changes are driven by limited magnetic fields from permanent magnets.In research published earlier this year, the same Cambridge-led team identified an inexpensive, widely available solid that might compete with conventional coolants when put under pressure. However, developing this material for cooling applications will involve a lot of new design work, which the Cambridge team are pursuing.In the current work, the thermal changes are instead driven by voltage. "Using voltage instead of pressure to drive cooling is simpler from an engineering standpoint, and allows existing design principles to be repurposed without the need for magnets," said Moya.The Cambridge researchers, working with colleagues in Costa Rica and Japan, used high-quality layers of PST with metallic electrodes sandwiched in between. This made the PST able to withstand much larger voltages, and produce much better cooling over a much larger range of temperatures."Replacing the heart of prototype magnetic fridges with a material that performs better, and does not require permanent magnets, could represent a game-changer for those currently trying to improve cooling technology," said co-author Professor Neil Mathur.In future, the team will use high-resolution microscopy to examine the PST microstructure, and optimise it further in order to apply even larger voltages.
Pollution
2,019
October 7, 2019
https://www.sciencedaily.com/releases/2019/10/191007164351.htm
Dual approach needed to save sinking cities and bleaching corals
Local conservation can boost the climate resilience of coastal ecosystems, species and cities and buy them precious time in their fight against sea-level rise, ocean acidification and warming temperatures, a new paper by scientists at Duke University and Fudan University suggests.
The peer-reviewed paper, published Oct. 7 in "The answer is, you need both," said Brian R. Silliman, Rachel Carson Associate Professor of Marine Conservation Biology at Duke's Nicholas School of the Environment."Our analysis of local conservation efforts shows that in all but extreme situations, these interventions significantly buffer the impacts of climate change and can buy our sinking cities and bleaching corals time to adapt until the beneficial impacts of global emissions reductions kick in," Silliman said.In the Florida Keys, for instance, local efforts to cull populations of coral-eating snails reduced thermal bleaching on corals by 40% compared to bleaching on non-treated corals during a three-month spike in water temperatures in 2014. It also promoted faster recoveries.In Chesapeake Bay, seagrass beds that were wiped out by warming waters and heavy pollution are now reappearing, largely due to local efforts to cut nutrient pollution flowing into the bay.In Shanghai, where the weight of thousands of high-rises and the depletion of groundwater aquifers causes the ground to sink further each year as the sea is rising, efforts to pump water back into wells and place tighter controls on groundwater use have slowed the subsidence though, and given city officials time to enact other protective measures."A common thread in many of the most successful scenarios we reviewed is that the local actions increased climate resilience by removing or reducing human-related stresses that were compounding climate stresses and increasing a species' or site's vulnerability," said Qiang He, professor of coastal ecology at Fudan University in Shanghai, China, who co-authored the new paper with Silliman.Understanding how human and climate stresses interact is critical for predicting when, where or if local interventions are likely to be effective and what their limits might be, so we can target our efforts accordingly and begin adaptive measures while there is still time, He said.This is especially true in areas with high human population densities.One of the most telling examples of this is the tragedy now facing the Indonesian capital of Jakarta, where massive groundwater withdrawal and the weight of 10 million people and their buildings is causing the city to sink by roughly 25 centimeters a year, He noted. By 2050, 95% of the city will be submerged as a result of the compounding effects of sea-level rise and human actions."Because Jakarta -- unlike Shanghai -- did not decrease its human impacts through local conservation or adaptation, the government's only recourse now is to move the entire city to a new, higher location on the island of Borneo," Silliman said."Unfortunately, other massive migrations of cities inland will become more and more common in coming decades, but we can reduce their number and how quickly they have to happen if we take dual action now on the local and global fronts," Silliman said. "For certain, this is no time to scale back on local conservation. We need to increase our investment at all scales."
Pollution
2,019
October 7, 2019
https://www.sciencedaily.com/releases/2019/10/191007113327.htm
China is on track to meet its emissions goals for 2020
Polluting emissions from Chinese thermal power plants declined significantly between 2014 and 2017, according to research involving UCL.
The reductions are important in helping to control China's national emissions which could lead to an improvement in air quality and considerable health benefits.A team of experts from the UK and China analysed emissions from coal, oil, natural gas and biomass power plants, with a focus on coal-fired power plants as the major contributors to ambient air pollution.The study, published today in Nature Energy, analysed data from 2014, when China introduced the ambitious Ultra-Low Emissions (ULE) Standards Policy for renovating coal-fired power stations to limit air pollutant emissions, to 2017.The team found that between 2014 and 2017, China's annual power plant emissions of sulphur dioxide, nitrogen oxide and particulate matter dropped by 65%, 60% and 72% each year respectively from 2.21, 3.11 and 0.52 million tonnes in 2014 to 0.77, 1.26 and 0.14 million tonnes in 2017, which is in compliance with ULE standards.This means that China looks to be on track to further reduce its emissions if all thermal power plants meet the ULE standards by 2020. These standards aim to limit the sulphur dioxide, nitrogen oxide and particulate matter emissions to 35, 50 and 10 milligrams per cubic metre respectively.UCL co-author Dr Zhifu Mi (UCL Bartlett School of Construction and Project Management) said: "This is encouraging news for China, as well as other countries wishing to reduce their power emissions. Thermal power plants combusting coal, oil, natural gas and biomass are one of the major contributors to global air pollution."These significant emission reductions demonstrate the technical and economic feasibility of controlling emissions from power plants to reach ultra-low levels, which is an important step towards reducing the number of deaths attributable to air pollution."The study shows that previous methods of estimating Chinese power emissions overestimated numbers by at least 18%, and in some cases up to 92%. This is because previous research was carried out using ex-ante studies -- estimations made ahead of the introduction of ULE standards -- which looked at how the standards might affect emissions based on assumptions of changes in emission concentrations.The research is the first to use data on emission concentrations collected by China's Continuous Emission Monitoring Systems network (CEMS) which covers 96-98% of Chinese thermal power capacity.The team constructed a nationwide emissions dataset -- the China Emissions Accounts for Power Plants (CEAP) -- based on data collected from the CEMS network between 2014 and 2017.CEAP is now publicly available and continues to present, organise and analyse data from the network. This gives accurate results for each power plant as well as real-time results on an hourly frequency."With coal being the most widely-used fuel in China, cutting the number of thermal power plants within a short timeframe would be challenging. The results of this research are encouraging in demonstrating that coal can be used in a much cleaner way to generate electricity," concluded Dr Mi.The research was in collaboration with Beijing University of Chemical Technology, Beihang University, the Ministry of Environmental Protection (Beijing), HeBei University of Science and Technology, the University of Science and Technology (Beijing), Xi'an Jiaotong University, the University of Cambridge and the Chinese Academy of Sciences (Beijing). It was funded by grants from the National Science Foundation for Outstanding Young Scholars, the National Programme for Support of Top Notch Young Professionals and the National Research Programme for Key Issues in Air Pollution Control.
Pollution
2,019
October 7, 2019
https://www.sciencedaily.com/releases/2019/10/191007100408.htm
Particles emitted by consumer 3D printers could hurt indoor air quality
Consumer-grade 3D printers have grown in popularity in recent years, but the particles emitted from such devices can negatively impact indoor air quality and have the potential to harm respiratory health, according to a study from researchers at the Georgia Institute of Technology and UL Chemical Safety.
For the study, which was published September 12 in the journal "All of these tests, which were done at high doses, showed that there is a toxic response to the particles from various types of filaments used by these 3D printers," said Rodney Weber, a professor in Georgia Tech's School of Earth & Atmospheric Sciences, who led the research.The study was part of multi-year research project aimed at characterizing particle emissions by the printers in a controlled environment and identifying measures that could be taken by both 3D printer manufacturers and users to reduce the potential for harm. While earlier studies had focused on quantifying the particles being emitted, this time the researchers looked more closely at the chemical composition of the particles and their potential for toxicity.3D printers typically work by melting plastic filaments and then depositing the melt layer upon layer to form an object. Heating the plastic to melt it releases volatile compounds, some of which from ultrafine particles that are emitted into the air near the printer and the object.In earlier research, the team found that generally the hotter the temperature required to melt the filament, the more emissions were produced. As a result, acrylonitrile butadiene styrene (ABS) plastic filaments, which require a higher temperature to melt, produced more emissions than filaments made of polylactic acid (PLA), which melt at a lower temperature.To test the impact of the emissions on live cells, the researchers partnered with Weizmann Institute of Science in Israel, which exposed human respiratory cells and rat immune system cells to concentrations of the particles from the printers. They found that both ABS and PLA particles negatively impacted cell viability, with the latter prompting a more toxic response. But these tests did not reflect actual exposuresThe researchers also performed a chemical analysis of particles to gain further insight into their toxicity and allow comparisons to toxicity of particles found in outdoor urban environments. The analysis -- called oxidative potential -- simulates the toxic response that an aerosol would have on cellular organisms."The toxicity tests showed that PLA particles were more toxic than the ABS particles on a per-particle comparison, but because the printers emitted so much more of the ABS -- it's the ABS emissions that end up being more of the concern," Weber said. "Taken together, these tests indicate that exposure to these filament particles could over time be as toxic as the air in an urban environment polluted with vehicular or other emissions."Another finding of the study was that the ABS particles emitted from the 3D printers had chemical characteristics that were different than the ABS filament."When the filament companies manufacture a certain type of filament, they may add small mass percentages of other compounds to achieve certain characteristics, but they mostly do not disclose what those additives are," Weber said. "Because these additives seem to affect the amount of emissions for ABS, and there can be great variability in the type and amount of additives added to ABS, a consumer may buy a certain ABS filament, and it could produce far more emissions than one from a different vendor."The study also looked at which types of indoor environmental scenarios emissions from a 3D printer would most impact. They estimated that in a commercial building setting such as a school or an office, better ventilation would limit the amount of exposure to the emissions. However, in a typical residential setting with less effective ventilation, the exposure could be much higher, they reported."These studies how that particle and chemical emissions from 3D printers can result in unintentional pollutant exposure hazards, and we are pleased to share this research so that steps can be taken to reduce health risks," said Marilyn Black, senior technical advisor for UL.In the meantime, some measures can be taken by operators of 3D printers to lessen their impact on air quality.
Pollution
2,019
October 3, 2019
https://www.sciencedaily.com/releases/2019/10/191003114007.htm
Exposure to air pollution increases violent crime rates
Breathing dirty air can make you sick. But according to new research, it can also make you more aggressive.
That's the conclusion from a set of studies recently authored by Colorado State University researchers in economics, atmospheric science and statistics. Together, the team found strong links between short-term exposure to air pollution and aggressive behavior, in the form of aggravated assaults and other violent crimes across the continental United States.The results, derived from daily Federal Bureau of Investigation crime statistics and an eight-year, detailed map of daily U.S. air pollution, will be published in a forthcoming edition of the The paper's lead author is Jesse Burkhardt, assistant professor in the Department of Agricultural and Resource Economics, who teamed up with fellow economist Jude Bayham in the same department; Ander Wilson in the Department of Statistics; and several air pollution experts in civil engineering and atmospheric science.The CSU researchers cross-analyzed three highly detailed datasets: daily criminal activity from the National Incident-Based Reporting System managed by the FBI; daily, county-level air pollution from 2006-2013 collected by U.S. Environmental Protection Agency monitors; and daily data on wildfire smoke plumes from satellite imagery provided by the National Oceanic and Atmospheric Administration's Hazard Mapping System.Air pollution scientists typically measure rates of pollution through concentrations of ozone, as well as of "PM2.5," or breathable particulate matter 2.5 microns in diameter or smaller, which has documented associations with health effects.Eighty-three percent of crimes considered "violent" by the FBI are categorized as assaults in crime databases. In the researchers' study, they observed whether crimes occurred inside or outside the home; they found that 56 percent of violent crimes and 60 percent of assaults occurred within the home, which is an indication that many such crimes are tied to domestic violence.The research results show a 10 microgram-per-cubic-meter increase in same-day exposure to PM2.5 is associated with a 1.4% increase in violent crimes, nearly all of which is driven by crimes categorized as assaults. Researchers also found that a 0.01 parts-per-million increase in same-day exposure to ozone is associated with a 0.97% increase in violent crime, or a 1.15% increase in assaults. Changes in these air pollution measures had no statistically significant effect on any other category of crime."We're talking about crimes that might not even be physical -- you can assault someone verbally," co-author Bayham said. "The story is, when you're exposed to more pollution, you become marginally more aggressive, so those altercations -- some things that may not have escalated -- do escalate."The researchers made no claims on the physiological, mechanistic relationship of how exposure to pollution leads someone to become more aggressive; their results only show a strong correlative relationship between such crimes and levels of air pollution.The researchers were careful to correct for other possible explanations, including weather, heat waves, precipitation, or more general, county-specific confounding factors.The team published a companion paper in the The tool that allowed the team to overlay crime data with pollution data was originally used in collaboration with CSU epidemiologist Sheryl Magazmen to study health effects from air pollution, explained co-author Jeff Pierce, associate professor in the Department of Atmospheric Science and a Monfort Professor. Pierce, associate professor Emily Fischer and researchers Kate O'Dell and Bonne Ford, had previously worked with Magzamen to detail how smoke and particulate matter exposure correlated with things like hospitalizations and asthma inhaler refills.Burkhardt had been wanting to study whether breathing smoke could enact behavioral change when he met atmopsheric scientist Pierce."Several years ago, Fort Collins experienced a fairly severe wildfire season," Burkhardt said. "The smoke was so bad that after a few days, I started to get frustrated, and I wondered if frustration and aggression would show up in aggregate crime data."Pierce recognized that the pollution-concentration product he and colleagues had designed, which provided detailed concentrations of total particulate matter and the fraction from smoke, would be useful for Burkhardt's desired application."The results are fascinating, and also scary," Pierce said. "When you have more air pollution, this specific type of crime, domestic violent crime in particular, increases quite significantly."The economists calculated that a 10 percent reduction in daily PM2.5 could save $1.1 million in crime costs per year, which they called a "previously overlooked cost associated with pollution."The authors remain interested in the relationships between pollution and cognitive outcomes, Burkhardt said. They are now working with a large online chess platform to determine if increased pollution exposure is correlated with worse chess performance.The results are just one outcome of CSU's philosophy around "cluster hiring" faculty from disparate fields to study interdisciplinary problems. In this case, several of the researchers came to CSU under the Partnership for Air Quality, Climate and Health initiative launched several years ago by the Office of the Vice President for Research.
Pollution
2,019
October 2, 2019
https://www.sciencedaily.com/releases/2019/10/191002165233.htm
Aspirin may halve air pollution harms
A new study is the first to report evidence that nonsteroidal anti-inflammatory drugs (NSAIDs) like aspirin may lessen the adverse effects of air pollution exposure on lung function. The team of researchers from the Columbia Mailman School of Public Health, Harvard Chan School of Public Health, Boston University School of Medicine published their findings in the
The researchers analyzed a subset of data collected from a cohort of 2,280 male veterans from the greater Boston area who were given tests to determine their lung function. The average age of participants was 73 years. The researchers examined the relationship between test results, self-reported NSAID use, and ambient particulate matter (PM) and black carbon in the month preceding the test, while accounting for a variety of factors, including the health status of the subject and whether or not he was a smoker. They found that the use of any NSAID nearly halved of the effect of PM on lung function, with the association consistent across all four weekly air pollution measurements from same-day to 28 days prior to the lung function test.Because most of the people in the study cohort who took NSAIDs used aspirin, the researchers say the modifying effect they observed was mainly from aspirin, but add that effects of non-aspirin NSAIDs are worthy of further exploration. While the mechanism is unknown, the researchers speculate that NSAIDs mitigate inflammation brought about by air pollution."Our findings suggest that aspirin and other NSAIDs may protect the lungs from short-term spikes in air pollution," says first and corresponding author Xu Gao, PhD, a post-doctoral research scientist in the Department of Environmental Health Sciences at the Columbia Mailman School. "Of course, it is still important to minimize our exposure to air pollution, which is linked to a host of adverse health effects, from cancer to cardiovascular disease.""While environmental policies have made considerable progress toward reducing our overall exposure to air pollution, even in places with low levels of air pollution, short-term spikes are still commonplace," says senior author Andrea Baccarelli, MD, PhD, chair of the Department of Environmental Health Sciences at the Columbia Mailman School. "For this reason, it is important to identify means to minimize those harms."An earlier study by Baccarelli found that B vitamins may also play a role in reducing the health impact of air pollution.Co-authors include Brent Coull, Xihong Lin, and Joel Schwartz at Harvard; and Pantel Vokonas at the Boston University School of Medicine.The current study was supported by grants from the National Institute of Environmental Health Sciences (ES009089, ES021733, ES025225, ES027747). The VA Normative Aging Study is supported by the Cooperative Studies Program/Epidemiology Research and Information Center of the U.S. Department of Veterans Affairs and is a component of the Massachusetts Veterans Epidemiology Research and Information Center in Boston.
Pollution
2,019
October 2, 2019
https://www.sciencedaily.com/releases/2019/10/191002131941.htm
Cleaning with bleach could create indoor air pollutants
For generations, people have used chlorine bleach to clean and disinfect their homes. However, researchers have now discovered that bleach fumes, in combination with light and a citrus compound found in many household products, can form airborne particles that might be harmful when inhaled by pets or people. They report their results in ACS'
Bleach cleaning products emit chlorine-containing compounds, such as hypochlorous acid (HOCl) and chlorine gas (ClThe researchers added limonene, HOCl and Cl
Pollution
2,019
October 2, 2019
https://www.sciencedaily.com/releases/2019/10/191002121719.htm
Carbon emissions soar as tourism reaches new heights
A researcher at The University of Texas at San Antonio (UTSA) is examining how the flight routes people take to get to tourist destinations impact the amount of pollution in the air in a newly published study he coauthored in the
"This paper provides one of the first efforts to quantify the carbon emissions associated with tourist air travel in the continental United States," explained Neil Debbage, assistant professor of geography and environmental sustainability in UTSA's Department of Political Science and Geography.The researchers wanted to know whether nonstop routes to tourist destinations can mitigate air travel carbon emissions compared to connecting routes through big airline hubs.Using International Civil Aviation Organization data, USTA researchers analyzed carbon emissions for direct and connecting routes between the 10 most-populated metropolitan areas in the northeastern United States (New York, Philadelphia, Boston, etc.) and 13 different tourist destinations located in the Sunbelt and Western regions of the United States (Bexar County, Texas; Los Angeles County, California; Miami-Dade County, Florida; etc.).Some of the key findings:"One potential tactic to mitigate the carbon footprint associated with tourist air travel is to select nonstop routes whenever possible," replied Debbage, who worked on the paper with Keith G. Debbage, professor of geography at the University of North Carolina at Greensboro.The researchers said they hope this paper will help policymakers consider making new initiatives that accelerate technological innovations regarding aircraft fuel usage, jet engines and jet fuel. Additionally, they also emphasized the importance of broader structural shifts such as implementing realistic carbon pricing for air travel.Neil Debbage's research focuses on climate change, natural hazards and resiliency. He utilizes geographic information systems, statistical modeling and numerical weather modeling to better understand the changing climate at various scales.
Pollution
2,019
October 2, 2019
https://www.sciencedaily.com/releases/2019/10/191002102752.htm
Preventing future forest diebacks
Bark beetles, heat, drought, storms, and fires have damaged the German forests. Those who go for a walk there often encounter dead spruces and dried beech trees. "The forests are affected in all regions and need quick help," says the website of the German Federal Ministry of Food and Agriculture.
Clear and reforest: this is how the ministry imagines this help. Minister Julia Klöckner plans a large-scale clear-up followed by reforestation programme. At least 500 million euros are needed for the programme and subsequent maintenance.Clear-up and reforestation is not the right strategy, forest ecologists Simon Thorn, Joerg Mueller and Alexandro Leverkus from Julius-Maximilians-Universität (JMU) Wuerzburg in Bavaria, Germany, write in Germany should therefore reconsider its strategic and financial efforts to create forests resilient to future climate change. Here a radical change is necessary: The scientists suggest not to remove dead wood and not to conduct reforestation on large scales.For centuries, forestry has followed a clearing and reforestation strategy. The consequences: a steady decline in biological diversity and the extinction of many fungi and insects that depend on dead wood.According to Thorn, large-scale clear-ups following natural disturbances have negative effects on the diversity of insects which are dependent on deadwood. This collides with the goals of the government's coalition agreement, according to which the dramatic decline of insects should be halted. Instead, public subsidies should be aimed at preserving dead wood created by disturbances.Natural disturbances such as storms, bark beetle outbreaks and drought create canopy gaps, which enable the regrowth of a wide variety of native tree species. According to the scientists, this increases the resistance of a forest to extreme weather events.In contrast, rapid reforestation leads to dense groups of trees of the same age, which are highly susceptible to weather events and pests. Subsidies for forestry should better promote a diverse tree and age structure as well as the presence of canopy gaps. This strategy would simultaneously benefit economically important tree species and preserve endangered insects.In the 1980s there was extensive forest damage in Central Europe, mainly caused by air pollution due to industry and traffic. At that time there was talk of "Waldsterben" or "Forest Dieback." The current catchword "Waldsterben 2.0" refers to this period. The addition "2.0" expresses that the current forest damage has other causes this time -- namely climate change.
Pollution
2,019
October 2, 2019
https://www.sciencedaily.com/releases/2019/10/191002075933.htm
Managing stormwater and stream restoration projects together
Both stormwater control and stream restoration are proven ways to reduce erosion along water channels. Often, though, each method is managed by a different urban land-management department, measuring different success values. Efforts are rarely coordinated due to funding and other constraints.
Rod Lammers and his colleagues at the University of Georgia looked at some computerized models to see if coordinating these land management practices with common goals might have a greater positive impact on erosion. The good news? It does.First, let's take a look at why stormwater management systems are necessary. In nature, precipitation falls onto forests, prairies and other soil-based areas. The water is soaked into the soil, down into the water table, and out into water bodies. Eventually, through evaporation, that water gets back into the atmosphere -- until the next precipitation event.In cities, though, pavement, rooftops, and other structures break the water cycle. City managers and engineers develop stormwater management systems to collect and move water in long tunnels, under buildings, and out to waterways. The more impermeable structures and the larger the area, the more complex the system must be.No matter the system, the water must go somewhere. After all, we don't want it in our basement or parking garage.Because this stormwater hasn't been able to take advantage of soils' natural ability to clean water, the water can be filled with sediment, and undesirable nutrients. These can take a toll on the stream habitats and harm sensitive ecosystems downstream. In addition, the larger runoff volumes and higher and more frequent peak flows can lead to stream bank erosion. The UGA study only looked at sediments and nutrients coming from the soil eroded in the channels.Lammers and his team looked at newer stormwater management approaches, called green infrastructure. These types of structures attempt to allow more water to soak into the soil like a natural system. "We are essentially trying to 'restore' the city to a more natural water cycle," says Lammers.Each combination of stormwater controls and restoration projects results in its own improvements. However, "piecemeal approaches to stormwater management and stream restoration miss synergistic benefits," says Lammers. "They make restoration projects more prone to failure, wasting valuable resources for pollutant reduction."Stormwater management programs often focus on peak flow rates of large, less frequent storms. They also attempt to removed suspended solids, as well and nitrogen and phosphorus.Lammers' team developed computerized models to predict the effects of three different stream restoration scenarios and three different stormwater treatment scenarios. Thus, there were scenarios with a combination of restoration and treatment techniques. Such an "experiment" in the field would take a long time and involve a lot of expense."Computer modeling is a powerful tool. We can test the relative success of different management approaches, over years or even decades," says Lammers. "These results can then be used by agencies to help with their planning. Of course, modeling has its limitations. Monitoring the actual performance of stormwater practices and stream restoration is essential. They also have to adapt management approaches based on observed successes and failures.""Our results suggest that watershed-scale implementation of stormwater controls that reduce runoff volume is essential," says Lammers. "The controls need to address a spectrum of storm sizes. This is a more effective approach for reducing channel erosion than stream restoration. Aggressive, early implementation may have resulted in even less pollution by avoiding erosion early on. Much like investing early in life leads to greater financial returns, early implementation of stormwater controls and restoration can result in greater water quality and channel stability benefits.""Stream restoration can complement effective stormwater treatment to reduce erosion and pollutant loading," says Lammers. "However, these approaches should be coordinated to achieve the best results. In addition, stormwater controls have a much greater potential to reduce stream erosion than channel restoration. Cities need to address the root cause of erosion -- the altered urban water cycle. That is more effective than only treating the symptoms by stabilizing the channel itself."Since this study was done in Colorado, future research could be done to apply similar approaches in different climates. Different rainfall patterns might result in different effectiveness of stormwater controls. Also, looking at different restoration strategies, like floodplain reconnection to reduce the velocity and erosive power of floods, would be interesting. Similarly, it would be useful to compare different stormwater control strategies, to see which perform best in different scenarios.
Pollution
2,019
September 30, 2019
https://www.sciencedaily.com/releases/2019/09/190930214511.htm
African child deaths could be prevented by improving environmental quality and reducing population
Children under 5 years of age in Africa are much more likely to die than those in wealthy countries as a direct result of poor health outcomes linked to air pollution, unsafe water, lack of sanitation, an increased family size, and environmental degradation, according to the first continent-wide investigation of its kind.
An international team of researchers led by Flinders University in South Australia, and the University of Western Australia in Perth, have analysed data to break down the correlation between increased child mortality, environmental degradation, and the population density of mainland countries across the African continent.Published in the journal "Across African countries, national child health was lowest when water quality, improved sanitation, air quality, and environmental performance were lowest. We have also provided the first empirical evidence that large households are linked to worsening child health outcomes in developing nations," says Professor Bradshaw.Population size in many African countries will increase rapidly over the coming decades, raising concerns that the added pressures on infrastructure and the environment will further compromise child-health outcomes."In most regions of Africa, this result suggests that environmental degradation is possibly now already at a point where it is compromising food production, water or air quality, or defence against infectious disease.""These concerning results emphasise the importance of continued investment in clean water and sanitation services, measures to improve air quality, broad-scale family planning, and efforts to restrict further environmental degradation, all to promote the United Nations' Sustainable Development Goals in Africa by 2030."The World Health Organisation estimates that 5.6 million children under five years of age died in 2016, with over half of those deaths deemed preventable or treatable with minimal intervention, particularly in sub-Saharan Africa where 1 in 13 children dies before turning five.Co-author Professor Peter Le Souëf from the University of Western Australia says:"Health professionals have largely been ignoring the negative consequences of overpopulation and environmental degradation -- including climate change -- on child health in developing nations. They no longer have a reason to do so with this new evidence."The relationship between child-health outcomes and causes is based on the most recent data and presents a snapshot in time, rather than what might have been more important historical challenges, according to the authors."Failing to break out of the poverty trap is partially a result of poor health causing lower economic performance, which itself erodes health outcomes. In fact, it has been estimated that the African region will lose approximately 6% of its gross domestic product from the future years of life lost." says Professor Bradshaw.Better environmental management and dedicated family planning across Africa will improve these figures.
Pollution
2,019
September 30, 2019
https://www.sciencedaily.com/releases/2019/09/190930131543.htm
Curbing diesel emission could reduce big city mortality rate
U.S. cities could see a decline in mortality rates and an improved economy through midcentury if federal and local governments maintain stringent air pollution policies and diminish concentrations of diesel freight truck exhaust, according to Cornell University research.
"The U.S. must reduce emission in the transportation sector. By improving air quality through better policies and technology in the freight transportation sector, we can breathe better and save lives," said senior author Oliver Gao, professor of civil and environmental engineering.Freight transportation is a pillar of the U.S. national economy, but while long-haul trucks account for less than 6% of the vehicle miles traveled over U.S. highways, they account for about 40% of the emissions of air polluting particulate matter and about 55% of nitrogen oxides -- the precursor to ozone in the atmosphere, the study said."People use their family cars some 10 to 12 years, and log about 120,000 miles over the car's lifetime," said Gao. "A diesel truck can stay on a fleet about 25 to 30 years and easily log a million miles."To reduce emissions by midcentury, the researchers said, truck manufacturers need to add advanced pollution-reduction technology to new trucks and retire older, highly polluting vehicles.Freight trucks primarily use diesel engines, which are efficient and durable but emit fine-particulate exhaust, which poses a cancer risk 7.5 times larger than all other air toxins. Diesel exhaust is classified as a Group 1 (highest level) carcinogenic, according to the World Health Organization's International Agency for Research on Cancer.Gao and his colleagues modeled the public health impacts of restraining particulate matter, based on emission change for future air quality. They estimated improved health outcome (preventing 3,600 premature deaths nationally each year) and $38 billion annually in economic benefits for reducing those deaths.In order to achieve emission reduction goals and the benefits in public health, stringent emission standards and fuel policies should be continuously and effectively implemented, the study said.In addition, the societal benefits of reduced freight emissions are expected to largely exceed the implementation costs of such standards and policies. For instance, the total compliance in particulate restraint costs from 2011 to 2050 would be about $1.8 billion annually, which is about 5% of their calculated yearly health savings, $38 billion.The researchers point out that employing a carbon tax on freight serves to increase oil prices and shifts at least 15% of freight to energy-efficient rail, reducing overall emissions and obtaining 9% more health benefits nationally.While current federal regulations have emissions limits on new vehicles, the regulations do not affect vehicles already in use, Gao said. Aging trucks, however, can easily degrade from normal to high-emitting conditions. Eliminating super-emitting vehicles completely could further reduce long-haul freight emissions by nearly 70% and provide 20% more health benefits, the researchers said."Getting rid of the particulate matter is an important part of reducing air pollution from diesel truck emissions," said Gao.
Pollution
2,019
September 26, 2019
https://www.sciencedaily.com/releases/2019/09/190926105838.htm
People living near green spaces are at lower risk of metabolic syndrome
Middle-aged and older adults that live in greener neighbourhoods are at lower risk of developing metabolic syndrome than those living in areas with less green spaces. This is the main conclusion of a new study by the Barcelona Institute for Global Health (ISGlobal), an institution supported by "la Caixa," which provides further evidence on the health benefits of green spaces.
Metabolic syndrome is a cluster of conditions that occur together and include obesity, hypertension, high blood sugar levels, and abnormal fat levels. It is a major risk factor for non-communicable diseases such as heart attacks, diabetes or stroke. To date, a number of studies have analysed the relationship between exposure to green spaces and individual components of metabolic syndrome. In this study, ISGlobal examined the link with metabolic syndrome as a whole, providing an indicator of overall cardiometabolic health, and in the long-term.The longitudinal study, published in These findings suggest that long-term exposure to green spaces can play an important role in preventing metabolic syndrome as a whole, as well as individual components such as large waist circumference, high levels of blood fats or hypertension.The mechanisms underlying this association "could be related to better opportunities provided by green spaces to perform physical activity as well as a decrease in exposure to air pollution," explains Carmen de Keijzer, ISGlobal researcher and first author of the study. The association observed was higher for women than for men. "Women tend to spend more time in their residential neighbourhood, which could explain this gender difference," adds the researcher."The study found more health benefits in those areas with higher tree coverage, which provides a basis for investigating the types of vegetation that impact positively on our health," says Payam Dadvand, ISGlobal researcher and last author of the study.Green spaces could help reduce the burden of non-communicable diseases, one of the top priorities in public health nowadays. "We need greener cities if we want healthier cities," Dadvand stresses.A recent study, also by ISGlobal, showed that people living in greener areas have a slower cognitive decline. Less stress, greater longevity, or a better overall and mental health are other benefits proved by scientific studies.
Pollution
2,019
September 28, 2019
https://www.sciencedaily.com/releases/2019/09/190928082727.htm
Ditch the delicate wash cycle to help save our seas
Delicate wash cycles in washing machines found to release more plastic microfibres than other cycles.
New research led by Newcastle University has shown that it is the volume of water used during the wash cycle, rather than the spinning action of the washing machine, which is the key factor in the release of plastic microfibres from clothes.Millions of plastic microfibres are shed every time we wash clothes that contain materials such as nylon, polyester and acrylic.Because these fibres are so small, they drain out of our washing machines and can ultimately enter the marine environment.Once in the ocean, they are ingested by the animals living there and two years ago Newcastle University scientists showed for the first time these fibres have now reached the deepest parts of our ocean.Working with Procter & Gamble in Newcastle, the team measured the release of plastic microfibres from polyester clothing for a range of cycles and water volumes.Counting the fibres released, the team found the higher the volume of water the more fibres released, regardless of the speed and abrasive forces of the washing machine.In fact, they found that on average, 800,000 more fibres were released in a delicate wash than a standard cycle.Publishing their findings today in the academic journal "Counterintuitively, we discovered that 'delicate' cycles release more plastic microfibres into the water, and then the environment, than standard cycles."Previous research has suggested the speed the drum spins at, the number of times it changes spinning direction during a cycle and the length of pauses in the cycle -- all known as the machine agitation -- is the most important factor in the amount of microfibre released."But we have shown here that even at reduced levels of agitation, microfibre release is still greatest with higher water-volume-to-fabric ratios."This is because the high volume of water used in a delicate cycle which is supposed to protect sensitive clothing from damage actually 'plucks' away more fibres from the material."Plastic pollution is one of the biggest challenges facing society today and understanding the key sources is an important process to help reduce our impact on the environment.Laundry has been recognised as a major contributor of microplastics but until now, precisely measuring the release of these fibres has been difficult due to the fact that it's almost impossible to accurately simulate the reality of what happens in people's machines in a lab setting.Using a tergotometer -- a benchtop device comprising of eight (1000 mL) washing vessels that simulate full-scale domestic washing, the team were able to carry out tests under different conditions, making changes to water volume, spin speed, temperature and time.A DigiEye camera -- digital colour imaging system -- was then used to accurately calculate the amount of microfibres released.To test whether the observations made using the tergotometers were reflective of full-size domestic washing machines, the team then tested the fabrics on a delicate wash cycle using identical washing machines in the test centre at Procter and Gamble (P&G).The team showed that previous recommendations by groups to move towards high water volumes and low levels of agitation as a way of reducing the amount of microfibre released was actually making the problem worse.Neil Lant, Research Fellow at P&G and co-author on the study, said:"The appliance industry has started to introduce microfibre filters in some new washing machines and the textile industry is looking to reduce the fibre shedding levels of new clothing."We hope that the issue will ultimately be solved by such actions, and our work on the mechanistic causes will help in the development of these solutions."Max Kelly adds:"Reducing the amount of plastic pollution is everyone's responsibility and often it's the small changes that make a huge difference."By avoiding high water-volume-to-fabric washes such as the delicate cycles and ensuring full wash loads then we can all do our bit to help reduce the amount of these synthetic fibres being released into the environment."Hopefully, these findings may also be used by manufacturers to influence the design of future washing machines and reduce our plastic footprint. Over time these changes could also see a global reduction in the amount of energy and water required to wash our clothes."
Pollution
2,019