Date
stringlengths
11
18
Link
stringlengths
62
62
Title
stringlengths
16
148
Summary
stringlengths
1
2.68k
Body
stringlengths
22
13k
Category
stringclasses
20 values
Year
int64
2k
2.02k
March 1, 2021
https://www.sciencedaily.com/releases/2021/03/210301133829.htm
True cost of the planet's energy and transport systems
The hidden social, environmental and health costs of the world's energy and transport sectors is equal to more than a quarter of the globe's entire economic output, new research from the University of Sussex Business School and Hanyang University reveals.
According to analysis carried out by Professor Benjamin K. Sovacool and Professor Jinsoo Kim, the combined externalities for the energy and transport sectors worldwide is an estimated average of $24.662 trillion -- the equivalent to 28.7% of global Gross Domestic Product.The study found that the true cost of coal should be more than twice as high as current prices when factoring in the currently unaccounted financial impact of externalities such as climate change, air pollution and land degradation.The study authors say the research highlights the market failure of the world's energy systems. Factoring in their true costs by including social costs almost equal to production costs, would make many fossil fuelled and nuclear power stations economically unviable, the research published in Benjamin K. Sovacool, Professor of Energy Policy in the Science Policy Research Unit (SPRU) at the University of Sussex Business School, said: "Our research has identified immense hidden costs that are almost never factored into the true expense of driving a car or operating a coal-powered power station. Including these social costs would dramatically change least-cost planning processes and integrated resource portfolios that energy suppliers and others depend upon."It is not that these costs are never paid by society, they are just not reflected in the costs of energy. And unfortunately these hidden costs are not distributed equally or fairly. The most affected parties are under-represented in the marketplace, and have external costs imposed upon them, whether that be the families forced to live in areas of highest air pollution and toxicity because they have no other choice to the inhabitants of low-lying island states such as the Maldives or Vanuatu who are threatened most immediately by rising sea levels."Professor Jinsoo Kim, from the Department of Earth Resources and Environmental Engineering at Hanyang University, said: "Or study clearly reveals oil, coal, and waste in electricity portfolios generate far more externalities than alternative sources of supply. If you factored in the true cost of fossil fuels, the multi-national giants that dominate this sector would be huge loss-making operations. Instead, it is left to society and government to pick up the considerable bill."The researchers sought to find the range and scope of externalities, e.g. the unexpected costs or benefits resulting from economic activity that affects people other than those engaged in that activity for which there's no proper compensation, associated with electricity supply, energy efficiency, and transport.To do so, they carried out a meta-analysis and research synthesis of 139 studies with 704 distinct estimates of externalities: 83 studies (with 318 observations) for electricity supply, 13 studies (with 13 observations) for energy efficiency, and 43 studies (with 373 observations) for transport.They found that coal accounts for by far the largest share of energy externalities ($4.78 trillion, or 59%) followed by oil (more than $2 trillion, 26%) and gas ($552 billion, or 7%) across the four largest energy markets of China, Europe, India, and the United States.The study found coal to have about three times as many negative externalities as solar PV, five times as many as wind energy, and 155 times as many as geothermal energy.The researchers found that the externalities of coal amounted to 14.5 ¢/kWh compared to its levelized cost of energy (LCOE) of between 6.6 to 15.2 ¢/kWh. Similarly natural gas combined cycle turbines has externalities of 3.5¢ and an LCOE of 4.4 to 6.8 ¢/kWh.Prof Sovacool said: "The challenge is for policymakers, regulators, and planners to ensure that electricity and transport markets function as they should and accurately price the trillions of dollars in external costs that the energy and mobility industries surreptitiously shift to society currently."At the moment, consumers have become shielded from the true costs of energy extraction, conversion, supply, distribution or use, which means the immense ecological or community impacts of our existing systems becomes far less discernible. The fundamental policy question is whether we want global markets that manipulate the presence of externalities to their advantage, or a policy regime that attempts to internalize them."Prof Kim said: "Our findings are timely and we hope it will help inform the design of Green New Deals or post-pandemic Covid-19 recovery packages around the world."Some of the most important commonalities of many stimulus packages have been bailouts for the fossil fuel, automotive and aeronautic industries but a global and national recovery may not be sustainable if the true cost of these industries is not correctly factored in."
Pollution
2,021
March 1, 2021
https://www.sciencedaily.com/releases/2021/03/210301133628.htm
Transmission risk of COVID-19 from sewage spills into rivers can now be quickly quantified
A team of researchers, including water quality, epidemiology, remote sensing and modelling experts, led by Dr Jamie Shutler at the University of Exeter, have developed a fast and simple way to assess the potential risk of water-borne transmission of the COVID-19 virus, posed by sewage spills into open and closed freshwater networks.
The new study, published in the journal The study used information on the environment, a population's infection rate, and water usage to calculate the potential potency of viral loads in the event of a sewerage spill.The research team believe the new study could provide fresh impetus in identifying new ways in which to prevent the spread of the virus amongst communities and the environment.Dr Jamie Shutler, lead author of the study and at the University of Exeter's Penryn Campus in Cornwall said: "it's important to identify and break all viable transmission routes if we want to stop any future outbreaks."Airborne water droplets have previously been highlighted as the main route for transmission of the virus which causes COVID-19, but we know that other forms of transmission are likely to exist.Previous studies have shown that COVID-19 viral pathogens can be found in untreated wastewater, in concentrations consistent with population infection rates. While studies are still relatively early in relation to COVID-19, other human coronaviruses are documented to survive in wastewater, with colder water temperature likely to increase viral survival.Using this knowledge and existing methods, the research team identified how the transmission risk from water contaminated with sewage reduces over time.This issue is likely to be especially problematic in parts of the world with a large proportion of temporary settlements, such as shanty towns, favellas or refugee camps, which are less likely to have safe sanitisation systems. Or any densely populated region that has high infection rates that also suffers from a sewage spill.Modifying established pollution analysis methods, the team were able to estimate the viral concentration in rivers after a sewage spill. This meant they could calculate the relative transmission risk posed to humans by contaminated waterways for 39 countries.These methods, the team argue, provides a fast way to assess the transmission risk associated to sewage spills through the use of easily available population, infection rate and environmental data, allowing evidence based guidance following a spill.Dr Shutler added: "we hope that water companies or NGOs will use our simple spreadsheet calculator, that is freely available, to estimate the transmission risk after a spill. They can then use this information to advise the public."This research was partially funded by the European Union project Aquasense, which is focussing on novel methods to study and monitor water quality.The research resulted from a collaboration between the University of Exeter in Cornwall, the University of Glasgow, the Łukasiewicz-Institute of Electron Technology in Poland, and the University of Agriculture in Kraków, Poland.
Pollution
2,021
March 1, 2021
https://www.sciencedaily.com/releases/2021/03/210301084509.htm
Microplastic sizes in Hudson-Raritan Estuary and coastal ocean revealed
Rutgers scientists for the first time have pinpointed the sizes of microplastics from a highly urbanized estuarine and coastal system with numerous sources of fresh water, including the Hudson River and Raritan River.
Their study of tiny pieces of plastic in the Hudson-Raritan Estuary in New Jersey and New York indicates that stormwater could be an important source of the plastic pollution that plagues oceans, bays, rivers and other waters and threatens aquatic and other life."Stormwater, an understudied pathway for microplastics to enter waterways, had similar or higher concentrations of plastics compared with effluent from wastewater sewage treatment plants," said senior author Nicole Fahrenfeld, an associate professor in the Department of Civil and Environmental Engineering in the School of Engineering at Rutgers University-New Brunswick. "More research is needed to increase understanding of the full impact of microplastics on ecosystems."In the early 1900s, General Bakelite began manufacturing Bakelite, the first synthetic plastic on Earth, in Perth Amboy, New Jersey. Today, plastics are used in myriad products worldwide and are widespread in marine and other environments, posing risks to wildlife and aquatic life.Possible sources of microplastics -- often fragments of larger pieces of plastic -- include municipal, industrial and stormwater outfalls.The Rutgers team collected water samples during a relatively dry period in July 2018 and after a heavy rainfall in April 2019. They also collected samples of wastewater entering treatment plants, wastewater discharges and stormwater.The highest levels of microplastics, ranging from two-hundredths of an inch to less than a tenth of an inch long, were observed during summer low-flow conditions at the mouth of the Raritan River, according to the study in the journal "The smaller microplastics likely spent more time in the turbulent Hudson River, leading to increased aging and breakdown of plastics," she said.Polyethylene, which is widely used in high-density polyethylene bottles, trash bags and other items, was the most commonly observed polymer, or plastic, in the Raritan River and Hudson-Raritan Estuary.A 2017 Rutgers-led study found high levels of microplastics in the Raritan and Passaic rivers. Scientists later identified more than 300 organic chemical compounds that appeared to be associated with microplastic particles in the two rivers.
Pollution
2,021
March 1, 2021
https://www.sciencedaily.com/releases/2021/02/210224100901.htm
Follow the smell of the ocean to find where marine predators feed
A joint research project between organizations in Japan and the US has demonstrated that zooplankton, a major food source for marine predators, can be located by following the concentration gradient of the chemical dimethyl sulfide (DMS) in ocean water and air. Currently, little is known about how marine predators search for and find enough food to maintain their body size. This study is expected to expand research into the chemical triggers of marine organisms while foraging.
Zooplankton, such as krill and copepods are the main energy source for many large marine animals. The big predators must consume a large amount of these tiny creatures to provide enough energy to power their enormous bodies. How they find their food is still not clearly understood.Krill feed on phytoplankton which produce and retain water-soluble compounds in their bodies to cope with osmotic pressure. This is essential for survival in seawater. One of these compounds is dimethylsulfoniopropionate (DMSP). DMSP contains sulfur elements and is zwitterionic, meaning that it has both a positive and a negative charge like an amino acid. It is broken down by bacteria into DMS, a component of the familiar aromas associated with ocean air or dried seaweed. DMSP stored in phytoplankton is released into seawater when zooplankton graze on the phytoplankton, which is hypothesized to result in higher DMS concentrations in dense zooplankton areas. It is thought that marine predators could use DMS concentration to locate food sources. While attraction to artificially released DMS has been shown in some predatory species, whether natural gradients of DMS are used by predators and serve as a useful foraging cue remains unknown.To investigate the phenomenon, an international team of researchers from Kumamoto University and Woods Hole Oceanographic Institution developed a new instrument to continuously and automatically analyze seawater and atmospheric concentrations of DMS. Together with a researcher from Stony Brook University, they then used the device to conduct a survey in June 2019 off the coast of Cape Cod, Massachusetts, a summer feeding grounds for many baleen whale species. Researchers took chemical measurements, recorded zooplankton and fish biomass, and whale locations over a series of transects across the ocean surface.Their work revealed that, as hypothesized, zooplankton grazing on phytoplankton seems to result in higher localized concentrations of DMS compared to surrounding areas. In contrast, no association was found between fish biomass and DMS concentration. Simulations based on their measurements show that if large marine predators, such as whales, are able to detect the DMS concentration gradient, following increasing concentrations of DMS would allow them to reach denser zooplankton feeding areas than if they swam randomly."We plan to expand this research project in the future to investigate the relationship between DMS and predation by measuring the concentration of the chemical alongside marine predator movement trajectories," said Professor Kei Toda, who led the chemical measurements. "We also plan to explore other attractant chemicals and study their relationship with the behavior of marine predators like whales, seabirds and penguins. A pilot study tagging humpback whales to examine their movements in relation to DMS was conducted in Antarctica in February 2020, but there are still some issues that need to be addressed to pursue the relationship between chemical substances and predation. We believe that we will have some interesting findings in the near future."
Pollution
2,021
February 23, 2021
https://www.sciencedaily.com/releases/2021/02/210223164440.htm
Oxidation processes in combustion engines and in the atmosphere take the same routes
Alkanes, an important component of fuels for combustion engines and an important class of urban trace gases, react via another reaction pathways than previously thought. These hydrocarbons, formerly called paraffins, thus produce large amounts of highly oxygenated compounds that can contribute to organic aerosol and thus to air pollution in cities. An international research team has now been able to prove this through laboratory experiments with state-of-the-art measurement technology at the University of Helsinki and the Leibniz Institute for Tropospheric Research (TROPOS) in Leipzig.
The results of this interdisciplinary work provide crucial information about oxidation processes both in combustion engines and in the atmosphere -- with direct implications for engine efficiency and the formation of aerosols, especially in cities, the research team writes in the journal Oxidation processes play a major role both in the atmosphere and in combustion. A chain reaction called autoxidation is enabled by high engine temperatures. But it also acts as an important source of highly oxygenated compounds in the atmosphere that form organic aerosol, as researchers from Finland, Germany and the USA demonstrated in 2014. Autoxidation is one reason for ageing processes of organic compounds by oxygen from the air. It contributes to the spoilage of food and wine.This chain reaction is initiated by the formation of peroxy radicals (RO2). The propensity of organic compounds to undergo such multistep autoxidation determines the ignition timing of fuels in engines and, on the other hand, the potential for the formation of low-volatility condensable vapours and consequently organic aerosol in the atmosphere. The extent to which multistep autoxidation takes place depends on the molecular structure of the organic compounds and the reaction conditions. Determining the different reaction pathways of peroxy radicals, which are important intermediates in all oxidation reactions, is crucial for the formation of the different reaction products and their key properties, which can ultimately affect both human health and the climate.Since peroxy radicals are very reactive, their chemical reactions take place very quickly and individual reaction steps were thus overlooked for a long time. The discovery of highly oxygenated organic molecules (HOMs) seven years ago was only possible due to advances in measurement techniques. A special mass spectrometer (Chemical Ionisation -- Atmospheric Pressure Interface -- Time of Flight (CI-APi-TOF) mass spectrometer), that can monitor the very short-lived compounds, was used now to measure the radicals and oxidation products of alkanes. "Until now, there have been no studies on HOM formation from alkanes because it was assumed that their structure would be unfavourable for autoxidation," reports Dr. Torsten Berndt from TROPOS. Methane, an important greenhouse gas, belongs to the group of alkanes. But the most important fossil fuels of the world economy from crude oil and natural gas also consist of alkanes: these include propane, butane, pentane, hexane, heptane and octane. New findings about the oxidation behaviour of this group of substances therefore have great relevance in many areas.To gain a deeper insight into alkane autoxidation, experiments were carried out in the free-jet flow reactor at TROPOS in Leipzig in addition to experiments in Helsinki. The experimental set-up is optimised so that the gases do not come into contact with the walls during the reaction in order to exclude interferences of the results by wall processes. During the experiments, almost all reactive intermediates, RO2 radicals and their reaction products could be directly monitored. The interdisciplinary cooperation of researchers from combustion chemistry and atmospheric chemistry proved to be very useful, because in the combustion processes analogous processes take place as in the atmosphere, only at a higher temperature. "As a result, it became visible that not only isomerisation reactions of RO2 radicals but also of RO radicals are responsible for the build-up of higher oxidised products. The study made it possible to identify with the alkanes the last and perhaps most surprising group of organic compounds for which autoxidation is important," Torsten Berndt concludes.Even at high concentrations of nitrogen oxides, which otherwise quickly terminate autoxidation reactions, the alkanes apparently produce considerable amounts of highly oxidised compounds in the air. The new findings allow for a deeper understanding of autoxidation processes and give rise to further investigations on isomerisation reactions of RO radicals.
Pollution
2,021
February 23, 2021
https://www.sciencedaily.com/releases/2021/02/210223110713.htm
Simply speaking while infected can potentially spread COVID-19
COVID-19 can spread from asymptomatic but infected people through small aerosol droplets in their exhaled breath. Most studies of the flow of exhaled air have focused on coughing or sneezing, which can send aerosols flying long distances.
However, speaking while near one another is also risky since the virus can be ejected by merely talking.In In this study, electronic cigarettes were used to produce artificial smoke consisting of droplets about one-tenth micron in diameter, similar to the size of a virus particle. The liquid used in these vaping devices, a mixture of glycerin and propylene glycol, produces a cloud of tiny droplets that scatter light from a laser, allowing visualization of airflow patterns."We analyzed the characteristics of exhalation diffusion with and without a mask when a person was standing, sitting, facing down, or lying face up," said author Keiko Ishii.To study the effect of speech on exhalation, the word "onegaishimasu," a typical Japanese greeting in a business setting, was uttered repeatedly while filming the resulting vapor cloud. The experiments were carried out in a hair salon at the Yamano College of Aesthetics in Tokyo, with postures chosen to simulate typical customer service scenarios, including shampooing where a customer is lying back and the technician is standing and leaning over the customer."A significant amount of similar face-to-face contact would occur not only in cosmetology but also in long-term and medical care," said Ishii.The experiments revealed the exhaled air from an unmasked person who is speaking tends to move downward under the influence of gravity. If a customer or patient is lying below, they could be infected.When a mask is worn while standing or sitting, the vapor cloud tends to attach to that person's body, which is warmer than the surrounding air and flows upward along the body. If the technician is leaning over, however, the aerosol cloud tends to detach from that person's body and fall onto the client below.The investigators also experimented with face shields and found it can prevent any aerosols that leak from around the technician's mask from traveling down to the customer."The face shield promoted the rise of the exhaled breath," said Ishii. "Hence, it is more effective to wear both a mask and a face shield when providing services to customers."
Pollution
2,021
February 23, 2021
https://www.sciencedaily.com/releases/2021/02/210223110441.htm
New sensor paves way to low-cost sensitive methane measurements
Researchers have developed a new sensor that could allow practical and low-cost detection of low concentrations of methane gas. Measuring methane emissions and leaks is important to a variety of industries because the gas contributes to global warming and air pollution.
"Agricultural and waste industries emit significant amounts of methane," said Mark Zondlo, leader of the Princeton University research team that developed the sensor. "Detecting methane leaks is also critical to the oil and gas industry for both environmental and economic reasons because natural gas is mainly composed of methane."In The Optical Society (OSA) journal "We hope that this research will eventually open the door to low-cost, accurate and sensitive methane measurements," said Nathan Li, first author of the paper. "These sensors could be used to better understand methane emissions from livestock and dairy farms and to enable more accurate and pervasive monitoring of the climate crisis."Building a less expensive sensor Laser-based sensors are currently the gold standard for methane detection, but they cost between USD 10,000 and 100,000 each. A sensor network that detects leaks across a landfill, petrochemical facility, wastewater treatment plant or farm would be prohibitively expensive to implement using laser-based sensors.Although methane sensing has been demonstrated with mid-IR LEDs, performance has been limited by the low light intensities generated by available devices. To substantially improve the sensitivity and develop a practical system for monitoring methane, the researchers used a new ICLED developed by Jerry Meyer's team at the U.S. Naval Research Laboratory."The ICLEDs we developed emit roughly ten times more power than commercially available mid-IR LEDs had generated, and could potentially be mass-produced," said Meyer. "This could enable ICLED-based sensors that cost less than USD 100 per sensor."To measure methane, the new sensor measures infrared light transmitted through clean air with no methane and compares that with transmission through air that contains methane. To boost sensitivity, the researchers sent the infrared light from the high-power ICLED through a 1-meter-long hollow-core fiber containing an air sample. The inside of the fiber is coated with silver, which causes the light to reflect off its surfaces as it travels down the fiber to the photodetector at the other end. This allows the light to interact with additional molecules of methane in the air resulting in higher absorption of the light."Mirrors are commonly used to bounce light back and forth multiple times to increase sensor sensitivity but can be bulky and require precise alignment," said Li. "Hollow core fibers are compact, require low volumes of sample gas and are mechanically flexible."Measuring up to laser-based sensors To test the new sensor, the researchers flowed known concentrations of methane into the hollow core fiber and compared the infrared transmission of the samples with state-of-the-art laser-based sensors. The ICLED sensor was able to detect concentrations as low as 0.1 parts per million while showing excellent agreement with both calibrated standards and the laser-based sensor."This level of precision is sufficient to monitor emissions near sources of methane pollution," said Li. "An array of these sensors could be installed to measure methane emissions at large facilities, allowing operators to affordably and quickly detect leaks and mitigate them."The researchers plan to improve the design of the sensor to make it practical for long-term field measurements by investigating ways to increase the mechanical stability of the hollow-core fiber. They will also study how extreme weather conditions and changes in ambient humidity and temperature might affect the system. Because most greenhouse gases, and many other chemicals, can be identified by using mid-IR light, the methane sensor could also be adapted to detect other important gases.
Pollution
2,021
February 22, 2021
https://www.sciencedaily.com/releases/2021/02/210222192825.htm
Environmental policies not always bad for business, study finds
Critics claim environmental regulations hurt productivity and profits, but the reality is more nuanced, according to an analysis of environmental policies in China by a pair of Cornell economists.
The analysis found that, contrary to conventional wisdom, market-based or incentive-based policies may actually benefit regulated firms in the traditional and "green" energy sectors, by spurring innovation and improvements in production processes. Policies that mandate environmental standards and technologies, on the other hand, may broadly harm output and profits."The conventional wisdom is not entirely accurate," said Shuyang Si, a doctoral student in applied economics and management. "The type of policy matters, and policy effects vary by firm, industry and sector."Si is the lead author of "The Effects of Environmental Policies in China on GDP, Output, and Profits," published in the current issue of the journal Si mined Chinese provincial government websites and other online sources to compile a comprehensive data set of nearly 2,700 environmental laws and regulations in effect in at least one of 30 provinces between 2002 and 2013. This period came just before China declared a "war on pollution," instituting major regulatory changes that shifted its longtime prioritization of economic growth over environmental concerns."We really looked deep into the policies and carefully examined their features and provisions," Si said.The researchers categorized each policy as one of four types: "command and control," such as mandates to use a portion of electricity from renewable sources; financial incentives, including taxes, subsidies and loans; monetary awards for cutting pollution or improving efficiency and technology; and nonmonetary awards, such as public recognition.They assessed how each type of policy impacted China's gross domestic product, industrial output in traditional energy industries and the profits of new energy sector companies, using publicly available data on economic indicators and publicly traded companies.Command and control policies and nonmonetary award policies had significant negative effects on GDP, output and profits, Si and Lin Lawell concluded. But a financial incentive -- loans for increasing renewable energy consumption -- improved industrial output in the petroleum and nuclear energy industries, and monetary awards for reducing pollution boosted new energy sector profits."Environmental policies do not necessarily lead to a decrease in output or profits," the researchers wrote.That finding, they said, is consistent with the "Porter hypothesis" -- Harvard Business School Professor Michael Porter's 1991 proposal that environmental policies could stimulate growth and development, by spurring technology and business innovation to reduce both pollution and costs.While certain policies benefited regulated firms and industries, the study found that those benefits came at a cost to other sectors and to the overall economy. Nevertheless, Si and Lin Lawell said, these costs should be weighed against the benefits of these policies to the environment and society, and to the regulated firms and industries.Economists generally prefer market-based or incentive-based environmental policies, Lin Lawell said, with a carbon tax or tradeable permit system representing the gold standard. The new study led by Si, she said, provides more support for those types of policies."This work will make people aware, including firms that may be opposed to environmental regulation, that it's not necessarily the case that these regulations will be harmful to their profits and productivity," Lin Lawell said. "In fact, if policies promoting environmental protection are designed carefully, there are some that these firms might actually like."Additional co-authors contributing to the study were Mingjie Lyu of Shanghai Lixin University of Accounting and Finance, and Song Chen of Tongji University. The authors acknowledged financial support from the Shanghai Science and Technology Development Fund and an Exxon-Mobil ITS-Davis Corporate Affiliate Fellowship.
Pollution
2,021
February 22, 2021
https://www.sciencedaily.com/releases/2021/02/210222164132.htm
How outdoor pollution affects indoor air quality
Just when you thought you could head indoors to be safe from the air pollution that plagues the Salt Lake Valley, new research shows that elevated air pollution events, like horror movie villains, claw their way into indoor spaces. The research, conducted in conjunction with the Utah Division of Facilities Construction and Management, is published in
In a long-term study in a Salt Lake-area building, researchers found that the amount of air pollution that comes indoors depends on the type of outdoor pollution. Wildfires, fireworks and wintertime inversions all affect indoor air to different degrees, says Daniel Mendoza, a research assistant professor in the Department of Atmospheric Sciences and visiting assistant professor in the Department of City & Metropolitan Planning. The study is unique, Mendoza says, combining a long-term indoor air quality monitoring project with paired outdoor measurements and research-grade instruments."We all know about the inversions," Mendoza says. "We all know how large of a problem wildfires are. But do we really know what happens when we're inside?"Mendoza, who also holds appointments as an adjunct assistant professor in the Pulmonary Division at the School of Medicine and as a senior scientist at the NEXUS Institute, and his colleagues set up their air monitoring equipment at the Unified State Laboratories in Taylorsville, Utah. They placed three sensors to measure airborne concentrations of particulate matter: One on the roof to measure outdoor air, one in the air handling room -- where the outdoor air comes in -- and one in an office. The building uses a 100% outside air filtration system; this is not typical for most commercial buildings, which usually use some amount of recirculated air.The sensors stayed in place from April 2018 to May 2019, just over a year. In the Salt Lake Valley, a year's air quality events include fireworks-laden holidays on Independence Day and Pioneer Day (July 24), smoke from wildfires throughout the West that settles in the bowl-like valley and wintertime inversions in which the whole valley's emissions are trapped in a pool of cold air.Through it all, the team's sensors kept watch. Amid the expected events, however, a private fireworks show took place on Aug. 17, 2018, within five miles of the study building, providing an unexpected research opportunity. More on that later.During a wintertime inversion event in December, as the Air Quality Index outdoors reached orange and red levels, the indoor air quality reached yellow levels and stayed there until the inversion cleared. In all, the pollution levels inside were about 30% of what they were outside.That's not surprising, Mendoza says. During inversions, only around 20% of the air pollution is what's called primary pollution -- the particulate matter that comes directly from combustion exhaust. The rest is secondary -- formed as gases undergo chemical reactions under specific meteorological conditions and combine to form solid particulates. As soon as the air comes indoors, those meteorological conditions change."That changes the chemical environment for these particles and they actually dissociate," Mendoza says. "That's what we're suspecting is happening when these particles come into the building and that's why we don't observe them."In late August 2018, when three active wildfires were burning in California, indoor air pollution rose to about 78% of outside pollution levels."For nearly 48 hours," the researchers wrote, "indoor air quality reached levels considered problematic for health compromised populations and nearly reached levels considered unsafe for all populations."It's important to note, though, that thanks to the building's air handling system, the air is still safer inside than outside.The reason for the higher infiltration of particulate matter, Mendoza says, is that smoke particles are stable and don't break down in different temperature and humidity conditions."We see those particles travel straight through the system," Mendoza says, "because there's no specific filtration that blocks out these particles. Smoke particles can also be smaller in size; that's why they're so dangerous for us."Utah has two major fireworks holidays: July 4 and July 24 (Pioneer Day). But the researchers happened to catch a signal from a private fireworks event just a few weeks before the wildfire smoke event, providing an opportunity to see how fireworks shows, both large and small, affected indoor air quality.The smoke from fireworks is somewhere between inversion pollution and wildfires. It contains primary smoke particles as well as the gases that can combine to produce secondary particulates, which can come from the chemicals used to produce fireworks' bright colors.On the night of July 4, 2018, air quality sharply deteriorated once fireworks shows began and stayed in the red range, with spikes into the purple "very unhealthy" range, for about three hours. Indoor air quality reached orange levels, registering about 30% of the outdoor air pollution."It was only after 8 a.m. on July 5 that indoor air quality returned to pre-fireworks levels," the researchers write.The private fireworks show on August 17 lasted only 30 minutes, and although the scope was much smaller, the smoke was still enough to raise the indoor air quality index to orange for several minutes."Even a 'small' fireworks show did have a marked impact on indoor air quality," Mendoza says. That matters to people with respiratory challenges who can see large-scale, poor air quality events like inversions and fireworks holidays coming -- but who might find private fireworks shows an unpleasant surprise.The commercial building that the researchers studied is a somewhat controlled environment. Learning about indoor air quality in homes will be a greater challenge. "You have kids coming in with mud or with dirt on their feet, you have vacuuming and cooking. So that's going to be our next step." As many people are spending more time at home due to the COVID-19 pandemic, the research will hopefully help understand what actions people can take to improve their indoor air quality."There is a lot of opportunity to reduce the pollutants that reach occupants in buildings, both commercial and residential," says Sarah Boll, assistant director of the Utah Division of Facilities Construction and Management. "To me, that is the great part of this work -- with more research it can point the way to protecting people indoors."
Pollution
2,021
February 22, 2021
https://www.sciencedaily.com/releases/2021/02/210222124613.htm
Air pollution puts children at higher risk of disease in adulthood
Children exposed to air pollution, such as wildfire smoke and car exhaust, for as little as one day may be doomed to higher rates of heart disease and other ailments in adulthood, according to a new Stanford-led study. The analysis, published in Nature
"I think this is compelling enough for a pediatrician to say that we have evidence air pollution causes changes in the immune and cardiovascular system associated not only with asthma and respiratory diseases, as has been shown before," said study lead author Mary Prunicki, director of air pollution and health research at Stanford's Sean N. Parker Center for Allergy & Asthma Research. "It looks like even brief air pollution exposure can actually change the regulation and expression of children's genes and perhaps alter blood pressure, potentially laying the foundation for increased risk of disease later in life."The researchers studied a predominantly Hispanic group of children ages 6-8 in Fresno, California, a city beset with some of the country's highest air pollution levels due to industrial agriculture and wildfires, among other sources. Using a combination of continuous daily pollutant concentrations measured at central air monitoring stations in Fresno, daily concentrations from periodic spatial sampling and meteorological and geophysical data, the study team estimated average air pollution exposures for 1 day, 1 week and 1, 3, 6 and 12 months prior to each participant visit. When combined with health and demographics questionnaires, blood pressure readings and blood samples, the data began to paint a troubling picture.The researchers used a form of mass spectrometry to analyze immune system cells for the first time in a pollution study. The approach allowed for more sensitive measurements of up to 40 cell markers simultaneously, providing a more in-depth analysis of pollution exposure impacts than previously possible.Among their findings: Exposure to fine particulate known as PM2.5, carbon monoxide and ozone over time is linked to increased methylation, an alteration of DNA molecules that can change their activity without changing their sequence. This change in gene expression may be passed down to future generations. The researchers also found that air pollution exposure correlates with an increase in monocytes, white blood cells that play a key role in the buildup of plaques in arteries, and could possibly predispose children to heart disease in adulthood. Future studies are needed to verify the long-term implications.Hispanic children bear an unequal burden of health ailments, especially in California, where they are exposed to higher traffic-related pollution levels than non-Hispanic children. Among Hispanic adults, prevalence for uncontrolled hypertension is greater compared with other races and ethnicities in the U.S., making it all the more important to determine how air pollution will affect long-term health risks for Hispanic children.Overall, respiratory diseases are killing more Americans each year, and rank as the second most common cause of deaths globally."This is everyone's problem," said study senior author Kari Nadeau, director of the Parker Center. "Nearly half of Americans and the vast majority of people around the world live in places with unhealthy air. Understanding and mitigating the impacts could save a lot of lives."Nadeau is also the Naddisy Foundation Professor in Pediatric Food Allergy, Immunology, and Asthma, professor of medicine and of pediatrics and, by courtesy, of otolaryngology at the Stanford School of Medicine, and a senior fellow at the Stanford Woods Institute for the Environment. Co-authors of the study include Justin Lee, a graduate student in epidemiology and population health; Xiaoying Zhou, a research scientist at the Parker Center; Hesam Movassagh, a postdoctoral research fellow at the Parker Center during the research; Manisha Desai, a professor of biomedical informatics research and biomedical data science; and Joseph Wu, director of the Stanford Cardiovascular Institute and the Simon H. Stertzer, MD, Professor of Medicine and Radiology; and researchers from the University of Leuven; the University of California, Berkeley; the University of California, San Francisco; and Sonoma Technology.
Pollution
2,021
February 22, 2021
https://www.sciencedaily.com/releases/2021/02/210222082622.htm
Long-term exposure to low levels of air pollution increases risk of heart and lung disease
Exposure to what is considered low levels of air pollution over a long period of time can increase the risk of heart attack, stroke, atrial fibrillation and pneumonia among people ages 65 and older, according to new research published today in the American Heart Association's flagship journal
Air pollution can cause harm to the cardiovascular and respiratory systems due to its effect on inflammation in the heart and throughout the body. Newer studies on the impact of air pollution on health are focused on understanding the potential harm caused by long-term exposure and are researching the effects of multiple air pollutants simultaneously. Research on air pollution is critical to informing recommendations for national environmental and health guidelines."People should be conscious of the air quality in the region where they live to avoid harmful exposure over long periods of time, if possible," said Mahdieh Danesh Yazdi, Pharm.D., M.P.H., Ph.D., a post-doctoral research fellow at the Harvard T.H. Chan School of Public Health and lead author of the study. "Since our study found harmful effects at levels below current U.S. standards, air pollution should be considered as a risk factor for cardiovascular and respiratory disease by clinicians, and policy makers should reconsider current standards for air pollutants."Researchers examined hospitalization records for more than 63 million Medicare enrollees in the contiguous Unites States from 2000 to 2016 to assess how long-term exposure to air pollution impacts hospital admissions for specific cardiovascular and respiratory issues. The study measured three components of air pollution: fine particulate matter (PM2.5), nitrogen dioxide (NO2) and ozone (O3). Using hundreds of predictors, including meteorological values, satellite measurements and land use to estimate daily levels of pollutants, researchers calculated the study participants' exposure to the pollutants based upon their residential zip code. Additional analysis included the impact of the average yearly amounts of each of the pollutants on hospitalization rates for non-fatal heart attacks, ischemic strokes, atrial fibrillation and flutter, and pneumonia.Statistical analyses found thousands of hospital admissions were attributable to air pollution per year. Specifically:The risks for heart attacks, strokes, atrial fibrillation and flutter, and pneumonia were associated with long-term exposure to particulate matter.Data also showed there were surges in hospital admissions for all of the health outcomes studied with each additional unit of increase in particulate matter. Specifically, stroke rates increased by 2,536 for each additional ug/m3 (micrograms per cubic meter of air) increase in fine particulate matter each year.There was an increased risk of stroke and atrial fibrillation associated with long-term exposure to nitrogen dioxide.Pneumonia was the only health outcome in the study that seemed impacted by long-term exposure to ozone; however, researchers note there are currently no national guidelines denoting safe or unsafe long-term ozone levels."When we restricted our analyses to individuals who were only exposed to lower concentrations of air pollution, we still found increased risk of hospital admissions with all of the studied outcomes, even at concentration levels below current national standards," added Danesh Yazdi. "More than half of the study population is exposed to low levels of these pollutants, according to U.S. benchmarks, therefore, the long-term health impact of these pollutants should be a serious concern for all, including policymakers, clinicians and patients."The researchers further stratified the analyses to calculate the cardiovascular and respiratory risks associated with each of the pollutants among patient subgroups including gender, race or ethnicity, age and socioeconomic factors, detailed in the study.The causality in the study could only be interpreted and not proven definitively due to the limitations of the data available, which may have not included other known CVD risk factors. In addition, coding errors can occur in the Medicare database, which would impact the analyses.
Pollution
2,021
February 18, 2021
https://www.sciencedaily.com/releases/2021/02/210218140126.htm
Human impact on solar radiation levels for decades
In the late 1980s and 1990s, researchers at ETH Zurich discovered the first indications that the amount of sunlight reaching the Earth's surface had been steadily declining since the 1950s. The phenomenon was known as "global dimming." However, a reversal in this trend became discernible in the late 1980s. The atmosphere brightened again at many locations and surface solar radiation increased.
"In previous studies, we showed that the amount of sunlight that reaches the Earth's surface is not constant over many decades but instead varies substantially -- a phenomenon known as global dimming and brightening," says ETH Professor Martin Wild of the Institute for Atmospheric and Climate Science.Yet little is known about the reasons for these fluctuations, which have been observed for decades. One particularly controversial point is whether the fluctuations are caused by air pollution, with aerosols blocking the sunlight, or whether they are a result of natural variations in the climate system.A number of scientists suspected that cloud cover may have changed over the years, absorbing the sun's rays more effectively during the dimming phase than during the brightening phase.This is why Wild and colleagues from other research institutes analysed measurements collected between 1947 and 2017 in the Potsdam radiation time series. The series offers one of the longest, most homogeneous, continuous measurements of solar radiation on the Earth's surface.In this new study, they were able to show that rather than these fluctuations being due to natural changes in the cloud cover, they are instead generated by varying aerosols from human activity. The paper was published in the journal "In our analysis, we filtered out the effects of cloud cover to see whether these long-term fluctuations in solar radiation also occurred in cloud-free conditions," Wild explains. As it turned out, the decadal fluctuations in the sunlight received at the Earth's surface were apparent even when skies were clear.The researchers identified aerosols entering the atmosphere due to air pollution as the major contributor to global dimming and brightening. "Although we'd already assumed as much, we'd been unable to prove it directly until now," he says.The fact that the transition from dimming to brightening coincided with the economic collapse of the former communist countries in the late 1980s supports the argument that these variations have a human cause. Moreover, around this time, many western industrialised nations introduced strict air pollution regulations, which improved air quality significantly and facilitated the transfer of the sunbeams through the atmosphere. Lastly, the atmosphere was recovering from the volcanic eruption of Mount Pinatubo, which had ejected vast amounts of aerosols into the air in 1991.Wild and his colleagues had already ruled out fluctuations in solar activity in an earlier study. "The sun itself had only an infinitesimal, negligible effect, which in no way accounts for the magnitude of the intensity changes that had been observed over the years at the surface," Wild says.Surface solar radiation is a key parameter for climate issues. Not only does it govern the temperature, it also has a fundamental impact on the water cycle by regulating evaporation, which, in turn, governs cloud formation and affects precipitation. During the global dimming, less water evaporated from the Earth's surface, causing precipitation to decline worldwide.Solar radiation also affects the cryosphere, i.e. glaciers, snow and ice. "Glacial retreat accelerated when the atmosphere began brightening again," Wild says, adding: "It's also becoming increasingly important for the solar industry to gain a better understanding of these fluctuations when it comes to planning new facilities."Germany's National Meteorological Service, the Deutscher Wetterdienst, operates an observatory in Potsdam that has been measuring solar radiation since 1937. This means the station boasts one of the world's longest radiation time series. "I'm extremely grateful to have access to decades' worth of data; after all, it is only thanks to measurement series such as this that we're able to record and show changes in our environment and climate," Wild says, adding that this makes it imperative to support monitoring networks around the world for prolonged periods of time. Admittedly, this task isn't particularly spectacular, making it difficult to secure funding. "But if we want to understand climate change and clarify the impact of human activities, we need time series that go back far enough," he says. To this end, ETH maintains the Global Energy Balance Archive (GEBA), an unparalleled database of surface energy fluxes worldwide.
Pollution
2,021
February 18, 2021
https://www.sciencedaily.com/releases/2021/02/210218140118.htm
Shale gas development in PA increases exposure of some to air pollutants
Air pollution levels may have exceeded air quality standards during the development of some Marcellus Shale natural gas wells in Pennsylvania, potentially impacting more than 36,000 people in one year alone during the drilling boom, according to Penn State scientists.
"The construction and drilling of these wells are a relatively short-term thing, and assessment of the impact on air quality is something that often falls through the cracks," said Jeremy Gernand, associate professor of industrial health and safety at Penn State. "But there are thousands and thousands of wells drilled depending on the year, and we wanted to see what the impact would be if we added it all up."More than 20,000 unconventional Marcellus Shale gas wells have been drilled since Pennsylvania's boom began around 2005. Large diesel-powered equipment and gas turbines used during the drilling and hydraulic fracturing stages of shale gas development emit air pollution, and these emissions can affect air quality within the vicinity of shale well sites and farther downwind, the scientists said.The scientists found emissions at some of the sites could have impacted air quality for people who live beyond the 500-foot setbacks required by state regulations."We found in one year alone, 36,000 people, or about 1% of the population of Pennsylvania's Marcellus Shale region, could have been exposed to pollution levels exceeding air quality standards," Gernand said. "However, we found doubling the required setback distance reduced that number by about half."The scientists developed a dispersion model to estimate concentrations of fine particulate matter (PM 2.5) resulting from well development. One form of air pollution created by the burning of fossil fuels, PM 2.5 are tiny particles that can be inhaled and cause lung damage, according to the scientists."Very few studies have investigated local residents' exposure to PM 2.5 emissions from shale gas development, specifically in the Marcellus region of Pennsylvania," Gernand said.The model considered meteorological conditions during well development, indicating how factors like wind carried emissions from individual well sites, the scientists said. The team then used census data to estimate how many people were in the areas affected by higher levels of air pollution.Their findings, published in "I think the main message is that a one-size-fits-all policy to constrain the impacts of industry probably isn't the most effective approach," Gernand said. "In this case, there are real benefits to making some alterations to setback regulations. We only need to push certain sites back farther from inhabited areas to see a big reduction in the number of people whose air quality is affected by this."Setback policy is shown to be an effective method to reduce exposure exceedances, but the scientists said their results indicate the policies should consider the number of wells per well pad and local conditions to further limit air quality impacts.The scientists said drilling activity has moved closer to populated areas as Marcellus development progressed in Pennsylvania. And while the construction, drilling and fracking stages are relatively short, sites now often have multiple wells so development can extend for months."Staying back 500 feet was probably fine when we drilled one well per pad and moved on, but under the current conditions that's not really sufficient anymore," Gernand said. "We really need to take into consideration things like how much construction and drilling activity will take place and for how long and population density in the area and use those things in some kind of decision framework."Zoya Banan, an air quality specialist with South Coast Air Quality Management District, also conducted this research while a doctoral student at Penn State.
Pollution
2,021
February 17, 2021
https://www.sciencedaily.com/releases/2021/02/210217134842.htm
Scientists develop blood test to predict environmental harms to children
Scientists at Columbia University Mailman School of Public Health developed a method using a DNA biomarker to easily screen pregnant women for harmful prenatal environmental contaminants like air pollution linked to childhood illness and developmental disorders. This approach has the potential to prevent childhood developmental disorders and chronic illness through the early identification of children at risk.
While environmental factors -- including air pollutants -- have previously been associated with DNA markers, no studies to date have used DNA markers to flag environmental exposures in children. Study results are published online in the journal There is ample scientific evidence that links prenatal environmental exposures to poor outcomes in children, yet so far there is no early warning system to predict which children are at highest risk of adverse health outcomes. The researchers took a major step toward overcoming this barrier by identifying an accessible biomarker measured in a small amount of blood to distinguish newborns at elevated risk due to prenatal exposure. They used air pollutants as a case study, although they say their approach is easily generalizable to other environmental exposures, and could eventually be made into a routine test.The researchers used machine learning analysis of umbilical cord blood collected through two New York City-based longitudinal birth cohorts to identify locations on DNA altered by air pollution. (DNA can be altered through methylation, which can modify gene expression, which can, for example, impact the amount of proteins that are important for development.) Study participants had known levels of exposure to air pollution measured through personal and ambient air monitoring during pregnancy, with specific measures of fine particulate matter (PM2.5), nitrogen dioxide (NO2), and polycyclic aromatic hydrocarbons (PAH).They tested these biomarkers and found that they could be used to predict prenatal exposure to NO2 and PM2.5 (which were monitored throughout pregnancy), although only with modest accuracy. PAH (which was only monitored for a short period during the third trimester) was less well predicted. The researchers now plan to apply their biomarker discovery process using a larger pool of data collected through the ECHO consortium, which potentially could lead to higher levels of predictability. It might also be possible to link these biomarkers with both exposures and adverse health outcomes. With better predictability and lower cost, the method could become a routine test used in hospitals and clinics."Using a small sample of cord blood, it may be possible to infer prenatal environmental exposure levels in women where exposures were not explicitly measured," says senior author Julie Herbstman, PhD, director of Columbia Center for Children's Environmental Health (CCCEH) and associate professor of Environmental Health Sciences. "While further validation is needed, this approach may help identify newborns at heightened risk for health problems. With this information, clinicians could increase monitoring for high-risk children to see if problems develop and prescribe interventions, as needed."Approximately 15 percent of children in the United States ages 3 to 17 years are affected by neurodevelopmental disorders, including attention deficit hyperactive disorder (ADHD), learning disabilities, intellectual disability, autism and other developmental delays. The prevalence of childhood asthma in the US is 8 percent with the highest rates in African-American boys. Environmental exposures are known, or suspected of contributing to, multiple childhood disorders and are by nature preventable once identified as harmful. Prenatal air pollution exposure has been associated with adverse neurodevelopmental and respiratory outcomes, as well as obesity.
Pollution
2,021
February 17, 2021
https://www.sciencedaily.com/releases/2021/02/210217091025.htm
Plastic recycling results in rare metals being found in children's toys and food packaging
Some of the planet's rarest metals -- used in the manufacture of smartphones and other electrical equipment -- are increasingly being found in everyday consumer plastics, according to new research.
Scientists from the University of Plymouth and University of Illinois at Urbana-Champaign tested a range of new and used products including children's toys, office equipment and cosmetic containers.Through a number of detailed assessments, they examined levels of rare earth elements (REEs) but also quantities of bromine and antimony, used as flame retardants in electrical equipment and a sign of the presence of recycled electronic plastic.The results showed one or more REEs were found in 24 of the 31 products tested, including items where unregulated recycling is prohibited such as single-use food packaging.They were most commonly observed in samples containing bromine and antimony at levels insufficient to effect flame retardancy, but also found in plastics where those chemicals weren't present.Having also been found in beached marine plastics, the study's authors have suggested there is evidence that REEs are ubiquitous and pervasive contaminants of both contemporary and historical consumer and environmental plastics.The study, published in While they have previously been found in a variety of environments -- including ground water, soils and the atmosphere -- the study demonstrates the wide REE contamination of the "plastisphere" that does not appear to be related to a single source or activity.Dr Andrew Turner, Associate Professor (Reader) in Environmental Sciences at the University of Plymouth and the study's lead author, said: "Rare earth elements have a variety of critical applications in modern electronic equipment because of their magnetic, phosphorescent and electrochemical properties. However, they are not deliberately added to plastic to serve any function. So their presence is more likely the result of incidental contamination during the mechanical separation and processing of recoverable components."The health impacts arising from chronic exposure to small quantities of these metals are unknown. But they have been found in greater levels in food and tap water and certain medicines, meaning plastics are unlikely to represent a significant vector of exposure to the general population. However, they could signify the presence of other more widely known and better-studied chemical additives and residues that are a cause for concern."The research is the latest work by Dr Turner examining the presence of toxic substances within everyday consumer products, marine litter and the wider environment.In May 2018, he showed that hazardous chemicals such as bromine, antimony and lead are finding their way into food-contact items and other everyday products because manufacturers are using recycled electrical equipment as a source of black plastic.His work was part of a successful application by the University to earn the Queen's Anniversary Prize for Higher and Further Education for its pioneering research on microplastics pollution.It also builds on previous work at the University, which saw scientists blend a smartphone to demonstrate quantities of rare or so-called 'conflict' elements in each product.
Pollution
2,021
February 17, 2021
https://www.sciencedaily.com/releases/2021/02/210217091019.htm
The 20 best places to tackle US farm nitrogen pollution
A pioneering study of U.S nitrogen use in agriculture has identified 20 places across the country where farmers, government, and citizens should target nitrogen reduction efforts.
Nitrogen from fertilizer and manure is essential for crop growth, but in high levels can cause a host of problems, including coastal "dead zones," freshwater pollution, poor air quality, biodiversity loss, and greenhouse gas emissions.The 20 nitrogen "hotspots of opportunity" represent a whopping 63% of the total surplus nitrogen balance in U.S. croplands, but only 24% of U.S. cropland area. In total, they comprise 759 counties across more than 30 states, finds the study in The top-ranked hotspot to target, based on total excess nitrogen, is a 61-county area across Illinois, Indiana, Missouri and Wisconsin. That's followed by a 55-county region in Kansas and Nebraska in second place, and 38 counties in Iowa, Minnesota and South Dakota in third.Several of the 20 hotspots -- with high nitrogen balances per acre -- surprised the researchers, particularly in the West -- including a 32-county hotspot in Idaho, Montana, Wyoming and Utah -- and the South (six hotspots across Texas, Louisiana, Mississippi, Alabama, Georgia, and Florida). Also on the list are chronic nitrogen problem areas, such as the Mississippi River Basin, Chesapeake Bay, and California's Central Valley."This study provides new perspective on where to focus efforts to tackle America's nitrogen problems," says lead author Eric Roy of the University of Vermont. "The U.S. has so many nitrogen trouble zones, and making progress will be easier in some locations than others. That's why this research is important. It reveals where programs aiming to increase the efficiency of farm nitrogen use are most likely to be successful."Why these particular 20 hotspots? First, the study shows that nitrogen inputs are so high in many of these areas that farmers can most likely reduce nitrogen use without hurting crop yields. "This is a crucial finding because farmers naturally worry about lower crop yields when reducing nitrogen inputs," says UVM co-author Meredith Niles. "And we don't want to compromise food security goals."Second -- and perhaps most importantly -- the study is the first to provide a robust, national analysis of underlying social, economic and agronomic factors linked to nitrogen balances on croplands at the county-level. That makes it one of the most comprehensive studies of U.S. nitrogen use to date.Examples of these underlying factors include climate change beliefs, crop mix, precipitation, soil productivity, farm operating expenses, and more.By examining these predictors, researchers were able to identify nitrogen hotspots where reductions in excess nitrogen are most achievable. Surplus nitrogen use was higher than expected in these regions based on the mix of underlying factors -- suggesting less barriers to successful nitrogen reduction efforts."This suggests that nitrogen reduction programs -- including those that offer farmers' financial incentives -- have the highest potential for success in these 20 regions," says co-author Courtney Hammond Wagner of Stanford University, who recently completed a PhD at University of Vermont.
Pollution
2,021
February 16, 2021
https://www.sciencedaily.com/releases/2021/02/210216185852.htm
Electricity source determines benefits of electrifying China's vehicles
Each year an estimated 1.2 million Chinese citizens die prematurely due to poor air quality. And public health consequences are particularly dire during extreme air quality events, such as infamous "Airpocalypse" winter haze episodes.
A team of Northwestern University researchers wondered if widespread adoption of electric vehicles (EVs) could help China avoid these deadly events. The answer? It depends.In a new study, the researchers concluded air quality and public health benefits of EVs -- as well as their ability to reduce carbon emissions -- in China are dependent on the type of transport electrified and the composition of the electric grid.The study was published today (Feb. 16) in the February 2021 issue of the journal "A significant fraction of China's electricity is currently sourced from the burning of coal, which is a very carbon intensive power source," said Jordan Schnell, the study's lead author. "When the coal-heavy power is used to charge light-duty vehicles, carbon emissions are reduced because of the efficiency of light-duty EVs. Heavy-duty electric vehicles require significantly more energy, so we see a net increase in carbon dioxide emissions. However, even when heavy-duty vehicle power is sourced entirely by coal-fired electricity, we still see air quality improvements because the on-road emission reductions outweigh what the power plants add. Fine particulate matter emissions, which are a main ingredient in haze, are reduced.""We find that to achieve net co-benefits from heavy-duty EV adoption, the real sticking point in China's infrastructure lies in its power-generation sector. Greater adoption of emission-free renewable power generation is needed," said Northwestern's Daniel Horton, the study's senior author. "For light-duty vehicles the co-benefits are already there under the current infrastructure."Horton is an assistant professor of Earth and planetary sciences in Northwestern's Weinberg College of Arts and Sciences. At the time of the research, Schnell was a postdoctoral fellow in Horton's laboratory. He is now a research scientist at the Cooperative Institute for Research in Environmental Sciences at the University of Colorado at Boulder.Pollutants come from a variety of sources, both natural and human-caused, and include emissions from transportation and power-generation facilities. Adoption of EVs reduces the emission of air pollutants and greenhouse gases from vehicle tailpipes, but the overall emissions budget must account for the shifting of emissions to the power plants used to charge EV batteries. Country-wide vehicle distribution and energy consumption also play roles in overall emissions profiles. Heavy-duty versus light-duty vehicles, for example, differ substantially, complicating the net outcome.To reconcile these complicating factors, the researchers combined chemistry-climate modeling with emissions, weather and public health data. The researchers examined the air quality and climate benefits and tradeoffs of heavy-duty versus light-duty vehicle adoption using meteorological conditions from a notorious Airpocalypse event in January 2013. Unlike previous EV-air quality studies that focused on chronic exposure, the researchers focused on the acute public health impacts of exposure to this short, but extremely hazardous, haze event.The researchers discovered that EV adoption could have a modest role in reducing the public health burden of individual Airpocalypse events, with the extent depending on the type of vehicle electrified. They also found the realization of public health and climate co-benefits depended on the addition of emission-free renewable energy to the electric grid.During the January 2013 extreme pollution episode, poisonous haze hung over much of China, including the major population centers of Beijing, Tianjin and Hebei. Acute exposure to the record-high levels of fine particulate matter and nitrogen dioxide increased pollution-related respiratory diseases, heart disease and stroke, which the researchers estimate led to approximately 32,000 premature deaths and $14.7 billion in health care costs.To assess the consequences of EV adoption, in one model simulation scenario the researchers replaced roughly 40% of China's heavy-duty vehicles (such as construction equipment, long-haul trucks and buses) with electrified versions. An alternative scenario simulated the replacement of roughly 40% of China's light-duty vehicles with electric alternatives. The energy needed to charge the EV batteries is equivalent in both scenarios and is sourced from power-generation facilities on the grid. Emissions of greenhouse gases and air pollutants are determined according to the battery-charging load and power plant profile.The research team found that electrifying 40% of heavy-duty vehicles consistently improved air quality -- avoiding up to 562 premature deaths. It did not, however, reduce greenhouse gas emissions. Light-duty EV adoption, on the other hand, reduced carbon dioxide emissions by 2 megatons but had more modest air quality benefits.To drive home this point, the researchers provided an additional scenario comparison. When all traffic emissions are removed from the 2013 event, air quality improvements lead to a 6% decrease in acute premature mortality. When all power-sector emissions are removed, however, acute premature mortality declines 24%."Overall, we found that EV-induced pollution changes and avoided premature deaths were modest for the extreme event," Schnell said. "But air quality will improve more drastically as the power-generation sector moves away from fossil fuel-fired electricity."
Pollution
2,021
February 15, 2021
https://www.sciencedaily.com/releases/2021/02/210215092402.htm
Commuters are inhaling unacceptably high levels of carcinogens
A new study finds that California's commuters are likely inhaling chemicals at levels that increase the risk for cancer and birth defects.
As with most chemicals, the poison is in the amount. Under a certain threshold of exposure, even known carcinogens are not likely to cause cancer. Once you cross that threshold, the risk for disease increases.Governmental agencies tend to regulate that threshold in workplaces. However, private spaces such as the interior of our cars and living rooms are less studied and less regulated.Benzene and formaldehyde -- both used in automobile manufacturing -- are known to cause cancer at or above certain levels of exposure and are Prop. 65-listed chemicals.New UC Riverside research shows that the average commuter in California is exceeding the threshold for exposure, breathing in unsustainably high levels of both chemicals.Both benzene and formaldehyde are carcinogens, and benzene carries the additional risk of reproductive and developmental toxicity."These chemicals are very volatile, moving easily from plastics and textiles to the air that you breathe," said David Volz, UCR professor of environmental toxicology.The study, published in the journal It found that up to 90% of the population in Los Angeles, San Diego, Orange, Santa Clara, and Alameda counties have at least a 10% chance of exceeding cancer risk from inhaling the chemicals, based on having 30-minute average commute times."Of course, there is a range of exposure that depends on how long you're in the car, and how much of the compounds your car is emitting," said Aalekhya Reddam, a graduate student in the Volz laboratory, and lead author of the study.Previously, Volz and Reddam studied commuter exposure to a flame retardant called TDCIPP or chlorinated tris, and found that longer commute times increased exposure to that carcinogen as well.They set out on this study wanting to understand the risk of that compound relative to other chemicals introduced during car manufacturing.Reddam advises commuters to keep the windows open during their rides if possible. "At least with some air flow, you'd be diluting the concentration of these chemicals inside your car," she said.Benzene is used to produce synthetic fibers, and formaldehyde is a binder in plastics. "There should be alternatives to these chemicals to achieve the same goals during vehicle manufacturing," Volz said. "If so, these should be used."
Pollution
2,021
February 11, 2021
https://www.sciencedaily.com/releases/2021/02/210211171119.htm
Facts on the ground: How microplastics in the soil contribute to environmental pollution
Plastic, with its unabated global production, is a major and persistent contributor to environmental pollution. In fact, the accumulation of plastic debris in our environment is only expected to increase in the future. "Microplastics" (MP) -- plastic debris <5 mm in size -- are particularly problematic in this regard, owing to how easily they can be ingested by marine organisms and eventually find their way to humans. But, it is not just the marine environment that contains MP debris. Studies on agricultural soil have revealed that MPs adversely affect not only the soil quality but also the physiology of soil organisms and, in turn, the interaction between soil and plants. Still, because most studies on MPs have focused on marine environments, it is not clear how abundant MPs are in different types of soils based on the agricultural practice (a source of MP) employed. Moreover, it remains to be determined whether only external sources of MP (sewage, wastewater, and runoff water due to rain) are responsible for the soil pollution.
Scientists from Incheon National University, Korea, headed by Prof. Seung-Kyu Kim, now explore these questions in their latest study published in For their study, the scientists examined four soil types corresponding to different agricultural practices: soils from outside and inside a greenhouse (GS-out and GS-in, respectively), mulching (MS), and rice field soil (RS). Of these, the former three samples represented the use of polyethylene film, while the RS sample represented little to no use of plastic. To minimize the effect of non-agricultural sources of MP, scientists collected the samples from rural farmlands during the dry season. They only considered MPs in the size range of 0.1-5 and classified them as per their shapes: fragment (uneven), sheet (thin an even), spherule (round), and fiber (thread-like).As expected, scientists found the highest average MP abundance in GS-in and GS-out (GS-in > GS-out), but surprisingly, they found the lowest MP content in MS rather than RS. Further, they found that among the different shapes of MPs, fragments dominated GS-in; fibers, GS-out and MS; and sheets, RS. Interestingly, all soils except GS-in had a major contribution from sheets, which hinted towards potential internal sources of fragment-type MPs within greenhouses.Scientists also observed an interesting trend regarding MP size distribution in the soil samples. They found that, unlike GS-out, MS, and RS (which showed MP abundance only for a range of sizes), GS-in showed an increasing abundance for progressively smaller sizes. They attributed this to the absence of "environmental fate effect," causing the removal of MPs by surface-runoff, infiltration, and wind in the GS-in samples. Prof. Kim explains, "Contrary to previous studies which stress on MPs originating mostly from external sources, our study reveals that MPs in agricultural soil can come from external as well as internal sources, and that their concentration and sizes can be strongly affected by environmental conditions,"These findings can contribute to an enhanced understanding of the role of agricultural environment as an MP source. Hopefully, assessing potential risks of MPs in agricultural soils and establishing efficient management strategies can help us to reduce the threat from MPs.
Pollution
2,021
February 10, 2021
https://www.sciencedaily.com/releases/2021/02/210210133423.htm
Emissions of banned ozone-depleting substance are back on the decline
Global emissions of a potent substance notorious for depleting the Earth's ozone layer -- the protective barrier which absorbs the Sun's harmful UV rays -- have fallen rapidly and are now back on the decline, according to new research.
Two international studies published today in Dr Luke Western, from the University of Bristol, a co-lead author of one of the studies, said: "The findings are very welcome news and hopefully mark an end to a disturbing period of apparent regulatory breaches. If the emissions had stayed at the significantly elevated levels we found, there could have been a delay, possibly of many years, in ozone layer recovery. On top of that, since CFC-11 is also a potent greenhouse gas, the new emissions were contributing to climate change at levels similar to the carbon dioxide emissions of a megacity."The production of CFC-11 was banned globally in 2010 as part of the Montreal Protocol, a historic international treaty which mandated the phase-out of ozone-depleting substances. Thereafter, CFC-11 emissions should have steadily fallen.But in 2018 some of the same scientists behind the recent more reassuring discovery found a jump in emissions had begun around 2013, prompting alarm at the time that production of the banned substance had resumed in an apparent violation of the Montreal Protocol.The first sign of something untoward was spotted by an international atmospheric monitoring team led by the National Oceanic and Atmospheric Administration (NOAA).Dr Steve Montzka from NOAA, lead author of the original research paper explained: "We noticed the concentration of CFC-11 had declined more slowly since 2013 than predicted, clearly indicating an upturn in emissions. The results suggested that some of the increase was from eastern Asia."These unexpected findings were confirmed by an independent global measurement network, the Advanced Global Atmospheric Gases Experiment (AGAGE).Professor Ron Prinn from Massachusetts Institute of Technology (MIT), AGAGE principal investigator and co-author of both new papers, said: "The global data clearly suggested new emissions. The question was where exactly?"The answer lay in the measurements at AGAGE and affiliate monitoring stations that detect polluted air from nearby regions. Using data from Korean and Japanese stations, it appeared around half of the increase in global emissions originated from parts of eastern China."Further investigation by media and environmental campaigners exposed usage of CFC-11 in the manufacture of insulating foams in China. Chinese authorities took notice and at meetings of the Montreal Protocol in 2018 and 2019, they confirmed some banned ozone depleting substances were identified during factory inspections, but only in very small amounts relative to those inferred from the atmospheric data. According to their reports, arrests, material seizures, and the demolition of production facilities ensued.The scientific teams have continued to closely monitor atmospheric levels, and the latest evidence, reported in the two papers on global CFC-11 emissions and eastern Chinese emissions, indicates that those efforts have likely contributed to dramatic emission declines.Professor Matt Rigby, from the University of Bristol, co-author of both studies, explained: "To quantify how emissions have changed at regional scales, we compared the pollution enhancements observed in the Korean and Japanese measurement data to computer models simulating how CFC-11 is transported through the atmosphere. With the global data, we used another type of model that quantified the emissions change required to match the observed global CFC-11 concentration trends."At both scales, the findings were striking; emissions had dropped by thousands of tonnes per year between 2017 and 2019. In fact, we estimate this recent decline is comparable or even greater than the original increase, which is a remarkable turnaround."Whilst the findings suggest the rapid action in eastern China and other regions of the world has likely prevented a substantial delay in ozone layer recovery, any unreported production will have a lingering environmental impact.Professor Rigby added: "Even if the new production associated with the emissions from eastern China, and other regions of the world, has now stopped, it is likely only part of the total CFC-11 that was made has been released to the atmosphere so far. The rest may still be sitting in foams in buildings and appliances and will seep out into the air over the coming decades."Since the estimated eastern Chinese CFC-11 emissions could not fully account for the inferred global emissions, there are calls to enhance international efforts to track and trace any future emitting regions.Professor Ray Weiss, from Scripps Institution of Oceanography, a Principal Investigator in AGAGE, said: "As a direct result of these findings, the Parties of the Montreal Protocol are now taking steps to identify, locate and quantify any future unexpected emissions of controlled substances by expanding the coverage of atmospheric measurements in key regions of the globe."
Pollution
2,021
February 10, 2021
https://www.sciencedaily.com/releases/2021/02/210210091158.htm
Pre-COVID subway air polluted from DC to Boston, but New York region's is the worst, study finds
Commuters now have yet another reason to avoid packing themselves into subway stations. New York City's transit system exposes riders to more inhaled pollutants than any other metropolitan subway system in the Northeastern United States, a new study finds. Yet even its "cleaner" neighbors struggle with enough toxins to give health-conscious travelers pause.
Led by NYU Grossman School of Medicine researchers, the study measured air quality samples in 71 stations at morning and evening rush hours in Boston, New York City, Philadelphia, and Washington, D.C. Among the 13 underground stations tested in New York, the investigators found concentrations of hazardous metals and organic particles that ranged anywhere from two to seven times that of outdoor air samples.Notably in the report, publishing online Feb. 10 in the journal Air quality was also measured in another 58 stations during rush hours in Boston, Philadelphia, and Washington. While no station's readings reached the severe levels of contamination seen in New York's worst transit lines, underground subway stations within each of these cities still showed at least twice the airborne particle concentrations as their respective outside samples at morning and evening rush hours."Our findings add to evidence that subways expose millions of commuters and transit employees to air pollutants at levels known to pose serious health risks over time," says study lead author David Luglio, a doctoral student at NYU Grossman."As riders of one of the busiest, and apparently dirtiest, metro systems in the country, New Yorkers in particular should be concerned about the toxins they are inhaling as they wait for trains to arrive," adds co-senior study author Terry Gordon, PhD, a professor in the Department of Environmental Medicine at NYU Grossman.Further analysis of air samples showed that iron and organic carbon, a chemical produced by the incomplete breakdown of fossil fuels or from decaying plants and animals, composed three-quarters of the pollutants found in the underground air samples for all measured subway stations. Although iron is largely nontoxic, some forms of organic carbon have been linked to increased risk of asthma, lung cancer, and heart disease, the study authors say. Gordon notes that further research is needed to assess potentially higher risk for transit workers who spend far longer periods of time in the stations than riders.The Metropolitan Transit Authority reported that 5.5 million people rode New York City's subways every day in 2019, while PATH puts its daily ridership at more than 284,000.For the investigation, the research team took over 300 air samples during rush hour in stations in Manhattan, Philadelphia, Washington, Boston, and along train lines connecting New York City to New Jersey and Long Island. The data reflects more than 50 total hours of sampling across about 70 subway stops. In addition to real-time monitoring of the air quality, the team also used filters to collect airborne particles for later analysis.According to the findings, the PATH-New York/New Jersey system had the highest airborne particle concentration at 392 micrograms per cubic meter, followed by the MTA-New York at 251 micrograms per cubic meter. Washington had the next highest levels at 145 micrograms per cubic meter, followed by Boston at 140 micrograms per cubic meter. Philadelphia was comparatively the cleanest system at 39 micrograms per cubic meter. By comparison, aboveground air concentrations for all measured cities averaged just 16 micrograms per cubic meter.Meanwhile, the Environmental Protection Agency advises that daily exposures at fine particle concentrations exceeding 35 micrograms per cubic meter pose serious health hazards.Besides the Christopher Street PATH station, the most polluted stations in the Northeast included Capitol South in Washington, Broadway in Boston, 2nd Avenue on the F line New York City, and 30th Street in Philadelphia, according to the findings.Gordon cautions that the researchers did not measure riders' short-term exposure to the airborne substances, which would more closely mimic their experiences dashing to catch a train at the last minute. In addition, it remains unclear whether the steep drop in New York subway ridership due to the COVID-19 pandemic has influenced the metro's air quality, he adds.Next, Gordon says he plans to investigate sources of subway station air contamination, such as exhaust given off by diesel maintenance locomotives, whipped up dust from the remains of dead rodents, and poor ventilation as potential culprits. He also encourages researchers and transit authorities to examine why some systems are less polluted than others in a bid to adopt practices that might relatively quickly make stations safer for riders.Funding for the study was provided by National Institute of Environmental Health Sciences grants P30 ES000260, P30 ES009089, and T32 ES007324.In addition to Gordon and Luglio, other NYU Grossman researchers include Maria Katsigeorgis; Jade Hess; Rebecca Kim; John Adragna, George Thurston, Colin Gordon, and Amna Raja. Jonathan Fine, MD, at Norwalk Hospital in Norwalk, Conn., was also involved in the study. M.J. Ruzmyn Vilcassim, PhD, at the University of Alabama in Birmingham served as the study co-senior author.
Pollution
2,021
February 9, 2021
https://www.sciencedaily.com/releases/2021/02/210209151839.htm
The pandemic lockdown leads to cleaner city air across Canada, paper reveals
The COVID-19 pandemic that shuttered cities around the world did not just affect the way we work, study and socialize. It also affected our mobility. With millions of workers no longer commuting, vehicle traffic across Canada has plummeted. This has had a significant impact on the quality of air in major Canadian cities, according to a new study by Concordia researchers.
A paper published in the journal Not surprisingly, the researchers found that emission levels dropped dramatically over the course of the pandemic. The most noticeable drop-off occurred in week 12 of 2020 -- the one beginning Sunday, March 15, when national lockdown measures were implemented."We saw traffic congestion levels decrease by 69 per cent in Toronto and by 75 per cent in Montreal, compared to the same week in 2019," says the paper's lead author, Xuelin Tian, a second-year MSc student at the Gina Cody School of Engineering and Computer Science. Her co-authors include fellow student Zhikun Chen, her supervisor Chunjiang An, assistant professor in the Department of Building, Civil and Environmental Engineering, and Zhiqiang Tian of Xi'an Jiaotong University in China.Less gasoline means less pollutionThe paper notes that motor gasoline consumption fell by almost half during the pandemic's early weeks, with a similar, corresponding drop seen in carbon dioxide emissions. Motor gasoline consumption added 8,253.52 million kilograms of carbon dioxide to the atmosphere in April 2019, according to the authors' data. That number dropped to 4,593.01 million kilograms in April 2020.There have also been significant drops in the concentration levels of nitrogen dioxide in Vancouver, Edmonton, Toronto and Montreal since the beginning of the pandemic. Similarly, concentration levels of carbon monoxide, closely linked to the transportation and mobile equipment sectors, dropped. In Edmonton, carbon monoxide concentration levels fell by as much as 50 per cent, from 0.14 parts per million in March 2018 to 0.07 in March 2020.Emissions began to grow again over the summer, but the researchers have not yet had a chance to examine data from the second lockdown that began in late fall/winter 2020.Aside from providing a kind of snapshot of a particularly unusual period, the data can also help governments assess the long-term impact of replacing gas-burning vehicles with electric ones on Canadian city streets."This pandemic provided an opportunity for scenario analysis, although it wasn't done on purpose," says An, Concordia University Research Chair in Spill Response and Remediation."Governments everywhere are trying to reduce their use of carbon-based fuels. Now we have some data that shows what happens when we reduce the number of gasoline-powered vehicles and the effect that has on emissions."The Natural Sciences and Engineering Research Council of Canada (NSERC) provided support for this study.
Pollution
2,021
February 9, 2021
https://www.sciencedaily.com/releases/2021/02/210209151824.htm
Long-term environmental damage from transportation projects in Kenya, scientists warn
The construction of a major railway through Kenya will have long-term environmental impacts on the area, suggesting more work needs to be done to limit the damage on future infrastructure projects, a major study reveals.
The biggest impact of the Standard Gauge Railway (SGR), which runs from Mombasa to Nairobi, was pollution and contamination of soil, water and air, as well as disruption of natural processes.The research, led by the University of York and part of the Development Corridors Partnership project, also showed environmental issues as a result of breaking up large areas of habitat into smaller, more isolated patches, that may not be able to support long-term natural processes.The SGR project was given the go-ahead following the completion of two Environmental and Social Impact Assessments, but scientists question how effectively recommendations were implemented in the development, given the evidence of widespread environmental degradation that can be seen in the area.Professor Robert Marchant, from the University of York's Department of Environment and Geography, said: "African nations are looking forward to large-scale infrastructure investment as a catalyst for economic growth, but our research shows that before this can happen more work is needed to quantify ecological impacts on the land."Not only this, but should issues arises once the projects are complete, there must be a ready-to-go mitigation strategy that can be applied to reduce further damage quickly."The researchers recommend that environmental impacts are integrated into the planning of largescale infrastructure projects at every stage, and call for a particular focus on engaging and consulting key stakeholders in the design and implementation phases of the project.Dr Tobias Nyumba, Post-Doctoral Researcher at the Development Corridors Partnership, said: "These steps are essential, if a 'transportation corridor' is to become a true 'development corridor', bringing sustainable development and social wellbeing to a country such as Kenya, while minimizing or eliminating environmental damage."As part of the Development Corridors Partnership, which aims to address environmental concerns about large development areas, researchers continue to work with a wide-range of East African and Chinese developers to improved thinking around sustainability.
Pollution
2,021
February 8, 2021
https://www.sciencedaily.com/releases/2021/02/210208161948.htm
Fast-growing parts of Africa see a surprise: Less air pollution from seasonal fires
Often, when populations and economies boom, so does air pollution -- a product of increased fossil-fuel consumption by vehicles, industry and households. This has been true across much of Africa, where air pollution recently surpassed AIDS as the leading cause of premature death. But researchers have discovered at least a temporary bright spot: dangerous nitrogen oxides, byproducts of combustion, are declining across the north equatorial part of the continent. The reason: a decline in the longtime practice of setting of dry-season fires to manage land.
The study, along with previous research, links the decline to increasing population densities, along with switches from animal herding to row-crop agriculture and other pursuits. Shifting weather patterns also seem to have played a role. The research was published this week in the People in many parts of Africa have long set fires during dry seasons to clear land and release mineral nutrients held in vegetation back into the soil -- so much so that in many years, the continent is home to about 70 percent of global burned areas. The practice is especially prevalent in north equatorial Africa, spanning some 15 countries from Senegal and Ivory Coast in the west, to South Sudan, Uganda and Kenya in the east. Here, many people live as nomadic herders amid vast expanses of savanna and grasslands, and they traditionally set fires during the November-February dry season.However, recent years have seen steady population growth, and the conversion of savanna into villages and plots for crops, along with growth in incomes. Thus, say the researchers, fewer people are setting fires, in order to protect infrastructure and livelihoods. As a result, from 2005 to 2017, the region saw a 4.5 percent overall decrease in lower-atmosphere concentrations of nitrogen oxides (known for short as NOx) during the dry season -- the time when fires normally combine with urban pollution to make this the worst time of year for air quality. The NOx decline has been so strong that it has more than offset a doubling of emissions from fossil-fuel use in vehicles, factories and other sources coming mainly from urban areas.Previous research has also attributed part of the decline to temporary cyclical changes in winds coming from the Indian Ocean. In some years, these shifts have caused dry seasons to become a little wetter, dampening fires, or caused rainy seasons to become a little drier, reducing the amount of new vegetation that can subsequently serve as fuel. But the human factor has been steady."It's nice to see a decline occurring when you'd expect to see pollution increasing," said the study's lead author, Jonathan Hickman, a researcher at the NASA Goddard Institute for Space Studies, an affiliate of Columbia University's Earth Institute. The flip side, says Hickman: Overall NOx pollution has continued to increase during the rainy season, when fires are not a factor. "In the rainy season, we see a straight increase related to economic growth," he said.The density of NOx compounds is considered by many scientists to be a proxy for overall air quality. They are linked directly to asthma and premature death, and once in the air, they are involved in chemical reactions that produce an array of other dangerous pollutants, including low-level ozone and aerosols that can damage both crops and human health.Satellite data has enabled researchers to measure NOx in the air over time, and the authors of the new study took advantage of this. They also used satellite imagery to document trends in burned land. Combining both sets of observations, they found that they were tightly linked. Furthermore, economic and demographic data showed that declines in NOx matched areas where population density and economic activity have increased.That said, Hickman says that as population continues to grow and urbanize, more and more people will almost certainly be subjected to concentrated urban pollution, and this could cancel out the benefits of decreased fires. "The projections in this regard are not optimistic," he says. While some efforts to expand monitoring of urban air quality are underway, most African cities don't currently even measure air pollution, much less do much to curb it.Historically, economic booms elsewhere have led to similarly rampant problems, usually followed by an inflection point, where governments finally rein things in. London's Great Smog of 1952, which killed some 10,000 people, was followed by some of the world's first clean-air standards. After World War II, booming U.S. industries poured pollutants into the air virtually unchecked, until the EPA was created in 1970. In the early 21st century, China was the world's air-pollution capital, until reforms in 2013 started easing the problem. Along with Africa, India is now undergoing a boom in emissions, with few rules in place."Hopefully, this one seasonal bright spot gives African nations a little opportunity to avoid the mistakes made by other countries," said Hickman.The paper was coauthored by Niels Andela of the NASA Goddard Space Flight Center; Kostas Tsigaridis and Susanne Bauer of the Goddard Institute for Space Studies; Corinne Galy-Lacaux of the Université Toulouse III Paul Sabatier, France; and Money Ossohou of the Université Félix Houphouët-Boigny, Ivory Coast.
Pollution
2,021
February 8, 2021
https://www.sciencedaily.com/releases/2021/02/210208161936.htm
Cleaning Up the Mississippi River
Louisiana State University College of the Coast & Environment Boyd Professor R. Eugene Turner reconstructed a 100-year record chronicling water quality trends in the lower Mississippi River by compiling water quality data collected from 1901 to 2019 by federal and state agencies as well as the New Orleans Sewerage and Water Board. The Mississippi River is the largest river in North America with about 30 million people living within its watershed. Turner focused on data that tracked the water's acidity through pH levels and concentrations of bacteria, oxygen, lead and sulphate in this study published in
Rivers have historically been used as disposal sites worldwide. From the polluted Cuyahoga River in Cleveland, Ohio that caught fire to the Mississippi River where sewage was dumped resulting in increases in lead and decreases in oxygen, rivers were environmentally hazardous until the passage of the U.S. Clean Water Act in 1972. The Clean Water Act as well as the Clean Air Act, the Toxic Substances Control Act and others established a federal structure to reduce pollutant discharges into the environment and gave the Environmental Protection Agency the authority to restrict the amounts and uses of certain toxic chemicals such as lead. Turner's study assesses changes in water quality before and after the Clean Water Act and Clean Air Act went into effect. The water quality data he compiled were collected from four locations on the southern end of the Mississippi River at St. Francisville, Plaquemine, two locations in New Orleans and at Belle Chasse, Louisiana.His research found that after these environmental policies were put into place, bacterial concentrations decreased by about 3 orders of magnitude, oxygen content increased, lead concentrations decreased and sulphate concentrations declined less dramatically. His research also found that as sulfur dioxide emissions peaked in 1965, the river's pH dropped to a low of 5.8. In the U.S., natural water falls between 6.5 and 8.5 with 7.0 being neutral. However, as sulfur dioxide emissions declined in 2019, the pH of the river was restored to an average of 8.2."The promulgation and acceptance of the Clean Water Act and Clean Air Act demonstrates how public policy can change for the better and help everyone who is demonstrably 'downstream' in a world of cycling pollutants," Turner said.Consistent vigilance and monitoring are necessary to ensure water quality in the Mississippi River and northern Gulf of Mexico. Plastics fill oceans, pharmaceuticals are distributed in sewage and COVID-19 virus and other viruses spread in partially treated sewerage wastes from aging septic tanks, unconstrained wetland treatment systems with insufficient hydrologic controls and overloaded treatment systems.New pollutants are added to the river each year, which will require monitoring and testing. Unfortunately, lead monitoring has stopped, but decades of sustained and effective efforts at a national scale created water quality improvements and are an example for addressing new and existing water quality challenges, Turner said.
Pollution
2,021
February 5, 2021
https://www.sciencedaily.com/releases/2021/02/210205121259.htm
Healthy oceans need healthy soundscapes
Rain falls lightly on the ocean's surface. Marine mammals chirp and squeal as they swim along. The pounding of surf along a distant shoreline heaves and thumps with metronomic regularity. These are the sounds that most of us associate with the marine environment. But the soundtrack of the healthy ocean no longer reflects the acoustic environment of today's ocean, plagued with human-created noise.
A global team of researchers set out to understand how human-made noise affects wildlife, from invertebrates to whales, in the oceans, and found overwhelming evidence that marine fauna, and their ecosystems, are negatively impacted by noise. This noise disrupts their behavior, physiology, reproduction and, in extreme cases, causes mortality. The researchers call for human-induced noise to be considered a prevalent stressor at the global scale and for policy to be developed to mitigate its effects.The research, led by Professor Carlos M. Duarte, distinguished professor at King Abdullah University of Science and Technology (KAUST), and published in the journal "The landscape of sound -- or soundscape -- is such a powerful indicator of the health of an environment," noted Ben Halpern, a coauthor on the study and director of the National Center for Ecological Analysis and Synthesis at UC Santa Barbara. "Like we have done in our cities on land, we have replaced the sounds of nature throughout the ocean with those of humans."The deterioration of habitats, such as coral reefs, seagrass meadows and kelp beds with overfishing, coastal development, climate change and other human pressures, have further silenced the characteristic sound that guides the larvae of fish and other animals drifting at sea into finding and settling on their habitats. The call home is no longer audible for many ecosystems and regions.The Anthropocene marine environment, according to the researchers, is polluted by human-made sound and should be restored along sonic dimensions, and along those more traditional chemical and climatic. Yet, current frameworks to improve ocean health ignore the need to mitigate noise as a pre-requisite for a healthy ocean.Sound travels far, and quickly, underwater. And marine animals are sensitive to sound, which they use as a prominent sensorial signal guiding all aspects of their behavior and ecology. "This makes the ocean soundscape one of the most important, and perhaps under-appreciated, aspects of the marine environment," the study states. The authors' hope is that the evidence presented in the paper will "prompt management actions ... to reduce noise levels in the ocean, thereby allowing marine animals to re-establish their use of ocean sound.""We all know that no one really wants to live right next to a freeway because of the constant noise," commented Halpern. "For animals in the ocean, it's like having a mega-freeway in your backyard."The team set out to document the impact of noise on marine animals and on marine ecosystems around the world. They assessed the evidence contained across more than 10,000 papers to consolidate compelling evidence that human-made noise impacts marine life from invertebrates to whales across multiple levels, from behavior to physiology."This unprecedented effort, involving a major tour de force, has shown the overwhelming evidence for the prevalence of impacts from human-induced noise on marine animals, to the point that the urgency of taking action can no longer be ignored," KAUST Ph.D. student Michelle Havlik said. The research involved scientists from Saudi Arabia, Denmark, the U.S. and the U.K., Australia, New Zealand, the Netherlands, Germany, Spain, Norway and Canada."The deep, dark ocean is conceived as a distant, remote ecosystem, even by marine scientists," Duarte said. "However, as I was listening, years ago, to a hydrophone recording acquired off the U.S. West Coast, I was surprised to hear the clear sound of rain falling on the surface as the dominant sound in the deep-sea ocean environment. I then realized how acoustically connected the ocean surface, where most human noise is generated, is to the deep sea; just 1,000 m, less than 1 second apart!"The takeaway of the review is that "mitigating the impacts of noise from human activities on marine life is key to achieving a healthier ocean." The KAUST-led study identifies a number of actions that may come at a cost but are relatively easy to implement to improve the ocean soundscape and, in so doing, enable the recovery of marine life and the goal of sustainable use of the ocean. For example, simple technological innovations are already reducing propeller noise from ships, and policy could accelerate their use in the shipping industry and spawn new innovations.Deploying these mitigation actions is a low hanging fruit as, unlike other forms of human pollution such as emissions of chemical pollutants and greenhouse gases, the effects of noise pollution cease upon reducing the noise, so the benefits are immediate. The study points to the quick response of marine animals to the human lockdown under COVID-19 as evidence for the potential rapid recovery from noise pollution.Using sounds gathered from around the globe, multimedia artist and study coauthor Jana Winderen created a six-minute audio track that demonstrates both the peaceful calm, and the devastatingly jarring, acoustic aspects of life for marine animals. The research is truly eye opening, or rather ear opening, both in its groundbreaking scale as well as in its immediacy.
Pollution
2,021
February 5, 2021
https://www.sciencedaily.com/releases/2021/02/210205104243.htm
Sensor and detoxifier in one
Ozone is a problematic air pollutant that causes serious health problems. A newly developed material not only quickly and selectively indicates the presence of ozone, but also simultaneously renders the gas harmless. As reported by Chinese researchers in
Ozone (OA team led by Zhenjie Zhang at Nankai University (Tianjin, China) set themselves the goal of developing a material that can both rapidly detect and efficiently remove ozone. Their approach uses materials known as covalent organic frameworks (COFs). COFs are two- or three-dimensional organic solids with extended porous crystalline structures; their components are bound together by strong covalent bonds. COFs can be tailored to many applications through the selection of different components.The researchers selected easily producible, highly crystalline COFs made of aromatic ring systems. The individual building blocks are bound through connective groups called imines (a nitrogen atom bound to a carbon atom by a double bond). These are at the center of the action.The imine COFs indicate the presence of ozone through a rapid color change from yellow to orange-red, which can be seen with the naked eye and registered by a spectrometer. Unlike many other detectors, the imine COF also works very reliably, sensitively, and efficiently at high humidity and over a wide temperature range. In the presence of water, the water molecules will preferentially bind to the imine groups. Consequently, the researchers assert, a hydroxide ion (OH
Pollution
2,021
February 8, 2021
https://www.sciencedaily.com/releases/2021/02/210208145919.htm
MARLIT, artificial intelligence against marine litter
Floating sea macro-litter is a threat to the conservation of marine ecosystems worldwide. The largest density of floating litter is in the great ocean gyres -- systems of circular currents that spin and catch litter -- but the polluting waste is abundant in coastal waters and semi closed seas such as the Mediterranean.
MARLIT, an open access web app based on an algorithm designed with deep learning techniques, will enable the detection and quantification of floating plastics in the sea with a reliability over 80%, according to a study published in the journal This methodology results from the analysis through artificial intelligence techniques of more than 3,800 aerial images of the Mediterranean coast in Catalonia, and it will allow researchers to make progress in the assessment of the presence, density and distribution of the plastic pollutants in the seas and oceans worldwide. Among the participants in the study, published in the journal Environmental Pollution, are the experts of the Consolidated Research Group on Large Marine Vertebrates of the UB and IRBio, and the Research Group on Biostatistics and Bioinformatics (GRBIO) of the UB, integrated in the Bioinformatics Barcelona platform (BIB).Historically, direct observations (boats, planes, etc.) are the base for the common methodology to assess the impact of floating marine macro-litter (FMML). However, the great ocean area and the volume of data make it hard for the researchers to advance with the monitoring studies."Automatic aerial photography techniques combined with analytical algorithms are more efficient protocols for the control and study of this kind of pollutants," notes Odei Garcia-Garin, first author of the article and member of the CRG on Large Marine Mammals, led by Professor Àlex Aguilar."However," he continues, "automated remote sensing of these materials is at an early stage. There are several factors in the ocean (waves, wind, clouds, etc.) that harden the detection of floating litter automatically with the aerial images of the marine surface. This is why there are only a few studies that made the effort to work on algorithms to apply to this new research context."The experts designed a new algorithm to automate the quantification of floating plastics in the sea through aerial photographs by applying the deep learning techniques, automatic learning methodology with artificial neuronal networks able to learn and take the learning to higher levels."The great amount of images of the marine surface obtained by drones and planes in monitoring campaigns on marine litter -also in experimental studies with known floating objects- enabled us to develop and test a new algorithm that reaches a 80% of precision in the remote sensing of floating marine macro-litter," notes Garcia-Garin, member of the Department of Evolutionary Biology, Ecology and Environmental Sciences of the UB and IRBio.The new algorithm has been implemented to MARLIT, an open access web app described in the article and which is available to all managers and professionals in the study of the detection and quantification of floating marine macro-litter with aerial images. In particular, this is a proof of concept based on an R Shiny package, a methodological innovation with great interest to speed up the monitoring procedures of floating marine macro-litter.MARLIT enables the analysis of images individually, as well as to divide them into several segments, according to the user's guidelines, identify the presence of floating litter in each certain area and estimate their density with the image metadata (height, resolution). In the future, it is expected to adapt the app to a remote sensor (for instance, a drone) to automate the remote sensing process.At a European level, the EU Marine Strategy Framework Directive indicates the application of FMML monitoring techniques to fulfill the continuous assessment of the environmental state of the marine environment. "Therefore, the automatization of monitoring processes and the use of apps such as MARLIT would ease the member states' fulfilment of the directive," conclude the authors of the study.
Pollution
2,021
February 4, 2021
https://www.sciencedaily.com/releases/2021/02/210204135742.htm
How air pollution may increase the risk of cardiovascular disease
Tiny particles of air pollution -- called fine particulate matter -- can have a range of effects on health, and exposure to high levels is a known risk factor for cardiovascular disease. New research led by investigators at Massachusetts General Hospital (MGH) reveals that fine particulate matter has a detrimental impact on cardiovascular health by activating the production of inflammatory cells in the bone marrow, ultimately leading to inflammation of the arteries. The findings are published in the
The retrospective study included 503 patients without cardiovascular disease or cancer who had undergone imaging tests at MGH for various medical reasons. The scientists estimated participants' annual average fine particulate matter levels using data obtained from the U.S. Environment Protection Agency's air quality monitors located closest to each participant's residential address.Over a median follow-up of 4.1 years, 40 individuals experienced major cardiovascular events, such as heart attacks and strokes, with the highest risk seen in participants with higher levels of fine particulate matter at their home address. Their risk was elevated even after accounting for cardiovascular risk factors, socioeconomic factors, and other key confounders. Imaging tests assessing the state of internal organs and tissues showed that these participants also had higher bone marrow activity, indicating a heightened production of inflammatory cells (a process called leukopoiesis), and elevated inflammation of the arteries. Additional analyses revealed that leukopoiesis in response to air pollution exposure is a trigger that causes arterial inflammation."The pathway linking air pollution exposure to cardiovascular events through higher bone marrow activity and arterial inflammation accounted for 29% of the relationship between air pollution and cardiovascular disease events," says co-first author Shady Abohashem, MD, a cardiovascular imaging fellow at MGH. "These findings implicate air pollution exposure as an underrecognized risk factor for cardiovascular disease and suggest therapeutic targets beyond pollution mitigation to lessen the cardiovascular impact of air pollution exposure."Co-first author Michael Osborne, MD, a cardiologist at MGH, explains that therapies targeting increased inflammation following exposure to fine particulate matter may benefit patients who cannot avoid air pollution. "Importantly, most of the population studied had air pollution exposures well below the unhealthy thresholds established by the World Health Organization, suggesting that no level of air pollution can truly be considered safe," he says.This work was supported by the National Institutes of Health.
Pollution
2,021
February 3, 2021
https://www.sciencedaily.com/releases/2021/02/210203123451.htm
Extreme UV laser shows generation of atmospheric pollutant
Hokkaido University scientists show that under laboratory conditions, ultraviolet light reacts with nitrophenol to produce smog-generating nitrous acid.
An advanced laser technique has allowed researchers to observe, in real-time, the decomposition of a pollutant into atmospheric nitrous acid, which plays a key role in the formation of ozone and photochemical smog. The technique, described by Hokkaido University researchers in Nitrophenols are a type of fine particulate matter found in the atmosphere that form as a result of fossil fuel combustion and from forest fires. It is hypothesised that light interacts with nitrophenols and breaks them down into nitrous acid; atmospheric nitrous acid is known to generate the hydroxyl radicals responsible for ozone formation. Too much ozone and nitrogen oxides lead to the formation of an atmospheric haze called photochemical smog, which can cause respiratory illnesses. Until now, there has been no evidence for the decomposition of nitrophenol into nitrous acid by sunlight.Hokkaido University applied physicist Taro Sekikawa and colleagues developed a new probing technique to observe the process in real-time. They then compared their measurements with theoretical quantum chemistry calculations."Our study showed that irradiation of o-nitrophenol with sunlight is one of the direct causes of nitrous acid production in the atmosphere," says Sekikawa.The team developed an advanced laser technique that involves exciting nitrophenol with a 400 nanometer-wavelength laser light and then shining very short, very fast pulses of ultraviolet light on it to see what happens. Specifically, they used extreme UV light, which has very short wavelengths, shone in femtoseconds, which last a millionth of a billionth of a second. The whole process measures the energy states and molecular changes that occur as the nitrophenol compound decomposes over time.The scientists found that nitrous acid begins to form 374 femtoseconds after the nitrophenol is first excited by light. The decomposition process involves distortion of the shape of the nitrophenol molecule by light irradiation and changes in its energy states, ultimately leading to the formation of nitrous acid."Photoelectron spectroscopy with extreme ultraviolet light is expected to have a wide range of applications as a method for measuring chemical reactions," says Sekikawa. "It could be used, for example, to understand the mechanism by which ultraviolet rays inactivate viruses at the molecular level, and to understand other chemical reactions that take place in the atmosphere.
Pollution
2,021
February 2, 2021
https://www.sciencedaily.com/releases/2021/02/210202164535.htm
COVID-19 lockdowns temporarily raised global temperatures, research shows
The lockdowns and reduced societal activity related to the COVID-19 pandemic affected emissions of pollutants in ways that slightly warmed the planet for several months last year, according to new research led by the National Center for Atmospheric Research (NCAR).
The counterintuitive finding highlights the influence of airborne particles, or aerosols, that block incoming sunlight. When emissions of aerosols dropped last spring, more of the Sun's warmth reached the planet, especially in heavily industrialized nations, such as the United States and Russia, that normally pump high amounts of aerosols into the atmosphere."There was a big decline in emissions from the most polluting industries, and that had immediate, short-term effects on temperatures," said NCAR scientist Andrew Gettelman, the study's lead author. "Pollution cools the planet, so it makes sense that pollution reductions would warm the planet."Temperatures over parts of Earth's land surface last spring were about 0.2-0.5 degrees Fahrenheit (0.1-0.3 degrees Celsius) warmer than would have been expected with prevailing weather conditions, the study found. The effect was most pronounced in regions that normally are associated with substantial emissions of aerosols, with the warming reaching about 0.7 degrees F (0.37 C) over much of the United States and Russia.The new study highlights the complex and often conflicting influences of different types of emissions from power plants, motor vehicles, industrial facilities, and other sources. While aerosols tend to brighten clouds and reflect heat from the Sun back into space, carbon dioxide and other greenhouse gases have the opposite effect, trapping heat near the planet's surface and elevating temperatures.Despite the short-term warming effects, Gettelman emphasized that the long-term impact of the pandemic may be to slightly slow climate change because of reduced emissions of carbon dioxide, which lingers in the atmosphere for decades and has a more gradual influence on climate. In contrast, aerosols -- the focus of the new study -- have a more immediate impact that fades away within a few years.The study was published in Although scientists have long been able to quantify the warming impacts of carbon dioxide, the climatic influence of various types of aerosols -- including sulfates, nitrates, black carbon, and dust -- has been more difficult to pin down. One of the major challenges for projecting the extent of future climate change is estimating the extent to which society will continue to emit aerosols in the future and the influence of the different types of aerosols on clouds and temperature.To conduct the research, Gettelman and his co-authors used two of the world's leading climate models: the NCAR-based Community Earth System Model and a model known as ECHAM-HAMMOZ, which was developed by a consortium of European nations. They ran simulations on both models, adjusting emissions of aerosols and incorporating actual meteorological conditions in 2020, such as winds.This approach enabled them to identify the impact of reduced emissions on temperature changes that were too small to tease out in actual observations, where they could be obscured by the variability in atmospheric conditions.The results showed that the warming effect was strongest in the mid and upper latitudes of the Northern Hemisphere. The effect was mixed in the tropics and comparatively minor in much of the Southern Hemisphere, where aerosol emissions are not as pervasive.Gettelman said the study will help scientists better understand the influence of various types of aerosols in different atmospheric conditions, helping to inform efforts to minimize climate change. Although the research illustrates how aerosols counter the warming influence of greenhouse gases, he emphasized that emitting more of them into the lower atmosphere is not a viable strategy for slowing climate change."Aerosol emissions have major health ramifications," he said. "Saying we should pollute is not practical."
Pollution
2,021
February 2, 2021
https://www.sciencedaily.com/releases/2021/02/210202164445.htm
Sea ice kept oxygen from reaching deep ocean during last ice age
Extensive sea ice covered the world's oceans during the last ice age, which prevented oxygen from penetrating into the deep ocean waters, complicating the relationship between oxygen and carbon, a new study has found.
"The sea ice is effectively like a closed window for the ocean," said Andreas Schmittner, a climate scientist at Oregon State University and co-author on the paper. "The closed window keeps fresh air out; the sea ice acted as a barrier to keep oxygen from entering the ocean, like stale air in a room full of people. If you open the window, oxygen from outside can come in and the air is not as stale."The findings, published recently in The ocean plays an important role in the carbon cycle; carbon dioxide from the atmosphere dissolves in surface waters, where algae turn the carbon into organic matter. Respiration of that organic matter removes oxygen as carbon sinks to the deep ocean. The process of transferring carbon from the surface of the ocean to the deep is known as the biological pump.Currently, the ocean is losing oxygen and that trend is expected to continue because the solubility of oxygen decreases as temperatures warm. That would lead scientists to expect higher oxygen concentrations during the last ice age, when oceans were colder, Schmittner said, but sediment collected previously from below the sea floor shows lower oxygen levels in the deep ocean during that period.Researchers have previously theorized that the biological pump was stronger during the last ice age, increasing carbon respiration. But that assumes that surface ocean oxygen is equilibrated with the atmosphere, Schmittner said.In their new work, Schmittner and his colleagues, Ellen Cliff and Samar Khatiwala of the University of Oxford, used modeling to investigate the lower oxygen levels in the deep ocean.They found that disequilibrium played an important role in the carbon cycle. Deep ocean oxygen concentrations were reduced because surface waters were less equilibrated with the atmosphere. The disequilibrium was a result of the vast sea ice mainly over the Southern Ocean, as well as higher iron fertilization from the ice age atmosphere, which was dustier, Schmittner said.That means the deep ocean oxygen levels are informed not just by the biological pump process, but also by the sea ice, or lack of, just as a room's air quality may change by the opening or closing of a window, he said.The researchers' method to understand the role of sea ice and other processes in the ocean carbon and oxygen cycles could also be applied to future climate modeling, Schmittner said."Current models cannot separate effects from the biological pump on ocean oxygen from sea ice or other influences," he said. "This changes our understanding of the process and the reasons for those changes."
Pollution
2,021
February 1, 2021
https://www.sciencedaily.com/releases/2021/02/210201115954.htm
Failed storage tanks pose atmospheric risks during disasters
When aboveground storage tanks fail during a storm and their toxic contents spread, the threat to human health can and probably will flow downwind of the immediate area.
Rice University engineers have developed a model to quantify what could happen when a hurricane or other natural disaster causes such damage based on data gathered from the Houston Ship Channel, the largest petrochemical complex in the United States, during and after two hurricanes, Ike in 2008 and Harvey in 2017.Pollutants like toxic organic chemicals evaporate from spills and can be carried a long way from the site by the wind, depending on the storm's characteristics.The computational model, according to atmospheric scientist Rob Griffin of Rice's Brown School of Engineering, uses real data from the two storms as a proof of concept. The model is available upon request to help researchers predict the "what if" consequences of future storms that threaten storage tanks or chemical spills in general."We first had to understand the surge models, and that takes time after a storm," Griffin said. "This paper combines the hydrology from those storm scenarios with structural fragility models, leading to predicting the atmospheric consequences."The study appears in the journal The model follows the hypothetical fate of toxins like organic solvents including benzene and toluene and their reaction products as they drift with the wind for up to 12 hours after a spill, up to about 5,000 feet. The pollutants pose further damage as they evolve into secondary toxins within the downwind plumes, according to the researchers.The model predicted downwind oil plumes would cover a broader region than organic solvent plumes that would remain concentrated along the path of the prevailing wind, according to the study. It showed substantial formation of ozone and secondary organic aerosols forming in the solvent plumes, depending on other factors like sunlight and background pollutants.The researchers noted models could provide the only means to estimate the spread of pollutants that threaten the population downwind of a spill if a storm knocks out air quality monitoring systems in its path.A 2015 study by Rice civil and environmental engineer Jamie Padgett and alumnus Sabarethinam Kameshwar, now an assistant professor of civil and environmental engineering at Louisiana State University, predicted a percentage of storage tanks would fail if a Category 4 or greater hurricane hit the Houston Ship Channel, either lifting them off their foundations, crushing them or penetrating them with debris.They estimated a 24-foot storm surge could release 90 million gallons or more of oil and hazardous substances.That study was a jumping-off point for Griffin, Padgett and Phil Bedient, director of Rice's Severe Storm Prediction, Education and Evacuation from Disasters (SPPEED) Center, along with Hanadi Rifai, a Rice alumna and environmental engineer at the University of Houston, to model how such a spill would spread pollutants through the atmosphere."The earlier study predicted the (surface-bound) plume in the ship channel if this spill were to happen," Griffin said. "It talked about potential exposure to the environment and damage to the channel itself, but as an atmospheric chemist, I thought, that stuff's not going to just sit there. It's going to evaporate."One model related to conditions during and after Hurricane Ike showed a diesel plume from a single tank spill would expand slowly for the first six hours to cover about 42 square kilometers, but then expand rapidly to cover 500 square kilometers after nine hours. The swirling winds would have kept the plume within Texas, the model shows.But travel downwind would have been significantly different during Harvey, for which the model showed a narrower and more concentrated plume directed by the wind straight into the Gulf of Mexico.Models of the fate of evaporated toluene and benzene during Ike showed the plumes tracked with the storm's eastward path, with levels dangerous to human health most likely at the center of the plume within the first minutes of a tank failure. These would pose a risk to workers and communities near the spill, but the chemicals' concentrations would decrease rapidly further downwind."I don't think the information about what's in the tanks is public knowledge, so we had to make some assumptions about what would spill if there were a spill," Griffin said. "But it's valuable to think about what would happen to those chemicals once they're in the atmosphere. The same results could be just as applicable to something like the Deepwater Horizon. Once that material reached the surface of the ocean, what happens to it when it evaporates?"I would love to see some of the owners of these tanks use this to look at their structures," he said. "I can imagine folks like the Environmental Defense Fund or other advocates picking up on the study as well."Griffin noted the model isn't set up to allow a company to predict the effects of a single tank failure. "It's a more general model of a region with tanks that are likely to fail, based on real situations that happened," he said. "But we can make the chemistry and atmospheric code available. If others want to study a given storm situation and a given leaking tank, some significant legwork would need to go into that."
Pollution
2,021
February 1, 2021
https://www.sciencedaily.com/releases/2021/02/210201103022.htm
Salt battery design overcomes bump in the road to help electric cars go the extra mile
Using salt as a key ingredient, Chinese and British researchers have designed a new type of rechargeable battery that could accelerate the shift to greener, electric transport on our roads.
Many electric vehicles (EV) are powered by rechargeable lithium-ion batteries, but they can lose energy and power over time. Under certain conditions, such batteries can also overheat while working or charging, which can also degrade battery life and reduce miles per charge.To solve these issues, the University of Nottingham is collaborating with six scientific research institutes across China to develop an innovative and affordable energy store with the combined performance merits of a solid-oxide fuel cell and a metal-air battery. The new battery could significantly extend the range of electric vehicles, while being fully recyclable, environmentally-friendly, low-cost and safe.A solid-oxide fuel cell converts hydrogen and oxygen into electricity as a result of a chemical reaction. While they are highly-efficient at extracting energy from a fuel, durable, low-cost and greener to produce, they are not rechargeable. Meanwhile, metal-air batteries are electrochemical cells that uses a cheap metal such as iron and the oxygen present in air to generate electricity. During charging, they emit only oxygen into the atmosphere. Although not very durable, these high-energy dense batteries are rechargeable and can store and discharge as much electricity as lithium-ion batteries, but much more safely and cheaply.In the early research phases, the research team explored a high-temperature, iron-air battery design that used molten salt as a type of electrolyte -- activated by heat -- for electrical conductivity. Cheap and inflammable, molten salts help to give a battery impressive energy storage and power capability and a lengthy lifecycle.However, molten salts also possess adverse characteristics. University of Nottingham study lead, Professor George Chen said: "In extreme heat, molten salt can be aggressively corrosive, volatile and evaporate or leak, which is challenging to the safety and stability of battery design. There was an urgent need to fine-tune these electrolyte characteristics for better battery performance and to enable its future use in electric transport."The researchers have now successfully improved the technology by turning the molten salt into soft-solid salt, using solid oxide nano-powders. Professor Jianqiang Wang, from the Shanghai Institute of Applied Physics, Chinese Academy of Sciences, who is leading this collaboration project has predicted that this quasi-solid-state (QSS) electrolyte is suitable for metal-air batteries which operate at 800 ºC; as it suppresses the evaporation and fluidity of the molten salts that can occur at such high operating temperatures.Project collaborator, Dr Cheng Peng, also from the Shanghai Institute of Applied Physics, Chinese Academy of Sciences, explains a unique and useful design aspect of this experimental research. The quasi-solidification has been achieved using nanotechnology to construct a flexibly-connected network of solid oxide particles that act as a structural barrier locking in the molten salt electrolytes, while still allowing them to safely conduct electricity in extreme heat.Professor Chen, who is leading a molten salt electrolysis laboratory in Nottingham, hopes the team's "encouraging results" will help to establish a simpler and more efficient approach to designing low-cost and high-performance molten salt metal-air batteries with high stability and safety.He adds, "The modified molten salt iron-oxygen battery has great potential applications in new markets, including electric transport and renewable energy which require innovative storage solutions in our homes and at grid-level. The battery is also, in principle, capable of storing solar heat as well as electricity, which is highly-desirable for both domestic and industrial energy needs. Molten salts are currently used at large scale in Spain and China to capture and store solar heat which is then converted to electricity -- our molten salt metal air battery does the two jobs in one device."
Pollution
2,021
January 28, 2021
https://www.sciencedaily.com/releases/2021/01/210128155643.htm
Aerosol particles cool the climate less than we thought
The impact of atmospheric aerosols on clouds and climate may be different than previously thought. That is the conclusion of cloud researcher Franziska Glassmeier from TU Delft. The results of her study will be published in
Cloud decks cover vast stretches of the subtropical oceans. They cool our planet because they reflect incoming sunlight back to space. Air pollution in the form of aerosols -- particles suspended in the atmosphere -- can increase this cooling effect because it makes clouds brighter. The cooling effect of pollution offsets part of the warming effect of greenhouse gases. How much exactly, is one of the largest uncertainties faced by climate scientists.A striking illustration of clouds becoming brighter as a result of aerosols, is provided by shipping emissions in the form of 'ship tracks'. These are visible as bright lines within a cloud deck that reveal the paths of polluting ships that travel beneath the clouds. 'Such ship tracks are a good example of how aerosol effects on clouds are traditionally thought of, and of how they are still represented in most climate models', says Glassmeier. But according to the cloud researcher, ship tracks do not tell the whole story.'The problem is that the clouds get brighter at first but after a while they start to get thinner and thus less bright again. And ship tracks disappear before we can observe this dimming effect.' To figure out the climate effect of air pollution in general, which is much more persistent than fleeting ship tracks, Glassmeier and her colleagues do not rely on ship track observations. Instead, they created an extensive data set of detailed cloud simulations. At the heart of their study, the researchers designed a clever new way to compare their simulated cloud decks to satellite snapshots. Such snapshots contain information about aerosol effects on clouds all over the globe, but have so far been hard to interpret.'Our conclusion is that the cooling effect of aerosols on clouds is overestimated when we rely on ship-track data', says Glassmeier. 'Ship tracks are simply too short-lived to provide the correct estimate of cloud brightening.' The reason for this is that ship-track data don't account for the reduced cloud thickness that occurs in widespread pollution. 'To properly quantify these effects and get better climate projections, we need to improve the way clouds are represented in climate models', Glassmeier explains further.The study also has implications in the context of climate engineering. Climate engineering denotes targeted, so far mostly hypothetical, interventions into the climate system with the goal of alleviating the consequences of climate change. One example of a climate engineering method is the deliberate brightening of clouds through targeted emissions of sea salt aerosols, known as marine cloud brightening. 'Our results show that even in terms of cloud physics, marine cloud brightening may not be as straight-forward as it may seem. A naïve implementation could even result in cloud darkening and the opposite of what was intended', says Glassmeier. 'We certainly have to do a lot more research on the feasibility and risks of such methods. There is still a lot to learn about how these tiny aerosol particles influence clouds and eventually climate.'The study was performed in collaboration with researchers from the Ludwig Maximilian University of Munich, the University of Leeds, the University of Colorado, and the National Oceanic and Atmospheric Administration (NOAA), Boulder, Colorado, USA.
Pollution
2,021
January 28, 2021
https://www.sciencedaily.com/releases/2021/01/210128155612.htm
Marine heatwaves becoming more intense, more frequent
When thick, the surface layer of the ocean acts as a buffer to extreme marine heating -- but a new study from the University of Colorado Boulder shows this "mixed layer" is becoming shallower each year. The thinner it becomes, the easier it is to warm. The new work could explain recent extreme marine heatwaves, and point at a future of more frequent and destructive ocean warming events as global temperatures continue to climb.
"Marine heatwaves will be more intense and happen more often in the future," said Dillon Amaya, a CIRES Visiting Fellow and lead author on the study out this week in the The mixed layer -- the water in which temperature remains consistent -- blankets the top 20-200 meters of the ocean. Its thickness is responsible for heat events: the thicker it is, the more the layer can act as a buffer to shield the waters below from incoming hot air. But as this armor thins, the mixed layer becomes more susceptible to rapid swings in temperature."Think of the mixed layer as boiling a pot of water," said Amaya. "It will take no time at all for an inch of water to come to a boil, but much longer for a pot filled to the brim to heat through."Amaya and his team used a combination of ocean observations and models to estimate the depth of the mixed layer back to 1980, and also project out into the future. They determined that over the last 40 years, the layer has thinned by nearly 3 meters (9 feet) in some regions of the North Pacific. And by 2100, the mixed layer will be 4 meters (12 feet) thinner -- 30 percent less than what it is today. This thin mixed layer combined with warmer global temperatures will set the stage for drastic swings in ocean temperatures, leading to much more frequent and extreme heating events, the researchers say.And it's already happening. Take the 2019 heatwave in the Northeast Pacific. Weakened winds and higher air temperatures came together to warm Pacific Ocean waters by about 3 degrees C (5.5 F). A thinning mixed layer most likely contributed to this surge of warm waters, the authors found. And it will get worse."If you take the same wind and ocean conditions that occurred in 2019 and you apply them to the estimated mixed layer in 2100, you get a marine heatwave that is 6.5 degrees C (12 F) warmer than what we say in 2019," said Amaya. "An event like that would absolutely devastate sensitive marine ecosystems along the U.S. west coast."Amaya also points out that, as climate continues to warm and the mixed layer continues to thin, scientists might start to lose the ability to predict year-to-year ocean surface temperatures. Without the ability to accurately forecast ocean temperatures, fisheries and other coastal operations could be in danger.Other studies also suggest marine heatwaves will become more common in the future, but not many have explored the root cause: ocean dynamics and physics. "In order to simulate these events in models and help predict them, we must understand the physics of why that's happening," said Amaya.
Pollution
2,021
January 27, 2021
https://www.sciencedaily.com/releases/2021/01/210127122410.htm
Carbon: Getting to net zero -- and even net negative -- is surprisingly feasible, and affordable
Reaching zero net emissions of carbon dioxide from energy and industry by 2050 can be accomplished by rebuilding U.S. energy infrastructure to run primarily on renewable energy, at a net cost of about $1 per person per day, according to new research published by the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab), the University of San Francisco (USF), and the consulting firm Evolved Energy Research.
The researchers created a detailed model of the entire U.S. energy and industrial system to produce the first detailed, peer-reviewed study of how to achieve carbon-neutrality by 2050. According to the Intergovernmental Panel on Climate Change (IPCC), the world must reach zero net COThe researchers developed multiple feasible technology pathways that differ widely in remaining fossil fuel use, land use, consumer adoption, nuclear energy, and bio-based fuels use but share a key set of strategies. "By methodically increasing energy efficiency, switching to electric technologies, utilizing clean electricity (especially wind and solar power), and deploying a small amount of carbon capture technology, the United States can reach zero emissions," the authors write in "Carbon Neutral Pathways for the United States," published recently in the scientific journal "The decarbonization of the U.S. energy system is fundamentally an infrastructure transformation," said Berkeley Lab senior scientist Margaret Torn, one of the study's lead authors. "It means that by 2050 we need to build many gigawatts of wind and solar power plants, new transmission lines, a fleet of electric cars and light trucks, millions of heat pumps to replace conventional furnaces and water heaters, and more energy-efficient buildings -- while continuing to research and innovate new technologies."In this transition, very little infrastructure would need "early retirement," or replacement before the end of its economic life. "No one is asking consumers to switch out their brand-new car for an electric vehicle," Torn said. "The point is that efficient, low-carbon technologies need to be used when it comes time to replace the current equipment."The pathways studied have net costs ranging from 0.2% to 1.2% of GDP, with higher costs resulting from certain tradeoffs, such as limiting the amount of land given to solar and wind farms. In the lowest-cost pathways, about 90% of electricity generation comes from wind and solar. One scenario showed that the U.S. can meet all its energy needs with 100% renewable energy (solar, wind, and bioenergy), but it would cost more and require greater land use."We were pleasantly surprised that the cost of the transformation is lower now than in similar studies we did five years ago, even though this achieves much more ambitious carbon reduction," said Torn. "The main reason is that the cost of wind and solar power and batteries for electric vehicles have declined faster than expected."The scenarios were generated using new energy models complete with details of both energy consumption and production -- such as the entire U.S. building stock, vehicle fleet, power plants, and more -- for 16 geographic regions in the U.S. Costs were calculated using projections for fossil fuel and renewable energy prices from DOE Annual Energy Outlook and the NREL Annual Technology Baseline report.The cost figures would be lower still if they included the economic and climate benefits of decarbonizing our energy systems. For example, less reliance on oil will mean less money spent on oil and less economic uncertainty due to oil price fluctuations. Climate benefits include the avoided impacts of climate change, such as extreme droughts and hurricanes, avoided air and water pollution from fossil fuel combustion, and improved public health.The economic costs of the scenarios are almost exclusively capital costs from building new infrastructure. But Torn points out there is an economic upside to that spending: "All that infrastructure build equates to jobs, and potentially jobs in the U.S., as opposed to sending money overseas to buy oil from other countries. There's no question that there will need to be a well-thought-out economic transition strategy for fossil fuel-based industries and communities, but there's also no question that there are a lot of jobs in building a low-carbon economy."An important finding of this study is that the actions required in the next 10 years are similar regardless of long-term differences between pathways. In the near term, we need to increase generation and transmission of renewable energy, make sure all new infrastructure, such as cars and buildings, are low carbon, and maintain current natural gas capacity for now for reliability."This is a very important finding. We don't need to have a big battle now over questions like the near-term construction of nuclear power plants, because new nuclear is not required in the next ten years to be on a net-zero emissions path. Instead we should make policy to drive the steps that we know are required now, while accelerating R&D and further developing our options for the choices we must make starting in the 2030s," said study lead author Jim Williams, associate professor of Energy Systems Management at USF and a Berkeley Lab affiliate scientist.Another important achievement of this study is that it's the first published work to give a detailed roadmap of how the U.S. energy and industrial system can become a source of negative COAccording to the study, with higher levels of carbon capture, biofuels, and electric fuels, the U.S. energy and industrial system could be "net negative" to the tune of 500 metric tons of COWhen combined with increasing COThe study was supported in part by the Sustainable Development Solutions Network, an initiative of the United Nations.
Pollution
2,021
January 27, 2021
https://www.sciencedaily.com/releases/2021/01/210127122338.htm
More than just CO2: It's time to tackle short-lived climate-forcing pollutants
Climate change mitigation is about more than just CO
It is common practice in climate policy to bundle the climate warming pollutants together and express their total effects in terms of "COOne of the more pernicious qualities of COMeasures to reduce SLCP emissions could be implemented using existing technologies and practices, such as the collection of landfill gas to generate energy. Changes in other sectors will be needed to achieve further reductions. Methane and soot emissions from the agriculture and waste management sectors, for example, have important climate as well as health impacts. Similarly, hydrofluorocarbons (HFCs), which are significantly more potent climate forcers than COMar argues that clear communication on the different time horizons relevant for CORecognizing the broader benefits of SLCP reductions, several countries have stepped up and made SLCP mitigation a central element of their national climate strategies. Chile, Mexico, and Nigeria all include SLCPs in their national commitments under the Paris Agreement. If this type of holistic approach can be expanded and translated into on-the-ground emissions reductions at a global scale, then it will certainly be a win, not only for climate, but for air quality, health, and sustainable development.
Pollution
2,021
January 26, 2021
https://www.sciencedaily.com/releases/2021/01/210126171636.htm
Emissions, pollution and economy: Satellite data reveal links
Burning fossil fuels has long powered world economies while contributing to air pollution and the buildup of greenhouse gases. A new analysis of nearly two decades of satellite data shows that economic development, fossil-fuel combustion and air quality are closely linked on the continental and national scales, but can be decoupled at the national level, according to Penn State scientists.
"We know air pollution and economic development are linked, but we want to know how tightly and whether our actions can change this," said Ruixue Lei, a post-doctoral researcher in the Department of Meteorology and Atmospheric Science. "We found they are not inherently bonded and can be decoupled under favorable policies."While previous research has explored the connections between air pollution, fossil-fuel emissions and economic growth, the study is the first to examine all three jointly to determine their long-term, global relationships, the scientists said."The significance of this study is that data from satellites was used for the first time to prove that we actually do not need to sacrifice our environment while at the same time having a growth economy," said Sha Feng, assistant research professor of meteorology and atmospheric science. "This relationship can be detangled, but countries may need infrastructure or policy support to make it happen."The team analyzed 18 years of satellite data measuring the amounts of anthropogenic aerosols in the atmosphere and fossil-fuel carbon dioxide emission estimates from the Open-Data Inventory for Anthropogenic Carbon product to determine anthropogenic emissions on continental and national scales. They then compared those findings to gross domestic product data for individual countries.Their data showed the fastest growing nations suffer the most severe pollution while countries like the United States were able to grow their economies while slowing emissions, the scientists said. The team developed a filter that allowed them to focus on cities and other areas where emissions result from human activities."We found the linkage between fossil-fuel combustion and air quality is not how much you emitted, it is how fast the annual increase of the combustion was," Lei said. "Maybe at this stage all countries cannot unbound these factors, but we still see good examples that give us hope."There are different types of pollutants associated with the burning of fossil fuels, and the satellite data also indicated that these varied widely by country, the scientists said.The results, published in the journal "This paper is a first step to look at fossil-fuel emissions using satellite data at a national scale and to provide information for policy makers who face difficult challenges in balancing economic growth and reducing fossil-fuel emissions," Feng said.NASA provided funding for this research.
Pollution
2,021
January 25, 2021
https://www.sciencedaily.com/releases/2021/01/210125191844.htm
Air pollution linked to higher risk of sight loss from AMD
Air pollution is linked to a heightened risk of progressive and irreversible sight loss, known as age related macular degeneration (AMD), reveals a large long term study led by UCL researchers.
They found that people in the most polluted areas were at least 8% more likely to report having AMD, according to the findings published in the Lead author Professor Paul Foster (UCL Institute of Ophthalmology) said: "Here we have identified yet another health risk posed by air pollution, strengthening the evidence that improving the air we breathe should be a key public health priority. Our findings suggest that living in an area with polluted air, particularly fine particulate matter or combustion-related particles that come from road traffic, could contribute to eye disease."Even relatively low exposure to air pollution appears to impact the risk of AMD, suggesting that air pollution is an important modifiable risk factor affecting risk of eye disease for a very large number of people."AMD is the leading cause of irreversible blindness among people over 50 in high-income countries, with the numbers of those affected projected to reach 300 million by 2040. Known risk factors include older age, smoking, and genetic make-up.Air pollution has been implicated in brain conditions such as Alzheimer's disease, Parkinson's disease and stroke, while a 2019 study by the same research team found that air pollution was linked to elevated glaucoma risk. Particulate matter exposure is one of the strongest predictors of mortality among air pollutants.To see if air pollution might also be implicated in AMD risk, the researchers drew on data from 115,954 UK Biobank study participants aged 40-69 with no eye problems at the start of this study in 2006.Participants were asked to report any formal diagnosis of AMD by a doctor. And structural changes in the thickness and/or numbers of light receptors in the retina -- indicative of AMD -- were assessed in 52,602 of the participants, for whom complete data were available in 2009 and 2012, using retinal imaging (non-invasive optical coherence tomography or OCT).Measures of ambient air pollution included those for particulate matter (PM2.5), nitrogen dioxide (NO2), and nitrogen oxides (NOx). The estimates for these were provided by the Small Area Health Statistics Unit as part of the BioSHaRE-EU Environmental Determinants of Health Project. Official information on traffic, land use, and topography was used to calculate the annual average air pollution levels at participants' home addresses.The research team found that people in areas with higher levels of fine particulate matter pollution were more likely to report having AMD (specifically, they found an 8% difference in AMD risk between people living in the 25th and 75th percentiles of pollution levels), after accounting for potentially influential factors such as underlying health conditions and lifestyle. All pollutants, except coarse particulate matter, were associated with changes in retinal structure.The researchers caution that this observational study cannot confirm cause, but their findings align with evidence from elsewhere in the world.While they cannot yet confirm a mechanism, they suggest that ambient air pollution could plausibly be associated with AMD through oxidative stress or inflammation.Dr Sharon Chua (UCL Institute of Ophthalmology), the paper's first author, adds: "Higher exposure to air pollution was also associated with structural features of AMD. This may indicate that higher levels of air pollution may cause the cells to be more vulnerable to adverse changes and increase the risk of AMD."The study was funded by Moorfields Eye Charity, NIHR Biomedical Research Centre at Moorfields Eye Hospital NHS Foundation Trust and UCL Institute of Ophthalmology, Alcon Research Institute, and the International Glaucoma Association.
Pollution
2,021
January 25, 2021
https://www.sciencedaily.com/releases/2021/01/210125191835.htm
Aircraft could cut emissions by better surfing the wind
Airlines could save fuel and reduce emissions on transatlantic flights by hitching a better ride on the jet stream, new research has shown.
Scientists at the University of Reading have found that commercial flights between New York and London last winter could have used up to 16% less fuel if they had made better use of the fast-moving winds at altitude.New satellites will soon allow transatlantic flights to be tracked more accurately while remaining a safe distance apart. This opportunity could allow aircraft to be more flexible in their flight paths, in order to more accurately follow favourable tailwinds and avoid headwinds, offering the aviation sector a cheaper and more immediate way of cutting emissions than through advances in technology.Cathie Wells, a PhD researcher in mathematics at the University of Reading and lead author of the research, said: "Current transatlantic flight paths mean aircraft are burning more fuel and emitting more carbon dioxide than they need to."Although winds are taken into account to some degree when planning routes, considerations such as reducing the total cost of operating the flight are currently given a higher priority than minimising the fuel burn and pollution."Professor Paul Williams, an atmospheric scientist at the University of Reading and co-author of the new study, said: "Upgrading to more efficient aircraft or switching to biofuels or batteries could lower emissions significantly, but will be costly and may take decades to achieve."Simple tweaks to flight paths are far cheaper and can offer benefits immediately. This is important, because lower emissions from aviation are urgently needed to reduce the future impacts of climate change."The new study, published today in The scientists found that taking better advantage of the winds would have saved around 200 kilometres worth of fuel per flight on average, adding up to a total reduction of 6.7 million kilograms of carbon dioxide emissions across the winter period. The average fuel saving per flight was 1.7% when flying west to New York and 2.5% when flying east to London.The study was led by the University of Reading in collaboration with the UK National Centre for Earth Observation, the University of Nottingham, and Poll AeroSciences Ltd.Aviation is currently responsible for around 2.4% of all human-caused carbon emissions, and this figure is growing. The International Civil Aviation Organisation (ICAO) and countries around the world have responded by establishing policies to improve the fuel efficiency of international flights or offset emissions, but most of this action relies on technological advances and is therefore costly and slow to implement.Climate change is likely to have a big impact on air travel, with previous Reading research showing flights will encounter two or three times more severe clear-air turbulence if emissions are not cut.
Pollution
2,021
January 25, 2021
https://www.sciencedaily.com/releases/2021/01/210125191821.htm
Light pollution linked to preterm birth increase
Scientists conducted the first study to examine the fetal health impact of light pollution based on a direct measure of skyglow, an important aspect of light pollution. Using an empirical regularity discovered in physics, called Walker's Law, a team from Lehigh University, Lafayette College and the University of Colorado Denver in the U.S., found evidence of reduced birth weight, shortened gestational length and preterm births.
Specifically, the likelihood of a preterm birth could increase by approximately 1.48 percentage points (or 12.9%), according to the researchers, as a result of increased nighttime brightness. Nighttime brightness is characterized by being able to see only one-fourth to one-third of the stars that are visible in the natural unpolluted night sky. The findings have been published in an article in One possible biological mechanism underlying the findings, based on the existing literature, is light-pollution-induced circadian rhythm disruption, according to Muzhe Yang, a co-author of the study and a professor of economics in Lehigh's College of Business. Yang says circadian rhythm disruption can cause sleep disorders that subsequently lead to adverse birth outcomes."While greater use of artificial light at night (ALAN) is often associated with greater economic prosperity, our study highlights an often neglected health benefit of 'darkness,'" says Yang. "We must realize that the biological clock (i.e., the circadian rhythm) of a human body, like all lives on the earth, needs the 'darkness' as part of the light-dark cycle, in order to effectively regulate physiological functions, such as sleep."While essential to a modern society, ALAN can disrupt a human body's circadian rhythm and therefore become a "pollutant." The societal benefits of ALAN, for example through increased economic activity, may be offset by ALAN's negative externalities such as adverse health effects, say the authors.The contribution of ALAN to the alteration of natural nocturnal lighting levels is often referred to as light pollution. Light pollution is considered a worldwide ongoing problem.
Pollution
2,021
January 25, 2021
https://www.sciencedaily.com/releases/2021/01/210125113119.htm
Dramatic increase in microplastics in seagrass soil since the 1970s
Large-scale production of vegetables and fruit in Spain with intensive plastic consumption in its greenhouse industry is believed to have leaked microplastic contaminants since the 1970s into the surrounding Mediterranean seagrass beds. This is shown in a new study where researchers have succeeded in tracing plastic pollution since the 1930s and 1940s by analyzing seagrass sediments.
About half of Sweden's cucumbers and a fifth of the tomatoes in Sweden are currently imported from Spain according to the Swedish Board of Agriculture. A special area in Spain where large-scale vegetable cultivation takes place is Almería on the Mediterranean coast in southeastern Spain. A new study from the area of Almería, also known as "the sea of plastic," shows that the intensive use of plastics in the greenhouse industry seems to have led to ever-increasing emissions of microplastics since the development of intensive greenhouse farming in the 1970s."Almería is unique in Europe because it is one of the few human structures that can be seen from space because it is so large. The area accounts for about a quarter, or three million tonnes, of the total Spanish exports of vegetables and fruit," says Martin Dahl, researcher in marine ecology at the Department of Ecology, Environment and Plant Sciences, Stockholm University, who is the first author of the study in the scientific journal The study was conducted by researchers from Stockholm University in collaboration with the Center for Advanced Studies of Blanes, the Spanish High Council for Scientific Research (CEAB-CSIC), the Swedish Environmental Research Institute (IVL) and Södertörn University.Seagrass beds act as filters for coastal areas and can therefore capture particles, including microplastics, from land that get stuck on the leaves or end up in the sea bed. This makes seagrass beds interesting to study as they stabilize and build up thick sediment layers that can be used as historical environmental archives to, among other things, study the accumulation of microplastics over time.The high concentrations of microplastics that have accumulated in the seagrass bed can potentially lead to the spread of microplastics to other environments or to animals:"Seagrass beds could serve as a first step in the transfer of microplastics to animals, as many graze on seagrass or live in its sediment, and in this way could be exposed to plastic," says Martin Dahl.Microplastics can also bind to heavy metals and other environmental toxins."We generally don't know enough about the effect of microplastics on the environment, but on the other hand we know that today plastic and microplastics occur almost everywhere in the oceans and therefore I think you should see it as a warning signal. Historically, there is usually a certain time lag from the introduction of environmental toxins until effects can be seen, such as with PCBs and DDT," says Martin Dahl.The high use of plastic in Almería is mainly due to the plastic films they use to cover the greenhouses. These wear out quickly and need to be replaced relatively often. Through usage and weathering of the plastic film and other types of plastic used in the production of vegetables and fruits, they end up in the environment and are passed on to the sea through runoff.The researchers were able to find PVC and polystyrene used in greenhouse cultivation in Almería. However, the analysis could not identify all specific plastic polymers and link them directly to the type of plastic that the greenhouses are covered with.The researchers have chosen to investigate the area on the coast outside Almería due to the fact that it has previously been known that greenhouse cultivation has a high consumption of plastic and that this might influence microplastic contaminants in the surrounding seagrass beds, which are known to capture particles that are transported with the water.According to Martin Dahl, there are several ongoing research projects on microplastics in Sweden, especially at Kristineberg's marine research station outside Fiskebäckskil in western Sweden, but they have mainly looked at plastics in other types of bottoms and in the open sea, not in seagrass sediments."Studying microplastics in seagrass beds is very new and this is the first study, as far as I know, where dated seagrass sediments have been used to analyze the accumulation of microplastics over time, which makes the study very exciting," says Martin Dahl."There are still many questions about the effect of microplastics on seagrass ecosystems, but I hope that this study can draw attention to the problems that obviously exist around microplastic pollutants, not only in Almería but in the ocean in general," says Martin Dahl.
Pollution
2,021
January 22, 2021
https://www.sciencedaily.com/releases/2021/01/210122101956.htm
Combined river flows could send up to 3 billion microplastics a day into the Bay of Bengal
The Ganges River -- with the combined flows of the Brahmaputra and Meghna rivers -- could be responsible for up to 3 billion microplastic particles entering the Bay of Bengal every day, according to new research.
The study represents the first investigation of microplastic abundance, characteristics and seasonal variation along the river and was conducted using samples collected by an international team of scientists as part of the National Geographic Society's Sea to Source: Ganges expedition.Over two expeditions in 2019, 120 samples (60 each in pre- and post-monsoon conditions) were gathered at 10 sites by pumping river water through a mesh filter to capture any particles.The samples were then analysed in laboratories at the University of Plymouth with microplastics found in 43 (71.6%) of the samples taken pre-monsoon, and 37 (61.6%) post-monsoon.More than 90% of the microplastics found were fibres and, among them, rayon (54%) and acrylic (24%) -- both of which are commonly used in clothing -- were the most abundant.Combining predicted microplastic concentration at the mouth of the river (Bhola, Bangladesh) with the discharge of the river, scientists estimate that between 1 billion and 3 billion microplastics might be being released from the Ganges Brahmaputra Meghna River Basin every day.The research, published in Research Fellow and National Geographic Explorer Dr Imogen Napper, the study's lead author, was among the participants in the Sea to Source: Ganges expedition. She said: "Globally, it has been estimated that 60 billion pieces of plastic are discharged into the ocean from rivers worldwide each day. However, what has been lacking until now has been a detailed analysis of how microplastic concentrations vary along a river's course. By working with local communities and partners, this expedition always aimed to help us stem the flow of plastic entering the Gangetic basin. These results provide the first step in understanding how it, as well as other major rivers, may contribute to oceanic microplastic."The Ganges River rises in the Himalayas and runs through India and Bangladesh, where it joins the Brahmaputra and Meghna rivers shortly before reaching the Indian Ocean.The combined flows of the three rivers are the largest in South Asia and form the most populous basin in the world, with over 655 million inhabitants relying on the water it provides.The samples were collected during pre-monsoon (May to June 2019) and post-monsoon (October to December 2019), at sites ranging from Harsil closest to the source of the Ganges to Bhola in southern Bangladesh where it meets the Bay of Bengal.The sample sites were selected to ensure a mixture of rural, urban, agricultural, tourism and religious locations, with the highest concentrations found closer to the river's mouth at Bhola, in Bangladesh.Pre-monsoon samples collected there had four times as many particles as those taken at Harsil, while post-monsoon samples had double the amount.Professor Richard Thompson OBE, Head of the International Marine Litter Research Unit at the University and one of the study's co-authors, said: "We know that rivers are a substantial source of microplastics in the ocean. But the information like this can help identify the key sources and pathways of microplastic and hence inform management interventions. With this type of evidence, we can progress toward using plastics more responsibly so as to get the many benefits they can bring without unnecessary contamination of the environment."This study is the latest by the University in the field, with it being awarded a Queen's Anniversary Prize for Higher and Further Education in 2020 for its ground-breaking research and policy impact on microplastics pollution in the oceans.It is currently among the partners in Preventing Plastic Pollution (PPP), a €14 million project which aims to prevent plastic pollution from rivers entering the English Channel, and LimnoPlast, a €4.1 million project examining the distribution of microplastics in European rivers and lakes.
Pollution
2,021
January 21, 2021
https://www.sciencedaily.com/releases/2021/01/210121132059.htm
Butterfly wing clap explains mystery of flight
The fluttery flight of butterflies has so far been somewhat of a mystery to researchers, given their unusually large and broad wings relative to their body size. Now researchers at Lund University in Sweden have studied the aerodynamics of butterflies in a wind tunnel. The results suggest that butterflies use a highly effective clap technique, therefore making use of their unique wings. This helps them rapidly take off when escaping predators.
The study explains the benefits of both the wing shape and the flexibility of their wings.The Lund researchers studied the wingbeats of freely flying butterflies during take-off in a wind tunnel. During the upward stroke, the wings cup, creating an air-filled pocket between them. When the wings then collide, the air is forced out, resulting in a backward jet that propels the butterflies forward. The downward wingbeat has another function: the butterflies stay in the air and do not fall to the ground.The wings colliding was described by researchers almost 50 years ago, but it is only in this study that the theory has been tested on real butterflies in free flight. Until now, the common perception has been that butterfly wings are aerodynamically inefficient, however, the researchers suggest that the opposite is actually true."That the wings are cupped when butterflies clap them together, makes the wing stroke much more effective. It is an elegant mechanism that is far more advanced than we imagined, and it is fascinating. The butterflies benefit from the technique when they have to take off quickly to escape from predators," says biology researcher Per Henningsson, who studied the butterflies' aerodynamics together with colleague Christoffer Johansson."The shape and flexibility of butterfly wings could inspire improved performance and flight technology in small drones," he continues.In addition to studying the butterflies in a wind tunnel, the researchers designed mechanical wings that mimic real ones. The shape and flexibility of the mechanical wings as they are cupped and folded confirm the efficiency."Our measurements show that the impulse created by the flexible wings is 22 percent higher and the efficiency 28 percent better compared to if the wings had been rigid," concludes Christoffer Johansson.
Pollution
2,021
January 19, 2021
https://www.sciencedaily.com/releases/2021/01/210119122043.htm
New clues help explain why PFAS chemicals resist remediation
The synthetic chemicals known as PFAS, short for perfluoroalkyl and polyfluoroalkyl substances, are found in soil and groundwater where they have accumulated, posing risks to human health ranging from respiratory problems to cancer.
New research from the University of Houston and Oregon State University published in The work focused on the interactions sparked when firefighters use firefighting foam, which contains PFAS, to combat fires involving jet fuel, diesel or other hydrocarbon-based fuels. Firefighter training sites are well-documented sources of PFAS pollution.Konstantinos Kostarelos, a researcher with UH Energy and corresponding author for the work, said the interactions form a viscous water-in-oil microemulsion, which chemical analysis determined retains a high level of the PFAS.Unlike many emulsions of oil and liquid, which separate into their component parts over time, these microemulsions -- composed of liquids from the firefighting foam and the hydrocarbon-based fuel -- retain their composition, Kostarelos said. "It behaves like a separate phase: the water phase, oil phase and the microemulsion phase. And the microemulsion phase encapsulates these PFAS."Experimental trials that simulate the subsurface determined about 80% of PFAS were retained in the microemulsions when they flow through the soil, he said. "If they passed through easily, they wouldn't have been so persistent over the course of decades."Produced during the post-World War II chemical boom, PFAS are found in consumer products ranging from anti-stain treatments to Teflon and microwave popcorn bags, in addition to firefighting foam. They were prized because they resist heat, oil and water -- traditional methods of removing or breaking down chemicals -- as a result of the strong bond between the carbon and fluorine atoms that make up PFAS molecules.They have been the target of lawsuits and regulatory actions, and new chemical formulations have shortened their half-life.In the meantime, the toxic legacy of the older formulations continues to resist permanent remediation. Kostarelos said the new understanding of microemulsion formation will help investigators better identify the source of the contamination, as well as stimulate new methods for clean-up efforts."It's very viscous," he said. "That's very useful information for designing a way to recover the microemulsion."The project was funded by the Strategic Environmental Research and Development Program of the U.S. Department of Defense. In addition to Kostarelos, co-authors on the publication include Pushpesh Sharma of UH; and Emerson Christie, Thomas Wanzek and Jennifer Field, all of Oregon State University.
Pollution
2,021
January 18, 2021
https://www.sciencedaily.com/releases/2021/01/210118103456.htm
Eliminating microplastics in wastewater directly at the source
A research team from the Institut national de la recherche scientifique (INRS) has developed a process for the electrolytic treatment of wastewater that degrades microplastics at the source. The results of this research have been published in the
Wastewater can carry high concentrations of microplastics into the environment. These small particles of less than 5 mm can come from our clothes, usually as microfibers. Professor Patrick Drogui, who led the study, points out there are currently no established degradation methods to handle this contaminant during wastewater treatment. Some techniques already exist, but they often involve physical separation as a means of filtering pollutants. These technologies do not degrade them, which requires additional work to manage the separated particles.Therefore, the research team decided to degrade the particles by electrolytic oxidation, a process not requiring the addition of chemicals. "Using electrodes, we generate hydroxyl radicals (* OH) to attack microplastics. This process is environmentally friendly because it breaks them down into COProfessor Drogui envisions the use of this technology at the exit of commercial laundries, a potential source of microplastics release into the environment. "When this commercial laundry water arrives at the wastewater treatment plant, it is mixed with large quantities of water, the pollutants are diluted and therefore more difficult to degrade. Conversely, by acting at the source, i.e., at the laundry, the concentration of microplastics is higher (per litre of water), thus more accessible for electrolytic degradation," explains the specialist in electrotechnology and water treatment.Laboratory tests conducted on water artificially contaminated with polystyrene showed a degradation efficiency of 89%. The team plans to move on to experiments on real water. "Real water contains other materials that can affect the degradation process, such as carbonates and phosphates, which can trap radicals and reduce the performance of the oxidation process," says Professor Drogui, scientific director of the Laboratory of Environmental Electrotechnologies and Oxidative Processes (LEEPO).If the technology demonstrates its effectiveness on real commercial laundry water, the research group intends to conduct a study to determine the cost of treatment and the adaptation of the technology to treat larger quantities of wastewater. Within a few years, the technology could be implemented in laundry facilities.
Pollution
2,021
January 15, 2021
https://www.sciencedaily.com/releases/2021/01/210115103020.htm
2020 tied for warmest year on record, NASA analysis shows
Earth's global average surface temperature in 2020 tied with 2016 as the warmest year on record, according to an analysis by NASA.
Continuing the planet's long-term warming trend, the year's globally averaged temperature was 1.84 degrees Fahrenheit (1.02 degrees Celsius) warmer than the baseline 1951-1980 mean, according to scientists at NASA's Goddard Institute for Space Studies (GISS) in New York. 2020 edged out 2016 by a very small amount, within the margin of error of the analysis, making the years effectively tied for the warmest year on record."The last seven years have been the warmest seven years on record, typifying the ongoing and dramatic warming trend," said GISS Director Gavin Schmidt. "Whether one year is a record or not is not really that important -- the important things are long-term trends. With these trends, and as the human impact on the climate increases, we have to expect that records will continue to be broken."Tracking global temperature trends provides a critical indicator of the impact of human activities -- specifically, greenhouse gas emissions -- on our planet. Earth's average temperature has risen more than 2 degrees Fahrenheit (1.2 degrees Celsius) since the late 19th century.Rising temperatures are causing phenomena such as loss of sea ice and ice sheet mass, sea level rise, longer and more intense heat waves, and shifts in plant and animal habitats. Understanding such long-term climate trends is essential for the safety and quality of human life, allowing humans to adapt to the changing environment in ways such as planting different crops, managing our water resources and preparing for extreme weather events.A separate, independent analysis by the National Oceanic and Atmospheric Administration (NOAA) concluded that 2020 was the second-warmest year in their record, behind 2016. NOAA scientists use much of the same raw temperature data in their analysis, but have a different baseline period (1901-2000) and methodology. Unlike NASA, NOAA also does not infer temperatures in polar regions lacking observations, which accounts for much of the difference between NASA and NOAA records.Like all scientific data, these temperature findings contain a small amount of uncertainty -- in this case, mainly due to changes in weather station locations and temperature measurement methods over time. The GISS temperature analysis (GISTEMP) is accurate to within 0.1 degrees Fahrenheit with a 95 percent confidence level for the most recent period.While the long-term trend of warming continues, a variety of events and factors contribute to any particular year's average temperature. Two separate events changed the amount of sunlight reaching the Earth's surface. The Australian bush fires during the first half of the year burned 46 million acres of land, releasing smoke and other particles more than 18 miles high in the atmosphere, blocking sunlight and likely cooling the atmosphere slightly. In contrast, global shutdowns related to the ongoing coronavirus (COVID-19) pandemic reduced particulate air pollution in many areas, allowing more sunlight to reach the surface and producing a small but potentially significant warming effect. These shutdowns also appear to have reduced the amount of carbon dioxide (CO2) emissions last year, but overall CO2 concentrations continued to increase, and since warming is related to cumulative emissions, the overall amount of avoided warming will be minimal.The largest source of year-to-year variability in global temperatures typically comes from the El Nino-Southern Oscillation (ENSO), a naturally occurring cycle of heat exchange between the ocean and atmosphere. While the year has ended in a negative (cool) phase of ENSO, it started in a slightly positive (warm) phase, which marginally increased the average overall temperature. The cooling influence from the negative phase is expected to have a larger influence on 2021 than 2020."The previous record warm year, 2016, received a significant boost from a strong El Nino. The lack of a similar assist from El Nino this year is evidence that the background climate continues to warm due to greenhouse gases," Schmidt said.The 2020 GISS values represent surface temperatures averaged over both the whole globe and the entire year. Local weather plays a role in regional temperature variations, so not every region on Earth experiences similar amounts of warming even in a record year. According to NOAA, parts of the continental United States experienced record high temperatures in 2020, while others did not.In the long term, parts of the globe are also warming faster than others. Earth's warming trends are most pronounced in the Arctic, which the GISTEMP analysis shows is warming more than three times as fast as the rest of the globe over the past 30 years, according to Schmidt. The loss of Arctic sea ice -- whose annual minimum area is declining by about 13 percent per decade -- makes the region less reflective, meaning more sunlight is absorbed by the oceans and temperatures rise further still. This phenomenon, known as Arctic amplification, is driving further sea ice loss, ice sheet melt and sea level rise, more intense Arctic fire seasons, and permafrost melt.NASA's analysis incorporates surface temperature measurements from more than 26,000 weather stations and thousands of ship- and buoy-based observations of sea surface temperatures. These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions if not taken into account. The result of these calculations is an estimate of the global average temperature difference from a baseline period of 1951 to 1980.NASA measures Earth's vital signs from land, air, and space with a fleet of satellites, as well as airborne and ground-based observation campaigns. The satellite surface temperature record from the Atmospheric Infrared Sounder (AIRS) instrument aboard NASA's Aura satellite confirms the GISTEMP results of the past seven years being the warmest on record. Satellite measurements of air temperature, sea surface temperature, and sea levels, as well as other space-based observations, also reflect a warming, changing world. The agency develops new ways to observe and study Earth's interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.NASA's full surface temperature data set -- and the complete methodology used to make the temperature calculation -- are available at:GISS is a NASA laboratory managed by the Earth Sciences Division of the agency's Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.For more information about NASA's Earth science missions, visit:
Pollution
2,021
January 15, 2021
https://www.sciencedaily.com/releases/2021/01/210115091351.htm
Changing resilience of oceans to climate change
Oxygen levels in the ancient oceans were surprisingly resilient to climate change, new research suggests.
Scientists used geological samples to estimate ocean oxygen during a period of global warming 56 million years ago -- and found "limited expansion" of seafloor anoxia (absence of oxygen).Global warming -- both past and present -- depletes ocean oxygen, but the new study suggests warming of 5°C in the Paleocene Eocene Thermal Maximum (PETM) led to anoxia covering no more than 2% of the global seafloor.However, conditions are different today to the PETM -- today's rate of carbon emissions is much faster, and we are adding nutrient pollution to the oceans -- both of which could drive more rapid and expansive oxygen loss.The study was carried out by an international team including researchers from ETH Zurich, the University of Exeter and Royal Holloway, University of London."The good news from our study is that the Earth system was resilient to seafloor deoxygenation 56 million years ago despite pronounced global warming," said lead author Dr Matthew Clarkson, of ETH Zurich."However, there are reasons why things are different today."In particular, we think the Paleocene had higher atmospheric oxygen than today, which would have made anoxia less likely."Additionally, human activity is putting more nutrients into the ocean through fertilisers and pollution, which can drive oxygen loss and accelerate environmental deterioration."To estimate ocean oxygen levels during the PETM, the researchers analysed the isotopic composition of uranium in ocean sediments, which tracks oxygen concentrations.Surprisingly, these barely changed during the PETM.This sets an upper limit on how much ocean oxygen levels could have changed.Computer simulations based on the results suggest a maximum ten-fold increase in the area of seafloor devoid of oxygen -- taking the total to no more than 2% of the global seafloor.This is still significant, at around ten times the modern area of anoxia, and there were clearly detrimental impacts and extinctions of marine life in some parts of the ocean.Co-author Professor Tim Lenton, Director of Exeter's Global Systems Institute notes: "This study shows how the resilience of the Earth's climate system has changed over time."The order of mammals we belong to -- the primates -- originated in the PETM. Unfortunately, as we primates have been evolving for the last 56 million years, it looks like the oceans have been getting less resilient."Professor Lenton added: "Although the oceans were more resilient than we thought at this time in the past, nothing should distract us from the urgent need to reduce emissions and tackle the climate crisis today."
Pollution
2,021
January 14, 2021
https://www.sciencedaily.com/releases/2021/01/210114180605.htm
Extreme fire weather
When the Thomas Fire raged through Ventura and Santa Barbara counties in December 2017, Danielle Touma, at the time an earth science researcher at Stanford, was stunned by its severity. Burning for more than a month and scorching 440 square miles, the fire was then considered the worst in California's history.
Six months later the Mendocino Complex Fire upended that record and took out 717 square miles over three months. Record-setting California wildfires have since been the norm, with five of the top 10 occurring in 2020 alone.The disturbing trend sparked some questions for Touma, who is now a postdoctoral researcher at UC Santa Barbara's Bren School for Environmental Science & Management."Climate scientists knew that there was a climate signal in there but we really didn't understand the details of it," she said of the transition to a climate more ideal for wildfires. While research has long concluded that anthropogenic activity and its products -- including greenhouse gas emissions, biomass burning, industrial aerosols (a.k.a. air pollution) and land-use changes -- raise the risk of extreme fire weather, the specific roles and influences of these activities was still unclear.Until now. In the first study of its kind, Touma, with fellow Bren School researcher Samantha Stevenson and colleagues Flavio Lehner of Cornell University and the National Center for Atmospheric Research (NCAR), and Sloan Coats from the University of Hawaii, have quantified competing anthropogenic influences on extreme fire weather risk in the recent past and into the near future. By disentangling the effects of those human-made factors the researchers were able to tease out the roles these activities have had in generating an increasingly fire-friendly climate around the world and the risk of extreme fire weather in decades to come.Their work appears in the journal "By understanding the different pieces that go into these scenarios of future climate change, we can get a better sense of what the risks associated with each of those pieces might be, because we know there are going to be uncertainties in the future," Stevenson said. "And we know those risks are going to be expressed unequally in different places too, so we can be better prepared for which parts of the world might be more vulnerable.""To get a wildfire to ignite and spread, you need suitable weather conditions -- you need warm, dry and windy conditions," Touma said. "And when these conditions are at their most extreme, they can cause really large, severe fires."Using state-of-the-art climate model simulations available from NCAR, the researchers analyzed the climate under various combinations of climate influences from 1920-2100, allowing them to isolate individual effects and their impacts on extreme fire weather risk.According to the study, heat-trapping greenhouse gas emissions (which started to increase rapidly by mid-century) are the dominant contributor to temperature increases around the globe. By 2005, emissions raised the risk of extreme fire weather by 20% from preindustrial levels in western and eastern North America, the Mediterranean, Southeast Asia and the Amazon. The researchers predict that by 2080, greenhouse gas emissions are expected to raise the risk of extreme wildfire by at least 50% in western North America, equatorial Africa, Southeast Asia and Australia, while doubling it in the Mediterranean, southern Africa, eastern North America and the Amazon.Meanwhile, biomass burning and land-use changes have more regional impacts that amplify greenhouse gas-driven warming, according to the study -- notably a 30% increase of extreme fire weather risk over the Amazon and western north America during the 20th century caused by biomass burning. Land use changes, the study found, also amplified the likelihood of extreme fire weather in western Australia and the Amazon.The role of industrial aerosols has been more complex in the 20th century, actually reducing the risk of extreme fire weather by approximately 30% in the Amazon and Mediterranean, but amplifying it by at least 10% in southeast Asia and Western North America, the researchers found."(Industrial aerosols) block some of the solar radiation from reaching the ground," Stevenson said. "So they tend to have a cooling effect on the climate."And that's part of the reason why we wanted to do this study," she continued. "We knew something had been compensating in a sense for greenhouse gas warming, but not the details of how that compensation might continue in the future."The cooling effect may still be present in regions such as the Horn of Africa, Central America and the northeast Amazon, where aerosols have not been reduced to pre-industrial levels. Aerosols may still compete with greenhouse gas warming effects in the Mediterranean, western North America and parts of the Amazon, but the researchers expect this effect to dissipate over most of the globe by 2080, due to cleanup efforts and increased greenhouse gas-driven warming. Eastern North America and Europe are likely to see the warming and drying due to aerosol reduction first.Southeast Asia meanwhile, "where aerosols emissions are expected to continue," may see a weakening of the annual monsoon, drier conditions and an increase in extreme fire weather. risk."Southeast Asia relies on the monsoon, but aerosols cause so much cooling on land that it actually can suppress a monsoon," Touma said. "It's not just whether you have aerosols or not, it's the way the regional climate interacts with aerosols."The researchers hope that the detailed perspective offered by their study opens the door to more nuanced explorations of the Earth's changing climate."In the broader scope of things, it's important for climate policy, like if we want to know how global actions will affect the climate," Touma said. "And it's also important for understanding the potential impacts to people, such as with urban planning and fire management."
Pollution
2,021
January 14, 2021
https://www.sciencedaily.com/releases/2021/01/210114085423.htm
A climate in crisis calls for investment in direct air capture, new research finds
There is a growing consensus among scientists as well as national and local governments representing hundreds of millions of people, that humanity faces a climate crisis that demands a crisis response. New research from the University of California San Diego explores one possible mode of response: a massively funded program to deploy direct air capture (DAC) systems that remove CO
The findings reveal such a program could reverse the rise in global temperature well before 2100, but only with immediate and sustained investments from governments and firms to scale up the new technology.Despite the enormous undertaking explored in the study, the research also reveals the need for governments, at the same time, to adopt policies that would achieve deep cuts in COThe study, published in "DAC is substantially more expensive than many conventional mitigation measures, but costs could fall as firms gain experience with the technology," said first-author Ryan Hanna, assistant research scientist at UC San Diego. "If that happens, politicians could turn to the technology in response to public pressure if conventional mitigation proves politically or economically difficult."Co-author David G. Victor, professor of industrial innovation at UC San Diego's School of Global Policy and Strategy, added that atmospheric CO"Current pledges to cut global emissions put us on track for about 3 degrees C of warming," Victor said. "This reality calls for research and action around the politics of emergency response. In times of crisis, such as war or pandemics, many barriers to policy expenditure and implementation are eclipsed by the need to mobilize aggressively."The study calculates the funding, net COThe authors find that if an emergency direct air capture program were to commence in 2025 and receive investment of 1.2-1.9% of global GDP annually it would remove 2.2-2.3 gigatons of COEven with such a massive program, the globe would see temperature rise of 2.4-2.5ºC in the year 2100 without further cuts in global emissions below current trajectories.According to the authors, DAC has attributes that could prove attractive to policymakers if political pressures continue to mount to act on climate change, yet cutting emissions remains insurmountable."Policymakers might see value in the installation of a fleet of COFrom the Civil War to Operation Warp Speed, the authors estimate the financial resources that might be available for emergency deployment of direct air capture -- in excess of one trillion dollars per year -- based on previous spending the U.S. has made in times of crisis.The authors then built a bottom-up deployment model that constructs, operates and retires successive vintages of DAC scrubbers, given available funds and the rates at which direct air capture technologies might improve with time. They link the technological and economic modeling to climate models that calculate the effects of these deployments on atmospheric COWith massive financial resources committed to DAC, the study finds that the ability of the DAC industry to scale up is the main factor limiting CO"Crisis deployment of direct air capture, even at the extreme of what is technically feasible, is not a substitute for conventional mitigation," the authors write.Nevertheless, they note that the long-term vision for combating climate requires taking negative emissions seriously."For policymakers, one implication of this finding is the high value of near-term direct air capture deployments -- even if societies today are not yet treating climate change as a crisis -- because near term deployments enhance future scalability," they write. "Rather than avoiding direct air capture deployments because of high near-term costs, the right policy approach is the opposite."Additionally, they note that such a large program would grow a new economic sector, producing a substantial number of new jobs.The authors conclude it is time to extend research on direct air capture systems to real-world conditions and constraints that accompany deployment -- especially in the context of acute political pressures that will arise as climate change becomes viewed as a crisis.
Pollution
2,021
January 14, 2021
https://www.sciencedaily.com/releases/2021/01/210114085407.htm
Concept for a hybrid-electric plane may reduce aviation's air pollution problem
At cruising altitude, airplanes emit a steady stream of nitrogen oxides into the atmosphere, where the chemicals can linger to produce ozone and fine particulates. Nitrogen oxides, or NOx, are a major source of air pollution and have been associated with asthma, respiratory disease, and cardiovascular disorders. Previous research has shown that the generation of these chemicals due to global aviation results in 16,000 premature deaths each year.
Now MIT engineers have come up with a concept for airplane propulsion that they estimate would eliminate 95 percent of aviation's NOx emissions, and thereby reduce the number of associated early deaths by 92 percent.The concept is inspired by emissions-control systems used in ground transportation vehicles. Many heavy-duty diesel trucks today house postcombustion emissions-control systems to reduce the NOx generated by engines. The researchers now propose a similar design for aviation, with an electric twist.Today's planes are propelled by jet engines anchored beneath each wing. Each engine houses a gas turbine that powers a propeller to move the plane through the air as exhaust from the turbine flows out the back. Due to this configuration, it has not been possible to use emissions-control devices, as they would interfere with the thrust produced by the engines.In the new hybrid-electric, or "turbo-electric," design, a plane's source of power would still be a conventional gas turbine, but it would be integrated within the plane's cargo hold. Rather than directly powering propellers or fans, the gas turbine would drive a generator, also in the hold, to produce electricity, which would then electrically power the plane's wing-mounted, electrically driven propellers or fans. The emissions produced by the gas turbine would be fed into an emissions-control system, broadly similar to those in diesel vehicles, which would clean the exhaust before ejecting it into the atmosphere."This would still be a tremendous engineering challenge, but there aren't fundamental physics limitations," says Steven Barrett, professor of aeronautics and astronautics at MIT. "If you want to get to a net-zero aviation sector, this is a potential way of solving the air pollution part of it, which is significant, and in a way that's technologically quite viable."The details of the design, including analyses of its potential fuel cost and health impacts, are published today in the journal The seeds for the team's hybrid-electric plane grew out of Barrett and his team's work in investigating the Volkswagen diesel emissions scandal. In 2015, environmental regulators discovered that the car manufacturer had been intentionally manipulating diesel engines to activate onboard emissions-control systems only during lab testing, such that they appeared to meet NOx emissions standards but in fact emitted up to 40 times more NOx in real-world driving conditions.As he looked into the health impacts of the emissions cheat, Barrett also became familiar with diesel vehicles' emissions-control systems in general. Around the same time, he was also looking into the possibility of engineering large, all-electric aircraft."The research that's been done in the last few years shows you could probably electrify smaller aircraft, but for big aircraft, it won't happen anytime soon without pretty major breakthroughs in battery technology," Barrett says. "So I thought, maybe we can take the electric propulsion part from electric aircraft, and the gas turbines that have been around for a long time and are super reliable and very efficient, and combine that with the emissions-control technology that's used in automotive and ground power, to at least enable semielectrified planes."Before airplane electrification had been seriously considered, it might have been possible to implement a concept such as this, for example as an add-on to the back of jet engines. But this design, Barrett notes, would "kill any stream of thrust" that a jet engine would produce, effectively grounding the design.Barrett's concept gets around this limitation by separating the thrust-producing propellers or fans from the power-generating gas turbine. The propellers or fans would instead be directly powered by an electric generator, which in turn would be powered by the gas turbine. The exhaust from the gas turbine would be fed into an emissions-control system, which could be folded up, accordion-style, in the plane's cargo hold -- completely isolated from the thrust-producing propellers.He envisions the bulk of the hybrid-electric system -- gas turbine, electric generator, and emissions control system -- would fit within the belly of a plane, where there can be ample space in many commercial aircraft .In their new paper, the researchers calculate that if such a hybrid-electric system were implemented on a Boeing 737 or Airbus A320-like aircraft, the extra weight would require about 0.6 percent more fuel to fly the plane."This would be many, many times more feasible than what has been proposed for all-electric aircraft," Barrett says. "This design would add some hundreds of kilograms to a plane, as opposed to adding many tons of batteries, which would be over a magnitude of extra weight."The researchers also calculated the emissions that would be produced by a large aircraft, with and without an emissions control system, and found that the hybrid-electric design would eliminate 95 percent of NOx emissionsIf this system were rolled out across all aircraft around the world, they further estimate that 92 percent of pollution-related deaths due to aviation would be avoided. They arrived at this estimate by using a global model to map the flow of aviation emissions through the atmosphere, and calculated how much various populations around the world would be exposed to these emissions. They then converted these exposures to mortalities, or estimates of the number of people who would die as a result of exposure to aviation emissions.The team is now working on designs for a "zero-impact" airplane that flies without emitting NOx and other chemicals like climate-altering carbon dioxide."We need to get to essentially zero net-climate impacts and zero deaths from air pollution," Barrett says. "This current design would effectively eliminate aviation's air pollution problem. We're now working on the climate impact part of it."
Pollution
2,021
January 14, 2021
https://www.sciencedaily.com/releases/2021/01/210114111929.htm
Strategies to improve water quality
Illinois residents value efforts to reduce watershed pollution, and they are willing to pay for environmental improvements, according to a new study from agricultural economists at the University of Illinois.
Nutrient runoff from agricultural production is a major cause of pollution in the Mississippi River Basin and contributes to hypoxia -- limited oxygen to support sea life in the Gulf of Mexico. The U.S. Environmental Protection Agency (EPA) set up action plans to reduce pollution in 12 midwestern states and reduce transmissions of nitrate-nitrogen and phosphorus by 45% in 2040.Illinois agencies have established the Illinois Nutrient Loss Reduction Strategy (INLRS) to address those issues, relying on voluntary efforts as well as policy measures such as state subsidies. Understanding the level of support from local residents can help inform nutrient reduction initiatives."We know water quality is important and water pollution has costs in terms of both health and economic damages," says Bryan Parthum, lead author on the paper. Parthum was a graduate student in the Department of Agricultural and Consumer Economics at U of I when the study was conducted; he now works as an economist in the EPA Office of Policy."We wanted to find out how much people care about local water quality, fish populations, and algal blooms, and how much they care about meeting EPA targets which benefit the Gulf of Mexico," Parthum explains.The researchers surveyed respondents in the Upper Sangamon River Watershed in Central Illinois, an INLRS priority watershed due to high levels of nitrate and phosphorus transmission.Respondents completed a choice experiment survey containing different scenarios that described water quality improvements such as reducing algal bloom, increasing the overall fish population and number of fish species, and meeting EPA targets for reducing nutrient pollution.The scenarios also included information about proximity to the respondent, and a range of potential costs associated with those measures.The researchers could then estimate how much people were willing to pay for the various improvements. For example, a 50% reduction of algal blooms in the watershed would, on average, be worth $32 annually per household."And if you combine reduced algal blooms, improved fish population and diversity, and meeting the nutrient targets, that whole bundle is worth $85 per household," Parthum says.Amy Ando, professor of agricultural and consumer economics at U of I and co-author of the paper, notes the Sangamon River isn't known as a tourist destination, yet still holds value for recreational purposes. Close to 50% of respondents indicated using the river for fishing, boating, or hiking at least once a year."The actions taken by the INLRS, if successful, will reduce hypoxia in the Gulf of Mexico, and people in Illinois care about doing that. But the strategies also have value for improvements in fish populations and reductions in algal blooms right here at home, so people get some local benefits as well," Ando notes.The survey included 373 respondents, about evenly divided between urban and rural residents. The researchers observed no difference in attitudes towards water policy measures among the two groups."There may be a perception that urban residents care more about environmental improvements. But we found both urban and rural groups expressed value for improved fish populations and water quality in the rivers that run through where they live," Ando says.The Department of Agricultural and Consumer Economics is in the College of Agricultural, Consumer and Environmental Sciences (ACES) at the University of Illinois.
Pollution
2,021
January 13, 2021
https://www.sciencedaily.com/releases/2021/01/210113144458.htm
Early COVID-19 lockdown in Delhi had less impact on urban air quality than first believed
The first COVID-19 lockdowns led to significant changes in urban air pollution levels around the world, but the changes were smaller than expected, a new study reveals.
After developing new corrections for the impact of weather and seasonal trends, such as reduced NOLed by experts at the University of Birmingham, the international team of scientists discovered that the beneficial reductions in NONOPublishing their findings today in Lead-author Zongbo Shi, Professor of Atmospheric Biogeochemistry at the University of Birmingham, commented: "Rapid, unprecedented reduction in economic activity provided a unique opportunity to study the impact of interventions on air quality. Emission changes associated with the early lockdown restrictions led to abrupt changes in air pollutant levels but their impacts on air quality were more complex than we thought, and smaller than we expected."Weather changes can mask changes in emissions on air quality. Importantly, our study has provided a new framework for assessing air pollution interventions, by separating the effects of weather and season from the effects of emission changes."Roy Harrison, Queen Elizabeth II Birmingham Centenary Professor of Environmental Health, a co-author, commented: "The reduction in NOWilliam Bloss, Professor of Atmospheric Sciences, who is also a co-author, commented that "We found increases in ozone levels due to lockdown in all the cities studied. This is what we expect from the air chemistry, but this will counteract at least some of the health benefit from NOScientists at Birmingham used machine learning to strip out weather impacts and seasonal trends before analysing the data -- site-specific hourly concentrations of key pollutants from December 2015 to May 2020.Air pollution is the single largest environmental risk to human health globally, contributing to 6.7 million deaths each year. The World Bank estimated that air pollution costs the global economy $3 trillion.
Pollution
2,021
January 13, 2021
https://www.sciencedaily.com/releases/2021/01/210113120714.htm
Northern lakes at risk of losing ice cover permanently, impacting drinking water
Close to 5,700 lakes in the Northern Hemisphere may permanently lose ice cover this century, 179 of them in the next decade, at current greenhouse gas emissions, despite a possible polar vortex this year, researchers at York University have found.
Those lakes include large bays in some of the deepest of the Great Lakes, such as Lake Superior and Lake Michigan, which could permanently become ice free by 2055 if nothing is done to curb greenhouse gas emissions or by 2085 with moderate changes.Many of these lakes that are predicted to stop freezing over are near large human populations and are an important source of drinking water. A loss of ice could affect the quantity and quality of the water."We need ice on lakes to curtail and minimize evaporation rates in the winter," says lead researcher Sapna Sharma, an associate professor in the Faculty of Science. "Without ice cover, evaporation rates would increase, and water levels could decline. We would lose freshwater, which we need for drinking and everyday activities. Ice cover is extremely important both ecologically and socio-economically."The researchers, including Postdoctoral Fellows Kevin Blagrave and Alessandro Filazzola, looked at 51,000 lakes in the Northern Hemisphere to forecast whether those lakes would become ice-free using annual winter temperature projections from 2020 to 2098 with 12 climate change scenarios.Video: "With increased greenhouse gas emissions, we expect greater increases in winter air temperatures, which are expected to increase much more than summer temperatures in the Northern Hemisphere," says Filazzola. "It's this warming of a couple of degrees, as result of carbon emissions, that will cause the loss of lake ice into the future."The most at-risk lakes are those in southern and coastal regions of the Northern Hemisphere, some of which are amongst the largest lakes in the world."It is quite dramatic for some of these lakes, that froze often, but within a few decades they stop freezing indefinitely," says Filazzola. "It's pretty shocking to imagine a lake that would normally freeze no longer doing so."The researchers found that when the air temperature was above -0.9 C, most lakes no longer froze. For shallow lakes, the air temperature could be zero or a bit above. Larger and deeper lakes need colder temperatures to freeze -- some as cold as -4.8 C ? than shallow lakes."Ice cover is also important for maintaining the quality of our freshwater," says Sharma. "In years where there isn't ice cover or when the ice melts earlier, there have been observations that water temperatures are warmer in the summer, there are increased rates of primary production, plant growth, as well as an increased presence of algal blooms, some of which may be toxic."To preserve lake ice cover, more aggressive measures to mitigate greenhouse gas emissions are needed now, says Sharma. "I was surprised at how quickly we may see this transition to permanent loss of ice cover in lakes that had previously frozen near consistently for centuries."
Pollution
2,021
January 13, 2021
https://www.sciencedaily.com/releases/2021/01/210113090944.htm
Framework sheds light on nitrogen loss of producing common food items
The element nitrogen is a double-edged sword. It is essential for growing plants and feeding people, but it is also a leading cause of pollution across the world. Only by using nitrogen more sustainably can the positive and harmful effects of nitrogen be balanced.
Xia (Emma) Liang, a member of the American Society of Agronomy, studies nitrogen loss during food production.Liang and her team created a framework that accurately measures nitrogen loss across a wide variety of crops and food products. She recently presented their research at the virtual 2020 ASA-CSSA-SSSA Annual Meeting."This framework can capture the environmental impacts and societal costs of nitrogen losses," Liang explains. "This allows us to potentially provide information to inform consumers, producers, and policymakers."The team hopes this research will help make major progress in making agricultural systems across the world more sustainable, less polluting, and more profitable.Their framework measured both overall nitrogen loss and nitrogen loss intensity. The latter is the loss per unit of food or per unit of nitrogen produced. This allowed better comparisons across different crops and food items.For example, cereal grains have a low loss intensity but a high overall loss because they are grown in such large quantities. On the other hand, an animal product like buffalo meat has a high loss intensity but a low overall loss. This is due to the small amount produced.The framework reveals that the loss quantity and loss intensity vary a lot for different food products, especially when compared between farmers and countries. The database includes 115 crop and 11 livestock commodities at the global scale.Cattle contribute the most to global nitrogen pollution. They are followed by the production of rice, wheat, maize, pork, and soybeans. Beef is also the food with the highest loss intensity, followed by lamb, pork and other livestock products. Generally, the loss intensity of livestock is much larger than the loss intensity of crop products."The lowest nitrogen loss for the 11 livestock products exceeds that of vegetable substitutes," Liang says. "This confirms the importance of dietary change to reduce nitrogen loss through consumption."The nitrogen loss from fields can cause harm in multiple ways. It can cause smog and further climate change. It harms soil and water, as well as the plants and animals that live there. For humans, high levels of nitrogen in the air and water have been connected to illness.Liang highlights that with current activities, the planet's nitrogen boundary, a "safe operating space" for humanity, is exceeded by over two-fold.Solutions are complex. On farms themselves, there are many techniques to better manage nitrogen. These include better fertilizer technologies and practices, improved crop varieties, and following the "4 Rs." This means using the right fertilizer in the right amount at the right time in the right place. There are also ways to improve nitrogen management in livestock.However, Liang explains that on-the-farm solutions are only half the battle. An economic approach is also needed."An economic approach would provide incentives for adopting better nitrogen management practices," she says. "For instance, incentives should be given to promote sustainable measures to maintain the soil nitrogen. These include reducing the risk of soil degradation and erosion and the overuse of fertilizers."Individuals can also adopt helpful changes, she adds. Reducing consumption of meat and reducing food waste are two options. Another is having discussions about sustainable nitrogen management."When we buy a washing machine or a car, we can choose a more water efficient and energy efficient product by water and energy rating," Liang says. "However, despite growing recognition of the importance of nitrogen in sustainable food production and consumption, we don't follow a similar idea for foods we eat."
Pollution
2,021
January 11, 2021
https://www.sciencedaily.com/releases/2021/01/210111143416.htm
Researchers report quantum-limit-approaching chemical sensing chip
University at Buffalo researchers are reporting an advancement of a chemical sensing chip that could lead to handheld devices that detect trace chemicals -- everything from illicit drugs to pollution -- as quickly as a breathalyzer identifies alcohol.
The chip, which also may have uses in food safety monitoring, anti-counterfeiting and other fields where trace chemicals are analyzed, is described in a study that appears on the cover of the Dec. 17 edition of the journal "There is a great need for portable and cost-effective chemical sensors in many areas, especially drug abuse," says the study's lead author Qiaoqiang Gan, PhD, professor of electrical engineering in the UB School of Engineering and Applied Sciences.The work builds upon previous research Gan's lab led that involved creating a chip that traps light at the edges of gold and silver nanoparticles.When biological or chemical molecules land on the chip's surface, some of the captured light interacts with the molecules and is "scattered" into light of new energies. This effect occurs in recognizable patterns that act as fingerprints of chemical or biological molecules, revealing information about what compounds are present.Because all chemicals have unique light-scattering signatures, the technology could eventually be integrated into a handheld device for detecting drugs in blood, breath, urine and other biological samples. It could also be incorporated into other devices to identify chemicals in the air or from water, as well as other surfaces.The sensing method is called surface-enhanced Raman spectroscopy (SERS).While effective, the chip the Gan group previously created wasn't uniform in its design. Because the gold and silver was spaced unevenly, it could make scattered molecules difficult to identify, especially if they appeared on different locations of the chip.Gan and a team of researchers -- featuring members of his lab at UB, and researchers from the University of Shanghai for Science and Technology in China, and King Abdullah University of Science and Technology in Saudi Arabia -- have been working to remedy this shortcoming.The team used four molecules (BZT, 4-MBA, BPT, and TPT), each with different lengths, in the fabrication process to control the size of the gaps in between the gold and silver nanoparticles. The updated fabrication process is based upon two techniques, atomic layer deposition and self-assembled monolayers, as opposed to the more common and expensive method for SERS chips, electron-beam lithography.The result is a SERS chip with unprecedented uniformity that is relatively inexpensive to produce. More importantly, it approaches quantum-limit sensing capabilities, says Gan, which was a challenge for conventional SERS chips"We think the chip will have many uses in addition to handheld drug detection devices," says the first author of this work, Nan Zhang, PhD, a postdoctoral researcher in Gan's lab. "For example, it could be used to assess air and water pollution or the safety of food. It could be useful in the security and defense sectors, and it has tremendous potential in health care."
Pollution
2,021
January 5, 2021
https://www.sciencedaily.com/releases/2021/01/210105130110.htm
Sweat, bleach and gym air quality: Chemical reactions make new airborne chemicals
One sweaty, huffing, exercising person emits as many chemicals from their body as up to five sedentary people, according to a new University of Colorado Boulder study. And notably, those human emissions, including amino acids from sweat or acetone from breath, chemically combine with bleach cleaners to form new airborne chemicals with unknown impacts to indoor air quality.
"Humans are a large source of indoor emissions," said Zachary Finewax, CIRES research scientist and lead author of the new study out in the current edition of In 2018, the CU Boulder team outfitted a weight room in the Dal Ward Athletic Center -- a campus facility for university student athletes, from weightlifters to cheerleaders -- with a suite of air-sampling equipment. Instruments collected data from both the weight room and supply air, measuring a slew of airborne chemicals in real time before, during and after workouts of CU athletes. The team found the athletes' bodies produced 3-5 times the emissions while working out, compared to when they were at rest."Using our state-of-the-art equipment, this was the first time indoor air analysis in a gym was done with this high level of sophistication. We were able to capture emissions in real time to see exactly how many chemicals the athletes were emitting, and at what rate," said Demetrios Pagonis, postdoctoral researcher at CIRES and co-author on the new work.Many gym facilities frequently use chlorine bleach-based products to sanitize sweaty equipment. And while these cleaning products work to kill surface bacteria -- they also combine with emissions from sweat -- mixing to form a new cocktail of chemicals.The team was the first to observe a chemical group called N-chloraldimines -- a reaction product of bleach with amino acids -- in gym air. That meant chlorine from bleach cleaner sprayed onto equipment was reacting with the amino acids released from sweating bodies, the authors report.And although more research is needed to determine specific impacts this might have on indoor air quality, chemically similar reaction products of ammonia with bleach can be harmful to human health."Since people spend about 90 percent of our time indoors, it's critical we understand how chemicals behave in the spaces we occupy," said Joost de Gouw, CIRES Fellow, professor of chemistry at CU Boulder and corresponding author on the paper. Although the researchers collected all data for this study pre-pandemic, the team says their results illustrate that a modern gym with low occupancy and good ventilation may still be relatively safe for a workout, especially if masks are used.
Pollution
2,021
January 5, 2021
https://www.sciencedaily.com/releases/2021/01/210105095621.htm
Catalyst transforms plastic waste to valuable ingredients at low temperature
For the first time, researchers have used a novel catalyst process to recycle a type of plastic found in everything from grocery bags and food packaging to toys and electronics into liquid fuels and wax.
The team published their results on Dec. 10 in "Plastics are essential materials for our life because they bring safety and hygiene to our society," said paper co-authors Masazumi Tamura, associate professor in the Research Center for Artificial Photosynthesis in the Advanced Research Institute for Natural Science and Technology in Osaka City University, and Keiichi Tomishige, professor in the Graduate School of Engineering in Tohoku University. "However, the growth of the global plastic production and the rapid penetration of plastics into our society brought mismanagement of waste plastics, causing serious environmental and biological issues such as ocean pollution."Polyolefinic plastics -- the most common plastic -- have physical properties that make it difficult for a catalyst, responsible for inducing chemical transformation, to interact directly with the molecular elements to cause a change. Current recycling efforts require temperatures of at least 573 degrees Kelvin, and up to 1,173 degrees Kelvin. For comparison, water boils at 373.15 degrees Kelvin, and the surface of the Sun is 5,778 degrees Kelvin.The researchers looked to heterogenous catalysts in an effort to find a reaction that might require a lower temperature to activate. By using a catalyst in a different state of matter than the plastics, they hypothesized that the reaction would be stronger at a lower temperature.They combined ruthenium, a metal in the platinum family, with cerium dioxide, used to polish glass among other applications, to produce a catalyst that caused the plastics to react at 473 degrees Kelvin. While still high for human sensibilities, it requires significantly less energy input compared to other catalyst systems.According to Tamura and Tomishige, ruthenium-based catalysts have never been reported in the scientific literature as a way to directly recycle polyolefinic plastics."Our approach acted as an effective and reusable heterogeneous catalyst, showing much higher activity than other metal-supported catalysts, working even under mild reaction conditions," Tamura and Tomishige said. "Furthermore, a plastic bag and waste plastics could be transformed to valuable chemicals in high yields."The researchers processed a plastic bag and waste plastics with the catalyst, producing a 92% yield of useful materials, including a 77% yield of liquid fuel and a 15% yield of wax."This catalyst system is expected to contribute to not only suppression of plastic wastes but also to utilization of plastic wastes as raw materials for production of chemicals," Tamura and Tomishige said.
Pollution
2,021
January 4, 2021
https://www.sciencedaily.com/releases/2021/01/210104131938.htm
A robotic revolution for urban nature
Drones, robots and autonomous systems can transform the natural world in and around cities for people and wildlife.
International research, involving over 170 experts and led by the University of Leeds, assessed the opportunities and challenges that this cutting-edge technology could have for urban nature and green spaces.The researchers highlighted opportunities to improve how we monitor nature, such as identifying emerging pests and ensuring plants are cared for, and helping people engage with and appreciate the natural world around them.As robotics, autonomous vehicles and drones become more widely used across cities, pollution and traffic congestion may reduce, making towns and cities more pleasant places to spend time outside.But the researchers also warned that advances in robotics and automation could be damaging to the environment.For instance, robots and drones might generate new sources of waste and pollution themselves, with potentially substantial negative implications for urban nature. Cities might have to be re-planned to provide enough room for robots and drones to operate, potentially leading to a loss of green space. And they could also increase existing social inequalities, such as unequal access to green space.Lead author Dr Martin Dallimer, from the School of Earth and Environment at the University of Leeds, said: "Technology, such as robotics, has the potential to change almost every aspect of our lives. As a society, it is vital that we proactively try to understand any possible side effects and risks of our growing use of robots and automated systems."Although the future impacts on urban green spaces and nature are hard to predict, we need to make sure that the public, policy makers and robotics developers are aware of the potential pros and cons, so we can avoid detrimental consequences and fully realise the benefits."The research, published today in The researchers conducted an online survey of 170 experts from 35 countries, which they say provides a current best guess of what the future could hold.Participants gave their views on the potential opportunities and challenges for urban biodiversity and ecosystems, from the growing use of robotics and autonomous systems. These are defined as technologies that can sense, analyse, interact with and manipulate their physical environment. This includes unmanned aerial vehicles (drones), self-driving cars, robots able to repair infrastructure, and wireless sensor networks used for monitoring.These technologies have a large range of potential applications, such as autonomous transport, waste collection, infrastructure maintenance and repair, policing and precision agriculture.The research was conducted as part of Leeds' Self Repairing Cities project, which aims to enable robots and autonomous systems to maintain urban infrastructure without causing disruption to citizens.First author Dr Mark Goddard conducted the work whilst at the University of Leeds and is now based at the Northumbria University. He said: "Spending time in urban green spaces and interacting with nature brings a range of human health and well-being benefits, and robots are likely to transform many of the ways in which we experience and gain benefits from urban nature."Understanding how robotics and autonomous systems will affect our interaction with nature is vital for ensuring that our future cities support wildlife that is accessible to all."This work was funded by the Engineering and Physical Sciences Research Council (EPSRC).
Pollution
2,021
January 4, 2021
https://www.sciencedaily.com/releases/2021/01/210104114105.htm
New data-driven global climate model provides projections for urban environments
Cities only occupy about 3% of the Earth's total land surface, but they bear the burden of the human-perceived effects of global climate change, researchers said. Global climate models are set up for big-picture analysis, leaving urban areas poorly represented. In a new study, researchers take a closer look at how climate change affects cities by using data-driven statistical models combined with traditional process-driven physical climate models.
The results of the research led by University of Illinois Urbana Champaign engineer Lei Zhao are published in the journal Home to more than 50% of the world's population, cities experience more heat stress, water scarcity, air pollution and energy insecurity than suburban and rural areas because of their layout and high population densities, the study reports."Cities are full of surfaces made from concrete and asphalt that absorb and retain more heat than natural surfaces and perturb other local-scale biophysical processes," said Zhao, a civil and environmental engineering professor and National Center for Supercomputing Applications affiliate. "Incorporating these types of small-scale variables into climate modeling is crucial for understanding future urban climate. However, finding a way to include them in global-scale models poses major resolution, scale and computational challenges."Global climate models project future scenarios by modeling how broader-scale processes like greenhouse gas emissions force the global climate to respond. By combining this technique with a statistical model that emulates a complex and detailed climate model for urban landscapes, Zhao's team confronted the urban-to-global information gap.The team applied its urban climate emulation technique to data from 26 global climate models under intermediate- and high-emissions scenarios. This approach allowed researchers to model outputs into city-level projections of temperature and relative humidity through the year 2100, permitting climate change and uncertainty quantification.The model predicts that by the end of this century, average warming across global cities will increase by 1.9 degrees Celsius with intermediate emissions and 4.4 C with high emissions, with good agreement among existing climate models over certain regions, Zhao said.The projections also predicted a near-universal decrease in relative humidity in cities, making surface evaporation more efficient and implying that adaptation strategies like urban vegetation could be useful."Our findings highlight the critical need for global projections of local urban climates for climate-sensitive urban areas," Zhao said. "This could give city planners the support they need to encourage solutions such as green infrastructure intervention to reduce urban heat stress on large scales."Currently, the projections do not account for the effects of future urban development. However, the researchers hypothesize that they can extend their strategy to make up for this. "The methodology, overall, is very flexible and can be adjusted to capture things like finer time scales and can even be applied to other ecosystems, like forests and polar regions, for example," Zhao said.The National Science Foundation and the Army Research Office supported this study.
Pollution
2,021
January 4, 2021
https://www.sciencedaily.com/releases/2021/01/210104110418.htm
Surprising news: Drylands are not getting drier
New Columbia Engineering study -- first to investigate the long-term effect of soil moisture-atmosphere feedbacks in drylands -- finds that soil moisture exerts a negative feedback on surface water availability in drylands, offsetting some of the expected decline
New York, NY -- January 4, 2021 -- Scientists have thought that global warming will increase the availability of surface water -- freshwater resources generated by precipitation minus evapotranspiration -- in wet regions, and decrease water availability in dry regions. This expectation is based primarily on atmospheric thermodynamic processes. As air temperatures rise, more water evaporates into the air from the ocean and land. Because warmer air can hold more water vapor than dry air, a more humid atmosphere is expected to amplify the existing pattern of water availability, causing the "dry-get-drier, and wet-get-wetter" atmospheric responses to global warming.A Columbia Engineering team led by Pierre Gentine, Maurice Ewing and J. Lamar Worzel professor of earth and environmental engineering and affiliated with the Earth Institute, wondered why coupled climate model predictions do not project significant "dry-get-drier" responses over drylands, tropical and temperate areas with an aridity index of less than 0.65, even when researchers use the high emissions global warming scenario. Sha Zhou, a postdoctoral fellow at Lamont-Doherty Earth Observatory and the Earth Institute who studies land-atmosphere interactions and the global water cycle, thought that soil moisture-atmosphere feedbacks might play an important part in future predictions of water availability in drylands.The new study, published today by "These feedbacks play a more significant role than realized in long-term surface water changes," says Zhou. "As soil moisture variations negatively impact water availability, this negative feedback could also partially reduce warming-driven increases in the magnitudes and frequencies of extreme high and extreme low hydroclimatic events, such as droughts and floods. Without the negative feedback, we may experience more frequent and more extreme droughts and floods."The team combined a unique, idealized multi-model land-atmosphere coupling experiment with a novel statistical approach they developed for the study. They then applied the algorithm on observations to examine the critical role of soil moisture-atmosphere feedbacks in future water availability changes over drylands, and to investigate the thermodynamic and dynamic mechanisms underpinning future water availability changes due to these feedbacks.They found, in response to global warming, strong declines in surface water availability (precipitation minus evaporation, P-E) in dry regions over oceans, but only slight P-E declines over drylands. Zhou suspected that this phenomenon is associated with land-atmosphere processes. "Over drylands, soil moisture is projected to decline substantially under climate change," she explains. "Changes in soil moisture would further impact atmospheric processes and the water cycle."Global warming is expected to reduce water availability and hence soil moisture in drylands. But this new study found that the drying of soil moisture actually negatively feeds back onto water availability -- declining soil moisture reduces evapotranspiration and evaporative cooling, and enhances surface warming in drylands relative to wet regions and the ocean. The land-ocean warming contrast strengthens the air pressure differences between ocean and land, driving greater wind blowing and water vapor transport from the ocean to land."Our work finds that soil moisture predictions and associated atmosphere feedbacks are highly variable and model dependent," says Gentine. "This study underscores the urgent need to improve future soil moisture predictions and accurately represent soil moisture-atmosphere feedbacks in models, which are critical to providing reliable predictions of dryland water availability for better water resources management."
Pollution
2,021
December 23, 2020
https://www.sciencedaily.com/releases/2020/12/201223125738.htm
Plastic is blowing in the wind
As the plastic in our oceans breaks up into smaller and smaller bits without breaking down chemically, the resulting microplastics are becoming a serious ecological problem. A new study at the Weizmann Institute of Science reveals a troubling aspect of microplastics -- defined as particles smaller than 5 mm across. They are swept up into the atmosphere and carried on the wind to far-flung parts of the ocean, including those that appear to be clear. Analysis reveals that such minuscule fragments can stay airborne for hours or days, spreading the potential to harm the marine environment and, by climbing up the food chain, to affect human health.
"A handful of studies have found microplastics in the atmosphere right above the water near shorelines," says Dr. Miri Trainic, in the groups of Prof. Ilan Koren of the Institute's Earth and Planetary Sciences Department in collaboration with that of Prof. Yinon Rudich of the same department, and Prof. Assaf Vardi of the Institute's Plant and Environmental Sciences Department. "But we were surprised to find a non-trivial amount above seemingly pristine water."Koren and Vardi have been collaborating for a number of years on studies designed to understand the interface between ocean and air. While the way the oceans absorb materials from the atmosphere has been well studied, the opposite-direction's process -- aerosolization, in which volatiles, viruses, algal fragments and other particles are swept from seawater into the atmosphere -- had been much less investigated.As part of this ongoing effort, aerosol samples were collected for study in the Weizmann labs during the 2016 run of the Tara research vessel, a schooner on which several international research teams at a time come together to study the effects of climate change, primarily on marine biodiversity. The Weizmann team affixed the inlet of their measuring equipment to the top of one of the Tara's masts (so as to avoid any aerosols produced by the schooner, itself) and Dr. J. Michel Flores, of Koren's group, joined the mission to tend to the collecting as the schooner sailed across the North Atlantic Ocean.Identifying and quantifying the microplastic bits trapped in their aerosol samples was far from easy, as the particles turned out to be hard to pick out under the microscope. To understand exactly what plastic was getting into the atmosphere, the team conducted Raman spectroscopy measurements with the help of Dr, Iddo Pinkas of the Institute's Chemical Research Support to determine their chemical makeup and size. The researchers detected high levels of common plastics -- polystyrene, polyethylene, polypropylene and more -- in their samples. Then, calculating the shape and mass of the microplastic particles, along with the average wind directions and speeds over the oceans, the team showed that the source of these microplastics was most likely the plastic bags and other plastic waste that had been discarded near the shore and made its way into the ocean hundreds of kilometers away.Checking the seawater beneath the sample sites showed the same type of plastic as in the aerosol, providing support for the idea that microplastics enter the atmosphere through bubbles on the ocean surface or are picked up by winds, and are transported on air currents to remote parts of the ocean."Once microplastics are in the atmosphere, they dry out, and they are exposed to UV light and atmospheric components with which they interact chemically," says Trainic. "That means the particles that fall back into the ocean are likely to be even more harmful or toxic than before to any marine life that ingests them.""On top of that," adds Vardi, "some of these plastics become scaffolds for bacterial growth for all kinds of marine bacteria, so airborne plastic could be offering a free ride to some species, including pathogenic bacteria that are harmful to marine life and humans.""The real amount of microplastic in the ocean aerosols is almost certainly greater than what our measurements showed, because our setup was unable to detect those particles below a few micrometers in size," says Trainic. "For example, in addition to plastics that break down into even smaller pieces, there are the nanoparticles that are added to cosmetics and which are easily washed into the ocean, or are formed in the ocean through microplastic fragmentation."Size, in the case of plastic particles, does matter, not only because lighter ones may stay airborne for longer periods. When they do land on the water's surface, they are more likely to be eaten by equally small marine life, which, of course, cannot digest them. Thus, every one of these particles has the potential to harm a marine organism or to work its way up the food chain and into our bodies."Last, but not least, like all aerosols, microplastics become part of the large planetary cycles -- for example, carbon and oxygen -- as they interact with other parts of the atmosphere," says Koren. "Because they are both lightweight and long-lived, we will be seeing more microplastics transported in the air as the plastics that are already polluting our oceans break up -- even if we do not add any further plastics to our waterways." he adds.
Pollution
2,020
December 21, 2020
https://www.sciencedaily.com/releases/2020/12/201221160416.htm
What if clean air benefits during COVID-19 shutdown continued post-pandemic?
A new study by Columbia University Mailman School of Public Health researchers poses a hypothetical question: What if air quality improvements in New York City during the spring 2020 COVID-19 shutdown were sustained for five years without the economic and health costs of the pandemic? They estimate cumulative benefits of clean air during this period would amount to thousands of avoided cases of illness and death in children and adults, as well as associated economic benefits between $32 to $77 billion. The study's findings are published in the journal
The researchers leveraged the unintended "natural experiment" of cleaner air in New York City during the COVID-19 shutdown to simulate the potential future health and economic benefits from sustained air quality improvements of a similar magnitude. They do not frame this study as an estimate of the benefits of the pandemic. Rather they offer this hypothetical clean air scenario as an aspirational goal for policies to reduce emissions, largely from fossil fuel combustion.Exploratory analyses found that neighborhoods with higher percentages of low-income residents or higher percentages of Black or Latinx residents tended to have proportionally higher benefits from reduced PM2.5 concentrations when compared to neighborhoods with lower levels of poverty or Black or Latinx populations. However, this does not mean that the disparity in health outcomes across neighborhoods would be eliminated under this scenario because underlying risk factors would still remain. The researchers also caution that limited air quality monitors and available data during the shutdown period constrained their ability to assess the impact of improved air quality on health disparities across neighborhoods.Air quality improvements during the New York City spring shutdown were the result of an estimated 60-percent decline in automobile traffic, as well as declines in air traffic, construction, restaurant operation, and electricity generation."Air quality improvements from the shutdown happened as the result of a tragic situation. However, our hypothetical clean air scenario could be achieved through air pollution and climate mitigation policies, including those that support low carbon modes of transportation and reduce emissions in other sectors," says study first author Frederica Perera, DrPH, PhD, director of translational research at the Columbia Center for Children's Environmental Health and professor of environmental health sciences at Columbia Mailman School.The researchers estimated a citywide 23-percent reduction in fine particulate matter (PM2.5 ) concentrations during the COVID-19 shutdown period (March 15-May 15, 2020) compared to the average level for those months in 2015-2018 (the business-as-usual period) using air quality monitoring data from the New York State Department of Environmental Conservation. Based on 2020 data, they extrapolated ambient levels of PM2.5 for a five-year period. They then used BenMAP, a publicly available computer tool supported by the U.S. Environmental Protection Agency, to estimate the number of avoided air pollution-related illnesses and deaths and quantify their economic value using methods the researchers developed in earlier research. Specifically, they estimate potential avoided cases of infant and adult mortality, adverse birth outcomes, autism spectrum disorder, and childhood asthma.Co-authors include Alique Berberian and Thomas Matte at Columbia Mailman; David Cooley and Elizabeth Shenaut at Abt Associates, Durham, North Carolina; and Hollie Olmstead and Zev Ross at Zev Ross Spatial Analysis, Ithaca, NY.This study was supported by grants from the Environmental Protection Agency (RD83615401) and the National Institute of Environmental Health Sciences (ES09600), the John and Wendy Neu Foundation, the John Merck Fund, and New York Community Trust.
Pollution
2,020
December 18, 2020
https://www.sciencedaily.com/releases/2020/12/201218112509.htm
Potentially damaging surface ozone levels rose in lockdown, UK study finds
Less traffic on the roads during the first lockdown led to a reduction in air pollution but may have caused potentially damaging surface ozone levels to rise, a new study has revealed.
The study -- led by the University of York -- shows levels of nitrogen dioxide (NOSurface, or ground-level ozone, can trigger a variety of health problems, particularly for children, the elderly, and people of all ages who have lung diseases such as asthma.Scientists believe our warm and sunny spring weather may have been a contributing factor.The report concludes that if the Covid-19 lockdown is taken as an example of how air quality will respond to future reductions in vehicle emissions -- with more electric vehicles being introduced -- it serves as a warning that the problem of OProfessor James Lee from the Department of Chemistry and the National Centre for Atmospheric Science said during the first lockdown levels of OProfessor Lee added: "The problem is being created by the change in chemistry between NOx (nitrogen oxide) and O"London, Chilton in Oxfordshire and Camborne in Cambridgeshire saw increases of around 50% compared to previous years, with Glasgow and Inverness showing smaller increases of around 30%."These results are a cautionary tale. As well as looking at how we reduce levels of nitrogen dioxide by cutting diesel and petrol emissions, we also need to keep an eye on what is happening with ozone so we don't create other forms of pollution dangerous to human health."The report says in China nitrogen oxide reductions have also led to increases in OProfessor Lee added: "Our research shows it will be vital to control man-made VOCs to avoid any health gains made by the reduction of NOData was collected from 175 Automatic Urban and Rural Network (AURN) traffic monitoring sites across the UK between 23rd March and 31st May 2020 and compared with figures from the previous five years.
Pollution
2,020
December 17, 2020
https://www.sciencedaily.com/releases/2020/12/201217140209.htm
Taking greenhouse gas analysis on the road, er, rails
Research-grade air quality sensors are costly -- around $40,000. For cities trying to monitor their greenhouse gas emissions, the cost may limit the number of sensors they can install and the data they can collect.
Unless. . .Since 2014, the University of Utah has maintained research-grade suites of air quality instruments installed on light rail trains that move throughout the Salt Lake Valley every day. These mobile sensors, researchers estimate in a new study, cover the same area as 30 stationary sensors, providing the Salt Lake Valley with a highly cost-effective way to monitor its greenhouse emissions and fill in gaps in emissions estimates. The study is published in Environmental Science & Technology."Pollutant levels in the atmosphere are going to be rapidly changing in the coming decade as clean energy technologies are deployed," says Logan Mitchell, research assistant professor of atmospheric sciences, and a co-author of the study. "Cost-effective atmospheric monitoring will help policymakers understand what policies lead to reductions in pollutant levels, where there needs to be more focus, and if there are environmental inequalities emerging as some areas reduce their emissions faster than other areas."The story of mounting sensors on the trains of the Utah Transit Authority's TRAX system begins in 2009 with then-doctoral student Heather Holmes (now an associate professor of chemical engineering). Holmes installed a particulate matter sensor on a train but for only a short period of time.When Mitchell arrived at the U as a postdoctoral scholar in 2013 he discussed reviving Holmes' project with faculty advisors Jim Ehleringer, distinguished professor of biology, and John Lin, professor of atmospheric sciences and a co-author of the current study. With support from UTA, Mitchell ran a preliminary study in 2014.They first test placed air inlet tubes out the window of an unoccupied driver's cab. "I noticed that there was a small COThe test was a success, and Mitchell partnered with professor John Horel's research group to launch a full-fledged research effort to monitor air quality and greenhouse gases -- this time with the sensors on the roof of the train so they aren't affected by people waiting on the train platforms.Now the program has expanded to additional TRAX lines and ongoing state funding supports the air quality monitoring while additional funding from the National Oceanic and Atmospheric Administration supported this study on greenhouse gas emissions.The study evaluates the TRAX air sensors as a top-down measurement of greenhouse gas emissions. "Top-down" analysis means measuring the atmospheric concentration, and then figuring out where the emissions come from. Another approach, "bottom-up" analysis, inventories all the possible emissions sources and adds them together to estimate the total."Top-down measurements allow us to evaluate if the bottom-up emission inventories are accurate," says Derek Mallia, lead author of the study and research assistant professor of atmospheric sciences. "If an emission inventory is off by a little bit or is missing an emissions source, the top-down approach gives us a way to figure that out."NASA satellites can also be used to estimate top-down emission estimates for cities around the world, an effort Lin and his group are also pursuing. "These satellite measurements are useful for assessing whole cities and for cities that lack ground observations," Lin says, "but the TRAX-based sensors allow for more granularity in emissions throughout the city and can complement the space-based observations."Top-down measurements of this type over a large area can focus in on particular elements of a city's emissions inventory to identify ways that the inventory needs to change."A really simple example of this would be looking at on-road emissions," Mallia says. The researchers found underestimates of on-road emissions by bottom-up inventories, which if observed by only a stationary sensor near a single main road would suggest only potential underestimations for that particular road. But if on-road emissions are being underestimated consistently over an entire city, Mallia says, "fundamentally, this tells us that we are not accounting for something about on-road emissions, in general. This could be really important to understand as more and more people start driving electric vehicles that have zero tailpipe emissions."As cities work to reduce environmental inequalities, mobile air monitoring can also help monitor if some urban areas' air is improving faster than others, Mitchell adds."The TRAX-based measurements, combined with the network of stationary sites, means that Salt Lake City is one of the best-instrumented cities in the world in terms of pollution observations," Lin says.To the researchers' knowledge, air quality sensors have been installed on public transport platforms in only a handful of cities in Europe. But the same approach could be used in any city with similar light rail systems -- Portland, Oregon and Denver, for example. In cities with rail systems that run partially or entirely underground, sensors could be mounted on electric buses.The cost savings of such an approach is staggering. One research-grade mobile sensor costing $40,000, the authors find, can cover the same area as around 30 stationary sensors costing upwards of $1.2 million."This excludes the manpower needed to maintain a 30-station network, which would be immense," Mitchell says. "Long story short -- based on our preliminary analysis, semi-continuous mobile measurements on public transit vehicles are a very cost-effective strategy for monitoring emissions in cities."
Pollution
2,020
December 17, 2020
https://www.sciencedaily.com/releases/2020/12/201217135317.htm
Fertilizer runoff in streams and rivers can have cascading effects, analysis shows
Fertilizer pollution can have significant ripple effects in the food webs of streams and rivers, according to a new analysis of global data. The researchers also found some detection methods could miss pollution in certain types of streams.
The analysis, published in "Overall, we found that high levels of nutrients affect streams and rivers everywhere," said the study's lead author Marcelo Ardón, associate professor of forestry and environmental resources at North Carolina State University. "Wherever we looked, we saw increases in the abundance and biomass of organisms that live in streams, and also the speeding up of processes that happen in streams -- how fast algae grow, how fast leaves decompose, and how fast organisms grow that feed on them."Across the studies, the researchers saw that nitrogen and phosphorus led to increased growth across the food web, such as in algae, the insects that eat the algae and the fish that eat the insects. In shaded streams where algae doesn't grow, they reported nitrogen and phosphorus sped decomposition of leaves and boosted growth of organisms that feed on them."We saw an average 48 percent increase overall in biomass abundance and activity in all levels of the food web," Ardón said. "We also found that the food webs responded most strongly when both nitrogen and phosphorus were added together."While experts already use the presence of a specific type of chlorophyll -- chlorophyll a -- in water to detect algae growth, researchers said using that method could miss pollution in waterways where algae do not grow, and where decomposition of leaves or other plant matter is the primary source of food for other organisms."The food webs in those streams don't depend on algae -- the trees shade out the algae," Ardón said. "The streams there depend on leaves that fall in and decompose, which is what the insects, such as caddisflies and stoneflies, are eating. In those detrital-based streams, we found similar responses to increases in nitrogen and phosphorus as has been found in algae."Another finding was that factors such as light, temperature, and baseline concentrations of nitrogen and phosphorus impacted the response to increases in the two nutrients."All of those things will determine how much of a response you get to increased nitrogen and phosphorus," said study co-author Ryan Utz of Chatham University.The findings have implications for environmental policy, Ardón said."The EPA has been asking states to come up with ways to reduce runoff of nitrogen and phosphorus into streams, because we know they can cause these really big problems," said Ardón. "We know that at a big scale, and we don't really know the details. A lot of states that are coming up with criteria to reduce the amount of nutrients in the water focus only on algal responses. Our study suggests regulators should expand their view."
Pollution
2,020
December 17, 2020
https://www.sciencedaily.com/releases/2020/12/201217112936.htm
The most consumed species of mussels contain microplastics all around the world
"If you eat mussels, you eat microplastics." This was already known to a limited extent about mussels from individual ocean regions. A new study by the University of Bayreuth, led by Prof. Dr. Christian Laforsch, reveals that this claim holds true globally. The Bayreuth team investigated the microplastic load of four mussel species which are particularly often sold as food in supermarkets from twelve countries around the world. The scientists now present their research results in the journal
All the samples analyzed contained microplastic particles, and the researchers detected a total of nine different types of plastic. Polypropylene (PP) and polyethylene terephthalate (PET) were the most common types of plastic. Both are plastics ubiquitous to people's everyday lives all over the world. To make the analyses of different sized mussels comparable, one gram of mussel meat was used as a fixed reference. According to the study, one gram of mussel meat contained between 0.13 and 2.45 microplastic particles. Mussel samples from the North Atlantic and South Pacific were the most contaminated. Because mussels filter out microplastic particles from the water in addition to food particles, a microplastic investigation of the mussels allows indirect conclusions to be drawn about pollution in their respective areas of origin.The four species of mussels sampled were the European blue mussel, the greenshell mussel, the undulate venus, and the Pacific venus clam. All of the mussels sampled were purchased from grocery stores. Some of them had been farmed while some were wild catch from the North Sea, the Mediterranean Sea, the Atlantic Ocean, the South Pacific Ocean, the South China Sea, and the Gulf of Thailand.The microplastic particles detected in the mussels were of a size of between three and 5,000 micrometres, i.e. between 0.003 and five millimetres. Special enzymatic purification was followed by chemical analysis of the particles via micro-Fourier transform infrared spectrometry (micro-FTIR) and Raman spectroscopy. "To analyze the types of microplastic, we used so-called random forest algorithms for the first time in this study, both for the immensely large micro-FTIR data sets and for the Raman measurement data. These enabled us to evaluate data quickly, automatically, and reliably," says Dr. Martin Löder, head of the plastics working group at the chair of Prof. Dr. Christian Laforsch.Indeed, the contamination of different organisms with microplastics has been investigated in earlier research. However, the results available to date can only be compared with each other to a very limited extent because often different analytical methods were used in the studies. "Our new study represents an important advance in terms of methodology. We have combined the latest technologies and procedures in sample preparation, measurement, and analysis of microplastic contamination in such a way that comparable results can be obtained on this basis in the future. Such methodological harmonization is an indispensable prerequisite for correctly assessing and evaluating risks potentially emanating from the spread of microplastics in the environment," says Prof. Dr. Christian Laforsch, spokesperson for the "Microplastics" Collaborative Research Centre at the University of Bayreuth, and Chair of Animal Ecology I.
Pollution
2,020
December 16, 2020
https://www.sciencedaily.com/releases/2020/12/201216085035.htm
New study links cadmium to more severe flu, pneumonia infections
High levels of cadmium, a chemical found in cigarettes and in contaminated vegetables, are associated with higher death rates in patients with influenza or pneumonia -- and may increase the severity of COVID-19 and other respiratory viruses, according to a new study.
"Our study suggests the public in general, both smokers and nonsmokers, could benefit from reduced exposure to cadmium," said lead author Sung Kyun Park, associate professor of epidemiology and environmental health sciences at the University of Michigan School of Public Health.Long-term exposure to cadmium, even at low levels, may undermine our defense system in the lungs, and people with high levels of the chemical may not be able to cope with influenza virus attacks, Park said.The study by researchers at U-M, the University of Southern California and the University of Washington is published in the December issue of "The associations we found need to be verified in other populations and also studied with respect to cadmium's potential impact on COVID-19 related morbidity and mortality," said senior author Howard Hu, professor and chair of USC's Department of Preventive Medicine and an occupational/environmental physician."Unfortunately, the human body finds it much more difficult to excrete cadmium than other toxic metals, and its presence in many nutritious foods means it is critical to continue reducing sources of environmental pollution that contribute to its presence in air, soil and water."Early in the pandemic, as data was starting to come out of Wuhan, China, a large percentage of people dying from the coronavirus shared a few characteristics -- they were male, smokers and older.That prompted Finnish researcher Matti Sirén, a co-author of the study, to reach out to Park and Hu, who a decade ago had conducted a comprehensive study on the impact of cadmium on chronic diseases, including lung and cardiovascular diseases.Interested in looking into the association between cadmium and COVID-19, but understanding that little data would be available to look at this link, the researchers instead focused on studying the potential association of cadmium to other viral infections: influenza and pneumonia.The researchers utilized data from the U.S. National Health and Nutrition Examination Survey from 1988-1994 and 1999-2006. NHANES is conducted by the National Center for Health Statistics and provides nationally representative survey data on the health and nutritional status of the noninstitutionalized U.S. population.Nearly 16,000 participants in the two separate cohorts were used for the analysis. Cadmium was measured in urine in the first survey and blood in the second. And because tobacco has more than 3,000 chemical components, researchers also looked at cadmium levels in nonsmokers.After adjusting for age, sex, race/ethnicity, education, body mass index, serum cholesterol and hypertension, researchers found that patients with cadmium levels in the 80th percentile were 15% more likely to die of influenza or pneumonia compared to those in the 20th percentile.Among those who never smoked, the difference was even greater with a 27% higher risk of mortality among those in the 80th percentile compared to the 20th percentile."We couldn't directly look at cadmium body burden among COVID-19 patients in the early pandemic," Park said. "Our motivation was to find a modifiable risk factor that can predispose people with COVID-19 infection to develop a severe complication and die of COVID-19."COVID-19 may not be a one-time event. Our findings suggest that the public can benefit from reduced cadmium exposure when the next pandemic occurs. This cannot be done suddenly and takes time through policy changes."In the meantime, Park said smokers should stop smoking. And everyone should be aware of the major sources of cadmium in their diets: cereal, rice, animal organs such as the liver and kidneys, soybeans and some types of leafy vegetables.There are many other sources of vitamins, he said. Cruciferous vegetables, such as cabbage and broccoli, contain high levels of antioxidants but relatively low levels of cadmium."This isn't a recommendation for a draconian change in lifestyle, since many of these foods are typical staples of a balanced, nutritious diet, and their overall contribution to cadmium burden is likely modest," Hu said. "Rather, the suggestion is to consider some shifts in choices."Meanwhile, epidemiologists need to focus on the issue we raised. Increased scrutiny is needed of sources of cadmium exposure and surveillance of cadmium levels in the general population, and policymakers need to work on continuing to reduce environmental cadmium pollution."The study was supported by the U-M's Lifestage Environmental Exposures and Disease Center, a National Institute of Environmental Health Sciences Core Center.
Pollution
2,020
December 16, 2020
https://www.sciencedaily.com/releases/2020/12/201215164921.htm
Urban land and aerosols amplify hazardous weather, steer storms toward cities
Urban landscapes and human-made aerosols -- particles suspended in the atmosphere -- have the potential to not only make gusts stronger and hail larger; they can also start storms sooner and even pull them toward cities, according to new research exploring the impact of urban development on hazardous weather, led by scientists at the U.S. Department of Energy's Pacific Northwest National Laboratory.
By modeling two thunderstorms -- one near Houston, Texas, and another in Kansas City, Missouri. -- atmospheric scientist Jiwen Fan teased out the separate and synergistic effects that urban landscapes and human-caused aerosols can have on storms, rain and hail.In the case of the Kansas City storm, urban land and aerosols worked together to amplify the frequency of large hail by roughly 20 percent. In Houston, an otherwise gentler thunderstorm saw amplified, longer-lasting rainfall that developed sooner, among other changes.Fan shared her findings at the American Geophysical Union's 2020 fall meeting, on Tuesday, Dec. 1, and answered questions virtually on Tuesday, Dec. 15."The novelty of our study is that we consider both urban land and aerosols together," said Fan, "instead of their separate impacts."In previous work, researchers have shown that urban land shapes weather, both through its topographical nature and the heat it produces. Cities are often warmer than their surroundings, because buildings not only absorb and retain the sun's heat differently than trees and agricultural land, but also block wind flow.Yet many studies focus primarily on how cities and aerosols change precipitation and temperature, or only examine the influence of those factors separately, rather than their joint effect.Fan modeled two very different types of storms: Kansas City's violent, rotating, hail-filled thunderstorm, and Houston's gentler, sea breeze-induced thunderstorm. She simulated multiple versions of the same storms, with and without cities and aerosols present, to isolate the effects of these two distinct factors.In Houston, afternoon showers swelled as urban land and aerosols worked synergistically to amplify rainfall. Compared to simulations without cities, rain drenched Houston roughly a half-hour sooner, increasing its total by an extra 1.5 millimeters. Sea breeze winds blew stronger as well, whipped up by the influence of urban land.When cooler, denser air from the sea breeze flowed toward Houston, it brought moisture with it and clashed with warmer, lighter city air. The two mixed upon meeting, creating stronger convection compared to simulations without urban land.Houston's thunderstorm clouds began as warm clouds with only liquid drops, but the strengthened sea breeze caused a quickened transition to mixed-phase clouds, named for their simultaneous mixture of water vapor, ice particles and supercooled water droplets. Even after the sea breeze trickled off, said Fan, residual heat from the city continued feeding storm convection well through the night, causing longer-lasting rain. Contrast that with Fan's simulation where the city was removed, showing a weaker sea breeze and a storm that dissipated sooner.Aerosols played a larger role in enhancing precipitation than urban land in Houston. As mixed-phase clouds formed and convection grew stronger, numerous ultrafine particles were transformed into cloud droplets. This transformation enhanced the conversion of water vapor into cloud condensates, thereby increasing latent heating and further strengthening the storm.In the case of the Kansas City storm, heat from the city was carried downwind, where it met the already formed storm at the northern urban-rural boundary. When warmer, drier air met with cooler, moister rural air, it intensified convergence, creating turbulent mixing and a more violent storm that moved toward urban land.In contrast with Houston's thunderstorm, Kansas City's aerosols did not influence storm initiation or propagation, nor did they, on their own, greatly influence hail. But, when simulated alongside urban land, the two amplified hail, synergistically producing a more hazardous hailstorm. Because of this relationship, said Fan, it's important to consider both urban land and aerosols when exploring the impact cities have on weather and associated hazards.Hail alone inflicts billions of dollars of damage in the U.S. and, according to the National Oceanic and Atmospheric Administration, it is possible for especially large hailstones to fall at over fhail100 miles per hour.Urban land and aerosols shape weather differently, according to Fan, depending on other environmental conditions, like whether air is already polluted."The aerosol effect really depends on the background concentration," said Fan. "If the environment is already polluted, adding more aerosols doesn't seem to affect much. But if you're already in clean a condition and you add aerosols, it may produce a large impact."Houston's busy shipping channel and nearby oil refineries, three of which are in its metro area, regularly discharge aerosols into the atmosphere, said Fan. Humidity, too, she added, can amplify the aerosol effect.Fan hopes her work may lead to more accurate predictions of hazardous weather, mitigating the death and damage dealt by storms. She plans to more deeply explore how sprawling urbanization will shape severe storms in future climate scenarios.This work was funded by DOE's Office of Science and the National Science Foundation, and was supported by the National Energy Research Scientific Computing Center, a DOE Office of Science user facility.
Pollution
2,020
December 15, 2020
https://www.sciencedaily.com/releases/2020/12/201215082101.htm
Scientists warn of likely massive oil spill endangering the Red Sea, region's health
A paper to be published in
Called the Safer, the tanker is a floating storage and offloading unit (FSO) abandoned for years, and with access controlled by Yemen's Houthis. The paper, titled "A Closing Window of Opportunity to Save a Unique Marine Ecosystem," comes shortly after The New York Times reported on November 24 that the Houthis will grant permission to a United Nations (UN) team to board the Safer to inspect and repair the vessel in the near future."The time is now to prevent a potential devastation to the region's waters and the livelihoods and health of millions of people living in half a dozen countries along the Red Sea's coast," says Dr. Kleinhaus. "If a spill from the Safer is allowed to occur, the oil would spread via ocean currents to devastate a global ocean resource, as the coral reefs of the northern Red Sea and Gulf of Aqaba are projected to be among the last reef ecosystems in the world to survive the coming decades."She explained that the reason the coral reefs of the northern Red Sea are unique is because they survive in much warmer waters than today's ocean temperatures, which are becoming too high for most coral to tolerate (over half of the Great Barrier Reef has degraded due to marine heat waves caused by climate change). Additionally, the fish living on the reefs off Yemen in the southern Red Sea are a major resource of food for the populations of the region, and the entire sea and its coral reefs are a highly biodiverse and rich ecosystem.Dr. Kleinhaus and co-authors point out that in May 2020 seawater breached the Safer and entered the engine compartment, and news agencies have reported oil spots next to the tanker, indicating likely seepage. The tanker has been abandoned since 2015, which the authors emphasize is a long advance warning of a decaying tanker poised to degrade to the point of a mass oil leak into the Red Sea.The paper reveals a computer model of how the oil will disperse if a major leak begins this winter. The model shows that the oil will reach much further if the spill occurs now rather than in summer, due to the typical winter currents in that region of the Red Sea. A spill now will cause much broader and more extensive devastation as a result.Despite the signs of the Safer's structural deterioration, access to the tanker has yet to be achieved and concrete steps to repair or to prevent an oil spill have yet to been taken, the authors point out. Dr. Kleinhaus adds that winter is the worst time to have an oil spill in that region, as winter currents will disperse oil much more widely.The authors urge that "Emergent action must be taken by the UN and its International Maritime Organization to address the threat of the Safer, despite political tensions, as a spill will have disastrous environmental and humanitarian consequences, especially if it occurs during winter. With millions of barrels of oil, a day passing through the Red Sea, a regional strategy must be drafted for leak prevention and containment that is specific to the Red Sea's unique ecosystems, unusual water currents, and political landscape."
Pollution
2,020
December 14, 2020
https://www.sciencedaily.com/releases/2020/12/201214192338.htm
Marine pollution: How do plastic additives dilute in water and how risky are they?
Plastic pollution has been at the center of environmental debate for decades. While it is well-known that plastic in the environment can break down into microplastics, be ingested by humans and other organisms, transfer up the food chain, and cause harm, this is only one part of the picture. Plastics are almost always enriched with additives, which makes them easier to process, more resistant, or more performant. This poses a second problem: when the polymer material is left in an environment for long durations, these additives can easily leach out and contaminate the environment.
This is the case with styrene oligomers (SOs), a type of plastic additive commonly found in polystyrene, which have been causing growing concern due to their effects on hormonal disruption and thyroid function. Authorities usually rely on scientists' risk assessments to evaluate such public hazards and determine the appropriate action to minimize their impact. But scientists struggle to accurately measure the proportion of leachable plastic additives (i.e., the bioavailable fraction), as it is difficult to discriminate between leached compounds and those still bound to the source plastic material. Adding to the problem is the fact that these additives can diffuse into the environment at different rates.Now, in a new study, Prof. Seung-Kyu Kim from Incheon National University, Korea, and his team have come up with an assessment method that could change the game. Their findings are published in Prof. Kim and his team collected surface sediments from an artificial lake connected to the Yellow Sea, with several potential sources of SO pollution from the surrounding land area and from marine buoys. "We were hoping that the distribution of SO contaminants in the lake's sediments would help identify their most likely source and measure the leachable amount from the source material," Prof. Kim explains. The scientists also examined one of these potential sources by dissecting a locally-used polystyrene buoy, measuring the concentration of SOs in it and how much leached out of it.A key finding from their investigation was that SO dimers (SDs) and trimers (STs) dilute in water at different rates, so their composition in coastal sediments is very different from what can be observed in the buoys and other potential sources. This was especially true for STs: heavy, hydrophobic molecules that tended to remain in the source microplastics and moved at a slower rate in the lake. The lighter SD molecules leached out much more readily and traveled further. This meant that the SD to ST ratio would increase further away from the source of the contaminant.Based on this dynamic, the researchers suggest using this ratio as a "reference index" to identify the source of SOs and to estimate the bioavailable fraction of SOs in a given sample. In Prof. Kim's words, this would "be critically important to the assessment of ecological and human risk caused by plastic additives," enabling more accurate risk assessments for potential exposure, and perhaps, formulating policies for disallowing certain more leachable, and therefore more hazardous, additives.
Pollution
2,020
December 14, 2020
https://www.sciencedaily.com/releases/2020/12/201214192334.htm
Exploring the relationship between nitrogen and carbon dioxide in greenhouse gas emissions
A University of Oklahoma-led interdisciplinary study on a decade-long experiment (1997-2009) at the University of Minnesota found that lower nitrogen levels in soil promoted release of carbon dioxide from soils under high levels of atmospheric carbon dioxide, and could therefore contribute to furthering rising atmospheric greenhouse gases and climate change.
"Soil microorganisms help extract carbon from non-living sources and make the carbon available to living organisms and play an important role in influencing future climate and carbon cycle feedbacks," said Jizhong Zhou, the OU director for the Institute for Environmental Genomics, a George Lynn Cross Research Professor in the College of Arts and Sciences, and an adjunct professor in the Gallogly College of Engineering.Zhou and the international research team sought to better understand how regional differences in soil nitrogen levels resulting from pollution or natural soil variation could be affecting how soils release carbon dioxide and impact atmospheric carbon dioxide levels."The interactive effects of nitrogen and carbon dioxide on soil respiration, a measure of carbon dioxide released from decomposition in the soil, is particularly important for our future climate, but are not all well understood, due to the lack of long-term manipulative experiments of these two elements together," said Peter Reich, a Distinguished McKnight University Professor at the University of Minnesota.In the study, published in the journal, "Our study highlights that low nitrogen supply gradually accelerates the amount of carbon dioxide released to the atmosphere through decomposition of soil detritus," said Sarah Hobbie, a Distinguished McKnight University Professor at the University of Minnesota. "Considering the worldwide nitrogen limitation in natural environments, heightened release of CO
Pollution
2,020
December 14, 2020
https://www.sciencedaily.com/releases/2020/12/201214104710.htm
COVID-19: Indoor air in hospitals and nursing homes requires more attention
A variety of measures are necessary to prevent the spread of the coronavirus SARS-CoV-2 in hospitals and nursing homes. It is particularly important to develop an appropriate strategy to protect healthcare workers from airborne transmission.
Researchers from the Leibniz Institute for Tropospheric Research (TROPOS) in Leipzig, the CSIR National Physical Laboratory in New Delhi, the Institute of Atmospheric Science and Climate (ISAC) in Rome and 2B Technologies, Colorado recommend that more attention is required in respect to indoor air in such facilities and to further training of the staff. From an aerosol experts' point of view, it is necessary to combine these different measures, the research team writes in an Editorial article in the The risk of infection is particularly high in hospitals and nursing homes because infected and healthy people stay in the same room for long periods of time and the virus can be transmitted via invisible aerosol particles in the air, even over distances of several metres. According to media reports, COVID-19 infections are already reported in almost one tenth of the 12,000 old people's homes and nursing homes in Germany. Homes are now also considered as hotspot for the spread of the virus among new infections in Saxony.Since the outbreak of the pandemic in early 2020, there have been increasing reports of transmissions via aerosol particles in the indoor air of hospitals and nursing homes. These include scientific reports from hospitals in China and the USA, but also from a nursing home in the Netherlands, where the virus apparently spread via the ventilation system using aerosol particles because unfiltered indoor air circulated in a ward. As further evidence, SARS-CoV-2 was detected on the dust filters of the air conditioning system there. "The complexity of the aerosol transmission of SARS-CoV-2, especially indoors, is far from being solved and there is a need to establish appropriate guidelines to protect medical staff. With this publication, we are therefore trying to give recommendations for measures that could contribute to the containment of not only current, but also future virus pandemics," reports Prof Alfred Wiedensohler from TROPOS.The aerosol spread of the virus is, according to many experts, a major reason why the number of corona infections in Europe increased dramatically in the autumn. People stay indoors for longer durations and as temperatures fall, many indoor spaces are much less ventilated. Concentrations of viral particles in the air can rise sharply when infected people stay indoors. Simple mouth-nose masks can significantly reduce but not completely prevent the release of viral aerosol particles through the airways. The risk can therefore increase significantly with the number of people and the length of time they stay in the room. Hospitals and nursing homes are particularly affected by this, because additional risk factors are added there: particularly sensitive people, very long stays in a room and sometimes medical procedures such as intubation in intensive care units, where a lot of aerosol is produced.The spread of viruses via the room air can be reduced with a number of measures. However, there is no single measure that can achieve this completely, but it is important to control indoor air and combine different measures:"As protection against the transmission of SARS-CoV-2 via the air in closed rooms, especially in cold and dry weather, we recommend humidifiers to keep the relative humidity in the room in the range of 40 to 60 percent and to reduce the risk of respiratory tract infection. It is in this middle range that the human mucous membranes are most resistant to infections. In addition, the viruses in the aerosol particles can survive at a relative humidity around 50 percent for less time than in drier or high humid air," explains Dr Ajit Ahlawat of TROPOS.It is very important that there should be a constant supply of fresh air through the air conditioning system or ventilation. This can be controlled with measuring devices for carbon dioxide (COIf it is not possible to ventilate the room sufficiently, an attempt can be made to reduce the concentration of viruses in the room air by using air purifiers. However, these air purifiers should have so-called HEPA (High-efficiency particulate absorbing) filters. However, air purifiers can always only be an additional measure as they cannot replace the supply of fresh air and thus oxygen.Medical staff need special protection during procedures and surgical operations that involve potentially infectious aerosol particles -- such as dental treatment or intubation in intensive care units. Valve-free particle filter masks, so-called respiratory masks such as N95, should be worn and care should be taken to ensure that they lie close to the skin. " Avoid the use of FFP2 and FFP3 type respirators, which have an exhalation valve or ventilation, as these types of respirators are not sufficient. To reduce the risk, protective equipment such as goggles should also be worn," advises Dr Francesca Costabile of the Institute of Atmospheric Science and Climate (ISAC) in Rome. In addition, the researchers recommend avoiding aerosol-generating procedures and treatments in patients with COVID-19 wherever possible to reduce the risk of infection for medical staff. Aerosol-generating treatments usually include medication administered via a nebulizer. In order to avoid the risk of aerosolisation of SARS-CoV-2 by the nebulisation process, inhaled drugs should be administered by a metered dose inhaler rather than a nebulizer, if possible.Care should also be taken when disinfecting rooms: "We recommend that disinfection with UV-C light should not be used too often. Although it is known that UV-C light destroys the SARS-CoV-2 viruses, it ultimately increases indoor ozone concentrations and can thus have a negative impact on health if the indoor air is not adequately replaced," stresses Dr Sumit Kumar Mishra of CSIR -- National Physical Laboratory. Spraying oxidizing chemicals in the air, such as hydrogen peroxide (H2O2), can also have negative consequences. Indoors, these chemicals cause toxic chemical reactions that create other air pollutants and damage the central nervous system and lungs of humans.The international research team emphasises that the training of hospital and nursing home staff is extremely important to prevent the spread of viruses via indoor air. Medical staff must be adequately trained to follow the recommendations. It is important to draw attention to the risks of airborne transmission of SARS-CoV-2. Such recommendations, if adequately provided by health authorities and implemented by medical staff, could significantly reduce the risk of airborne transmission in hospitals and nursing homes until vaccination is effective on a large scale. Tilo Arnhold
Pollution
2,020
December 9, 2020
https://www.sciencedaily.com/releases/2020/12/201209140342.htm
New-found phenomenon that may improve hurricane forecasts
In a year like no other, it's certainly fitting that we had hurricane season that followed suit. It seemed every time we turned around, there was a tropical disturbance brewing that eventually became a named storm.
As these storms made their way through the Atlantic Ocean or Caribbean Sea, those in the "cone of concern" watched intently to see where it was heading, its intensity and if it was time to put up their shutters. The science of forecasting storms has come a long way from the days of Hurricane David or Andrew, but scientists know there's more than can be learned.Step in a team of research scientists, led by a group from Nova Southeastern University's Halmos College of Arts and Sciences and Guy Harvey Oceanographic Research Center, who have just had a paper -- "Potential Effect of Bio-Surfactants on Sea Spray Generation in Tropical Cyclone Conditions" -- published by Nature'sRapid storm intensification and decay remain a challenge for hurricane forecasts. Many factors are involved and some of them are either poorly known or not yet identified. One such factor appears to be the presence of surface-active materials of biological (e.g., coral reefs) or anthropogenic (e.g., oil spills) origin. This new research paper was authored by an ad hoc team of researchers from NSU, The University of Miami (UM), The University of Hawaii (UH), The University of Rhode Island (URI) and the high-performance computing company Ansys, Inc."We have conducted computational and laboratory experiments and found that under certain environmental conditions, surface-active materials significantly alter the size distribution of sea spray," said Breanna Vanderplow, a NSU Halmos College Ph.D. student, who is the first author of this paper. "Since sea spray is 'fuel' for hurricanes, the hurricane intensity can be altered."Improved tropical cyclone prediction is particularly critical during pandemics, such as the COVID-19 outbreak, where poor prediction could cost lives if unnecessary sheltering of large groups occurs.Breanna presented her work on surfactants and sea spray at the 2019 Tropical Cyclone Ocean Interaction (TCOI 2019) conference in Jeju Island, South Korea and received feedback from the tropical cyclone community. Subsequently, she submitted the collaborative paper to"Surfactants reduce interfacial tension between air and water, which results in an increased rate of sea spray generation," said Alexander Soloviev, Ph.D., a professor, and principal investigator at NSU's Halmos College's Department of Marine and Environmental Sciences. "Evaporating sea spray is part of tropical cyclone thermodynamics. Spray particles also produce additional resistance to the air-low since they increase the total surface exposed to the wind. Yet, surfactants have never previously been considered as a factor in tropical cyclone thermodynamics. Breanna has identified a new phenomenon, which may contribute to improving hurricane intensity forecasts."The paper is co-authored by Breanna Vanderplow (NSU), Alexander V. Soloviev (NSU), Cayla W. Dean (NSU), Brian K. Haus (UM), Roger Lukas (UH), Muhammad Sami (Ansys), and Isaac Ginis (URI).
Pollution
2,020
December 9, 2020
https://www.sciencedaily.com/releases/2020/12/201209140329.htm
Study connects diabetes, air pollution to interstitial lung disease
People with pre-diabetes or diabetes who live in ozone-polluted areas may have an increased risk for an irreversible disease with a high mortality rate. A new study published in the
"Our findings are especially important today as we're in the midst of the COVID-19 pandemic, where we have great concern regarding the convergence of health effects from air pollution and SARS-CoV-2 in susceptible populations like people with diabetes," said James Wagner, lead author and associate professor for the MSU College of Veterinary Medicine's Department of Pathobiology and Diagnostic Investigation.Ozone -- a gas often referred to as "smog" -- is known to exacerbate certain lung diseases, such as asthma and rhinitis, which are primarily upper airway diseases. But recent epidemiology (Johannson et al. and Sesé et al.) suggests an association between high ozone concentrations and adverse health effects in the deep lung, which cause difficulty breathing due to lung restriction and stiffness."More than 170,000 people in the U.S. suffer from interstitial lung disease. Furthermore, type 2 diabetes and insulin resistance are recently suggested risk factors for developing pulmonary fibrosis," said Jack Harkema, University Distinguished Professor, Albert C. and Lois E. Dehn Endowed Chair in Veterinary Medicine, and director of the Laboratory for Environmental and Toxicologic Pathology and the Mobile Air Research Laboratories at MSU.In the study, Wagner, Harkema and their collaborators, Robert Tighe and Christina Barkauskas from Duke University's Department of Medicine, studied healthy mice, mice with mild insulin resistance and mice with marked insulin resistance. The study found a direct relationship between insulin resistance levels and the severity of lung inflammation and scarring (fibrosis); diabetes-prone mice were particularly susceptible to inflammation and tissue remodeling caused by repeated ozone exposure."Evidence suggests that ozone exposure could exacerbate pulmonary fibrosis, particularly in individuals that are diabetic," said Tighe, a pulmonologist who specializes in interstitial disease at Duke. "Poorly controlled diabetes, in particular, may be an important co-morbidity for worsened lung damage."According to Wagner, these findings are of critical importance for public health."Our results propose a causal link for ozone exposure to preferentially promote early pulmonary fibrosis and interstitial lung disease in pre-diabetic mice. We only exposed these mice for three weeks, but there are millions of people living in cities like Los Angeles and New York who are exposed to high levels of ozone day after day," he said. "Then, you must consider the prevalence of pre-diabetes people -- approximately 33% in this country. Our study results suggest that people who are borderline insulin resistant -- or diabetic -- and living in areas with high levels of ozone pollution might be at an increased risk for developing interstitial lung disease."This study is just the latest in Wagner and Harkema's research efforts, which describe pre-diabetes as a risk factor for multiple possible adverse responses to air pollution. Wagner has previously shown deleterious effects on heart rate, blood pressure, and adipose tissue inflammation in pre-diabetic rodents that were exposed to ozone (Wagner et al. and Zhong et al.).The authors believe this study is the first of its kind, as it describes exacerbated pulmonary inflammation and remodeling due to repetitive, short-term ozone exposures in insulin-resistant rodents that also exhibit other manifestations of type 2 diabetes. The work was supported in large part by the researchers' grant from the Environmental Protection Agency's Great Lakes Air Center for Integrative Environmental Research (GLACIER) at MSU. For more information about this research, contact Dr. James Wagner.
Pollution
2,020
December 8, 2020
https://www.sciencedaily.com/releases/2020/12/201208193353.htm
Colorado mountains bouncing back from 'acid rain' impacts
A long-term trend of ecological improvement is appearing in the mountains west of Boulder. Researchers from the University of Colorado Boulder have found that Niwot Ridge -- a high alpine area of the Rocky Mountains, east of the Continental Divide -- is slowly recovering from increased acidity caused by vehicle emissions in Colorado's Front Range.
Their results show that nitric and sulfuric acid levels in the Green Lakes Valley region of Niwot Ridge have generally decreased over the past 30 years, especially since the mid-2000s. The findings, which suggest that alpine regions across the Mountain West may be recovering, are published in the This is good news for the wildlife and wildflowers of Rocky Mountain National Park to the north of Niwot Ridge, which depend on limited levels of acidity in the water and soil to thrive. Colorado's Rocky Mountains are also the source of a lot of water for people living in the Mountain West, and the integrity of these ecosystems influences both the quantity and the quality of this water."It looks like we're doing the right thing. By controlling vehicle emissions, some of these really special places that make Colorado unique are going back to what they used to be," said Jason Neff, co-author on the paper and director of the Sustainability Innovation Lab at Colorado (SILC).Almost every area in the world, including Colorado's Rocky Mountains, has been affected in the past 200 years by increased acidic nutrients, like nitrogen, contained in rain and snow. Nitrogen oxides, like nitrate, are produced primarily from vehicles and energy production. Ammonium is a main ingredient in common agricultural fertilizers.Nitrogen is a fundamental nutrient required in ecosystems. But when nitrogen levels increase too much, this changed soil and water chemistry can make it difficult for native plants to thrive or even survive -- leading to a cascade of negative consequences.In the summer, the sun heats up the Eastern flanks of the Front Range, causing the warmer air to rise -- bringing nitrogen from cars, industry and agriculture with it. As this air cools, it forms clouds over the Rocky Mountains and falls back down as afternoon thunderstorms -- depositing contaminants, explained Neff.In the 1970s, so-called "acid rain" hit East Coast ecosystems much harder than the Mountain West, famously wiping out fish populations and killing trees across large swaths of upstate New York. But scientists are still working to understand how increased levels of acidic nutrients affect the alpine region and how long these ecosystems take to recover.To fill this gap of knowledge, the researchers analyzed data from 1984 to 2017 on atmospheric deposition and stream water chemistry from the Mountain Research Station, a research facility of the Institute of Arctic and Alpine Research (INSTAAR) and CU Boulder located on Niwot Ridge. They found that around the early 2000s, levels of nitric and sulfuric acid stopped increasing in the Green Lakes Valley. In the mid-2000s they started decreasing.Their findings were not all good news, however. Levels of ammonium from fertilizer have more than doubled in rainfall in this area between 1984 and 2017, indicating a need to continue monitoring this agricultural chemical and its effects on the mountain ecosystem.This work builds on decades of field work by Colorado researchers at CU Boulder and beyond.Niwot Ridge is one of 28 Long Term Ecological Research (LTER) Network sites in the U.S., funded by the National Science Foundation. Its 4 square miles stretch from the Continental Divide down to the subalpine forest, 25 miles northwest of Boulder. Researchers at CU Boulder, as well as Colorado State University and the United States Geological Survey, have been collecting data here since the mid-1970s, hiking through snow, sleet and rain to get it.In the 80s, 90s and 2000s they worked to bring attention to increasing acidification in Colorado mountain ecosystems as a need for pollution regulation in the Front Range.This new research was made possible by these dedicated scientists, stresses Neff."We used water quality modeling and statistical approaches to analyze the long-term datasets that Niwot researchers have been collecting for decades," said Eve-Lyn Hinckley, a co-author on the paper and fellow of INSTAAR. "The data are available for anyone to download. Our modeling approaches allowed us to evaluate the patterns they hold in a rigorous way."Since 1990, Bill Bowman, director of the Mountain Research Station and a professor of ecology and evolutionary biology, has been looking into how nutrients like nitrogen affect plants in mountain ecosystems. He's found that alpine environments are unique in how they respond to these nutrients."It's a system that is adapted to low nutrients, as well as a harsh climate and a very short growing season -- and frost in the middle of the season. These are very slow growing plants. And they just simply can't respond to the addition of more nitrogen into the system," said Bowman, also a fellow in INSTAAR.He has also found that these ecosystems recover quite slowly, even after acidic elements like nitrogen are no longer being added. But like Neff, who completed his undergraduate honors thesis with Bowman in 1993 using Niwot Ridge data, he sees this research as encouraging.Even if it's slow going, they said, these results show that the ecosystem has a chance to recover."We still have air quality issues in the Front Range. But even with those air quality issues, this research shows that regulating vehicle and power plant emissions is having a big impact," said Neff.Additional authors on this paper include lead author John Crawford of the Institute of Arctic and Alpine Research (INSTAAR) and CU Boulder.
Pollution
2,020
December 8, 2020
https://www.sciencedaily.com/releases/2020/12/201208162957.htm
Environmental impacts of the COVID-19 pandemic, as observed from space
COVID-19 has changed the way we live and work, as various health and safety restrictions keep more of us at home more often. The resulting changes to our behavior are already impacting the environment around us in myriad ways, according to comparisons of remote sensing data before and during the pandemic collected by NASA, U.S. Geological Survey (USGS), and ESA (European Space Agency) Earth-observing satellites and others.
Researchers from several institutions presented their early results in a virtual press conference on Dec. 7 at the American Geophysical Union's 2020 fall meeting. They found that the environment is quickly changing, and the timing of those changes seems to indicate that the pandemic may be a reason. Deforestation rates are changing in some places, air pollution is diminishing, water quality is improving, and snow is becoming more reflective in some areas since the pandemic began earlier this year."But we will need more research to clearly attribute environmental change to COVID," said Timothy Newman, National Land Imaging Program Coordinator for the United States Geological Survey (USGS).Scientists and engineers like Newman use remote sensing data to observe how the world is changing during the COVID-19 pandemic, comparing current remote sensing data to pre-pandemic trends. Newman's program monitors weekly changes with satellite images from the joint NASA/USGS Landsat satellites and the ESA's Sentinel-2 satellites.Newman's program observed that large swaths of the Brazilian Amazon rainforest were cleared from June to September of this year, since the start of the COVID-19 pandemic. Rapid deforestation also is occurring in the tropics near Indonesia and the Congo. Yet, in other parts of the Amazon rainforest such as Colombia and Peru, deforestation appears to have slowed somewhat since the onset of the pandemic.Satellite images and data from Landsat also show a reduction in environmental pollution in this time period. Industrial activities in India, including extracting and crushing stone for construction projects, slowed or ground to a halt because of COVID-19 lockdowns. Soon after, surface air measurements and Landsat thermal infrared data showed that air pollution levels had dropped significantly. One study found that the concentration of an air pollutant called particulate matter (PM) 10 decreased around a third to a fourth of the pre-pandemic level in India.For years, Ned Bair has been studying snow in the Indus River Basin -- a network of mountain ranges and rivers near India, China, and Pakistan that supplies water for more than 300 million people."Once the COVID-19 lockdown started in India, I immediately thought that it would have an impact on the snowpack," said Bair, a snow hydrologist with the University of California Santa Barbara's Earth Research Institute.Bair saw posts on social media about how clear the air was in Delhi and preliminary data that air quality was improving during the pandemic. With less pollution in the air, he thought, there would be less dust and soot accumulating on nearby snow. Dust and other air pollutants affect snow albedo -- how white and, therefore, reflective the snow is -- as they accumulate on the surface of snow. Cleaner snow has a higher albedo, which means it reflects more light energy and, thus, melts at a slower rate.Bair and his team found that snow albedo was higher during pandemic-related lockdowns than in the 20 years prior -- likely a result of the significant reduction in travel and industrial activity as fewer people were leaving home and workplaces were shut down or reduced operations.They used data from the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard NASA's Terra satellite, and two computer models to filter out clouds, rocks, trees, and anything that wasn't snow. Both models showed that snow in the Indus was significantly cleaner during the COVID-19 lockdowns. Using dust to approximate all pollutants, the models showed that pollutants accumulating on the snow decreased by 36 parts per million below the pre-pandemic average -- a change that could delay the melting of enough snow to fill up Lake Tahoe Dam in California, or about 0.17-0.22 cubic miles (0.73- 0.93 cubic kilometers).Snowmelt is an important source of drinking water for more than 300 million people living in the Indus River Basin. While changes in albedo won't change the overall amount of snowmelt, it will change the timing of when that snow melts -- potentially affecting the available water supply in the region.Nima Pahlevan, a research scientist at NASA's Goddard Space Flight Center, dove into examining the pandemic's impact on water quality around the world. He looked at Landsat-8 and Sentinel-2 data on water quality by analyzing proxies like chlorophyll-a, solid material suspended in the water, and turbidity -- essentially a measure of how clear water is based on things like suspended particles of inorganic sediment or phytoplankton in the water -- during the pandemic and compared those measurements to years prior.The findings were murky in some areas. For example, in San Francisco, California changes in rainfall made it difficult to tell whether the pandemic impacted water quality. But a clearer picture surfaced in the western Manhattan area of New York City."The water has become clearer in the western Manhattan area because there were fewer people commuting to Manhattan during the lockdown," he explained.Sewage water from homes and businesses, as well as runoff from streets, are treated in wastewater treatment plants before being released into nearby rivers. When the city imposed a stay-at-home order in mid-March, many of Manhattan's 2.1 million commuters began working from home or left the city. Fewer people producing those pollutants means that fewer particles end up in the water in the Hudson River. Satellite data showed a more than 40% drop in turbidity during the pandemic in a section of the Hudson River.The better water quality probably won't last, though, Pahlevan says. Once we return to pre-pandemic behaviors, water quality will revert as well. Many of the environmental improvements that researchers are seeing won't last if the world goes back to its pre-pandemic ways.For more insight into how the environment is responding to changes in human behavior during the pandemic, see NASA's COVID-19 dashboard.
Pollution
2,020
December 8, 2020
https://www.sciencedaily.com/releases/2020/12/201208111536.htm
Pollution from cooking remains in atmosphere for longer
Particulate emissions from cooking stay in the atmosphere for longer than previously thought, making a prolonged contribution to poor air quality and human health, according to a new study.
Researchers at the University of Birmingham succeeded in demonstrating how cooking emissions -- which account for up to 10 per cent of particulate pollution in the UK -- are able to survive in the atmosphere over several days, rather than being broken up and dispersed.The team collaborated with experts at the University of Bath, the Central Laser Facility and Diamond Light Source to show how these fatty acid molecules react with molecules found naturally in the earth's atmosphere. During the reaction process, a coating, or crust is formed around the outside of the particle that protects the fatty acid inside from gases such as ozone which would otherwise break up the particles.This is the first time scientists have been able to recreate the process in a way that enables it to be studied in laboratory conditions by using the powerful X-ray beam at Diamond Light Source to follow the degradation of thin layers of molecules representative of these cooking emissions in minute detail. The results are published in the Royal Society of Chemistry's The ability of these particles to remain in the atmosphere has a number of implications for climate change and human health. Because the molecules are interacting so closely with water, this affects the ability of water droplets to form clouds. In turn this may alter the amount of rainfall, and also the amount of sunlight that is either reflected by cloud cover or absorbed by the earth -- all of which could contribute to temperature changes.In addition, as the cooking emission particles form their protective layer they can also incorporate other pollutant particles, including those known to be harmful to health such as carcinogens from diesel engine emissions. These particles can then be transported over much wider areas.Lead author, Dr Christian Pfrang, of the University of Birmingham's School of Geography, Earth and Environmental Sciences, said: "These emissions, which come particularly from cooking processes such as deep fat frying, make up a significant proportion of air pollution in cities, in particular of small particles that can be inhaled known as PM2.5 particles. In London it accounts for around 10 per cent of those particles, but in some of the world's megacities for example in China it can be as much as 22 per cent with recent measurements in Hong Kong indicating a proportion of up to 39%."The implications of this should be taken into account in city planning, but we should also look at ways we can better regulate the ways air is filtered -- particularly in fast food industries where regulations do not currently cover air quality impacts from cooking extractor emissions for example."The research was supported by the Science and Technology Facilities Council (STFC) and the Natural Environment Research Council (NERC).
Pollution
2,020
December 8, 2020
https://www.sciencedaily.com/releases/2020/12/201208111422.htm
SMART researchers design portable device for fast detection of plant stress
Researchers from the Disruptive & Sustainable Technologies for Agricultural Precision (DiSTAP) Interdisciplinary Research Group (IRG) of Singapore-MIT Alliance for Research and Technology (SMART), MIT's research enterprise in Singapore and Temasek Life Sciences Laboratory (TLL) have designed a portable optical sensor that can monitor whether a plant is under stress. The device offers farmers and plant scientists a new tool for early diagnosis and real-time monitoring of plant health in field conditions.
Precision agriculture is an important strategy for tackling growing food insecurity through sustainable farming practices, but it requires new technologies for rapid diagnosis of plant stresses before the onset of visible symptoms and subsequent yield loss. SMART's new portable Raman leaf-clip sensor is a useful tool in precision agriculture allowing early diagnosis of nitrogen deficiency in plants, which can be linked to premature leaf deterioration and loss of yield.In a paper titled "Portable Raman leaf-clip sensor for rapid detection of plant stress" published in the journal "Our findings showed that in vivo measurements using the portable leaf-clip Raman sensor under full-light growth conditions were consistent with measurements obtained with a benchtop Raman spectrometer on leaf-sections under laboratory conditions," says MIT Professor Rajeev Ram, co-Lead author of the paper and Principal Investigator at DiSTAP. "We demonstrated that early diagnosis of nitrogen deficiency -- a critical nutrient and the most important component of fertilizers -- in living plants is possible with the portable sensor."While the study mainly looked at measuring nitrogen levels in plants, the device can also be used to detect levels of other plant stress phenotypes such as drought, heat and cold stress, saline stress, and light stress. The wide range of plant stressors that can be detected by these leaf-clip Raman probes and their simplicity and speed makes them ideal for field use by farmers to ensure crop health."While we have focused on the early and specific diagnosis of nitrogen deficiency using the leaf-clip sensor, we were able to measure peaks from other metabolites that are also clearly observed in popular vegetables such as Kailan, Lettuce, Choy Sum, Pak Choi, and Spinach," says Dr. Chung Hao Huang, co-first author of the paper and Postdoctoral Fellow at TLL.The team believes their findings can aid farmers to maximise crop yield, while ensuring minimal negative impacts on the environment, including minimising pollution of aquatic ecosystems by reducing nitrogen runoff and infiltration into the water table."The sensor was demonstrated on multiple vegetable varieties and supports the effort to produce nutritious, low-cost vegetables as part of the Singapore 30 by 30 initiative," says Professor Nam-Hai Chua, co-Lead Principal Investigator at DiSTAP, Deputy Chairman at TLL and co-Lead author of the study. "Extension of this work to a wider variety of crops may contribute globally to improved crop yields, greater climate resiliency, and mitigation of environmental pollution through reduced fertilizer use."
Pollution
2,020
December 7, 2020
https://www.sciencedaily.com/releases/2020/12/201207102109.htm
Military flights biggest cause of noise pollution on Olympic Peninsula
An area in the Olympic Peninsula's Hoh Rain Forest in Washington state for years held the distinction as one of the quietest places in the world. Deep within the diverse, lush, rainy landscape the sounds of human disturbance were noticeably absent.
But in recent years, the U.S. Navy switched to a more powerful aircraft and increased training flights from its nearby base on Whidbey Island, contributing to more noise pollution on the peninsula -- and notably over what used to be the quietest place in the continental U.S. While local residents and visitors have noticed more aircraft noise, no comprehensive analysis has been done to measure the amount of noise disturbance, or the impact it has on people and wildlife.Now, as the Navy is set to implement another increase in flight activities, a University of Washington study provides the first look at how much noise pollution is impacting the Olympic Peninsula. The paper found that aircraft were audible across a large swath of the peninsula at least 20% of weekday hours, or for about one hour during a six-hour period. About 88% of all audible aircraft in the pre-pandemic study were military planes."I think there is a huge gap between what the Navy is telling people -- that its aircraft are not substantially louder and operations haven't changed -- and what people are noticing on the ground," said lead author Lauren Kuehne, who completed the work as a research scientist at the UW School of Aquatic and Fishery Sciences and is now an independent consultant. "Our project was designed to try and measure noise in the ways that reflect what people are actually experiencing."The study was published Nov. 25 in the journal The Navy is set to implement a 62% increase in airborne electronic warfare and 13% increase in air-to-air combat training over the Olympic Peninsula, a place that is historically, culturally and ecologically significant. Eight American Indian tribes call the peninsula home, while Olympic National Park receives more than 3 million visitors a year and is a UNESCO World Heritage Site. More than two dozen animal species are found only on the peninsula, and multiple species are listed as threatened or endangered under the federal Endangered Species Act."The Olympic Peninsula is a renowned hotspot for wildlife, home for people of many different cultures and a playground for outdoor enthusiasts," said co-author Julian Olden, professor at the UW School of Aquatic and Fishery Sciences.The researchers chose three primary sites on the Olympic Peninsula to monitor the soundscape during four seasonal periods from June 2017 to May 2018. Two sites, at Third Beach and Hoh Watershed, were near the coast, while the third site was inland on the Hoh River Trail. They placed recorders at each site to capture sound continuously for 10 days at time, then recruited and trained volunteers to help process the nearly 3,000 hours of recorded audio."This data is very accessible -- you can hear and see it, and it's not rocket science," Kuehne said. "I wanted people to feel like they could really own the process of analyzing it."From their analysis, the researchers identified nearly 5,800 flight events across all monitoring locations and periods. Of these, 88% were military aircraft, 6% were propeller planes, 5% were commercial airplanes and less than 1% were helicopters. Three-quarters of all recorded military aircraft noise occurred between 9 a.m. and 5 p.m. on weekdays. Most of the military aircraft were Growlers, or Boeing EA-18G jets that are used for electronic warfare -- drills that resemble "hide and seek" with a target.The researchers found that most of the aircraft noise was intermittent, detectible across all the sites that were monitored simultaneously, and followed no set pattern. The noise mostly registered between 45 and 60 decibels, which is comparable to the air traffic sounds in Seattle, Kuehne said. Occasionally, the sound level would hit 80 decibels or more, which is akin to the persistent noise when walking under Seattle's former waterfront viaduct.Conversations with local residents also revealed a majority who notice the low-level jet noise, the researchers said. The chronic and unpredictable nature of the noise is especially tiresome for residents, and some report difficulty sleeping, learning in school and even interference with hearing aids.Previous research has shown that loudness is only one aspect of how sound can impact human health. Studies have found that the duration of noise, unpredictable patterns and the inability to control exposure all contribute to stress, annoyance, sleep disturbance and interference with learning.Noise impacts on wildlife are less studied, but some research has shown it can prompt physiological stress and impact animals' ability to reproduce successfully. Noise can also interfere with how animals communicate and find prey."The deafening sound of anthropogenic noise not only threatens wildlife but may also deter people from visiting in the future," Olden said. "Why travel to the Olympic Peninsula to only experience noise comparable to Seattle?"The researchers hope these results will prompt follow-up assessments of how chronic aircraft noise impacts residents on the peninsula. They also hope the Navy will publicly acknowledge the extent of its noise pollution and consider changing its operations near the peninsula."My wildest-dream scenario is that this would allow the Navy to take seriously people's requests that they move at least some of the training elsewhere, to other military operations areas," Kuehne said.This research was funded by The Suquamish Foundation, the One Square Inch of Silence Foundation and the UW School of Aquatic and Fishery Sciences.
Pollution
2,020
December 5, 2020
https://www.sciencedaily.com/releases/2020/12/201205143458.htm
Research reveals how airflow inside a car may affect COVID-19 transmission risk
A new study of airflow patterns inside a car's passenger cabin offers some suggestions for potentially reducing the risk of COVID-19 transmission while sharing rides with others.
The study, by a team of Brown University researchers, used computer models to simulate the airflow inside a compact car with various combinations of windows open or closed. The simulations showed that opening windows -- the more windows the better -- created airflow patterns that dramatically reduced the concentration of airborne particles exchanged between a driver and a single passenger. Blasting the car's ventilation system didn't circulate air nearly as well as a few open windows, the researchers found."Driving around with the windows up and the air conditioning or heat on is definitely the worst scenario, according to our computer simulations," said Asimanshu Das, a graduate student in Brown's School of Engineering and co-lead author of the research. "The best scenario we found was having all four windows open, but even having one or two open was far better than having them all closed."Das co-led the research with Varghese Mathai, a former postdoctoral researcher at Brown who is now an assistant professor of physics at the University of Massachusetts, Amherst. The study is published in the journal The researchers stress that there's no way to eliminate risk completely -- and, of course, current guidance from the U.S. Centers for Disease Control (CDC) notes that postponing travel and staying home is the best way to protect personal and community health. The goal of the study was simply to study how changes in airflow inside a car may worsen or reduce risk of pathogen transmission.The computer models used in the study simulated a car, loosely based on a Toyota Prius, with two people inside -- a driver and a passenger sitting in the back seat on the opposite side from the driver. The researchers chose that seating arrangement because it maximizes the physical distance between the two people (though still less than the 6 feet recommended by the CDC). The models simulated airflow around and inside a car moving at 50 miles per hour, as well as the movement and concentration of aerosols coming from both driver and passenger. Aerosols are tiny particles that can linger in the air for extended periods of time. They are thought to be one way in which the SARS-CoV-2 virus is transmitted, particularly in enclosed spaces.Part of the reason that opening windows is better in terms of aerosol transmission is because it increases the number of air changes per hour (ACH) inside the car, which helps to reduce the overall concentration of aerosols. But ACH was only part of the story, the researchers say. The study showed that different combinations of open windows created different air currents inside the car that could either increase or decrease exposure to remaining aerosols.Because of the way air flows across the outside of the car, air pressure near the rear windows tends to be higher than pressure at the front windows. As a result, air tends to enter the car through the back windows and exit through the front windows. With all the windows open, this tendency creates two more-or-less independent flows on either side of the cabin. Since the occupants in the simulations were sitting on opposite sides of the cabin, very few particles end up being transferred between the two. The driver in this scenario is at slightly higher risk than the passenger because the average airflow in the car goes from back to front, but both occupants experience a dramatically lower transfer of particles compared to any other scenario.The simulations for scenarios in which some but not all windows are down yielded some possibly counterintuitive results. For example, one might expect that opening windows directly beside each occupant might be the simplest way to reduce exposure. The simulations found that while this configuration is better than no windows down at all, it carries a higher exposure risk compared to putting down the window opposite each occupant."When the windows opposite the occupants are open, you get a flow that enters the car behind the driver, sweeps across the cabin behind the passenger and then goes out the passenger-side front window," said Kenny Breuer, a professor of engineering at Brown and a senior author of the research. "That pattern helps to reduce cross-contamination between the driver and passenger."It's important to note, the researchers say, that airflow adjustments are no substitute for mask-wearing by both occupants when inside a car. And the findings are limited to potential exposure to lingering aerosols that may contain pathogens. The study did not model larger respiratory droplets or the risk of actually becoming infected by the virus.Still, the researchers say the study provides valuable new insights into air circulation patterns inside a car's passenger compartment -- something that had received little attention before now."This is the first study we're aware of that really looked at the microclimate inside a car," Breuer said. "There had been some studies that looked at how much external pollution gets into a car, or how long cigarette smoke lingers in a car. But this is the first time anyone has looked at airflow patterns in detail."The research grew out of a COVID-19 research task force established at Brown to gather expertise from across the University to address widely varying aspects of the pandemic. Jeffrey Bailey, an associate professor of pathology and laboratory medicine and a coauthor of the airflow study, leads the group. Bailey was impressed with how quickly the research came together, with Mathai suggesting the use of computer simulations that could be done while laboratory research at Brown was paused for the pandemic."This is really a great example of how different disciplines can come together quickly and produce valuable findings," Bailey said. "I talked to Kenny briefly about this idea, and within three or four days his team was already doing some preliminary testing. That's one of the great things about being at a place like Brown, where people are eager to collaborate and work across disciplines."
Pollution
2,020
December 2, 2020
https://www.sciencedaily.com/releases/2020/12/201202192806.htm
Satellite-tagged bottles show promise for tracking plastic litter through rivers
A new study demonstrates the potential for plastic bottles tagged with tracking devices to deepen our understanding of how plastic pollution moves through rivers. Emily Duncan of the University of Exeter, U.K., and colleagues present this research in the open-access journal
Plastic pollution threatens natural ecosystems and human health worldwide. Previous research suggests that rivers transport up to 80 percent of the plastic pollution found in oceans. However, while ocean modeling and tracking technology have revealed detailed insights into how plastic litter moves and accumulates within oceans, river transport of plastic pollution remains poorly understood.To help address this knowledge gap, Duncan and colleagues developed a new, low-cost, open-source tracking method that uses reclaimed 500 mL plastic bottles to house custom-designed electronics, allowing the bottles to be tracked via GPS cellular networks and satellite technology. These "bottle tags" mimic plastic beverage bottles, in the hopes that they realistically replicate the path of plastic pollution down a river.As part of the National Geographic Sea to Source Ganges Expedition, the researchers released 25 bottle tags at various sites along the Ganges River. They successfully tracked several of them through the river and into the Bay of Bengal. They also released three bottles directly into the Bay of Bengal to mimic paths followed by litter once it reaches the sea. The farthest distance traveled by any of the bottles was 2,845 kilometers, which took 94 days.This study demonstrates that future research could use bottle tags to significantly boost understanding of plastic litter's movement through rivers and into oceans. These devices could reveal new insights into areas where plastic litter is likely to accumulate and periods when large amounts of plastic pollution are moving through the waterways.The authors also highlight the potential for bottle tags to engage the public -- such as by enabling people to follow the bottles' journeys for themselves -- potentially boosting awareness, discouraging littering, and informing changes to pollution policy.The authors add: "Our 'message in a bottle' tags show how far and how fast plastic pollution can move. It demonstrates that this is a truly global issue, as a piece of plastic dropped in a river or ocean could soon wash up on the other side of the world."
Pollution
2,020
December 1, 2020
https://www.sciencedaily.com/releases/2020/12/201201144048.htm
Bleach-alternative COVID-19 surface disinfectants may pollute indoor air, study finds
Cleaning surfaces with hydrogen peroxide-based disinfectants has the potential to pollute the air and pose a health risk, according to research led by University of Saskatchewan (USask).
The research team found that mopping a floor with a commercially available hydrogen peroxide-based disinfectant raised the level of airborne hydrogen peroxide to more than 600 parts per billion -- about 60 per cent of the maximum level permitted for exposure over eight hours, and 600 times the level naturally occurring in the air. The results were just published in the journal "When you're washing surfaces, you are also changing the air you are breathing," said USask chemistry researcher Tara Kahan, senior author of the study and Canada Research Chair in Environmental Analytical Chemistry. "Poor indoor air quality is associated with respiratory issues such as asthma."Too much exposure to hydrogen peroxide could lead to respiratory, skin, and eye irritation, according to the U.S. Centres for Disease Control.The COVID-19 pandemic has led to increased cleaning and demand for all types of cleaning products, including bleach alternatives that contain hydrogen peroxide."At the beginning of the pandemic, we couldn't do research on this topic because hydrogen peroxide solutions were out of stock," Kahan said.Kahan's team, which also included researchers from Syracuse University, York University (Toronto), and University of York (England), sprayed the vinyl floor in a simulated room environment with 0.88 per cent hydrogen peroxide disinfectant and wiped it dry with paper towel either immediately or after letting it soak in for an hour. The team then tested the air at human head height."The real risk is for people who get repeatedly exposed, such as janitors and house cleaners," Kahan said. "We washed the floor and collected measurements at face height -- the concentrations will be even stronger at the floor or at the level of a countertop."Kahan said that the impact on children and pets -- those physically closer to the disinfected surfaces -- is not yet known.More than 10 per cent of disinfectants approved by Health Canada that are deemed likely to be effective against SARS-CoV-2 use hydrogen peroxide as the active ingredient. A total of 168 disinfecting products containing hydrogen peroxide as the active ingredient are approved or marketed in Canada.There are a few ways to reduce risks while disinfecting your home, Kahan says:Funded by the Canada Research Chairs program and the Alfred P. Sloan Foundation, Kahan's team -- mostly women in a discipline which tends to be male-dominated -- is currently repeating the experiment in a house and apartment in Saskatoon to determine whether the high numbers occur in a real world environment and to find practical ways to mitigate exposure risks.
Pollution
2,020
December 1, 2020
https://www.sciencedaily.com/releases/2020/12/201201124124.htm
Climate change warms groundwater in Bavaria
Groundwater reservoirs in Bavaria have warmed considerably over the past few decades. A new study by researchers at Martin Luther University Halle-Wittenberg (MLU) compares temperatures at 35 measuring stations, taken at different depths, with data from the 1990s. Water found at a depth of 20 metres was almost one degree warmer on average than 30 years ago. The findings were published in the journal "
As the air warms, the ground also becomes warmer over time -- ultimately resulting in warmer groundwater. Geologists call this thermal coupling. "Unlike the atmosphere, however, the earth's sub-surface is very sluggish," explains Professor Peter Bayer, a geoscientist at MLU and co-author of the study. Because the ground below the surface does not react to short-term temperature fluctuations and thus tends to reflect long-term trends, it is a good indicator of climate change."This ground warming effect has been known to scientists, however there is still little data on it," explains Bayer. For the new study, Bayer and his doctoral student Hannes Hemmerle repeated measurements that had been carried out in the 1990s at 35 measuring stations in groundwater reservoirs in Bavaria. The measuring points are distributed throughout the state, which provides a rare insight into the development of an entire region.The geologists were able to show that almost all the groundwater reservoirs they investigated had warmed up in a similar way over the decades. "Climate change has a very clear effect at depths starting at around 15 metres; at that point short-term local or seasonal fluctuations can no longer be measured," explains Hemmerle. The groundwater at depths of 20 metres was, on average, nearly 0.9 degrees Celsius warmer than in the 1990s. At depths of 60 metres it was still nearly 0.3 degrees warmer. During the same period, the average air temperature rose by 1.05 degrees Celsius."It can be assumed that the groundwater will warm up even more as a delayed reaction to air temperatures and that it will continue to react to rising atmospheric temperatures in the future," says Hemmerle. The consequences of this warming are still difficult to gauge, says Bayer, who adds, higher water temperatures affect the growth of microbes and put pressure on underground ecosystems that are adapted to very constant temperatures.In order to get a feel for the magnitude of the measurements, Bayer and Hemmerle also compared ground warming at a depth of 15 metres with Bavaria's annual heating requirements. Their findings: the increase in temperature correlates to about ten percent of demand. "At least a portion of the heat could possibly be reused as geothermal energy," says Bayer. However, the results cannot be directly transferred to the whole of Germany. "But it can be assumed that the trend is the same," says Hemmerle.
Pollution
2,020
December 1, 2020
https://www.sciencedaily.com/releases/2020/12/201201124119.htm
Hydrogen-powered heavy duty vehicles could contribute significantly to achieving climate goals
A partial transition of German road transport to hydrogen energy is among the possibilities being discussed to help meet national climate targets. A team of researchers from the Institute for Advanced Sustainability Studies (IASS) has examined the hypothetical transition to a hydrogen-powered transport sector through several scenarios. Their conclusion: A shift towards hydrogen-powered mobility could significantly reduce greenhouse gas emissions and greatly improve air quality -- in particular, heavy duty vehicles represent a low-hanging fruit for decarbonization of German road transport.
"Hydrogen fuel cell vehicles offer competitive advantages over battery electric vehicles regarding heavy loads, longer driving ranges and shorter fuelling times -- making them particularly attractive to the heavy duty vehicle segment" explains lead author Lindsey Weger: "Moreover, transitioning heavy-duty vehicles to green hydrogen could already achieve a deep reduction in emissions -- our results indicate a potential of -57 MtCOAccordingly, heavy duty vehicles (which here include not only trucks but also commercial vehicles and buses) equipped with (green) hydrogen fuel cells are a possibility worth considering on the path to road transport decarbonization.Transport is one of the most emission-intensive sectors for both climate and air pollutants. In 2017, for example, Germany's transport sector accounted for 18.4 percent of COWhile Germany has successfully decreased its emissions considerably in most areas of the economy since 1990, little progress has been made in the transport sector, which is in large part responsible for Germany's failure to meet its target of a (lasting) 40 percent reduction in greenhouse gas emissions by 2020 compared to 1990 levels.The major reasons for this are:- the continued dominance of fossil fuels in transport;- and high average vehicular CODue to extraordinary circumstances, including the countermeasures adopted to contain the Covid-19 pandemic, Germany is now set to meet its original 2020 emissions reduction target. However, this reduction is not expected to be lasting, with emissions from the transport sector almost returning to their original levels in mid-June 2020.The overall emissions impact depends on the method of hydrogen production: According to the analysis, emissions change between -179 and +95 MtCOThe green hydrogen scenario also promises to deliver the largest reduction in air pollutants -- up to 42 percent for NMVOCs, NOx and CO -- compared to emissions from the German energy sector for the current conditions. However, producing hydrogen with the current (fossil fuel-intense) electricity mix would result in an increase or minimal effect (i.e., no benefit) in emissions of some pollutants.Transitioning only heavy duty vehicles to green hydrogen would already deliver a large reduction in emissions (-57 MtCOHydrogen is a non-toxic, colourless, and odourless gas. It has been safely produced for decades and is used in industry and space research. Hydrogen has the highest energy density by mass among conventional fuels (although not by volume at standard atmospheric pressures) and, crucially, hydrogen refuelling infrastructure is comparable to that used for conventional road fuels.In addition, hydrogen can be produced from a wide range of energy forms, including renewable electricity. It can be easily stored, compressed or liquefied either in pure form, mixed with natural gas, or bound with larger molecules. Hydrogen is easily transported by pipeline, truck, or ship. It can be safely used to fuel vehicles and is in many respects even safer than petrol and diesel.
Pollution
2,020
December 1, 2020
https://www.sciencedaily.com/releases/2020/12/201201103624.htm
Producing ammonia with a much smaller carbon footprint
Ammonia is the second most commonly produced chemical in the world and an important component of most fertilizers, but current industrial processes to make ammonia produce several millions of tons of carbon dioxide-a potent greenhouse gas-each year.
Now, researchers led by Meenesh Singh, assistant professor of chemical engineering at the University of Illinois Chicago College of Engineering, describe a new process to produce ammonia with a potentially much lower carbon footprint. They report their findings in the journal Nitrogen gas is one of the components used to make ammonia, but because nitrogen bonds in nitrogen gas are very stable, a lot of energy is needed to break them so the nitrogen can bind to hydrogen to produce ammonia."Current methods to make ammonia from nitrogen are very energy-intensive and require the burning of fossil fuels to generate enormous amounts of heat, and this produces a lot of greenhouse gas as a byproduct," said Singh.Singh and colleagues have developed a new method to produce ammonia that relies on the use of a mesh screen coated in copper -- a catalyst that helps bind nitrogen to hydrogen to make ammonia. The electrification of the screen helps drive the reactions.Pure nitrogen gas is pushed through the screen and then interacts with water, which provides the hydrogen. Even though Singh's process uses similar amounts of energy compared to the traditional process, it requires far less fossil fuels than traditional methods -- just enough to electrify the screen. "The electricity can come from solar or wind energy, which would really make a huge difference in reducing greenhouse gas emissions," said Singh. "Even modern electricity-generating powerplants are highly efficient, and if the grid is powered conventionally, our process still uses less fossil fuels and generates less harmful greenhouse gases than conventional ammonia production."Currently, Singh's process produces 20% ammonia and 80 percent hydrogen gas. "We are hoping to increase the production of ammonia, but our early efforts so far are promising, and the savings in the carbon emissions are still significant if you were to scale up our process to produce large amounts of ammonia," Singh said.A provisional patent for the new process has been filed by the UIC Office of Technology Management.Singh's group is now looking at using air -- instead of purified nitrogen gas -- as a source of nitrogen for producing ammonia using their unique method. "Using air would give us even more savings when it comes to greenhouse gases because we're using readily available air instead of nitrogen gas, which needs to be purified and bottled."
Pollution
2,020
December 1, 2020
https://www.sciencedaily.com/releases/2020/12/201201103622.htm
Air pollution spikes linked to lower test scores for Salt Lake County third graders
Fine particulate matter (PM2.5), the tiny particles responsible for hazy air pollution, are detrimental to children's health even inside the classroom. Mounting evidence has linked chronic exposure with poor academic performance in K-12 students. Until now, no research had examined the impact of "peak" air pollution events, the 24-hour spikes of extremely high PM2.5 levels. For students in Salt Lake County, Utah, these episodes are a dangerous reality -- the county's largest city, Salt Lake City, was among the top 10 most polluted American cities for short-term particle pollution in the latest American Lung Association report.
In a new study, University of Utah researchers found that more frequent peak air pollution exposure was associated with reduced math and English language arts (ELA) test scores for third graders in all primary public schools in Salt Lake County during the 2016-2017 year. The minimum peak pollution levels in this study are below what the U.S. Environmental Protection Agency (EPA) determines are "safe" levels of PM2.5. The results stress the need for legislators to enact policies that reduce the number of peak pollution days, and to advocate for lower federal pollution standards."The huge takeaway is that this isn't about school location -- it's not just the schools in the most polluted parts of the city. Everyone is impacted by peak pollution," said lead author Casey Mullen, doctoral student in sociology at the University of Utah.The study highlights the environmental injustice of Salt Lake County's air pollution problem -- schools with a higher proportion of students of color and from households experiencing poverty were exposed to higher mean concentrations of fine particulates and more peak pollution days than were schools serving middle- to upper- class and predominately white students. Though peak exposures had a stronger effect on lower math proficiency in more socially advantaged schools."There are so many studies showing us that air pollution damages our brain's cognitive processing ability," said co-author Sara Grineski, professor of sociology at the University of Utah. "Utah has made great strides to lower pollution in the past decades, but we need to keep pushing forward policies to reduce pollution. We already know it will improve Utahns respiratory health, but it also can help kids do a little bit better in schools."The paper published online on Sept. 22, 2020, in the The researchers looked at the Student Assessment of Growth and Excellence (SAGE) math and ELA scores of third graders in 156 primary public schools in Salt Lake County in 2017. They focused on the percentage of students whose scores were lower than grade-level expectations.In order to ensure that air pollution was the only variable affecting test scores, the researchers created a school disadvantage variable that took into account Title I school status; Hispanic students; the percentage of non-Hispanic minority students who were Black, Asian/Pacific Islander or Native American/Alaska Native; and the percentage of students on free and/or reduce-priced meals. They also accounted for the school's neighborhood context."It is important to account for social disadvantage since social factors are tightly linked with standardized test scores. Students from low-income families have additional struggles that don't tend to affect more affluent students, such as food insecurity. Students from racial/ethnic minority backgrounds often have unequal educational experiences in the U.S. In some cases, they are immigrants themselves or the children of immigrants, and they might still be learning English," said Grineski. "These factors influence standardized test scores."Then, they evaluated each school's chronic and peak air pollution concentrations. For chronic air pollution levels, they analyzed the daily PM2.5concentrations for each school using daily concentrations in the census tract housing each school from the U.S. EPA's Downscaler data. On average, schools had chronic PM2.5 levels of just over 8 micrograms per cubic meter. To establish peak air pollution episodes, the researchers identified the number of days each school was exposed to PM2.5 levels in the 95th percentile of PM2.5 concentrations for the year, which was 23 micrograms per cubic meter. The U.S. EPA's unhealthy air standard is 35 micrograms per cubic meter.While chronic pollution exposure was associated with lower test scores, the effect disappeared when researchers controlled for the social disadvantage factors. In contrast, the frequency of peak pollution exposure was associated with a higher percentage of students who tested below grade proficiency in math and ELA, even after controlling for social disadvantage."Research shows that air pollution is associated with brain cell inflammation. That is well-established," said Mullen. "With that in mind, our study findings suggest that future research should examine if repeated peak exposures to concentrations of fine particulate matter might be more damaging to children's brain functioning than chronic exposures."The authors suggest that legislators could advance public health initiatives that protect children's exposure to air pollution. The researchers have consulted with urban planners about creating cleaner air routes for children to move around their communities. For the future, the state could improve regulations that would prevent schools from being built in high pollution areas. For now, investing in better air filtration systems in classrooms could help mitigate the poor air quality already exists. In the face of growing evidence that even low PM2.5 concentrations does damage to the human body, legislators should advocate for lower federal air pollution standards."Here, with constant air quality issues throughout the seasons, we need to increase awareness about air pollution exposures," Mullen said. "This is a conversation everyone should be having -- people need to be informed about what's at stake."
Pollution
2,020
November 30, 2020
https://www.sciencedaily.com/releases/2020/11/201130201958.htm
Engineers combine light and sound to see underwater
Stanford University engineers have developed an airborne method for imaging underwater objects by combining light and sound to break through the seemingly impassable barrier at the interface of air and water.
The researchers envision their hybrid optical-acoustic system one day being used to conduct drone-based biological marine surveys from the air, carry out large-scale aerial searches of sunken ships and planes, and map the ocean depths with a similar speed and level of detail as Earth's landscapes. Their "Photoacoustic Airborne Sonar System" is detailed in a recent study published in the journal "Airborne and spaceborne radar and laser-based, or LIDAR, systems have been able to map Earth's landscapes for decades. Radar signals are even able to penetrate cloud coverage and canopy coverage. However, seawater is much too absorptive for imaging into the water," said study leader Amin Arbabian, an associate professor of electrical engineering in Stanford's School of Engineering. "Our goal is to develop a more robust system which can image even through murky water."Oceans cover about 70 percent of the Earth's surface, yet only a small fraction of their depths have been subjected to high-resolution imaging and mapping.The main barrier has to do with physics: Sound waves, for example, cannot pass from air into water or vice versa without losing most -- more than 99.9 percent -- of their energy through reflection against the other medium. A system that tries to see underwater using soundwaves traveling from air into water and back into air is subjected to this energy loss twice -- resulting in a 99.9999 percent energy reduction.Similarly, electromagnetic radiation -- an umbrella term that includes light, microwave and radar signals -- also loses energy when passing from one physical medium into another, although the mechanism is different than for sound. "Light also loses some energy from reflection, but the bulk of the energy loss is due to absorption by the water," explained study first author Aidan Fitzpatrick, a Stanford graduate student in electrical engineering. Incidentally, this absorption is also the reason why sunlight can't penetrate to the ocean depth and why your smartphone -- which relies on cellular signals, a form of electromagnetic radiation -- can't receive calls underwater.The upshot of all of this is that oceans can't be mapped from the air and from space in the same way that the land can. To date, most underwater mapping has been achieved by attaching sonar systems to ships that trawl a given region of interest. But this technique is slow and costly, and inefficient for covering large areas.Enter the Photoacoustic Airborne Sonar System (PASS), which combines light and sound to break through the air-water interface. The idea for it stemmed from another project that used microwaves to perform "non-contact" imaging and characterization of underground plant roots. Some of PASS's instruments were initially designed for that purpose in collaboration with the lab of Stanford electrical engineering professor Butrus Khuri-Yakub.At its heart, PASS plays to the individual strengths of light and sound. "If we can use light in the air, where light travels well, and sound in the water, where sound travels well, we can get the best of both worlds," Fitzpatrick said.To do this, the system first fires a laser from the air that gets absorbed at the water surface. When the laser is absorbed, it generates ultrasound waves that propagate down through the water column and reflect off underwater objects before racing back toward the surface.The returning sound waves are still sapped of most of their energy when they breach the water surface, but by generating the sound waves underwater with lasers, the researchers can prevent the energy loss from happening twice."We have developed a system that is sensitive enough to compensate for a loss of this magnitude and still allow for signal detection and imaging," Arbabian said.The reflected ultrasound waves are recorded by instruments called transducers. Software is then used to piece the acoustic signals back together like an invisible jigsaw puzzle and reconstruct a three-dimensional image of the submerged feature or object."Similar to how light refracts or 'bends' when it passes through water or any medium denser than air, ultrasound also refracts," Arbabian explained. "Our image reconstruction algorithms correct for this bending that occurs when the ultrasound waves pass from the water into the air."Conventional sonar systems can penetrate to depths of hundreds to thousands of meters, and the researchers expect their system will eventually be able to reach similar depths.To date, PASS has only been tested in the lab in a container the size of a large fish tank. "Current experiments use static water but we are currently working toward dealing with water waves," Fitzpatrick said. "This is a challenging but we think feasible problem."The next step, the researchers say, will be to conduct tests in a larger setting and, eventually, an open-water environment."Our vision for this technology is on-board a helicopter or drone," Fitzpatrick said. "We expect the system to be able to fly at tens of meters above the water."See video:
Pollution
2,020
November 30, 2020
https://www.sciencedaily.com/releases/2020/11/201130131519.htm
Plastic contaminants harm sea urchins
Plastics in the ocean can release chemicals that cause deformities in sea urchin larvae, new research shows.
Scientists soaked various plastic samples in seawater then removed the plastic and raised sea urchin embryos in the water.The study, led by the University of Exeter, found that urchins developed a variety of abnormalities, including deformed skeletons and nervous systems.These abnormalities were caused by chemicals embedded in the plastics leaching out into the water, rather than the plastics themselves.The plastic-to-water ratio in the study would only be seen in severely polluted places, but the findings raise questions about the wider impact of plastic contaminants on marine life."We are learning more and more about how ingesting plastic affects marine animals," said Flora Rendell-Bhatti, of the Centre for Ecology and Conservation on Exeter's Penryn Campus in Cornwall."However, little is known about the effects of exposure to chemicals that leach into the water from plastic particles."This study provides evidence that contamination of the marine environment with plastic could have direct implications for the development of larvae, with potential impacts on wider ecosystems."Our work contributes to the growing evidence that we all need to help reduce the amount of plastic contamination released into our natural environment, to ensure healthy and productive ecosystems for future generations."Dr Eva Jimenez-Guri, also of the Centre for Ecology and Conservation, added: "Many plastics are treated with chemicals for a variety of purposes, such as making them mouldable or flame retardant."If such plastics find their way to the oceans, these chemicals can leach out into the water."Plastics can also pick up and transport chemicals and other environmental contaminants, potentially spreading them through the oceans."The study used pre-production "nurdles" (pellets from which most plastics are made) from a UK supplier, and also tested nurdles and "floating filters" (used in water treatment) found on beaches in Cornwall, UK.For the tests, each plastic type was soaked in seawater for 72 hours, then the plastic was removed.Analysis of the water showed all samples contained chemicals known to be detrimental to development of animals, including polycyclic aromatic hydrocarbons and polychlorinated biphenyls.Water from the different kinds of plastic affected urchin development in slightly different ways, though all sample types led to deformity of skeletons and nervous systems, and caused problems with gastrulation (when embryos begin to take shape).The study also raised urchin embryos in water that had contained "virgin" polyethylene particles that had not been treated with additive chemicals or collected any environmental pollutants.These urchins developed normally, suggesting that abnormalities observed in other samples were caused by industrial additives and/or environmentally adsorbed contaminants -- rather than the base plastics themselves.Nurdles and floating filters are not waste products, so they are not deliberately discarded, but the study highlights the importance of preventing their accidental release.The researchers say most plastics may have similar effects as those in the study, so the findings emphasise the importance of finding alternatives to replace harmful additives, and reducing overall marine plastic pollution.The study was funded by the European Union's Horizon 2020 programme and the Natural Environment Research Council.
Pollution
2,020
November 30, 2020
https://www.sciencedaily.com/releases/2020/11/201130131412.htm
Caribbean coral reefs under siege from aggressive algae
Human activity endangers coral health around the world. A new algal threat is taking advantage of coral's already precarious situation in the Caribbean and making it even harder for reef ecosystems to grow.
Just-published research in For the past four years, the University of Oxford's Bryan Wilson, Carnegie's Chen-Ming Fan, and California State University Northridge's Peter Edmunds have been studying the biology and ecology of peyssonnelid algal crusts, or PAC, in the U.S. Virgin Islands, which are out-competing coral larvae for limited surface space and then growing over the existing reef architecture, greatly damaging these fragile ecosystems."This alga seems to be something of an ecological winner in our changing world," described lead author Wilson, noting that the various other threats to coral communities make them more susceptible to the algal crusts.Edmunds first took note of the crusts' invasive growth in the wake of category 5 hurricanes Irma and Maria when they were rapidly taking over spaces that had been blasted clean by the storms.Corals are marine invertebrates that build large exoskeletons from which reefs are constructed. To grow new reef structures, free-floating baby corals first have to successfully attach to a stable surface. They prefer to settle on the crusty surface created by a specific type of friendly algae that grows on the local rocks. These coralline crustose algae, or CCA, acts as guideposts for the coral larvae, producing biochemical signals along with their associated microbial community, which entice the baby coral to affix itself.What puzzled the researchers is that both the destructive PAC and the helpful CCA grow on rocks and create a crust, but PAC exclude coral settlement and CCA entices it. What drives this difference?The team set out to determine how the golden-brown PAC affects Caribbean coral reefs, and found that the PAC harbors a microbial community that is distinct from the one associated with CCA, which is known to attract corals."These PAC crusts have biochemical and structural defenses that they deploy to deter grazing from fish and other marine creatures," explained Fan. "It is possible that these same mechanisms, which make them successful at invading the marine bio-space, also deter corals."More research is needed to elucidate the tremendous success that the algal crusts are having in taking over Caribbean reef communities and to look for ways to mitigate the risk that they pose."There is a new genomic and evolutionary frontier to explore to help us understand the complexity of organismal interactions on the reef, both mutualistic and antagonistic," added Fan.Lead author Wilson concluded: "The coral and their ecosystem are so fragile as it is. They are under assault by environmental pollution and global warming. We have made their lives so fragile, yet they are sticking in there. And now this gets thrown into the mix. We don't know if this is the straw that breaks the camel's back, but we need to find out."
Pollution
2,020
November 30, 2020
https://www.sciencedaily.com/releases/2020/11/201130113536.htm
Forest fires, cars, power plants join list of risk factors for Alzheimer's disease
A new study led by researchers at UC San Francisco has found that among older Americans with cognitive impairment, the greater the air pollution in their neighborhood, the higher the likelihood of amyloid plaques -- a hallmark of Alzheimer's disease. The study adds to a body of evidence indicating that pollution from cars, factories, power plants and forest fires joins established dementia risk factors like smoking and diabetes.
In the study, which appears in When applied to the U.S. population, with an estimated 5.8 million people over 65 with Alzheimer's disease, high exposure to microscopic airborne particles may be implicated in tens of thousands of cases."This study provides additional evidence to a growing and convergent literature, ranging from animal models to epidemiological studies, that suggests air pollution is a significant risk factor for Alzheimer's disease and dementia," said senior author Gil Rabinovici, MD, of the UCSF Memory and Aging Center, Department of Neurology and the Weill Institute for Neurosciences.The 18,178 participants had been recruited for the IDEAS study (Imaging Dementia -- Evidence for Amyloid Scanning), which had enrolled Medicare beneficiaries whose mild cognitive impairment or dementia had been diagnosed following comprehensive evaluation. Not all of the participants were later found to have positive PET scans -- 40 percent showed no evidence of plaques on the scan, suggesting non-Alzheimer's diagnoses like frontotemporal or vascular dementias, which are not associated with the telltale amyloid plaques.Air pollution in the neighborhood of each participant was estimated with Environmental Protection Agency data that measured ground-level ozone and PM2.5, atmospheric particulate matter that has a diameter of less than 2.5 micrometers. The researchers also divided locations into quartiles according to the concentration of PM2.5. They found that the probability of a positive PET scan rose progressively as concentrations of pollutants increased, and predicted a difference of 10 percent probability between the least and most polluted areas."Exposure in our daily lives to PM2.5, even at levels that would be considered normal, could contribute to induce a chronic inflammatory response," said first author Leonardo Iaccarino, PhD, also of the UCSF Memory and Aging Center, Department of Neurology and the Weill Institute of Neurosciences. "Over time, this could impact brain health in a number of ways, including contributing to an accumulation of amyloid plaques."Overall concentrations of PM2.5 would not be considered very high for it to have a significant association with amyloid plaques, amounting to annual averages in San Francisco during the study time, added Rabinovici."I think it's very appropriate that air pollution has been added to the modifiable risk factors highlighted by the Lancet Commission on dementia," he said, referring to the journal's decision this year to include air pollution, together with excessive alcohol intake and traumatic brain injury, to their list of risk factors.The study complements previous large-scale studies that tie air pollution to dementia and Parkinson's disease, and adds novel findings by including a cohort with mild cognitive impairment -- a frequent precursor to dementia -- and using amyloid plaques as a biomarker of disease. Other studies have linked air pollution to adverse effects on cognitive, behavioral and psychomotor development in children, including a UCSF-University of Washington study that looked at its impact on the IQ of the offspring of pregnant women.
Pollution
2,020
November 30, 2020
https://www.sciencedaily.com/releases/2020/11/201130101239.htm
Even razor clams on sparsely populated Olympic Coast can't escape plastics
Portland State University researchers and their collaborators at the Quinault Indian Nation and Oregon State University found microplastics in Pacific razor clams on Washington's sparsely populated Olympic Coast -- proof, they say, that even in more remote regions, coastal organisms can't escape plastic contamination.
Microplastics are pieces of plastic smaller than 5 millimeters that are either intentionally produced at that size, or break down from synthetic clothing, single-use plastic items, or other products. These particles enter the environment and pervade freshwater and marine environments, soils and even the air we breathe.Britta Baechler, the study's lead author and a recent graduate of PSU's Earth, Environment and Society doctoral program, analyzed the concentrations of microplastics in razor clams collected from eight beaches along the Washington coast and, after surveying recreational clam harvesters,estimated the annual microplastic exposure of those who eat them.The Pacific razor clam is one of the most sought-after shellfish in Washington. The state's Department of Fish and Wildlife said that during a recent season, the recreational razor clam fishery saw more than 280,000 digger trips with diggers harvesting 4 million clams for the season. It's also a key first food, cultural resource, and vital source of income for members of the Quinault Indian Nation.During the study, a total of 799 suspected microplastics were found in the 138 clam samples, 99% of which were microfibers. On average, clams had seven pieces of plastic each.Clams from Kalaloch Beach, the northernmost site near the mouth of Puget Sound, contained significantly more microplastics than clams from the other seven sites. Though the study did not explore the reasons behind this, Baechler noted that there were no major differences in land cover types between Kalaloch and the other sites, but Kalaloch is the closest in proximity of all sites to the densely populated Seattle metro area.Baechler's team compared whole clams -- minimally processed as if being consumed by an animal predator -- and cleaned clams -- gutted, cleaned of sand debris and grit, and prepared as if being eaten by a person. They found that in thoroughly cleaned clams, the amount of microplastics was reduced by half.Baechler said this bodes better for people -- 88% percent of the survey respondents reported cleaning clams before eating them -- than for ocean predators that aren't afforded the luxury of cleaning clams prior to consumption.Surveys of 107 recreational harvesters determined the average number of razor clams consumed per meal and the number of meals containing clams each year. Combining consumption information with the average number of microplastics per clam, the researchers estimated Olympic Coast razor clam harvester-consumers were exposed to between 60 and 3,070 microplastics per year from razor clams for those who thoroughly cleaned their clams before eating them, or between 120 and 6,020 microplastics a year for those who ate them whole without removing the guts, gills or other organs."We don't know the exact human health impacts of microplastics we inevitably ingest through food and beverages," said Baechler, who now works as an ocean plastics researcher at Ocean Conservancy. "Our estimates of microplastic exposure from this single seafood item are, for context, far lower than what we likely take in from inhalation, drinking bottled water and other sources, but no amount of plastic in our marine species or seafood items is desirable."Baechler and Elise Granek, a professor of environmental science at PSU, said that everyone has a role to play in reducing plastic pollution in the marine environment -- from plastic producers and product designers who can develop effective upstream pollution control solutions to consumers who can make substitutions in their daily lives to reduce their plastic footprints."We all have become dependent on plastics for our clothing and packaging, and the more plastic we use, the more likely it's going to end up in our drinking water, our food and our air," Granek said. "All of us have a responsibility to do what we can to limit the amount of plastic that we're using."
Pollution
2,020
November 26, 2020
https://www.sciencedaily.com/releases/2020/11/201119103043.htm
A filter for environmental remediation
A team of researchers at Osaka University has developed a nanopowder shaped like seaweed for a water filter to help remove toxic metal ions. Made of layered sodium titanate, the randomly oriented nanofibers increase the efficacy of cobalt-II (Co
As the global population continues to increase, so will the need for drinkable water. Sadly, many water sources have become contaminated with heavy metals, such as cobalt, from industrial waste or radioactive runoff. Sodium titanate has been widely used to filter out these toxic substances, but its efficiency is not enough. Sodium titanate is generally a two-dimensional layered material, but its crystal structure can vary based on the chemical composition and preparation method. To effectively capture radioactive and/or heavy metal ions, the morphological control of the sodium titanate is very important.Now, researchers from the Institute of Scientific and Industrial Research at Osaka University have developed a new method to create highly efficient sodium titanate filters. "We used a template-free alkaline hydrothermal process to produce the mats," first author Yoshifumi Kondo says. The researchers found that increasing the hydrothermal synthesis time caused the initially round crystals to become elongated and fibrous, and to form the seaweed-shaped mats consisting of the randomly oriented nanofibers. This seaweed-like nanoscale morphology increased the surface area of the mats, which improved the removal efficiency of Co"Due to the progress of global warming and serious environmental pollution, the need for safe ways to remove radioactive materials and heavy metals from water resources has become even more critical," senior author Tomoyo Goto says. Compared with a commercially available material, the nanostructured sodium titanate mats showed improved performance for capturing Co
Pollution
2,020
November 25, 2020
https://www.sciencedaily.com/releases/2020/11/201125091503.htm
Ghost fishing threatens endangered river dolphins, critically endangered turtles, otters
Waste fishing gear in the River Ganges poses a threat to wildlife including otters, turtles and dolphins, new research shows.
The study says entanglement in fishing gear could harm species including the critically endangered three-striped roofed turtle and the endangered Ganges river dolphin.Surveys along the length of the river, from the mouth in Bangladesh to the Himalayas in India, show levels of waste fishing gear are highest near to the sea.Fishing nets -- all made of plastic -- were the most common type of gear found.Interviews with local fishers revealed high rates of fishing equipment being discarded in the river -- driven by short gear lifespans and lack of appropriate disposal systems.The study, led by researchers from the University of Exeter, with an international team including researchers from India and Bangladesh, was conducted as part of the National Geographic Society's "Sea to Source: Ganges" expedition."The Ganges River supports some of the world's largest inland fisheries, but no research has been done to assess plastic pollution from this industry, and its impacts on wildlife," said Dr Sarah Nelms, of the Centre for Ecology and Conservation on Exeter's Penryn Campus in Cornwall."Ingesting plastic can harm wildlife, but our threat assessment focussed on entanglement, which is known to injure and kill a wide range of marine species."The researchers used a list of 21 river species of "conservation concern" identified by the Wildlife Institute for India.They combined existing information on entanglements of similar species worldwide with the new data on levels of waste fishing gear in the Ganges to estimate which species are most at risk.Speaking about the why so much fishing gear was found in the river, Dr Nelms said: "There is no system for fishers to recycle their nets."Most fishers told us they mend and repurpose nets if they can, but if they can't do that the nets are often discarded in the river."Many held the view that the river 'cleans it away', so one useful step would be to raise awareness of the real environmental impacts."National Geographic Fellow and science co-lead of the expedition Professor Heather Koldewey, of ZSL (the Zoological Society of London) and the University of Exeter, said the study's findings offer hope for solutions based on "circular economy" -- where waste is dramatically reduced by reusing materials."A high proportion of the fishing gear we found was made of nylon 6, which is valuable and can be used to make products including carpets and clothing," she said."Collection and recycling of nylon 6 has strong potential as a solution because it would cut plastic pollution and provide an income."We demonstrated this through the Net-Works project in the Philippines, which has been so successful it has become a standalone social enterprise called COAST-4C."Professor Koldewey added: "This is a complex problem that will require multiple solutions -- all of which must work for both local communities and wildlife."Dr Nelms' work was partly funded by the ExeMPLaR Project, and was supported by access to the analytical facilities of the Greenpeace Research Laboratories.
Pollution
2,020
November 24, 2020
https://www.sciencedaily.com/releases/2020/11/201124150845.htm
Clean Air Act saved 1.5 billion birds
U.S. pollution regulations meant to protect humans from dirty air are also saving birds. So concludes a new continentwide study published today in The
"Our research shows that the benefits of environmental regulation have likely been underestimated," says Ivan Rudik, a lead author and Ruth and William Morgan Assistant Professor at Cornell's Dyson School of Applied Economics and Management. "Reducing pollution has positive impacts in unexpected places and provides an additional policy lever for conservation efforts."Ozone is a gas that occurs in nature and is also produced by human activities, including by power plants and cars. It can be good or bad. A layer of ozone in the upper atmosphere protects the Earth from the harmful ultraviolet rays of the sun. But ground-level ozone is hazardous and is the main pollutant in smog.To examine the relationship between bird abundance and air pollution, the researchers used models that combined bird observations from the Cornell Lab of Ornithology's eBird program with ground-level pollution data and existing regulations. They tracked monthly changes in bird abundance, air quality, and regulation status for 3,214 U.S. counties over a span of 15 years. The team focused on the NOx (nitrogen oxide) Budget Trading Program, which was implemented by the U.S. Environmental Protection Agency to protect human health by limiting summertime emissions of ozone precursors from large industrial sources.Study results suggest that ozone pollution is most detrimental to the small migratory birds (such as sparrows, warblers, and finches) that make up 86 percent of all North American landbird species. Ozone pollution directly harms birds by damaging their respiratory system, and indirectly affects birds by harming their food sources."Not only can ozone cause direct physical damage to birds, but it also can compromise plant health and reduce numbers of the insects that birds consume," explains study author Amanda Rodewald, Garvin Professor at the Cornell Department of Natural Resources and the Environment and Director of the Center for Avian Population Studies at the Cornell Lab of Ornithology. "Not surprisingly, birds that cannot access high-quality habitat or food resources are less likely to survive or reproduce successfully. The good news here is that environmental policies intended to protect human health return important benefits for birds too."Last year, a separate study by the Cornell Lab of Ornithology showed that North American bird populations have declined by nearly 3 billion birds since 1970 (Rosenberg "This is the first large-scale evidence that ozone is associated with declines in bird abundance in the United States and that regulations intended to save human lives also bring significant conservation benefits to birds," says Catherine Kling, Tisch University Professor at the Cornell Dyson School of Applied Economics and Management and Faculty Director at Cornell's Atkinson Center for Sustainability. "This work contributes to our ever increasing understanding of the connectedness of environmental health and human health."
Pollution
2,020
November 24, 2020
https://www.sciencedaily.com/releases/2020/11/201124111313.htm
COVID-19: Air quality influences the pandemic
The correlation between the high concentration of fine particles and the severity of influenza waves is well known to epidemiologists. An interdisciplinary team from the University of Geneva (UNIGE) and the ETH Zürich spin-off Meteodat investigated possible interactions between acutely elevated levels of fine particulate matter and the virulence of the coronavirus disease. Their results, published in the journal
Epidemiologists widely agree that there is a correlation between acute and locally elevated concentrations of fine particles and the severity of influenza waves. "We have investigated whether such a link also exists with the virulence of COVID-19 disease," says Mario Rohrer, researcher at the Institute for Environmental Sciences of the Faculty of Sciences of UNIGE and director of Meteodat.COVID-19 studies conducted in Italy and France suggest that SARS-CoV-2 was already present in Europe at the end of 2019, while the sharp increase in morbidity and mortality was only recorded in spring 2020 in Paris and London. "This time lag is surprising, but also suggests that something else than just the mere interaction of people may promote the transmission of the virus, and particularly the severity of the infection," says Mario Rohrer. His research team has been able to show that these increases in cases followed phases where the levels of fine particles in the air were higher.The team made similar observations in the Swiss canton of Ticino, where fine-particle pollution increased sharply during a period of shallow fog on the Magadino plain and the Sotto Ceneri, observed at the end of February 2020. "Shortly afterwards, an explosive increase in hospital admissions due to COVID-19 was recorded in Ticino. The fact that a large carnival event with some 150,000 visitors took place at the same time probably had an additional impact on the spread of the virus," says Mario Rohrer.The information is important for Switzerland because the increase in fine particle concentrations is particularly frequent during thermal inversions, i.e. when fog forms on the Swiss Plateau, thus limiting the exchange of air masses. In these situations, emissions accumulate in the layer of air underneath the fog. Switzerland is also frequently swept by dust from Saharan sandstorms, also pointed out in this study.The Swiss research team shows that acute concentrations of fine particles, especially those smaller than 2.5 micrometers, cause inflammation of the respiratory, pulmonary and cardiovascular tracts and thicken the blood. "In combination with a viral infection, these inflammatory factors can lead to a serious progression of the disease. Inflammation also promotes the attachment of the virus to cells," he says. In addition, the coronavirus may also be transported by the fine particles. "This has already been demonstrated for influenza and an Italian study found coronavirus RNA on fine particles. All this remains to be demonstrated, of course, but it is a likely possibility," adds Rohrer.Nonetheless, the researchers also emphasize that, although particulate matter pollution can influence the virulence of the virus and possible severe disease progression, physiological, social or economic factors will clearly also influence the further course of the pandemic. Mario Rohrer concludes that the findings of this study offer the possibility of taking preventive measures in the event of future increases in fine particulate matter concentrations, thus limiting a new flare-up of Covid-19 morbidity and mortality.
Pollution
2,020
November 23, 2020
https://www.sciencedaily.com/releases/2020/11/201123173452.htm
The science of windy cities
Global population and urbanization have boomed over the last few decades. With them came scores of new tall buildings, drones, more energy-efficient ventilation systems, and planned air taxis by Uber and other companies. But these technological advancements must contend with a natural physical phenomenon: wind.
Scientists presented the latest findings on modeling and predicting urban airflow -- in the hope of building better buildings, cities, and transportation -- at the 73rd Annual Meeting of the American Physical Society's Division of Fluid Dynamics.The urban skies of the future could teem with autonomous aircraft: air taxis, drones, and other self-flying systems. A team from Oklahoma State University has developed techniques to model environmental hazards these vehicles might encounter so they can safely navigate cities."Urban environments present enormous challenges for drone and urban air mobility platforms," said researcher Jamey Jacob, who led the team. "In addition to the challenges of traffic congestion and obstacles, critical technology gaps exist in modeling, detecting, and accommodating the dynamic urban local wind fields as well as in precision navigation through uncertain weather conditions."Researchers attached sensors to robotic aircraft to take more cohesive measurements of building wakes, or the disturbed airflow around buildings. They combined this data with numerical predictions to get a better picture of the complex wind patterns found in urban environments.The work could help improve wind and weather forecasting, not only for unmanned aircraft but also for conventional airplanes."The potential of outfitting every drone and urban air taxi, as well as other aircraft, with sensors provides a game changing opportunity in our capability to monitor, predict, and report hazardous weather events," said Jacob.Another group, based at the University of Surrey also investigated building wakes. With an eye toward enhancing air quality in cities, they looked for wake differences between a single tall building and a cluster of tall buildings."Understanding how to model the wake of tall buildings is the first step to enable city planners to reduce the heat-island effect as well as improve urban air quality," said Joshua Anthony Minien, a researcher in mechanical engineering.The team carried out experiments in a wind tunnel, varying the grouping, aspect ratio, and spacing of tall buildings. They were encouraged to see that when measured far enough downstream, a cluster of buildings and an isolated building have similar wake characteristics. Changes to wind direction also seem to significantly affect the wakes of clusters of buildings.All buildings, tall or not, must be ventilated."The ability to predict ventilation flow rates, purging times and flow patterns is important for human comfort and health, as highlighted by the need to prevent the airborne spread of coronavirus," said University of Cambridge researcher Nicholas Wise.With engineering professor Gary Hunt, Wise found a problem in current models of passive natural ventilation systems. These often use displacement flow -- where cooler night air enters a building through one opening and warmer air accumulated during the day exits through another opening.Their mathematical modeling revealed that displacement flow does not continue during the purge of warm air, as was believed. Instead, the room experiences an "unbalanced exchange flow" which can slow down the purging process."Every displacement flow transitions to unbalanced exchange flow," said Wise.The researchers were surprised at just how much adding a small low-level opening speeds up room cooling, compared to a room with only a high-level opening. Their model will be useful for designers of natural ventilation systems.
Pollution
2,020
November 22, 2020
https://www.sciencedaily.com/releases/2020/11/201122094643.htm
Airflow studies reveal strategies to reduce indoor transmission of COVID-19
Wear a mask. Stay six feet apart. Avoid large gatherings. As the world awaits a safe and effective vaccine, controlling the COVID-19 pandemic hinges on widespread compliance with these public health guidelines. But as colder weather forces people to spend more time indoors, blocking disease transmission will become more challenging than ever.
At the 73rd Annual Meeting of the American Physical Society's Division of Fluid Dynamics, researchers presented a range of studies investigating the aerodynamics of infectious disease. Their results suggest strategies for lowering risk based on a rigorous understanding of how infectious particles mix with air in confined spaces.Research early in the pandemic focused on the role played by large, fast-falling droplets produced by coughing and sneezing. However, documented super-spreader events hinted that airborne transmission of tiny particles from everyday activities may also be a dangerous route of infection. Fifty-three of 61 singers in Washington state, for example, became infected after a 2.5-hour choir rehearsal in March. Of 67 passengers who spent two hours on a bus with a COVID-19-infected individual in Zhejiang Province, China, 24 tested positive afterward.William Ristenpart, a chemical engineer at the University of California, Davis, found that when people speak or sing loudly, they produce dramatically larger numbers of micron-sized particles compared to when they use a normal voice. The particles produced during yelling, they found, greatly exceed the number produced during coughing. In guinea pigs, they observed influenza can spread through contaminated dust particles. If the same is true for the SARS-CoV-2, the researchers said, then objects that release contaminated dust -- like tissues -- may pose a risk.Abhishek Kumar, Jean Hertzberg, and other researchers from the University of Colorado, Boulder, focused on how the virus might spread during music performance. They discussed results from experiments designed to measure aerosol emission from instrumentalists."Everyone was very worried about flutes early on, but it turns out that flutes don't generate that much," said Hertzberg. On the other hand, instruments like clarinets and oboes, which have wet vibrating surfaces, tend to produce copious aerosols. The good news is they can be controlled. "When you put a surgical mask over the bell of a clarinet or trumpet, it reduces the amount of aerosols back down to levels in a normal tone of voice."Engineers led by Ruichen He at the University of Minnesota investigated a similar risk-reduction strategy in their study of the flow field and aerosols generated by various instruments. Although the level of aerosols produced varied by musician and instrument, they rarely traveled more than a foot away. Based on their findings, the researchers devised a pandemic-sensitive seating model for live orchestras and described where to place filters and audience members to reduce risk.While many formerly officebound employees continue to work from home, employers are exploring ways to safely reopen their workplaces by maintaining sufficient social distance between individuals. Using two-dimensional simulations that modeled people as particles, Kelby Kramer and Gerald Wang from Carnegie Mellon University identified conditions that would help avoid crowding and jamming in confined spaces like hallways.Traveling to and from office buildings in passenger cars also poses an infection risk. Kenny Breuer and his collaborators at Brown University performed numerical simulations of how air moves through passenger car cabins to identify strategies that may reduce infection risk. If air enters and exits a room at points far away from passengers, then it may reduce the risk of transmission. In a passenger car, they said, that means strategically opening some windows and closing others.MIT mathematicians Martin Bazant and John Bush proposed a new safety guideline built on existing models of airborne disease transmission to identify maximum levels of exposure in a variety of indoor environments. Their guideline depends on a metric called "cumulative exposure time," which is determined by multiplying the number of people in a room by the duration of the exposure. The maximum depends on the size and ventilation rate of the room, the face covering of its occupant, the infectiousness of aerosolized particles, and other factors. To facilitate easy implementation of the guideline, the researchers worked with chemical engineer Kasim Khan to design an app and online spreadsheet that people can use to gauge the risk of transmission in a variety of settings.As Bazant and Bush wrote in a forthcoming paper on the work, staying six feet apart "offers little protection from pathogen-bearing aerosol droplets sufficiently small to be continuously mixed through an indoor space." A better, flow-dynamics-based understanding of how infected particles move through a room may ultimately yield smarter strategies for reducing transmission.
Pollution
2,020
November 23, 2020
https://www.sciencedaily.com/releases/2020/11/201123173448.htm
Indonesian wildfires a 'fixable problem'
Indonesian wildfires that cause widespread air pollution and vast carbon emissions are a "fixable problem," according to the leader of a project set up to help tackle the issue.
In dry years, Indonesia peatland fires burn for months -- and smoke exposure from 2015 alone is expected to lead to up to 100,000 premature deaths.The 2015 fires, described as the year's "worst environmental disaster" (Guardian, 2015), also emitted four-and-a-half times as much carbon as the UK economy in a whole year.A combination of factors including deforestation and climate change play a role, and Project "KaLi" (Kalimantan Lestari -- Sustainable Kalimantan) will investigate the causes and possible solutions.This project is funded by UK Research and Innovation (UKRI) through the Global Challenges Research Fund."The recurring peat fires in Indonesia are a huge problem and some of the statistics on their implications are just mind-boggling," said project leader Professor Frank Van Veen, of the Centre for Ecology and Conservation on Exeter's Penryn Campus in Cornwall."Ultimately though, this is a fixable problem and there is clearly a strong desire in Indonesia to get on top of this."This funding provides us with a fantastic opportunity to contribute to these efforts."It is really very exciting to be part of such an interdisciplinary team, conducting collaborative research that can make a real difference to people's lives and to the environment."Indonesia's Central Kalimantan province on the island of Borneo is home to extensive peatlands, and is the epicentre of the country's wildfires.Most fires in this region are started deliberately, primarily as part of agricultural practices, but the duration and severity of fires are strongly linked to El Niño-driven droughts, which may be exacerbated by ongoing deforestation."In their intact, naturally waterlogged, forested state, these peatlands rarely burn, therefore fires are concentrated in the extensive areas that have dried to some degree due to deforestation and drainage for agriculture and timber extraction," said Professor Susan Page of the School of Geography, Geology and the Environment at the University of Leicester."Here, smouldering fires burn down into the underlying peat, often continuing for months and generating immense clouds of toxic haze."This is the primary cause of the air pollution events that are now happening in South East Asia in most years."Dr Muhammad Ali Imron, Vice Dean for Research, Community Service and Cooperation of Forestry Faculty at Gadjah Mada University, said: "The drivers behind the peatland fires are a combination of climatic processes, land use and ignition by humans."The resulting impacts are, therefore, to a large extent preventable -- but effective action requires a more detailed understanding of future climate-associated risks, physical conditions and human systems and behaviour."The project aims to develop this understanding and identify the groups and communities most at risk from wildfires.The team will then identify priority actions and policies to support Indonesian-led initiatives to reduce the risk of future fires.They will also identify "hurdles" that might prevent progress, with the goal of "better environmental and socio-economic circumstances for all."Dr Darmae Nasir, Director of the Center for International Cooperation in Sustainable Management of Tropical Peatland (CIMTROP), University of Palangka Raya, said: "The ultimate aim of this project is to build long-term resilience to the multiple hazards associated with drought and fire in Central Kalimantan's peatlands."Fully understanding the human costs can guide the appropriate action to take to minimise the impacts when a disaster does occur."Our proposed research on building resilience emphasises the need to do this in the context of sustainable development and building positive economic opportunities for people in Indonesia."Project partners include Indonesian government agencies and departments, an NGO with significant experience of engaging rural communities in the region and equal partnerships between UK and Indonesian universities.The Indonesian institutions involved are Badan Restorasi Gambut (Peatland Restoration Agency), Yayasan Borneo Nature Indonesia and the universities Gadjah Mada, Indonesia and Palangka Raya.The UK institutions are the universities of Exeter, Leicester, Leeds and East Anglia, and the London School of Economics.
Pollution
2,020