Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
March 30, 2021
|
https://www.sciencedaily.com/releases/2021/03/210330171006.htm
|
A second look at sunlight
|
A year ago scientists everywhere were scrambling to get their minds around the SARS-CoV-2, a novel coronavirus that caused the pandemic from which we are only now beginning to emerge. The world clung to every new development, every bit of science that could provide clues to managing life in the presence of this mysterious killer.
|
Many science-backed COVID-19 management concepts remain unchanged to this day: handwashing with soap and warm water disrupts the virus' lipid membrane. Social distancing can attenuate the virus's spread, ideally keeping it out of a host until it degrades. Other notions, such as droplet contact being the primary mode of transmission, were modified when emerging evidence showed that under certain conditions, the virus could remain suspended in air for extended periods of time.In a letter in the The idea that an additional mechanism might be in play came when the team compared data from a July 2020 study(link is external) that reported rapid sunlight inactivation of SARS-CoV-2 in a lab setting, with a theory(link is external) of coronavirus inactivation by solar radiation that was published just a month earlier."The theory assumes that inactivation works by having UV-B hit the RNA of the virus, damaging it," said UC Santa Barbara mechanical engineering professor and lead author Paolo Luzzatto-Fegiz(link is external). Judging from the discrepancies between the experimental results and the predictions of the theoretical model, however, the research team felt that RNA inactivation by UV-B "might not be the whole story."According to the letter, the experiments demonstrated virus inactivation times of about 10-20 minutes -- much faster than predicted by the theory."The theory predicts that inactivation should happen an order of magnitude slower," Luzzatto-Fegiz said. In the experiments, viruses in simulated saliva and exposed to UV-B lamps were inactivated more than eight times faster than would have been predicted by the theory, while those cultured in a complete growth medium before exposure to UV-B were inactivated more than three times faster than expected. To make the math of the theory fit the data, according to the letter, SARS-CoV-2 would have to exceed the highest UV-B sensitivity of any currently known virus.Or, Luzzato-Fegiz and colleagues reasoned, there could be another mechanism at play aside from RNA inactivation by UV-B rays. For instance, UV-A, another, less energetic component of sunlight might be playing a more active role than previously thought."People think of UV-A as not having much of an effect, but it might be interacting with some of the molecules in the medium," he said. Those reactive intermediate molecules in turn could be interacting with the virus, hastening inactivation. It's a concept familiar to those who work in wastewater treatment and other environmental science fields."So, scientists don't yet know what's going on," Luzzatto-Fegiz said; "Our analysis points to the need for additional experiments to separately test the effects of specific light wavelengths and medium composition."Results of such experiments might provide clues into new ways of managing the virus with widely available and accessible UV-A and UV-B radiation. While UV-C radiation is proved effective against SARS-CoV-2, this wavelength does not reach the earth's surface and must be manufactured. Although UV-C is presently used in air filtration and in other settings, its short wavelengths and high energy also makes UV-C the most damaging form of UV radiation, limiting its practical application and raising other safety concerns."UV-C is great for hospitals," said co-author Julie McMurry. "But in other environments -- for instance kitchens or subways -- UV-C would interact with the particulates to produce harmful ozone." While no single intervention will eliminate risk, this research would provide one further tool to reduce exposure, thus slowing transmission and improving health outcomes.Co-author and UCSB mechanical engineering professor Yangying Zhu(link is external) added that UV-A turning out to be capable of inactivating the virus could be very advantageous: there are now widely available inexpensive LED bulbs that are many times stronger than natural sunlight, which could accelerate inactivation times. UV-A could potentially be used far more broadly to augment air filtration systems at relatively low risk for human health, especially in high-risk settings such as hospitals and public transportation, but the specifics of each setting warrant consideration, said co-author Fernando Temprano-Coleto.Research in this paper was conducted also by François J. Peaudecerf at ETH Zurich and Julien Landel at University of Manchester.
|
Environment
| 2,021 |
March 30, 2021
|
https://www.sciencedaily.com/releases/2021/03/210330143059.htm
|
Mysterious living monuments
|
Giant trees in tropical forests, witnesses to centuries of civilization, may be trapped in a dangerous feedback loop according to a new report in
|
Evan Gora, STRI Tupper postdoctoral fellow, studies the role of lightning in tropical forests. Adriane Esquivel-Muelbert, lecturer at the University of Birmingham, studies the effects of climate change in the Amazon. The two teamed up to find out what kills big tropical trees. But as they sleuthed through hundreds of papers, they discovered that nearly nothing is known about the biggest trees and how they die because they are extremely rare in field surveys."Big trees are hard to measure," said Esquivel-Muelbert. "They are the pain in a field campaign because we always have to go back with a ladder to climb up to find a place to measure the circumference above the buttresses. It takes a long time. Studies focusing on the reasons trees die don't have enough information for the biggest trees and often end up excluding them from their analysis.""Because we generally lack the data necessary to tell us what kills trees that are above approximately 50 centimeters in diameter, that leaves out half of the forest biomass in most forests," Gora said.Only about 1% of trees in mature tropical forests make it to this size. Others wait their turn in the shade below.The other thing that makes tropical forests so special -- high biodiversity -- also makes it difficult to study big trees: There are so many different species, and many of them are extremely rare."Because only 1-2% of big trees in a forest die every year, researchers need to sample hundreds of individuals of a given species to understand why they are dying," Gora said. "That may involve looking for trees across a huge area."Imagine a study of blood pressure in people who have lived to be 103. One would have to locate and test seniors from cities and towns around the world: a time-consuming, logistically complex and expensive proposition.A large body of evidence shows that trees are dying faster in tropical forests than ever before. This is affecting the ability of forests to function and in particular, to capture and store carbon dioxide."We know the deaths of largest and oldest trees are more consequential than the death of smaller trees," Gora said. "Big trees may be at particular risk because the factors that kill them appear to be increasing more rapidly than the factors that seem to be important for smaller-tree mortality."In large parts of the tropics, climate change is resulting in more severe storms and more frequent and intense droughts. Because big trees tower above the rest, they may be more likely to be hit by lightning, or damaged by wind. Because they have to pull ground water higher than other trees, they are most likely to be affected by drought.Hoping to better understand what is happening to big trees, Gora and Esquivel-Muelbert identified three glaring knowledge gaps. First, almost nothing is known about disease, insects and other biological causes of death in big trees. Second, because big trees are often left out of analyses, the relationship between cause of death and size is not clear. And, finally, almost all of the detailed studies of big tropical trees are from a few locations like Manaus in Brazil and Barro Colorado Island in Panama.To understand how big trees die, there is a trade-off between putting effort into measuring large numbers of trees and measuring them often enough to identify the cause of death. Gora and Esquivel-Muelbert agree that a combination of drone technology and satellite views of the forest will help to find out how these big trees die, but this approach will only work if it is combined with intense, standardized, on-the-ground observations, such as those used by the Smithsonian's international ForestGEO network of study sites.Esquivel-Muelbert hopes that the impetus for this research will come from a shared appreciation for these mysterious living monuments:"I think they are fascinating to everyone," she said. "When you see one of those giants in the forest, they are so big. My colleague and Amazonian researcher, Carolina Levis, says that they are the monuments we have in the Amazon where we don't have big pyramids or old buildings....That is the feeling, that they have been through so much. They are fascinating, not just in the scientific sense but also in another way. It moves you somehow."
|
Environment
| 2,021 |
March 30, 2021
|
https://www.sciencedaily.com/releases/2021/03/210330121259.htm
|
Shining, colored LED lighting on microalgae for next-generation biofuel
|
As ethanol, biodiesel, and other biofuels continue to present challenges, such as competing with food security or lacking the technology for more efficient and low-cost production, microalgae are gaining momentum as a biofuel energy crop.
|
In their paper, published in the The researchers focused on Dunaliella salina (Microalgae tend to accumulate higher amounts of lipids (fatty acids that make up natural oils and waxes) than other biomass feedstocks do, which means a much higher percentage of the organisms can be turned into usable biofuel. In the case of LEDs, as tunable single-color light sources, are already used to optimize plant growth, particularly in greenhouse cultivation. All parts of the visible spectrum are used in photosynthesis, but light also influences plant development. Adding more blue or red light, for instance, affects different plants in different ways. Optimal illumination conditions for microalgae growth and lipid production yield remain unknown.In their study, the researchers applied red, blue, or combined red-blue illumination to However, when red and blue lights were simultaneously applied in various ratios, the microalgae showed a major boost in growth and lipid productivity. The optimal 4-to-3 ratio of red and blue light significantly improved lipid productivity by more than 35% and increased dry biomass yield by more than 10% compared to the white light control.The researchers are planning to analyze the composition of fatty acids synthesized in the algae under the favorable combined lighting for increased lipid production."Biodiesel performance is dependent on the composition of fatty acids, so we want to determine how the combined monochromatic lights would affect the quality of microalgae biodiesel," author Xiaojian Zhou said.
|
Environment
| 2,021 |
March 30, 2021
|
https://www.sciencedaily.com/releases/2021/03/210330121202.htm
|
In the deep sea, the last ice age is not yet over
|
Gas hydrates are a solid compound of gases and water that have an ice-like structure at low temperatures and high pressures. Compounds of methane and water, so-called methane hydrates, are found especially at many ocean margins -- also in the Black Sea. In addition to a possible use as an energy source, methane hydrate deposits are being investigated for their stability, as they can dissolve with changes in temperature and pressure. In addition to releases of methane, this can also have an impact on submarine slope stability.
|
During a six-week expedition with the German research vessel METEOR in autumn 2017, a team from MARUM and GEOMAR investigated a methane hydrate deposit in the deep-sea fan of the Danube in the western Black Sea. During the cruise, which was part of the joint project SUGAR III "Submarine Gas Hydrate Resources" jointly funded by the BMWi and BMBF, the gas hydrate deposits were drilled using the mobile seafloor drilling device MARUM-MeBo200. The results of the investigations, which have now been published in the international journal "Based on data from previous expeditions, we selected two working areas where, on the one hand, methane hydrate and free methane gas coexist in the upper 50 to 150 metres of the hydrate stability zone and, on the other hand, a landslide and gas seeps were found directly at the edge of the gas hydrate stability zone," explains Prof. Dr. Gerhard Bohrmann, expedition leader from MARUM and co-author of the study. "For our investigations we used our drilling device MARUM-MeBo200 and broke all previous depth records with a maximum depth reached of almost 145 metres."In addition to obtaining samples, the scientists were, for the first time, also able to carry out detailed in situ temperature measurements down to the base of the gas hydrate stability under the seabed. Previously, this baseline was determined using seismic methods, from which the so-called "bottom simulating reflector" (BSR) was obtained as an indicator of this base. "However, our work has now proven for the first time that the approach using the BSR does not work for the Black Sea," explains Dr. Michael Riedel from GEOMAR, lead author of the study. "From our point of view, the gas-hydrate stability boundary has already approached the warmer conditions in the subsurface, but the free methane gas, which is always found at this lower edge, has not yet managed to rise with it," Riedel continues. The reasons for this could be attributed to the low permeability of the sediments, which means the methane gas is still "stuck" down there and can only rise very, very slowly under its own power, according to the scientist."However, our new analyses of the seismic data have also shown that in a few places the methane gas can break through the BSR. There, a new BSR is just establishing itself over the 'old' reflector. This is new and has never been seen before," says Dr Matthias Haeckel, co-author of the study from GEOMAR. "Our interpretation is that the gas can rise in these places, as disturbances in the seabed here favour the flow of gas," Haeckel continues."In summary, we have found a very dynamic situation in this region, which also appears to be related with the development of the Black Sea since the last ice age," says Michael Riedel. After the last glacial maximum (LGM), the sea level rose (pressure increase), and when the global sea level rose above the threshold of the Bosporus, salty water from the Mediterranean Sea was able to propagate into the Black Sea. Before that, this ocean basin was basically a freshwater lake. In addition, global warming since the LGM has caused a temperature rise of the bottom water in the Black Sea. The combination of these three factors -- salinity, pressure and temperature -- had drastic effects on the methane hydrates, which decompose as a result of these effects. The current study exemplifies the complex feedbacks and time scales that induce climate changes in the marine environment and is therefore well suited to estimate the expected consequences of today's more rapid global warming -- especially on the Arctic gas hydrate deposits.Cruise leader Gerhard Bohrmann summarizes: "At the end of the SUGAR-3 programme, the drilling campaign with MeBo200 in the Black Sea showed us once again very clearly how quickly the methane hydrate stability in the ocean deposits also changes with environmental fluctuations."
|
Environment
| 2,021 |
March 30, 2021
|
https://www.sciencedaily.com/releases/2021/03/210330081251.htm
|
New oil palm map to inform policy and landscape-level planning
|
IIASA researchers have used Sentinel 1 satellite imagery from the European Space Agency to produce a map of the extent and year of detection of oil palm plantations in Indonesia, Malaysia, and Thailand that will help policymakers and other stakeholders understand trends in oil palm expansion to inform landscape-level planning.
|
The world's appetite for palm oil seems to know no bounds. We use it in everything from beauty products and food, to industrial processes and biofuels to fulfill our energy needs. This ever growing demand has caused oil palm production to more than double in the last two decades, a development which has in turn deeply impacted natural forest ecosystems and biodiversity, while also significantly contributing to climate change by releasing carbon from converted forests and peatlands into the atmosphere. Today, almost 90% of the world's oil palm production takes place in Southeast Asia. While oil palm is known to be the most efficient oil producing plant globally, yields vary dynamically with plantation stand age, management practices, and location. To understand trends in oil palm plantation expansion and for landscape-level planning, accurate maps are needed. To this end, IIASA researchers have provided an age-specific map of oil palm extent in 2017 using *Sentinel 1 satellite imagery from the European Space Agency in a new paper published in "We specifically wanted to determine the extent and age of oil palm plantations across Southeast Asia and see if we could use technologies such as Google Earth Engine and data mining algorithms to produce an accurate map of oil palm extent from Sentinel 1 radar data," explains lead author Olga Danylo, a researcher with the IIASA Novel Data Ecosystems For Sustainability Research Group.While oil palm extent has been mapped before, this paper uses Sentinel 1 satellite data in combination with other data sets to map extent, along with time series from the Landsat archive to derive the year of plantation detection (which is a proxy for productive age of the plantations). This additional information is valuable to complement discussions related to oil palm expansion over the last two decades. Specifically, the map can inform palm oil yield calculations based on stand age information. (Yields increase during the plant's youth phase in the first seven years, reach a plateau during the prime age of 7-15 years, and then slowly start to decline before palms are replaced at the age of 25-30 years.) Therefore, knowing the exact extent and age of plantations across a landscape is crucial for landscape-level planning to allow for both sustainable oil palm production and forest conservation.The paper's key output is a 30 m resolution map of Southeast Asia that indicates if oil palm is present and the year of detection of the plantation -- a brand new feature that allows for a better understanding of oil palm expansion in Southeast Asia. The map was validated based on the interpretation of several thousand very high resolution satellite images (on site ground data was not considered in the study). From this, the researchers estimated that the remotely sensed oil palm extent map has an overall accuracy that is comparable to similar remotely-sensed products. The largest area of oil palm can be found in Sumatra and Kalimantan, with expansions in all major regions since the year 2000. The maps shows that the largest relative expansions over the last decade have taken place in Kalimantan, insular Malaysia, and Thailand, but interestingly, the net oil palm plantation area, excluding milling facilities, roads, and other related infrastructure, might be significantly smaller than previously thought.According to the researchers, the new map could furthermore support the calculation of estimates of greenhouse gas emissions and removals for specific regions and inform the generation of large-scale yield estimates. It could also be used in analyses related to determining the economic trade-offs in different types of land use. In addition, the oil palm map can inform replanting initiatives that can be crucial in preventing future land conversion while ensuring the stability of supply."Tropical deforestation is a complicated issue. Conserving valuable tropical forests is important, but protecting the development rights of countries with tropical forests are equally important. Bridging solutions to deal with such a delicate issue requires detailed and contextualized understanding. If combined with effective land use planning by governments in these countries, our map can, for instance, inform which sites can be replanted to rejuvenate plantations, thus enhancing yields and helping to avoid the need for future land conversion. This is an example of how bridges can be built to resolve oil palm disputes between policymakers in importing countries and oil palm producing countries," concludes coauthor Johannes Pirker, a guest researcher with the Agriculture, Forestry, and Ecosystem Services Research Group at IIASA.The data set used in this paper is publicly accessible for download from the IIASA DARE repository (*
|
Environment
| 2,021 |
March 29, 2021
|
https://www.sciencedaily.com/releases/2021/03/210329140751.htm
|
How coastal forests are managed can impact water cycle
|
Younger trees take up and release less water than mature trees 10 years or older, researchers from North Carolina State University found in a new study that tracked how water moves through wetland pine forests near the North Carolina coast.
|
Their findings, published in "The water balance, especially in coastal sites, is very important," said the study's lead author Maricar Aguilos, postdoctoral research associate in forestry and environmental resources at NC State. "We have so much water there. We wanted to understand how land-use changes impact water use and drainage in the forests, as well as how they affect the growth of the trees."The findings come from a long-term research project designed to understand how wetland forests in eastern North Carolina -- including pine forests managed for timber and a natural hardwood forest at the Alligator River National Wildlife Refuge in Dare County -- are responding to changing climate conditions.Using meteorological sensors perched on towers above the forest canopy, the researchers are able to track water flow to and from the site, including during a severe drought in 2007-2008. They've also used the sensors to track carbon sequestration -- an important marker for the forests' ability to mitigate or contribute to climate change. They have gathered data on forest carbon and water cycling spanning 14 years."In order to study the response of coastal ecosystems to climate change and sea-level rise, we need long-term observations," said study co-author John King, professor of forestry and environmental resources at NC State. "The longer we can let those studies run, the better our data will be, and the more effectively we can help inform policy." The latest study evaluated how much water the trees use and release as vapor, compared to how much is lost as drainage.The researchers found that younger pine plantations had increasingly higher "evapotranspiration," which is the amount of water released in combination from two sources: through evaporation of water from the soil, and the process in which trees consume water and release it from their leaves as vapor, which is known as "transpiration." Mature plantations had the highest ratio of evapotranspiration to rainfall, and drained less water than younger pine forests."We found that the trees use more water as they mature," said study co-author Ge Sun, a research hydrologist and project leader at the U.S. Department of Agriculture Forest Service and adjunct professor in forestry and environmental resources at NC State. "Water use stabilized by about year 10 in the pine forests."That finding suggests clear-cutting a site and replanting it could lead to increased drainage and flooding off the site initially, but the impacts would diminish as the trees grow."The mature plantations help to mitigate effects of forest harvesting on drainage at a landscape scale," Aguilos said. "If you harvest to leave trees of different ages, they can help each other."
|
Environment
| 2,021 |
March 29, 2021
|
https://www.sciencedaily.com/releases/2021/03/210329122921.htm
|
Tires turned into graphene that makes stronger concrete
|
This could be where the rubber truly hits the road.
|
Rice University scientists have optimized a process to convert waste from rubber tires into graphene that can, in turn, be used to strengthen concrete.The environmental benefits of adding graphene to concrete are clear, chemist James Tour said."Concrete is the most-produced material in the world, and simply making it produces as much as 9% of the world's carbon dioxide emissions," Tour said. "If we can use less concrete in our roads, buildings and bridges, we can eliminate some of the emissions at the very start."Recycled tire waste is already used as a component of Portland cement, but graphene has been proven to strengthen cementitious materials, concrete among them, at the molecular level.While the majority of the 800 million tires discarded annually are burned for fuel or ground up for other applications, 16% of them wind up in landfills."Reclaiming even a fraction of those as graphene will keep millions of tires from reaching landfills," Tour said.The "flash" process introduced by Tour and his colleagues in 2020 has been used to convert food waste, plastic and other carbon sources by exposing them to a jolt of electricity that removes everything but carbon atoms from the sample.Those atoms reassemble into valuable turbostratic graphene, which has misaligned layers that are more soluble than graphene produced via exfoliation from graphite. That makes it easier to use in composite materials.Rubber proved more challenging than food or plastic to turn into graphene, but the lab optimized the process by using commercial pyrolyzed waste rubber from tires. After useful oils are extracted from waste tires, this carbon residue has until now had near-zero value, Tour said.Tire-derived carbon black or a blend of shredded rubber tires and commercial carbon black can be flashed into graphene. Because turbostratic graphene is soluble, it can easily be added to cement to make more environmentally friendly concrete.The research led by Tour and Rouzbeh Shahsavari of C-Crete Technologies is detailed in the journal The Rice lab flashed tire-derived carbon black and found about 70% of the material converted to graphene. When flashing shredded rubber tires mixed with plain carbon black to add conductivity, about 47% converted to graphene. Elements besides carbon were vented out for other uses.The electrical pulses lasted between 300 milliseconds and 1 second. The lab calculated electricity used in the conversion process would cost about $100 per ton of starting carbon.The researchers blended minute amounts of tire-derived graphene -- 0.1 weight/percent (wt%) for tire carbon black and 0.05 wt% for carbon black and shredded tires -- with Portland cement and used it to produce concrete cylinders. Tested after curing for seven days, the cylinders showed gains of 30% or more in compressive strength. After 28 days, 0.1 wt% of graphene sufficed to give both products a strength gain of at least 30%."This increase in strength is in part due to a seeding effect of 2D graphene for better growth of cement hydrate products, and in part due to a reinforcing effect at later stages," Shahsavari said.
|
Environment
| 2,021 |
March 29, 2021
|
https://www.sciencedaily.com/releases/2021/03/210329122918.htm
|
Coastal lupine faces specific extinction threat from climate change
|
Climate change is altering the world we share with all living things. But it's surprisingly difficult to single out climate change as an extinction threat for any one particular species protected under the Endangered Species Act.
|
To date, the U.S. Fish and Wildlife Service has only formally considered impacts from climate change in listing actions for four animal species and one alpine tree.But the effects of climate change extend to temperate climates as well. A new analysis of population data published in the journal Biologists including Eleanor Pardini at Washington University in St. Louis have tracked all of the known stands of Tidestrom's lupine, If average temperatures increase by one degree Celsius (1° C, or about 1.8 degrees Fahrenheit) -- a conservative assumption -- the scientists project that 90% of individual lupine plants could be lost in the next 30 years."In general, it is fairly difficult to conclusively say that climate change is a species threat," said Pardini, assistant director of environmental studies at Washington University and senior lecturer in Arts & Sciences.Modeling the threat of climate change requires long-term population data, which is difficult to collect and thus not available for most species."We were able to perform this analysis and show climate change is an important additional threat factor for this species because we have spent considerable effort collecting a long-term dataset," Pardini said.To date, regulators have considered climate change in their listing actions only for four animal species: the polar bear, American pika, American wolverine and Gunnison sage-grouse.Tidestrom's lupine is different, and not just because it's a plant. It's from a more seasonally mild coastal area -- not someplace that one might think would be rocked by a few degrees of rising temperatures. The animals that have been previously considered all occur in arctic, alpine or arid regions."While our results on The delicate, purple-flowering lupine is part of a dune ecosystem along the west coast of the United States that is highly disturbed. In many of these coastal places, people have planted exotic plants to be able to develop and farm closer to the beach. Over time, exotic plants have over-stabilized dunes, disrupting wind and sand movement and harming plants and animals.For the new study, Pardini worked with Tiffany Knight and Aldo Compagnoni, both of Martin Luther University Halle-Wittenberg and the German Centre for Integrative Biodiversity Research (iDiv) in Germany. Pardini and Knight have been tracking populations of Tidestrom's lupine at Point Reyes since 2005. Compagnoni joined the team as an expert in demographic modeling incorporating climate data.The scientists produced population trajectories for all populations of the species at Point Reyes for the next 30 years."Using 14 years of demographic data from 2005 to 2018 and model selection, we found that survival and fertility measures responded negatively to temperature anomalies," said Compagnoni, first author of the new study. "We then produced forecasts based on stochastic individual-based population models that account for uncertainty in demographic outcomes."If temperatures remain at the 1990-2018 average levels, the scientists expect that the number of individual lupine plants would double over the next 30 years. However, with a 1° C increase in temperature, the number of plants will instead drop off dramatically, with an expected 90% reduction in the number of individual plants.This scenario is conservative, as even more dramatic increases in temperature than 1° C are projected for this region of California in the next 30 years."Despite large uncertainties, we predict that all populations will decline if temperatures increase by 1° Celsius," Compagnoni said. "Considering the total number of individuals across all seven populations, the most likely outcome is a population decline of 90%. Moreover, we predict local extinction is certain for one of our seven populations.""Our species has a range so small that its response to climate cannot be inferred from its geographic distribution," Pardini said. "In these cases, long-term data collection becomes an important alternative option to assess the climatic vulnerability of a species."Some rare species that are endemic to coastal habitats are currently protected by the Endangered Species Act and by various state listings.Many Tidestrom's lupine populations are protected against development because they occur in a national park or state parks. However, Knight expressed general concern about the proposed new regulations that would allow coastal habitats to be excluded in the future because they are economically valuable to developers. Coastal plant communities provide a wide variety of valuable ecosystem services, such as mitigating the effects of coastal erosion and flooding.
|
Environment
| 2,021 |
March 29, 2021
|
https://www.sciencedaily.com/releases/2021/03/210329094852.htm
|
Mapping policy for how the EU can reduce its impact on tropical deforestation
|
EU imports of products including palm oil, soybeans, and beef contribute significantly to deforestation in other parts of the world. In a new study, researchers from Chalmers University of Technology, Sweden, and the University of Louvain, Belgium, evaluated over a thousand policy proposals for how the EU could reduce this impact, to assess which would have the largest potential to reduce deforestation -- while also being politically feasible.
|
"Unsurprisingly, there is weaker support for tougher regulations, such as import restrictions on certain goods. But our study shows that there is broad support in general, including for certain policies that have real potential to reduce imported deforestation," says Martin Persson, Associate Professor of Physical Resource Theory at Chalmers University of Technology.Previous research from Chalmers University of Technology has already shown the EU's great impact in this area. More than half of tropical deforestation is linked to production of food and animal feed, such as palm oil, soybeans, wood products, cocoa and coffee -- goods which the EU imports in vast quantities. The question is, what can the EU do to reduce its contribution to deforestation?"This issue is particularly interesting now, as this year the EU is planning to present legislative proposals for reducing deforestation caused by European consumption. The question has been discussed by the EU since 2008, but now something political is actually happening," says Simon Bager, a doctoral student at the Université Catholique de Louvain, Belgium, and lead author of the study.The authors of the article mapped 1141 different proposals, originating from open consultations and workshops, where the EU has collected ideas from companies, interest groups and think tanks. The researchers also compiled proposals from a large number of research reports, policy briefs and other publications, where different stakeholders have put forward various policy proposals. After grouping together similar proposals, they arrived at 86 unique suggestions.Finding proposals for measures that would have the desired effect but are also possible to implement in practice, and enjoy the necessary political support, is no easy task. But after their extensive survey, the researchers identify two policy options in particular which show promise. The first is to make importers of produce responsible for any deforestation in their supply chains, by requiring them to carry out the requisite due diligence."If the importing companies' suppliers have products that contribute to deforestation, the company may be held responsible for this. We consider such a system to be credible and possible to implement both politically and practically -- there are already examples from France and England where similar systems have been implemented or are in the process thereof," says Simon Bager."Due diligence is also the measure which is most common in our survey, put forward by many different types of actors, and there is broad support for this proposal. However, it is important to emphasise that for such a system to have an impact on deforestation, it must be carefully designed, including which companies are affected by the requirements, and which sanctions and liability options exist."The other possibility is to support multi-stakeholder forums, where companies, civil society organisations, and politicians come together to agree on possible measures for ridding a supply-chain, commodity, or area, of deforestation. There are positive examples here too, the most notable being the Amazon Soy Moratorium from 2006, when actors including Greenpeace and the World Wide Fund for Nature gathered with soy producers and exporters and agreed to end soy exports from deforested areas in the Amazon rainforest."Examples such as these demonstrate the effect that multi-stakeholder forums can have. And in our opinion, it is a measure that is easier to get acceptance for, because it is an opportunity for the affected parties to be directly involved in helping design the measures themselves," says Martin.Such discussions can also be adapted to the relevant areas or regions, increasing the likelihood of local support for the initiatives.The researchers also investigated how to deal with the trade-off between policy impacts and feasibility. An important part of this is combining different complementary measures. Trade regulations on their own, for example, risk hitting poorer producing countries harder, and should therefore be combined with targeted aid to help introduce more sustainable production methods, increasing yields without having to resort to deforestation. This would also reduce the risk of goods that are produced on deforested land simply being sold in markets other than the EU."If the EU now focuses on its contribution to deforestation, the effect may be that what is produced on newly deforested land is sold to other countries, while the EU gets the 'good' products. Therefore, our assessment is that the EU should ensure that the measures introduced are combined with those which contribute to an overall transition to sustainable land use in producing countries," says Simon Bager.In conclusion, the researchers summarise three essential principles needed for new measures, if the EU is serious about reducing its impact on tropical deforestation."First, enact measures that actually are able to bring about change. Second, use a range of measures, combining different tools and instruments to contribute to reduced deforestation. Finally, ensure the direct involvement of supply chain actors within particularly important regions, expanding and broadening the measures over time," concludes Simon Bager.The authors hope that the research and identified policy options can serve as inspiration for policy makers, NGOs, industries, and other stakeholders working to address the EU's deforestation footprint. With at least 86 different unique alternatives, there is a wide range of opportunities to focus on the problem -- very few of these are political 'non-starters' or proposals which would have no effect on the issue.
|
Environment
| 2,021 |
March 29, 2021
|
https://www.sciencedaily.com/releases/2021/03/210329085936.htm
|
Forests on caffeine: Coffee waste can boost forest recovery
|
A new study finds that coffee pulp, a waste product of coffee production, can be used to speed up tropical forest recovery on post agricultural land. The findings are published in the British Ecological Society journal
|
In the study, researchers from ETH-Zurich and the University of Hawai`i spread 30 dump truck loads of coffee pulp on a 35 × 40m area of degraded land in Costa Rica and marked out a similar sized area without coffee pulp as a control."The results were dramatic" said Dr Rebecca Cole, lead author of the study. "The area treated with a thick layer of coffee pulp turned into a small forest in only two years while the control plot remained dominated by non-native pasture grasses."After only two years the coffee pulp treated area had 80% canopy cover compared to 20% in the control area. The canopy in the coffee pulp area was also four times taller than that of the control area.The addition of the half metre thick layer of coffee pulp eliminated the invasive pasture grasses which dominated the land. These grasses are often a barrier to forest succession and their removal allowed native, pioneer tree species, that arrived as seeds through wind and animal dispersal, to recolonise the area quickly.The researchers also found that after two years, nutrients including carbon, nitrogen and phosphorus were significantly elevated in the coffee pulp treated area compared to the control. This is a promising finding given former tropical agricultural land is often highly degraded and poor soil quality can delay forest succession for decades.Dr Cole said: "This case study suggests that agricultural by-products can be used to speed up forest recovery on degraded tropical lands. In situations where processing these by-products incurs a cost to agricultural industries, using them for restoration to meet global reforestation objectives can represent a 'win-win' scenario."As a widely available waste product that's high in nutrients, coffee pulp can be a cost-effective forest restoration strategy. Such strategies will be important if we are to achieve ambitious global objectives to restore large areas of forest, such as those agreed in the 2015 Paris Accords.The study was conducted in Coto Brus county in southern Costa Rica on a former coffee farm that is being restored to forest for conservation. In the 1950's the region underwent rapid deforestation and land conversion to coffee agriculture and pasture with forest cover reduced to 25% by 2014.In 2018, the researchers set out two areas of roughly 35 × 40m, spreading coffee pulp into a half meter-thick layer on one area and leaving the other as a control.The researchers analysed soil samples for nutrients immediately prior to the application of the coffee pulp and again two years later. They also recorded the species present, the size of woody stems, percentage of forest ground cover and used drones to record canopy cover.Dr Cole warns that as a case study with two years of data, further research is needed to test the use of coffee pulp to aid forest restoration. "This study was done at only one large site so more testing is needed to see if this strategy works across a broader range of conditions. The measurements we share are only from the first two years. Longer-term monitoring would show how the coffee pulp affected soil and vegetation over time. Additional testing can also assess whether there are any undesirable effects from the coffee pulp application."A limitation of using coffee pulp or other agricultural by-products is that its use is mostly limited to relatively flat and accessible areas where the material can be delivered and the risk of the added nutrients being washed into nearby watersheds can be managed.On further research into the use of coffee pulp, Dr Cole said: "We would like to scale up the study by testing this method across a variety of degraded sites in the landscape. Also, this concept could be tested with other types of agricultural non-market products like orange husks."We hope our study is a jumping off point for other researchers and industries to take a look at how they might make their production more efficient by creating links to the global restoration movement."
|
Environment
| 2,021 |
March 29, 2021
|
https://www.sciencedaily.com/releases/2021/03/210329094839.htm
|
Genomic secrets of organisms that thrive in extreme deep-sea
|
A study led by scientists at Hong Kong Baptist University (HKBU) has decoded the genomes of the deep-sea clam (Archivesica marissinica) and the chemoautotrophic bacteria (Candidatus Vesicomyosocius marissinica) that live in its gill epithelium cells. Through analysis of their genomic structures and profiling of their gene expression patterns, the research team revealed that symbiosis between the two partners enables the clams to thrive in extreme deep-sea environments.
|
The research findings have been published in the academic journal Due to the general lack of photosynthesis-derived organic matter, the deep-sea was once considered a vast "desert" with very little biomass. Yet, clams often form large populations in the high-temperature hydrothermal vents and freezing cold seeps in the deep oceans around the globe where sunlight cannot penetrate but toxic molecules, such as hydrogen sulfide, are available below the seabed. The clams are known to have a reduced gut and digestive system, and they rely on endosymbiotic bacteria to generate energy in a process called chemosynthesis. However, when this symbiotic relationship developed, and how the clams and chemoautotrophic bacteria interact, remain largely unclear.A research team led by Professor Qiu Jianwen, Associate Head and Professor of the Department of Biology at HKBU, collected the clam specimens at 1,360 metres below sea level from a cold seep in the South China Sea. The genomes of the clam and its symbiotic bacteria were then sequenced to shed light on the genomic signatures of their successful symbiotic relationship.The team found that the ancestor of the clam split with its shallow-water relatives 128 million years ago when dinosaurs roamed the earth. The study revealed that 28 genes have been transferred from the ancestral chemoautotrophic bacteria to the clam, the first discovery of horizontal gene transfer -- a process that transmits genetic material between distantly-related organisms -- from bacteria to a bivalve mollusc.The following genomic features of the clam were discovered, and combined, they have enabled it to adapt to the extreme deep-sea environment:The clam relies on its symbiotic chemoautotrophic bacteria to produce the biological materials essential for its survival. In their symbiotic relationship, the clam absorbs hydrogen sulfide from the sediment, and oxygen and carbon dioxide from seawater, and it transfers them to the bacteria living in its gill epithelium cells to produce the energy and nutrients in a process called chemosynthesis. The process is illustrated in Figure 1.The research team also discovered that the clam's genome exhibits gene family expansion in cellular processes such as respiration and diffusion that likely facilitate chemoautotrophy, including gas delivery to support energy and carbon production, the transfer of small molecules and proteins within the symbiont, and the regulation of the endosymbiont population. It helps the host to obtain sufficient nutrients from the symbiotic bacteria.Cellulase is an enzyme that facilitates the decomposition of the cellulose found in phytoplankton, a major primary food source in the marine food chain. It was discovered that the clam's cellulase genes have undergone significant contraction, which is likely an adaptation to the shift from phytoplankton-derived to bacteria-based food.The genome of the symbiont also holds the secrets of this mutually beneficial relationship. The team discovered that the clam has a reduced genome, as it is only about 40% of the size of its free-living relatives. Nevertheless, the symbiont genome encodes complete and flexible sulfur metabolic pathways, and it retains the ability to synthesise 20 common amino acids and other essential nutrients, highlighting the importance of the symbiont in generating energy and providing nutrients to support the symbiotic relationship.Unlike in vertebrates, haemoglobin, a metalloprotein found in the blood and tissues of many organisms, is not commonly used as an oxygen carrier in molluscs. However, the team discovered several kinds of highly expressed haemoglobin genes in the clam, suggesting an improvement in its oxygen-binding capacity, which can enhance the ability of the clam to survive in deep-sea low-oxygen habitats.Professor Qiu said: "Most of the previous studies on deep-sea symbiosis have focused only on the bacteria. This first coupled clam-symbiont genome assembly will facilitate comparative studies that aim to elucidate the diversity and evolutionary mechanisms of symbiosis, which allows many invertebrates to thrive in 'extreme' deep-sea ecosystems."The research was jointly conducted by scientists from HKBU and the HKBU Institute for Research and Continuing Education, the Hong Kong Branch of the Southern Marine Science and Engineering Guangdong Laboratory (Guangzhou), The Hong Kong University of Science and Technology, City University of Hong Kong, the Japan Agency for Marine-Earth Science and Technology, the Sanya Institute of Deep-Sea Science and Engineering, and the Guangzhou Marine Geological Survey.
|
Environment
| 2,021 |
March 26, 2021
|
https://www.sciencedaily.com/releases/2021/03/210326151336.htm
|
Controlling bubble formation on electrodes
|
Using electricity to split water into hydrogen and oxygen can be an effective way to produce clean-burning hydrogen fuel, with further benefits if that electricity is generated from renewable energy sources. But as water-splitting technologies improve, often using porous electrode materials to provide greater surface areas for electrochemical reactions, their efficiency is often limited by the formation of bubbles that can block or clog the reactive surfaces.
|
Now, a study at MIT has for the first time analyzed and quantified how bubbles form on these porous electrodes. The researchers have found that there are three different ways bubbles can form on and depart from the surface, and that these can be precisely controlled by adjusting the composition and surface treatment of the electrodes.The findings could apply to a variety of other electrochemical reactions as well, including those used for the conversion of carbon dioxide captured from power plant emissions or air to form fuel or chemical feedstocks. The work is described today in the journal "Water-splitting is basically a way to generate hydrogen out of electricity, and it can be used for mitigating the fluctuations of the energy supply from renewable sources," says Iwata, the paper's lead author. That application was what motivated the team to study the limitations on that process and how they could be controlled.Because the reaction constantly produces gas within a liquid medium, the gas forms bubbles that can temporarily block the active electrode surface. "Control of the bubbles is a key to realizing a high system performance," Iwata says. But little study had been done on the kinds of porous electrodes that are increasingly being studied for use in such systems.The team identified three different ways that bubbles can form and release from the surface. In one, dubbed internal growth and departure, the bubbles are tiny relative to the size of the pores in the electrode. In that case, bubbles float away freely and the surface remains relatively clear, promoting the reaction process.In another regime, the bubbles are larger than the pores, so they tend to get stuck and clog the openings, significantly curtailing the reaction. And in a third, intermediate regime, called wicking, the bubbles are of medium size and are still partly blocked, but manage to seep out through capillary action.The team found that the crucial variable in determining which of these regimes takes place is the wettability of the porous surface. This quality, which determines whether water spreads out evenly across the surface or beads up into droplets, can be controlled by adjusting the coating applied to the surface. The team used a polymer called PTFE, and the more of it they sputtered onto the electrode surface, the more hydrophobic it became. It also became more resistant to blockage by larger bubbles.The transition is quite abrupt, Zhang says, so even a small change in wettability, brought about by a small change in the surface coating's coverage, can dramatically alter the system's performance. Through this finding, he says, "we've added a new design parameter, which is the ratio of the bubble departure diameter [the size it reaches before separating from the surface] and the pore size. This is a new indicator for the effectiveness of a porous electrode."Pore size can be controlled through the way the porous electrodes are made, and the wettability can be controlled precisely through the added coating. So, "by manipulating these two effects, in the future we can precisely control these design parameters to ensure that the porous medium is operated under the optimal conditions," Zhang says. This will provide materials designers with a set of parameters to help guide their selection of chemical compounds, manufacturing methods and surface treatments or coatings in order to provide the best performance for a specific application.While the group's experiments focused on the water-splitting process, the results should be applicable to virtually any gas-evolving electrochemical reaction, the team says, including reactions used to electrochemically convert captured carbon dioxide, for example from power plant emissions.Gallant, an associate professor of mechanical engineering at MIT, says that "what's really exciting is that as the technology of water splitting continues to develop, the field's focus is expanding beyond designing catalyst materials to engineering mass transport, to the point where this technology is poised to be able to scale." While it's still not at the mass-market commercializable stage, she says, "they're getting there. And now that we're starting to really push the limits of gas evolution rates with good catalysts, we can't ignore the bubbles that are being evolved anymore, which is a good sign."The MIT team also included Kyle Wilke, Shuai Gong, and Mingfu He. The work was supported by Toyota Central R&D Labs, the Singapore-MIT Alliance for Research and Technology (SMART), the U.S.-Egypt Science and Technology Joint Fund, and the Natural Science Foundation of China.
|
Environment
| 2,021 |
March 26, 2021
|
https://www.sciencedaily.com/releases/2021/03/210326122732.htm
|
Study exposes global ripple effects of regional water scarcity
|
Water scarcity is often understood as a problem for regions experiencing drought, but a new study from Cornell and Tufts universities finds that not only can localized water shortages impact the global economy, but changes in global demand send positive and negative ripple effects to water basins across the globe.
|
"We are looking at water scarcity as a globally connected and multi-sector phenomenon," said Jonathan Lamontagne, assistant professor of civil and environmental engineering at Tufts University, who co-authored the study with Patrick Reed, the Joseph C. Ford Professor of Civil and Environmental Engineering at Cornell. Tufts graduate student Flannery Dolan is lead author of the study, which suggests water scarcity dynamics are more complicated than traditionally acknowledged.The study, "Evaluating the economic impact of water scarcity in a changing world," was published March 26 in The researchers coupled physical and economic models to simulate thousands of potential climate futures for 235 major river basins -- a technique known as scenario discovery -- to better understand how water scarcity is a globally-connected phenomenon, with local conditions having reverberations across the globe in industries such as agriculture, energy, transportation and manufacturing.The research found that global trade dynamics and market adaptations to regional water scarcity result in positive and negative economic outcomes for every regional river basin considered in the study.For instance, in the lower Colorado River basin, the worst economic outcomes arise from limited groundwater availability and high population growth, but that high population growth can also prove beneficial under some climatic scenarios. In contrast, the future economic outcomes in the Indus Basin depend largely on global land-use policies."What is happening elsewhere in the world through differences in regional choices related to energy transitions -- how land is being managed as well as different regional water demands and adaptive choices -- can shape relative advantages and disadvantages of water intensive economic activities," said Reed.Restrictions in water availability usually lead to a negative regional economic impact, but the research revealed that some regions can experience a positive economic impact if they hold an advantage over other water basins and can become a virtual exporter of water. The Orinoco basin in Venezuela, for example, usually has a reliable supply of water and is often in a relative position that can benefit when other regions are under stress, according to the researchers.The study also found that small differences in projections for future climate conditions can yield very large differences in the economic outcomes for water scarcity."Human activities and market responses can strongly amplify the economic effects of water scarcity, but the conditions that lead to this amplification vary widely from one basin to the next," said Lamontagne.A river basin can be considered economically robust if it is able to adapt to drought with alternative sources of water or adjust economic activity to limit usage. If a basin is unable to adapt its supply options and if prolonged water scarcity leads to persistent economic decline, then the researchers describe the loss in water basin adaptive capacity as having reached an 'economic tipping point.'For example, in the Indus region in South Asia, the water supply is under stress due to heavy agricultural use and irrigation leading to unsustainable consumption of groundwater, which places it close to the tipping point.The conditions that lead to these tipping points are highly variable from basin to basin, depending on a combination of local factors and global conditions. In the Arabian Peninsula, low groundwater availability and pricing of carbon emissions are key factors. In the lower Colorado River basin, a mixture of low groundwater availability, low agricultural productivity, and strong economic demands from the U.S. and Europe lead to tipping points."It is noteworthy that the lower Colorado River basin has some of the most uncertain and widely divergent economic outcomes of water scarcity of the basins analyzed in this study," said Reed. "This implies that assumed differences in regional, national and global human system conditions as well as the intensity of climate change can dramatically amplify the uncertainty in the basin's outcomes."As climate change makes the physical and economic effects of water scarcity more challenging for policy makers to understand, the researchers hope their work will provide the basis for similar analyses and draw attention to the importance of expanded data collection to improve modeling and decision making.The study was co-authored by researchers from the Joint Global Change Research Institute at the Pacific Northwest National Laboratory, and was supported by the U.S. Department of Energy's Office of Science.
|
Environment
| 2,021 |
March 26, 2021
|
https://www.sciencedaily.com/releases/2021/03/210326122730.htm
|
The persistent danger after landscape fires
|
Reactive oxygen species (ROS) cause oxidative stress at the cellular level. Research shows that this way, amongst others, they inhibit the germination capacity of plants, produce cytotoxins or exert toxic effects on aquatic invertebrates. Environmentally persistent free radicals (EPFR) are potential precursors of ROS because they can react with water to form these radical species. "Therefore, EPFR are associated with harmful effects on the ecosystem and human health," explains Gabriel Sigmund, the lead investigator of the study.
|
"Our study shows that these environmentally persistent free radicals can be found in large quantities and over a long period of time in fire derived charcoal," reports Sigmund, environmental geoscientist at the Center for Microbiology and Environmental Systems Science (CMESS) at the University of Vienna. In all 60 charcoal samples from ten different fires, the researchers detected EPFR in concentrations that exceeded those typically found in soils by as much as ten to a thousand times. Other than expected, this concentration remained stable for at least five years, as an analysis of charcoal samples showed which were gathered at the same location and over several years after a forest fire. "The more stable the environmentally persistent free radicals are, the more likely it is that they will have an impact on ecosystems over longer periods of time," explains Thilo Hofmann, co-author of the study and head of the research group.The researchers collected charcoal samples from fires of diverse intensity in boreal, temperate, subtropical, and tropical climates. They considered forest, shrubland and grassland fires and, thus, also different fuel materials (woods and grasses). The original material and the charring conditions determine the degree of carbonization. Consequently, both indirectly influence the extent to which EPFR are formed and how persistent they are. "The analyses show that the concentration of environmentally persistent free radicals increased with the degree of carbonization," Sigmund reports. Woody fuels favored higher concentrations. For these, the researchers were also able to demonstrate the stability of EPFR over several years. "We assume that woody wildfire derived charcoal is a globally important source of these free radicals and thus potentially also of harmful reactive oxygen species," adds Hofmann."It is our collaboration with colleagues at Swansea University in the United Kingdom that enables us to make these highly differentiated statements," explains Sigmund. The wildfire experts at Swansea University are conducting global research into the effects of fire on environmental processes such as the carbon cycle and erosion. They have collected charcoal samples from around the world and sent them to Vienna for analysis, along with information on the timing, duration and intensity of the fires. CMESS researchers analyzed the samples in collaboration with Marc Pignitter of the Faculty of Chemistry using electron spin resonance spectroscopy (ESR spectroscopy). ESR spectroscopy made it possible to quantify the environmentally persistent free radicals in the studied material and to identify their adjacent chemical structures.The study has provided insights, but also raised further questions: The fact that environmentally persistent free radicals occur in such high concentrations and remain stable over several years was surprising. In future studies, the researchers are planning to also assess the consequences this may have for the environment. "To what extent is this a stress factor for microorganisms after a fire? How does it affect an ecosystem? The study is an impetus for further research," reports Sigmund.
|
Environment
| 2,021 |
March 26, 2021
|
https://www.sciencedaily.com/releases/2021/03/210326104732.htm
|
A general approach to high-efficiency perovskite solar cells
|
Perovskites, a class of materials first reported in the early 19th century, were "re-discovered" in 2009 as a possible candidate for power generation via their use in solar cells. Since then, they have taken the photovoltaic (PV) research community by storm, reaching new record efficiencies at an unprecedented pace. This improvement has been so rapid that by 2021, barely more than a decade of research later, they are already achieving performance similar to conventional silicon devices. What makes perovskites especially promising is the manner in which they can be created. Where silicon-based devices are heavy and require high temperatures for fabrication, perovskite devices can be lightweight and formed with minimal energy investiture. It is this combination -- high performance and facile fabrication -- which has excited the research community.
|
As the performance of perovskite photovoltaics rocketed upward, left behind were some of the supporting developments needed to make a commercially viable technology. One issue that continues to plague perovskite development is device reproducibility. While some PV devices can be made with the desired level of performance, others made in the exact same manner often have significantly lower efficiencies, puzzling and frustrating the research community.Recently, researchers from the Emerging Electronic Technologies Group of Prof. Yana Vaynzof have identified that fundamental processes that occur during the perovskite film formation strongly influence the reproducibility of the photovoltaic devices. When depositing the perovskite layer from solution, an antisolvent is dripped onto the perovskite solution to trigger its crystallization. "We found that the duration for which the perovskite was exposed to the antisolvent had a dramatic impact on the final device performance, a variable which had, until now, gone unnoticed in the field." says Dr. Alexander Taylor, a postdoctoral research associate in the Vaynzof group and the first author on the study. "This is related to the fact that certain antisolvents may at least partly dissolve the precursors of the perovskite layer, thus altering its final composition. Additionally, the miscibility of antisolvents with the perovskite solution solvents influences their efficacy in triggering crystallization."These results reveal that, as researchers fabricate their PV devices, differences in this antisolvent step could cause the observed irreproducibility in performance. Going further, the authors tested a wide range of potential antisolvents, and showed that by controlling for these phenomena, they could obtain cutting-edge performance from nearly every candidate tested. "By identifying the key antisolvent characteristics that influence the quality of the perovskite active layers, we are also able to predict the optimal processing for new antisolvents, thus eliminating the need for the tedious trial-and-error optimization so common in the field." adds Dr. Fabian Paulus, leader of the Transport in Hybrid Materials Group at cfaed and a contributor to the study."Another important aspect of our study is the fact that we demonstrate how an optimal application of an antisolvent can significantly widen the processibility window of perovskite photovoltaic devices" notes Prof. Vaynzof, who led the work. "Our results offer the perovskite research community valuable insights necessary for the advancement of this promising technology into a commercial product."
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325190246.htm
|
Researchers harvest energy from radio waves to power wearable devices
|
From microwave ovens to Wi-Fi connections, the radio waves that permeate the environment are not just signals of energy consumed but are also sources of energy themselves. An international team of researchers, led by Huanyu "Larry" Cheng, Dorothy Quiggle Career Development Professor in the Penn State Department of Engineering Science and Mechanics, has developed a way to harvest energy from radio waves to power wearable devices.
|
The researchers recently published their method in According to Cheng, current energy sources for wearable health-monitoring devices have their place in powering sensor devices, but each has its setbacks. Solar power, for example, can only harvest energy when exposed to the sun. A self-powered triboelectric device can only harvest energy when the body is in motion."We don't want to replace any of these current power sources," Cheng said. "We are trying to provide additional, consistent energy."The researchers developed a stretchable wideband dipole antenna system capable of wirelessly transmitting data that is collected from health-monitoring sensors. The system consists of two stretchable metal antennas integrated onto conductive graphene material with a metal coating. The wideband design of the system allows it to retain its frequency functions even when stretched, bent and twisted. This system is then connected to a stretchable rectifying circuit, creating a rectified antenna, or "rectenna," capable of converting energy from electromagnetic waves into electricity. This electricity that can be used to power wireless devices or to charge energy storage devices, such as batteries and supercapacitors.This rectenna can convert radio, or electromagnetic, waves from the ambient environment into energy to power the sensing modules on the device, which track temperature, hydration and pulse oxygen level. Compared to other sources, less energy is produced, but the system can generate power continuously -- a significant advantage, according to Cheng."We are utilizing the energy that already surrounds us -- radio waves are everywhere, all the time," Cheng said. "If we don't use this energy found in the ambient environment, it is simply wasted. We can harvest this energy and rectify it into power."Cheng said that this technology is a building block for him and his team. Combining it with their novel wireless transmissible data device will provide a critical component that will work with the team's existing sensor modules."Our next steps will be exploring miniaturized versions of these circuits and working on developing the stretchability of the rectifier," Cheng said. "This is a platform where we can easily combine and apply this technology with other modules that we have created in the past. It is easily extended or adapted for other applications, and we plan to explore those opportunities."
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325190243.htm
|
Turning wood into plastic
|
Efforts to shift from petrochemical plastics to renewable and biodegradable plastics have proven tricky -- the production process can require toxic chemicals and is expensive, and the mechanical strength and water stability is often insufficient. But researchers have made a breakthrough, using wood byproducts, that shows promise for producing more durable and sustainable bioplastics.
|
A study published in "There are many people who have tried to develop these kinds of polymers in plastic, but the mechanical strands are not good enough to replace the plastics we currently use, which are made mostly from fossil fuels," says Yao. "We've developed a straightforward and simple manufacturing process that generates biomass-based plastics from wood, but also plastic that delivers good mechanical properties as well."To create the slurry mixture, the researchers used a wood powder -- a processing residue usually discarded as waste in lumber mills -- and deconstructed the loose, porous structure of the powder with a biodegradable and recyclable deep eutectic solvent (DES). The resulting mixture, which features nanoscale entanglement and hydrogen bonding between the regenerated lignin and cellulose micro/nanofibrils, has a high solid content and high viscosity, which can be casted and rolled without breaking.Yao then led a comprehensive life cycle assessment to test the environmental impacts of the bioplastic against commons plastics. Sheets of the bioplastic were buried in soil, fracturing after two weeks and completely degrading after three months; additionally, researchers say the bioplastic can be broken back down into the slurry by mechanical stirring, which also allows for the DES to be recovered and reused."That, to me, is what really makes this plastic good: It can all be recycled or biodegraded," says Yao. "We've minimized all of the materials and the waste going into nature."The bioplastic has numerous applications, says Liangbing Hu, a professor at the Center for Materials Innovation at the University of Maryland and co-author of the paper. It can be molded into a film that can be used in plastic bags and packaging -- one of the major uses of plastic and causes of waste production. Hu also says that because the bioplastic can be molded into different shapes, it has potential for use in automobile manufacturing, as well.One area the research team continues to investigate is the potential impact on forests if the manufacturing of this bioplastic is scaled up. While the process currently uses wood byproducts in manufacturing, the researchers say they are keenly aware that large-scale production could require usage of massive amounts of wood, which could have far-reaching implications on forests, land management, ecosystems and climate change, to name a few.Yao says the research team has already begun working with a forest ecologist to create forest simulation models, linking the growth cycle of forests with the manufacturing process. She also sees an opportunity to collaborate with people who work in forest-related fields at YSE -- an uncommon convenience."It's not often an engineer can walk down the hall and talk to a forester," says Yao.
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325150153.htm
|
Ocean's mammals at crucial crossroads
|
The ocean's mammals are at a crucial crossroads -- with some at risk of extinction and others showing signs of recovery, researchers say.
|
In a detailed review of the status of the world's 126 marine mammal species -- which include whales, dolphins, seals, sea lions, manatees, dugongs, sea otters and polar bears -- scientists found that accidental capture by fisheries (bycatch), climate change and pollution are among the key drivers of decline.A quarter of these species are now classified as being at risk of extinction (vulnerable, endangered or critically endangered on the IUCN Red List), with the near-extinct vaquita porpoise and the critically endangered North Atlantic right whale among those in greatest danger.Conservation efforts have enabled recoveries among other species, including the northern elephant seal, humpback whale and Guadalupe fur seal.The international research team -- led by the University of Exeter and including scientists from more than 30 institutions in 13 countries -- highlight conservation measures and research techniques that could protect marine mammals into the future."We have reached a critical point in terms of marine mammal conservation," said lead author Dr Sarah Nelms, of the Centre for Ecology and Conservation on Exeter's Penryn Campus in Cornwall."Very few marine mammal species have been driven to extinction in modern times, but human activities are putting many of them under increasing pressure."Our paper examines a range of conservation measures -- including Marine Protected Areas (MPAs), bycatch reduction methods and community engagement -- as well as highlighting some of the species that are in urgent need of focus."The researchers say 21% of marine mammal species are listed as "data deficient" in the IUCN Red List -- meaning not enough is known to assess their conservation status.This lack of knowledge makes it difficult to identify which species are in need of protection and what actions should be taken to save them.Professor Brendan Godley, who leads the Exeter Marine research group, said: "To continue conservation successes and reverse the downward trend in at-risk species, we need to understand the threats they face and the conservation measures that could help."Technology such as drone and satellite imaging, electronic tags and molecular techniques are among the tools that will help us do this."Additionally, sharing best practice will empower us -- and this is why we are so proud to be part of such a large and international group for this project."
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325150055.htm
|
New documentation: Old-growth forest carbon sinks overestimated
|
The claim that old-growth forests play a significant role in climate mitigation, based upon the argument that even the oldest forests keep sucking CO
|
Old and unmanaged forest has become the subject of much debate in recent years, both in Denmark and internationally. In Denmark, setting aside forests as unmanaged has often been argued to play a significant role for climate mitigation. The argument doesn't stand up according to researchers at the University of Copenhagen, whose documentation has just been published as a commentary in The entire climate mitigation argument is based upon a widely cited 2008 research article which reports that old-growth forests continue to suck up and sequester large amounts of CO"The climate mitigation effect of unmanaged forests with trees more than 200 years old is estimated to be at least one-third too high -- and is based solely upon their own data, which, incidentally, is subject to great uncertainty. Thus, the basis for the article's conclusions is very problematic," explains Professor Per Gundersen, of the University of Copenhagen's Department of Geosciences and Natural Resource Management.The original research article concluded that old-growth forests more than 200 years old bind an average of 2.4 tonnes of carbon per hectare, per year, and that 1.3 tonnes of this amount is bound in forest soil. According to the UCPH researchers, this claim is particularly unrealistic. Carbon storage in soil requires the addition of a certain amount of externally sourced nitrogen."The large amounts of nitrogen needed for their numbers to stand up don't exist in the areas of forest which they studied. The rate is equivalent to the soil's carbon content doubling in 100 years, which is also unlikely, as it has taken 10,000 years to build up the soil's current carbon content. It simply isn't possible to bind such large quantities of carbon in soil," says Gundersen.Unlike the authors of the 2008 article, and in line with the classical view in this area, the UCPH researchers believe that old unmanaged forests reach a saturation point after a number of years. At that point, CO"As we know, trees don't just grow into the sky. Trees age. And at some point, they die. When that happens, decay begins, sending carbon back into the atmosphere as COHe adds that the 2008 article does not document any mechanism which allows the forest to keep sequestering COThe UCPH researchers' view is supported by observations from Suserup Forest, near Sorø, Denmark, a forest that has remained largely untouched for the past century. The oldest trees within it are 300 years old. Inventories taken in 1992, 2002 and 2012 all demonstrated that there was no significant CO"We feel a bit like the child in the Emperor's New Clothes, because what we say is based on classic scientific knowledge, thermodynamics and common sense. Nevertheless, many have embraced an alternative view -- and brought the debate to a dead end. I hope that our contribution provides an exit," says Per Gundersen.He would like to make it clear that this should in no way be perceived as a position against protection of old-growth forest or setting aside unmanaged forest areas."Old-growth forest plays a key role in biodiversity. However, from a long-term climate mitigation perspective, it isn't an effective tool. Grasping the nuance is important so that debate can be based upon scientifically substantiated assertions, and so that policy is not influenced on an incorrect basis," concludes Gundersen.
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325150037.htm
|
Study introduces 13 new, threatened species of sparkly moths from Hawaii
|
Akito Kawahara was snapping pictures at a scenic outlook in Hawaii when he spotted the moth equivalent of a dodo.
|
An entomologist, Kawahara recognized the squiggly patterns on nearby plants as trails carved by leaf-mining caterpillars and lowered his camera to take a closer look. To his astonishment, he saw a tiny moth most experts assumed was extinct. It belonged to a genus known as "I thought, 'Oh my God, there's a Kawahara's chance sighting along a tourist footpath would kick off a hunt for the moths across all major volcanic islands of Hawaii, as well as in museum collections and back through time, resulting in the rediscovery of one of the archipelago's oldest living lineages of native animals. Over the past eight years, Kawahara and collaborators have labored to fill gaps in our understanding of these poorly-studied insects: what they look like, where they live and what they eat. When the researchers began their work, 30 species of the slender, feathery moths were recorded in the scientific literature. That number has since grown to 51.Now, the team is capping its project with a nearly 200-page-long study, the first to detail the natural history of all members of "This is an amazing opportunity to add another key piece that deepens the Hawaiian story," said study co-author Chris Johns, who carried out the project's fieldwork during his doctoral studies in the Kawahara Lab. "Because of its rarity and isolated existence, Many of the newly described species are named after native Hawaiian plants, places and people who have contributed to conservation on the islands, Kawahara said. One moth, Shrinking habitat, disease and invasive species have wiped out much of Hawaii's native flora and fauna, and more than 530 species on the islands are federally listed as endangered or threatened. Somehow, these micromoths, with a wingspan the length of an eyelash, have persisted.But their restricted range and the scarcity of their host plants place them in danger of extinction. The researchers were unable to find living representatives of 10 According to research led by Johns, the moths' lineage likely dates back about 21 million years. Can they survive the next century?Though "This tiny, tiny insect has a special relationship to Hawaiian plants," said study lead author Shigeki Kobayashi, a former postdoctoral researcher in the Kawahara Lab and now a visiting researcher at Japan's Osaka Prefecture University.While the diet of the genus as a whole includes 12 families of Hawaiian plants, about 80% of It's a delicate system in which the balance can easily be tipped. Invasive species, in particular, can disrupt an island's native ecosystems with mind-boggling speed, Johns said."Within a couple of years, a forest that is fully functioning can be completely erased and replaced with non-native species," he said. "This can happen in places that researchers just can't reach. A pig or goat or bird could bring an invasive plant up there. It's a huge problem."Blown or rafted from parts unknown, Some of their leaf-mining relatives in other parts of the world have become so intertwined with their hosts that neither species can live without the other. In the South Pacific, a local species of plant relies on another leaf-mining moth species for pollination. In turn, the moth deposits its eggs in the plants' flowers, which will later provide food for its offspring. Whether any members of "The problem is that, in some cases, there's only one or two of the plants left on the planet," he said. "They're often growing off steep cliffs, very hard to find, and you have to be in the right place at the right time to see them flowering. It was hard enough to find the larvae."Kawahara should know. After spotting that first moth as a postdoctoral researcher 10 years ago, he spent months combing the jungle for others with no success.He had moved halfway across the world for a faculty position at the Florida Museum when Johns knocked on his door. Johns was a recent University of Florida anthropology graduate and a plant guy in search of a moth job. As he described his previous conservation work in Hawaii, an idea began to take shape in Kawahara's mind. Would Johns be willing to scour the rainforest for an ultra-rare moth that had eluded researchers for nearly half a century?Johns' response: Game on."Most people think about Hawaii as this vacation spot with Mai-Tais and beaches, but it's so much more than that," he said. "It really all revolves around the culture, which revolves around its nature. There are so many things in Hawaii that are small and understated. The fact that After a month on the islands, Johns appeared in Kawahara's doorway in Gainesville with a small cooler. Inside were "I was shocked," Kawahara said. "I was convinced he was going to come back empty-handed. That was the best plane ticket I ever bought."Johns' strategy was to look not for the moths, but their host plants. He leaned on his ties with local conservation biologists to reach some of the islands' most remote forests, accessible only by helicopter, four-by-four or a long hike. Once there, he searched for It wasn't easy. Some mountainsides were so difficult to get to, he had to be dropped off in the middle of a stream, the helicopter touching a single skid to a rock long enough for him to hop out. He would spot a piece of flagging tied to a faraway tree across a dense forest -- that was the trail. Navigating it was "mostly falling," he recalled."You're on a volcano in the middle of the biggest ocean in the world," Johns said. "When you get up into these places, you really get a sense of how quiet, still and slow the entire place is. You understand the isolation so much better."But due to the islands' conelike shape, Johns could look out and see the rows of hotels and condominiums packing the distant coastline.Micromoths often play important but overlooked roles in ecosystems, and In one study, Johns, Kawahara and collaborators at the Bishop Museum in Honolulu discovered the dried pupae of an undescribed and possibly extinct "The only way we could document this particular moth-plant interaction was through museum collections," Kawahara said. "Whoever collected those leaves may not have even realized there were pupae attached, but that's the only record we have of that moth."The team also used the moths to revisit a classic evolutionary puzzle: When did plants and animals first appear in Hawaii?On a map, Hawaii may look like a smattering of islands surrounded by ocean in all directions, but it's actually the easternmost tip of a vast underwater mountain range that spans more than 3,600 miles, ending off the coast of Russia.The main Hawaiian Islands -- the youngest and largest -- sit atop a hotspot, a single magma plume that has been birthing volcanoes for about 85 million years, which are sent on a slow westward journey by the movement of tectonic plates. The farther from the hotspot the islands travel, the more they sink and erode, finally disappearing beneath the ocean surface.The hotspot has created an estimated 180,000 cubic miles of rock over its history. About 23 million years ago, it formed the Northwest Hawaiian Islands of Lisianski and Laysan. Once enormous landmasses rivalling the size of today's main islands, they're now in advanced stages of erosion: Lisianski's highest point above sea level is a 40-foot sand dune, and Laysan -- more than 800 nautical miles west of Honolulu -- will likely be submerged this century.The first of the main Hawaiian Islands to appear, Kauai, surfaced about 4.7 million years ago. Many researchers believe Hawaii's existing native plants and animals date from this period. But evidence is mounting that some, including certain kinds of insects and spiders, hail from a much earlier era, when Lisianski and Laysan were in their prime.This is the problem with using the ages of islands to timestamp the origin of species, Kawahara said. Plants and animals can predate the islands they now inhabit."Islands can also go extinct," he said. "There are a lot of gaps in our knowledge, not just in terms of insects and plants, but also in terms of island geology."Hawaii's flora and fauna may have been riding a conveyor belt of islands for millions of years, gradually vanishing from older islands and moving to new ones.Using a combination of DNA evidence, island ages and studies of closely related insect groups, Johns and Kawahara estimated that the moth lineage originated more than 21 million years ago, at least 19 million years before the formation of Kauai. If accurate, this would make the moths the islands' oldest known living lineage of native arthropods -- and possibly their oldest animal lineage alive today.This timeline and fossil pollen evidence from the moths' host plants suggest that Lisianski and Laysan were the moths' first Hawaiian homes.How did such tiny insects wind up on volcanoes in the middle of the ocean?"Plants and animals arrived in Hawaii somehow," Kawahara said. "How they got there, when they got there and how they ended up this way are some of the most fundamentally interesting evolutionary questions."The future of While the "This was one of my dreams," he said. "It's not necessarily flashy science. It's natural history. We're doing it for the moths. We're doing it for conservation. We're doing it because it's important."From one Hawaiian's perspective, the moths are essential to the islands, regardless of whether or not we ever learn their precise roles within the ecosystem."Our culture depends on the existence of these moths and other insects and plants," said collaborator and conservationist Keahi Bustamante in an award-winning video created by Johns. "It's talked about. It's put into songs and legends. It's documented that we respect these things and they respect us in a way that allows us to survive."
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325120821.htm
|
Warm water has overlooked importance for cold-water fish, like salmon and trout
|
Warm river habitats appear to play a larger than expected role supporting the survival of cold-water fish, such as salmon and trout, a new Oregon State University-led study published today found.
|
The research has important implications for fish conservation strategies. A common goal among scientists and policymakers is to identify and prioritize habitat for cold-water fish that remains suitably cool during the summer, especially as the climate warms.This implicitly devalues areas that are seasonally warm, even if they are suitable for fish most of the year, said Jonny Armstrong, lead author of the paper and an ecologist at Oregon State. He called this a "potentially severe blind spot for climate change adaptation.""Coldwater fish like trout and salmon are the polar bears of river ecosystems -- iconic species that are among the most vulnerable to climate change," Armstrong said. "A huge challenge for conservation is to figure out how to help these fish survive a warmer future. The conclusion is that we should not waste money on warm habitats and instead focus on saving the coldest places, such as high mountain streams, which are already the most pristine parts of basins. Most people agree we should give up on places that are warm in summer, but forget that these places are actually optimal for much of the year."In the new paper, published in "The synergy between cold water and warm water is really important," said Armstrong, an assistant professor in the Department of Fisheries and Wildlife in the College of Agricultural Sciences. "We're not saying cold water is not important. We're saying that warm portions of basins are also important because they grow fish during the shoulder seasons. Conserving this habitat is critical for unlocking the full potential of rivers to support fisheries."In a warmer future, many fish will need fish to take a summer vacation and move to cold places to survive the hottest months of the year. Their ability to do that could often depend on how much energy they can get in the spring and how well they can feed in the fall to bounce back. The places that are stressfully warm in summer are just right in spring and fall, and there is growing evidence that they can fuel fisheries"For the study, the researchers used data from another team of scientists that used remote sensing technology to obtain river water temperature data across entire landscapes throughout the year. That team compiled data for 14 river basins in Oregon, Washington and Idaho.The OSU-led team plugged these temperature data into a "bioenergetics model" that predicts fish growth potential based on equations derived from lab studies. This provided new insights into how growth opportunities shift across river basins throughout the year, and how a large fraction of total growth potential can accrue during the spring and autumn in places that are too hot during summer.To explore how these warm habitats could contribute to fisheries, the team created a simulation model in which virtual rainbow trout were given simple behavior rules and allowed to forage throughout the year in a basin with cold tributaries and a warm, productive main-stem river. Their simulations showed the majority of fish moved into cooler waters in the summer and exhibited meager growth rates. However, outside summer, the simulation showed the fish resided primarily in seasonally warm downstream habitats, which fueled the vast majority of their growth."In conservation, we often judge streams by their summer conditions; this is when we traditionally do field work, and this is the season we focus on when planning for climate change," Armstrong said. "We place value on places that hold fish during summer and devalue those that don't. Our simulation showed why this can be a problem -- the portions of rivers that contribute most to growth may not be the places where fish are found during summer, so they get written off."The simulations reveal the synergy between seasonally warm and perennially cool habitats and that fish that lived in these two types of habitats grew much more than fish that were restricted to either habitat alone, Armstrong said."We think of things in this binary way -- it's either warm-water habitat or its cold-water habitat," Armstrong said. "And we have definitions for fish -- it's either a warm-water fish or a cold-water fish. But the places we think of as warm are, in fact, cold way more than they are warm."He then mentioned an example using rivers in Oregon, including the Willamette, a tributary of the Columbia River that runs nearly 200 miles from Eugene to Portland."When it's warm enough for humans to swim, it's bad for cold-water fish. But there's only like six weeks of the year where it is comfortable to go swimming in Oregon," Armstrong said. "That speaks to the fact that we write off places because they get too hot through the lens of August. They're actually pretty nice for most of the year if you're a cold-water fish. And fish don't necessarily have to live there in August, just like you don't have to go swimming in the Willamette in December."This research is continuing in the field at Upper Klamath Lake in Southern Oregon, where Armstrong and a team of researchers are tracking the movement and feeding behavior of redband trout as water temperature changes.Co-authors of the paper are Aimee Fullerton and Chris Jordan of the National Oceanic and Atmospheric Administration Northwest Fisheries Science Center; Joseph Ebersole of the Environmental Protection Agency; James Bellmore, Brooke Penaluna and Gordon Reeves of the U.S. Forest Service Pacific Northwest Research Station; and Ivan Arismendi of Oregon State.
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325115319.htm
|
'Keep off the grass': The biofuel that could help us achieve net zero
|
The Miscanthus genus of grasses, commonly used to add movement and texture to gardens, could quickly become the first choice for biofuel production. A new study shows these grasses can be grown in lower agricultural grade conditions -- such as marginal land -- due to their remarkable resilience and photosynthetic capacity at low temperatures.
|
Miscanthus is a promising biofuel thanks to its high biomass yield and low input requirements, which means it can adapt to a wide range of climate zones and land types. It is seen as a viable commercial option for farmers but yields can come under threat from insufficient or excessive water supply, such as increasing winter floods or summer heat waves.With very little known about its productivity in flooded and moisture-saturated soil conditions, researchers at the Earlham Institute in Norwich wanted to understand the differences in water-stress tolerance among Miscanthus species to guide genomics-assisted crop breeding.The research team -- along with collaborators at TEAGASC, The Agriculture and Food Development Authority in the Republic of Ireland, and the Institute of Biological, Environmental and Rural Sciences in Wales -- analysed various Miscanthus genotypes to identify traits that provided insight into gene adaptation and regulation during water stress.They found specific genes that play key roles in response to water stress across different Miscanthus species, and saw consistencies with functional biological processes that are critical during the survival of drought stress in other organisms.Dr Jose De Vega, author of the study and Group Leader at the Earlham Institute, said: "Miscanthus is a commercial crop due to its high biomass productivity, resilience, and ability to continue photosynthesis during the winter months. These qualities make it a particularly good candidate for growth on marginal land in the UK, where yields might otherwise be limited by scorching summers and wet winters."Previously, a decade-long trial in Europe showed that Miscanthus produced up to 40 tonnes of dry matter per hectare each year. This was reached after just two years of establishment, proving its biofuel capacity was more efficient in ethanol production per hectare than switchgrass and corn.Miscanthus species have been used as forage species in Japan, Korea and China for thousands of years and, due to its high biomass yield and high ligno-cellulose (plant dry matter) content, they are commercially used as feedstock for bioenergy production.Ligno-cellulose biomass is the most abundantly available raw material on Earth for the production of biofuels, mainly producing bio-ethanol. Miscanthus's high biomass ability makes the grass a valuable commodity for farmers on marginal land but the crop's responses to water-stress vary depending on the Miscanthus species' origin.The scientists compared the physiological and molecular responses among Miscanthus species in both water-flooded and drought conditions. The induced physiological conditions were used for an in-depth analysis of the molecular basis of water stress in Miscanthus species.A significant biomass loss was observed under drought conditions in all of the four Miscanthus species. In flooded conditions, biomass yield was as good as or better than controlled conditions in all species. The low number of differentially expressed genes, and higher biomass yield in flooded conditions, supported the use of Miscanthus in flood-prone marginal land."The global challenge of feeding the ever-increasing world population is exacerbated when food crops are being used as feedstock for green energy production," said Dr De Vega."Successful plant breeding for ethanol and chemical production requires the ability to grow on marginal lands alongside prioritising the attributes; non-food related, perennial, high biomass yield, low chemical and mechanical input, enhanced water-use efficiency and high carbon storage capacity. Miscanthus fulfils these for enhanced breeding -- saving money and space for farmers, and lending a hand to our over polluted environment by emitting CO2."The research team is in the early selection process of high biomass genotypes from large Miscanthus populations that are better adapted to the UK conditions and require low inputs. The use of genomic approaches is allowing us to better understand the traits that make some Miscanthus species a commercially sustainable alternative for marginal lands and applying this to agri-practices."The paper 'Physiological and transcriptional response to drought stress among bioenergy grass Miscanthus species' is published in
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325101224.htm
|
Biocrude passes the 2,000-hour catalyst stability test
|
A large-scale demonstration converting biocrude to renewable diesel fuel has passed a significant test, operating for more than 2,000 hours continuously without losing effectiveness. Scientists and engineers led by the U.S. Department of Energy's Pacific Northwest National Laboratory conducted the research to show that the process is robust enough to handle many kinds of raw material without failing.
|
"The biocrude oil came from many different sources, including wastewater sludge from Detroit, and food waste collected from prison and an army base," said John Holladay, a PNNL scientist and co-director of the joint Bioproducts Institute, a collaboration between PNNL and Washington State University. "The research showed that essentially any biocrude, regardless of wet-waste sources, could be used in the process and the catalyst remained robust during the entire run. While this is just a first step in demonstrating robustness, it is an important step."The milestone was first described at a virtual conference organized by NextGenRoadFuels, a European consortium funded by the EU Framework Programme for Research and Innovation. It addresses the need to convert biocrude, a mixture of carbon-based polymers, into biofuels. In the near term, most expect that these biofuels will be further refined and then mixed with petroleum-based fuels used to power vehicles."For the industry to consider investing in biofuel, we need these kinds of demonstrations that show durability and flexibility of the process," said Michael Thorson, a PNNL engineer and project manager.Just as crude oil from petroleum sources must be refined to be used in vehicles, biocrude needs to be refined into biofuel. This step provides the crucial "last mile" in a multi-step process that starts with renewables such as crop residues, food residues, forestry byproducts, algae, or sewage sludge. For the most recent demonstration, the biocrude came from a variety of sources including converted food waste salvaged from Joint Base Lewis-McChord, located near Tacoma, Wash., and Coyote Ridge Corrections Center, located in Connell, Wash. The initial step in the process, called hydrothermal liquefaction, is being actively pursued in a number of demonstration projects by teams of PNNL scientists and engineers.The "last mile" demonstration project took place at the Bioproducts, Sciences, and Engineering Laboratory on the Richland, Wash. campus of Washington State University Tri-Cities. For 83 days, reactor technician Miki Santosa and supervisor Senthil Subramaniam fed a constant flow of biocrude into carefully honed and highly controlled reactor conditions. The hydrotreating process introduces hydrogen into a catalytic process that removes sulfur and nitrogen contaminants found in biocrude, producing a combustible end-product of long-chain alkanes, the desirable fuel used in vehicle engines. Chemist Marie Swita analyzed the biofuel product to ensure it met standards that would make it vehicle-ready."Processing food and sewage waste streams to extract useful fuel serves several purposes," said Thorson. Food waste contains carbon. When sent to a landfill, that food waste gets broken down by bacteria that emit methane gas, a potent greenhouse gas and contributor to climate change. Diverting that carbon to another use could reduce the use of petroleum-based fuels and have the added benefit of reducing methane emissions.The purpose of this project was to show that the commercially available catalyst could stand up to the thousands of hours of continuous processing that would be necessary to make biofuels a realistic contributor to reducing the world's carbon footprint. But Thorson pointed out that it also showed that the biofuel product produced was of high quality, regardless of the source of biocrude?an important factor for the industry, which would likely be processing biocrude from a variety of regional sources.Indeed, knowing that transporting biocrude to a treatment facility could be costly, modelers are looking at areas where rural and urban waste could be gathered from various sources in local hubs. For example, they are assessing the resources available within a 50-mile radius of Detroit, Mich. There, the sources of potential biocrude feedstock could include food waste, sewage sludge and cooking oil waste. In areas where food waste could be collected and diverted from landfills, much as recycling is currently collected, a processing plant could be up to 10 times larger than in rural areas and provide significant progress toward cost and emission-reduction targets for biofuels.Milestones such as hours of continuous operation are being closely watched by investor groups in the U.S. and Europe, which has set aggressive goals, including being the first climate-neutral continent by 2050 and achieving a 55% reduction in greenhouse gas emissions by 2030. "A number of demonstration projects across Europe aim to commercialize this process in the next few years," Holladay said.The next steps for the research team include gathering more sources of biocrude from various waste streams and analyzing the biofuel output for quality. In a new collaboration, PNNL will partner with a commercial waste management company to evaluate waste from many sources. Ultimately, the project will result in a database of findings from various manures and sludges, which could help decide how facilities can scale up economically."Since at least three-quarters of the input and output of this process consists of water, the ultimate success of any industrial scale-up will need to include a plan for dealing with wastewater," said Thorson. This too is an active area of research, with many viable options available in many locations for wastewater treatment facilities.DOE's Bioenergy Technologies Office has been instrumental in supporting this project, as well as the full range of technologies needed to make biofuels feasible.
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325101202.htm
|
How improving acoustic monitoring of bats could help protecting biodiversity
|
In order to assess the risk of bats dying at wind turbines, it is common practice to record the acoustic activity of bats within the operating range of the rotor blades. For this purpose, ultrasonic detectors are attached to the nacelles of the mast top. In a recent analysis, a team of scientists led by the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) concludes that the effectiveness of this acoustic monitoring is insufficient to reliably predict mortality risk, especially for bats at large turbines. They therefore recommend installing supplementary ultrasonic detectors at other locations on the wind turbines and developing additional techniques such as radar and thermal imaging cameras for monitoring. The results of their analysis are published in the scientific journal
|
Wind is a form of renewable energy source which is widely used for energy generation. One downside of wind energy is that many bats die when colliding with rotor blades of wind turbines. This is an urgent problem for conservation because all bat species are protected by law because of their rarity. To find out when the operation of wind turbines poses a threat to bats and when it does not, the temperature and wind conditions at which bats are particularly active at turbines are determined. For this purpose, the echolocation calls of bats are recorded when they fly into the risk zone near the rotor blades. From this, threshold values for wind speed and temperature can be derived for a bat-safe operation of wind turbines. Wind turbines then only produce electricity when none or only a few bats are active."This approach is a good starting point. Its methodological implementation is, however, often insufficient, especially for large wind turbines," summarises bat expert Dr Christian Voigt, Head of the Leibniz-IZW Department of Evolutionary Ecology, together with colleagues from the German Bat Association (Bundesverband für Fledermauskunde Deutschland), the University of Naples Federico II, the University of Bristol and the Max Planck Institute for Ornithology in a joint publication. Automated ultrasonic detectors on the nacelles of wind turbines are usually used for acoustic monitoring. These record the calls of passing bats. "Each bat species produces echolocation sounds at a pitch and volume typical for the species," explains Voigt. He and his colleagues simulated sound propagation using the example of the common noctule, with calls of a low frequency (about 20 kHz) but a high sound pressure level (110 dB), and Nathusius's pipistrelle, with calls at a higher frequency (about 40 kHz) and a lower sound pressure level (104 dB). "Our simulations show that, according to the laws of physics, the calls are attenuated with each metre of distance as they propagate through the air by 0.45 dB per metre for common noctules and by 1.13 dB per metre for Nathusius's pipistrelle" says Voigt. With the widely used detection threshold of 60 dB, ultrasonic detectors record calls of common noctules at a distance of calls up to 40 m away. For Nathusius's pipistrelle, the detection range is on average 17 m. Neither maximum distance is sufficient to completely cover the danger zone of large wind turbines. New turbines in particular have rotor blades of more than 60 m in length, which is well above the detection distance of bats by ultrasonic detectors.The sonar beam of bats also means that echolocation calls do not spread evenly in all directions, but preferentially towards the front in the direction of flying. If bats do not fly directly towards the microphone, the calculated detection range decreases further. In addition, ultrasonic detectors are usually mounted on the underside of the nacelles and the microphone therefore points downwards. Bat calls above the nacelle are therefore not registered. The focus is on the lower half of the danger zone, although bats can also be found in the upper half."At a wind turbine with rotor blades of 60 m length, the detectors only cover a maximum of 23 % of the risk zone for the common noctule and only a maximum of 4 % of the risk zone for Nathusius's pipistrelle, two species with a high risk of colliding with turbines. With modern wind turbines, rotor blade lengths continue to increase, so the relative coverage will be even lower in the future," says Voigt, first author of the article. As a consequence, the existing acoustic monitoring measures do not adequately reflect the collision risk. Therefore, the conditions under which wind turbines are switched off for bat protection are insufficient and many animals therefore continue to die.In order to improve the cover of the risk zone of the rotor blades, the scientists recommend additional detectors at other locations, e.g. above as well as on the lee side of the nacelle. In order to also detect bats circling up the mast of the turbine, it may also be advisable to install ultrasonic detectors directly on the mast. This would also register animals flying at lower levels above ground or collecting insects from the mast surface. Complementary sensor technology such as radar systems or thermal imaging cameras could provide additional information.Based on the recordings, consultants and researchers can determine the bat species and assess under which conditions (temperature, time of day, wind strength) they are most active. With this information, conditions can be described that restrict the operation of wind turbines during times of particularly high bat activity, thus reducing the risk of killing. "Through suitable monitoring schemes, the operation of wind turbines can be effectively adjusted to ensure that wind energy production does not come at the expense of biodiversity," Voigt concludes.
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325084856.htm
|
Technology uses 'single' approach to develop electronics, acoustics
|
A Purdue University innovator has developed a new approach to creating popular thin films used for devices across a broad range of fields, including optics, acoustics and electronics.
|
Epitaxial lithium niobate (LNO) thin films are an attractive material for electronics and other devices. These films offer flexibility and other properties that are important to manufacturers.The challenge is that these devices demand high-quality thin films that can be difficult to grow and produce. Haiyan Wang, a Purdue materials engineer, developed a new approach to creating these films. The work is published in "We created an approach that makes these films easier to produce," said Wang, the Basil S. Turner Professor of Engineering in Purdue's College of Engineering. "We developed a versatile nanocomposite-seeded approach that allows us to create single-layer films. Typically, engineers have used a double-layer approach, which adds to the complicated production process."This work is supported by Sandia National Laboratories through its Academic Alliance initiative. This technology work also is supported through Sandia's Diversity Initiative."Our approach offers an efficient new option for optics, acoustics and electronics," said Robynne Paldi, a Ph.D. candidate at Purdue who helped lead the research. "Our films are grown through a pulsed laser deposition method and growth conditions are optimized to achieve high-quality films that can be easily integrated into devices."The innovators worked with the Purdue Research Foundation Office of Technology Commercialization to patent their technology.
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325084830.htm
|
Two new species of already-endangered screech owls discovered in Amazon rainforest
|
The Amazon rainforest is teeming with creatures unknown to science -- and that's just in broad daylight. After dark, the forest is a whole new place, alive with nocturnal animals that have remained even more elusive to scientists than their day-shift counterparts. In a new paper in
|
"Screech owls are considered a well-understood group compared to some other types of organisms in these areas," says John Bates, curator of birds at the Field Museum in Chicago and one of the study's authors. "But when you start listening to them and comparing them across geography, it turns out that there are things that people hadn't appreciated. That's why these new species are being described.""Not even professional ornithologists who have worked on owls for their entire lives would agree about the actual number of species found in this group, so a study like ours has been awaited for a really long time," says Alex Aleixo, head of the research team responsible for the study, and currently curator of birds at the Finnish Museum of Natural History in the University of Helsinki, Finland.The newly-discovered screech owls are cousins of the Eastern Screech Owls that are common in the United States. "They're cute little owls, probably five or six inches long, with tufts of feathers on their heads," says Bates. "Some are brown, some are gray, and some are in between." Until this study, the new species were lumped together with the Tawny-bellied Screech Owl and the Black-capped Screech Owl, which are found throughout South America.Teasing out the differences between the species started with years of fieldwork in the Amazon rainforest as well as the Atlantic forest running along the eastern part of Brazil and surrounding countries. Bates, who usually conducts fieldwork during the day, says that doing fieldwork in the rainforest at night comes with new challenges. "For me it's more a feeling of fascination than being scared, but at the same time, you're running into spider webs. If you're wearing a headlight you see the eyeshine of the nocturnal animals. One time I was stepping over a log and I looked down and there was a tarantula the size of my hand just sitting there," says Bates. "If I had been a kid I would have been scared to death."The owls that the researchers were looking for live in the trees, often a hundred feet above the forest floor. That makes studying them difficult. But the researchers had a secret weapon: the screech owls' namesake screech."To draw the birds out, we used tape recordings," explains Bates. "We'd record their calls and then play them back. The owls are territorial, and when they heard the recordings, they came out to defend their territory."The scientists compared the birds' calls and found that there were variations in the sounds they made, indicative of different species. They also examined the birds' physical appearances and took tissue samples so they could study the owls' DNA at the Field Museum's Pritzker DNA Lab.Altogether, 252 specimens, 83 tape-recordings, and 49 genetic samples from across the range of the Tawny-bellied Screech Owl complex in South America were analyzed. A significant number of specimens were collected by the research team itself, especially the study's lead author Sidnei Dantas, who spent a good share of his time in graduate school searching for and tape-recording screech owls in South American rainforests. In addition, natural history collections and their materials collected over the centuries were essential to complete the study´s unprecedented sampling."The study would not have been possible if it were not for the great biological collections in Brazil and USA which I visited during my work, and that sent us essential material, either genetic and morphological. This highlights the importance of such research institutions for the progress of science and hence of the countries they represent," says Dantas, who conducted the study as part of his PhD dissertation at the Goeldi Museum in Belém and is currently working as a nature guide in Brazilian Amazonia.The combination of genetic variation, physical differences, and unique vocalizations led the team to describe two new species in addition to the previously known Tawny-bellied Screech Owl: the Xingu Screech Owl and the Alagoas Screech Owl. The Xingu owl's scientific name is in honor of Sister Dorothy May Stang, an activist who worked with Brazilian farmers to develop sustainable practices and fight for their land rights; its common name is for the area where the owl is found near the Xingu River. The Alagoas owl's name is a reference to the northeastern Brazilian state of Alagoas where the owl is primarily found.While the owls are new to science, they're already in danger of disappearing forever. "Both new species are threatened by deforestation," says Jason Weckstein, associate curator of Ornithology in the Academy of Natural Sciences of Drexel University and associate professor in the university's Department of Biodiversity, Earth, and Environmental Science. "The Xingu Screech Owl is endemic to the most severely burned area of the Amazon by the unprecedented 2019 fires, and the Alagoas Screech Owl should be regarded as critically endangered given the extensive forest fragmentation in the very small area where it occurs," says Weckstein, who is a co-author and began work on this project as a postdoctoral researcher at the Field Museum.Bates says he hopes that the study will shed light on how varied the Amazon and Atlantic forests are and how simply protecting certain areas isn't enough to preserve the forests' biodiversity. "If you just say, 'Well, you know Amazonia is Amazonia, and it's big,' you don't end up prioritizing efforts to keep forests from being cut in these different parts of Amazonia. That could mean losing entire faunas in this region," says Bates.In addition to the study's conservation implications, the authors highlight the international collaboration that made the work possible. "This study shows how important it is to train the next generation of scientists at a global level," says Bates. "That means to having students like Sidnei come from Brazil and work in the Field's Pritzker Lab and measure specimens in our collection for their research. It's a great thing to build those connections."
|
Environment
| 2,021 |
March 25, 2021
|
https://www.sciencedaily.com/releases/2021/03/210325115342.htm
|
Pumice the key to solving seabird mass death mystery
|
Researchers have used the evidence of pumice from an underwater volcanic eruption to answer a long-standing mystery about a mass death of migrating seabirds.
|
New research into the mass death of millions of shearwater birds in 2013 suggests seabirds are eating non-food materials including floating pumice stones, because they are starving, potentially indicating broader health issues for the marine ecosystem.The research which was led by CSIRO, Australia's national science agency, and QUT, was published in the journal The lead author on the paper and CSIRO and IMAS-UTas researcher Dr Roman said there was much discussion in the scientific community about the causes of mass mortality of seabirds which are often found with plastic and other non-food items in their stomach."We found that in the instance of the shearwater bird deaths in 2013, these birds were starving, and in their starved state had reduced prey discrimination," Dr Roman said."Our study investigates the chicken-and-egg dilemma -- do animals starve from eating non-food or do animals eat non-food because they are starving?""Sea birds are widely considered to be indicators of the health of a marine ecosystem and mass mortalities can indicate changing food webs and ecological conditions.Short-tailed shearwaters (Ardenna tenuirostris), which were the subject of the study, migrate from Australia in April to the North Pacific and return late in the year.Necropsies of 172 seabirds recovered from beaches along the New South Wales and Queensland coast found 96.7 per cent of birds had ingested pumice or plastic.The research team, including Dr Natalie Bool (IMAS-UTas), Leah Gustafson (QUT) and Dr Kathy Townsend (USC), used satellite systems to track the 2013 shearwater migration and overlayed that onto locations of the pumice raft produced by the 2012 Havre eruption in the Kermadec arc north of New Zealand.QUT's Associate Professor Bryan has been studying pumice rafts for over 20 years and recently has been tracking the impact of another giant pumice raft from a 2019 underwater eruption near Tonga that landed along the Australian coastline last year."We proposed that a short time period between non-food ingestion and death would indicate that birds were already starving at the time of non-food ingestion, and a starving state would be reflected by poor body condition and reduced muscle mass," Professor Bryan said."Death after a longer period would indicate that birds starved following ingestion of non-food."Professor Bryan said detailed information about the route of the pumice raft was an integral part to determining the answer of where the birds had consumed the pumice."We combined the tracking information data of the short-tailed shearwaters, using location tags on migrating birds, and the geological signature of the ingested pumice," Professor Bryan said."By October 2013 when the shearwaters were returning to Australia on their annual migration from the North Pacific, the floating pumice was now located along their flight path as they approached Australia."By determining when the birds ate the pumice, and their physical condition at the time, the researchers were able to conclude that the birds were already starving when they ate the pumice, and they had ingested the pumice about 12 to 41 hours before death."With a projected increase in challenging times for wildlife because of threats such as climate change, marine pollution and over exploitation of resources, this study has implications for mass mortalities and exacerbation of existing threats to marine species," Dr Roman said.It was the coming together of researchers into seabirds and volcanoes and pumice rafts, apparently unrelated phenomena, and the crossing of science discipline boundaries that has been able to help solve this chicken-and-egg dilemma.The research team included scientists from CSIRO, QUT, Institute for Marine and Antarctic Studies and University of the Sunshine Coast.
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324155110.htm
|
Deadly heat waves will be common in South Asia, even at 1.5 degrees of warming
|
Residents of South Asia already periodically experience heat waves at the current level of warming. But a new study projecting the amount of heat stress residents of the region will experience in the future finds with 2 degrees Celsius of warming, the population's exposure to heat stress will nearly triple.
|
Limiting warming to 1.5 degrees Celsius will likely reduce that impact by half, but deadly heat stress will become commonplace across South Asia, according to the new study in With almost one quarter of the world's population living in South Asia, the new study underlines the urgency of addressing climate change."The future looks bad for South Asia, but the worst can be avoided by containing warming to as low as possible," said Moetasim Ashfaq, a computational climate scientist at Oak Ridge National Laboratory and corresponding author of the new study. "The need for adaptation over South Asia is today, not in the future. It's not a choice anymore."Earth has warmed by 1 degree Celsius since the start of the Industrial Revolution, according to the Intergovernmental Panel on Climate Change. On the current climate trajectory, it may reach 1.5 degrees Celsius of warming in 2040. This deadline leaves little time for South Asian countries to adapt. "Only half a degree increase from today is going to cause a widespread increase in these events," Ashfaq said.People living in South Asia are especially vulnerable to deadly heat waves because the area already experiences very hot, humid summers. Much of the population live in densely populated cities without regular access to air conditioning, and about 60% perform agricultural work and can't escape the heat by staying indoors.In the new study, the researchers used climate simulations and projections of future population growth to estimate the number of people who will experience dangerous levels of heat stress in South Asia at warming levels of 1.5 and 2 degrees Celsius. They estimated the wet bulb temperature residents will experience, which is similar to the heat index, as it takes into account humidity as well as temperature. A wet bulb temperature of 32 degrees Celsius (89.6 degrees Fahrenheit) is considered to be the point when labor becomes unsafe, and 35 degrees Celsius (95 degrees Fahrenheit) is the limit to human survivability -- when the body can no longer cool itself.Their analysis suggests at 2 degrees of warming, the population's exposure to unsafe labor temperatures will rise more than two-fold, and exposure to lethal temperatures rises 2.7 times, as compared to recent years.Curbing warming to 1.5 degrees Celsius will likely cut that exposure in half, but large numbers of people across South Asia will still experience extreme temperatures. An increase in heat events that create unsafe labor conditions are likely to occur in major crop producing regions in India, such as West Bengal and Uttar Pradesh, and in Pakistan in Punjab and Sindh. Coastal regions and urban centers such as Karachi, Kolkata, Mumbai, Hyderabad and Peshawar are also likely to be heavily affected, according to the study."Even at 1.5 degrees, South Asia will have serious consequences in terms of heat stress," Ashfaq said. "That's why there is a need to radically alter the current trajectory of greenhouse gas emissions."The results differ from a similar study conducted in 2017, which predicted that heat waves of lethal temperatures will occur in South Asia toward the end of the 21st century. The researchers suspect the earlier study is too conservative, as deadly heat waves have already hit the region in the past. In 2015, large parts of Pakistan and India experienced the fifth deadliest heat wave in the recorded history, which caused about 3,500 heat-related deaths."A policy framework is very much needed to fight against heat stress and heat wave-related problems," said T.V. Lakshmi Kumar, an atmospheric scientist at India's SRM Institute of Science and Technology who was not involved in the work. "India has already committed to reduce emissions to combat climate change issues."The study was supported by National Climate?Computing Research Center, which is located within ORNL's National Center for Computational Sciences and supported under a Strategic Partnership Project between Department of Energy and National Oceanic and Atmospheric Administration.
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324113450.htm
|
Female salmon are dying at higher rates than male salmon
|
Female adult sockeye from the Fraser River are dying at significantly higher rates than their male counterparts on the journey back to their spawning grounds, finds new UBC research. For every male salmon that doesn't make it to their natal stream, at least two, sometimes three female salmon die.
|
"This is causing skewed sex ratios in their spawning grounds, something that has been observed in recent years," says lead researcher Dr. Scott Hinch, a professor in the faculty of forestry and head of the Pacific Salmon Ecology and Conservation Laboratory at UBC. "The implications on the health of Fraser River stocks are concerning, particularly as Pacific salmon populations in British Columbia have been declining over the past several decades."Hinch noted that records in the 1930s and even up to the early 1990s show that for most years, females outnumbered males on spawning grounds. The sex ratios started to change in the early 2000s towards relatively fewer females."A combination of environmental stressors could have triggered the shift," he explains. "More females die relative to males when migration conditions are challenging. This happens when the water is too warm, or there is too much turbulence, or when the fish have been handled or released from capture. Stressful events have a larger impact on females."The trend of higher female mortality when environmental conditions are challenging also was identified in other Pacific salmon species including Coho and Chinook salmon, and in sockeye in other river systems.Hinch and his collaborators came up with their finding after reviewing 19 major studies on salmon, including tagging and tracking studies in the field, and laboratory studies. They are proposing four reasons why females are dying at higher rates than males in the studies they reviewed: depletion of energy reserves, reduced cardiac capacity, stress and disease."Females have higher heart rates and smaller hearts than males leading to reduced cardiac capacity. Because female gonads are so large compared to males, they have to divert way more blood to them especially as the eggs are developing and this requires even more oxygen supply from the heart, so it's likely that when the migration is difficult, females are not able to get enough oxygen to swim."Sockeye and other Pacific salmon don't feed during their river migration and the females, more so than the males, can also run out of stored energy reserves earlier. "Females are also more susceptible to stress, and to pathogens, so a combination of factors is likely causing their higher mortality."In 2019, the Committee on the Status of Endangered Wildlife in Canada (COSEWIC), an independent committee of wildlife experts and scientists, designated 24 salmon populations in southern B.C. as threatened or endangered, including several of the sockeye populations that Hinch and his colleagues studied."The conservation and management implications of our findings are significant," says Hinch. "Pacific salmon stocks are important ecologically, but also culturally and as food security to First Nations, and for commercial and recreational fisheries. Salmon fishing in British Columbia supports more than 8,000 jobs and generates over $200 million in tax revenues annually, while commercial fisheries bring in up to $200 million a year. Recreational fishing contributes almost $1 billion in economic impact each year."Hinch and his team are recommending actions like adjusting harvest rates to protect female salmon, and ensuring migration routes have fewer obstacles to ensure females are able to complete their migrations. This is a particularly large issue at present as a fishway is now being built at the site of the Fraser River Big Bar landslide, an area that has impeded spawning migrations for the past two years."A few years ago we studied the migration of sockeye salmon through the Seton River Dam and Fishway near Lillooet, B.C., and proposed that some small adjustments by BC Hydro to flows at this dam could improve salmon passage, which they did. Female salmon benefitted the most, showing us that basic research can be used to fine-tune our management actions to improve the survival of female salmon."Hinch and his team acknowledge that more research is needed to fully understand the mechanisms behind the different mortality rates in salmon. But they warn that as rivers continue to warm with climate change, we will see even higher rates of female mortality.
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324113425.htm
|
Tiny currents may impact vital ocean food source
|
Copepods are tiny crustaceans about the size of a grain of rice, but they are one of the most important parts of the Earth's aquatic ecosystems. Their behavior and interaction with the environment, however, remains a relative mystery. Now, a recent paper published in the
|
Researchers from Bigelow Laboratory of Ocean Sciences and the Georgia Institute of Technology found that the copepods gather around small vortexes in the ocean, a finding which could have significant implications for the food web."We're getting at a mechanism that helps us understand how the ecosystem works," said Bigelow Laboratory Senior Research Scientist David Fields, a co-author on the paper. "These vortexes influence the behavior of copepods in a way that allows other animals in the food web to survive."Copepods can be found in almost every freshwater and saltwater body in the world. If you were to take all copepods and put them together, their weight would be equivalent to about a trillion humans. Their abundance makes them critical to ocean health, where they serve as a cornerstone of the ocean food web and play an important role in global cycles."That many organisms breathing oxygen, eating phytoplankton, and producing waste is a major driver in how the ocean carbon cycle works," Fields said. "Despite their tiny individual size, they have a huge impact on the ecosystem."Abundance alone, however, is not enough to make them such a vital food source for marine life from baby fish to right whales. Although plentiful, they are dispersed in the mind-bogglingly vast ocean. Fortunately for predators, copepods group together. Exactly where and why they do so has been a challenge for scientists to identify.The newly published study suggests one gathering place is swirling ocean currents less than an inch in diameter. The scientists teamed up with engineers and developed a new type of instrument that can replicate these vortexes and allow for control over their size and speed.The researchers discovered that the copepods could not only detect the vortexes but actually aggregate around them. The finding could be significant for understanding copepods, and the new ability to create these tiny vortexes in a laboratory may enable scientists to study ocean food webs in a new light."We've come up with an explanation for why these animals aggregate in what we like to think of as this, well-mixed, homogeneous ocean," Fields said.These small vortexes have always been difficult to study in the field because of their scale, ephemeral nature, and how much other turbulence exists in the ocean. However, previous research using mathematical models has suggested these processes could explain a number of phenomena such as the behavior of marine organisms and nutrients mixing up from the deep ocean."People use this kind of concept to explain a lot about how ocean processes work, but nobody's really ever seen it," Fields said. "Until recently, you couldn't hold onto them long enough to study because they just pop up and disappear within seconds to minutes."Researchers have previously observed some interactions between copepods and turbulent water. However, this study was the first to examine the interaction of individual copepods with a single vortex, which opens up new possibilities for understanding these vital organisms."These tiny vortexes are happening everywhere in the ocean, but we've never gotten the chance to really look at them," Fields said. "Now, we can create one of those little vortexes that live out in nature and hold it in the laboratory so we can analyze it in detail."
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324113416.htm
|
Floating solar farms could help reduce impacts of climate change on lakes and reservoirs
|
Floating solar farms could help to protect lakes and reservoirs from some of the harms of climate change, a new study suggests.
|
However, given the complex nature of water bodies and differing designs of solar technologies, there could also be detrimental ecosystem impacts of deploying floating solar arrays.Conventional solar farms are controversial due to the amount of land they take up. This is leading to increasing interest in floating solar farms -- making use of the additional space that bodies of water provide.So far, there are three commercial-size floating solar arrays in the UK, and hundreds more across the world. The number of installations is likely to grow significantly in coming decades as demand rises for renewable energy sources with more countries committing to net zero carbon targets.However, little is known about the impacts -- both positive and negative -- these floating solar farms are having on the lakes and reservoirs they are installed on -- until now.Scientists from Lancaster University and the University of Stirling have completed the first detailed modelling of the environmental effects of floating solar installations on bodies of water."As demand for land increases, water bodies are increasingly being targeted for renewable energy. Deployment of solar on water increases electricity production, but it is critical to know if there will be any positive or negative environmental consequences," said Mr Giles Exley, PhD researcher and lead author from Lancaster University."Given the relative immaturity of floating solar farms, it is important to further scientific evidence of the impacts. Our results provide initial insight of the key effects that will help inform water body manager and policy maker decisions."The research team undertook computer modelling using the MyLake simulation programme and data collected by the UK's Centre for Ecology and Hydrology from England's largest lake, Windermere. Although the researchers believe it is unlikely floating solar farms will be deployed on Windermere, it presents a rich data-set as it is one of the most comprehensively studied lakes in the world.Their results show that floating solar arrays can cool water temperatures by shading the water from the sun. At scale, this could help to mitigate against harmful effects caused by global warming, such as blooms of toxic blue green algae, and increased water evaporation, which could threaten water supply in some regions.The scientists found that floating solar installations also reduce the duration of 'stratification' -- this is where the sun heats the water, forming distinct layers of water at different temperatures. This tends to happen more in the warmer summer months and can result in the bottom layer of water becoming deoxygenated, which deteriorates water quality -- an obvious issue for supplies of drinking water. However, the picture is complex and there are also conditions under which stratification, and therefore detrimental water quality impacts, could increase if floating solar farms are deployed.Mr Exley said: "The effects of floating solar on the temperature of the water body and stratification, both of which are major drivers of biological and chemical processes, could be comparable in magnitude to the changes lakes will experience with climate change. Floating solar could help to mitigate against the negative effects global warming will have on these bodies of water.""However, there are also real risks of detrimental impacts, such as deoxygenation causing undesirable increases in nutrient concentrations and killing fish. We need to do more research to understand the likelihood of both positive and negative impacts."The effects on water temperature increased the larger the solar installation, with small arrays of less than ten per cent of the lake surface generally having minimal impacts. However, this model concentrated on one lake. Further studies will be needed to determine the optimum size array, and design, and their effects for individual lakes and reservoirs -- all of which have unique characteristics. Different designs of solar installations also have different shading and sheltering effects for the sun and wind.Arrays covering more than 90 per cent of a lake could increase the chances of the lake freezing over in winter, the study found -- though these effects would also be specific to the body of water and design of the installation and require further studying.Field studies and further modelling work to build on these initial findings is ongoing.
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324113359.htm
|
How grasslands respond to climate change
|
"Based on field experiments with increased carbon dioxide concentration, artificial warming, and modified water supply, scientists understand quite well how future climate change will affect grassland vegetation. Such knowledge is largely missing for effects that already occurred in the last century," says Hans Schnyder, Professor of Grassland at the TUM.
|
Based on the Park Grass Experiment at Rothamsted, researchers have now shown that future predicted effects of climate change on the nutrient status of grassland vegetation have already taken hold in the last century.Since 1856, research at Rothamsted has been testing the effects of different fertilizer applications on yield performance and botanical composition of hay meadows. Harvested material has been archived since the experiment began. This material is now available to researchers for studies of vegetation nutrient status, and the carbon and oxygen isotope composition of biomass."The increase in atmospheric COPlants control how far their stomata, small pores in the leaf epidermis, open to optimize the balance between carbon dioxide uptake (photosynthesis) and water loss (transpiration). With increased COCombining the new analyses of oxygen and carbon isotope composition, nitrogen and phosphorus in biomass, and yield and climate data, the research team, led by Professor Schnyder, analyzed the physiological effects of the emission-related increase in COThey found that in particular the grass-rich communities that were heavily fertilized with nitrogen experienced a deterioration in their nitrogen nutrition status. Climate change also resulted in greatly reduced stomatal conductance (now detectable with the new research methods) and significantly reduced yields.The core element of the researchers' observations is the hypersensitive CO"We also observed that fields that were heavily fertilized with nitrogen, and therefore rich in grass, largely lost their yield superiority over forbs- and legume-rich fields that were either less or completely unfertilized with nitrogen despite being otherwise equally supplied with nutrients over the course of the last century," says the first author of the study Juan Baca Cabrera, who is pursuing a doctorate at the TUM's chair of Grassland.In the researchers' view, the results indicate that restraining nitrogen supply to grasslands in the future would enhance the yield contribution from forbs and legumes while at the same time would help limit nitrogen emissions to the environment. Professor Schnyder states, "Our findings are important for understanding the importance of grasses in earth systems and provide guidance for sustainable future grassland use."
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324113345.htm
|
Dow-like index for energy prices might help smooth transition to clean power
|
Since the early industrial revolution in the mid-1700s, fossil fuels have acquired an ever-growing footprint in energy production. However, the environmental concerns of fossil fuels use and their inevitable depletion have led to a global shift toward renewable energy sources. These transitions, however, raise questions about the best choice of renewables and the impact of investing in these resources on consumer cost.
|
In a recent study published in the journal "Energy is affected by all kinds of events, including political developments, technological breakthroughs and other happenings going on at a global scale," said Stefanos Baratsas, a graduate student in the Artie McFerrin Department of Chemical Engineering at Texas A&M and the lead author on the study. "It's crucial to understand the price of energy across the energy landscape along with its supply and demand. We came up with one number that reflects exactly that. In other words, our metric monitors the price of energy as a whole on a monthly basis."Today, the energy industry is largely dominated by fossil fuels, like coal, natural gas and petroleum. An increase in fossil fuel consumption, particularly in the last few decades, has raised increasing concerns about their environmental impact. Most notably, the Intergovernmental Panel on Climate Change has reported an estimated increase at 0.2 degrees Celsius per decade in global temperature, which is directly linked to burning fossil fuels.But only around a 11% share of the total energy landscape comes from renewable sources. Although many countries, including the United States, have committed to using more renewable energy sources, there isn't a way to quantitatively and accurately measure the price of energy as a whole. For example, an establishment might use a combination of solar and fossil fuels for various purposes, including heating, power and transportation. In this case, it is unclear how the price would change if there is an increased tax on fossil fuels or subsidies in favor of renewables are introduced."Energy transition is a complex process and there is no magic button that one can press and suddenly transition from almost 80% carbon-based energy to 0%," said Dr. Stratos Pistikopoulos, director of the Texas A&M Energy Institute and senior author on the study. "We need to navigate this energy landscape from where we are now, toward the future in steps. For that, we need to know the consolidated price of energy of end users. But we don't have an answer to this fundamental question."To address this research gap, the researchers first identified different energy feedstocks, such as crude oil, wind, solar and biomass, and their energy products. So, for example, crude oil's energy products are petrol, gasoline and diesel. Next, they categorized the energy end users as either residential, commercial, industrial or transportation. Further, they obtained information on which energy product and how much of it is consumed by each user from the United States Energy Information Administration. Last, they identified the supply chains that connected the energy products to consumers. All this information was used to calculate the average price of energy, called the energy price index, for a given month and forecast energy prices and demands for future months.As a potential real-world use of this metric, the researchers explored two policy case studies. In the first scenario, they studied how the energy price index would change if a tax on crude oil was imposed. One of their main findings upon tracking the energy price index was that around $148 billion could be generated in four years for every $5-per-barrel increase in crude oil tax. Also, this tax would not significantly increase the monthly cost of energy for U.S. households. In the second case study that explored the effect of subsidies in the production of electricity from renewable energy sources, they found that these policies can cause a dip in energy prices even with no tax credit.Baratsas said their approach offers a way to optimize policies at the state, regional and national level for a smooth and efficient transition to clean energy. Further, he noted that their metric could adapt or self-correct its forecasting of energy demands and prices in the event of sudden, unforeseen situations, like the COVID-19 pandemic that may trigger a drastic decrease in the demand for energy products."This metric can help guide lawmakers, government or non-government organizations and policymakers on whether, say, a particular tax policy or the impact of a technological advance is good or bad, and by how much," said Pistikopoulos. "We now have a quantitative and accurate, predictive metric to navigate the evolving energy landscape, and that's the real value of the index."
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324113342.htm
|
Waste from making purple corn chips yields a natural dye, supplements, kitty litter
|
The more colorful a food, the more nutritious it probably is. For example, purple corn contains compounds associated with a reduced risk of developing diabetes and heart disease. The cobs contain the same compounds but are typically thrown out. Now, researchers report a step-wise biorefinery approach in
|
Eating a rainbow of fruits and vegetables provides a variety of health benefits, with vitamins and nutrients stored within the plant's color-producing compounds. One group of compounds contributing distinct hues to food are anthocyanins -- vibrant pigments desired as natural dyes that also have antioxidant and anti-inflammatory properties. Anthocyanins are found in purple corn's kernels and the corncobs, which are typically discarded. Past attempts at repurposing cobs have involved harmful and expensive solvents to extract compounds. Water could be used as an eco-friendly and cost-effective agent for this process, but it is not very efficient. And then the insoluble cob material is still left over as waste. So, Fabrizio Adani, Roberto Pilu, Patrizia De Nisi and colleagues wanted to extract beneficial pigments from purple corncobs with a multi-step approach to make many value-added products, while also closing the loop with zero waste at the end.The researchers devised a biorefinery approach to extract anthocyanins from a new variety of purple corn they developed. First, ground-up corncobs and water were mixed and heated, removing 36% of the pigments compared to methods with acetone and ethanol solvents. The pigments from this step were used to dye cotton and wool fabrics. In the next step of the biorefinery method, the researchers removed an additional 33% of the anthocyanin content from the water-treated cobs with an ethanol mixture. These extracts showed antioxidant activity and anti-inflammatory properties in cells in petri dishes and could be used in the future to develop nutraceutical supplements, the researchers say. Finally, the team found that the remaining insoluble purple grounds were similar to commercial corncob animal litter. In tests, the residual cob material was even more absorbent than the commercial product. And because the material still contains anthocyanins, which have antimicrobial activity, the purple litter could fight bacteria and reduce odors, the researchers say. Used purple corn cob litter could also be composted along with other organic matter, resulting in no waste, they explain.
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324094708.htm
|
Once-in-a-century UK wildfire threats could happen most years by end of century
|
Extremely hot and dry conditions that currently put parts of the UK in the most severe danger of wildfires once a century could happen every other year in a few decades' time due to climate change, new research has revealed.
|
A study, led by the University of Reading, predicting how the danger of wildfires will increase in future showed that parts of eastern and southern England may be at the very highest danger level on nearly four days per year on average by 2080 with high emissions, compared to once every 50-100 years currently.Wildfires need a source of ignition which is difficult to predict, so wildfire risk is typically measured by the likelihood that a fire would develop after a spark of ignition. This fire danger is affected by weather conditions. As temperatures rise and summer rainfall decreases, conditions highly conducive to wildfire could be nearly five times more common in some regions of the UK by the latter part of the century.In the driest regions, this could put habitats at risk for up to four months per year on average, the scientists found.Professor Nigel Arnell, a climate scientist at the University of Reading who led the research, said: "Extremely hot and dry conditions that are perfect for large wildfires are currently rare in the UK, but climate change will make them more and more common. In future decades, wildfires could pose as much of a threat to the UK as they currently do in the south of France or parts of Australia."This increased fire danger will threaten wildlife and the environment, as well as lives and property, yet it is currently underestimated as a threat in many parts of the UK. This research highlights the growing importance of taking the threat of wildfires seriously in the UK, as they are likely to be an increasing problem in future."In the new study, published in the journal They found the average number of 'very high' danger days each year will increase significantly in all parts of the UK by 2080. Excluding London, southern and eastern England were predicted to be worst affected, with the average number of danger days more than quadrupling, up to 111 days in the South East and 121 days in the East of England on average.Significant increases by 2080 were also seen in the West Midlands (from 13 up to 96 days). Even traditionally wet parts of the UK would dry out for longer, leaving them vulnerable to severe fires for several weeks on average each year, including Wales (5 to 53), Northern Ireland (2 to 20), and West Scotland (3 to 16).'Exceptional danger' days -- currently extremely rare across the UK -- were found to become more commonplace across the UK by 2080, with the East of England (0.02 to 3.55), East Midlands (0.03 to 3.23), South East (0.01 to 1.88), and Yorkshire and Humberside (0.01 to 1.55) all seeing large increases in numbers of days each year when these conditions were present.The research showed that the projected increase in fire danger is predominantly due to hotter temperatures, less rainfall, lower humidity and stronger winds expected across the UK in future decades due to climate change.Wildfires pose environmental, health and economic risks. Although the UK records tens of thousands fires each year, these are almost all very small, especially in comparison to those in countries and regions like Australia and California, which have the kinds of hotter, drier climates forecast for the UK in future decades.Although the UK has experienced very low losses from wildfires so far, up to £15m is estimated to be spent each year tackling them. There is no coordinated strategy for wildfire in England, only a voluntary forum which does not have powers to set standards or guidance.Notable examples of wildfires in the UK are the Swinley Forest fire on the Surrey/Berkshire border in May 2011 that threatened critical infrastructure; the Saddleworth Moor fire in the Peak District in May 2018 and Wanstead Flats fire in London in July 2018 that both led to residents being evacuated; residential and commercial property loss in Marlow, Buckinghamshire, in July 2018; and an extensive fire in Moray, Scotland, in April 2019 that endangered an onshore wind farm.While natural weather and climate conditions directly affect the 'danger' of a wildfire becoming established, the 'risk' of a wildfire occurring often depends on deliberate or accidental human acts. This study, therefore, does not indicate how likely wildfires are to occur, only their likely severity if one did occur.
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324094656.htm
|
Extreme temperatures, heat stress and forced migration
|
The study, building on cooperation between climate scientists from the MENA region, aimed at assessing emerging heatwave characteristics. The research team used a first-of-its-kind multi-model ensemble of climate projections designed exclusively for the geographic area. Such detailed downscaling studies had been lacking for this region. The researchers then projected future hot spells and characterised them with the Heat Wave Magnitude Index. The good match among the model results and with observations indicates a high level of confidence in the heat wave projections.
|
"Our results for a business-as-usual pathway indicate that especially in the second half of this century unprecedented super- and ultra-extreme heatwaves will emerge," explains George Zittis of The Cyprus Institute, first author of the study. These events will involve excessively high temperatures of up to 56 degrees Celsius and higher in urban settings and could last for multiple weeks, being potentially life-threatening for humans and animals. In the second half of the century, about half of the MENA population or approximately 600 million people could be exposed to such annually recurring extreme weather conditions."Vulnerable citizens may not have the means to adapt to such harsh environmental conditions," adds Jos Lelieveld, Director at the Max Planck Institute for Chemistry and leading the research team. "These heat waves combined with regional economic, political, social and demographic drivers have a high potential to cause massive, forced migration to cooler regions in the north."To avoid such extreme heat events in the region, the scientists recommend immediate and effective climate change mitigation measures. "Such measures include drastic decreases of the emissions of greenhouse gases such as carbon dioxide and methane into the atmosphere, but also adaptation solutions for the cities in the area," says Lelieveld. It is expected that in the next 50 years, almost 90 percent of the exposed population in the MENA will live in urban centers, which will need to cope with these societally disruptive weather conditions. "There is an urgent need to make the cities more resilient to climate change," emphasizes Zittis.
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324094654.htm
|
Electric cars: Recharge your batteries
|
Perhaps the most frustrating limitation of owning an all-electric car is how long it takes to fully charge the battery. For a Tesla, for example, it takes about 40 minutes to charge it to 80% capacity using the most powerful charging station.
|
Scientists have long thought the laws of physics limited how fast you could safely recharge a battery, but new research by University of Utah chemical engineering assistant professor Tao Gao has opened the door to creating a battery that can be recharged in just a fraction of the time.Gao's research was detailed in a new paper published in the scientific journal "This understanding lays the foundation for the future engineering work needed to address this challenge," says Gao. "Now we know where to go. We have a clear vision of what needs to be done."Lithium-ion batteries have become a popular choice for portable electronics and all-electric vehicles because of their high energy density, low weight and long life. They are also used in laptop computers, portable electric appliances and for solar energy storage.But how quickly a lithium-ion battery can recharge is hampered by a phenomenon known as "lithium plating," a side reaction that happens when lithium ions are put into graphite particles too fast. Gao compares the operation of a lithium-ion battery to a ping pong ball being batted back and forth on a table. The ball, or lithium ion, travels from the positive electrode to the negative electrode during the charging process. The charging rate is similar to how fast the ball travels. Lithium plating occurs when the lithium ion moves too fast and the graphite particles in the battery fails to catch it, Gao explains. While charging, this can be hazardous and cause the battery to catch fire or explode, so that limits how quickly batteries can be recharged. It also can seriously degrade the battery, limiting its life.Gao's discovery reveals the important physics that govern the lithium plating phenomena in graphite particles during battery charging and enables the prediction of lithium plating in the operation of a battery."We designed an experiment that can visualize what happens to the negative electrode during charging. We can see the graphite particle -- the material in the negative electrode -- and we can see what happens during battery charging in real time," he says. "Now we understand the physics. This provides us direction to address this limitation and improve battery charging performance."Gao believes that with this fresh understanding, new technologies could create a car battery that could be fully charged five times faster than normal, or in just over 10 minutes, without the risk of a hazard or degrading too quickly, he says. Smartphones, which typically take more than a half an hour with the fastest charger, could also be fully charged in just 10 minutes, he says.Now that Gao and his co-researchers have a better grasp of the science behind lithium-ion charging, he believes we could see cell phones with better batteries in as little as three to five years and on all-electric cars in as soon as five to 10 years.
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324094651.htm
|
Bees form scent-driven phone tree to pass along messages
|
Honeybees play a scent-driven game of telephone to guide members of a colony back to their queen, according to a new study led by University of Colorado Boulder. The research, published today in the
|
The findings also serve as a testament to a honeybee's love for its queen. These matriarchs are the most important members of any hive: They're the only females able to reproduce. Queens, like other members of a colony, can also communicate using pheromones, or small and odorous molecules that bees produce through special glands."It's very important for the bees to know where the queen is and to stay close to her," said study author Orit Peleg, an assistant professor in the BioFrontiers Institute and Department of Computer Science at CU Boulder.Pheromones, which are too small for scientists to observe directly, can only travel so far before they dissipate into the air.So bees get creative to pass the messages along. Drawing on experiments with live bees and computer simulations, or models, Peleg and her colleagues discovered that when a queen starts sending out pheromones, nearby insects take note. They stop what they're doing, start making their own pheromones, then transmit those scents to friendly bees that are farther away.Peleg added that the results could one day help engineers to design more efficient telecommunications networks -- for humans."There are many examples of animals, like ants, who lay pheromones in their environments," Peleg said. "But those pheromones just disperse passively by the laws of physics. Here, the bees are actively directing that signal."That conclusion, she added, came about from a chance observation. During a previous study, Peleg and her colleagues were tracking how honeybees form giant swarms -- or undulating blobs made up of thousands to as many as 100,000 bees.In the process, the researchers spotted something strange. As the honeybees in their experiments gathered around a queen to build a swarm, large numbers of them began to engage in what scientists call the "scenting" behavior. They stuck their hind ends into the air and beat their wings furiously."When they fan their wings, they're drawing air over their pheromone glands, blowing those molecules away," Peleg said.She and her team wanted to know what was behind this insect version of twerking.To do that, the group set up a video camera in an arena and recorded bees in the process of forming a swarm. The researchers then analyzed that footage using machine learning tools that automatically tracked the locations and orientations of the bees in a colony.The team discovered the bees didn't seem to spread their scent randomly."The signal is broadcasted in a particular direction, and that direction tends to be away from the queen," Peleg said.Picture an insect phone tree: The bees closest to the queen catch whiff of her smell molecules, then blow their own pheromones to the bees behind them. That next layer of bees passes the message on in turn, and the chain continues until every bee in the colony is in on the secret."It almost resembles a telecommunications network where you have antennas that are talking to each other and amplifying the signal so that it can reach farther away," Peleg said.Dieu My Nguyen, lead author of the study, noted that at the height of this communication frenzy, a hive's messenger bees mostly spaced themselves evenly across an arena."The distances between the scenting bees were very uniform," said Nguyen, a graduate student in computer science at CU Boulder. "That suggests that there is some sort of concentration threshold over which pheromones are detectable, and that the bees were responding to that."Peleg, Nguyen, and their colleagues say that there is still a lot that they don't know about how these communications networks work. Do only some bees, for example, transmit messages for the queen, or can all members of a hive sniff and fan when it suits them? For now, the team is happy to get a new whiff of the social lives of these curious insects."We got a few bee stings," Nguyen said. "But it was worth it for those nice movies."
|
Environment
| 2,021 |
March 24, 2021
|
https://www.sciencedaily.com/releases/2021/03/210324094648.htm
|
Scientists improve a photosynthetic enzyme by adding fluorophores
|
Given the finite nature of fossil fuel reserves and the devastating environmental impacts of relying on fossil fuels, the development of clean energy sources is among the most pressing challenges facing modern industrial civilization. Solar energy is an attractive clean energy option, but the widescale implementation of solar energy technologies will depend on the development of efficient ways of converting light energy into chemical energy.
|
Like many other research groups, the members of Professor Takehisa Dewa's research team at Nagoya Institute of Technology in Japan have turned to biological photosynthetic apparatuses, which are, in Prof. Dewa's words, both "a source of inspiration and a target to test ways of improving the efficiency of artificial systems." Specifically, they chose to focus on the purple photosynthetic bacterium Rhodopseudomonas palustris, which uses a biohybrid light-harvesting 1-reaction center core complex (LH1-RC) to both capture light energy and convert it into chemical energy.In their initial studies of R. palustris, Prof. Dewa's group quickly noted that the LH1-RC system has certain limitations, such as only being able to harvest light energy efficiently within a relatively narrow wavelength band due to its reliance on (bacterio)chlorophylls, a single light-harvesting organic pigment assembly (B875, named for its absorption maximum). To overcome this limitation, the researchers, in partnership with collaborators at Osaka University and Ritsumeikan University, experimented with covalently linking the LH1-RC system to a set of fluorophores (Alexa647, Alexa680, Alexa750, and ATTO647N). The results of their experiments appear in a paper published in a recent issue of the Having synthesized their modified LH1-RC system, Prof. Dewa's team used a method called "femtosecond transient absorption spectroscopy" to confirm the presence of ultrafast "excitation energy" transfer from the fluorophores to the bacteriochlorophyll a pigments in the B875 assembly. They also confirmed the subsequent occurrence of "charge separation" reactions, a key step in energy harvesting. Unsurprisingly, the rate of excitation energy transfer increased with greater spectral overlap between the emission bands of the fluorophores and the absorption band of B875. Attaching the external light-harvesting fluorophores boosted the enzyme's maximum yield of charge separation and photocurrent generation activity on an electrode within an artificial lipid bilayer system.By introducing covalently linked fluorophores into a bacterial photosynthetic enzyme, Prof. Dewa's team succeeded in broadening the enzyme's band of harvestable light wavelengths. This is an important improvement given the extremely low energy density of sunlight. "This finding could pave the way to developing an efficient artificial photosynthesis system for solar energy conversion," notes Prof. Dewa. "Research on biohybrids should provide insights into the development of implementable energy conversion systems, thereby giving advanced modern civilization a practical option for accessing an inexhaustible supply of clean solar energy," he adds.The energy conversion systems in question may take many forms, including various nanomaterials, such as quantum dots and nanocarbon materials, but a unifying feature will be the need for some way to harness a broad-spectrum light-harvesting apparatus to a photocurrent-generating apparatus, and the biohybrid-type system developed by Prof. Dewa's team provides a feasible means of addressing this need.
|
Environment
| 2,021 |
March 23, 2021
|
https://www.sciencedaily.com/releases/2021/03/210323150824.htm
|
Changes in Antarctic marine ecosystems
|
Understanding the evolution of the polar sea ice is not enough to study the effects of the climate change on marine ecosystems in Antarctic seafloors. It is also necessary to determine the intensity of phytoplankton local production during the Antarctic summer, as stated in a new study by a research team of the Faculty of Biology and the Biodiversity Research Institute (IRBio) of the University of Barcelona, published in the journal
|
Extremely low temperatures, strong ocean currents and the broad seasonal coverage of marine ice are factors that determine the features of the Antarctic marine ecosystems. IN particular, the seasonality regarding the ice formation in the marine surface is a process that directly affects the dynamics of the marine ecosystems and the flow of matter and energy in complex Antarctic trophic networks. During the Antarctic winter, the ice and snow that accumulate limit the availability of light, and as a result, this reduces the activity of photosynthetic organisms and the production of krill (basic food resource within the food network in Antarctic marine ecosystems).The main sources of organic carbon in Antarctic ecosystems located at the shallows are phytoplankton, algae that grow under the ice and algae that are stuck in rocks. However, great part of this primary production does not enter the trophic network directly through herbivores, but as detritus (particles of rock). "The presence of ice in the shallows limits the primary production during great part of the year. This determines that benthic trophic networks depend largely on the accumulated organic matter in seafloors during the summer months," notes Lluís Cardona, first author of the article and lecturer at the Department of Evolutionary Biology, Ecology and Environmental Sciences, and the IRBio."To date, we thought this dependence would be acute in areas where the sea surface remains frozen for a longer time, and this would involve a lesser diversity of trophic niches and a shorter and redundant trophic network while we go south," notes the researcher. The study highlights that the intensity of the summer bloom of phytoplankton alters this gradient and therefore the structure of coastal benthic systems is strongly modified wherever the bloom is intense.The study was based on the analysis of C and N stable isotopes to identify the ecological niche -the role of each organism in the structure and function of the ecosystem- of a series of marine species caught in Base Rothera, Cierva Cove, Maxwell Bay, Hope Bay, and Paradise Harbour, in the western side of the Antarctic peninsula, and Southern Shetland Islands. Using the isotopic analyses, experts could verify the great stability of the trophic level of each species but they also detected a considerable geographical variety in the used carbon sources. With the used methodologies in previous studies -in particular, the study of stomach content-, the obtained data provided a high taxonomic resolution but did not offer a complete version of the diet over time, which generated a great disparity in results.According to the conclusions of the study, "where the phytoplankton production is intense, the benthic ecosystem receives lots of organic matter coming from the phytoplankton that becomes the basic source of carbon for benthic species, regardless of the latitude and length of the marine ice. This reduces the importance of benthic algae as a source of carbon, which is however very high since these are protected from herbivores by chemical defenses (repulsive natural products")," notes Lluís Cardona. Therefore, those areas that feature a summer bloom of intense phytoplankton, show a shorter and redundant trophic network, like in the sea surface that remains frozen for months. "Therefore, in order to assess the impact of climate change in benthic ecosystems, it is as much important to predict changes in summer production of phytoplankton as to simulate the length the ice will remain in the sea surface," notes the researcher.The Antarctic peninsula is the most affected area by climate change in the white continent. According to current data, in winter, there is a reduction of the length of marine ice in the north, and a movement towards south regarding the species such as the Antarctic krill. Therefore, current conditions registered in the north of the Antarctic peninsula could be a model for the future of the southern-western peninsular areas as long as the summer production of phytoplankton remains the same, experts note.
|
Environment
| 2,021 |
March 23, 2021
|
https://www.sciencedaily.com/releases/2021/03/210323131244.htm
|
Last Ice Age: Precipitation caused maximum advance of Alpine Glaciers
|
The last glacial period, which lasted about 100,000 years, reached its peak about 20,000 to 25,000 years ago: Huge ice sheets covered large parts of northern Europe, North America and northern Asia, some of them kilometres thick, and the sea level was about 125 metres below today's level. The Earth looked very different during this so-called Last Glacial Maximum than it does today. This relatively recent period of the last maximum ice extent has long been of interest to researchers and subject to intensive research.
|
What actually led to this extreme glacier growth, however, has remained unclear until now. Through findings of special cave deposits in the Obir Caves in Bad Eisenkappel located in the Austrian state of Carinthia Christoph Spötl, head of the Quaternary Research Group at the Department of Geology at the University of Innsbruck, together with his colleague Gabriella Koltai, made an interesting observation for an interval within the Last Glacial Maximum that lasted about 3100 years. During this period, the ice volume in the Alps reached its maximum. The data are based on small, inconspicuous crystals, so-called cryogenic cave carbonates (CCC): "These calcite crystals formed when the Obir Caves were ice caves with temperatures just below zero. CCC are reliable indicators of thawing permafrost. These findings mean that, paradoxically, during one of the coldest periods of the last glacial period, the permafrost above these caves slowly warmed up," says Christoph Spötl. Since climate warming can be ruled out at this time, there is only one way for geologists to explain this phenomenon. "There must have been a major increase in solid precipitation in the Alps between 26,500 and 23,500 years ago: There is no permafrost in places with a stable thick snow cover."Cold periods are typically also dry, but in the Alpine region this was not the case during this interval, which lasted about 3100 years. "The largest advance of Alpine glaciers in the entire last glacial period took place during this time interval. Precipitation was the key source for the growth of the ice giants -- and there must have been a lot of it, especially in autumn and early winter, as the CCC show," says Spötl. "A snow cover of about half a metre has already a strong insulating effect, shields the ground below from the very cold winter air and thus leads to an increased temperature in the subsurface. The permafrost above the Obir caves gradually thawed at that time. This thermal phenomenon, triggered by the shift from an Arctic-dry to a significantly wetter climate, remained preserved in the underground in the form of the CCC until today." Since the North Atlantic -- today a major source of precipitation -- was ice-covered in winter at the time, the team assumes a strong southerly flow from the Mediterranean that brought the moisture to the Alps, driven by pronounced southerly föhn conditions. "We consider massive snowfall due to this strong southerly flow as the cause of the growth of glaciers in the Alpine region at the peak of the Last Glacial Maximum. And our data allow us to even pin down the season: autumn and early winter," concludes Christoph Spötl.Cryogenic cave carbonates have long been overlooked even by experienced speleologists, however, Koltai and Spötl are convinced: "In Austria alone, around 17,500 caves are known, and further discoveries of CCC are only a matter of time. That's why we work closely with speleologists, in the case of the Obir caves with the specialist group for karst and speleology of the Natural Science Association for Carinthia."
|
Environment
| 2,021 |
March 23, 2021
|
https://www.sciencedaily.com/releases/2021/03/210323131219.htm
|
Climate change can destabilize the global soil carbon reservoir
|
The vast reservoir of carbon that is stored in soils probably is more sensitive to destabilization from climate change than has previously been assumed, according to a new study by researchers at WHOI and other institutions.
|
The study found that the biospheric carbon turnover within river basins is vulnerable to future temperature and precipitation perturbations from a changing climate.Although many earlier, and fairly localized, studies have hinted at soil organic carbon sensitivity to climate change, the new research sampled 36 rivers from around the globe and provides evidence of sensitivity at a global scale."The study results indicate that at the large ecosystem scale of river basins, soil carbon is sensitive to climate variability," said WHOI researcher Timothy Eglinton, co-lead author of the paper in the The public is generally aware that climate change can potentially destabilize and release permafrost carbon into the atmosphere and exacerbate global warming. But the study shows that this is true for the entire soil carbon reservoir, said WHOI researcher Valier Galy, the other co-lead author of the study.The soil carbon reservoir is a key component in keeping the atmosphere in check in terms of how much carbon dioxide is in the air. The amount of carbon stored in terrestrial vegetation and soils is three times more than how much the atmosphere holds, and it consumes more than a third of the anthropogenic carbon that is emitted to the atmosphere.To determine the sensitivity of terrestrial carbon to destabilization from climate change, researchers measured the radiocarbon age of some specific organic compounds from the mouths of a diverse set of rivers. Those rivers -- including the Amazon, Ganges, Yangtze, Congo, Danube, and Mississippi -- account for a large fraction of the global discharge of water, sediments and carbon from rivers to the oceans.Terrestrial carbon, however, is not so simple to isolate and measure. That's because carbon in rivers comes from a variety of sources, including rocks, organic contaminants such as domestic sewage or petroleum that differ widely in their age, and vegetation. To determine what's happening within the rivers' watersheds, and to measure radiocarbon from the terrestrial biosphere, researchers focused on two groups of compounds: the waxes of plant leaves that serve a protective function for the plants' leaf surface and lignin, which is the woody "scaffolding" of land plants.Taking these measurements showed a relationship between the age of the terrestrial carbon in the rivers and the latitude where the rivers reside, researchers found. That latitudinal relationship prompted researchers to infer that climate must be a key control in the age of the carbon that is exported from the terrestrial biosphere to these rivers, and that temperature and precipitation are primary controls on the age of that carbon."Why this study is powerful is because this large number of rivers, the wide coverage, and the wide range of catchment properties give a very clear picture of what's happening at the global scale," said Galy. "You could imagine that by going after lots of rivers, we would have ended up with a very complicated story. However, as we kept adding new river systems to the study, the story was fairly consistent.""In many respects, Earth scientists see rivers as being a source signal that is sent to sedimentary records that we can interpret," said Eglinton. "By going to sedimentary records, we have the opportunity to look at how the terrestrial biosphere has responded to climate variability in the past. In addition, by monitoring rivers in the present day, we can also use them as sentinels in order to assess how these watersheds may be changing."
|
Environment
| 2,021 |
March 23, 2021
|
https://www.sciencedaily.com/releases/2021/03/210323131216.htm
|
Sea-level rise in 20th century was fastest in 2,000 years along much of East Coast
|
The rate of sea-level rise in the 20th century along much of the U.S. Atlantic coast was the fastest in 2,000 years, and southern New Jersey had the fastest rates, according to a Rutgers-led study.
|
The global rise in sea-level from melting ice and warming oceans from 1900 to 2000 led to a rate that's more than twice the average for the years 0 to 1800 -- the most significant change, according to the study in the journal The study for the first time looked at the phenomena that contributed to sea-level change over 2,000 years at six sites along the coast (in Connecticut, New York City, New Jersey and North Carolina), using a sea-level budget. A budget enhances understanding of the processes driving sea-level change. The processes are global, regional (including geological, such as land subsidence) and local, such as groundwater withdrawal."Having a thorough understanding of sea-level change at sites over the long-term is imperative for regional and local planning and responding to future sea-level rise," said lead author Jennifer S. Walker, a postdoctoral associate in the Department of Earth and Planetary Sciences in the School of Arts and Sciences at Rutgers University-New Brunswick. "By learning how different processes vary over time and contribute to sea-level change, we can more accurately estimate future contributions at specific sites."Sea-level rise stemming from climate change threatens to permanently inundate low-lying islands, cities and lands. It also heightens their vulnerability to flooding and damage from coastal and other storms.Most sea-level budget studies are global and limited to the 20th and 21st centuries. Rutgers-led researchers estimated sea-level budgets for longer timeframes over 2,000 years. The goal was to better understand how the processes driving sea-level have changed and could shape future change, and this sea-level budget method could be applied to other sites around the world.Using a statistical model, scientists developed sea-level budgets for six sites, dividing sea-level records into global, regional and local components. They found that regional land subsidence -- sinking of the land since the Laurentide ice sheet retreated thousands of years ago -- dominates each site's budget over the last 2,000 years. Other regional factors, such as ocean dynamics, and site-specific local processes, such as groundwater withdrawal that helps cause land to sink, contribute much less to each budget and vary over time and by location.The total rate of sea-level rise for each of the six sites in the 20th century (ranging from 2.6 to 3.6 millimeters per year, or about 1 to 1.4 inches per decade) was the fastest in 2,000 years. Southern New Jersey had the fastest rates over the 2,000-year period: 1.6 millimeters a year (about 0.63 inches per decade) at Edwin Forsythe National Wildlife Refuge, Leeds Point, in Atlantic County and 1.5 millimeters a year (about 0.6 inches per decade) at Cape May Court House, Cape May County. Other sites included East River Marsh in Guilford, Connecticut; Pelham Bay, The Bronx, New York City; Cheesequake State Park in Old Bridge, New Jersey; and Roanoke Island in North Carolina.
|
Environment
| 2,021 |
March 23, 2021
|
https://www.sciencedaily.com/releases/2021/03/210323131205.htm
|
Mussel sensors pave the way for new environmental monitoring tools
|
Researchers at North Carolina State University have designed and demonstrated a new system that allows them to remotely monitor the behavior of freshwater mussels. The system could be used to alert researchers to the presence of toxic substances in aquatic ecosystems.
|
"When mussels feed, they open their shells; but if there's something noxious in the water, they may immediately close their shells, all at once," says Jay Levine, co-author of a paper on the work and a professor of epidemiology at NC State. "Folks have been trying to find ways to measure how widely mussels or oysters open their shells off and on since the 1950s, but there have been a wide variety of challenges. We needed something that allows the animals to move, can be placed in streams and collects data -- and now we have it.""We've basically designed a custom Fitbit to track the activities of mussels," says Alper Bozkurt, corresponding author of the paper and a professor of electrical and computer engineering at NC State.The fundamental idea for the research stems from the fact that feeding behavior in mussels is generally asynchronous -- it's not a coordinated affair. So, if a bunch of mussels close their shells at once, that's likely a warning there's something harmful in the water.One of the things the researchers are already doing with the new sensor system is monitoring mussel behavior to determine if there are harmless circumstances in which mussels may all close their shells at the same time."Think of it as a canary in the coal mine, except we can detect the presence of toxins without having to wait for the mussels to die," Levine says. "At the same time, it will help us understand the behavior and monitor the health of the mussels themselves, which could give us insights into how various environmental factors affect their health. Which is important, given that many freshwater mussel species are threatened or endangered.""To minimize costs, all the components we used to make this prototype sensor system are commercially available -- we're just using the technologies in a way nobody has used them before," Bozkurt says.Specifically, the system uses two inertial measurement units (IMUs) on each mussel. Each of the IMUs includes a magnetometer and an accelerometer -- like the ones used in smartphones to detect when you are moving the phone. One IMU is attached to the mussel's top shell, the other to its bottom shell. This allows the researchers to compare the movement of the shell halves relative to each other. In other words, this allows the researchers to tell if the mussel is closing its shell, as opposed to the mussel being tumbled in the water by a strong current.Wires from the IMUs are designed to run to a data acquisition system that would be mounted on a stake in the waterway. When placed in a natural setting, the data acquisition system is powered by a solar cell and transmits data from the sensors wirelessly via a cellular network. The current prototype has four mussels connected to the system, but it could handle dozens.The researchers did more than 250 hours of testing with live mussels in a laboratory fish tank, and found that the sensors were exceptionally accurate -- measuring the angle of the mussel's shell opening to within less than one degree."You can definitely tell when it's closed, when it's open and by how much," Bozkurt says."Our aim is to establish an 'internet-of-mussels' and monitor their individual and collective behavior," Bozkurt says. "This will ultimately enable us to use them as environmental sensors or sentinels."The researchers are now continuing their testing to better understand the robustness of the system. For example, how long might it last in practical use under real-life conditions? The team plans to begin field testing soon."In addition to exploring its effectiveness as an environmental monitor, we're optimistic that the technology can help us learn new things about the mussels themselves," Levine says. "What prompts them to filter and feed? Does their behavior change in response to changes in temperature? While we know a lot about these animals, there is also a lot we don't know. The sensors provide us with the opportunity to develop baseline values for individual animals, and to monitor their shell movement in response to environmental changes."The work was done with support from the National Science Foundation, under grants 1160483 and 1554367; and from the U.S. Fish and Wildlife Service, under grant 2018-0535/F18AC00237.
|
Environment
| 2,021 |
March 22, 2021
|
https://www.sciencedaily.com/releases/2021/03/210322175036.htm
|
Explosive origins of 'secondary' ice and snow
|
Where does snow come from? This may seem like a simple question to ponder as half the planet emerges from a season of watching whimsical flakes fall from the sky -- and shoveling them from driveways. But a new study on how water becomes ice in slightly supercooled Arctic clouds may make you rethink the simplicity of the fluffy stuff. The study, published by scientists from the U.S. Department of Energy's (DOE) Brookhaven National Laboratory in the
|
"Our results shed new light on prior lab-experiment-based understanding about how supercooled water droplets -- water that's still liquid below its freezing point -- turn into ice and eventually snow," said Brookhaven Lab atmospheric scientist Edward Luke, the lead author on the paper. The new results, from real-world long-term cloud radar and weather-balloon measurements in mixed-phase clouds (composed of liquid water and ice) at temperatures between 0 and -10 degrees Celsius (32 and 14° Fahrenheit), provide evidence that freezing fragmentation of drizzle drops is important to how much ice will form and potentially fall from these clouds as snow."Now climate models and the weather forecast models used to determine how much snow you'll have to shovel can make a leap forward by using much more realistic physics to simulate 'secondary' ice formation," Luke said.Precipitating snow from supercooled clouds usually originates from "primary" ice particles, which form when water crystallizes on select tiny specks of dust or aerosols in the atmosphere, known as ice-nucleating particles. However, at slightly supercooled temperatures (i.e., 0 to -10°C), aircraft observations have shown that clouds can contain far more ice crystals than can be explained by the relatively few ice-nucleating particles present. This phenomenon has puzzled the atmospheric research community for decades. Scientists have thought that the explanation is "secondary" ice production, in which the additional ice particles are generated from other ice particles. But catching the process in action in the natural environment has been difficult.Previous explanations for how secondary ice forms relied mainly on laboratory experiments and limited, short-term aircraft-based sampling flights. A common understanding that came out of several lab experiments was that relatively big, fast-falling ice particles, called rimers, can "collect" and freeze tiny, supercooled cloud droplets -- which then produce more tiny ice particles, called splinters. But it turns out that such "rime splintering" isn't nearly the whole story.The new results from the Arctic show that larger supercooled water droplets, classified as drizzle, play a much more important role in producing secondary ice particles than commonly thought."When an ice particle hits one of those drizzle drops, it triggers freezing, which first forms a solid ice shell around the drop," explained Fan Yang, a co-author on the paper. "Then, as the freezing moves inward, the pressure starts to build because water expands as it freezes. That pressure causes the drizzle drop to shatter, generating more ice particles."The data show that this "freezing fragmentation" process can be explosive."If you had one ice particle triggering the production of one other ice particle, it would not be that significant," Luke said. "But we've provided evidence that, with this cascading process, drizzle freezing fragmentation can enhance ice particle concentrations in clouds by 10 to 100 times -- and even 1,000 on occasion!"Our findings could provide the missing link for the mismatch between the scarcity of primary ice-nucleating particles and snowfall from these slightly supercooled clouds."The new results hinge upon six years of data gathered by an upward-pointing millimeter-wavelength Doppler radar at the DOE Atmospheric Radiation Measurement (ARM) user facility's North Slope of Alaska atmospheric observatory in Utqiagvik (formerly Barrow), Alaska. The radar data are complemented by measurements of temperature, humidity, and other atmospheric conditions collected by weather balloons launched from Utqiagvik throughout the study period.Brookhaven Lab atmospheric scientist and study co-author Pavlos Kollias, who is also a professor in the atmospheric sciences division at Stony Brook University, was crucial to the collection of this millimeter-wavelength radar data in a way that made it possible for the scientists to deduce how secondary ice was formed."ARM has pioneered the use of short-wavelength cloud radars since the 1990s to better understand clouds' microphysical processes and how those affect weather on Earth today. Our team led the optimization of their data sampling strategy so information on cloud and precipitation processes like the one presented in this study can be obtained," Kollias said.The radar's millimeter-scale wavelength makes it uniquely sensitive to the sizes of ice particles and water droplets in clouds. Its dual polarization provides information about particle shape, allowing scientists to identify needlelike ice crystals -- the preferential shape of secondary ice particles in slightly supercooled cloud conditions. Doppler spectra observations recorded every few seconds provide information on how many particles are present and how fast they fall toward the ground. This information is critical to figuring out where there are rimers, drizzle, and secondary ice particles.Using sophisticated automated analysis techniques developed by Luke, Yang, and Kollias, the scientists scanned through millions of these Doppler radar spectra to sort the particles into data buckets by size and shape -- and matched the data with contemporaneous weather-balloon observations on the presence of supercooled cloud water, temperature, and other variables. The detailed data mining allowed them to compare the number of secondary ice needles generated under different conditions: in the presence of just rimers, rimers plus drizzle drops, or just drizzle."The sheer volume of observations allows us for the first time to lift the secondary ice signal out of the 'background noise' of all the other atmospheric processes taking place -- and quantify how and under what circumstances secondary ice events happen," Luke said.The results were clear: Conditions with supercooled drizzle drops produced dramatic ice multiplication events, many more than rimers.These real-world data give the scientists the ability to quantify the "ice multiplication factor" for various cloud conditions, which will improve the accuracy of climate models and weather forecasts."Weather prediction models can't handle the full complexity of the cloud microphysical processes. We need to economize on the computations, otherwise you'd never get a forecast out," said Andrew Vogelmann, another co-author on the study. "To do that, you have to figure out what aspects of the physics are most important, and then account for that physics as accurately and simply as possible in the model. This study makes it clear that knowing about drizzle in these mixed-phase clouds is essential."Besides helping you budget how much extra time you'll need to shovel your driveway and get to work, a clearer understanding of what drives secondary ice formation can help scientists better predict how much snow will accumulate in watersheds to provide drinking water throughout the year. The new data will also help improve our understanding of how long clouds will stick around, which has important consequences for climate."More ice particles generated by secondary ice production will have a huge impact on precipitation, solar radiation (how much sunlight clouds reflect back into space), the water cycle, and the evolution of mixed-phase clouds," Yang said.Cloud lifetime is particularly important to the climate in the Arctic, Luke and Vogelmann noted, and the Arctic climate is very important to the overall energy balance on Earth."Mixed-phase clouds, which have both supercooled liquid water and ice particles in them, can last for weeks on end in the Arctic," Vogelmann said. "But if you have a whole bunch of ice particles, the cloud can get cleared out after they grow and fall to the ground as snow. Then you'll have sunlight able to go straight through to start heating up the ground or ocean surface."That could change the seasonality of snow and ice on the ground, causing melting and then even less reflection of sunlight and more heating."If we can predict in a climate model that something is going to change the balance of ice formation, drizzle, and other factors, then we'll have a better ability to anticipate what to expect in future weather and climate, and possibly be better prepared for these impacts," Luke said.
|
Environment
| 2,021 |
March 22, 2021
|
https://www.sciencedaily.com/releases/2021/03/210322175033.htm
|
Deep seafloor nutrient vital in global food chain
|
Eroded seabed rocks are providing an essential source of nutrition for drifting marine organisms at the base of the food chain, according to new research.
|
The findings, led by the University of Leeds, show that iron -- an essential nutrient for microscopic marine algae, or phytoplankton -- is being released from sediments on the deep ocean floor.The research shows that contrary to the expectation that oxygen in the deep-sea prevents dissolved iron from escaping the seafloor, a combination of oxygen and organic matter may actually encourage the release of iron from sediments into the ocean.Published today (22 March) in theReport lead author is Dr Will Homoky, a University Academic Fellow at Leeds' School of Earth and Environment.He said: "Our findings reveal that the shallow surface of the deep seafloor provides an important source of iron -- a scarce micronutrient -- for the ocean."We show that the degradation of rock minerals with organic matter and oxygen is a recipe to produce tiny rust particles, which are small enough to be dissolved and carried in seawater. "These tiny rust particles and their chemical signatures explain how iron found in large parts of the ocean interior could have come from deep ocean sediments, in a manner which was once thought to be practically impossible."The nanometre sized iron particles -- known as colloids -- could provide an important source of nutrition for phytoplankton, which provide the primary food source for a wide range of sea creatures, affecting global food chains.The phytoplankton are also important amid rising worldwide pollution levels, as they help the ocean remove about one quarter of carbon dioxide emitted annually to the atmosphere.The research team, funded by the Natural Environment Research Council (NERC), also included scientists from the universities of Southampton, Liverpool, Oxford, South Florida and Southern California -- a collaboration formed through the international GEOTRACES programme.The findings will help shape further study of the processes that regulate the occurrence of iron in the world's oceans and the role they play in moderating marine life and atmospheric carbon dioxide.Dr Homoky said: "Our findings here are significant, because they mark a turning point in the way we view iron supply from sediments and its potential to reach marine life that paves a new way of thinking about the seafloor."Our discovered production of iron colloids is different to other forms of iron supplied to the ocean, and will help us design a new generation of ocean models to re-evaluate marine life and climate connections to the seafloor -- where large uncertainty currently exists."This could help us to better understand how iron in the ocean has contributed to past productivity and climatic variations and inform our approaches to marine conservation and management."The research team analysed tiny and precise variations within the fluid content of sediment samples collected from the South Atlantic Ocean at water depths ranging from 60m down to 5km.They aimed to understand how the chemical -- or isotope -- signatures of nano-sized iron in the sediment fluids had been formed, and what this tells us about iron supply processes to the ocean.Report co-author Dr Tim Conway is Assistant Professor at University of South Florida.He explained: "We can now measure tiny but important variations in the chemical make-up of seawater that were beyond our reach a decade ago."Here we have characterised an isotope signature belonging to the iron colloids produced in deep ocean sediments that we can use to trace their journey in the ocean."Our continuing goal is to learn how far this iron travels and how much of it nourishes our marine food webs around the globe."
|
Environment
| 2,021 |
March 22, 2021
|
https://www.sciencedaily.com/releases/2021/03/210322175005.htm
|
What early-budding trees tell us about genetics, climate change
|
One of the surest signs of spring is the vibrantly lime-green tinge trees develop as their buds open and tiny new leaves unfurl. Bud-break is the scientific name for this process -- a straightforward term for the grand genetic mechanism that allows trees to leaf out and do their summer work of photosynthesis to store up energy for the coming winter.
|
Bud-break is precluded by bud-set, which occurs in the autumn. After trees have dropped their leaves and as the days shorten and grow colder, new buds grow on branches. Like many wildflowers, trees require a period of dormancy at colder temperatures -- a process fine-tuned by evolution -- before bud-break can occur.But as the changing climate becomes increasingly unpredictable, late frosts are more common -- and many trees initiate bud-break too early or too late. For farmers who grow fruit- and nut-bearing trees as well as grape vines, a mistimed bud-break and a frost could mean the difference between a good harvest and none at all.For example, a late frost in 2007 across the eastern U.S. resulted in an estimated agricultural loss of $112 million, including $86 million in losses to fruit crops. Poorly synchronized bud-break can also lead to pest and disease outbreaks.Understanding bud-break genetics enables scientists to modify or select crop varieties more resilient to such threats.Victor Busov, professor in the College of Forest Resources and Environmental Science at Michigan Technological University, along with colleagues in the U.S. and Sweden, published new research about the transcription factors responsible for early bud-break in the journal The properties of transcription factors help scientists determine what other genes might be involved in a particular process like starting bud-break.Busov and collaborators previously identified transcription factors for early bud-break 1 (EBB1) and short vegetative phase-like (SVL), which directly interact to control bud-break. The research team has now identified and characterized the early bud-break 3 (EBB3) gene. EBB3 is a temperature-responsive regulator of bud-break controlled by interactions between genes and the surrounding environment. The transcription factor provides a direct link to activation of the cell cycle during bud-break."We know now EBB3 is providing a direct link through the signaling pathway for how these cells divide," Busov said. "Once we found the third gene, we started to put them together in a coherent pathway, which helps us see the bigger picture."Using poplar and flowering locus trees in the Michigan Tech greenhouses, the researchers mimicked the daylight length and temperature of an average summer day for a period of time, followed by a period that mimicked average winter days. Then, the scientists conducted gene expression analysis to determine how the transcription factors worked together to help the trees judge when to put forth leaves in the greenhouse's artificial springtime.Busov said the analysis reveals how particular genes activate through the season or in response to specific environmental factors."We need to understand not only three transcription factors, but the whole network," Busov said. "Once we identify the genes, we do experiments where we dial up or down the expression of the gene. We look at what the effect of these actions is on offspring. Identifying variation in the network will allow us to regulate early bud-break. New technologies of sequencing are empowering these areas."The climate has profound effects on the genetic processes that regulate bud-break. The first of these effects is warming winters. In places that no longer experience enough cold, trees do not get the necessary growth-resetting cold exposure. Cold exposure is crucial for strong and uniform bloom and leaf-out, which is needed to produce a good crop, whether it's peaches, apples, cherries, grapes or almonds.The second way climate change affects trees is late frosts. Bud-break is all about timing; trees shouldn't initiate leaf growth until the danger of frost is past. Instances of extremely late frost are becoming more common, and as Busov notes, research indicates that the frequency of these events is increased by climate change."Late frost has detrimental effects, not only on fruit trees, resulting in crop loss, but also forest trees," Busov said. "Frost negatively affects growth and inflicts injuries to growing organs, making trees susceptible to disease and pests."To make matters worse, trees are such long-lived organisms that their evolution is not keeping pace with the rate at which the climate is changing."For trees, their adaption is generational -- but their generations are so long, their adaptation is also so long," Busov said. "You need some way to speed this up, both in fruit trees and in forest populations. With rapid changes, there is no time for this adaptation."Devising new approaches for accelerated tree adaptation to climate change can ensure bud-break happens at precisely the right time each spring. Using their understanding of the genetic pathways that control bud-break, scientists hope to genetically modify crops to adapt to warmer winters and unpredictable frosts. Scientists can also conduct genome-assisted breeding -- the age-old process of natural selection, with science-enabled knowledge.
|
Environment
| 2,021 |
March 22, 2021
|
https://www.sciencedaily.com/releases/2021/03/210322174959.htm
|
Why commercialization of carbon capture and sequestration has failed and how it can work
|
There are 12 essential attributes that explain why commercial carbon capture and sequestration projects succeed or fail in the U.S., University of California San Diego researchers say in a recent study published in
|
Carbon capture and sequestration (CCS) has become increasingly important in addressing climate change. The Intergovernmental Panel on Climate Change (IPCC) relies greatly on the technology to reach zero carbon at low cost. Additionally, it is among the few low-carbon technologies in President Joseph R. Biden's proposed $400 billion clean energy plan that earns bipartisan support.In the last two decades, private industry and government have invested tens of billions of dollars to capture CO"Instead of relying on case studies, we decided that we needed to develop new methods to systematically explain the variation in project outcome of why do so many projects fail," said lead author Ahmed Y. Abdulla, research fellow with UC San Diego's Deep Decarbonization Initiative and assistant professor of mechanical and aerospace engineering at Carleton University. "Knowing which features of CCS projects have been most responsible for past successes and failures allows developers to not only avoid past mistakes, but also identify clusters of existing, near-term CCS projects that are more likely to succeed."He added, "By considering the largest sample of U.S. CCS projects ever studied, and with extensive support from people who managed these projects in the past, we essentially created a checklist of attributes that matter and gauged the extent to which each does."The researchers found that the credibility of revenues and incentives -- functions of policy and politics -- are among the most important attributes, along with capital cost and technological readiness, which have been studied extensively in the past."Policy design is essential to help commercialize the industry because CCS projects require a huge amount of capital up front," the authors, comprised of an international team of researchers, note.The authors point to existing credible policies that act as incentives, such as the 2018 expansion of the 45Q tax credit. It provides companies with a guaranteed revenue stream if they sequester COThe only major incentive companies have had thus far to recoup their investments in carbon capture is by selling the COThe 45Q tax credit also incentivizes enhanced oil recovery, but at a lower price per COBeyond selling to oil and gas companies, CO"If designed explicitly to address credibility, public policy could have a huge impact on the success of projects," said David Victor, co-lead of the Deep Decarbonization Initiative and professor of industrial innovation at UC San Diego's School of Global Policy and Strategy.While technological readiness has been studied extensively and is essential to reducing the cost and risk of CCS, the researchers looked beyond the engineering and engineering economics to determine why CCS continues to be such a risky investment. Over the course of two years, the researchers analyzed publicly available records of 39 U.S. projects and sought expertise from CCS project managers with extensive, real-world experience.They identified 12 possible determinants of project outcomes, which are technological readiness, credibility of incentives, financial credibility, cost, regulatory challenges, burden of COTo evaluate the relative influence of the 12 factors in explaining project outcomes, the researchers built two statistical models and complemented their empirical analysis with a model derived through expert assessment.The experts only underscored the importance of credibility of revenues and incentives; the vast majority of successful projects arranged in advance to sell their captured COThe authors conclude models in the study -- especially when augmented with the structured elicitation of expert judgment -- can likely improve representations of CCS deployment across energy systems."Assessments like ours empower both developers and policymakers," the authors write. "With data to identify near-term CCS projects that are more likely to succeed, these projects will become the seeds from which a new CCS industry sprouts."Co-authors include Ryan Hanna, assistant research scientist at UC San Diego; Kristen R Schell, assistant professor at Carleton University; and Oytun Babacan, research fellow at Imperial College London.
|
Environment
| 2,021 |
March 22, 2021
|
https://www.sciencedaily.com/releases/2021/03/210322135230.htm
|
Uniform drying time for goldenseal to enhance medicinal qualities of forest herb
|
Developing a standardized drying protocol for goldenseal could lead to more predictable health applications and outcomes by preserving the alkaloids found in the plant, which is native to Appalachia, according to Penn State researchers, who conducted a new study of the medicinal forest herb.
|
The roots and rhizomes of goldenseal -- "Three alkaloids -- berberine, hydrastine and canadine -- are recognized as the major bioactive constituents in goldenseal," said Burkhart, who also is program director, Appalachian botany and ethnobotany, at Shaver's Creek Environmental Center. "One important postharvest processing step for goldenseal is drying. However, before this study it was not known how drying temperature influences the concentrations of these alkaloids."To investigate this question, researchers removed goldenseal samples from three plant colonies within a wild population located in central Pennsylvania. Fourteen "ramets," or bunches, were harvested from each plot in early April while plants were dormant.Lead researcher Grady Zuiderveen, doctoral student in ecosystem science and management, freeze-dried or air-dried goldenseal samples at six temperatures, ranging from 80 to 130 degrees Fahrenheit, to determine the relationship between drying temperature and alkaloid content in the rhizome and roots.In findings recently published in While canadine is the least abundant alkaloid of the three, it is known to have key antibacterial properties, Zuiderveen pointed out, so developing a more standardized drying protocol for goldenseal could lead to a more predictable phytochemical profile."This work is important because canadine has been found to have significant activity against numerous strains of bacteria, and in previous research it was the only one of the three major alkaloids found to be active against Pseudomonas aeruginosa and Staphylococcus aureus," he said. "Also, canadine possesses significant antioxidant properties and has been identified as effective at strengthening the immune system."The Pennsylvania Department of Conservation and Natural Resources' Wild Resources Conservation Program funded this research.
|
Environment
| 2,021 |
March 22, 2021
|
https://www.sciencedaily.com/releases/2021/03/210322130145.htm
|
Refining the hunt for SARS-CoV-2 in wastewater
|
There are many ways to test municipal wastewater for signs of the virus that causes COVID-19, but scientists in Houston have determined theirs is the best yet.
|
A study led by environmental engineer Lauren Stadler of Rice University's Brown School of Engineering with the aid of the City of Houston Health Department and Baylor College of Medicine compared five processes used by labs around the country to concentrate samples and find the virus in wastewater from six Houston plants.The process employed at Rice and now Baylor, called "electronegative filtration with bead beating," proved the most sensitive to signs of the virus as well as the most cost-effective.The study appears in the Elsevier journal There is no standard test, according to the study, but all of the processes -- including electronegative filtration with elution, ultrafiltration, precipitation and direct extraction -- are effective to a degree."The virus is extremely dilute in wastewater, so we need a way to concentrate it," Stadler said in explaining the Rice process. "First, we add salt to the wastewater sample to enhance adsorption of the virus to the electronegative filter. After filtration, we physically beat the filter with glass beads to release the virus into a lysate. Even though this process might break up the virus, we only detect tiny fragments of its RNA genome to quantify it."Established in spring 2020, the Houston coalition was on the leading edge of what became a nationwide effort to find the SARS-CoV-2 virus in wastewater. The technique quickly proved able to anticipate COVID-19 outbreaks and allowed health officials to ramp up testing where needed."When we started testing, Baylor was using a different method," Stadler said. "That gave us the opportunity to do a lot of head-to-head comparisons about which method to use. And that led us to want to do a more comprehensive evaluation of several methods that were commonly being used to concentrate SARS-CoV-2 in wastewater by other groups around the world."There's not one right method, but we wanted to be sure we considered other options," she said. "Our recommendation and final method selection was based on finding a balance between sensitivity, throughput and cost."The method we selected originally turned out to have the lowest detection limit, while also being relatively high-throughput and cost effective," Stadler said. "Baylor switched over to the same concentration method as a result, which gave us confidence that we were truly generating the best possible data for the city."The Houston researchers hope the study will guide municipalities around the world that have, or are considering, their own wastewater testing labs."A lot of major cities are already doing this, and there are now statewide programs emerging in Michigan, Wisconsin, North Carolina and a few others," Stadler said. She noted the Houston labs are already looking for COVID mutations. "We're doing research on sequencing wastewater samples to be able to detect highly transmissible variants circulating in the community," Stadler said.That the study even happened is a bonus, considering the workload for Stadler's Rice lab and counterparts at Baylor and the Houston Health Department. Since ramping up in mid-2020, the labs have analyzed hundreds of samples a week from the city's 39 wastewater plants. Since then, Houston has added testing points at dozens of nursing homes, schools and other critical locations."The city is planning to expand the number of stations to get zip code-level information, and we're working with them to analyze that data," Stadler said. "Someday, it could be a tool to look for a panel of viruses, not just this one."Rice graduate student Zachary LaTurner is lead author of the study. Co-authors are Rice postdoctoral researcher David Zong and graduate students Prashant Kalvapalle, Kiara Reyes Gamas, Tessa Crosby and Priyanka Ali; Austen Terwilliger, director of operations of the TAILOR Labs at Baylor; Baylor staff scientist Vasanthi Avadhanula, lab assistant Haroldo Hernandez Santos, research technician Kyle Weesner, Pedro Piedra, a professor of molecular virology and microbiology, and Anthony Maresso, an associate professor of molecular virology and microbiology; and Loren Hopkins, chief environmental science officer for the Houston Health Department and a professor in the practice of statistics at Rice. Stadler is an assistant professor of civil and environmental engineering.The Houston Health Department, National Science Foundation, Johnson & Johnson, the Environmental Research and Education Foundation and a National Academies of Science, Engineering and Medicine Gulf Research Early Career Research Fellowship supported the study.
|
Environment
| 2,021 |
March 22, 2021
|
https://www.sciencedaily.com/releases/2021/03/210322120113.htm
|
Making plastics production more energy efficient
|
Northwestern engineering researchers have demonstrated a new approach to chemical catalysis that results in high propylene yields using less energy. The findings could support more energy-efficient production processes for many plastics.
|
One of the highest volume chemical products, more than $100 billion worth of propylene is produced each year and used primarily to produce polypropylene for a variety of materials, from injection moldings in car parts to consumer products. Producing propylene is also energy intensive, requiring temperatures around 800 degrees Celsius to convert propane gas to propylene.One technique, called oxidative dehydrogenation, has long been studied as an alternative way to make propylene from propane without the high temperature restrictions. This approach reacts propane and oxygen over a catalyst to produce propylene and water. Yet, because propylene is more reactive to oxygen than propane, the reaction typically only yields only a small amount of propylene."The reaction works, but similar to when you turn on your gas grill to cook at home, you don't produce propylene, you just burn the propane," said Justin Notestein, professor of chemical and biological engineering at the McCormick School of Engineering and a co-corresponding author on the research. "Instead of searching for the right catalyst, we deconstructed the oxidative dehydrogenation reaction down into two components -- dehydrogenation and selective hydrogen combustion -- and then designed a tandem material that does both reactions, in a particular order. This produced the highest yields of propylene ever reported."A paper titled "Tandem In2O3-Pt/Al2O3 Catalyst for Coupling of Propane Dehydrogenation to Selective H2 Combustion" was published March 19 in the journal In the new approach, the researchers engineered two catalysts with nanoscale proximity: a platinum-based catalyst that selectively removes hydrogen from propane to make propylene, and an indium oxide-based catalyst that selectively burns the hydrogen, but not the propane or propylene."We found that the nanostructure really matters," Notestein said. "Indium oxide on platinum works great. Platinum on indium oxide doesn't. Platinum physically combined with indium oxide doesn't. This nanostructure is able to separate and sequence the reactions, even though both catalysts can do both reactions. This organization is common in biology, but is very rare with human-made materials."The team's tests produced notable improvements in propane yield to produce propylene. At 450 degrees Celsius, tests produced 30 percent yield from a single pass through the reactor, while ensuring that more than 75 percent of the carbon atoms in the propane went on to become propylene. By comparison, it is impossible to produce yields greater than 24 percent when heating propane in the absence of oxygen, and the required catalysts are often unstable."No one has ever demonstrated yields exceeding these thermodynamic limitations," Notestein said. "From a utility point of view, our results are some of the first to really justify trying to do this reaction oxidatively, rather than just doing dehydrogenation."The system's simple design could be further optimized by adjusting the reactor conditions and altering the two catalyst components. Current methods to produce higher yields require more complex and expensive engineering solutions."Since we rely on proven design-build-test cycles from engineering, there can be additional improvements," said Notestein, director of the Center for Catalysis and Surface Science, part of the Institute for Sustainability and Energy at Northwestern. "These findings give us new compositions and rational strategies to try in the search for high-performing catalyst systems. This could especially benefit smaller chemical plants where energy consumption is very important and current engineering strategies may not be feasible." He added the team's approach reflects the larger efforts of the National Science Foundation's Center for Innovative and Strategic Transformation of Alkane Resources (CISTAR), which funded the work.The findings also could further improve the energy efficiency of manufacturing many plastics used in structural and materials applications. Plastic parts in cars, for example, make vehicles lighter in weight and more energy efficient, while polymer house wrap and siding is durable and helps keep homes warm and dry."Plastics, while much maligned, are essential to modern society, including efforts to make society more energy efficient," Notestein said. "Making propylene, and materials like polypropylene, using this new approach could be much less energy intensive, which would be good news for everyone."
|
Environment
| 2,021 |
March 22, 2021
|
https://www.sciencedaily.com/releases/2021/03/210322112910.htm
|
New porous material promising for making renewable energy from water
|
One prospective source of renewable energy is hydrogen gas produced from water with the aid of sunlight. Researchers at Linköping University, Sweden, have developed a material, nanoporous cubic silicon carbide, that exhibits promising properties to capture solar energy and split water for hydrogen gas production. The study has been published in the journal
|
"New sustainable energy systems are needed to meet global energy and environmental challenges, such as increasing carbon dioxide emissions and climate change," says Jianwu Sun, senior lecturer in the Department of Physics, Chemistry and Biology at Linköping University, who has led the new study.Hydrogen has an energy density three times that of petrol. It can be used to generate electricity using a fuel cell, and hydrogen-fuelled cars are already commercially available. When hydrogen gas is used to produce energy, the only product formed is pure water. In contrast, however, carbon dioxide is created when the hydrogen is produced, since the most commonly used technology used today depends on fossil fuels for the process. Thus, 9-12 tonnes of carbon dioxide are emitted when 1 tonne of hydrogen gas is produced.Producing hydrogen gas by splitting water molecules with the aid of solar energy is a sustainable approach that could give hydrogen gas using renewable sources without leading to carbon dioxide emissions. A major advantage of this method is the possibility to convert solar energy to fuel that can be stored."Conventional solar cells produce energy during the daytime, and the energy must either be used immediately, or stored in, for example, batteries. Hydrogen is a promising source of energy that can be stored and transported in the same way as traditional fuels such as petrol and diesel," says Jianwu Sun.It is not, however, an easy task to split water using the energy in sunlight to give hydrogen gas. For this to succeed, it is necessary to find cost-efficient materials that have the right properties for the reaction in which water (HJianwu Sun's research group has investigated cubic silicon carbide, 3C-SiC. The scientists have produced a form of cubic silicon carbide that has many extremely small pores. The material, which they call nanoporous 3C-SiC, has promising properties that suggest it can be used to produce hydrogen gas from water using sunlight. The present study has been published in the journal "The main result we have shown is that nanoporous cubic silicon carbide has a higher charge-separation efficiency, which makes the splitting of water to hydrogen much better than when using planar silicon carbide," says Jianwu Sun.The research has received financial support from, among other sources, the Swedish Research Council, FORMAS, and The Swedish Foundation for International Cooperation in Research and Higher Education.
|
Environment
| 2,021 |
March 22, 2021
|
https://www.sciencedaily.com/releases/2021/03/210322091634.htm
|
Systematic approach to forest and water supply management
|
As World Water Day is observed around the globe, new research from UBC Okanagan suggests a systematic approach to forest and water supply research may yield an improved assessment and understanding of connections between the two.
|
Healthy forests play a vital role in providing a clean, stable water supply, says eco-hydrologist Dr. Adam Wei.Acting as natural reservoirs, forests in watersheds release and purify water by slowing erosion and delaying its release into streams. But forests are changing -- in part because of human activity -- and that's having an impact on forests' interaction with hydrological processes.Dr. Wei, Forest Renewal BC's chair of watershed research and management, is a professor of earth, environmental and geographic sciences in the Irving K. Barber Faculty of Science, and study co-author.He says activities like logging, deforestation, creating new forests on previously bare land, agriculture and urbanization are changing the landscape of forests worldwide."The notion that humans have left enormous, often negative, footprints on the natural world isn't new," he says. "It's why the term Anthropocene was created, to describe these phenomena. But now we need to acknowledge where we're at and figure out a way to fix what's broken."While humans bear much of the blame, they aren't the only culprits.Natural disturbances like insect infestations and wildfires are also contributing to the swift transformation of forests, leading Dr. Wei to examine current forest-water research and management practices. His goal is to identify the gaps and propose a new approach that reflects numerous variables and their interactions that may be at play at any given watershed.He points to an example in the study to illustrate the need for a new perspective."We were looking at the impacts of deforestation on annual streamflow -- and though we were able to draw the conclusion that deforestation increased it, the variations between studies were large, with increases between less than one per cent to nearly 600 per cent," he explains.Dr. Wei saw similar variations when he researched the 'why.'"We concluded this was due to when water in the soil and on plants evaporates due to a loss of forest cover," explains Wei. "But the amount lost ranged from less than two per cent to 100 per cent -- that's a huge difference that can be attributed to scale, type and severity of forest disturbance, as well as climate and location of watershed properties. There are so many variables that need to be taken into account, and not doing so can result in contradictory research conclusions."To limit disparities, Dr. Wei says future research and watershed management approaches need to be systematic, include key contributing factors and a broad spectrum of response variables related to hydrological services.He also suggests new tools like machine learning and climatic eco-hydrological modelling should be utilized."Implementing a systematic approach to all forest-water research will reduce the likelihood of procuring misleading assessment, which in turn will give us a better chance to solve some of the problems we've created," says Dr. Wei.This study, published in
|
Environment
| 2,021 |
March 19, 2021
|
https://www.sciencedaily.com/releases/2021/03/210319183936.htm
|
How our microplastic waste becomes 'hubs' for pathogens, antibiotic-resistant bacteria
|
It's estimated that an average-sized wastewater treatment plant serving roughly 400,000 residents will discharge up to 2,000,000 microplastic particles into the environment each day. Yet, researchers are still learning the environmental and human health impact of these ultra-fine plastic particles, less than 5 millimeters in length, found in everything from cosmetics, toothpaste and clothing microfibers, to our food, air and drinking water.
|
Now, researchers at New Jersey Institute of Technology have shown that ubiquitous microplastics can become 'hubs' for antibiotic-resistant bacteria and pathogens to grow once they wash down household drains and enter wastewater treatment plants -- forming a slimy layer of buildup, or biofilm, on their surface that allows pathogenic microorganisms and antibiotic waste to attach and comingle.In findings published in Elsevier's "A number of recent studies have focused on the negative impacts that millions of tons of microplastic waste a year is having on our freshwater and ocean environments, but until now the role of microplastics in our towns' and cities' wastewater treatment processes has largely been unknown," said Mengyan Li, associate professor of chemistry and environmental science at NJIT and the study's corresponding author. "These wastewater treatment plants can be hotspots where various chemicals, antibiotic-resistant bacteria and pathogens converge and what our study shows is that microplastics can serve as their carriers, posing imminent risks to aquatic biota and human health if they bypass the water treatment process.""Most wastewater treatment plants are not designed for the removal of microplastics, so they are constantly being released into the receiving environment," added Dung Ngoc Pham, NJIT Ph.D. candidate and first author of the study. "Our goal was to investigate whether or not microplastics are enriching antibiotic-resistant bacteria from activated sludge at municipal wastewater treatment plants, and if so, learn more about the microbial communities involved."In their study, the team collected batches of sludge samples from three domestic wastewater treatment plants in northern New Jersey, inoculating the samples in the lab with two widespread commercial microplastics -- polyethylene (PE) and polystyrene (PS). The team used a combination of quantitative PCR and next-generation sequencing techniques to identify the species of bacteria that tend to grow on the microplastics, tracking genetic changes of the bacteria along the way.The analysis revealed that three genes in particular -- sul1, sul2 and intI1 -- known to aid resistance to common antibiotics, sulfonamides, were found to be up to 30 times greater on the microplastic biofilms than in the lab's control tests using sand biofilms after just three days.When the team spiked the samples with the antibiotic, sulfamethoxazole (SMX), they found it further amplified the antibiotic resistance genes by up to 4.5-fold."Previously, we thought the presence of antibiotics would be necessary to enhance antibiotic-resistance genes in these microplastic-associated bacteria, but it seems microplastics can naturally allow for uptake of these resistance genes on their own." said Pham. "The presence of antibiotics does have a significant multiplier effect however."Eight different species of bacteria were found highly enriched on the microplastics. Among these species, the team observed two emerging human pathogens typically linked with respiratory infection, Raoultella ornithinolytica and Stenotrophomonas maltophilia, frequently hitchhiking on the microplastic biofilms.The team say the most common strain found on the microplastics by far, Novosphingobium pokkalii, is likely a key initiator in forming the sticky biofilm that attracts such pathogens -- as it proliferates it may contribute to the deterioration of the plastic and expand the biofilm. At the same time, the team's study highlighted the role of the gene, intI1, a mobile genetic element chiefly responsible for enabling the exchange of antibiotic resistance genes among the microplastic-bound microbes."We might think of microplastics as tiny beads, but they provide an enormous surface area for microbes to reside," explained Li. "When these microplastics enter the wastewater treatment plant and mix in with sludge, bacteria like Novosphingobium can accidentally attach to the surface and secrete glue-like extracellular substances. As other bacteria attach to the surface and grow, they can even swap DNA with each other. This is how the antibiotic resistance genes are being spread among the community.""We have evidence that the bacteria developed resistance to other antibiotics this way as well, such as aminoglycoside, beta-lactam and trimethoprim," added Pham.Now, Li says the lab is further studying the role of Novosphingobium in biofilm formation on microplastics. The team is also seeking to better understand the extent to which such pathogen-carrying microplastics may be bypassing water treatment processes, by studying resistance of microplastic biofilms during wastewater treatment with disinfectants such as UV light and chlorine."Some states are already considering new regulations on the use of microplastics in consumer products. This study raises calls for further investigation on microplastic biofilms in our wastewater systems and development of effective means for removing microplastics in aquatic environments," said Li.
|
Environment
| 2,021 |
March 19, 2021
|
https://www.sciencedaily.com/releases/2021/03/210319125516.htm
|
Tropical species are moving northward in U.S. as winters warm
|
Notwithstanding last month's cold snap in Texas and Louisiana, climate change is leading to warmer winter weather throughout the southern U.S., creating a golden opportunity for many tropical plants and animals to move north, according to a new study appearing this week in the journal
|
Some of these species may be welcomed, such as sea turtles and the Florida manatee, which are expanding their ranges northward along the Atlantic Coast. Others, like the invasive Burmese python -- in the Florida Everglades, the largest measured 18 feet, end-to-end -- maybe less so.Equally unwelcome, and among the quickest to spread into warming areas, are the insects, including mosquitoes that carry diseases such as West Nile virus, Zika, dengue and yellow fever, and beetles that destroy native trees."Quite a few mosquito species are expanding northward, as well as a lot of forestry pests: bark beetles, the southern mountain pine beetle," said Caroline Williams, associate professor of integrative biology at the University of California, Berkeley, and a co-author of the paper. "In our study, we were really focusing on that boundary in the U.S. where we get that quick tropical-temperate transition. Changes in winter conditions are one of the major, if not the major, drivers of shifting distributions."That transition zone, northward of which freezes occur every winter, has always been a barrier to species that evolved in more stable temperatures, said Williams, who specializes in insect metabolism -- in particular, how winter freezes and snow affect the survival of species."For the vast majority of organisms, if they freeze, they die," she said. "Cold snaps like the recent one in Texas might not happen for 30 or 50 or even 100 years, and then you see these widespread mortality events where tropical species that have been creeping northward are suddenly knocked back. But as the return times become longer and longer for these extreme cold events, it enables tropical species to get more and more of a foothold, and even maybe for populations to adapt in situ to allow them to tolerate more cold extremes in the future."The study, conducted by a team of 16 scientists led by the U.S. Geological Survey (USGS), focused on the effects warming winters will have on the movement of a broad range of cold-sensitive tropical plants and animals into the Southern U.S., especially into the eight subtropical U.S. mainland states: Florida, Alabama, Mississippi, Louisiana, Texas, New Mexico, Arizona and California. Williams and Katie Marshall of the University of British Columbia in Vancouver co-wrote the section on insects for the study.The team found that a number of tropical species, including insects, fish, reptiles, amphibians, mammals, grasses, shrubs and trees, are enlarging their ranges to the north. Among them are species native to the U.S., such as mangroves, which are tropical salt-tolerant trees; and snook, a warm water coastal sport fish; and invasive species such as Burmese pythons, Cuban tree frogs, Brazilian pepper trees and buffelgrass."We don't expect it to be a continuous process," said USGS research ecologist Michael Osland, the study's lead author. "There's going to be northward expansion, then contraction with extreme cold events, like the one that just occurred in Texas, and then movement again. But by the end of this century, we are expecting tropicalization to occur."The authors document several decades' worth of changes in the frequency and intensity of extreme cold snaps in San Francisco, Tucson, New Orleans and Tampa -- all cities with temperature records stretching back to at least 1948. In each city, they found, mean winter temperatures have risen over time, winter's coldest temperatures have gotten warmer, and there are fewer days each winter when the mercury falls below freezing.Temperature records from San Francisco International Airport, for example, show that before 1980, each winter would typically see several sub-freezing days. For the past 20 years, there has been only one day with sub-freezing temperatures.Changes already underway or anticipated in the home ranges of 22 plant and animal species from California to Florida include:The changes are expected to result in some temperate zone plant and animal communities found today across the southern U.S. being replaced by tropical communities."Unfortunately, the general story is that the species that are going to do really well are the more generalist species -- their host plants or food sources are quite varied or widely distributed, and they have relatively wide thermal tolerance, so they can tolerate a wide range of conditions," Williams said. "And, by definition, these tend to be the pest species -- that is why they are pests: They are adaptable, widespread and relatively unbothered by changes in conditions, whereas, the more specialized or boutique species are tending to decline as they get displaced from their relatively narrow niche."She cautioned that insect populations overall are falling worldwide."We are seeing an alarming decrease in total numbers in natural areas, managed areas, national parks, tropical rain forests -- globally," she said. " So, although we are seeing some widespread pest species increasing, the overall pattern is that insects are declining extremely rapidly."The authors suggest in-depth laboratory studies to learn how tropical species can adapt to extreme conditions and modeling to show how lengthening intervals between cold snaps will affect plant and animal communities."On a hopeful note, it is not that we are heading for extinction of absolutely everything, but we need to prepare for widespread shifts in the distribution of biodiversity as climate, including winter climate, changes," Williams said. "The actions that we take over the next 20 years are going to be critical in determining our trajectory. In addition to obvious shifts, like reducing our carbon footprint, we need to protect and restore habitat for insects. Individuals can create habitat in their own backyards for insects by cultivating native plants that support pollinators and other native insects. Those are little things that people can do and that can be important in providing corridors for species to move through our very fragmented habitats."
|
Environment
| 2,021 |
March 19, 2021
|
https://www.sciencedaily.com/releases/2021/03/210319080825.htm
|
Carbon uptake in regrowing Amazon forest threatened by climate and human disturbance
|
Large areas of forests regrowing in the Amazon to help reduce carbon dioxide in the atmosphere, are being limited by climate and human activity.
|
The forests, which naturally regrow on land previously deforested for agriculture and now abandoned, are developing at different speeds. Researchers at the University of Bristol have found a link between slower tree-growth and land previously scorched by fire.The findings were published today [date] in Global forests are expected to contribute a quarter of pledged mitigation under the 2015 Paris Agreement. Many countries pledged in their Nationally Determined Contribution (NDC) to restore and reforest millions of hectares of land to help achieve the goals of the Paris Agreement. Until recently, this included Brazil, which in 2015 vowed to restore and reforest 12 million hectares, an area approximately equal to that of England.Part of this reforestation can be achieved through the natural regrowth of secondary forests, which already occupy about 20% of deforested land in the Amazon. Understanding how the regrowth is affected by the environment and humans will improve estimates of the climate mitigation potential in the decade ahead that the United Nations has called the "Decade of Ecosystem Restoration."Viola Heinrich, lead author and PhD student from the School of Geographical Sciences at the University of Bristol, said, "Our results show the strong effects of key climate and human factors on regrowth, stressing the need to safeguard and expand secondary forest areas if they are to have any significant role in the fight against climate change."Annually, tropical secondary forests, which are growing on used land, can absorb carbon up to 11 times faster than old-growth forests. However, there are many driving factors that can influence the spatial patterns of regrowth rate, such as when forest land is burned either to clear for agriculture or when fire elsewhere has spread.The research was led by researchers at the University of Bristol and Brazil's National Institute for Space Research (INPE) and included scientists from the Universities of Cardiff and Exeter, UK.Scientists used a combination of satellite-derived images that detect changes in forest cover over time to identify secondary forest areas and their ages as well as satellite data that can monitor the aboveground carbon, environmental factors and human activity.They found that the impact of disturbances such as fire and repeated deforestations prior to regrowth reduced the regrowth rate by 20% to 55% across different areas of the Amazon."The regrowth models we developed in this study will be useful for scientists, forest managers and policy makers, highlighting the regions that have the greatest regrowth potential." Said Heinrich.The research team also calculated the contribution of Amazonian Secondary Forests to Brazil's net emissions reduction target and found that by preserving the current area, secondary forests can contribute to 6% of Brazil's net emissions reduction targets. However, this value reduces rapidly to less than 1% if only secondary forests older than 20 years are preserved.In December 2020, Brazil amended its pledge (NDC) under the Paris Agreement such that there is now no mention of the 12 million hectares of forest restoration or eliminating illegal deforestation as was pledged in Brazil's original NDC target in 2015.Co-author Dr Jo House, University of Bristol said "The findings in our study highlight the carbon benefits of forest regrowth and the negative impact of human action if these forests are not protected. In the run up to the 26th Conference of the Parties, this is a time when countries should be raising their climate ambitions for protecting and restoring forest ecosystems, not lowering them as Brazil seems to have done."Co-author Dr Luiz Aragão, National Institute of Space Research in Brazil, added "Across the tropics several areas could be used to regrow forests to remove COThe team will now focus their next steps on applying their methods to estimate the regrowth of secondary forest across the tropics.
|
Environment
| 2,021 |
March 15, 2021
|
https://www.sciencedaily.com/releases/2021/03/210315110123.htm
|
Weed invaders are getting faster
|
Dr Daniel Montesinos is a Senior Research Fellow at the Australian Tropical Herbarium, at James Cook University in Cairns. He is studying weeds to better understand (among other things) how they might respond to climate change.
|
He said most invasive plants are characterised by their rapid pace when it comes to taking up nutrients, growing, and reproducing -- and they're even faster in the regions they invade."New experiments comparing populations from distant regions show a clear trend for already-fast invasive plants to rapidly adapt even faster traits in their non-native regions," Dr Montesinos said.This is further pronounced in the tropics and sub-tropics."Even though invasives' growth rates are already among the highest for plants, when they invade new territory in the tropics and sub-tropics, they develop those weedy traits more rapidly than they do when they invade in temperate climates," Dr Montesinos said."This might be explained by higher chemical processing at higher temperatures, which suggests that global warming will increase invasive impacts in these regions, as long as enough water is available."Dr Montesinos said invasive plants usually take hold in land that has been disturbed by human intervention (for example farms and roadsides) and then spread to other habitats."It's important to recognise disturbed habitats as a gateway for plant invasions," Dr Montesinos said. "If we can limit disturbance of natural environments, we can reduce biological invasions, particularly in tropical areas that are threatened by increasing human encroachment."Dr Montesinos said that range expansions by native species trying to 'escape' from changes in climate could be a further complication. This involves climate change enabling some native plants to grow where they previously could not."This can be seen as a double-edged sword -- some native species will survive climate change, but they might achieve that by disrupting the habitats of others."The study of invasion ecology is complex, but invasive species can be models in which to study, and make predictions about, the responses of native plants to climate change, giving us clues on improved management techniques for both natives and invasives," Dr Montesinos said.
|
Environment
| 2,021 |
March 12, 2021
|
https://www.sciencedaily.com/releases/2021/03/210312155451.htm
|
Glaciers and enigmatic stone stripes in the Ethiopian highlands
|
As the driver of global atmospheric and ocean circulation, the tropics play a central role in understanding past and future climate change. Both global climate simulations and worldwide ocean temperature reconstructions indicate that the cooling in the tropics during the last cold period, which began about 115,000 years ago, was much weaker than in the temperate zone and the polar regions. The extent to which this general statement also applies to the tropical high mountains of Eastern Africa and elsewhere is, however, doubted on the basis of palaeoclimatic, geological and ecological studies at high elevations.
|
A research team led by Alexander Groos, Heinz Veit (both from the Institute of Geography) and Naki Akçar (Institute of Geological Sciences) at the University of Bern, in collaboration with colleagues from ETH Zurich, the University of Marburg and the University of Ankara, used the Ethiopian Highlands as a test site to investigate the extent and impact of regional cooling on tropical mountains during the last glacial period. The results have been published in the scientific journals "The Ethiopian Highlands are currently not covered by ice, despite its elevation of over 4,000 m," explains Groos, who studied the glacial, climatic and landscape history of the Bale and Arsi Mountains in the southern highlands as part of his dissertation. "Moraines and other land forms, however, attest to the fact that these mountains were glaciated during the last cold period," he continues.Moraine boulders in the Bale and Arsi Mountains were mapped and sampled in the field and later dated using the chlorine isotope "36Cl" to accurately determine the extent and timing of past glaciations. The researchers were in for a surprise: "Our results show that glaciers in the southern Ethiopian Highlands reached their maximum extent between 40,000 and 30,000 years ago," says Groos, "several thousand years earlier than in other mountainous regions in Eastern Africa and worldwide." In total, the glaciers in the southern highlands covered an area of more than 350 km² during their maximum. In addition to the cooling of at least 4 to 6 °C, the extensive volcanic plateaus above 4,000 m favored the development of glaciation in this magnitude.The researchers gained important insights by comparing the specially reconstructed glacier fluctuations in the Ethiopian Highlands with those of the highest East African mountains and climate archives from the Great African Rift Valley. "The cross-comparisons show that the tropical mountains in Eastern Africa have experienced a more pronounced cooling than the surrounding lowlands," Groos concludes. "Furthermore, the results suggest a nonuniform response by East African glaciers and ice caps to climate changes during the last cold period, which can be attributed to regional differences in precipitation distribution and mountain relief, among other factors," he continues.During their fieldwork on the central Sanetti Plateau in the Bale Mountains, the researchers also came across gigantic stone stripes (up to 1,000 m long, 15 m wide and 2 m deep) outside the area of the former ice cap. "The existence of these stone stripes on a tropical plateau surprised us, as so-called periglacial landforms of this magnitude were previously only known from the temperate zone and polar regions and are associated with ground temperatures around freezing point," Groos said. However, the average ground temperature on the Sanetti Plateau is currently about 11 °C.The large boulders and basalt columns that make up the stone stripes originally came from heavily eroded rock formations and volcanic plugs. As things stand, the researchers assume that the stone stripes were formed during the last glacial period through natural sorting of the previously chaotically distributed rocks in the course of the periodic freezing and thawing of the ground near the former ice cap. However, locally this would have required a drop in the mean ground temperature of at least 11 °C and in the mean air temperature of at least 7 °C. Whether this unprecedented cooling is a regional phenomenon or exemplary for the cooling of tropical high mountains during the last glacial period must be shown by future studies from other tropical mountain regions.
|
Environment
| 2,021 |
March 12, 2021
|
https://www.sciencedaily.com/releases/2021/03/210312140010.htm
|
Zealandia switch: New theory of regulation of ice age climates
|
The origins of ice age climate changes may lie in the Southern Hemisphere, where interactions among the westerly wind system, the Southern Ocean and the tropical Pacific can trigger rapid, global changes in atmospheric temperature, according to an international research team led by the University of Maine.
|
The mechanism, dubbed the Zealandia Switch, relates to the general position of the Southern Hemisphere westerly wind belt -- the strongest wind system on Earth -- and the continental platforms of the southwest Pacific Ocean, and their control on ocean currents. Shifts in the latitude of the westerly winds affects the strength of the subtropical oceanic gyres and, in turn, influences the release of energy from the tropical ocean waters, the planet's "heat engine." Tropical heat spreads rapidly through the atmosphere and ocean to the polar regions of both hemispheres, acting as the planet's thermostat.The Southern Hemisphere climate dynamics may be the missing link in understanding longstanding questions about ice ages, based on the findings of the research team from UMaine, Columbia University's Lamont-Doherty Earth Observatory, the University of Arizona, and GNS Science in New Zealand, published in For more than a quarter-century, George Denton, UMaine Libra Professor of Geological Sciences, the journal article's first author, has led research reconstructing the history of mountain glaciers in the Southern Hemisphere. In the late 1980s, he and Wallace Broecker, a geochemist at Columbia University, noted that a key question about ice ages remained unresolved -- the link between ice age climate and the orbital cycles in the length and strength of the Earth's season. Evidence showed that ice age climate changes were synchronous in both polar hemispheres, with rapid transitions from glacial to interglacial global climate conditions. They concluded that existing theories could not adequately account for changes in seasonality, ice sheet size and regional climate.Mountain glaciers are highly sensitive to climate and well suited to climatic reconstruction, using distinctive moraine deposits that mark the former glacier limits. In the 1990s, Denton led research teams in the mapping and dating of moraine sequences in South America and, more recently, in New Zealand's Southern Alps, with co-author David Barrell, geologist and geomorphologist with the New Zealand government's geoscience research institute, GNS Science.With advances in isotopic dating of moraines in the mid-2000s, Denton teamed up with Columbia University's Joerg Schaefer, who directs the Cosmogenic Nuclide Laboratory at the Lamont-Doherty Earth Observatory. Together with CU-LDEO colleague and co-author Michael Kaplan, Schaefer, Denton, and UMaine assistant professor and co-author Aaron Putnam have guided a succession of UMaine graduate student field and laboratory projects (including Putnam's Ph.D. work) that have developed a chronology of climate-induced glacier changes in the Southern Alps spanning many tens of thousands of years. The most recent participant in the UMaine-CU partnership is UMaine Ph.D. student and co-author Peter Strand.Collectively, the UMaine, CU-LDEO and GNS Science partners have worked to create and compile mountain glacier chronologies from New Zealand and South America, producing a comprehensive chronology of glacier extent during and since the last ice age. The team then compared the moraine dating to paleoclimate data worldwide to gain insights into the climate dynamics of ice ages and millennial-scale abrupt climate events. The findings highlight a general global synchronicity of mountain-glacier advance and retreat during the last ice age.Deep insights into the climate dynamics come from co-author Joellen Russell, climate scientist at the University of Arizona and Thomas R. Brown Distinguished Chair of Integrative Science. Following on her longstanding efforts at modeling the climatic modulation of the westerly winds, she evaluated simulations done as part of the Southern Ocean Model Intercomparison Project, part of the Southern Ocean Carbon and Climate Observations and Modeling initiative. The modeling showed the changes to the southern wind systems have profound consequences for the global heat budget, as monitored by glacier systems.The "switch" takes its name from Zealandia, a largely submerged continental platform about a third of the size of Australia, with the islands of New Zealand being the largest emergent parts. Zealandia presents a physical impediment to ocean current flow. When the westerly wind belt is farther north, the southward flow of warm ocean water from the tropical Pacific is directed north of the New Zealand landmass (glacial mode). With the wind belt farther south, warm ocean water extends to the south of New Zealand (interglacial mode). Computer modelling shows that global climate effects arise from the latitude at which the westerlies are circulating. A southward shift of the southern westerlies invigorates water circulation in the South Pacific and Southern oceans, and warms the surface ocean waters across much of the globe.The researchers hypothesize that subtle changes in the Earth's orbit affect the behavior of the Southern Hemisphere westerly winds, and that behavior lies at the heart of global ice age cycles. This perspective is fundamentally different from the long-held view that orbital influences on the extent of Northern Hemisphere continental ice sheets regulate ice age climates. Adding weight to the Zealandia Switch hypothesis is that the Southern Hemisphere westerlies regulate the exchange of carbon dioxide and heat between the ocean and atmosphere, and, thus, exert a further influence on global climate."Together with interhemispheric paleoclimate records and with the results of coupled ocean-atmosphere climate modeling, these findings suggest a big, fast and global end to the last ice age in which a southern-sourced warming episode linked the hemispheres," according to the researchers, whose work was funded by the Comer Family Foundation, the Quesada Family Foundation, the National Science Foundation and the New Zealand government.The last glacial termination was a global warming episode that led to extreme seasonality (winter vs. summer conditions) in northern latitudes by stimulating a flush of meltwater and icebergs into the North Atlantic from adjoining ice sheets. Summer warming led to freshwater influx, resulting in widespread North Atlantic sea ice that caused very cold northern winters and amplified the annual southward shift of the Intertropical Convergence Zone and the monsoonal rain belts. Although this has created an impression of differing temperature responses between the polar hemispheres, the so-called "bipolar seesaw," the researchers suggest this is due to contrasting interregional effects of global warming or cooling. A succession of short-lived, abrupt, episodes of cold northern winters during the last ice age are suggested to have been caused by temporary shifts of the Zealandia Switch mechanism.The southward shift of the Southern Hemisphere westerlies at the termination of the last ice age was accompanied by gradual but sustained release of carbon dioxide from the Southern Ocean, which may have helped to lock the climate system into a warm interglacial mode.The researchers suggest that the introduction of fossil CO"The mapping and dating of mid-latitude Southern Hemisphere mountain-glacier moraines leads us to the view that the latitude and strength of the austral westerlies, and their effect on the tropical/subtropical ocean, particularly in the region spanning the Indo-Pacific Warm Pool and Tasman Sea through to the Southern Ocean, provides an explanation for driving orbital-scale global shifts between glacial and interglacial climatic modes, via the Zealandia Switch mechanism," the research team wrote. "Such behavior of the ocean-atmosphere system may be operative in today's warming world, introducing a distinctly nonlinear mechanism for accelerating global warming due to atmospheric CO
|
Environment
| 2,021 |
March 12, 2021
|
https://www.sciencedaily.com/releases/2021/03/210312121306.htm
|
'Magical' fire suppressant kills zombie fires 40% faster than water alone
|
The researchers say this is a big step in tackling smouldering peat fires, which are the largest fires on Earth. They ignite very easily, are notoriously difficult to put out, and release up to 100 times more carbon into the atmosphere than flaming fires, contributing to climate change.
|
The fires, known as 'zombie fires' for their ability to hide and smoulder underground and then reanimate as new flames days or weeks after the wildfire had been extinguished, are prevalent in regions like Southeast Asia, North America, and Siberia.They are driven by the burning of soils rich in organic content like peat, which is a large natural reservoir of carbon. Worldwide, peat fires account for millions of tonnes of carbon released into the atmosphere each year.Firefighters currently use millions to billions of litres of water per to tackle a peat fire: The 2008 Evans Road peat fire in the USA consumed 7.5 billion litres of water, and the 2018 Lake Cobrico peat fire in Australia consumed 65 million.However, when water alone is used to extinguish peat fires, it tends to create a few large channels in the soil, diverting the water from nearby smouldering hotspots where it is most needed. This is partly why they take can so long to be extinguished.Now, researchers at Imperial College London have combined water with an environmentally friendly fire suppressant that is already used to help extinguish flaming wildfires, to measure its effectiveness against peat fires at different concentrations.During laboratory experiments at Imperial's HazeLab, they found that adding the suppressant to water helped them put out peat fires nearly twice as fast as using water alone, while using only a third to a half of the usual amount of water.Lead author Muhammad Agung Santoso of Imperial's Department of Mechanical Engineering said: "The suppressant could enable firefighters to put out peat fires much faster while using between a third to half of the amount of water. This could be critical in ending pollution-related deaths, devastation of local communities, and environmental damage caused by these fires."The results are published in The suppressant, also known as a 'wetting agent', increases the penetrating properties of liquids like water by reducing their surface tension. This agent is made from plant matter and is biodegradable so it doesn't harm the environment.The researchers mixed the wetting agent with water at three concentrations: 0% (pure water), 1% (low concentration), and 5% (high concentration). They used each concentration on a laboratory peat fire with varying rates of flow between 0.3 and 18 litres per hour.They found that the suppressant reduced the surface tension of the liquid, which made it less likely to create large channels and instead flow uniformly through the soil. Low-concentration solutions reduced the average fire suppression time by 39%, and the high concentration solution reduced it by 26% but more consistently. The average volume of liquid needed for suppression was 5.7 litres per kilogram of burning peat, regardless of flow rates or suppressant.They also learned that the agent acts thermally and not chemically: it encapsulates the fire to bring down the temperature and remove the 'heat' element from the fire triangle. The other two essential elements for fire are oxygen and fuel.Senior author Professor Guillermo Rein, Head of Hazelab at Imperial's Department of Mechanical Engineering, said: "Fighting peat fires uses an incredible amount of work, time and water, and this biodegradable wetting agent could help everybody: fire brigades, communities and the planet. This magical suppressant could make it easier to put zombie fires to rest for good."The results provide a better understanding of the suppression mechanism of peat fires and could help to improve firefighting and mitigation strategies. The researchers are now looking to replicate their findings in controlled peat fires outside the lab in real peatlands.This research was funded by the European Research Council and the Indonesia Endowment Fund for Education.
|
Environment
| 2,021 |
March 12, 2021
|
https://www.sciencedaily.com/releases/2021/03/210312095759.htm
|
Sea-level rise drives wastewater leakage to coastal waters
|
When people think of sea level rise, they usually think of coastal erosion. However, recent computer modeling studies indicate that coastal wastewater infrastructure, which includes sewer lines and cesspools, is likely to flood with groundwater as sea-level rises.
|
A new study, published by University of Hawai'i (UH) at Manoa earth scientists, is the first to provide direct evidence that tidally-driven groundwater inundation of wastewater infrastructure is occurring today in urban Honolulu, Hawai'i. The study shows that higher ocean water levels are leading to wastewater entering storm drains and the coastal ocean -- creating negative impacts to coastal water quality and ecological health.The study was led by postdoctoral researcher Trista McKenzie and co-authored by UH Sea Grant coastal geologist Shellie Habel and Henrietta Dulai, advisor and associate professor in the UH Manoa School of Ocean and Earth Science and Technology (SOEST). The team assessed coastal ocean water and storm drain water in low-lying areas during spring tides, which serve as an approximation of future sea levels.To understand the connection between wastewater infrastructure, groundwater and the coastal ocean, the researchers used chemical tracers to detect groundwater discharge and wastewater present at each site. Radon is a naturally occurring gas that reliably indicates the presence of groundwater, while wastewater can be detected by measuring specific organic contaminants from human sources, such as caffeine and certain antibiotics."Our results confirm that indeed, both groundwater inundation and wastewater discharge to the coast and storm drains are occurring today and that it is tidally-influenced," said McKenzie. "While the results were predicted, I was surprised how prevalent the evidence for these processes and the scale of it."In low-lying inland areas, storm drains can overflow every spring tide. This study demonstrated that at the same time wastewater from compromised infrastructure also discharges into storm drains. During high tides, storm drains are becoming channels for untreated wastewater to flood streets and sidewalks. In addition to impeding traffic, including access by emergency vehicles, this flooding of contaminated water also poses a risk to human health.The team also found evidence that many of the human-derived contaminants were in concentrations that pose a high risk to aquatic organisms. This has negative consequences to coastal organisms where the groundwater and storm drains discharge."Many people may think of sea-level rise as a future problem, but in fact, we are already seeing the effects today," said McKenzie. "Further, these threats to human health, ocean ecosystems and the wastewater infrastructure are expected to occur with even greater frequency and magnitude in the future."This project demonstrates that actions to mitigate the impact from sea-level rise to coastal wastewater infrastructure in Honolulu are no longer proactive but are instead critical to addressing current issues. Through its multi-partner effort, the Hawai'i State Climate Commission also raises awareness around the variety of impacts of sea level rise, including those highlighted by this study."Coastal municipalities should pursue mitigation strategies that account for increased connectivity between wastewater infrastructure and recreational and drinking water resources," said McKenzie. "We need to consider infrastructure that minimizes flooding opportunities and contact with contaminated water; and decreases the number of contaminant sources, such as installation of one-way valves for storm drains, decommissioning cesspools, monitoring defective sewer lines, and construction of raised walkways and streets."
|
Environment
| 2,021 |
March 12, 2021
|
https://www.sciencedaily.com/releases/2021/03/210312084702.htm
|
Breast cancer: The risks of brominated flame retardants
|
Brominated flame retardants (BFRs) are found in furniture, electronics, and kitchenware to slow the spread of flames in the event of a fire. However, it has been shown that these molecules may lead to early mammary gland development, which is linked to an increased risk of breast cancer. The study on the subject by Professor Isabelle Plante from the Institut national de la recherche scientifique (INRS) made the cover of the February issue of the journal
|
Part of the flame retardants are considered to be endocrine disruptors, i.e. they interfere with the hormonal system. Since they are not directly bound to the material in which they are added, the molecules escape easily. They are then found in house dust, air and food.This exposure can cause problems for mammary glands because their development is highly regulated by hormones. "BFRs pose a significant risk, particularly during sensitive periods, from intrauterine life to puberty and during pregnancy," says Professor Plante, co-director of the Intersectoral Centre for Endocrine Disruptor Analysis and environmental toxicologist. Endocrine disruptors, such as BFRs, can mimic hormones and cause cells to respond inappropriately.In their experiments, the research team exposed female rodents to a mixture of BFRs, similar to that found in house dust, prior to mating, during gestation and during lactation. Biologists were able to observe the effects on the offspring at two stages of development and on the mothers.In pre-pubertal rats, the team noted early development of mammary glands. For pubescent rats, the results, published in 2019, showed a deregulation of communication between cells. Similar consequences were observed in female genitors in a 2017 study. All of these effects are associated with an increased risk of breast cancer.Professor Isabelle Plante points out that peaks in human exposure to BFRs have been observed in the early 2000s. "Young women exposed to BFRs in utero and through breastfeeding are now in the early stages of fertility. Their mothers are in their fifties, a period of increased risk for breast cancer," says Professor Plante. This is why the team is currently studying endocrine disruptors related to a predisposition to breast cancer, funded by the Breast Cancer Foundation and the Cancer Research Society.In all three studies, most of the effects were observed when subjects were exposed to the lowest dose, from dust, and not the higher doses. This observation raises questions about the current legislation for endocrine disruptors. "To evaluate the "safe" dose, experts give an increasing dose and then, when they observe an effect, identify it as the maximum dose. With endocrine disruptors, the long-term consequences would be caused by lower doses" reports Professor Plante.Although counter-intuitive, this observation comes from the fact that high doses trigger a toxic response in the cells. When the body is exposed to lower doses, similar to the concentration of hormones in our body, the consequences rather consist in the deregulation of the hormonal system.
|
Environment
| 2,021 |
March 12, 2021
|
https://www.sciencedaily.com/releases/2021/03/210312140016.htm
|
Computing clean water
|
Water is perhaps Earth's most critical natural resource. Given increasing demand and increasingly stretched water resources, scientists are pursuing more innovative ways to use and reuse existing water, as well as to design new materials to improve water purification methods. Synthetically created semi-permeable polymer membranes used for contaminant solute removal can provide a level of advanced treatment and improve the energy efficiency of treating water; however, existing knowledge gaps are limiting transformative advances in membrane technology. One basic problem is learning how the affinity, or the attraction, between solutes and membrane surfaces impacts many aspects of the water purification process.
|
"Fouling -- where solutes stick to and gunk up membranes -- significantly reduces performance and is a major obstacle in designing membranes to treat produced water," said M. Scott Shell, a chemical engineering professor at UC Santa Barbara, who conducts computational simulations of soft materials and biomaterials. "If we can fundamentally understand how solute stickiness is affected by the chemical composition of membrane surfaces, including possible patterning of functional groups on these surfaces, then we can begin to design next-generation, fouling-resistant membranes to repel a wide range of solute types."Now, in a paper published in the "Solute-surface interactions in water determine the behavior of a huge range of physical phenomena and technologies, but are particularly important in water separation and purification, where often many distinct types of solutes need to be removed or captured," said Monroe, now a postdoctoral researcher at the National Institute of Standards and Technology (NIST). "This work tackles the grand challenge of understanding how to design next-generation membranes that can handle huge yearly volumes of highly contaminated water sources, like those produced in oilfield operations, where the concentration of solutes is high and their chemistries quite diverse."Solutes are frequently characterized as spanning a range from hydrophilic, which can be thought of as water-liking and dissolving easily in water, to hydrophobic, or water-disliking and preferring to separate from water, like oil. Surfaces span the same range; for example, water beads up on hydrophobic surfaces and spreads out on hydrophilic surfaces. Hydrophilic solutes like to stick to hydrophilic surfaces, and hydrophobic solutes stick to hydrophobic surfaces. Here, the researchers corroborated the expectation that "like sticks to like," but also discovered, surprisingly, that the complete picture is more complex."Among the wide range of chemistries that we considered, we found that hydrophilic solutes also like hydrophobic surfaces, and that hydrophobic solutes also like hydrophilic surfaces, though these attractions are weaker than those of like to like," explained Monroe, referencing the eight solutes the group tested, ranging from ammonia and boric acid, to isopropanol and methane. The group selected small-molecule solutes typically found in produced waters to provide a fundamental perspective on solute-surface affinity.The computational research group developed an algorithm to repattern surfaces by rearranging surface chemical groups in order to minimize or maximize the affinity of a given solute to the surface, or alternatively, to maximize the surface affinity of one solute relative to that of another. The approach relied on a genetic algorithm that "evolved" surface patterns in a way similar to natural selection, optimizing them toward a particular function goal.Through simulations, the team discovered that surface affinity was poorly correlated to conventional methods of solute hydrophobicity, such as how soluble a solute is in water. Instead, they found a stronger connection between surface affinity and the way that water molecules near a surface or near a solute change their structures in response. In some cases, these neighboring waters were forced to adopt structures that were unfavorable; by moving closer to hydrophobic surfaces, solutes could then reduce the number of such unfavorable water molecules, providing an overall driving force for affinity."The missing ingredient was understanding how the water molecules near a surface are structured and move around it," said Monroe. "In particular, water structural fluctuations are enhanced near hydrophobic surfaces, compared to bulk water, or the water far away from the surface. We found that fluctuations drove the stickiness of every small solute types that we tested. "The finding is significant because it shows that in designing new surfaces, researchers should focus on the response of water molecules around them and avoid being guided by conventional hydrophobicity metrics.Based on their findings, Monroe and Shell say that surfaces comprised of different types of molecular chemistries may be the key to achieving multiple performance goals, such as preventing an assortment of solutes from fouling a membrane."Surfaces with multiple types of chemical groups offer great potential. We showed that not only the presence of different surface groups, but their arrangement or pattern, influence solute-surface affinity," Monroe said. "Just by rearranging the spatial pattern, it becomes possible to significantly increase or decrease the surface affinity of a given solute, without changing how many surface groups are present."According to the team, their findings show that computational methods can contribute in significant ways to next-generation membrane systems for sustainable water treatment."This work provided detailed insight into the molecular-scale interactions that control solute-surface affinity," said Shell, the John E. Myers Founder's Chair in Chemical Engineering. "Moreover, it shows that surface patterning offers a powerful design strategy in engineering membranes are resistant to fouling by a variety of contaminants and that can precisely control how each solute type is separated out. As a result, it offers molecular design rules and targets for next-generation membrane systems capable of purifying highly contaminated waters in an energy-efficient manner."Most of the surfaces examined were model systems, simplified to facilitate analysis and understanding. The researchers say that the natural next step will be to examine increasingly complex and realistic surfaces that more closely mimic actual membranes used in water treatment. Another important step to bring the modeling closer to membrane design will be to move beyond understanding merely how sticky a membrane is for a solute and toward computing the rates at which solutes move through membranes.
|
Environment
| 2,021 |
March 11, 2021
|
https://www.sciencedaily.com/releases/2021/03/210311190000.htm
|
Climate change damaging North America's largest temperate rainforest, harming salmon
|
New research released in
|
North America's largest remaining temperate rainforest, located in Southeast Alaska, is one of the most pristine and intact ecosystems. The entire ecosystem stretches well over 2,000 km from north to south and stores more carbon in its forests than any other.The region can store more than 1,000 tons per hectare of carbon in biomass and soil. Although the area is extremely remote, researchers say it is not immune from the negative impacts of climate change. Glaciers are disappearing faster than most other places on Earth and winter snows are turning into winter rains. This is leading to a change in stream temperatures, which can harm salmon, and changes in ground temperatures, causing the death of forests."This is an incredible landscape in a relatively compact area we have as much biomass carbon as 8% of the lower 48 states put together," said Buma. "The 200-foot trees, the deep soils -- it's just layers and layers of life. And that land is so intertwined with the water that any change in one means massive change in the other, downstream and into the ocean."Why is this important? Forests absorb more carbon than they release. Trees absorb carbon during photosynthesis, removing large amounts of carbon from the atmosphere. Since the forest is growing faster as the climate warms, a lot of that carbon "leaks" out through the creeks and rivers. This carbon powers downstream and marine ecosystems, which thrive on the flow of energy off the land."This region is immensely important to global carbon cycles and our national carbon strategy, but we still don't know the direction overall carbon stocks and movement will take as the world warms," said Buma. "While there is ample research identifying how important this area is, more work is needed to determine where this large reservoir will trend in the future."
|
Environment
| 2,021 |
March 11, 2021
|
https://www.sciencedaily.com/releases/2021/03/210311185932.htm
|
Whooping cranes steer clear of wind turbines when selecting stopover sites
|
As gatherings to observe whooping cranes join the ranks of online-only events this year, a new study offers insight into how the endangered bird is faring on a landscape increasingly dotted with wind turbines. The paper, published this week in
|
Avoidance of wind turbines can decrease collision mortality for birds, but can also make it more difficult and time-consuming for migrating flocks to find safe and suitable rest and refueling locations. The study's insights into migratory behavior could improve future siting decisions as wind energy infrastructure continues to expand."In the past, federal agencies had thought of impacts related to wind energy primarily associated with collision risks," said Aaron Pearse, the paper's first author and a research wildlife biologist for the U.S. Geological Survey's Northern Prairie Wildlife Research Center in Jamestown, N.D. "I think this research changes that paradigm to a greater focus on potential impacts to important migration habitats."The study tracked whooping cranes migrating across the Great Plains, a region that encompasses a mosaic of croplands, grasslands and wetlands. The region has seen a rapid proliferation of wind energy infrastructure in recent years: in 2010, there were 2,215 wind towers within the whooping crane migration corridor that the study focused on; by 2016, when the study ended, there were 7,622 wind towers within the same area.Pearse and his colleagues found that whooping cranes migrating across the study area in 2010 and 2016 were 20 times more likely to select "rest stop" locations at least 5 km away from wind turbines than those closer to turbines.The authors estimated that 5% of high-quality stopover habitat in the study area was affected by presence of wind towers. Siting wind infrastructure outside of whooping cranes' migration corridor would reduce the risk of further habitat loss not only for whooping cranes, but also for millions of other birds that use the same land for breeding, migration, and wintering habitat.
|
Environment
| 2,021 |
March 11, 2021
|
https://www.sciencedaily.com/releases/2021/03/210311185929.htm
|
How India's rice production can adapt to climate change challenges
|
As the global population grows, the demand for food increases while arable land shrinks. A new University of Illinois study investigates how rice production in India can meet future needs by adapting to changing climate conditions and water availability.
|
"Rice is the primary crop in India, China, and other countries in Southeast Asia. Rice consumption is also growing in the U.S. and elsewhere in the world," says Prasanta Kalita, professor in the Department of Agricultural and Biological Engineering at U of I and lead author on the study."If you look at where they traditionally grow rice, it is countries that have plenty of water, or at least they used to. They have tropical weather with heavy rainfall they depend on for rice production. Overall, about 4,000 liters of water go into production and processing per kilogram of rice," he states.Climate change is likely to affect future water availability, and rice farmers must implement new management practices to sustain production and increase yield, Kalita says.The United Nations' Food and Agriculture Organization (FAO) estimates the world population will grow by two billion people by 2050, and food demand will increase by 60%."We will need multiple efforts to meet that demand," Kalita states. "And with two billion more people, we will also need more water for crop production, drinking water, and industrial use."Kalita and his colleagues conducted the study at the Borlaug Institute for South Asia's research farm in Bihar, India. Farmers in the region grow rice during the monsoon season, when heavy rainfall sustains the crop.The researchers collected data on rice yield and climate conditions, then used computer simulations to model future scenarios based on four global climate models. The purpose of the study was to estimate rice yield and water demand by 2050, and evaluate how farmers can adapt to the effects of climate change."As the weather changes, it affects temperature, rainfall, and carbon dioxide concentration. These are essential ingredients for crop growth, especially for rice. It's a complicated system, and effects are difficult to evaluate and manage," Kalita states."Our modeling results show the crop growth stage is shrinking. The time for total maturity from the day you plant to the day you harvest is getting shorter. The crops are maturing faster, and as a result, you don't get the full potential of the yield."If farmers maintain current practices, rice yield will decrease substantially by 2050, the study shows. But various management strategies can mitigate the effects of climate change, and the researchers provide a series of recommendations.Traditional rice farming involves flooding the fields with water. Rice transplants need about six inches of standing water. If fields aren't level, it requires even more water to cover the crops, Kalita says. However, if farmers use direct-seeded rice instead of transplants, they can increase production while using significantly less water.Another practice involves soil conservation technology. "The soil surface continuously loses water because of temperature, humidity, and wind. If you keep crop residue on the ground, it reduces the evaporation and preserves water. Furthermore, when the crop residue decomposes, it will help increase soil quality," Kalita explains.The researchers also suggest implementing strategies to prevent post-harvest crop losses. FAO estimates about 30% of crops are lost or wasted after harvest, so efforts to reduce those losses can further increase crop availability and food security.Overall, the best approach to achieve a 60% increase in rice production while minimizing additional irrigation needs is a combination of conservation strategies and a 30% reduction in post-harvest loss, the researchers conclude.
|
Environment
| 2,021 |
March 11, 2021
|
https://www.sciencedaily.com/releases/2021/03/210311152749.htm
|
The narwhal's tusk reveals its past living conditions
|
Every year, a new growth layer is added to the narwhal's spiralled tusk. The individual layers act as an archive of data that reveals what and where the animal has eaten, providing a glimpse of how the ice and environmental conditions have changed over its long life span (up to 50 years).
|
Same as rings in a tree trunk, every year a new growth layer is added to the narwhal's tusk, which grows longer and thicker throughout the animals life. Because the tusk is connected to the rest of body through blood, each new growth layer records aspects of animal physiology during the year it was formed.An international team of researchers has now studied each individual growth layer of the tusks from ten narwhals from North-West Greenland. They specifically analysed mercury and stable isotopes of carbon and nitrogen to give information on what the whales have eaten in each year of their life and how the ice cover and the impact of potential toxic compounds such as mercury have changed over time.Most people are familiar with the narwhal's impressive unicorn-like tusk (a canine tooth) that projects from the left side of the upper jaw of the males.Researchers do not fully agree on the purpose of the impressive narwhal tusk. Indications from recent years' research suggest that the tusk may be used when the narwhals search for food. But presumably the males also use the long tusk to impress the females. And it is, indeed, impressive -- this spiralled, pointed tusk can reach up to three metres long.Researchers have now shown that each layer of the tusk offers valuable data on the animals' living conditions from when they are born until they die."It is unique that a single animal in this way can contribute with a 50-year long-term series of data. It is often through long time series that we as researchers come to understand the development of biological communities, and such series of unbroken data are very rare. Here, the data is a mirror of the development in the Arctic," tells Professor Rune Dietz from the Department of Bioscience and the Arctic Research Centre, Aarhus University, Denmark, who headed the studies.The sensational data is just published in the journal Among the biggest threats to Arctic top predators, such as the polar bear, white whale and narwhal are climate change and the amount of mercury consumed by the animals."The higher you are in the food chain, the more mercury you accumulate into your body throughout your life. Heavy metals and other environmentally contaminants accumulate at each link in the food chain, so if you are at the top of the food chain, you end up consuming the greatest amount of mercury at each meal" explains Post-Doctoral Research Fellow Jean-Pierre Desforges, Natural Resource Department, McGill University, Canada, who has co-headed the study.Elevated amounts of heavy metals in the body are toxic and affect the cognitive functions, behaviour and ability of a species to reproduce and defend against infections.In 2017, the UN adopted the Minamata Convention, which attempts to limit global mercury pollution.In the Arctic, climate change over the past 30-40 years has led to less sea ice. Many species depend on the ice when searching for food, for example polar bears, while other species use the ice as important breeding grounds, for example seals. For the narwhal, the ice acts as a protection against enemies like killer whales.Changes in temperature and the cover of sea ice also lead to invasion by new species from warmer areas. This affects the entire Arctic food chain and thus the living conditions of the individual species."We have been able to trace this development in the narwhals' tusks. In each layer of the tusk, we measured the amount of mercury, just as we measured stable isotopes of carbon and nitrogen -- the so-called delta 13C (δThe composition of the carbon and nitrogen isotopes in a layer of the tusk provides insight into the diet of each narwhal in the year from which the actual layer originated. Or rather, how high in the food the prey was, and in which part of the ocean the animals lived.A low δThe tusks analysed by the researchers were 150 to 248 cm long and contained data from 1962 to 2010."What we found in narwhal of Northwest Greenland is consistent with a more general trend across the Arctic where sea-ice is declining and changing the spatial distribution of sub-Arctic and Arctic fish as well as top predators. The big question now is how these changes will affect the health and fitness of key Arctic species in the years to come," says Jean-Pierre Desforges.The analyses of the tusks revealed three things in particular:Up until around 1990, the narwhals' food consisted particularly of prey linked to the sea ice, such as halibut and Arctic cod. During this period, the ice cover was extensive but varying.After 1990, the ice cover in North-West Greenland decline consistently year after year and the diet of the narwhals changed to a dominance of open ocean prey like capelin and polar cod. From 1990 until 2000, narwhals also accumulated relatively small quantities of mercury as the new items of prey sat lower in the food chain.However, from around 2000, the amount of mercury increased significantly in the narwhal tusks without a simultaneous shift in food items. The researchers have also measured higher levels of mercury in other Arctic animals over the past few decades, and they attribute this to extensive emission of mercury primarily from the coal combustion in South-East Asia. The rise in mercury might also be due to changing sea ice conditions in the Arctic as the climate is warming, causing changes in the environmental mercury cycle in the Arctic.The development worries Rune Dietz and Jean-Pierre Desforges."The narwhal is the Arctic mammal most affected by climate change. At the same time, whales lack the physiological properties to eliminate environmentally contaminants. They don't get rid of mercury by forming hair and feathers like polar bears, seals and seabirds, just as their enzyme system is less efficient at breaking down organic pollutants," explains Rune Dietz.However, the researchers behind the study see it as a positive sign that the narwhal has a greater ability to change its food basis than previously believed."With our new discoveries, we now know that there is a bank of data in the narwhal tusks found in museums around the world. By analysing them, we can hopefully get an insight into the narwhals' food strategy from different areas and periods many years back in time. This will provide us with a solid basis for evaluating how the species copes with the changed conditions that it now encounters in the Arctic," says Rune Dietz.The research team points out that valuable, chronological information is also waiting in other types of biological material, for example from the teeth of other species, hair, whale baleen, whale earwax plugs, shells from shellfish and year rings in trees.
|
Environment
| 2,021 |
March 11, 2021
|
https://www.sciencedaily.com/releases/2021/03/210311152719.htm
|
Cheaper carbon capture is on the way
|
As part of a marathon research effort to lower the cost of carbon capture, chemists have now demonstrated a method to seize carbon dioxide (CO
|
In a study published in the March 2021 edition of "EEMPA has some promising qualities," said chemical engineer Yuan Jiang, lead author of the study. "It can capture carbon dioxide without high water content, so it's water-lean, and it's much less viscous than other water-lean solvents."Carbon capture methods are diverse. They range from aqueous amines -- the water-rich solvents that run through today's commercially available capture units, which Jiang used as an industrial comparison -- to energy-efficient membranes that filter COCurrent atmospheric COAt a cost of $400-$500 million per unit, commercial technology can capture carbon at roughly $58.30 per metric ton of COJiang's study described seven processes that power plants can adopt when using EEMPA, ranging from simple setups similar to those described in 1930s technology, to multi-stage configurations of greater complexity. Jiang modeled the energy and material costs to run such processes in a 550-megawatt coal power plant, finding that each method coalesces near the $47.10 per metric ton mark.One of the first known patents for solvent-based carbon capture technology cropped up in 1930, filed by Robert Bottoms."I kid you not," said green chemist David Heldebrant, coauthor of the new study. "Ninety-one years ago, Bottoms used almost the same process design and chemistry to address what we now know as a 21st century problem."The chemical process for extracting CO"We wanted to hit it from the other side and ask, why are we not using 21st century chemistry for this?" Heldebrant said. So, in 2009, he and his colleagues began designing water-lean solvents as an alternative. The first few solvents were too viscous to be usable."'Look,'" he recalled industry partners saying, "'your solvent is freezing and turning into glass. We can't work with this.' So, we said, OK. Challenge accepted."Over the next decade, the PNNL team refined the solvent's chemistry with the explicit aim to overcome the "viscosity barrier." The key, it turned out, was to use molecules that aligned in a way that promoted internal hydrogen bonding, leaving fewer hydrogen atoms to interact with neighboring molecules.Heldebrant draws a comparison to children running through a ball pit: if two kids hold each other's hands while passing through, they move slowly. But if they hold their own hands instead, they pass as two smaller, faster-moving objects. Internal hydrogen bonding also leaves fewer hydrogen atoms to interact with overall, akin to removing balls from the pit.Where the team's solvent was once viscous like honey, it now flowed like water from the kettle. EEMPA is 99 percent less viscous than PNNL's previous water-lean formulations, now nearly on par with commercial solvents, allowing them to be utilized in existing infrastructure, which is largely built from steel. Pivoting to plastic in place of steel, the team found, can further reduce equipment costs.Steel is expensive to produce, costly to ship and tends to corrode over time in contact with solvents. At one tenth the weight, substituting plastic for steel can drive the overall cost down another $5 per metric ton, according to a study led by Jiang in 2019.Pairing with plastic offers another advantage to EEMPA, whose reactive surface area is boosted in plastic systems. Because traditional aqueous amines can't "wet" plastic as well (think of water beading on Teflon), this advantage is unique to the new solvent.The PNNL team plans to produce 4,000 gallons of EEMPA in 2022 to analyze at a 0.5-megawatt scale inside testing facilities at the National Carbon Capture Center in Shelby County, Alabama, in a project led by the Electric Power Research Institute in partnership with Research Triangle Institute International. They will continue testing at increasing scales and further refine the solvent's chemistry, with the aim to reach the U.S. Department of Energy's goal of deploying commercially available technology that can capture CO
|
Environment
| 2,021 |
March 11, 2021
|
https://www.sciencedaily.com/releases/2021/03/210311142030.htm
|
Climate change may not expand drylands
|
Does a warmer climate mean more dry land? For years, researchers projected that drylands -- including deserts, savannas and shrublands -- will expand as the planet warms, but new research from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) challenges those prevailing views.
|
Previous studies used atmospheric information, including rainfall and temperature, to make projections about future land conditions. The real picture is more complicated than that, said Kaighin McColl, Assistant Professor of Earth and Planetary Sciences and of Environmental Science and Engineering at SEAS and senior author of the paper."Historically, we have relatively good records of rainfall and temperature but really poor records of the land surface, things like soil moisture and vegetation," said McColl. "As a result, previous definitions of drylands are based only on how the atmosphere is behaving, as an approximation of the land surface. But models can now simulate both atmospheric and land conditions. By just looking directly at the land surface in climate models, we find that the models aren't showing a clear increase of drylands over time and that there is huge uncertainty about the global average state of drylands in the future."The research is published in "If you want to know if the land is going to get drier, if crops are going to fail or if a forest is going to dry out, you have look at the land itself," said Alexis Berg, a research associate in McColl's lab and first author of the paper. "How much vegetation is there? Are the plants water stressed?"While climate models have historically focused on the atmosphere, modern climate models now also simulate vegetation behavior and land hydrology.For example, when plants absorb COThese effects have long been known, but previous atmospheric-only indicators of drylands just weren't capturing these land surface effects."As the climate is warming, there is a divergence between atmospheric and land surface behavior," said Berg.To account for that divergence, McColl and Berg developed a new metric of drylands, based on land surface properties, including biological responses to higher atmospheric CO"Our research shows that while some drylands may expand, climate models don't project that there will be a dramatic and rapid global expansion of drylands," said McColl.However, simulating complex land processes remains challenging in global models."There is still a lot of uncertainty about how vegetation and the water cycle will change in a warming world," said McColl.McColl and his team aim to reduce that uncertainty in future research by developing more accurate land surface models.This research was supported in part by a Winokur Seed Grant in Environmental Sciences from the Harvard University Center for the Environment.
|
Environment
| 2,021 |
March 11, 2021
|
https://www.sciencedaily.com/releases/2021/03/210311123507.htm
|
Making green energy the default choice can help tackle climate change, study finds
|
Researchers studying the Swiss energy market have found that making green energy the default option for consumers leads to an enduring shift to renewables and thus has the potential to cut CO
|
The study, published today in Both business and private customers largely accepted the default option, even though it was slightly more expensive, and the switch to green sources proved a lasting one.Professor Ulf Liebe (University of Warwick), Doctor Jennifer Gewinner and Professor em. Andreas Diekmann (both ETH Zurich) analysed data from two Swiss energy suppliers who between them supplied around 234,000 households and 9,000 businesses in urban and rural areas.Both companies restructured their products to offer a choice between conventional power, renewable power and "renewable plus," one company in 2009 and the other in 2016. Consumers were assigned the renewable package unless they opted out, a behavioural mechanism known to have success in a range of settings.Further analysis of customer data in the household sector showed that women were around 6 per cent more likely than men to accept the green default, while women business owners were 8 per cent more likely to stick with the renewable package.It is sometimes argued that the introduction of green energy defaults leads to an increase in energy use -- because the energy is 'clean' consumers are more relaxed about using it. Analysis of six years of energy consumption data showed no evidence of this.Commenting on these results, Professor Liebe said: "Our study shows that 'green defaults' have an immediate, enduring impact and as such should be part of the toolkit for policymakers and utility companies seeking to increase renewable energy consumption, not only among household customers but also in the business sector."Large green default effects can considerably reduce CO
|
Environment
| 2,021 |
March 11, 2021
|
https://www.sciencedaily.com/releases/2021/03/210311123501.htm
|
New analysis of 2D perovskites could shape the future of solar cells and LEDs
|
An innovative analysis of two-dimensional (2D) materials from engineers at the University of Surrey could boost the development of next-generation solar cells and LEDs.
|
Three-dimensional perovskites have proved themselves remarkably successful materials for LED devices and solar panels in the past decade. One key issue with these materials, however, is their stability, with device performance decreasing quicker than other state-of-the-art materials. The engineering community believes the 2D variant of perovskites could provide answers to these performance issues.In a study published in The study analysed the effects of combining lead with tin inside the Ruddlesden-Popper structure to reduce the toxic lead quantity. This also allows for the tuning of key properties such as the wavelengths of light that the material can absorb or emit at the device level -- improving the performance of photovoltaics and light-emitting diodes.Cameron Underwood, lead author of the research and postdoctoral researcher at the ATI, said: "There is rightly much excitement about the potential of 2D perovskites, as they could inspire a sustainability revolution in many industries. We believe our analysis of strengthening the performance of perovskite can play a role in improving the stability of low-cost solar energy and LEDs."Professor Ravi Silva, corresponding author of the research and Director of the ATI, said: "As we wean ourselves away from fossil energy sources to more sustainable alternatives, we are starting to see innovative and ground-breaking uses of materials such as perovskites. The Advanced Technology Institute is dedicated to being a strong voice in shaping a greener and more sustainable future in electronics -- and our new analysis is part of this continuing discussion."This research is sponsored by the EPSRC and the MUSICODE project.
|
Environment
| 2,021 |
March 11, 2021
|
https://www.sciencedaily.com/releases/2021/03/210311123455.htm
|
Starting small to answer the big questions about photosynthesis
|
New scientific techniques are revealing the intricate role that proteins play in photosynthesis.
|
Despite being discovered almost 300 years ago, photosynthesis still holds many unanswered questions for science, particularly the way that proteins organise themselves to convert sunlight into chemical energy and at the same time, protect plants from too much sunlight.Now a collaboration between researchers at the University of Leeds and Kobe University in Japan is developing a novel approach to the investigation of photosynthesis.Using hybrid membranes that mimic natural plant membranes and advanced microscopes, they are opening photosynthesis to nanoscale investigation -- the study of life at less than one billionth of a metre -- to reveal the behaviour of individual protein molecules.Dr Peter Adams, Associate Professor in the School of Physics and Astronomy at the University of Leeds, who supervised the research, said: "For many decades scientists have been developing an understanding of photosynthesis in terms of the biology of whole plants. This research is tackling it at the molecular level and the way proteins interact."A greater understanding of photosynthesis will benefit humankind. It will help scientists identify new ways to protect and boost crop yields, as well as inspire technologists to develop new solar-powered materials and components."The findings are published in the academic journal Photosynthesis happens when photons or packets of light energy cause pigments inside light-harvesting proteins to become excited. The way that these proteins arrange themselves determines how the energy is transferred to other molecules.It is a complex system that plays out across many different pigments, proteins, and layers of light-harvesting membranes within the plant. Together, it regulates energy absorption, transfer, and the conversion of this energy into other useful forms.To understand this intricate process, scientists have been using a technique called atomic force microscopy, which is a device capable of revealing components of a membrane that are a few nanometres in size.The difficulty is that natural plant membranes are very fragile and can be damaged by atomic force microscopy.But last year, the researchers at Kobe University announced that they had developed a hybrid membrane made up of natural plant material and synthetic lipids that would act as a substitute for a natural plant membrane -- and crucially, is more stable when placed in an atomic force microscope.The team at the University of Leeds used the hybrid membrane and subjected it to atomic force microscopy and another advanced visualisation technique called fluorescence lifetime imaging microscopy, or FLIM.PhD researcher Sophie Meredith, also from the School of Physics at the University of Leeds, is the lead author in the paper. She said: "The combination of FLIM and atomic force microscopy allowed us to observe the elements of photosynthesis. It gave us an insight into the dynamic behaviors and interactions that take place."What is important is that we can control some of the parameters in the hybrid membrane, so we can isolate and control factors, and that helps with experimental investigation."In essence, we now have a 'testbed' and a suite of advanced imaging tools that will reveal the sub-molecular working of photosynthesis."The research was supported by the Royal Society, the Biotechnology and Biological Sciences Research Council, the Engineering and Physical Sciences Research Council, the Medical Research Council and the Japan Society for the Promotion of Science.
|
Environment
| 2,021 |
March 11, 2021
|
https://www.sciencedaily.com/releases/2021/03/210311085321.htm
|
Mapping the best places to plant trees
|
Reforestation could help to combat climate change, but whether and where to plant trees is a complex choice with many conflicting factors. To combat this problem, researchers reporting in the journal
|
"Often the information we need to make informed decisions about where to deploy reforestation already exists, it's just scattered across a lot of different locations," says author Susan Cook-Patton, a Senior Forest Restoration Scientist at the Nature Conservancy. "Not everybody has the computer science experience to delve into the raw data, so we tried to bring this information together to develop a menu of options for reforestation, allowing people to choose what they would like to see in their community, state, or nation."The culmination of these efforts is the Reforestation Hub, a web-based interactive map that color-codes individual counties by reforestation opportunity or level of potential for successful reforestation. And the results show that there is a great deal of reforestation opportunity in the United States."There are up to 51.6 million hectares (about 200,000 square miles) of opportunity to restore forest across the United States after excluding productive cropland and other places where trees are infeasible," she says. "Those additional forested areas could absorb the emissions equivalent to all the personal vehicles in California, Texas, and New York combined."In addition to quantifying the amount of land that could yield viable forests, the Hub also identifies trends in how this opportunity is distributed throughout the country."While there's no single best place to restore forest cover, we did find a particularly high density of opportunity in the Southeastern United States," says Cook-Patton. "This is a region where carbon accumulation rates are high, costs are low, and there is a lot of opportunity to achieve multiple benefits like creating habitats for biodiversity, improving water quality, and climate mitigation."The map also quantifies the acreage of 10 individual opportunity classes -- or categories based on land ownership and quality. Some of these include pastures, post-burn lands, and floodplains. "The choice to plant trees really depends on what people want out of the landscape, whether it's controlling flood waters, improving urban environments, or recovering forests after a fire," she says.The researchers hope to create similar maps for other countries, an important next step for combating the global problem of climate change."We have about a decade to get climate change in check," Cook-Patton says, "and I am excited about the potential for this study to help accelerate decisions to invest in reforestation as a climate solution."
|
Environment
| 2,021 |
March 10, 2021
|
https://www.sciencedaily.com/releases/2021/03/210310132335.htm
|
'Lost' ocean nanoplastic might be getting trapped on coasts
|
As plastic debris weathers in aquatic environments, it can shed tiny nanoplastics. Although scientists have a good understanding of how these particles form, they still don't have a good grasp of where all the fragments end up. Now, researchers reporting in ACS'
|
There is a huge discrepancy between the millions of tons of plastic waste entering rivers and streams and the amount researchers have found in the oceans. As large pieces of plastic break apart into successively smaller fragments on their transit to the sea, some eventually wear down to nano-sized particles. Previous studies have shown that these nanoplastics congregate in well-mixed, stagnant salty water. Yet, these results do not apply when the particles encounter dynamic changes in salt content, such as estuaries, where rivers carrying freshwater meet tidal saltwater. So, Hervé Tabuteau, Julien Gigault and colleagues wanted to perform laboratory experiments with micro-sized chambers mimicking the conditions measured in an estuary to show how nanoplastics interact and aggregate in this type of environment.To determine how nanoplastics move in estuarine waters, the team developed a lab-on-a-chip device. They introduced crushed 400-nm-wide polystyrene beads and freshwater into one side of the device, while injecting saltwater through another inlet. At the opposite end of the 1.7-cm-long device, the researchers collected the output. The team tested different flow rates, replicating the salt gradient and water movement they measured in an estuary on the French Caribbean island of Guadeloupe. Nanoplastic aggregates up to 10-μm wide were detected within the zone of highest salt concentration in the flow chamber, regardless of how fast the water was moving. At the highest flow rate, only 12% of the nanoplastics were collected in the outlets; the remaining particles either clumped together and sank in the flow chamber or formed floating aggregates that stuck to the chamber's sides. The researchers say their results show estuaries and other coastal environments may filter out nanoplastics before they can enter the ocean.
|
Environment
| 2,021 |
March 10, 2021
|
https://www.sciencedaily.com/releases/2021/03/210310132332.htm
|
Producing highly efficient LEDs based on 2D perovskite films
|
Energy-efficient light-emitting diodes (LEDs) have been used in our everyday life for many decades. But the quest for better LEDs, offering both lower costs and brighter colours, has recently drawn scientists to a material called perovskite. A recent joint-research project co-led by the scientist from City University of Hong Kong (CityU) has now developed a 2D perovskite material for the most efficient LEDs.
|
From household lighting to mobile phone displays, from pinpoint lighting needed for endoscopy procedures to light source to grow vegetables in Space, LEDs are everywhere. Yet current high-quality LEDs still need to be processed at high temperatures and using elaborated deposition technologies -- which make their production cost expensive.Scientists have recently realised that metal halide perovskites -- semiconductor materials with the same structure as calcium titanate mineral, but with another elemental composition -- are extremely promising candidates for next-generation LEDs. These perovskites can be processed into LEDs from solution at room temperature, thus largely reducing their production cost. Yet, the electro-luminescence performance of perovskites in LEDs still has room for improvement.Led by Professor Andrey Rogach, Chair Professor at the Department of Materials Science and Engineering at CityU, and his collaborator Professor Yang Xuyong from Shanghai University, the team has found a kind of dimmer switch: they could turn one light emission from perovskites to a brighter level!They worked with two-dimensional (2D) perovskites (also known as Ruddlesden-Popper perovskites) and succeeded to realise very efficient and bright LEDs, with best-reported performance on both current efficiency and external quantum efficiency for devices based on this kind of perovskites. This work has now put the perovskite LEDs close on the heels of current commercial display technologies, such as organic LEDs.The key to the powerful change lies in the addition of around 10% of a simple organic molecule called methanesulfonate (MeS).In this study, the 2D perovskites used by the team have a nanometre level thickness. The MeS reconstructs the structure of the 2D perovskite nanosheets, while at the same time enhancing exciton energy transfer between sheets of different thicknesses. Both of these changes have greatly enhanced the electro-luminescence of the thicker, green-emitting perovskite sheets within the 2D structure.The MeS is also useful in reducing the number of defects in the 2D perovskite structure. During the process of light production, where radiative recombination took place, part of the excitons required for the process will be "wasted" in the non-radiative recombination which produces no light. MeS reduces the number of uncoordinated PbThe results of the research for producing better LEDs has been encouraging. The brightness of 13,400 candela/m"My CityU team has built-up its expertise on perovskite materials to a very high level in a relatively short period of time, thanks to funding support from Senior Research Fellowship by the Croucher Foundation. And already we see the benefit, especially in the outcomes detailed in this latest publication," said Professor Rogach."The achieved high brightness, excellent colour purity, and commercial grade operating efficiency mark 2D perovskites as extremely attractive materials for future commercial LEDs, and potentially also display technology. It's a tangible outcome from both fundamental and applied research into novel nano-scale materials" he adds.
|
Environment
| 2,021 |
March 10, 2021
|
https://www.sciencedaily.com/releases/2021/03/210310122605.htm
|
Venom-extraction and exotic pet trade may hasten the extinction of scorpions
|
An article published by the researchers of the Biodiversity Unit at the University of Turku, Finland, highlights how amateur venom-extraction business is threatening scorpion species. Sustainably produced scorpion venoms are important, for example, in the pharmacological industry. However, in the recent years, there has been a dramatic increase in the number of people involved in the trade and vast numbers of scorpions are harvested from nature. This development is endangering the future of several scorpion species in a number of areas.
|
Scorpions have existed on Earth for over 430 million years. Currently comprising over 2,500 extant species, scorpions occur on almost all the major landmasses in a range of habitats from deserts to tropical rainforests and caves. All scorpions are predators and use their venom to subdue and paralyse prey, as well as for defence.Scorpion venoms are very complex and they are used in biomedical research. Despite their reputation, most scorpion species are harmless to humans, and in only approximately 50 species the venom is life-threatening. Scorpion stings cause around 200 fatalities each year."Interest towards scorpion venom has unfortunately led to the situation where enormous amounts of scorpions are collected from nature. For example, a claim was spread in social media in Iran that scorpion venom costs ten million dollars per litre. As the situation escalated, illegal scorpion farms were established in the country and tens of thousands of scorpions were collected into these farms. Simultaneously, businesses devoted to training people in captive husbandry and rearing, marketing, and bulk distribution of live scorpions began to flourish. As a result, many species are quickly becoming endangered," says Doctoral Candidate Alireza Zamani from the Biodiversity Unit at the University of Turku, Finland.Biodiversity loss is accelerating at an alarming rate because of population growth and the related unsustainable overexploitation of natural resources. According to the estimate of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES), as many as million different species are in danger of becoming extinct in the next decades if this development is not slowed down."It is important to understand that long before a species disappears, the number of individuals in the populations decrease and the species becomes endangered. This means that the risk of becoming extinct has increased. With scorpions, the pressure to overharvest populations for venom-extraction and exotic pet trade threatens especially species with a small range. Scorpions also breed relatively slowly when compared with several other invertebrates. In addition to the increased pressure to harvest these animals, they are also threatened by habitat destruction," notes Professor of Biodiversity Research Ilari E. Sääksjärvi from the University of Turku.Research has a very important role in stopping the biodiversity loss. Our understanding of biodiversity is still inadequate and as much as 80 percent of the living organisms on Earth are unknown to science. Protecting biodiversity requires more and more researched information."Scorpion species are still poorly known. It is vital for the protection of scorpions to produce more information about the species and get them under conservation. At the moment, only few scorpion species are protected. At the same time, we should ensure that the local communities are sufficiently informed about scorpions and their situation. With knowledge, we can help people to understand that many species are endangered and in danger of becoming extinct due to the overharvesting. It is also important to make sure that people understand that there is no market for the venom produced by amateur scorpion farms," says Zamani.The researchers of the Biodiversity Unit at the University of Turku are specialised in mapping out species in poorly documented areas. Each year, the researchers discover and describe dozens of species new to science."These studies help us to better understand the biodiversity loss and its factors. Many species currently suffer from the exotic animal trade that is enormous on a global scale. Our goal is to continue to shine light also on scorpions. It is important that people understand these magnificent animals better. Their importance for humans is great as well. As species become extinct, we also lose all the possibilities that their complex venoms could offer, for example, to drug development," emphasises Professor Sääksjärvi.
|
Environment
| 2,021 |
March 10, 2021
|
https://www.sciencedaily.com/releases/2021/03/210310122431.htm
|
Face masks and the environment: Preventing the next plastic problem
|
Recent studies estimate that we use an astounding 129 billion face masks globally every month -- that is 3 million a minute. Most of them are disposable face masks made from plastic microfibers.
|
"With increasing reports on inappropriate disposal of masks, it is urgent to recognize this potential environmental threat and prevent it from becoming the next plastic problem," researchers warn in a comment in the scientific journal The researchers are Environmental Toxicologist Elvis Genbo Xu from University of Southern Denmark and Professor of Civil and Environmental Engineering Zhiyong Jason Ren from Princeton University.Disposable masks are plastic products, that cannot be readily biodegraded but may fragment into smaller plastic particles, namely micro- and nanoplastics that widespread in ecosystems.The enormous production of disposable masks is on a similar scale as plastic bottles, which is estimated to be 43 billion per month.However, different from plastic bottles, (of which app. 25 pct. is recycled), there is no official guidance on mask recycle, making it more likely to be disposed of as solid waste, the researchers write.If not disposed of for recycling, like other plastic wastes, disposable masks can end up in the environment, freshwater systems, and oceans, where weathering can generate a large number of micro-sized particles (smaller than 5 mm) during a relatively short period (weeks) and further fragment into nanoplastics (smaller than 1 micrometer)."A newer and bigger concern is that the masks are directly made from microsized plastic fibers (thickness of ~1 to 10 micrometers). When breaking down in the environment, the mask may release more micro-sized plastics, easier and faster than bulk plastics like plastic bags," the researchers write, continuing:"Such impacts can be worsened by a new-generation mask, nanomasks, which directly use nano-sized plastic fibers (with a diameter smaller than 1 micrometer) and add a new source of nanoplastic pollution."The researchers stress that they do not know how masks contribute to the large number of plastic particles detected in the environment -- simply because no data on mask degradation in nature exists."But we know that, like other plastic debris, disposable masks may also accumulate and release harmful chemical and biological substances, such as bisphenol A, heavy metals, as well as pathogenic micro-organisms. These may pose indirect adverse impacts on plants, animals and humans," says Elvis Genbo Xu.Elvis Genbo Xu and Zhiyong Jason Ren have the following suggestions for dealing with the problem:
|
Environment
| 2,021 |
March 10, 2021
|
https://www.sciencedaily.com/releases/2021/03/210310122428.htm
|
How global sustainable development will affect forests
|
Global targets to improve the welfare of people across the planet will have mixed impacts on the world's forests, according to new research.
|
The United Nations' 17 key areas for global development -- known as the Sustainable Development Goals (SDGs) -- range from tackling poverty, hunger and sanitation to promoting clean energy, economic growth and reducing inequality.Many of these goals, such as improved peace and justice, good health and wellbeing, and quality education, will have a positive impact on the Earth's natural forests.But others, including creating new roads, industry and infrastructure, are likely to have detrimental consequences.The research, led by the University of Leeds, reviewed a wide range of existing academic papers into the UN's global goals.The findings, funded by the Natural Environment Research Council and the United Bank of Carbon, are published in the journal Lead author Jamie Carr, of Leeds' School of Earth and Environment, said: "Almost none of the 17 goals are universally good or bad for forests."The only exception to this is the goal concerning education, for which all impacts were identified as beneficial. Well-being and social progress are also most commonly associated with beneficial outcomes."Overall, targets relating to energy and infrastructure have the potential to be the most damaging to the world's forest ecosystems."For example, negative impacts were associated with hard infrastructure including roads, railways, dams, housing and industrial areas."In particular, there is good evidence to suggest that roads designed to boost access to markets are especially damaging for forests."Other damaging impacts included efforts to combat cocaine-associated crime in Colombia. Despite having some forest benefits, coca crop eradication has been shown to result in cultivators simply moving their activities elsewhere or switching to even more damaging agricultural practices."Overall, beneficial impacts are more numerous than damaging ones, but are typically less well understood. This suggests an urgent need for increased research on these so that society and policymakers can take full advantage."Forest ecosystems also help to mitigate against climate change while offering watershed protection and preventing soil erosion. In addition, about 1.6 billion people live near forests, and hundreds of millions depend on forest products in the form of fuel, food and timber.The UN goals to create a fairer society for everyone are broken down into 169 more specific targets. These were agreed by the organisation's General Assembly in 2015 and are intended to be achieved by 2030.The research team led by the University of Leeds included scientists from the University of Oxford, the International Union for Conservation of Nature, the United Nations Environment Programme World Conservation Monitoring Centre, and the International Institute for Environment and Development.They reviewed 466 academic papers on the UN targets, collecting 963 examples of impacts.They found that 63 of the 169 targets were likely to have effects on forests that were either damaging or beneficial, or in some cases, both.Of the identified impacts, 29 were potentially beneficial, 15 damaging and 19 had mixed impacts.Identifying and understanding these effects will help Governments to avoid negative impacts, while capitalising on the positive ones.Lead supervisors of the research were Professor Dominick Spracklen and Dr Susannah Sallu, of Leeds' School of Earth and Environment.Dr Sallu said: "Institutions working to help achieve the UN goals need to be aware that their actions can have negative implications for forests and the environmental services these forests provide."Inter-sectoral coordination between agriculture, energy, health, transport and forest sectors can help ensure future development does not cause unintended consequences."Inclusive planning involving a diverse range of society further minimises the potential for negative impacts."
|
Environment
| 2,021 |
March 5, 2021
|
https://www.sciencedaily.com/releases/2021/03/210305092414.htm
|
'Falling insect' season length impacts river ecosystems
|
Insects that fall from the surrounding forest provide seasonal food for fish in streams. Researchers at Kobe University and The University of Tokyo have shown that the lengthening of this period has a profound effect on food webs and ecosystem functions present in streams.
|
These research results provide proof that changes in forest seasonality also affect the ecosystems of nearby rivers. This finding highlights the importance of predicting the effects of climate change on ecosystems.The research group consisted of Associate Professor SATO Takuya and post-graduate student UEDA Rui of Kobe University's Graduate School of Science, and Associate Professor TAKIMOTO Gaku of The University of Tokyo's Graduate School of Agricultural and Life Sciences.The results were published in the Cold, clear flowing streams are home to many salmonid species including red-spotted masu salmon, cherry salmon and Japanese char (hereafter referred to as 'stream fish'). These stream fish prefer to eat the terrestrial invertebrates that fall into the river from the surrounding forests. When there are many of these land-dwelling insects in the water, the stream fish tend not to eat the benthic invertebrates that reside in the river, such as amphipods and the young of aquatic insects. This results in a sustained large population of benthic invertebrates, which eat the leaves that fall into the water. Consequently, this high population in turn accelerates the speed at which leaves in the river are decomposed (leaf breakdown rate = stream ecosystem functionality). Thus the presence of terrestrial invertebrates changes the diets of fish, which has a big impact (a positive indirect effect (*3)) on river food webs and ecosystem functions (Figure 1).The amount of terrestrial invertebrates that end up in rivers increases as trees grow new leaves in spring, reaches a peak in early summer, and then decreases as the trees lose their leaves in fall. This seasonal pattern is common to streams located in cool temperate to temperate zones. However, the period between the growth of new leaves and defoliation is short in forests at high latitudes and elevations, but long in forests at low latitudes and elevations. Therefore, even though there may be a similar total number of terrestrial invertebrates in rivers over the course of a year, it is likely that they are present in the water for intensive periods at high latitudes/elevations and prolonged periods at low latitudes/elevations.What kind of effect does the length of the terrestrial invertebrate supply period have on the food webs of streams and stream fish?This research study investigated the impacts that different supply periods had on red-spotted masu salmon (Outdoor experiments were conducted in large pools that mimic river ecosystems at Kyoto University's Wakayama Research Forest Station. The experiments were carried out from August until November 2016, and exactly the same total amount and type of terrestrial invertebrates (mealworms) were supplied in each experiment during the 90-day period. In the pulsed experiment groups, concentrated amounts of mealworms were supplied every day during the 30-day period in the middle of the 90-day experiment (i.e., from the 30th to 60th days), whereas the prolonged experiment groups were given a steady supply of mealworms for a third of the 90-day period (Photos B and C). Control groups that were not given terrestrial invertebrates were also set up. The following aspects were investigated: salmonid fishes' stomach contents and body size, the number of benthic invertebrates, and the leaf breakdown rate.In the pulsed groups, it was difficult for the bigger fish to monopolize the mealworm supply because a large amount was given at each time, therefore smaller fish were also able to eat mealworms (Figure 2A). After the experiment, it was found that there was little difference in size between fish in the pulsed groups (Figure 3), indicating that these conditions resulted in a community where it was difficult for individual fish to dominate the food supply. Conversely, in the prolonged groups, it was easy for larger fish to monopolize food flowing downstream, meaning that the out-competed smaller fish hardly ate any mealworms (Figure 2A). Post-experiment, a big variation in the size of fish was found in the prolonged group (Figure 3), revealing that these conditions had resulted in a community where large fish could easily monopolize the food supply. Furthermore, individuals that had reached a mature size were found among the dominant fish in the prolonged group, which is also indicative of the impact on population growth.In the pulsed group where both large and small fish could eat mealworms, there was a significant decrease in the amount of benthic invertebrates eaten by all fish compared with the control group (Figure 2B). On the other hand, small fish had a tendency to frequently consume benthic invertebrates in the prolonged groups where large fish monopolized the mealworm supply (Figure 2B). Consequently, there was no significant decrease in the amount of consumed benthic invertebrates in the prolonged groups compared with the control.Benthic invertebrate populations were at their highest in the pulsed groups where all salmon consumed fewer benthic invertebrates, resulting in the quickest breakdown of fallen leaves. On the other hand, in the prolonged groups where smaller fish ate many benthic invertebrates, the numbers of these insects and the leaf breakdown rate did not reach the levels seen in the pulsed groups. In other words, the presence of terrestrial invertebrates changed the feeding habits of the fish, which had a positive indirect effect on benthic invertebrates and the leaf breakdown rate, and this impact was greater in the pulsed groups than in the prolonged groups. Significant contrast was observed in the strength of this indirect effect between pulsed and prolonged groups when a large percentage of the benthic invertebrates consisted of midges, which are easy for salmon to consume. However, the effect was not observed when isopods, which are rarely found in the stomach contents of salmon, made up a large percentage of the benthic invertebrates.The main cause behind this last finding is believed to be that it is difficult for the fishes' dietary habits to influence benthic invertebrate numbers and the leaf breakdown rate in such circumstances. If the majority of benthic invertebrates present are difficult for the fish to eat, then their diet is unlikely to change from terrestrial to benthic invertebrates.This research provides initial proof that the length of the period where forest-dwelling insects are present in rivers has an extensive impact on salmon growth rate and size distribution, stream food webs and ecosystem functions. In addition, the effect on stream ecosystems is more pronounced when there is a high population of benthic invertebrate species that are easy for salmon to consume. These results show the vital importance of studying organisms' seasonality, which connects ecosystems such as those of forests and rivers, in order to understand food web structures and ecosystem functions.Based on these research results, we can see how worldwide climate change is impacting the seasonality of organisms living in specific ecosystems and that these changes in turn are likely to have a significant ripple effect on the surrounding ecosystems. Investigating these aspects, and being able to understand and predict the domino effect that climate change has on ecosystem behavior are important issues in the study of macrobiology.At present, the researchers have set up observation sites all across Japan, from Hokkaido in the north to Kyushu in the south. They are conducting longitudinal observations on the seasonal rise and decline of forest and river-dwelling insects in collaboration with local researchers. Through a combination of wide-ranging, longitudinal species observations and outdoor experiments like the ones in this study, they hope to deepen our understanding of how climate change impacts ecosystems' seasonal aspects, with a view to being able to predict these effects.
|
Environment
| 2,021 |
March 5, 2021
|
https://www.sciencedaily.com/releases/2021/03/210305092409.htm
|
Speeding up commercialization of electric vehicles
|
Professor Byoungwoo Kang develops a high-density cathode material through controlling local structures of the Li-rich layered materials.
|
Researchers in Korea have developed a high-capacity cathode material that can be stably charged and discharged for hundreds of cycles without using the expensive cobalt (Co) metal. The day is fast approaching when electric vehicles can drive long distances with Li- ion batteries.Professor Byoungwoo Kang and Dr. Junghwa Lee of POSTECH's Department of Materials Science and Engineering have successfully developed a high energy-density cathode material that can stably maintain charge and discharge for over 500 cycles without the expensive and toxic Co metal. The research team achieved this by controlling the local structure via developing the simple synthesis process for the Li-rich layered material that is attracting attention as the next-generation high-capacity cathode material. These research findings were published in The mileage and the charge-discharge cycle of an electric vehicle depend on the unique properties of the electrode material in the rechargeable Li-ion battery. Electricity is generated when lithium ions flow back and forth between the cathode and anode. In the case of Li-rich layered material, the number of cycles decreases sharply when large amount of lithium is extracted and inserted. In particular, when a large amount of lithium is extracted and an oxygen reaction occurs in a highly charged state, a structural collapse occurs rendering it impossible to maintain the charge-discharge properties or the high-energy density for long-term cycles. This deterioration of the cycle property has hampered commercialization.The research team had previously revealed that the homogeneous distribution of atoms between the transition metal layer and the lithium layer of the Li-rich layered material can be an important factor in the activation of electrochemical reaction and cycle property in Li-rich layered materials. The team then conducted a subsequent research to control the synthesis conditions for increasing the degree of the atoms' distribution in the structure. Using the previously published solid-state reaction, the team developed a new, simple and efficient process that can produce a cathode material that has an optimized atomic distribution.As a result, it was confirmed that the synthesized Li-rich layered material has an optimized local structure in terms of electrochemical activity and cycle property, enabling a large amount of lithium to be used reversibly. It was also confirmed that the oxygen redox reaction was also stably and reversibly driven for several hundred cycles.Under these optimized conditions, the Co-free Li-rich layered material synthesized displayed 180% higher reversible energy at 1,100Wh/kg than the conventionally commercialized high nickel layered material (ex. LiNi0.8Mn0.1Co0.1O2) with energy density of 600Wh/kg. In particular, even if a large amount of lithium is removed, a stable structure was maintained, enabling about 95% capacity for 100 cycles. In addition, by maintaining 83% for 500 cycles, a breakthrough performance that can maintain stable high energy for hundreds of cycles is anticipated."The significance of these research findings is that the cycle property, which is one of the important issues in the next-generation high-capacity Li-rich layered materials, have been dramatically improved through relatively simple process changes," explained Professor Byoungwoo Kang of POSTECH. "This is noteworthy in that we have moved a step closer to commercializing the next generation Li-rich layered materials."
|
Environment
| 2,021 |
March 4, 2021
|
https://www.sciencedaily.com/releases/2021/03/210304161058.htm
|
Field study shows icing can cost wind turbines up to 80% of power production
|
Wind turbine blades spinning through cold, wet conditions can collect ice nearly a foot thick on the yard-wide tips of their blades.
|
That disrupts blade aerodynamics. That disrupts the balance of the entire turbine. And that can disrupt energy production by up to 80 percent, according to a recently published field study led by Hui Hu, Iowa State University's Martin C. Jischke Professor in Aerospace Engineering and director of the university's Aircraft Icing Physics and Anti-/De-icing Technology Laboratory.Hu has been doing laboratory studies of turbine-blade icing for about 10 years, including performing experiments in the unique ISU Icing Research Tunnel. Much of that work has been supported by grants from the Iowa Energy Center and the National Science Foundation."But we always have questions about whether what we do in the lab represents what happens in the field," Hu said. "What happens over the blade surfaces of large, utility-scale wind turbines?"We all know about one thing that recently happened in the field. Wind power and other energy sources froze and failed in Texas during last month's winter storm.Hu wanted to quantify what happens on wind farms during winter weather and so several years ago began organizing a field study. But that was more complicated than he expected. Even in Iowa, where some 5,100 wind turbines produce more than 40% of the state's electricity (according to the U.S. Energy Information Association), he wasn't given access to turbines. Energy companies usually don't want their turbine performance data to go public.So Hu -- who had made connections with researchers at the School of Renewable Energy at North China Electric Power University in Beijing as part of an International Research Experiences for Students program funded by the National Science Foundation -- asked if Chinese wind farms would cooperate.Operators of a 34-turbine, 50-megawatt wind farm on a mountain ridgetop in eastern China agreed to a field study in January 2019. Hu said most of the turbines generate 1.5 megawatts of electricity and are very similar to the utility-scale turbines that operate in the United States.Because the wind farm the researchers studied is not far from the East China Sea, Hu said the wind turbines there face icing conditions more like those in Texas than in Iowa. Iowa wind farms are exposed to colder, drier winter conditions; when winter cold drops to Texas, wind farms there are exposed to more moisture because of the nearby Gulf of Mexico.As part of their field work, the researchers used drones to take photos of 50-meter-long turbine blades after exposure to up to 30 hours of icy winter conditions, including freezing rain, freezing drizzle, wet snow and freezing fog.The photographs allowed detailed measurement and analyses of how and where ice collected on the turbine blades. Hu said the photos also allowed researchers to compare natural icing to laboratory icing and largely validated their experimental findings, theories and predictions.The photos showed, "While ice accreted over entire blade spans, more ice was found to accrete on outboard blades with the ice thickness reaching up to 0.3 meters (nearly 1 foot) near the blade tips," the researchers wrote in a paper recently published online by the journal The researchers used the turbines' built-in control and data-acquisition systems to compare operation status and power production with ice on the blades against more typical, ice-free conditions."That tells us what's the big deal, what's the effect on power production," Hu said.The researchers found that icing had a major effect:"Despite the high wind, iced wind turbines were found to rotate much slower and even shut down frequently during the icing event, with the icing-induced power loss being up to 80%," the researchers wrote.That means Hu will continue to work on another area of wind-turbine research -- finding effective ways to de-ice the blades so they keep spinning, and the electricity keeps flowing, all winter long.
|
Environment
| 2,021 |
March 4, 2021
|
https://www.sciencedaily.com/releases/2021/03/210304145405.htm
|
Dramatic decline in western butterfly populations linked to fall warming
|
Western butterfly populations are declining at an estimated rate of 1.6% per year, according to a new report to be published this week in
|
"The monarch population that winters along the West Coast plummeted from several hundred thousand just a few years ago to fewer than 2,000 this past year," said Katy Prudic, an assistant professor of citizen and data science in the University of Arizona School of Natural Resources and the Environment and a co-author of the report. "Essentially, the western monarch is on the brink of extinction, but what's most unsettling is they are situated in the middle of the pack, so to speak, in our list of declining butterfly species."Declining population groups include butterflies that typically thrive in disturbed and degraded habitats, such as the cabbage white, or Pieris rapae, as well as species with broad migration ranges, such as the West Coast lady, or Vanessa annabella.The research team sourced more than 40 years of data collected by both expert and community scientists across the western United States to identify the most influential drivers in butterfly declines."When we talk about global change, it's often hard to tease out climate and land use change because they are happening simultaneously," Prudic said."Our report is unique in covering a wide area of relatively undeveloped land as compared to, for example, studies from heavily populated areas of western Europe," said Matthew Forister, a biology professor at the University of Nevada, Reno and lead author of the article. "The fact that declines are observed across the undeveloped spaces of the western U.S. means that we cannot assume that insects are OK out there far from direct human influence. And that's because the influence of climate change is, of course, not geographically restricted."The takeaway, Prudic said, is western U.S. butterflies are declining quickly, and autumn warming -- not spring warming or land use change -- is an important contributor to the decline.When it comes to seasonal warming, spring gets a lot of attention, but the warming climate affects temperatures year-round. Fall warming trends have been observed in 230 cities across the U.S., with the greatest fall temperature increases found in much of the Southwest."In Arizona, for example, the period between September and November has warmed about 0.2 degrees Fahrenheit per decade since 1895," said Michael Crimmins, a professor and climate science extension specialist in the UArizona Department of Environmental Science. "Fall temperatures have warmed especially fast since the late 1980s and it's not clear why."Declining butterfly populations have been observed in areas experiencing these fall warming trends across the West, according to the report. The authors suggest fall temperature increases may not only induce physiological stress on butterflies but may influence development and hibernation preparation. Warmer fall temperatures can also reduce the availability of food or host plants, and extend the length of time butterflies' natural enemies are active.In response to recent climate warming, one species in the eastern U.S. has taken flight. In a study published in the journal Over the course of 18 years, the swallowtail butterfly moved 324 kilometers north; that rate of expansion is more than 27 times faster than the average organism."Butterflies are pretty mobile. The swallowtails we worked on are more on the mobile side of things," said Keaton Wilson, who completed his postdoctoral research in the College of Agriculture and Life Sciences and is the lead author on the swallowtail article. "The tricky thing is they can move north, but they're really restricted by their host plants, or what the caterpillar larvae feed on. So, they can fly farther north, but they can't really hang out long without something for their young to eat."Both reports suggest butterflies are struggling to adapt to the changing climate, in urban as well as wild spaces. Conservation, management and restoration of public lands, especially in cooler riparian areas, will be critical to preventing further butterfly declines and extinction, Prudic said. However, it can't be assumed that protection of wild spaces without careful management is enough to stem the tide, she said."What we can do better is develop lands and gather resources we need more sustainably and with more sensitivity to conservation," said William Mannan, a conservation biologist in the UArizona School of Natural Resources and the Environment.Mannan points to creative strategies aimed at creating systems of connectivity and minimizing ecological impact, such as overhead power distribution and conservation plans that direct development away from areas of high biodiversity.Wet urban places, such as botanical gardens and restoration projects like Tucson's Santa Cruz River Heritage Project, are going to be more important than ever, Prudic said."Urban areas that have constant water flow are going to be critical refuges moving forward and likely areas of higher butterfly concentration," she said."The widespread butterfly declines highlight the importance of careful management of the lands that we do have control over, including our own backyards where we should use fewer pesticides and choose plants for landscapes that benefit local insects," Forister said.
|
Environment
| 2,021 |
March 4, 2021
|
https://www.sciencedaily.com/releases/2021/03/210304125337.htm
|
Want to cut emissions that cause climate change? Tax carbon
|
Putting a price on producing carbon is the cheapest, most efficient policy change legislators can make to reduce emissions that cause climate change, new research suggests.
|
The case study, published recently in the journal "If the goal is reducing carbon dioxide in the atmosphere, what we found is that putting a price on carbon and then letting suppliers and consumers make their production and consumption choices accordingly is much more effective than other policies," said Ramteen Sioshansi, senior author of the study and an integrated systems engineering professor at The Ohio State University.The study did not examine how policy changes might affect the reliability of the Texas power system -- an issue that became acute and painful for Texas residents last month when a winter storm caused the state's power grid to go down.But it did evaluate other policies, including mandates that a certain amount of energy in a region's energy portfolio come from renewable sources, and found that they were either more expensive or not as effective as carbon taxes at reducing the amount of carbon dioxide in the air. Subsidies for renewable energy sources were the also not as effective at reducing carbon dioxide, the study found.The researchers modeled what might happen if the government used these various methods to cut carbon emission to be 80% below the 2010 level by the end of 2040.They found that carbon taxes on coal and natural-gas-fired producing units could achieve those cuts at about half the cost of tax credits for renewable energy sources.The study was led by Yixian Liu, a former graduate student in Sioshansi's lab, who is now a research scientist at Amazon. It modeled the expenses and carbon reductions possible from five generation technologies -- wind, solar, nuclear, natural gas and coal-fired units -- along with the costs and carbon reductions associated with storing energy. Storing energy is crucial, because it allows energy systems to manage renewable energy resources as sources shift from climate-change-causing fossil fuels -- natural gas and coal -- to cleaner sources like wind and solar.Sioshansi said the results of the study were not surprising, given that a similar program has been in use to reduce levels of sulfur dioxide, one of the chemicals that causes acid rain."We have known for the last 40 or more years that market-based solutions can work on issues like this," Sioshansi said.Although subsidies for renewable sources would work to decrease carbon emissions, the costs of those subsidies would be an issue, the study found."If no one had to pay for the subsidies and they were truly free, that would be a great option," Sioshansi said. "Unfortunately, that is not how they work."
|
Environment
| 2,021 |
March 4, 2021
|
https://www.sciencedaily.com/releases/2021/03/210304100432.htm
|
Performance of methane conversion solid catalyst is predicted by theoretical calculation
|
Japanese researchers have developed a simulation method to theoretically estimate the performance of heterogeneous catalyst by combining first-principles calculation (1) and kinetic calculation techniques. Up to now, simulation studies mainly focused on a single or limited number of reaction pathways, and it was difficult to estimate the efficiency of a catalytic reaction without experimental information.
|
Atsushi Ishikawa, Senior Researcher, Center for Green Research on Energy and Environmental Materials, National Institute for Materials Science (NIMS), performed computation of reaction kinetic information from first-principles calculations based on quantum mechanics, and developed methods and programs to carry out kinetic simulations without using experimental kinetic results. Then he applied the findings to the oxidative coupling of methane (OCM) reaction, which is an important process in the use of natural gas. He could successfully predict the yield of the products, such as ethane, without experimental information on the reaction kinetics. He also predicted changes in yield depending on the temperature and partial pressure, and the results reproduced faithfully the existing experimental results.This research shows that the computer simulation enables the forecasting the conversion of reactant and the selectivity of products, even if experimental data are unavailable. The search for catalytic materials led by theory and calculation is expected to speed up. Furthermore, this method is highly versatile and can be applied not only to methane conversion catalysts but also to other catalyst systems such as for automobile exhaust gas purification, carbon dioxide reduction and hydrogen generation, and is expected to contribute to the realization of a carbon-free society.The research was supported by JST's Strategic Basic Research Program, Precursory Research for Embryonic Science and Technology (PRESTO) Program.(1) First-principles calculation A theoretical calculation method for solving quantum mechanical equations without using empirical parameters, which is often applied to atomic, molecular, solid, and interface systems.
|
Environment
| 2,021 |
March 4, 2021
|
https://www.sciencedaily.com/releases/2021/03/210304100336.htm
|
Misinformation, polarization impeding environmental protection efforts
|
A group of researchers, spanning six universities and three continents, are sounding the alarm on a topic not often discussed in the context of conservation -- misinformation.
|
In a recent study published in "Outcomes, not intentions, should be the basis for how we view success in conservation," says Dr. Ford."Misinformation related to vaccines, climate change, and links between smoking and cancer has made it harder for science to create better policies for people," he says. "Weaponizing information to attack other groups impedes our ability to solve problems that affect almost everyone. We wanted to know if these issues were also a problem for people working to conserve biodiversity."Conservation is not perfect and things can go wrong. Sometimes people mean well, and harm ensues by accident. Sometimes people's actions are much more sinister."The study points to multiple examples of good intentions ending badly from across the globe, including the case of the Huemul deer in Patagonia National Park, Chile."We reviewed one case where the primary objective of a newly-established park was to protect the endangered Huemul deer. The goal was to make the landscape a little better for these deer in hopes of increasing the population," explains Dr. Lamb. "In doing so, they removed the domestic livestock from the park, and as a result, the natural predators in the system lost their usual food source and ate many of the deer, causing the population to decline further. It's a textbook case of misplaced conservation."Dr. Lamb points to other cases including mass petitions against shark finning in Florida, although the practice was previously banned there; planting a species of milkweed in an attempt to save monarch butterflies, only to ultimately harm them; and closer to home, the sharing of misinformation in regards to the British Columbia grizzly bear hunt."When we see province-wide policies like banning grizzly hunting, those go against the wishes of some local communities in some parts of the province -- and choosing to steamroll their perspectives is damaging relationships and alienating the partners we need on board to protect biodiversity," says Dr. Ford.He suggests using a 'big tent' approach may help combat some of the problems."We need to work together on the 90 per cent of goals that we share in common, as opposed to focusing on the 10 per cent of issues where we disagree. There are many clear wins for people and wildlife waiting to be actioned right now, we need to work together to make those happen," says Dr. Ford.Dr. Lamb says doing so is likely to improve cooperation among parties and increase the use of evidence-based approaches in conservation; ultimately suppressing the spread of misinformation and occurrences of polarization."Although we're seeing some misplaced efforts, we're also seeing genuine care and good community energy in many of these cases -- we just need to find a way to harness this energy in the right direction."
|
Environment
| 2,021 |
March 4, 2021
|
https://www.sciencedaily.com/releases/2021/03/210304100400.htm
|
Scientists discover how microorganisms evolve cooperative behaviors
|
Interspecies interactions are the foundation of ecosystems, from soil to ocean to human gut. Among the many different types of interactions, syntrophy is a particularly important and mutually beneficial interspecies interaction where one partner provides a chemical or nutrient that is consumed by the other in exchange for a reward.
|
Syntrophy plays an essential role in global carbon cycles by mediating the conversion of organic matter to methane, which is about 30 times more potent than carbon dioxide as a greenhouse gas and is a source of sustainable energy. And in the human gut, trillions of microbial cells also interact with each other and other species to modulate the physiology of their human host.Therefore, deciphering the nature, evolution, and mechanism of syntrophic interspecies interactions is fundamental to understand and manipulate microbial processes, bioenergy production and environmental sustainability.However, our understanding of what drives these interactions, how they evolve and how their disruption may lead to disease or ecosystem instability, is not well understood.ISB researchers and collaborators aimed to tackle these fundamental questions to shed light on how interspecies interactions -- specifically, cooperation -- arise, evolve and are maintained. Their results provide a new window to understand the key roles of these interactions in industrial applications, and in the health and disease of humans, animals and plants.The study builds on the prior work on syntrophic interactions among two microbes -- Desulfovibrio vulgaris (Dv) and Methanococcus maripaludis (Mm) -- that coexist in diverse environments (gut, soil, etc.) and play a central role in an important step in biogeochemical carbon cycling.With a multidisciplinary approach cutting across systems biology, microbiology, evolutionary biology and other disciplines, researchers analyzed massive amounts of genome sequence data generated from more than 400 samples. They investigated the temporal and combinatorial patterns in which mutations accumulated in both organisms over 1,000 generations, mapped lineages through high resolution single cell sequencing, and characterized the fitness and cooperativity of pairings of their individual isolates.The team uncovered striking evidence that mutations accumulated during evolution generate positive genetic interactions among rare individuals of a microbial community. These genetic interactions increase cooperativity within these rare microbial assemblages, enabling their persistence at very low frequency within a larger productive population. In addition, researchers discovered one of the first examples of parallel evolution, i.e., accumulation of mutations in similar genes across independently evolving populations, underlying the evolution of both organisms in a mutualistic community."This study is a significant step in understanding and manipulating early adaptive events in evolution of mutualistic interactions with a wide range of applications for biotechnology, medicine and environment," said Dr. Serdar Turkarslan, senior research scientist in ISB's Baliga Lab and lead author of a paper recently published in These discoveries are of great interest because they explain how diverse microbial populations co-exist in dynamically changing environments, such as in reactors, lake sediments and the human gut. Moreover, the study incorporates multiscale systems analysis of diverse kinds of longitudinal datasets and experimental testing of hypotheses to characterize a complex phenomenon that emerges from interactions at the genetic level between two members of a microbial community."The methodologies, insights and resources generated by this study will have wide applicability to the study of other interspecies interactions and evolutionary phenomena," said ISB Professor, Director and Senior Vice President Dr. Nitin Baliga, a co-corresponding author of the paper. "One of the fundamental questions of biology is whether we can predict and modulate interspecies interactions such as the ones between the pathogens and their host environment. If we understand what are the genes that drive interaction of pathogens with the host, we can design therapies to alter the host microenvironment and permissibility of infection, or by directly changing pathogen recognition. For industrial applications, we can quickly screen for the most cooperative mutations across different microbial consortia to facilitate production of public goods."
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303161651.htm
|
Bahamas were settled earlier than believed
|
Humans were present in Florida by 14,000 years ago, and until recently, it was believed the Bahamas -- located only a few miles away -- were not colonized until about 1,000 years ago. But new findings from a team including a Texas A&M University at Galveston researcher prove that the area was colonized earlier, and the new settlers dramatically changed the landscape.
|
Peter van Hengstum, associate professor in the Department of Marine and Coastal Environment Science at Texas A&M-Galveston, and colleagues have had their findings published in Researchers generated a new environmental record from the Blackwood Sinkhole, which is flooded with 120 feet of groundwater without dissolved oxygen. This is important because it has pristinely preserved organic material for the last 3,000 years. Using core samples and radiocarbon dating, the team examined charcoal deposits from human fires thousands of years ago, indicating that the first settlers arrived in the Bahamas sooner than previously thought."The Bahamas were the last place colonized by people in the Caribbean region, and previous physical evidence indicated that it may have taken hundreds of years for indigenous people of the Bahamas -- called the Lucayans -- to move through the Bahamian archipelago that spans about 500 miles," van Hengstum said.While people were present in Florida more than 14,000 years ago at the end of the last ice age, he said, these people never crossed the Florida Straits to nearby Bahamian islands, only 50 to 65 miles away. Meanwhile, the Caribbean islands were populated by people migrating from South American northward. Van Hengstum said the oldest archaeological sites in the southernmost Bahamian archipelago from the Turks and Caicos Islands indicate human arrival likely by 700 A.D."But in the northern Bahamian Great Abaco Island, the earliest physical evidence of human occupation are skeletons preserved in sinkholes and blueholes," he said. "These two skeletons from Abaco date from 1200 to 1300 A.D. Our new record of landscape disturbance from people indicates that slash-and-burn agriculture likely began around 830 A.D., meaning the Lucayans rapidly migrated through the Bahamian archipelago in likely a century, or spanning just a few human generations."The team's other findings show how the Lucayans changed the new land.When the Lucayans arrived, Great Abaco Island was mostly covered with pine and palm forests, and had a unique reptile-dominated ecosystem of giant tortoises and crocodiles. Increased deforestation and burning allowed pine trees to colonize and out-compete native palms and hardwoods.Large land reptiles began to disappear after 1000 A.D. A significant increase in intense regional hurricane activity around 1500 AD is thought to have caused considerable damage to the new pine tree forests, as indicated by a decrease in pine pollen in the sediment core."The pollen record indicates that the pre-contact forest was not significantly impacted earlier in the record during known times when intense hurricane strike events were more frequent," van Hengstum said. "In our current world where the intensity of the largest hurricanes is expected to increase over the coming decades, the current pine trees in the northern Bahamas may not be as resilient to environmental impacts of these changes in hurricane activity."The study was funded by the National Science Foundation.
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303161643.htm
|
High end of climate sensitivity in new climate models seen as less plausible
|
A recent analysis of the latest generation of climate models -- known as a CMIP6 -- provides a cautionary tale on interpreting climate simulations as scientists develop more sensitive and sophisticated projections of how the Earth will respond to increasing levels of carbon dioxide in the atmosphere.
|
Researchers at Princeton University and the University of Miami reported that newer models with a high "climate sensitivity" -- meaning they predict much greater global warming from the same levels of atmospheric carbon dioxide as other models -- do not provide a plausible scenario of Earth's future climate.Those models overstate the global cooling effect that arises from interactions between clouds and aerosols and project that clouds will moderate greenhouse gas-induced warming -- particularly in the northern hemisphere -- much more than climate records show actually happens, the researchers reported in the journal Instead, the researchers found that models with lower climate sensitivity are more consistent with observed differences in temperature between the northern and southern hemispheres, and, thus, are more accurate depictions of projected climate change than the newer models. The study was supported by the Carbon Mitigation Initiative (CMI) based in Princeton's High Meadows Environmental Institute (HMEI).These findings are potentially significant when it comes to climate-change policy, explained co-author Gabriel Vecchi, a Princeton professor of geosciences and the High Meadows Environmental Institute and principal investigator in CMI. Because models with higher climate sensitivity forecast greater warming from greenhouse gas emissions, they also project more dire -- and imminent -- consequences such as more extreme sea-level rise and heat waves.The high climate-sensitivity models forecast an increase in global average temperature from 2 to 6 degrees Celsius under current carbon dioxide levels. The current scientific consensus is that the increase must be kept under 2 degrees to avoid catastrophic effects. The 2016 Paris Agreement sets the threshold to 1.5 degrees Celsius."A higher climate sensitivity would obviously necessitate much more aggressive carbon mitigation," Vecchi said. "Society would need to reduce carbon emissions much more rapidly to meet the goals of the Paris Agreement and keep global warming below 2 degrees Celsius. Reducing the uncertainty in climate sensitivity helps us make a more reliable and accurate strategy to deal with climate change."The researchers found that both the high and low climate-sensitivity models match global temperatures observed during the 20th century. The higher-sensitivity models, however, include a stronger cooling effect from aerosol-cloud interaction that offsets the greater warming due to greenhouse gases. Moreover, the models have aerosol emissions occurring primarily in the northern hemisphere, which is not consistent with observations."Our results remind us that we should be cautious about a model result, even if the models accurately represent past global warming," said first author Chenggong Wang, a Ph.D. candidate in Princeton's Program in Atmospheric and Oceanic Sciences. "We show that the global average hides important details about the patterns of temperature change."In addition to the main findings, the study helps shed light on how clouds can moderate warming both in models and the real world at large and small scales."Clouds can amplify global warming and may cause warming to accelerate rapidly during the next century," said co-author Wenchang Yang, an associate research scholar in geosciences at Princeton. "In short, improving our understanding and ability to correctly simulate clouds is really the key to more reliable predictions of the future."Scientists at Princeton and other institutions have recently turned their focus to the effect that clouds have on climate change. Related research includes two papers by Amilcare Porporato, Princeton's Thomas J. Wu '94 Professor of Civil and Environmental Engineering and the High Meadows Environmental Institute and a member of the CMI leadership team, that reported on the future effect of heat-induced clouds on solar power and how climate models underestimate the cooling effect of the daily cloud cycle."Understanding how clouds modulate climate change is at the forefront of climate research," said co-author Brian Soden, a professor of atmospheric sciences at the University of Miami. "It is encouraging that, as this study shows, there are still many treasures we can exploit from historical climate observations that help refine the interpretations we get from global mean-temperature change."
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303161612.htm
|
Climate change 'winners' may owe financial compensation to polluters
|
Climate change is generally portrayed as an environmental and societal threat with entirely negative consequences. However, some sectors of the global economy may actually end up benefiting.
|
New economic and philosophical research argues that policymakers must consider both the beneficial effects of climate change to "climate winners" as well as its costs in order to appropriately incentivize actions that are best for society and for the environment.The study by researchers from Princeton University, University College Cork, and HEC Montréal appears to be the first to develop a systematic, ethical framework for addressing climate winners -- as well as those harmed -- using financial transfers.Their approach, called "Polluter Pays, Then Receives," requires polluters to first compensate those most harmed by climate change. Subsequently, polluters would be eligible to receive compensation from those who are passively benefiting from climate change.Published in "With a global issue like climate change, it's difficult for people to make decisions that account for the harm or benefit their actions cause, because those effects aren't directly or proportionally felt by that actor," said study co-author Kian Mintz-Woo, a former postdoctoral research associate at the Princeton University Center for Human Values and the Princeton School of Public and International Affairs. "Our research argues that payments are one way to help correct incentives: Harms should be redressed, and then beneficial actions should be rewarded."Mintz-Woo recently joined the department of Philosophy and the Environmental Research Institute at University College Cork.While globally the negative consequences of climate change are expected to far outweigh the benefits, some groups or places may experience net benefits. For example, countries at far northern latitudes or specific industries may see improved agricultural conditions, additional tourism, or lower energy costs."Not systematically considering or accounting for beneficial climate effects makes it easier for climate impact skeptics to think that climate change discussions are oversimplified or alarmist," said study co-author Justin Leroux, professor of applied economics at HEC Montréal and CIRANO research fellow. "Another motivation of our study was to address the unfairness that arises when some benefit from climate change while others are suffering harm. It's a question of solidarity -- both sharing benefits that weren't truly earned and compensating losses that weren't the fault of those harmed."The authors say this compensation approach could be experimented with at a regional or national level before being introduced globally. They explore how it might be implemented in a federal nation using the example of Canada. A national carbon tax could be used to collect funds from greenhouse gas emitters. Those revenues would first and foremost be used to compensate victims. In addition, a corporate tax would be levied on the sectors of the economy that gain passively from climate change, like tourism, which could benefit from more tourists taking advantage of longer summers in Arctic regions. Revenue from the additional corporate tax would be shared with the greenhouse gas emitters. Introducing a policy to reward emitters may sound surprising, but the authors emphasize that those emitters would first need to pay for their harms before receiving any benefit payments."Payments from passive winners to polluters could either help the polluters more fully compensate the groups that have been harmed by their actions or help fund the polluters' own climate adaptation responses," Mintz-Woo said.
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303144817.htm
|
Small-scale fisheries offer strategies for resilience in the face of climate change
|
Coastal communities at the forefront of climate change reveal valuable approaches to foster adaptability and resilience, according to a worldwide analysis of small-scale fisheries by Stanford University researchers.
|
Globally important for both livelihood and nourishment, small-scale fisheries employ about 90 percent of the world's fishers and provide half the fish for human consumption. Large-scale shocks -- like natural disasters, weather fluctuations, oil spills and market collapse -- can spell disaster, depending on the fisheries' ability to adapt to change. In an assessment of 22 small-scale fisheries that experienced stressors, researchers revealed that diversity and flexibility are among the most important adaptive capacity factors overall, while access to financial assets was not as important for individual households as it was at the community scale. The research was published Jan. 23 in the journal "The idea of assets not being essential at the household level is an empowering finding because we looked at a lot of places in developing nations without a lot of assets," said lead author Kristen Green, a PhD student in the Emmett Interdisciplinary Program in Environment and Resources (E-IPER) at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth). "It shows we can invest in non-financial or non-asset-based adaptive mechanisms, and fishers can still adapt."The researchers measured adaptive capacity using a new framework with three response pathways: adapt, react and cope. Adaptation is defined as proactive planning or taking collective action, reaction as an unplanned response, and coping as passive acceptance of consequences. The team of 11 study authors determined whether or not each fishery community or household had capacity in the areas of knowledge, assets, diversity and flexibility, governance and institutions, and natural capital."These adaptive capacity domains don't work in isolation -- it's the recipes or combinations that are important for successful adaptation," Green said.While previous research has calculated a quantitative or numerical resilience score for different regions and sectors, the focus on community response is fairly new, according to senior author Larry Crowder, the Edward Ricketts Provostial Professor and professor of biology in Stanford's School of Humanities and Sciences."Millions of people are dependent on making a living in small-scale fisheries, and some of them are currently doing it better than others," said Crowder, who is also a senior fellow at the Stanford Woods Institute for the Environment. "If we can identify the features that allow communities and individuals to be better prepared for those perturbations -- in other words, to have an adaptive response -- then we can try to build that capacity in communities that don't have it."In one case study in their analysis, a tropical island in Vanuatu exhibited flexibility when a cyclone disrupted fishery reefs, infrastructure and fisher livelihoods. Because the fishers had agency over management of the marine area, they were able to temporarily open a previously closed section to maintain food supply and income."Part of our findings run counter to the emerging conventional wisdom that making specialists of fisherman is a good thing," Crowder said. "Historically, these fishers were generalists, and our findings suggest they're more able to adapt to fluctuating circumstances if they can maintain that generalist fishing approach."The researchers found that diversity and flexibility were important at every scale, for both community and household adaptive capacity in responding to acute and chronic stressors -- for example, being able to diversify fishing portfolios or shift to other means of income. In addition to climate stressors, the researchers assessed responses to biological, economic, political and social changes, as well as environmental degradation and overfishing. The patterns that emerged from the study may be applied to adaptive capacity in other sectors, such as agriculture or manufacturing.Using a broad "way of life" approach allowed the co-authors to consider what factors drive behavior, such as culture, heritage or spending time with their families -- not necessarily economics."From a Western perspective, sustainability would be a nice thing to have happen. But for people in these communities that are highly resource-dependent, it's not nice -- it's necessary," Crowder said. "Their future is potentially compromised if they and we don't help figure out how to make those lifestyles more sustainable in the long term."The analysis revealed several examples of how Western-style management -- such as imposing fixed protection areas or maximizing one product that will make the most money -- doesn't always work for small-scale fisheries.
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303142622.htm
|
Ghosts of past pesticide use can haunt organic farms for decades
|
Although the use of pesticides in agriculture is increasing, some farms have transitioned to organic practices and avoid applying them. But it's uncertain whether chemicals applied to land decades ago can continue to influence the soil's health after switching to organic management. Now, researchers reporting in ACS'
|
Fungicides, herbicides and insecticides protect crops by repelling or destroying organisms that harm the plants. In contrast, organic agriculture management strategies avoid adding synthetic substances, instead relying on a presumably healthy existing soil ecosystem. However, some organic farms are operating on land treated with pesticides in the past. Yet, it's unclear whether pesticides have a long-lasting presence in organically managed fields and what the reverberations are to soil life, specifically microbes and beneficial soil fungi, years after their application. So, Judith Riedo, Thomas Bucheli, Florian Walder, Marcel van der Heijden and colleagues wanted to examine pesticide levels and their impact on soil health on farms managed with conventional versus organic practices, as well as on farms converted to organic methods.The researchers measured surface soil characteristics and the concentrations of 46 regularly used pesticides and their breakdown products in samples taken from 100 fields that were managed with either conventional or organic practices. Surprisingly, the researchers found pesticide residues at all of the sites, including organic farms converted more than 20 years prior. Multiple herbicides and one fungicide remained in the surface soil after the conversion to organic practices; though the total number of synthetic chemicals and their concentrations decreased significantly the longer the fields were in organic management. According to the researchers, some of the pesticides alternatively could have contaminated the organic fields by traveling through the air, water or soil from nearby conventional fields. In addition, the team observed lower microbial abundance and decreased levels of a beneficial microbe when fields had higher numbers of pesticides in the fields, suggesting that the presence of these substances can decrease soil health. The researchers say future work should examine the synergistic effects of pesticide residues and other environmental stressors on soil health.
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303142601.htm
|
Tenfold increase in CO2 emissions cuts needed to stem climate emergency
|
New research shows 64 countries cut their fossil CO
|
This first global stocktake by researchers at the University of East Anglia (UEA), Stanford University and the Global Carbon Project examined progress in cutting fossil COThe annual cuts of 0.16 billion tonnes of COWhile emissions decreased in 64 countries, they increased in 150 countries. Globally, emissions grew by 0.21 billion tonnes of COThe scientists' findings, 'Fossil COIn 2020, confinement measures to tackle the COVID-19 pandemic cut global emissions by 2.6 billion tonnes of COProf Corinne Le Quéré, Royal Society Professor at UEA's School of Environmental Sciences, led the analysis. She said: "Countries' efforts to cut CO"The drop in CO"It is in everyone's best interests to build back better to speed the urgent transition to clean energy."Annual cuts of 1-2 billion tonnes of COOf the 36 high-income countries, 25 saw their emissions decrease during 2016-2019 compared to 2011-2015, including the USA (-0.7 per cent), the European Union (-0.9 per cent), and the UK (-3.6 per cent). Emissions decreased even when accounting for the carbon footprint of imported goods produced in other countries.Thirty of 99 upper-middle income countries also saw their emissions decrease during 2016-2019 compared to 2011-2015, suggesting that actions to reduce emissions are now in motion in many countries worldwide. Mexico (-1.3 per cent) is a notable example in that group, while China's emissions increased 0.4 per cent, much less than the 6.2 per cent annual growth of 2011-2015.The growing number of climate change laws and policies appear to have played a key role in curbing the growth in emissions during 2016-2019. There are now more than 2000 climate laws and policies worldwide.A full bounce-back in 2021 to previous COInvestments post-COVID continue to be overwhelmingly dominated by fossil fuels in most countries, in contradiction with climate commitments, including in the United States and China. The European Union, Denmark, France, the United Kingdom, Germany and Switzerland are among the few countries that have so far implemented substantial green stimulus packages with limited investments in fossil-based activities.Prof Rob Jackson of Stanford University co-authored the study. He said: "The growing commitments by countries to reach net zero emissions within decades strengthens the climate ambition needed at COP26 in Glasgow. Greater ambition is now backed by leaders of the three biggest emitters: China, the United States, and the European Commission.""Commitments alone aren't enough. Countries need to align post-COVID incentives with climate targets this decade, based on sound science and credible implementation plans."Prof Le Quéré added: "This pressing timeline is constantly underscored by the rapid unfolding of extreme climate impacts worldwide."The paper, 'Fossil CO
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303142557.htm
|
How do you know where volcanic ash will end up?
|
When the Eyjafjallajökull volcano in Iceland erupted in April 2010, air traffic was interrupted for six days and then disrupted until May. Until then, models from the nine Volcanic Ash Advisory Centres (VAACs) around the world, which aimed at predicting when the ash cloud interfered with aircraft routes, were based on the tracking of the clouds in the atmosphere. In the wake of this economic disaster for airlines, ash concentration thresholds were introduced in Europe which are used by the airline industry when making decisions on flight restrictions. However, a team of researchers, led by the University of Geneva (UNIGE), Switzerland, discovered that even the smallest volcanic ash did not behave as expected. Its results, to be read in the journal
|
The eruption of Iceland's Eyjafjallajökull volcano in 2010 not only disrupted global air traffic, but also called into question the functioning of the forecast strategies used by the VAACs, based only on the spatial tracking of the ash cloud. A meeting of experts refined the strategies based on ash concentration thresholds and enabled flights to resume more quickly, while ensuring the safety of passengers and flight personnel."During a volcanic explosive eruption, fragments ranging from a few microns to more than 2 metres are ejected from the volcanic vent," explains Eduardo Rossi, a researcher at the Department of Earth Sciences of the UNIGE Faculty of Sciences and the first author of the study. The larger the particles, the faster and closer to the volcano they fall, reducing the concentration of ash in the atmosphere. "This is why the new strategies have integrated concentration thresholds better defining the dangerousness for aircraft engines. From 2 milligrams per cubic metre, airlines must have an approved safety case to operate," says the Geneva-based researcher.Despite existing knowledge about the ash clouds, several open questions remained unanswered after the 2010 Eyjafjallajökull eruption, including the discovery of particles in UK that were much larger than expected. "We wanted to understand how this was possible by accurately analysing the ash particles from the Sakurajima volcano in Japan, which has been erupting 2-3 times a day for more than 50 years," says Costanza Bonadonna, a professor in the Department of Earth Sciences at UNIGE.By using adhesive paper to collect the ash before it hit the ground, the team of scientists had already observed during the Eyjafjallajökull eruption how micrometric particles would group together into clusters, which, after the impact with the ground, were destroyed. "It plays an important role in the sedimentation rate, notes Eduardo Rossi. Once assembled in aggregates, these micrometre particles fall much faster and closer to the volcano than the models predict, because they are ultimately heavier than if they fell individually. This is called premature sedimentation. "In Japan the UNIGE team made a new important discovery: the observation of the rafting effect. Using a high-speed camera, the volcanologists observed the sedimentation of the ash in real-time and discovered previously unseen aggregates called cored clusters. "These are formed by a large particle of 100-800 microns -- the core -- which is covered by many small particles less than 60 microns, explains Costanza Bonadonna. And this external layer of small particles can act like a parachute over the core, delaying its sedimentation. This is the rafting effect. "This rafting effect had been theoretically suggested in 1993, but finally declared impossible. Today, its existence is well and truly proven by direct observation and accurate theoretical analysis, made possible by high-speed camera. "Working with Frances Beckett of the UK Met Office, we have carried out several simulations that have enabled us to answer the questions raised by the eruption of Eyjafjallajökull and the unexplained discovery of these oversized ash particles in UK. It was the result of this rafting effect, which delayed the fall of these aggregates," enthuses Eduardo Rossi.Now that the ash aggregates, the cored clusters and the rafting effect have been studied, it is a matter of collecting more accurate physical particle parameters so that one day they can be integrated into the operational models of the VAACs, for which size and density play a crucial role in calculating the concentration of ash in the atmosphere.
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303142553.htm
|
Ecosystems across the globe 'breathe' differently in response to rising temperatures
|
Land stores vast amounts of carbon, but a new study led by Cranfield University's Dr Alice Johnston suggests that how much of this carbon enters the atmosphere as temperatures rise depends on how far that land sits from the equator.
|
Ecosystems on land are made up of plants, soils, animals, and microbes -- all growing, reproducing, dying, and breathing in a common currency; carbon. And how much of that carbon is breathed out (also known as ecosystem respiration) compared to how much is stored (through primary production) has impacts for climate change.A key concern is that if more carbon is respired than stored, the rate of climate change could accelerate even further. Yet, some big assumptions are made in the models used to predict climate changes -- that ecosystem respiration rises with temperature at the same rate (doubles for a temperature rise of 10 °C) irrespective of the ecosystem itself. A new study "Temperature thresholds to ecosystem respiration at a global scale" published in The study, funded by the Leverhulme Trust, shows that ecosystem respiration doesn't rise as strongly with temperature in warmer (Mediterranean and tropical) climates compared to mild (temperate) climates but shows an extreme rise with temperature in cold (boreal and tundra) climates. This finding contradicts several studies showing a static temperature-respiration relationship globally but agrees with observations made within different ecosystems.First author Dr Alice Johnston, Lecturer in Environmental Data Science at Cranfield University, said: "Ecosystems are extremely complex, and there is massive variation in how many and what type of plants, animals and microbes are present in one field compared to the next let alone across global ecosystems. Given those shifting patterns in biodiversity we would expect changes in how ecosystem respiration responds to temperature because different species exhibit different temperature sensitivities. Our study is very simple and doesn't capture all of that variation, but it does capture three distinct differences in the ecosystem respiration pattern across 210 globally distributed sites."Fundamentally, our results show that temperature has a weak effect on ecosystem respiration in Mediterranean and tropical ecosystems, a well understood effect in temperate ecosystems, and an outsized effect in boreal and tundra ecosystems. On the one hand, that's a concern because huge stores of carbon in cold climates could be released with rising temperatures. On the other hand, if COProfessor Chris Venditti, Professor of Evolutionary Biology at University of Reading, added: "The impact of plant diversity on the terrestrial carbon cycle is far better known than animal diversity. In the future, we need to focus our attention on identifying general but realistic ways to integrate whole community complexity into climate models. That way, we can determine biodiversity loss or gain tipping points beyond which biosphere carbon sinks are enhanced or diminished."
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303142537.htm
|
Temperature and aridity fluctuations over the past century linked to flower color changes
|
Clemson University scientists have linked climatic fluctuations over the past one and a quarter-century with flower color changes.
|
Researchers combined descriptions of flower color from museum flower specimens dating back to 1895 with longitudinal- and latitudinal-specific climate data to link changes in temperature and aridity with color change in the human-visible spectrum (white to purple).The study, which was published in the journal "Species experiencing larger increases in temperature tended to decline in pigmentation, but those experiencing larger increases in aridity tended to increase in pigmentation," said Cierra Sullivan, a graduate student in the College of Science's Department of Biological Sciences and lead author of the paper titled "The effects of climate change on floral anthocyanin polymorphisms."Matthew Koski, an assistant professor of biological sciences, co-authored the paper.Previous research by Koski and his team, including Sullivan, showed that the ultraviolet-absorbing pigmentation of flowers increased globally over the past 75 years in response to a rapidly degraded ozone layer. That study discussed how flower color changes could influence the behavior of pollinators, which have UV photoreceptors that enable them to detect patterns not visible to human eyes. This study discusses plant color change visible to humans."Although we see these changes in flower color, that doesn't inherently mean it's doomsday because the forest, plants and animals naturally respond to what's going on in their environment," Sullivan said. "Seeing changes is not necessarily bad, but it's something to which we should pay attention."Researchers selected 12 species with reported floral color polymorphisms in North America, representing eight families and 10 genera.Sullivan obtained herbarium specimen data from the Southeast Regional Network of Expertise and Collections (SRNEC), Consortium of Pacific Northwest Herbaria, Consortium of California Herbaria and the Consortium of Northeastern Herbaria. She also checked Clemson University Herbarium's physical collection for specimens not already represented in SERNEC.After researchers retrieved the date of specimen collection and latitudinal and longitudinal coordinates, they obtained historical bioclimatic data from the year and month that the plant was collected. That data included monthly precipitation, minimum, maximum and mean temperature, minimum and maximum vapor pressure deficit (VPD), and dew point temperature. Vapor pressure deficit is the difference between how much moisture is in the air and the amount of moisture that can be held when the air is saturated. It has implications for drought stress in plants -- higher VPD means more water loss from plants.Researchers were able to get complete data sets for 1,944 herbarium specimens.They found variation among the 12 species. Some increased in pigmentation, while others declined in color over the past century."It was all tightly linked to how much climatic variation they experienced over time across their range," Koski said.Two of the species that tended to get lighter in pigmentation are found in the western parts of North America that experienced more dramatic temperature changes than the species in the eastern United States, which had more moderate temperature increases."This study documents that flower color that is visually more obvious to humans is also responding to global change but is responding to different factors such as temperature and drought," Koski said.He said such flower color changes are likely to affect plant-pollinator and plant-herbivore interactions and warrant further study.Continued research will help give insight to how species will respond to the various aspects of climate change and which species are the most vulnerable to future climate projections," he said.
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303142446.htm
|
More extreme short-duration thunderstorms likely in the future due to global warming
|
Climate experts have revealed that rising temperatures will intensify future rainfall extremes at a much greater rate than average rainfall, with largest increases to short thunderstorms.
|
New research by Newcastle University has shown that warming temperatures in some regions of the UK are the main drivers of increases in extreme short-duration rainfall intensities, which tend to occur in summer and cause dangerous flash flooding.These intensities are increasing at significantly higher rates than for winter storms. A study, led by Professor Hayley Fowler, of Newcastle University's School of Engineering, highlights the urgent need for climate change adaptation measures as heavier short-term rainfall increases the risk of flash flooding and extreme rainfall globally.Publishing their findings in the journal The scientists found that rainfall extremes intensify with warming, generally at a rate consistent with increasing atmospheric moisture, which is what would be expected. However, the study has shown that temperature increases in some regions affect short-duration heavy rainfall extremes more than the increase in atmospheric moisture alone, with local feedbacks in convective cloud systems providing part of the answer to this puzzle.Professor Fowler said: "We know that climate change is bringing us hotter, drier summers and warmer, wetter winters. But, in the past, we have struggled to capture the detail in extreme rainfall events as these can be highly localised and occur in a matter of hours or even minutes."Thanks to our new research, we now know more about how really heavy rainfall might be affected by climate change. Because warmer air holds more moisture, rainfall intensity increases as temperatures rise."This new work shows that the increase in intensity is even greater for short and heavy events, meaning localised flash flooding is likely to be a more prominent feature of our future climate."The findings are also highlighted in a It is unclear whether storm size will increase or decrease with warming. However, the researchers warn that increases in rainfall intensity and the footprint of a storm can compound to substantially increase the total rainfall during an event.In recent years, short but significantly heavy rainfall events have caused much disruption across the UK. Recent examples include severe flooding and landslides in August 2020 and damage to the Toddbrook Reservoir, in the Peak District, in August 2019.Information about current and future rainfall intensity is critical for the management of surface water flooding, as well as our guidance for surface water management on new developments and sewer design.
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303141305.htm
|
Camera traps reveal newly discovered biodiversity relationship
|
In one of the first studies of its kind, an analysis of camera-trap data from 15 wildlife preserves in tropical rainforests has revealed a previously unknown relationship between the biodiversity of mammals and the forests in which they live.
|
Tropical rainforests are home to half of the world's species, but with species going extinct at a rapid pace worldwide, it's difficult for conservationists to keep close tabs on the overall health of ecosystems, even in places where wildlife is protected. Researchers found that observational data from camera traps can help."In general, rainforest ecosystems are extremely diverse, and our study shows that mammal communities in rainforests can be predictably different, and these differences may be controlled, in part, by differences in plant productivity in forests," said Rice's Daniel Gorczynski, a graduate student in biosciences and corresponding author of a study featured on the cover of the Royal Society's flagship biological research journal, Gorczynski and more than a dozen co-authors, including his Ph.D. adviser, Rice ecologist Lydia Beaudrot, analyzed camera-trap photos from the Tropical Ecology Assessment and Monitoring Network (TEAM), which uses motion-activated cameras to monitor species trends in tropical forests in Asia, Africa and South America.Beaudrot, an assistant professor of biosciences, said the study's scientific contributions demonstrates the importance of having the same data collection replicated on the ground in forests all around the world."The TEAM data are an incredible resource for basic and applied ecology and conservation," she said. "Given the pace of tropical forest loss, it is more important now than ever to use standardized camera-trap data to understand environmental and anthropogenic effects on wildlife."For each site, the researchers gathered data about all species of terrestrial mammals with an average body mass greater than 1 kilogram. All the mammal species studied at each site were treated as a single community, and data was compiled for communities with as many as 31 species and as few as five. The researchers also compiled the known functional traits for each species, such as body size, reproductive habits and diet. The combined functional traits of species in a community were used to calculate the community's "functional diversity," or the variety of roles in the forest's overall ecosystem that were filled by that community's species."We found that species with unique characteristics -- for example, species that are very large or eat unique foods -- are relatively more common in forests with high productivity," Gorczynski said, referring to the measure ecologists use to characterize the overall rate of plant growth within a forest. The research also showed that species with unique characteristics were less common at sites with low productivity."Higher productivity is thought to make rare resources, like certain food types that unique species often eat, more readily available, which unique species can capitalize on," he said. "And because they are unique, they don't have to compete as much with other species for rare resources, and they can persist at higher abundances."The species that are considered unique vary by site, he said. Examples include elephants, tapirs and ground-dwelling monkeys.Gorczynski said this relationship between mammal functional diversity and productivity had not been previously shown."Most studies of rainforest mammals rely on range maps, which don't give you an idea of how common different species are," he said. "We were able to find this relationship because we used camera trap observations. The observational data gives us an idea of how common different species are, which allows us to compare the relative abundances of species with different traits."Study co-author Jorge Ahumada, a wildlife scientist at Conservation International, said the study also shows that destructive human activities, like deforestation, decrease the diversity of species' traits in protected areas."We found that in areas where local species extinctions have been documented due to significant deforestation or poaching, such as in Korup National Park in Cameroon, large carnivores like leopards and golden cats are the first to go," Ahumada said. "Without these apex predators, entire food chains can be thrown out of balance. Eventually, populations of smaller herbivores will skyrocket, forcing more competition for the same limited resources."He said "simply counting the number of species in a tropical forest does not provide a full picture" of biodiversity or ecosystem health.The researchers said more data science studies are needed to understand the ramifications of local species extinctions and address other fundamental questions in conservation, ecology and wildlife biology.The research was funded by the Gordon and Betty Moore Foundation, HP, the Northrop Grumman Foundation and other donors.TEAM data was provided by the TEAM Network, a collaboration between Conservation International, the Smithsonian Institute and the Wildlife Conservation Society.
|
Environment
| 2,021 |
March 3, 2021
|
https://www.sciencedaily.com/releases/2021/03/210303081410.htm
|
A silver swining: 'Destructive' pigs help build rainforests
|
Wild pigs are often maligned as ecosystem destroyers, but a University of Queensland study has found they also cultivate biodiverse rainforests in their native habitats.
|
Dr Matthew Luskin has been researching the effect of native pigs in Malaysian rainforests and found their nests may be critical to maintaining diverse and balanced tree communities."We've shown that wild pigs can support higher diversity ecosystems and are not just nuisances and pests, thanks to a beneficial effect of their nesting practices," Dr Luskin said."Prior to giving birth, pigs build birthing nests made up of hundreds of tree seedlings, usually on flat, dry sites in the forest."As they build their nests, the pigs kill many of the dominant seedlings and inadvertently reduce the abundance of locally dominant tree species, but usually not rarer local species, supporting tree diversity."Dr Luskin said wild pigs (Sus scrofa) descended from the same species of domestic pigs and both have generally been considered pests by farmers, land managers and conservationists."Their negative impacts on natural and cultivated ecosystems have been well documented -- ranging from soil disturbances to attacking newborn livestock," he said.This is the first study to link animals to this key mechanism for maintaining hyper-diverse rainforests.The researchers tagged more than 30,000 tree seedlings in a Malaysian rainforest and were able to examine how tree diversity changed in the areas where pigs nested after recovering more than 1800 of those tree tags from inside more than 200 pig birthing nests."You could consider pigs 'accidental forest gardeners' that prune common seedlings and inadvertently maintain diversity," Dr Luskin said."In many regions, there's a focus on managing overabundant pig populations to limit their negative environmental impacts."But our results suggest there may be some positives to maintaining pigs in the ecosystem."Dr Luskin said that as the fieldwork was conducted in Malaysia where pigs are native -- the impacts of invasive pigs in Australia may not create similar effects."We're currently in the process of designing new research to study the same pig processes here in Queensland," he said."And we'll also be comparing our initial Malaysian results with conditions in a nearby Malaysian forest that is heavily hunted and where many native pigs have been killed."It's an intriguing insight, as pigs have become the most widespread large animal on earth, so documenting any new ecological impacts has massive repercussions globally."
|
Environment
| 2,021 |
March 2, 2021
|
https://www.sciencedaily.com/releases/2021/03/210302212004.htm
|
Animals fake death for long periods to escape predators
|
Many animals feign death to try to escape their predators, with some individuals in prey species remaining motionless, if in danger, for extended lengths of time.
|
Charles Darwin recorded a beetle that remained stationary for 23 minutes -- however the University of Bristol has documented an individual antlion larvae pretending to be dead for an astonishing 61 minutes. Of equal importance, the amount of time that an individual remains motionless is not only long but unpredictable. This means that a predator will be unable to predict when a potential prey item will move again, attract attention, and become a meal.Predators are hungry and cannot wait indefinitely. Similarly, prey may be losing opportunities to get on with their lives if they remain motionless for too long. Thus, death-feigning might best be thought of as part of a deadly game of hide and seek in which prey might gain most by feigning death if alternative victims are readily available.The study, published today in science journal Lead author of the paper Professor Nigel R. Franks from the University of Bristol's School of Biological Sciences, said: "Imagine you are in a garden full of identical soft fruit bushes. You go to the first bush. Initially collecting and consuming fruit is fast and easy, but as you strip the bush finding more fruit gets harder and harder and more time consuming."At some stage, you should decide to go to another bush and begin again. You are greedy and you want to eat as many fruit as quickly as possible. The marginal value theorem would tell you how long to spend at each bush given that time will also be lost moving to the next bush."We use this approach to consider a small bird visiting patches of conspicuous antlion pits and show that antlion larvae that waste some of the predator's time, by 'playing dead' if they are dropped, change the game significantly. In a sense, they encourage the predator to search elsewhere."The modelling suggests that antlion larvae would not gain significantly if they remained motionless for even longer than they actually do. This suggests that in this arms race between predators and prey, death-feigning has been prolonged to such an extent that it can hardly be bettered.Professor Franks added: "Thus, playing dead is rather like a conjuring trick. Magicians distract an audience from seeing their sleights of hand by encouraging them to look elsewhere. Just so with the antlion larvae playing dead -- the predator looks elsewhere. Playing dead seems to be a very good way to stay alive."
|
Environment
| 2,021 |
March 2, 2021
|
https://www.sciencedaily.com/releases/2021/03/210302185414.htm
|
Environmental impact of computation and the future of green computing
|
When you think about your carbon footprint, what comes to mind? Driving and flying, probably. Perhaps home energy consumption or those daily Amazon deliveries. But what about watching Netflix or having Zoom meetings? Ever thought about the carbon footprint of the silicon chips inside your phone, smartwatch or the countless other devices inside your home?
|
Every aspect of modern computing, from the smallest chip to the largest data center comes with a carbon price tag. For the better part of a century, the tech industry and the field of computation as a whole have focused on building smaller, faster, more powerful devices -- but few have considered their overall environmental impact.Researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) are trying to change that."Over the next decade, the demand, number and types of devices is only going to grow," said Udit Gupta, a PhD candidate in Computer Science at SEAS. "We want to know what impact that will have on the environment and how we, as a field, should be thinking about how we adopt more sustainable practices."Gupta, along with Gu-Yeon Wei, the Robert and Suzanne Case Professor of Electrical Engineering and Computer Science, and David Brooks, the Haley Family Professor of Computer Science, will present a paper on the environmental footprint of computing at the IEEE International Symposium on High-Performance Computer Architecture on March 3rd, 2021.The SEAS research is part of a collaboration with Facebook, where Gupta is an intern, and Arizona State University.The team not only explored every aspect of computing, from chip architecture to data center design, but also mapped the entire lifetime of a device, from manufacturing to recycling, to identify the stages where the most emissions occur.The team found that most emissions related to modern mobile and data-center equipment come from hardware manufacturing and infrastructure."A lot of the focus has been on how we reduce the amount of energy used by computers, but we found that it's also really important to think about the emissions from just building these processors," said Brooks. "If manufacturing is really important to emissions, can we design better processors? Can we reduce the complexity of our devices so that manufacturing emissions are lower?"Take chip design, for example.Today's chips are optimized for size, performance and battery life. The typical chip is about 100 square millimeters of silicon and houses billions of transistors. But at any given time, only a portion of that silicon is being used. In fact, if all the transistors were fired up at the same time, the device would exhaust its battery life and overheat. This so-called dark silicon improves a device's performance and battery life but it's wildly inefficient if you consider the carbon footprint that goes into manufacturing the chip."You have to ask yourself, what is the carbon impact of that added performance," said Wei. "Dark silicon offers a boost in energy efficiency but what's the cost in terms of manufacturing? Is there a way to design a smaller and smarter chip that uses all of the silicon available? That is a really intricate, interesting, and exciting problem."The same issues face data centers. Today, data centers, some of which span many millions of square feet, account for 1 percent of global energy consumption, a number that is expected to grow.As cloud computing continues to grow, decisions about where to run applications -- on a device or in a data center -- are being made based on performance and battery life, not carbon footprint."We need to be asking what's greener, running applications on the device or in a data center," said Gupta. "These decisions must optimize for global carbon emissions by taking into account application characteristics, efficiency of each hardware device, and varying power grids over the day."The researchers are also challenging industry to look at the chemicals used in manufacturing.Adding environmental impact to the parameters of computational design requires a massive cultural shift in every level of the field, from undergraduate CS students to CEOs.To that end, Brooks has partnered with Embedded EthiCS, a Harvard program that embeds philosophers directly into computer science courses to teach students how to think through the ethical and social implications of their work. Brooks is including an Embedded EthiCS module on computational sustainability in COMPSCI 146: Computer Architecture this spring.The researchers also hope to partner with faculty from Environmental Science and Engineering at SEAS and the Harvard University Center for the Environment to explore how to enact change at the policy level."The goal of this paper is to raise awareness of the carbon footprint associated with computing and to challenge the field to add carbon footprint to the list of metrics we consider when designing new processes, new computing systems, new hardware, and new ways to use devices. We need this to be a primary objective in the development of computing overall," said Wei.The paper was co-authored by Sylvia Lee, Jordan Tse, Hsien-Hsin S. Lee and Carole-Jean Wu from Facebook and Young Geun Kim from Arizona State University.
|
Environment
| 2,021 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.