Date
stringlengths
11
18
Link
stringlengths
62
62
Title
stringlengths
16
148
Summary
stringlengths
1
2.68k
Body
stringlengths
22
13k
Category
stringclasses
20 values
Year
int64
2k
2.02k
April 6, 2000
https://www.sciencedaily.com/releases/2000/04/000406085917.htm
NASA-European Measurements See Significant Arctic Ozone Loss
Ozone losses of more than 60 percent have occurred in the Arctic stratosphere near 60,000 feet (18 kilometer) in one of the coldest winters on record. This is one of the worst ozone losses at this altitude in the Arctic.
Investigations into the Arctic stratosphere have provided better insights into the processes that control polar ozone. These insights considerably add to scientists' ability to predict ozone levels in the future as chlorine levels decline as a result of the Montreal Protocol, and as greenhouse gases increase. Climate change in the stratosphere will likely enhance ozone losses in the Arctic winter in the coming decades, even as the amount of chlorine introduced into the atmosphere is decreased, researchers say. This winter, the NASA sponsored SAGE III Ozone Loss and Validation Experiment (SOLVE) and European Union sponsored Third European Stratospheric Experiment on Ozone (THESEO) obtained measurements of ozone, other atmospheric gases, and particles using satellites, airplanes, large, small and long duration balloons, and ground-based instruments. Scientists from the United States joined with scientists from Europe, Canada, Russia and Japan in mounting the biggest field measurement campaign yet to measure ozone amounts and changes in the Arctic stratosphere. The activities were conducted from November 1999 through March 2000. The total amount of information collected by the international campaign this winter is greater than the information collected in any past polar measurement campaign. Most of the measurements were made near Kiruna, Sweden with additional measurements being made from satellites and a network of stations at mid- and high- northern latitudes. During the winter of 1999-2000, large ozone losses were observed in the Arctic lower stratosphere, measured by a number of instruments and techniques, including a National Oceanic and Atmospheric Administration ozone instrument aboard the high altitude NASA ER-2 aircraft, a civilian variant of the U-2 reconnaissance plane. "Measurements from the NASA ER-2 show ozone in the Arctic region decreasing by about 60 percent between January and mid- March," said ER-2 co-project scientist Dr. Paul A. Newman of NASA's Goddard Space Flight Center, Greenbelt, Md. These measurements are comparable to the large chemical losses at this altitude observed in several winters in the mid- 1990s. The effect on total column ozone was slightly mitigated by the fact that reductions in ozone were smaller above 20 kilometers (66,000 feet). Spacecraft observations by NASA's Total Ozone Mapping Spectrometer-Earth Probe showed a clear ozone minimum over the polar region during February and March. The average polar column amounts of ozone for the first two weeks of March were 16 percent lower than observed in the early 1980's. High altitude clouds (at about 18 kilometers or 60,000 feet) that exist only at the poles are called "polar stratospheric clouds" or PSCs. They play a unique role in atmospheric ozone loss. The visually beautiful, opalescent clouds form only at the cold temperatures found at the poles. These clouds help trigger the conversion of chlorine from relatively non-reactive forms to a form (chlorine monoxide, or ClO) that, in combination with sunlight, destroys ozone. PSCs were observed to extend widely over the Arctic region from early December to early March. "We were somewhat surprised to see PSCs so early in December," said Dr. Mark Schoeberl, who was the SOLVE co-project scientist for observations made from NASA's DC-8 aircraft. "Some of the PSC types and their locations which we observed in December did not fit within our current understanding." The last PSCs were observed on March 8 by instruments aboard the DC-8, and on March 15 by satellite. The polar stratosphere temperatures were extremely low over the course of this last winter. PSCs can only form in these low temperature regions. At 20 kilometers (66,000 feet) on Jan. 28, the area covered by temperatures low enough to form PSCs was 14.8 million square kilometers (5.7 million square miles), which is larger than the United States. This is the largest-area coverage recorded in more than 40 years of Northern Hemisphere stratospheric analyses. "The polar stratospheric clouds covered a larger area, and persisted for a longer period of time, than for any other Arctic winter during the past 20 years. These conditions heighten our concern regarding possible couplings between climate change and stratospheric ozone depletion," said ozone researcher Dr. Ross Salawitch of NASA's Jet Propulsion Laboratory, Pasadena, Calif. The mixing of polar air into middle latitudes, both during the winter and as the polar circulation broke down in late March, influences ozone levels over the populated middle latitudes. Dilution of ozone-depleted air into latitudes is a major contributor to the long-term mid-latitude decline. These mixing processes have been studied during SOLVE/THESEO-2000 and detailed analysis of these processes continues. For further information visit the SOLVE web site at JPL is managed for NASA by the California Institute of Technology, Pasadena.
Ozone Holes
2,000
March 13, 2000
https://www.sciencedaily.com/releases/2000/03/000313081731.htm
Asteroid Devastation Could Even Be Worse Than Feared
CORVALLIS, Ore. - Researchers say in a new report that if a huge asteroid were to hit the Earth, the catastrophic destruction it causes, and even the "impact winter" that follows, might only be a prelude to a different, but very deadly phase that starts later on.
They're calling it, "ultraviolet spring." In an analysis of the secondary ecological repercussions of a major asteroid impact, scientists from Oregon State University and the British Antarctic Survey have outlined some of the residual effects of ozone depletion, acid rain and increased levels o f harmful ultraviolet radiation. The results were just published in the journal Ecology Letters. The findings are frightening. As a number of popular movies have illustrated in recent years, a big asteroid or comet impact would in fact produce enormous devastation, huge tidal waves, and a global dust cloud that would block the sun and choke the plane t in icy, winter-like conditions for months. Many experts believe such conditions existed on Earth following an impact around the Cretaceous-Tertiary, or K-T boundary, when there was a massive extinction of many animals, including the dinosaurs. That's pretty bad. But according to Andrew Blaustein, a professor of zoology at Oregon State University, there's more to the story. "Scientists have pretty well documented the immediate destruction of an asteroid impact and even the impact winter which its dust cloud would create," Blaustein said. "But our study suggests that's just the beginning of the ecological disaster, not the e nd of it." Blaustein and colleague Charles Cockell examined an asteroid impact of a magnitude similar to the one that occurred around the K-T boundary, which is believed to have hit off the Yucatan Peninsula with a force of almost one trillion megatons. The immediate results would be catastrophic destruction and an impact winter, with widespread death of plants and the large terrestrial animals - including humans - that most directly depend on those plants for food. That's the beginning of an ugly scena rio, the researchers say. As a result of the impact, the atmosphere would become loaded with nitric oxide, causing massive amounts of acid rain. As they become acidified, the lakes and rivers would have reduced amounts of dissolved organic carbons, which would allow much greater p enetration of ultraviolet light. At first, of course, the ultraviolet rays would be blocked by the dust cloud, which sets the stage for a greater disaster later on. Many animals depend on some exposure to ultraviolet light to keep operational their biological protective mechanisms agains t it - without any such light, those protective mechanisms would be eroded or lost. During the extended winter, animals across the biological spectrum would become weaker, starved and more vulnerable. Many would die. Then comes ultraviolet spring, shining down on surviving plants and animals that have lost their resistance to ultraviolet radiation and penetrating more deeply, with greater intensity, into shallow waters than it ever has before. "By our calculations, the dust cloud would shield the Earth from ultraviolet light for an extended period, with it taking about 390 days after impact before enough dust settled that there would be an ultraviolet level equal to before the impact. After tha t, the ozone depletion would cause levels of ultraviolet radiation to at least double, about 600 days after impact." According to their study, these factors would lead to ultraviolet-related DNA damage about 1,000 times higher than normal, and general ultraviolet damage to plants about 500 times higher than normal. Ultraviolet radiation can cause mutations, cancer, and cataracts. It can kill plants or slow their growth, suppressing the photosynthesis which forms the base of the world's food chain. Smaller asteroid impacts, which have happened far more frequently in Earth's history, theoretically might cause similar or even worse problems with ultraviolet exposure, the researchers say. The ozone depletion would be less, but there would also be less of a protective dust cloud. "Part of what we're trying to stress here is that with an asteroid collision, there will be many synergistic effects on the environment that go far beyond the initial impact," said Cockell, a researcher with the British Antarctic Survey who did some of th is analysis while formerly working with NASA. "Effects such as acid rain, fires, the dust clouds, cold temperatures, ozone depletion and ultraviolet radiation could all build upon each other." During the K-T event, the scientists said, many of the animals may actually have been spared most of the ultraviolet spring they envision. That impact, oddly enough, hit a portion of the Earth's crust that was rich in anhydrite rocks. This produced a 12-y ear sulfate haze that blocked much of the ultraviolet radiation. But it was a lucky shot - that type of rock covers less than 1 percent of the Earth's surface. So when the next "big one" comes, the scientists said, the ecological repercussions may be more savage than any of those known in Earth's long history. The collision will be devastating, the "impact winter" deadly. But it will be the ultraviolet spring that helps finish off the survivors.
Ozone Holes
2,000
February 18, 2000
https://www.sciencedaily.com/releases/2000/02/000217165652.htm
Scientist Find Clues To Different Warming Rates In Lower Atmosphere And Surface
BOULDER --- Three factors--the thinning of the ozone layer, emissionsfrom the Mt. Pinatubo volcano, and the influx of sulfate aerosols andgreenhouse gases into the atmosphere--may help explain why the lowestfive miles of the earth's atmosphere has not warmed as quickly as theearth's surface, say a group of scientists in a paper appearing inthe February 18 issue of the journal Science. The results followextensive data analysis and modeling studies by the 13 scientists.The team includes second author Tom Wigley and Gerald Meehl, bothscientists at the National Center for Atmospheric Research (NCAR).Lead author Ben Santer is at Lawrence Livermore National Laboratory.NCAR's primary sponsor is the National Science Foundation.
The difference in temperature trends at the surface and in the lowertroposphere has intensified the climate change debate. Some havepointed to the surface data as more reliable, while others havefocused on the satellite measurements. In January the NationalResearch Council (NRC) issued a report from a team of scientistsacross the spectrum of climate change positions that partlyreconciles the differences in data sets and offers some explanationof why the temperature trends would be different. The Santer-Wigleypaper, though not published at the time, was fully taken into accountin the report, says Kevin Trenberth, head of NCAR's Climate AnalysisSection and a coauthor of the NRC report.For the Science paper, the team examined three observational datasets and recent model studies to reach their conclusions. The datasources are--a century of thermometer readings of sea surface temperatures andair temperatures a few meters above land--a half century of radiosonde measurements of troposphere and lowerstratosphere temperatures--two decades of global observations of tropospheric temperatures (upto eight kilometers) taken by a series of satellites that measure theupwelling microwave radiation from oxygen moleculesOver the period 1979 to 1998, the surface data show a warming of 0.2-0.4 degree Celsius, while the radiosonde and satellite data show nowarming or only a slight temperature rise (0.1 degree C) in the lowertroposphere over the same period.Neither complicated problems with the measurements nor the climate'sinherent variability over decades explains fully the temperaturetrend difference, say the authors. In a comprehensive modeling study,they found that the loss of stratospheric ozone and, to a lesserextent, the influx of Mt. Pinatubo emissions in the stratospherecooled the lower troposphere more than the surface. The model alsotook into account the buildup of greenhouse gases and sulfateaerosols.Says Wigley, "This is a very complex problem with large uncertaintiesin the effects of human activities on the climate. However, we havereasonable confidence that ozone depletion and the Mt. Pinatuboemissions are likely candidates for explaining at least part of thecooler temperatures in the lower to middle troposphere compared tothe more intense warming at the surface."NCAR is managed by the University Corporation for AtmosphericResearch, a consortium of more than 60 universities offering Ph.D.sin atmospheric and related sciences.
Ozone Holes
2,000
January 17, 2000
https://www.sciencedaily.com/releases/2000/01/000114173651.htm
NCAR Scientists Seek Ozone-Hole Clues During Largest Campaign Ever in Arctic Stratosphere
BOULDER -- This winter a team of scientists from the National Center forAtmospheric Research (NCAR) is part of the largest international projectever mounted to measure levels of ozone and learn more about itslifecycle in the upper atmosphere of the Arctic. Prompted byobservations of very low levels of ozone in the Arctic stratosphere inrecent winters, scientists from the United States, Europe, Russia, andJapan are hoping to explain the ozone loss by making detailedmeasurements of the chemistry and dynamics of this under-studied region.
The SAGE III Ozone Loss and Validation Experiment, sponsored by theNational Aeronautics and Space Administration, is being conductedjointly with the European Commission-sponsored Third EuropeanStratospheric Experiment on Ozone. With some 350 scientistsparticipating, SOLVE/THESEO-2000 is the largest stratospheric fieldmission ever conducted, according to project manager Michael Craig ofNASA's Ames Research Center. NCAR's primary sponsor is the NationalScience Foundation.Begun in the Arctic darkness of November and continuing through March asthe sun climbs higher above the horizon, the mission is timed to capturechemical changes in the stratosphere brought about by interaction withincreasing solar radiation. As temperatures fall during Arctic winter,polar stratospheric clouds (PSCs) can form. A complex series of chemicalreactions on the surface of PSC cloud particles frees up active chlorineand bromine, which react with sunlight to catalyze ozone destructionwhen the sun returns in early spring. The sources of chlorine andbromine are human-produced chlorofluorocarbons (CFCs) and halocarbons.The colder the Arctic spring, the longer the clouds linger and the moreozone loss. Scientists need to understand the complex interactions amongsolar radiation, temperature, water, CFCs, aerosol particles, and polarstratospheric clouds before predictions of ozone loss in the NorthernHemisphere can become more reliable.An array of research instruments aboard NASA's DC-8 and ER-2 aircraft istaking measurements in flight and bringing back air samples for testingin the lab. NCAR researchers William Mankin and Michael Coffey developedtechniques for using a spectrometer aboard the DC-8 to measure amountsof chlorine, nitrogen-containing gases, CFCs, ozone, and otherstratospheric gases important to polar ozone chemistry. They expect theinstrument to also detect the infrared signature of polar stratosphericclouds, allowing them to determine cloud structure and composition. Alsoaboard the DC-8, Richard Shetter's spectroradiometers are gathering dataon photolysis (sunlight-produced chemical changes) of 15 differentmolecules important to the production and destruction of ozone. Histeam's measurements of actinic flux, which serves as a tracer ofphotolysis, are the first to be made in the Arctic stratosphere.The ER-2 is carrying an instrument developed by Darrel Baumgardner,Bruce Gandrud, and colleagues to determine the size and concentration ofPSC cloud particles from 0.3 to 20 micrometers (thousandths of amillimeter) in diameter. A whole air sampler is collecting and storingup to 32 air samples per ER-2 flight. The samples are shipped the sameweek to NCAR, where Elliot Atlas is analyzing them using severaldifferent gas chromatographs to look for halocarbons, hydrocarbons, andorganic nitrates.Several European aircraft are also participating, and additionalmeasurements will be taken by instruments carried up to 100,000 feetaloft by research balloons. Instruments on the ground in Sweden andNorway will round out the profile of the Arctic stratosphere. SOLVEscientists are based above the Arctic Circle at the airport in Kiruna,Sweden, where winter temperatures can reach -50_ Fahrenheit or lower.The stratosphere ranges from about 30,000 to 180,000 feet in altitude.Ozone in the stratosphere acts as a protective layer, keeping most ofthe sun's ultraviolet radiation from reaching the earth, where it causesdamage to people and other living things. Most of the ozone in thestratosphere is concentrated between 50,000 and 100,000 feet--withinrange of SOLVE's aircraft and balloons.The Antarctic ozone hole and its causes made news in the 1980s.International efforts to reduce manufacture of ozone-destroying CFCsculminated in a production ban for industrialized countries in 1996. TheArctic ozone layer seemed unaffected; ozone concentrations werenaturally higher there, and relatively warmer Arctic temperatures stayedabove the levels necessary for CFCs to interfere with ozone chemistry.In the late 1990s, however, scientists detected dramatically lowerlevels of ozone over the Arctic, raising concerns about the possibilityof a second ozone hole above the North Pole.NCAR is managed by the University Corporation for Atmospheric Research,a consortium of more than 60 universities offering Ph.D.s in atmosphericand related sciences.-The End-Note to Editors: Journalists are invited to the main field stagingarea in Kiruna, Sweden, during media week, January 21-28. Members ofmost of the science teams, including NCAR's, will be on hand. A newsroomwill operate in the Scandic Hotel Ferrum near the airport. Duringescorted tours into the research area, journalists may meet withscientists. Contact for media week is Chris Rink--before January 22:NASA Langley Research Center, Hampton, Virginia, phone: 757-864-6786,fax: 757-864-6333, e-mail: [email protected];January 21-28: NASA newsroom, Kiruna, Sweden, phone: 011 46-980-398-787,fax: 011 46-980-398-788, e-mail: [email protected]: Zhenya Gallon
Ozone Holes
2,000
December 8, 1999
https://www.sciencedaily.com/releases/1999/12/991208055758.htm
Air Pollution From Asia Could Violate New Federal Ozone Standard
A plume of pollution that crossed the Pacific Ocean from Asia earlier this year contained ozone at levels high enough to violate a new federal ozone standard.
"This is air that has health implications," said Daniel Jaffe, an atmospheric scientist at the University of Washington, Bothell, whose team of researchers discovered evidence of the high pollution content in data collected during a research flight off the Washington coast on April 9. He will present his findings Monday (Dec. 13) at the American Geophysical Union's fall meeting in San Francisco.Equipment aboard the plane detected an ozone level of 85 parts per billion at about 20,000 feet. That would exceed a new U.S. Environmental Protection Agency standard of 80 parts per billion (or 0.08 parts per million). That standard, which was formulated in 1997 but is under legal challenge and has not yet taken effect, includes time limits for how long ozone levels can remain at or above 80 parts per billion. It would replace the current standard that allows concentrations of 120 parts per billion.Eventually, air at 20,000 feet is likely to mix into the lower atmosphere, but it is uncertain where it might come to ground level and create a health risk, Jaffe said.The research flight also found an ozone level of 72 parts per billion at about 10,000 feet, an altitude lower than the tops of many peaks in the Cascade Range. At that concentration, ozone is known to damage vegetation, he said.A meteorological analysis of the plume shows it came from East Asia, though the exact source is unknown, he said. At the same time, elevated levels of other pollutants, including hydrocarbons, carbon monoxide and a key smog ingredient called PAN (peroxyacetylnitrate), proved that the ozone-rich air mass had not come from the upper atmosphere because those pollutants do not exist at high concentrations in the upper atmosphere.At the AGU meeting in 1997, Jaffe presented computer modeling indicating the likelihood that, under the proper springtime conditions, air pollution from East Asia could make its way across the Pacific relatively undiluted within a matter of days. Data collected from the UW's Cheeka Peak Observatory on Washington's northwest coast in 1997 and 1998 confirmed the model's prediction, though that data did not indicate heightened ozone levels.To gather this year's data, Jaffe's team used a University of Wyoming plane that is part of a fleet of research aircraft operated by the National Science Foundation. The plane was outfitted with essentially the same equipment used at Cheeka Peak. On 14 flights between March 15 and April 28, the plane gathered data from several equally spaced levels between 1,500 feet (the same elevation as Cheeka Peak) and 23,000 feet. Pollution layers were observed on about one-third of the flights."This was a day when we could really see haze layers out there," Jaffe said of the April 9 flight.Other scientists involved in the research are from the UW, Seattle; the University of California, Irvine; the National Oceanic and Atmospheric Administration; and the Atmospheric Environment Service of Canada.Jaffe's previous research has shown that Asian pollution travels to North America when meteorological conditions over the Pacific are just right, typically during the spring. A low-pressure system over the Aleutian Islands and a high-pressure cell near Hawaii, which remain stable and in place for at least several days, work in concert to quickly move air from East Asia directly across the ocean to North America. The process, which the researchers have dubbed "The Asian Express," takes four to 10 days, too little time for the air to be cleansed over the ocean."For us to see what we're seeing, I think we have to be talking about a fairly large region of pollutants that remain intact and get transported across in one big blob," Jaffe said.
Ozone Holes
1,999
December 8, 1999
https://www.sciencedaily.com/releases/1999/12/991208055644.htm
New Energy-Efficient Chinese Refrigerator To Have Global Impact
BERKELEY, CA — China’s refrigerator industry is the largest in the world. As such, it contributes a significant share of ozone-depleting chlorofluorocarbons (CFCs) into the environment.
Now, an internationally funded, award-winning project to improve the energy efficiency of Chinese refrigerators, developed by the Department of Energy’s Lawrence Berkeley National Laboratory, has received a green light from the government of China and international funders, and is set to start in early December.The five-year program -- the CFC-Free Energy-Efficient Refrigerator Project -- consists of a series of market-oriented measures for manufacturers and consumers to encourage the production and consumption of CFC-free energy-efficient refrigerators. It is expected to reduce greenhouse gas emissions from China by a total of over 100 million tons of carbon dioxide from 20 million households over the 15-year lifetime of the new refrigerators. Also, because 80 percent of China's electricity is generated by coal-burning plants, the benefits of the project will include avoided emissions of other air pollutants.“Refrigerator production in China jumped from 1.4 million units in 1985 to 10.6 million in 1998,” according to David Fridley, a researcher in Berkeley Lab’s Environmental Energy Technologies Division, and manager of the refrigerator project. “In 1985, only 7 percent of urban households had refrigerators. By 1998, 76 percent had them, a 21 percent annual growth rate. The average Chinese refrigerator uses 2.5 kilowatt-hours per liter of volume per year, compared to 1.5 kWh/l for European refrigerators."The Global Environmental Facility, through the United Nations Development Program, has decided to fund $9.3 million of the $40 million program to help the government of China transform its market for refrigerators.Berkeley Lab has been involved in the project since 1995 through the U.S. Environmental Protection Agency (EPA), developing the market transformation program based on the success of the first phase of the project, which involved designing and testing CFC-free, energy-efficient refrigerators. Fridley says that beyond his technical supervisory role, the Laboratory will be involved in training and working with the State Bureau of Technical Supervision as the new efficiency standards are developed.“Market transformation,” Fridley explains, “is the process of shifting consumer demand for a product, in this case to a more energy-efficient, environmentally benign product through voluntary, market-based means such as technical assistance and training for manufacturers, consumer education, and financial incentives to manufacture and sell the more efficient product.”Berkeley Lab has worked directly with cooperating U.S. and international agencies such as the U.S. EPA, the United Nations Development Program, the China State Environmental Protection Administration and the former China National Council for Light Industry to determine a comprehensive set of measures based on economic, policy, and technical analysis.“Collectively, we developed a technical training program for Chinese refrigerator manufacturers interested in developing CFC-free, efficient refrigerators; a financial incentive program to motivate manufacturers to build the most efficient refrigerator possible; a dealer incentive program to convince dealers to stock the new refrigerators; and a mass purchasing program for Chinese government agencies that acquire refrigerators in -bulk,” Fridley says.Other new project activities will include a recycling buy-back pilot program, revision of existing refrigerator efficiency standards, an energy-efficiency labeling system, and an extensive nationwide consumer education campaign.In 1998, the refrigerator project was awarded an International Climate Protection Award by the EPA.“It is not widely known in the United States, but China has had an energy-efficiency policy in place since the early 1980s,” says Mark Levine, Environmental Energy Technologies Division Director and an advisor to the Chinese government on energy efficiency. “The government of China is committed to using energy more efficiently, and this has allowed the economy to grow at nearly twice the rate of energy consumption.“One effect of the increasing affluence in China is that refrigerators are growing in size and consuming more energy," adds Levine. "The Energy-Efficient Refrigerator Project will have a significant, direct effect on reducing greenhouse gas and pollutant emissions. We at Berkeley Lab are grateful to have the chance to work with the people and government of China on this project, as well as on our other projects in energy data analysis, appliance efficiency standards, and technical advice on cogeneration plants.”The refrigerator project began in 1989 when the EPA signed an agreement with the government of China to assist in the elimination of CFCs from refrigerators. Under the Montreal Protocol, most nations of the world agreed to phase out the use of CFCs to protect the Earth’s ozone layer. The success of the design phase of the project, in which a prototype model of 40 percent greater efficiency was produced and tested, led to eventual multilateral support for the new phase.Major Chinese participants in the project have included the China State Environmental Protection Administration, the State Administration for Light Industry, the Household Electric Appliance Research Institute, and domestic refrigerator manufacturers. Major U.S. participants have included the EPA, the University of Maryland Center for Environmental Energy Engineering, Underwriters Laboratories, and Berkeley Lab.Berkeley Lab is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified research and is managed by the University of California.
Ozone Holes
1,999
December 7, 1999
https://www.sciencedaily.com/releases/1999/12/991207072741.htm
Explaining How Ozone "Chokes Up" Plants
University Park, Pa. --- Penn State researchers have identified how ozone, a major smog constituent, affects the microscopic breathing pores on plants' leaves, a process that may figure in the estimated $3 billion in agricultural losses caused by ozone air pollution in the U.S. each year.
Dr. Gro Torsethaugen, a postdoctoral researcher in Penn State's Environmental Resources Research Institute, says, "Although elevated ground levels of ozone resulting from traffic and other fossil fuel burning have long been associated with losses in agricultural yield, the precise cellular targets of ozone's action were essentially unknown. Our work has shown, for the first time, that, rather than causing the pores or stomates on a plant's leaves to close, as was generally assumed, ozone actually inhibits stomatal opening by directly affecting the 'guard cells' that control the opening." Torsethaugen adds that knowing ozone's specific cellular targets may make it possible in the future to breed or to genetically engineer new plant varieties to improve productivity in geographic regions, such as California, with significant ozone exposure. Torsethaugen and her co-authors Dr. Eva J. Pell, the Steimer professor of agricultural sciences, and Dr. Sarah M. Assmann, professor of biology, published their findings in a recent issue of the Proceedings of the National Academy of Sciences (PNAS). Plants take in the carbon dioxide they need for photosynthesis through their stomates, Torsethaugen explains. They also release oxygen made in photosynthesis through the same pores. Ozone can also enter the plant through the stomates and can affect photosynthesis via that route. The Penn State experiments point to direct action on the guard cells as an additional path that ozone takes to decrease carbon dioxide assimilation and reduce plant productivity. Torsethaugen conducted the experiments with fava bean plants, an important world food source and a species scientists favor for guard cell studies. Using various techniques, she examined the pores on the leaves of whole plants and portions of leaf surfaces and then studied the isolated guard cells. In whole plants and the leaf surfaces, she found that ozone directly affects the stomatal opening. Using isolated guard cells, she monitored the flow of potassium, in a positively charged or ion form, into and out of the cells. "We monitored potassium because it is a major component in the osmotic process," she says. "If the potassium ion concentration is increased, water comes into the cell by osmosis and the guard cells surrounding the stomate swell. This swelling causes the pore to open." Ozone exposure reduced the flow of potassium ions into the guard cells but did not affect the outward flow, indicating that ozone inhibits the opening of the pores. In their PNAS paper, the authors note that their findings may have particular relevance during drought. They write, "Stomatal closure during a period of drought may be less readily reversed in ozone-exposed plants. This may be particularly relevant because the highest ozone concentrations are sometimes associated with times of drought." In addition, they write "In major agricultural regions with high light environments and significant ozone exposure - e.g. the "South Coast Air Basin of California, which has the most extreme ozone levels in the U.S. - midday stomatal closure often occurs because of the low ambient humidity that results from the high light, high temperature conditions of midday. Because the generation of ozone in photochemical smog depends on high solar irradiation, ozone inhibition of stomatal opening could significantly retard stomatal reopening in the afternoon after this mid-day depression and consequently reduce crop yield." Identification of the potassium ion channel as a target for ozone action opens the door to selectively breeding or genetically engineering less ozone sensitive plants to improve plant productivity in geographic regions with significant ozone expose. However, the Penn State researchers also note that "Our identification of a specific ion channel as a target for ozone action may prompt comparable studies in mammalian system, leading to improved understanding of and treatment for the disease etiologies exacerbated by ozone. " The research was supported in part by a grant from the Binational Agricultural Research and Development/U.S. Department of Agriculture. The Department of Biology, University of Oslo, Norway, where Torsethaugen earned her doctorate, provided additional support.
Ozone Holes
1,999
October 8, 1999
https://www.sciencedaily.com/releases/1999/10/991008080015.htm
Annual Depletion Of Antarctic Ozone Results Are In: "Ozone Hole" Smaller Than Last Year
A NASA satellite has shown that the area of ozone depletion over the Antarctic -- the well-known ozone "hole" -- is a bit less in 1999 than it was last year.
"This Antarctic year's ozone depletion area, or ozone 'hole,' is very large, but slightly smaller than that of 1998," said Dr. Richard McPeters, principal investigator for the instrument that made the measurements. This year's study found that an ozone "low" had formed between New Zealand and Antarctica on Sept. 17. This sort of ozone low, commonly referred to as a "mini-hole," is a result of the redistribution of ozone by a large weather system. The "mini-hole" moved eastward along the rim of the Antarctic "ozone hole" for a number of days after Sept. 17. Preliminary data from the satellite show that this year's Antarctic ozone depletion covered 9.8 million square miles on Sept. 15. The record area of Antarctic ozone depletion of 10.5 million square miles was set on Sept. 19, 1998. The ozone levels are expected to decrease over the next two weeks. The lowest amount of total-column ozone recorded to date this year was 92 Dobson Units on Oct. 1. In contrast, ozone levels of 90 Dobson Units were observed at one point last year. Dobson units measure how thick the ozone layer would be if all the overhead ozone molecules in a column of atmosphere could be brought down to the Earth's surface. Globally, the ozone layer averages approximately 300 Dobson Units, which would correspond to a layer about 1/8th of an inch (3 millimeters) thick at the Earth's surface, about the thickness of two stacked pennies. In contrast, during the annual Antarctic ozone "hole," the amount of ozone in the ozone "hole" is about 100 Dobson Units, about 1/25th of an inch, or approximately the thickness of a single dime. The slightly decreased size of the ozone "hole" from last year is not an indication of the recovery of Antarctic ozone levels. The current year-to-year variations of size and depth of the ozone "hole" depend primarily on the variations in meteorological conditions. The Antarctic ozone losses are caused by chlorine and bromine compounds released by chlorofluorocarbons (CFCs) and halons. Due to international treaties regulating the production of these gases, the amount of chlorine in the stratosphere is close to maximum value and, in some regions, is beginning to decline. In the next century, chlorine-induced ozone losses will be reduced as chlorine amounts throughout the stratosphere decline, and ozone levels will begin to recover. The actual rate of recovery will likely be affected by the increasing abundance of greenhouse gases in the atmosphere. Detecting the recovery of the ozone hole will require a number of years of measurements. Ozone molecules, made up of three atoms of oxygen, comprise a thin layer of the atmosphere that absorbs harmful ultraviolet radiation from the Sun. Most atmospheric ozone is found between 6 and 18 miles above the Earth's surface. Ozone shields life on Earth from the harmful effects of the Sun's ultraviolet radiation. Scientists and others have a keen interest in ozone depletion. Increased amounts of ultraviolet radiation that reach the Earth's surface due to ozone loss might increase the incidence of skin cancer and cataracts in humans, depress the human immune system, harm some crops and interfere with marine life. These measurements were obtained between mid-August and early October using the Total Ozone Mapping Spectrometer (TOMS) instrument aboard NASA's Earth Probe (TOMS-EP) satellite. NASA instruments have been measuring Antarctic ozone levels since the early 1970s. Since the discovery of the ozone "hole" in 1985, TOMS has been a key instrument for monitoring ozone levels over the Earth. TOMS ozone data and pictures are available on the Internet: TOMS-EP and other ozone-measurement programs are important parts of a global environmental effort of NASA's Earth Science enterprise, a long-term research program designed to study Earth's land, oceans, atmosphere, ice and life as a total integrated system.
Ozone Holes
1,999
June 3, 1999
https://www.sciencedaily.com/releases/1999/06/990603071210.htm
UW Scientists Say Arctic Oscillation Might Carry Evidence Of Global Warming
For years, scientists have known that Eurasian weather turns on the whim of a climate phenomenon called the North Atlantic oscillation. But two University of Washington researchers contend that the condition is just a part of a hemisphere-wide cycle they call the Arctic oscillation, which also has far-reaching impact in North America.
Significant changes in the Arctic oscillation during the last 30 years have influenced temperature and precipitation patterns throughout the Northern Hemisphere, said John Wallace, a UW atmospheric sciences professor, and graduate student David Thompson. Those changes might well have been caused by ozone depletion over the northern polar region or a buildup of greenhouse gases, they said. In recent years, the Arctic oscillation seems to have been stuck in its positive phase most of the time, said Wallace, who will present evidence of the phenomenon on Thursday in the Charney Lecture during the American Geophysical Union's spring meeting in Boston. "This Arctic oscillation seems to be doing things in the last decade or two beyond the range of what it has done earlier in the century," he said. In a 1995 study published in the journal Science, Wallace suggested that the global warming widely believed to be occurring could be, in part, the result of natural cycles. His work on the Arctic oscillation, paid for in part by a National Science Foundation grant, is altering his view because it includes findings that imply human-induced climate change. "We are looking at temperature changes over Siberia in the last 30 years that are almost 10 times greater than the global mean temperature rise in the last 100 years," he said. In addition, significant changes in surface winds over the Arctic might have contributed to the thinning of the polar ice cap that has been reported by scientists with last year's SHEBA (Surface Heat Budget of the Arctic Ocean) project, Wallace said. "If it keeps going in this direction, it could be a major force for melting," he said. The Arctic oscillation is a pattern in which atmospheric pressure in northern latitudes switches, or oscillates randomly, between positive and negative phases. The negative phase brings high pressure over the polar region and low pressure at about 45 degrees north latitude - a line that runs through the northern third of the United States and western Europe. The positive phase reverses the conditions, steering ocean storms farther north and bringing wetter weather to Alaska, Scotland and Scandinavia and drier conditions to areas such as California and Spain. In its positive phase, Thompson said, frigid winter air doesn't plunge as far into the heart of North America, keeping much of the United States east of the Rocky Mountains warmer than normal. However, areas such as Greenland and Newfoundland typically are colder than usual. One piece of the puzzle, the researchers said, is the existence of a Southern Hemisphere phenomenon, an Antarctic oscillation, which is virtually identical to its northern counterpart except that the Southern Hemisphere has a stronger vortex, or spinning ring of air, encircling the pole. That's because, unlike in the Northern Hemisphere, there are no large mid-latitude continents like Eurasia to disrupt the circular flow, Wallace and Thompson said. But the cooling of the stratosphere in the last few decades has caused the polar vortexes in both hemispheres to strengthen in winter. Winds at the Earth's surface also have gained speed, sweeping larger quantities of mild ocean air across the cold northern continents. A modeling study being published in the journal Nature on Thursday by researchers at NASA's Goddard Institute for Space Studies suggests the buildup of greenhouse gases might be contributing to colder temperatures in the polar stratosphere. If that trend continues, the colder temperatures ultimately could produce a hole in the ozone layer like that in the Southern Hemisphere, Wallace said. The ozone layer acts as a shield against ultraviolet rays from the sun, but ozone is depleted when it interacts with chlorofluorocarbons, or CFCs, once common ingredients in refrigerants. International cooperation has greatly reduced the production of CFCs, but enough still linger from past decades to damage the ozone if wintertime stratospheric temperatures become cold enough to activate them. "If the Northern Hemisphere polar stratosphere keeps getting colder, the ozone becomes more susceptible to the CFCs that are there," Wallace said. "We may have chronic problems with ozone depletion despite curbs on CFC releases."
Ozone Holes
1,999
May 31, 1999
https://www.sciencedaily.com/releases/1999/05/990531072528.htm
USGS Study Confirms An Urban Air-Pollution Problem At Mount Rainier National Park
The scenery is spectacular, but don't go for the pure mountain air. According to a recently published study, air in Washington's Mount Rainier National Park contains higher concentrations of ozone, a major component of air pollution, than nearby urban areas. This means local rural residents and park visitors, as well as the beautiful forests, wetlands and alpine meadows of the park, are being exposed to elevated levels of this pollutant, especially during those warm summer days that favor the production of ozone.
The study, by Dr. David Peterson of the USGS Forest and Rangeland Ecosystem Science Center, was published in the scientific journal "Atmospheric Environment," and adds to the growing body of evidence that protected areas such as national parks are vulnerable to pollution from outside sources. The ozone at Mt. Rainier, for example, is blown in from Seattle and other urban centers in western Washington. Ozone is a natural component of the Earth's atmosphere, but its effects vary depending on where and in what concentration it occurs. High above the Earth's surface, in the stratosphere, a protective layer of ozone screens the earth from biologically harmful frequencies of ultraviolet radiation. The stratospheric ozone layer is essential to the existence of many forms of life. However in the lower part of the atmosphere, called the troposphere, human-produced ozone can be a dangerous pollutant. The colorless gas is formed from byproducts released during the burning of fossil fuels, and can be toxic to both plants and animals, including humans, even at fairly low concentrations. "It's well documented that both periodic episodes of high ozone exposure and chronic moderate ozone exposure can be harmful to plants," said Peterson. "We know this from other regions of North America including national parks such as Sequoia and Great Smoky Mountains." Sources of ozone-producing compounds, including automobile exhaust and fuel-burning industries, tend to be concentrated in urban areas. But the common assumption that ozone pollution is strictly an urban problem is proving to be false, Peterson said. Peterson and his students monitored ozone at Mount Rainier National Park from 1993 to 1997, and quantified the spatial distribution of this pollutant throughout western Washington in 1996. The park consistently had the highest average weekly levels of tropospheric ozone measured anywhere in the state. Ozone concentrations tend to increase at higher altitudes, partly because of passive dispersion from the stratosphere, but largely due to the transport of pollutants by prevailing winds inland from urban sources, such as the Seattle metropolitan area. A common visitor destination in the park, known as Paradise, is on the southern slope of Mount Rainier at an elevation of 5,400 feet and a distance of about 60 miles from Seattle. Peterson said that Paradise had almost twice the monthly mean ozone concentration as Lake Sammamish, which is near sea level and less than ten miles east of Seattle. Mount Rainier National Park resource manager Barbara Samora said the ozone study "is just one more indication of how difficult it is to protect our national parks. It isn't a simple matter of telling people to stay on footpaths or of sensibly locating campgrounds and parking lots. How do you manage a threat that is produced 60 miles from the park and transported here by winds?" A number of other North American wildlands have concentrations of tropospheric ozone sufficiently high to harm vegetation. They, too, are located downwind of metropolitan areas. These areas include El Desierto de Los Leones near Mexico City, San Bernardino National Forest to the east of Los Angeles, Sequoia National Park and Sequoia National Forest to the east of several metropolitan areas in California, and Great Smoky Mountains National Park in the eastern U.S. Peterson noted that in the rapidly urbanizing Puget Sound region, with its higher population and additional vehicles will produce more local pollution sources in the future. "It is time to consider the potential for damage to park ecosystems and even potential health hazards to park visitors," Peterson said. "We've long known that parks cannot be managed as islands, separate from their surroundings. These findings about tropospheric ozone concentrations lend additional support to that management position." As the nation's largest water, earth and biological science and civilian mapping agency, the USGS works in cooperation with more than 2000 organizations across the country to provide reliable, impartial, scientific information to resource managers, planners, and other customers. This information is gathered in every state by USGS scientists to minimize the loss of life and property from natural disasters, contribute to the sound conservation, economic and physical development of the nation's natural resources, and enhance the quality of life by monitoring water, biological, energy, and mineral resources.
Ozone Holes
1,999
May 28, 1999
https://www.sciencedaily.com/releases/1999/05/990527152937.htm
Ozone Linked To Warmer Weekend Temperatures In Toronto
Higher amounts of ground-level ozone on weekends compared to weekdays are causing warmer weekend weather in Toronto, according to a U of T study.
Professor William Gough and graduate student Gary Beaney of environmental science at the University of Toronto at Scarborough have found that, contrary to expectation, the lack of a Saturday morning rush hour in the Greater Toronto Area increases the amount of harmful ozone in the atmosphere. Ground-level ozone forms when air pollutants such as car exhaust mix with sunlight, but on weekday mornings when there is little sun the rush hour pollutants actually destroy much of the previous day's ozone. While this phenomenon has been documented in other major cities, Gough and Beaney also found that when ozone levels are very high on weekends the temperatures are one degree celsius higher than during the week. "This strong correlation between ozone levels and temperature challenges the assumption that ozone is a minor contributor to greenhouse warming compared to carbon dioxide," says Gough. "It may have implications for climate change assessment and strategies to reduce urban smog—for example reducing emissions in the morning rush hour may not be as important as reducing emissions later in the day." Beaney analysed data over approximately 30 years from Environment Canada and the Ontario Ministry of the Environment.CONTACT: Megan Easton
Ozone Holes
1,999
April 12, 1999
https://www.sciencedaily.com/releases/1999/04/990412075538.htm
Link Between Solar Cycle And Climate Is Blowin' In The Wind
Researchers have found that the variations in the energy given off from the sun effect the Earth's wind patterns and thus the climate of the planet, according to results of a new study published in the April 9 issue of Science.
For decades, scientists have tried to understand the link between winds and temperature and the sun and its cycles. There were tell-tale signs of a connection. For instance, the Little Ice Age recorded in Europe between 1550 and 1700 happened during a time of very low solar activity. But how the sun and climate were linked continued to elude researchers. According to Drew Shindell, a climate researcher from NASA's Goddard Institute for Space Studies in New York, NY, and lead author of the new study, a key piece of the puzzle was missing. Previous studies neglected to take into account the effects of increased solar activity on the ozone layer or the complex chemistry of the upper atmosphere where most of the high-energy radiation, including ultra-violet radiation (the kind responsible for creating the ozone layer) gets absorbed. "When we added the upper atmosphere's chemistry into our climate model, we found that during a solar maximum major climate changes occur in North America." The changes, according to Shindell, are caused by stronger westerly winds. Changes also occur in wind speeds and directions all over the Earth's surface. "Solar variability changes the distribution of energy," said Shindell. "Over an 11-year solar cycle, the total amount of energy has not changed very much. But where the energy goes changes as wind speeds and directions change." During the sun's 11-year cycle, from a solar maximum to a solar minimum, the energy released by the sun changes by only about a tenth of a percent. When the solar cycle is at a maximum, it puts out a larger percentage of high-energy radiation, which increases the amount of ozone in the upper atmosphere. The increased ozone warms the upper atmosphere and the warm air affects winds all the way from the stratosphere (that region of the atmosphere that extends from about 6 to 30 miles high) to the Earth's surface. "The change in wind strength and direction creates different climate patterns around the globe," said Shindell. According to Shindell, the new study also confirms that changing levels of energy from the sun are not a major cause of global warming. Many scientists have argued that the radiation change in a solar cycle - an increase of two to three tenths of a percent over the 20th century - are not strong enough to account for the observed surface temperature increases. The GISS model agrees that the solar increases do not have the ability to cause large global temperature increases, leading Shindell to conclude that greenhouse gasses are indeed playing the dominant role. The general circulation model used in the study included solar radiation data from NASA's Upper Atmospheric Research Satellite, launched in 1991. With data from UARS, which was used to calculate ozone changes, scientists have good measurements of how much radiation the sun puts out, increasing the accuracy of the new model.
Ozone Holes
1,999
March 25, 1999
https://www.sciencedaily.com/releases/1999/03/990325052848.htm
Atlanta An "Urban Heat Island," With Higher Temperatures Than Surrounding Area, New NASA Study Shows
ATHENS, Ga. -- Atlanta, Georgia, is an island unto itself - an "urban heat island" - that can have temperatures up to 10 degrees Fahrenheit higher than surrounding areas, creating its own weather and causing thunderstorms.
That's the conclusion of a new NASA-sponsored study whose results were revealed today in Honolulu, Hawaii, at the annual meeting of the Association of American Geographers. "We used geographic information system technology to see how land use has changed over the past two decades," said Dr. C. P. Lo, a geographer from the University of Georgia. "It's a very useful technique to see how land cover has changed." Lo and graduate student Xiaojun Yang presented their data at the meeting today in Honolulu. All large urban areas are warmed by their own urban heat islands as a result of the removal of trees and the paving of land, according to Dale Quattrochi and Jeffrey Luvall of NASA's Marshall Space Flight Center, who lead the Atlanta Land-use Analysis: Temperature and Air-quality (ATLANTA) project. Dark, heat-absorbing materials for roofs and roads create the problem. During the day, dark materials absorb heat and hold it long after the sun sets, keeping cities hot hours longer than outlying rural areas. The added heat intensifies Atlanta's air quality problem. The city is plagued with serious ozone pollution. Smog levels are intensified by the urban heat island because with a 10-degree rise in temperature, the chemical reaction that creates ozone-the molecule responsible for smog-doubles, according to Luvall. Ozone is only produced in warm summer months. Ozone is a health hazard regulated by the Environmental Protection Agency (EPA). The ATLANTA project began in 1996 to help solve the problems created and enhanced by urban heat islands. With funding through NASA's Earth Observing System, investigators from a variety of disciplines and institutions are looking at how land use changes since the 1970s have intensified the urban heat island effect. New results of these studies will be presented in a session at the AAG meeting. "NASA had already done a study like this with Huntsville, Alabama, and when it was finished, I suggested that we do Atlanta," said Lo. "It fit well into NASA's project examining urban environments and global change. And that's when we realized we needed many other experts if we were to understand the effects of development on Atlanta." To understand the distribution of increasing populations over the Atlanta metropolitan area, Lo and Yang use aerial photos and Landsat satellite data to study the area's growth since 1973. By interpreting these images, they can see where the vegetation is disappearing and being replaced by roads and suburbs. Lo and Yang report today that between 1973 and 1998, nearly 350,000 acres of forest area have been cleared for Atlanta's 13 metropolitan counties. Replacing the forests are mainly suburbs, according to Lo. Since 1973 the area of developed suburbs "low density residential area" has doubled to nearly 670,000 acres. Commercial development also doubled. The expanding population and loss of vegetated land leads to a larger urban heat island, according to Lo. Robert Gillies, a Utah State University geographer, uses satellite data to map the heat coming off Atlanta's urban area. When land is covered by plants or soil containing water, heat absorbed during the day is quickly removed by evaporation and plant transpiration-the way that plants lose water through their leaves. From an instrument aboard a National Oceanographic and Atmospheric Administration (NOAA) satellite that detects radiated heat from the earth, Gillies can map what parts of the city are hotter than others, based on which areas are losing heat more quickly. Gillies will report on the heat distribution around the city including the fact that in Atlanta's central business district there is an intense hot zone encompassing 17 square miles (45 square kilometers). Robert Bornstein and Qing Lu Lin, meteorologists from San Jose State University use data from meteorological stations set up during the 1996 summer Olympics and discovered that the urban heat island in Atlanta creates thunderstorms south of the city. When the city heats up, low air pressure is created. Cold dense air rushes in from surrounding areas and causes the warm air to rise. The city creates its own wind, and hot air rushes upward, triggering convective thunderstorms, said Bornstein. Increasing thunderstorms could cause urban flooding, said Bornstein, especially because large areas of ground are paved and rainwater can't be absorbed into soil. One benefit of the added thunderstorms is that the precipitation cleans the atmosphere of pollutants and cools the city. Colorado State University meteorologists Stanley Kidder and Jan Hafner are using Geostationary Environmental Satellite (GOES) and Landsat data to study how clouds interact with Atlanta's urban heat island. They will report on new research to understand how large urban areas effect cloud cover and how the clouds tend to decrease the amount of ozone production by blocking sunlight and cooling the ground surface. "The presence of forest has a large modification effect on local climate," said Lo, "but we can't really tell exactly where it begins or how much it changes local climate generally. What we can say is that there is a huge increase in urban heat, making Atlanta an island in this regard." The project team now hopes to gain funding to extend the project to include the modeling of numerous events and land-use practices affecting the area.
Ozone Holes
1,999
March 19, 1999
https://www.sciencedaily.com/releases/1999/03/990319060957.htm
Study Rethinks Atmospheric Chemistry From Ground Up
WEST LAFAYETTE, Ind. -- Don't be snowed by everything the textbooks have to say about atmospheric chemistry.
A Purdue University research team studying natural processes that affect ozone in the Arctic atmosphere has discovered that snowpacks not only absorb chemicals from the atmosphere, but also can help produce them.The findings, published in the March 18 issue of the scientific journal Nature and the March 15 issue of Geophysical Research Letters, cast a new light on scientists' perceptions of how atmospheric gases are processed, says Paul Shepson, professor of atmospheric chemistry at Purdue.The new findings also may affect the way that scientists view data from ice core studies, because researchers have assumed that the air trapped in ice provided representative samples of atmospheric conditions at the time the ice was formed."Ice core studies designed to look at reactive species such as nitrates may have to be revisited, as the air bubbles found in these ice cores may not be the mirrors of atmospheric composition that we suspected they were," Shepson says.This is not a concern for more stable greenhouse gases such as carbon dioxide and methane, which have been extensively studied in ice cores, because these stable gases are less likely to react with other compounds in snow or ice, Shepson says.His group studies the chemistry of ozone in the troposphere, the lowest part of the atmosphere. Ozone, a beneficial component of the earth's upper atmosphere, is a pollutant at the ground level.Last winter, Shepson led a research group to the Canadian Arctic to observe how sunlight interacts with various gases in the atmosphere to reduce near-surface ozone levels."It has recently been observed that, at polar sunrise, which occurs in March or April after several months of complete darkness, ozone in a thin layer of air over the Arctic ocean is completely removed," Shepson says. "This was a big surprise to us, and it indicates that our understanding of atmospheric ozone is poor."From the Environment Canada research site at the Canadian Forces base at Alert, the group tracked levels of atmospheric compounds, including formaldehyde, over a two-month period. Formaldehyde is an important part of the atmosphere's self-cleaning mechanism because it is a major source of free radicals, Shepson says."The atmosphere acts to clean itself of pollutants through reactions involving free radicals. When formaldehyde absorbs light, it falls apart to produce these free radicals."Previous studies of formaldehyde in the Arctic had shown concentrations up to 10 times higher than expected, so graduate student Ann Louise Sumner spent two months at the Alert laboratory measuring formaldehyde in the snowpack and in the atmosphere.These measurements, published in the Nature article, suggest that formaldehyde is produced through photochemical reactions at the snow surface."The data account for much of the discrepancy between the high concentrations of formaldehyde found in the Arctic and the amounts predicted by our models," Shepson says.The second paper, published in Geophysical Research Letters, reports on studies at the ice core site at Summit, Greenland, where the Purdue group participated in an experiment led by Richard Honrath of Michigan Technological University.The studies found further complexity and importance in photochemical processes that occur at the snow surface, Shepson says.Specifically, the team found that concentrations of nitric oxide and nitrogen dioxide -- collectively known as NOx -- were actually higher within the snowpack than in the atmosphere.The findings suggest that nitrate ions in the snow can interact with sunlight to produce NOx, a pollutant derived largely from the combustion of fossil fuels and a critical precursor to the production of ozone the atmosphere, Shepson says."This observation changes the way we look at atmospheric chemistry in a fundamental way, in that deposition of nitric acid to the snow was previously regarded as the final fate of NOx," he says. "Now it appears that nitric acid in the snow can be reprocessed by interactions with light, causing re-release of a variety of pollutants back into the atmosphere."In addition to forcing a re-evaluation of data from ice core studies, the new findings call into question some models that are used to predict long-term changes in the composition of our atmosphere."Specifically, models of atmospheric chemistry need to do a better job of treating interaction of gases with surfaces," Shepson says. "Although we are starting to do better with atmospheric particles, it is important to remember that a potentially important atmospheric surface is the surface of the earth."Shepson and his group are working with another group at Purdue to develop new computer models that incorporate the chemical reactions that occur in snowpacks into the current models of atmospheric chemistry and transport.Working with Shepson on the studies are graduate students Bryan Splawn, a native of Spartanburg, S.C., Sumner of Lake in the Hills, Ill., and Brian Michalowski of Racine, Wis. Shepson's studies at Purdue are funded by the National Science Foundation and BASF.
Ozone Holes
1,999
March 11, 1999
https://www.sciencedaily.com/releases/1999/03/990310131223.htm
Atmospheric Scientists Fly Over South Pacific To Sample "World's Cleanest Air"
BOULDER--Two research aircraft and 100 scientists and support staff areheading to the South Pacific to study what some have called the world'scleanest air. The March 10-April 30 flights are part of the secondPacific Exploratory Mission to the Tropics (PEM-Tropics B). Nineprincipal investigators from the National Center for AtmosphericResearch (NCAR) are participating in the mission, which is part of theGlobal Tropospheric Experiment sponsored by the National Aeronautics andSpace Administration (NASA). NCAR's primary sponsor is the NationalScience Foundation.
Researchers will gather data on the chemical species that affectformation of tropospheric ozone and sulfate aerosols. The goal is todetermine how well the earth's atmosphere cleanses itself. This chemicalprocess, called oxidation, includes removal of gases that wouldotherwise be warming the troposphere or causing stratospheric ozonedepletion. The tropics play a key role in determining the globaloxidizing power of the atmosphere because the high levels of humidityand ultraviolet radiation found there promote the formation of oxidizingmolecules.PEM researchers aboard the NASA-Ames DC-8 jet and the NASA-Goddard P-3Bturboprop will measure hundreds of chemical species and compounds using35 instruments. The aircraft will be deployed from sites in Hawaii,Christmas Island, Tahiti, American Samoa, Easter Island, and Fiji tocover an area ranging roughly from the Cook Islands to the west coast ofSouth America. Over 30 principal investigators from 17 U.S. universitiesand research laboratories are involved in the experiment.The chemistry of the tropical Pacific's troposphere (the atmosphere'slowest layer, which reaches an average height of 16 kilometers, or 10miles, over the tropics) was largely unknown until the PEM-Tropics Amission in 1996, when researchers took extensive measurements during theSouthern Hemisphere's dry season from August to October. They foundsignificant levels of human-generated pollution, primarily from firesset to clear land for agriculture in Africa and possibly South America.PEM-Tropics B is returning during the wet season, when the impact ofbiomass burning in the Southern Hemisphere is expected to be much lower."We will have so many airborne instruments taking measurements thatwe'll be able to draw some conclusions about the chemistry of sulfateaerosols and the chemistry that's responsible for production and loss oftropospheric ozone," explains NCAR scientist Brian Ridley. The abilityof sulfate aerosols to reflect the sun's radiation may be one reasonthat increasing greenhouse gases have not warmed the earth as much assome climate models have predicted. Sulfates also contribute to localpollution and acid rain. While the sources of tropospheric ozone includebiomass burning and urban smog, this trace gas is also involved in theoxidizing process. Understanding the life cycle of ozone in thetroposphere is vital to understanding the oxidizing capacity of theatmosphere.NCAR scientist Lee Mauldin is creating a Web site for the PEM missionaimed at K-12 students around the world (NCAR is managed by the University Corporation for Atmospheric Research,a consortium of more than 60 universities offering Ph.D.s in atmosphericand related sciences.-The End-Writer: Zhenya GallonFind this news release on the World Wide Web at To receive UCAR and NCAR news releases by e-mail, send name,affiliation, postal address, fax, and phone number to [email protected]
Ozone Holes
1,999
December 3, 1998
https://www.sciencedaily.com/releases/1998/12/981202114432.htm
Ozone Above Indian Ocean Linked To African Lightning
BOULDER--In one of the first studies to trace lightning's chemicalimpact across thousands of miles, a team of atmospheric chemists hasconnected a region of elevated ozone levels in the eastern Indian Oceanwith lightning produced in Africa. The results will be presentedDecember 6 at the American Geophysical Union conference in San Franciscoby Louisa Emmons, a visiting scientist at the National Center forAtmospheric Research (NCAR). NCAR's primary sponsor is the NationalScience Foundation.
Emmons and colleagues examined a set of ozone data collected over fouryears between Japan and Antarctica for their paper, "Evidence ofTransport Across the Indian Ocean of Ozone Produced from Biomass Burningand Lightning" (AGU paper A12D-11). Her coauthors are DidierHauglustaine (France's Centre National de la Recherche Scientifique),Michael Newchurch (University of Alabama at Huntsville), Toshi Takao andKouji Matsubara (Japan Meteorological Agency), and Guy Brasseur (NCAR).The research was funded by the National Aeronutics and SpaceAdministration.Lightning is known to produce nitrogen oxides (NOx) withinthunderstorms. These chemicals may react with others in the presence ofsunlight to produce ozone. Until now, most related studies have focusedon measuring the production of NOx in the immediate vicinity of storms.However, the ozone produced has a long lifetime in the upper troposphereand thus could be carried over long distances. According to Emmons andcolleagues, ozone from storms across southern Africa is beingtransported by the subtropical jet stream to Australia.Ozone measurements between 2 and 6 miles in altitude (3-10 kilometers)over a large part of the eastern Indian Ocean were as high as 80 partsper billion, similar to a polluted day in a U.S. city and several timesmore than normal levels, says Emmons. To analyze the source of thisozone, she and colleagues used a new computer model of atmosphericchemistry called MOZART, developed at NCAR by Brasseur and Hauglustaine.Results from MOZART indicate that the ozone did not descend from thestratosphere, the most obvious source. Another possible source was theburning of forests and grasses upwind in Africa. When biomass burningwas removed from the model calculations, ozone levels remained high, butwhen African lightning was removed, the ozone levels droppedsignificantly. The MOZART results are consistent with the observationsabove."Although there are uncertanties in the model results," says Emmons,"they indicate that lightning has a far-reaching and significant impacton tropospheric chemistry."The University Corporation for Atmospheric Research, a consortium ofmore than 60 universities offering Ph.D.s in atmospheric and relatedsciences, manages NCAR.-The End-Writer: Bob HensonFind this news release on the World Wide Web atTo receive UCAR and NCAR news releases by e-mail,telephone 303-497-8601 or e-mail [email protected]
Ozone Holes
1,998
October 26, 1998
https://www.sciencedaily.com/releases/1998/10/981026070151.htm
NASA Helps "Hot" Cities Cool Down
Environmental planning for the 2002 Olympic games, strategies to reduce ozone levels, focused tree-planting programs and identification of cool roofs are early spinoffs from a NASA urban study just concluding in three U.S. cities.
Researchers from NASA's Marshall Space Flight Center, Huntsville, AL, flew a thermal camera mounted on a NASA aircraft over Baton Rouge, LA; Sacramento, CA; and Salt Lake City, UT. The thermal camera took each city's temperature and produced an image that pinpoints the cities' "hot spots." The researchers are using the images to study which city surfaces contribute to bubble-like accumulations of hot air, called urban heat islands. The bubbles of hot air develop over cities as naturally vegetated surfaces are replaced with asphalt, concrete, rooftops and other man-made materials. "One thing's for sure, the three cities we've looked at were hot," said the study's lead investigator, Dr. Jeff Luvall of Marshall's Global Hydrology and Climate Center. "They can use a lot of trees and reflective rooftops." Salt Lake City is using the early results to help plan sites for the 2002 Olympic Games and develop strategies to reduce ground-level ozone concentrations in the Salt Lake City valley. Though at high altitudes ozone protects the Earth from ultraviolet rays, at ground level it is a powerful and dangerous respiratory irritant found in cities during the summer's hottest months. In Sacramento and Baton Rouge, city planners and tree-planting organizations are using the study to focus their tree-planting programs. "We are helping the citiesincorporate the study into their urban planning," said Maury Estes, an urban planner on the science team at Marshall. "By choosing strategic areas in which to plant trees and by encouraging the use of light-colored, reflective building material, we think that the cities can be cooled." The science team will continue to analyze the thermal heat information and work with the cities to incorporate future results into the cities' plans. The team plans to disseminate its findings nationally so other cities can incorporate what the team has learned into their long-range growth plans. This study is supported by NASA's Earth Science enterprise. The enterprise is responsible for a long-term, coordinated research effort to study the total Earth system and the effects of natural and human-induced changes on the global environment. This project also is aimed at the enterprise's efforts to make more near-term economic and societal benefits of Earth science research and data products available to the broader community of public and private users. Working on the study are researchers from Marshall; the Environmental Protection Agency, Washington, DC; the Department of Energy, Washington, DC; Lawrence Berkeley National Laboratory, Berkeley, CA; Baton Rouge Green, LA; the Sacramento Tree Foundation, CA; Tree Utah, Salt Lake City; and the Utah State Energy Services Department, Salt Lake City.Note to Editors: Interviews with the NASA urban planner, heat island researchers and program coordinators in Baton Rouge, Sacramento and Salt Lake City are available via telephone, NASA TV live satellite link or by e-mail. For additional information, call Marshall's Media Relations Office at 256/544-0034. Images related to the study can be found at: More information on the study and research updates can be found on the new Marshall Internet Web site at URL:
Ozone Holes
1,998
October 7, 1998
https://www.sciencedaily.com/releases/1998/10/981007072632.htm
Antarctic Ozone Depletion Sets New Size Record
NASA and NOAA satellites show that the Antarctic ozone thinning covers the largest expanse of territory since the depletion developed in the early 1980s. The measurements were obtained this year between mid-August and early October using the Total Ozone Mapping Spectrometer (TOMS) instrument aboard NASA's Earth Probe (TOMS-EP) satellite and the Solar Backscatter Ultraviolet Instrument (SBUV) aboard the NOAA-14 satellite.
"This is the largest Antarctic ozone hole we've ever observed, and it's nearly the deepest," said Dr. Richard McPeters, Principal Investigator for Earth Probe TOMS. Preliminary data from the satellites show that this year's ozone depletion reached a record size of 10.5 million square miles (27.3 million square kilometers) on Sept. 19, 1998. The previous record of 10.0 million square miles was set on Sept. 7, 1996. The ozone level fell to 90 Dobson units on Sept. 30, 1998. This nearly equals the lowest value ever recorded of 88 Dobson Units seen on Sept. 28, 1994, over Antarctica. Scientists are not concerned that the hole might be growing because they know it is a direct result of unusually cold stratospheric temperatures, though they do not know why it is colder this year. The decrease in ozone, however, could result in more acute solar or ultraviolet radiation exposure in southern Chile and Argentina if the ozone hole were to pass over that region. One of the primary concerns with an ozone hole of this size is that as the hole "breaks up," the ozone-depleted air will diffuse and reduce the overall ozone levels in the mid-latitudes of the southern hemisphere. These ozone losses are caused by chlorine and bromine compounds released by chlorofluorocarbons (CFCs) and halons. Year-to-year variations of size and depth of the ozone hole depend on the variations in meteorological conditions. Scientists believe that the decrease in Antarctic ozone is attributed to unusually cold (by 5-9 degrees Fahrenheit) temperatures in the southern middle and polar latitudes. "This year was colder than normal and therefore enables greater activation of reactive chlorine that ultimately causes more ozone loss and lower ozone levels," said Dr. Alvin J. Miller of the National Centers for Environmental Prediction (NCEP). This decrease in ozone was observed earlier than usual with the hole opening in mid-August about two weeks before a typical year. This is consistent with expectations, since chlorine levels have slightly increased since the early 1990s. As a result of international agreements known as the Montreal Protocol on ozone-depleting substances (and its amendments), chlorine levels from CFCs already have peaked in the lower atmosphere and should peak in the Antarctic stratosphere within a few years. As we move into the next century, chlorine-catalyzed ozone losses resulting from CFCs and other chlorine-containing species will be reduced. "An ozone hole of substantial depth and size is likely to continue to form for the next few years or until the stratospheric chlorine amount drops to its pre-ozone hole values," said Dr. Paul Newman at NASA's Goddard Space Flight Center (GSFC), Greenbelt, MD. "The decrease in chlorine in our atmosphere is analogous to using a small air cleaner to recycle all of the air in a large indoor sports stadium -- it will take a very, very long time." Scientists and others have a keen interest in ozone depletion, given that the increased amounts of ultraviolet radiation that reach the Earth's surface because of ozone loss have the potential to increase the incidence of skin cancer and cataracts in humans, harm some crops, and interfere with marine life. NASA and NOAA instruments have been measuring Antarctic ozone levels since the early 1970s. Since the discovery of the ozone hole in 1985, TOMS and SBUV have been key instruments for monitoring ozone levels over the Earth. Analysis of TOMS and SBUV data have traced in detail the annual development of the Antarctic "ozone hole," a large area of intense ozone depletion that occurs between late August and early October. Analysis of the historical data indicated that the hole has existed since at least 1979. A Dobson unit measures the physical thickness of the ozone layer at the pressure of the Earth's surface. The global average ozone layer thickness is 300 Dobson units, which equals three millimeters or 1/8th of an inch, and while not uniform, averages the thickness of two stacked pennies. In contrast during these annual occurrences, the ozone layer thickness in the ozone hole is about 100 Dobson units (1/25th of an inch or 1 millimeter thick), approximately the thickness of a single dime. Ozone shields life on Earth from the harmful effects of the Sun's ultraviolet radiation. The ozone molecule is made up of three atoms of oxygen; ozone comprises a thin layer of the atmosphere which absorbs harmful ultraviolet radiation from the Sun. Most atmospheric ozone is found in a thin layer between 6-18 miles up. TOMS ozone data and pictures are available on the Internet at the following URL: or through links at URL: TOMS-EP and other ozone-measurement programs are key parts of a global environmental effort of NASA's Earth Science enterprise, a long-term research program designed to study Earth's land, oceans, atmosphere, ice and life as a total integrated system. Goddard developed and manages the operation of the TOMS-EP for NASA's Office of Earth Science, Washington, DC.
Ozone Holes
1,998
April 9, 1998
https://www.sciencedaily.com/releases/1998/04/980409081315.htm
Arctic Ozone Hole, Responding To Greenhouse Gases, Will Worsen Through 2020, Columbia Team Finds
An ozone hole in the Arctic is expected to grow larger as a result of greenhouse gas accumulation and should worsen through the year 2020 before recovering, according to a new study by a team of climate scientists at Columbia University and NASA's Goddard Institute for Space Studies.
The loss of ozone in the Arctic by 2020 will be about double what would occur without greenhouse gases, the scientists report. Ozone absorbs harmful ultraviolet radiation from the sun, and its depletion over the poles is thought to be a cause of increased global levels of skin cancer. Researchers used computer models to project over time future emissions of greenhouse gases and ozone-depleting halogens, the first time such interactions have been studied. The work appears in the April 9 issue of the British journal Nature and is reported by Drew T. Shindell, associate research scientist at Columbia's Center for Climate Systems Research; David H. Rind, senior scientist at the Goddard Institute, adjunct professor of earth and environmental sciences at Columbia and adjunct senior research scientist at Columbia's Lamont-Doherty Earth Observatory, and Patrick Lonergan of Science Systems and Affiliations Inc. Ozone losses had increased greatly in the 1990s in the Arctic and in late 1997 were the greatest ever observed, according to measurements by NASA satellites. Ozone, found in a thin layer of the upper atmosphere, absorbs harmful ultraviolet radiation from the sun. But when polar stratospheric clouds form, reactions take place between ozone gas and chlorine, bromine and other halogen gases on the surface of ice or water droplets in these clouds, depleting the ozone. Most of those halogens are from chlorofluorocarbons emitted into the atmosphere by industrial processes. Because the number of particles that form in polar stratospheric clouds is extremely sensitive to changes in temperature, the reaction that results in ozone depletion is also very sensitive, occurring only below about minus 108 degrees Fahrenheit. Though greenhouse gases cause atmospheric warming at the Earth's surface, they cool the stratosphere, where ozone resides, and thus are a likely cause of the increased ozone depletion, the researchers said. "Since ozone chemistry is very sensitive to temperature, this cooling results in more ozone depletion in the polar regions," Dr. Shindell said. "Even very small amounts of stratospheric cooling can greatly increase ozone depletion." Temperatures are slightly warmer in the Arctic than the Antarctic during their respective winter and spring seasons, with the result that ozone losses in the Northern Hemisphere had been lower than in the Southern. But the Arctic stratosphere has gradually cooled over the last decade, resulting in the increased ozone loss, the scientists believe. One of the reasons for the warmer Arctic is that large-scale planetary atmospheric waves, similar to solitons in the oceans, deposit heat energy in the North, breaking up an atmospheric vortex of cold air that sits over the Arctic. In the simulations performed by the NASA-Columbia team, temperature and wind changes induced by greenhouse gases alter the propagation of planetary waves, which no longer disturb the Arctic vortex as often. The combination of greenhouse-induced stratospheric cooling and the increased stability of the Arctic polar vortex dramatically increase ozone depletion. Because of international controls on the emission of ozone-depleting halogens, those gases are expected to peak about the year 2000. In the Columbia-NASA model, Arctic ozone depletion will be worst in the decade 2010 to 2019, with two-thirds of atmospheric ozone lost in the most severely affected areas. The work was supported by the NASA Atmospheric Chemistry Modeling and Analysis Program and the NASA Climate Modeling Program. Lamont- Doherty Earth Observatory and the Center for Climate Systems Research are part of the Columbia Earth Institute, launched to develop innovations for wise stewardship of the Earth. For more information on the research, see also:
Ozone Holes
1,998
February 19, 1998
https://www.sciencedaily.com/releases/1998/02/980219062627.htm
Amphibian Mortality Due To Ultraviolet Radiation, Researchers Find
New Haven, Conn. -- Many frog and other amphibian species throughout the world appear to be experiencing declining populations, with several species already extinct and others showing alarming rates of deformities. No single cause has been identified. Some scientists believe habitat disturbances are to blame, although declines have occurred in relatively undisturbed areas.
Now, field experiments in the Oregon Cascade Mountains have confirmed what many scientists had suspected -- ambient levels of ultraviolet-B (UV-B) radiation from the sun can cause high rates of mortality and deformity in some species of frogs and other amphibians. The earth is shielded from UV radiation by the ozone layer, which is believed to be thinning because of the increased use of chlorofluorocarbons as refrigerants, solvents and cleaning agents. "There has been a great deal of recent attention to the suspected increase in amphibian deformities. However, most reports have been anecdotal, and no experiment in the field under natural conditions had been performed previously," said Joseph M. Kiesecker of Yale University, who presented his findings Feb. 17 at the annual meeting of the American Association for the Advancement of Science in Philadelphia. Kiesecker, along with Andrew R. Blaustein of Oregon State University, compared the embryos of long-toed salamanders shielded from UV-B radiation by mylar filters to unshielded embryos. They found that 95 percent of the shielded embryos hatched, compared to only 14.5 percent of the unshielded embryos. Even more striking, only 0.5 percent of the surviving shielded salamanders had deformities while 91.9 percent of the unshielded salamanders had deformities. Malformed tails, blisters and edema were the most frequent deformities. "The recent thinning of the protective ozone layer in the upper atmosphere has been linked to increased risks of skin cancer and cataracts in humans as well as to the destruction of fragile plant life. Deformed and dying frogs may be linked to thinning ozone as well," said Kiesecker, who is studying other possible factors, such as water level and quality, which also can affect the amount of UV-B radiation reaching amphibians. UV-B radiation also may impair disease defense mechanisms, making amphibians more susceptible to pathogens and parasites that may hamper normal development and increase mortality, Kiesecker said. For example, he found increased mortality associated with a pathogenic fungus (Saprolegnia ferax) infecting some embryos exposed to UV-B, while embryos under mylar filters were not infected. The UV-B may work synergistically with the fungus, said Kiesecker, who reports seeing an outbreak of fungal pathogens in a number of amphibian species in the last 10 years. Amphibians are ideal species for the study of UV-B exposure, he noted. Many lay their eggs in open, shallow water where exposure to UV-B is high. Typically, a population of 200 breeding pairs of toads, for example, will produce as many as 1 million embryos. Furthermore, amphibian species have varying amounts of an enzyme called photolyase, which is the principal enzyme for repairing UV damage to DNA. Photolyase attacks a major UV photoproduct in DNA -- cyclobutane pyrimidine dimers -- which can cause mutations and cell death if left unchecked. Kiesecker, a zoologist and postdoctoral fellow, reported that frog and toad species with the greatest photolyase activity had the lowest mortality rates in developing embryos. For example, he and his colleagues noted an increase in embryo mortality of 15 to 20 percent in the Western toad and the Cascade frog -- two species with low levels of photolyase -- while the Pacific tree frog, which has a high photolyase level, is thriving. All three species live in the same habitat in the Cascade Mountains. The field studies, which were completed in May and June 1997, also are reported in part in the December issue of the Proceedings of the National Academy of Sciences. Funding was from the National Science Foundation and the Donnelley Fellowship sponsored by the Yale Institute for Biospheric Studies.
Ozone Holes
1,998
December 31, 1997
https://www.sciencedaily.com/releases/1997/12/971231085954.htm
How Will Increased Ultraviolet Radiation Affect Forests?
Consider these few facts:
Forests occupy about 31 percent of the Earth's land area. Forests make up over 90 percent of the Earth's biomass. Forests account for two-thirds of the carbon that is "fixed" or withdrawn from the atmosphere. Forests thus play a major role in how much carbon is free in the atmosphere, which in turn affects the magnitude of the "greenhouse effect." Forests regulate not only the flow of water, but local, regional and global climate. Now add another fact: We know next to nothing about what effect increased ultraviolet-B radiation will have on forests as the stratospheric ozone shield continues to disintegrate over the next century. Also, since global processes do not operate in isolation, how will the UV-B effect on forests affect their ability to cope with anticipated global warming? Most research on the effect of increased UV-B on plants has been done on annual plants, such as crop plants, says tree physiologist John Bassman. But trees are much different in their relationship to increased UV-B. The most obvious difference is their longevity and the resulting increased exposure to UV-B radiation. With conifers, a single needle can stay on the tree--and be exposed to UV radiation--for up to 20 years. Another difference is trees' annual dormancy and their overall exposure to greater environmental extremes. Also, their large size results in considerable physiological complexity, such as the transport of water from its roots to leaves far above the ground. Finally, whereas an annual plant might be able to adapt to climatic change, a tree is slow to adapt because it is so slow to respond genetically. Although public perception of increased UV-B radiation has been diverted lately by global climate change, the problem has not gone away. In fact, even if ozone-depleting emissions were halted immediately, the detrimental gases already in the stratosphere break down slowly. Scientists estimate that their effect on the ozone layer could continue for another 100 years. So what effect, ask Bassman and others, will the resulting enhanced UV-B exposure have not only on individual trees but on forest ecosystems? Studies on agricultural species have shown that about 60 percent are at least moderately sensitive to high levels of UV-B radiation. Among other effects is a lower rate of photosynthesis. One of Bassman and his colleagues' primary interests is what effect UV-B might have on RUBISCO, or "ribulose 1,5-bisphosphate carboxylase oxygenase." Ultraviolet-B radiation affects many important proteins, including DNA and RNA. RUBISCO is not only the most abundant protein on Earth, it is the primary enzyme responsible for capturing carbon dioxide from the atmosphere. Based on work that's been done on crop and herbaceous plants, Bassman and others believe that increased carbon dioxide and global warming will offer a buffer against UV-B damage--to a certain extent. Increased carbon dioxide can enhance plant growth. "But other things associated with that make the problem less than straightforward," says Bassman. From the broadest possible perspective, he continues, carbon dioxide is going to have a positive effect at least on physiology. But combine that with the negative effect of UV-B radiation on photosynthesis and the result is far from certain. One thing Bassman worries about is whether the increased UV-B radiation will change carbon allocations within trees. They may have to put more of their photosynthetic products into protective mechanisms at the expense of growth. There could be more severe direct effects, also, says Bassman. But considering the role of trees in regulating atmospheric carbon, even small effects could in turn have large effects on climate change. One earlier series of studies on loblolly pine showed that enhanced radiation caused a 20 percent decrease in biomass. Another study on sweetgum, however, resulted in no reduction in biomass, even though it did affect the rate of leaf elongation. As with other plants, the effect of increased UV-B seems to vary from species to species. Along with Gerald Edwards and Ron Robberecht, Bassman has begun a project to gather more information on the effect of enhanced UV-B radiation on trees. But doing so is not a simple matter. In fact, one reason so little is known is the difficulty in exposing trees to measurable amounts of UV-B radiation. Ambient UV-B exposure varies constantly. Clouds, the angle of the sun, and the density of the surrounding canopy all affect how much radiation a tree is receiving. Only three or four studies across the country are attempting to mimic the natural environment outside the greenhouse. Bassman has rigged up a system that allows him to measure the UV-B output of the sun. It tracks the output second by second, then supplies multiples of that amount of UV-B to the trees, simulating natural exposure to enhanced levels of radiation. So if a cloud goes over the sun, the lamp levels correspondingly go down. As the cloud passes, the light level goes back up. The trees are subjected to the amount of extra UV-B caused by a 25 percent reduction in stratospheric ozone and a 50 percent reduction. Bassman and his colleagues are examining the effect on four species: poplar, red oak, ponderosa pine, and Douglas-fir. They will consider UV-B's effect on a number of processes: growth and biomass distribution, carbon uptake, carbon allocation and its partitioning into various chemical fractions, and leaf development, anatomy, morphology and aging.
Ozone Holes
1,997
December 5, 1997
https://www.sciencedaily.com/releases/1997/12/971205073230.htm
Banning Chemicals To Protect Ozone May Aggravate Global Warming, Atmospheric Scientist Says
CHAMPAIGN, Ill. -- Some of the chemicals being phased out to protect the ozone layer offer offsetting benefits, such as reducing global warming, a University of Illinois researcher says.
"By independently addressing the issues of ozone depletion and global warming, we are jeopardizing desirable options for one effect based on lesser -- or even inconsequential -- impacts on the other," said Don Wuebbles, director of the Environmental Council at the U. of I. and a professor of atmospheric sciences. "We need to stop looking at these issues as though they are separate from one another, and start considering them together when we determine environmental policy."In the Nov. 7 issue of the journal Science, Wuebbles and colleague James Calm, an engineering consultant in Great Falls, Va., write that the regulatory actions on certain chemicals -- imposed by both the Montreal Protocol and the U.S. Clean Air Act to protect the ozone layer -- will have little impact on stratospheric ozone while contributing unnecessarily to global warming."Most of the chemicals responsible for ozone depletion are also greenhouse gases," Wuebbles said. "Chlorofluorocarbons [CFCs], for example, tend to be severe offenders for both the depletion of ozone and for global warming. The need for their regulation is unambiguous."But there are other industrially important chemicals that fall into the gray area. "Some hydrochlorofluorocarbons [HCFCs], for example, have very short atmospheric lifetimes and mostly decompose before reaching the upper atmosphere," Wuebbles said. "Effects on ozone depletion from some of these compounds are likely to be negligible. Nevertheless, they are still being tightly regulated and eventually the intention of the rulings is that they be banned entirely."One such chemical, HCFC-123, is a high-performance refrigerant commonly used in the cooling systems of large buildings. Some of the intended replacements for HCFC-123 not only have much longer atmospheric lifetimes that could contribute to global warming, but they also are far less energy efficient."High efficiency translates into reduced emissions of carbon dioxide and other greenhouse gases from associated energy use, which, in net impact, dwarf those from incidental releases of the refrigerant itself," Wuebbles said. "High efficiency also reduces fuel and other resource requirements."It is probable that HCFC-123 and several other CFC replacements would have survived the ban if the global warming regulations had been implemented before the ones for ozone," Wuebbles said. "With keener awareness of the more limited options to reduce global warming, the framers of the Montreal Protocol and the U.S. Clean Air Act might have been more cautious in rejecting chemicals with minimal impacts and offsetting benefits."There are many other chemicals that also have special uses, small impacts, and where the replacements for them would cause other problems or issues," Wuebbles said. "In such cases, it might make more sense to reconsider current policy and allow the continued use of some chemicals."
Ozone Holes
1,997
July 2, 1997
https://www.sciencedaily.com/releases/1997/07/970702203745.htm
NASA's Earth Science Program Adjusts To Loss Of Data From Japanese ADEOS Satellite
Don SavageHeadquarters, Washington DC July 2, 1997(Phone: 202/358-1547)
Allen KenitzerGoddard Space Flight Center, Greenbelt, MD(Phone: 301/286-2806)Mary HardinJet Propulsion Laboratory, Pasadena, CA(Phone: 818/354-0344)RELEASE: 97-149NASA'S EARTH SCIENCE PROGRAM ADJUSTS TO LOSSOF DATA FROM JAPANESE ADEOS SATELLITE "The failure of Japan's Advanced Earth Observing Satellite (ADEOS or Midori) spacecraft with the two NASA instruments aboard it is a real blow to NASA's science program," said Mike Mann, Deputy Associate Administrator, NASA's Mission to Planet Earth Strategic Enterprise, Washington, DC. "Fortunately, much of the ozone data provided by the Total Ozone Mapping Spectrometer (TOMS) science instruments aboard ADEOS can be provided by instruments on another spacecraft. However, the sea-surface winds data provided by the NASA Scatterometer (NSCAT) will be harder to replace and were opening essentially new opportunities for research and operational users worldwide," Mann said. The two NASA instruments were aboard the ADEOS spacecraft, which on June 30 was declared lost by the National Space Development Agency of Japan (NASDA). "The collaboration between NASDA and NASA on this mission has been outstanding and is reflective of the great partnership that exists between Japan and the U.S in the area of global change research," Mann said. "NASDA has performed in an exemplary and open manner in the development of the spacecraft and in dealing with us. However, space operations is a risky business; those of us involved in the business strive to limit the risk but sometimes mishaps do occur," Mann said. "The data we have obtained to date are extremely valuable," said Jim Graf, NSCAT project manager at NASA's Jet Propulsion Laboratory, Pasadena, CA. "If we knew we were limited to just nine months of data, we would have chosen the period we actually got. We obtained coverage over the summer and winter monsoon seasons and what may be the onset of an El Nino. Perhaps the largest loss is the discontinuity of the long-term data set, which is being used to understand interannual and decadal variations in our climate." The scatterometer measured wind speed and direction over the world's oceans. The data set is extremely valuable and versatile and is being used by climate change researchers, operational weather forecasters, and commercial ship routing firms. During its flight, the instrument gathered 42 weeks' worth of data. Within a very few short months after launch, the value of ADEOS data was seen in U.S. weather forecasting. "NOAA had begun using ocean surface wind products, derived from NSCAT, in weather forecasting," said Helen Wood, Director, Office of Satellite Data Processing and Distribution, National Oceanic and Atmospheric Administration. "Ocean surface wind measurements are used in numerical weather prediction models and help forecasters more accurately determine the path and intensity of tropical storms and hurricanes." Because this instrument provided measurements that will be needed over the long term, NASA was already developing a second scatterometer instrument to continue this vital data set. That instrument, called "SeaWinds," will be delivered to NASDA for integration on the spacecraft next April and is scheduled for launch in 1999 on ADEOS II. The launch of a Total Ozone Mapping Spectrometer sensor aboard ADEOS was helping to extend the unique data set of global total column ozone measurements begun by a similar instrument carried aboard NASA's Nimbus-7 satellite in 1978 and extended until December 1994 with the Meteor-3 TOMS. "The ADEOS spectrometer, along with the TOMS Earth Probe (EP) instruments also observed the unusual loss of Arctic polar ozone reported earlier this year," said Dr. Arlin J. Krueger, PrincipalInvestigator and Instrument Scientist for TOMS/ADEOS at NASA'sGoddard Space Flight Center, Greenbelt, MD. Although it also provided ozone coverage, NASA's Total Ozone Mapping Spectrometer/Earth Probe instrument had also been providing high ground resolution research data to complement the global data of the spectrometer on ADEOS. As a result, its orbit is different than TOMS/ADEOS. The EP satellite has adequate fuel to raise its present 500 km orbit to an orbit near the 800 km ADEOS orbit, where contiguous Earth coverage is possible for monitoring of ozone and volcanic eruption clouds. NASA is considering raising TOMS/EP to a higher orbit. With this adjustment, much more complete global coverage of total ozone measurements previously provided by TOMS/ADEOS could be received. However, some of the unique smaller-scale aerosols and ozone research being done by TOMS/EP would be lost. The next Total Ozone Mapping Spectrometer mission is planned for launch on a Russian Meteor-3M spacecraft in 2000. The loss of the ADEOS platform has a particularly serious impact on oceanographic research since two instruments, the Ocean Color and Temperature Sensor and the Polarization and Directionality of the Earth's Reflectance, both capable of providing routine global estimates of phytoplankton pigment concentrations, were lost. These instruments were providing the first routine global observations of ocean color and were initiating the much-needed, long-term time series of such measurements for global change studies. Future routine global ocean-color information will be provided by SeaWIFS, a commercial mission from which NASA will purchase data, currently scheduled for launch July 18. The NASA Scatterometer and Total Ozone Mapping Spectrometer/ADEOS were developed under NASA's strategic enterprise called Mission to Planet Earth, a comprehensive research effort to study Earth's land, oceans, atmosphere, ice and life as an interrelated system. NASA is cooperating with NASDA to identify the cause of the ADEOS failure and recommend a solution for future missions. -end-
Ozone Holes
1,997
May 3, 2021
https://www.sciencedaily.com/releases/2021/05/210503172900.htm
Plastic pollution in the deep sea: A geological perspective
A new focus article in the May issue of
The authors cite multiple studies, including one in the May issue by Guangfa Zhong and Xiaotong Peng, discussed in a previous GSA story (26 Jan. 2021). Zhong and Peng were surprised to find plastic waste in a deep-sea submarine canyon located in the northwestern South China Sea."Plastic is generally considered to be the dominant component of marine litter, due to its durability and the large volume produced," write Kane and Fildani. "Nano- and microplastics are a particularly insidious form of anthropogenic pollutant: tiny fragments and fibers may be invisible to the naked eye, but they are ingested with the food and water we consume and absorbed into the flesh of organisms."One of their vital questions is, "If some plastics can survive for >1000 years in terrestrial environments, how long do they last in ocean trenches that are kilometers deep, dark, cold, and at high pressure? How long does it take microplastic to break down into microplastics and nanoplastics in the deep sea?""While it is incumbent on policy makers to take action now to protect the oceans from further harm, we recognize the roles that geoscientists can play," write Kane and Fildani. That includes using their deep-time perspective to address the societal challenges, their understanding of the present-day distribution on the seafloor and in the sedimentary record, using geoscience techniques to record the downstream effects of mitigation efforts, and to predict the future of seafloor plastics.In summary, they write, "We understand ... the transient nature of the stratigraphic record and its surprising preservation, and the unique geochemical environments found in deep-sea sediments. Our source-to-sink approach to elucidate land-to-sea linkages can identify the sources and pathways that plastics take while traversing natural habitats and identify the context in which they are ultimately sequestered, and the ecosystems they affect. This will happen by working closely with oceanographers, biologists, chemists, and others tackling the global pollution problem."
Pollution
2,021
April 28, 2021
https://www.sciencedaily.com/releases/2021/04/210428133020.htm
Study finds green spaces linked to lower racial disparity in COVID infection rates, study finds
A higher ratio of green spaces at the county level is associated with a lower racial disparity in coronavirus infection rates, according to a new study. It is the first study to report the significant relationship between the supply of green spaces and reduced disparity in infectious disease rates.
The research team included William Sullivan, a landscape architecture professor at the University of Illinois Urbana-Champaign, and was led by Bin Jiang, a landscape architecture professor at The University of Hong Kong who received his Ph.D. at Illinois, and Yi Lu, an architecture professor at City University of Hong Kong. They reported their findings in the journal Previous studies by Sullivan, Jiang and Lu have shown that green spaces have positive effects on health. Access to green spaces is associated with improved cognitive performance, reduced mental fatigue and stress, reduced impulsiveness and aggressiveness, increased sense of safety, reduced crime rate, increased physical activity and increased social cohesion.Prior studies also provide strong evidence that green spaces may mitigate racial disparities in health outcomes. However, none have looked at the effect on disparities in infectious diseases. Most studies examining the racial disparity in coronavirus infections have focused on its association with socio-economic status or pre-existing chronic disease factors.For this study, the researchers identified 135 of the most urbanized counties in the U.S., with a total population of 132,350,027, representing 40.3% of the U.S. population. They collected infection data from county health departments from late January to July 10, 2020, and used it to calculate the infection rates for Black and white residents of the counties, while controlling for differences in income, pre-existing chronic diseases and urban density.The data showed that the average infection rate for Black residents was more than twice that of white residents -- 497 per 100,000 people for white individuals versus 988 per 100,000 people for Black individuals.The researchers compared the infection rates of each population within each county, rather than across all the counties studied. The county-level comparison Is critical because it can minimize the bias caused by differences of socioeconomic, transportation, climate and policy conditions among counties, they said.Sullivan, Jiang and Lu said several factors could account for the findings. They proposed that a greater proportion of green spaces in a county makes it more likely that Black and white individuals have more equal access to the green spaces and the accompanying health benefits."In many, many counties, Black folks have less access to green space than white folks do. In counties with more green space, that disparity may be less, and it may help account for some of the positive benefits we're seeing," Sullivan said.The coronavirus is spread through aerosol particles, and the spread is heightened in indoor settings without adequate ventilation. Having access to green spaces attracts people outdoors, where air movement and the ease of social distancing can reduce the spread of the virus.More access to green spaces is likely to promote physical activity, which may enhance the immune system. Green spaces enhance mental health and reduce stress, which also promotes immune system health. They strengthen social ties, which is an important predictor of health and well-being, the researchers said. Green spaces also may decrease infection risk by improving air quality and decreasing exposure to air pollutants in dense urban areas."We did not measure these things, but we know from previous research that all these things are tied to green spaces and have implications for health and well-being," Sullivan said.Jiang described green space as preventive medicine, encouraging outdoor physical activity and social ties with neighbors that will boost the immune system and promote social trust and cooperation to reduce risk of infections.While the study looked at infection rates in the U.S., "we also think the racial disparity issue is not just an American issue. It's an international issue," Jiang said.The research shows the importance for local and regional governments to invest in the development of green spaces, Sullivan said."One of the things the pandemic has helped us understand is that the built environment has real implications for the spread of disease and for our health. The design of landscape in cities, in neighborhoods, in communities also has really important ways it can contribute to or detract from health and well-being," he said. "There is a lot of competition for investment of public dollars. Lots of times, investments in parks and green spaces are prioritized lower. People think it makes a place look pretty and it's a place to go for walks. What we're finding is these kinds of investments have implications for health and well-being."
Pollution
2,021
April 27, 2021
https://www.sciencedaily.com/releases/2021/04/210427122434.htm
Study links hydraulic fracking with increased risk of heart attack hospitalization, death
The Marcellus Formation straddles the New York State and Pennsylvania border, a region that shares similar geography and population demographics. However, on one side of the state line unconventional natural gas development -- or fracking -- is banned, while on the other side it represents a multi-billion dollar industry. New research takes advantage of this 'natural experiment' to examine the health impacts of fracking and found that people who live in areas with a high concentration of wells are at higher risk for heart attacks.
"Fracking is associated with increased acute myocardial infarction hospitalization rates among middle-aged men, older men and older women as well as with increased heart attack-related mortality among middle-aged men," said Elaine Hill, Ph.D., an associate professor in the University of Rochester Medical Center (URMC) Department of Public Health Sciences, and senior author of the study that appears in the journal Natural gas extraction, including hydraulic fracking, is a well-known contributor to air pollution. Fracking wells operate around the clock and the process of drilling, gas extraction, and flaring -- the burning off of natural gas byproducts -- release organic compounds, nitrogen oxide, and other chemicals and particulates into the air. Additionally, each well requires the constant transportation of equipment, water, and chemicals, as well as the removal of waste water from the fracking process, further contributing to air pollution levels. Fracking wells remain in operation for several years, prolonging exposure to people who work at the wells sites and those who live nearby.Instead of the typical single source of industrial air pollution, such as a factory or power plant, fracking entails multiple well sites spread across a large, and often rural, geographic area. In 2014, there were more than 8,000 fracking well sites in Pennsylvania. Some areas of the state have a dense population of fracking wells -- three Pennsylvania counties have more than 1,000 sites. Contrast that with New York State, which has essentially banned the process of hydraulic fracking since 2010.Exposure to air pollution is recognized as a significant risk factor for cardiovascular disease. Other research has shown that the intensity of oil and gas development and production is positively associated with diminished vascular function, blood pressure, and inflammatory markers associated with stress and short-term air pollution exposure. Light and noise pollution from the continuous operation of the wells are also associated with increasing stress, which is another contributor to cardiovascular disease.The research team decided to measure the impact of fracking on cardiovascular health by studying heart attack hospitalization and death rates in 47 counties on either side of the New York and Pennsylvania state line. Using data from 2005 to 2014, they observed that heart attack rates were 1.4 to 2.8 percent higher in Pennsylvania, depending upon the age group and level of fracking activity in a given county.The associations between fracking and heart attack hospitalization and death were most consistent among men aged 45-54, a group most likely to be in the unconventional gas industry workforce and probably the most exposed to fracking-related air pollutants and stressors. Heart attack deaths also increase in this age group by 5.4 percent or more in counties with high concentrations of well sites. Hospitalization and mortality rates also jumped significantly in women over the age of 65.Fracking is more concentrated in rural communities, which the authors speculate may further compromise cardiovascular heath due to the trend of rural hospital closures. People who suffer from cardiovascular disease in these areas may be at increased risk of adverse health outcomes, including death, due to less access to care. The authors suggest that more should be done to raise awareness about fracking-related risks for cardiovascular disease and physicians should keep a closer eye on high risk patients who reside in areas with fracking activity. They also contend that the study should inform policymakers about the tradeoffs between public health and the economic activity generated by the industry."These findings contribute to the growing body of evidence on the adverse health impact of fracking," said Alina Denham, a Ph.D. candidate in Health Policy at the University of Rochester School of Medicine and Dentistry and first author of the study. "Several states, including New York, have taken the precaution of prohibiting hydraulic fracturing until more is known about the health and environmental consequences. If causal mechanisms behind our findings are ascertained, our findings would suggest that bans on hydraulic fracturing can be protective for human health."The study was funded with support from the National Institutes of Health Office of the Director.
Pollution
2,021
April 27, 2021
https://www.sciencedaily.com/releases/2021/04/210427122417.htm
Cultivated seaweed can soak up excess nutrients plaguing human health and marine life
It's easy to think that more nutrients -- the stuff life needs to grow and thrive -- would foster more vibrant ecosystems. Yet nutrient pollution has in fact wrought havoc on marine systems, contributing to harmful algae blooms, worse water quality and oxygen-poor dead zones.
A team of researchers from UC Santa Barbara has proposed a novel strategy for reducing large amounts of nutrients -- specifically nitrogen and phosphorus -- after they have already been released into the environment. In a study appearing in the journal "A key goal of conservation ecology is to understand and maintain the natural balance of ecosystems, because human activity tends to tip things out of balance," said co-author Darcy Bradley, co-director of the Ocean and Fisheries Program at the university's Environmental Markets Lab. Activities on land, like industrial-scale farming, send lots of nutrients into waterways where they accumulate and flow into the ocean in greater quantities than they naturally would.Opportunistic algae and microbes take advantage of the glut of nutrients, which fuel massive blooms. This growth can have all kinds of consequences, from producing biotoxins to smothering habitats in virtual monocultures. And while these algae produce oxygen when they're alive, they die so suddenly and in such volume that their rapid decomposition consumes all the available oxygen in the water, transforming huge swaths of the ocean into so-called "dead zones."Cultivated seaweed could draw down available nutrients, the authors claim, limiting the resources for unchecked growth of nuisance algae and microbes. Seaweeds also produce oxygen, which could alleviate the development of hypoxic dead zones.The authors analyzed data from the U.S. Gulf of Mexico, which they say exemplifies the challenges associated with nutrient pollution. More than 800 watersheds across 32 states deliver nutrients to the Gulf, which has led to a growing low-oxygen dead zone. In 2019, this dead zone stretched just over 18,000 square kilometers, slightly smaller than the area of New Jersey.Cortez grunt fish swim beneath a "red tide" algae bloom near the Bat Islands in Costa Rica's Santa Rosa National Park. Cortez grunt fish swim beneath a "red tide" algae bloom near the Bat Islands in Costa Rica's Santa Rosa National Park. Blooms like these can release biotoxins and create oxygen-poor dead zones in the ocean.Using open-source oceanographic and human-use data, the team identified areas of the gulf suitable for seaweed cultivation. They found roughly 9% of the United States' exclusive economic zone in the gulf could support seaweed aquaculture, particularly off the west coast of Florida."Cultivating seaweed in less than 1% of the U.S. Gulf of Mexico could potentially reach the country's pollution reduction goals that, for decades, have been difficult to achieve," said lead author Phoebe Racine, a Ph.D. candidate at UCSB's Bren School of Environmental Science & Management."Dealing with nutrient pollution is difficult and expensive," Bradley added. The U.S. alone spends more than $27 billion every year on wastewater treatment.Many regions employ water quality trading programs to manage this issue. In these cap-and-trade systems regulators set a limit on the amount of a pollutant that can be released, and then entities trade credits in a market. Water quality trading programs exist all over the U.S., though they are often small, bespoke and can be ephemeral. That said, they show a lot of promise and, according to Racine, have bipartisan support.Seaweed aquaculture would fit nicely within these initiatives. "Depending on farming costs and efficiency, seaweed aquaculture could be financed by water quality trading markets for anywhere between $2 and $70 per kilogram of nitrogen removed," Racine said, "which is within range of observed credit prices in existing markets."What's more, the researchers note that demand is rising for seaweed in food and industry sectors. Potential products include biofuel, fertilizer and food, depending on the water quality, Racine said. This means that, unlike many remediation strategies, seaweed aquaculture could pay for itself or even generate revenue.And the time seems ripe for the authors' proposal. "The U.S. has traditionally had a lot of barriers to getting aquaculture in the ocean," Bradley explained. "But there is mounting political support in the form of drafted bills and a signed executive order that could catalyze the expansion of the U.S. aquaculture industry."This study is the first of several to come out of the Seaweed Working Group, an interdisciplinary group of researchers looking to understand and chart the potential of seaweed aquaculture's benefits to society. They are currently investigating a range of other ecosystem services that seaweed cultivation could provide, such as benefits to surrounding fisheries and carbon capture. The researchers are also working on a paper that explores nitrogen and phosphorous removal at the national level with fine-scale scale analysis modeling nutrient removal from native seaweeds off the coast of Florida.As long as humans continue adding nutrients to the environment, nature will find ways to use them. By deliberately cultivating seaweeds, we can grow algae that we know are benign, helpful, or even potentially useful, rather than the opportunistic algae that currently draw upon these excess nutrients.
Pollution
2,021
April 23, 2021
https://www.sciencedaily.com/releases/2021/04/210423095412.htm
US asbestos sites made risky by some remediation strategies
The Environmental Protection Agency (EPA) largely remedies Superfund sites containing asbestos by capping them with soil to lock the buried toxin in place. But new research suggests that this may actually increase the likelihood of human exposure to the cancer-causing mineral.
"People have this idea that asbestos is all covered up and taken care of," said Jane Willenbring, who is an associate professor of geological sciences at Stanford University's School of Earth, Energy & Environmental Sciences (Stanford Earth). "But this is still a lingering legacy pollutant and might be dribbling out pollution, little by little."Willenbring has published several studies about asbestos behavior and, most recently, turned her attention to the lack of information about how asbestos may move through the soils where it is stored. Through lab experiments with asbestos fibers, which were detailed in a paper published Jan. 27 in the They found that dissolved organic matter changes the electric charge on asbestos particles and makes them less sticky, thereby enabling them to move faster through soil. The work disproves the prevailing theory that asbestos fibers cannot easily move through soil -- an assumption that has been made in part because of the mineral's hair-like shape."It's surprising that even though these little fibers are so long, because their shortest diameter is small enough, they can wind their way through these soil pores," said Willenbring, who is senior author on the study.Inhalation of asbestos increases the risk of developing lung disease and lung cancer, and exposure could occur through irrigation, taking showers, using humidifiers or other unfiltered sources that disperse water into the air.Asbestos is a naturally occurring mineral that mainly forms in the subsurface, at the boundary of Earth's oceanic and continental crusts. For much of the 20th century, it was revered as a miracle building material for its high heat capacity and insulation properties, and mining and production boomed worldwide. Following widespread evidence of its link to cancer, including a rare and aggressive form called mesothelioma, production of asbestos in the U.S. declined dramatically starting in the 1970s.In addition to thinking that the shape of the fibers would inhibit transport, the scientific community has been influenced by a 1977 EPA report that minimized the threat of asbestos moving through soil. Since then, new findings about the role of colloids -- microscopic particles that remain dispersed within solutions rather than settling to the bottom -- have led researchers to challenge the assumption that asbestos stays fixed in soil."Now we can show that exactly the thing that they do, which is add manure or other organic sludge to the asbestos piles that creates the production of dissolved organic matter, is exactly what causes the liberation of asbestos," Willenbring said. "It's actually facilitating the transport of asbestos fibers."In some ways, the team's breakthrough about asbestos is not surprising because it aligns so closely with recent findings about the transport of colloids in soil, Willenbring said. But she was stunned by the scale of the problem: Millions of people in the U.S. are living near thousands of sites contaminated with asbestos.At least 16 Superfund sites contain asbestos and areas where the mineral naturally occurs can also pose a risk.As part of the lab experiments, Willenbring and her team sampled soil from the BoRit Superfund Site in Ambler, Pennsylvania before it was capped in 2008. The waste dump is located next to a reservoir, as well as a stream that feeds water to the city of Philadelphia.However, there is a silver lining to the team's discovery."Not all types of dissolved organic matter have the same effect on asbestos mobility," said lead study author Sanjay Mohanty, an assistant professor at UCLA's Civil and Environmental Engineering who collaborated with Willenbring on the experiments. "Thus, by identifying the types that have the worst effect, the remediation design could exclude those organic amendments."As part of the remediation strategy, some sites include vegetation planted on top of the soil to prevent erosion. Willenbring's ongoing research involves figuring out how fungal-vegetation associations may be able to extract iron and make the asbestos fibers less toxic to people."It's not just inflammation in the lungs that's a problem -- there's a process by which iron contained in the asbestos fiber is actually responsible for causing DNA damage, which can lead to cancer or mesothelioma," Willenbring said.
Pollution
2,021
April 22, 2021
https://www.sciencedaily.com/releases/2021/04/210422093849.htm
Faster air exchange in buildings not always beneficial for coronavirus levels
Vigorous and rapid air exchanges might not always be a good thing when it comes to addressing levels of coronavirus particles in a multiroom building, according to a new modeling study.
The study suggests that, in a multiroom building, rapid air exchanges can spread the virus rapidly from the source room into other rooms at high concentrations. Particle levels spike in adjacent rooms within 30 minutes and can remain elevated for up to approximately 90 minutes.The findings, published online in final form April 15 in the journal "Most studies have looked at particle levels in just one room, and for a one-room building, increased ventilation is always useful to reducing their concentration," said Leonard Pease, lead author of the study. "But for a building with more than one room, air exchanges can pose a risk in the adjacent rooms by elevating virus concentrations more quickly than would otherwise occur."To understand what's happening, consider how secondhand smoke is distributed throughout a building. Near the source, air exchange reduces the smoke near the person but can distribute the smoke at lower levels into nearby rooms," Pease added. "The risk is not zero, for any respiratory disease."The team modeled the spread of particles similar to SARS-CoV-2, the virus that causes COVID-19, via air-handling systems. Scientists modeled what happens after a person has a five-minute coughing bout in one room of a three-room small office building, running simulations with particles of five microns.Researchers looked at the effects of three factors: different levels of filtration, different rates of outdoor air incorporation into the building air supply, and different rates of ventilation or air changes per hour. For downstream rooms, they found an expected clear benefit from increasing outdoor air and improving filtering, but the effect of increased ventilation rate was less obvious.Scientists studied the effects of adding varying amounts of outdoor air to the building air supply, from no outside air to 33 percent of the building's air supply per hour. As expected, the incorporation of more clean outdoor air reduced transmission risk in the connected rooms. Replacement of one-third of a building's air per hour with clean outdoor air in downstream rooms reduced infection risk by about 20 percent compared to the lower levels of outdoor air commonly included in buildings. The team noted that the model assumed that the outdoor air was clean and virus free."More outside air is clearly a good thing for transmission risk, as long as the air is free of virus," said Pease.The second factor studied -- strong filtration -- also was very effective at reducing transmission of the coronavirus.The team studied the effects of three levels of filtration: MERV-8, MERV-11, and MERV-13, where MERV stands for minimum efficiency reporting value, a common measure of filtration. A higher number translates to a stronger filter.Filtration decreased the odds of infection in the connected rooms markedly. A MERV-8 filter decreased the peak level of viral particles in connected rooms to just 20 percent what it was without filtration. A MERV-13 filter knocked down the peak concentration of viral particles in a connected room by 93 percent, to less than one-tenth of what it was with a MERV-8 filter. The researchers note that the stronger filters have become more common since the pandemic began.The most surprising finding of the study involved ventilation -- the effect of what researchers call air changes per hour. What's good for the source room -- cutting transmission risk within the room by 75 percent -- is not so good for connected rooms. The team found that a rapid rate of air exchange, 12 air changes per hour, can cause a spike in viral particle levels within minutes in connected rooms. This increases the risk of infection in those rooms for a few minutes to more than 10 times what it was at lower air-exchange rates. The higher transmission risk in connected rooms remains for about 20 minutes."For the source room, clearly more ventilation is a good thing. But that air goes somewhere," said Pease. "Maybe more ventilation is not always the solution.""There are many factors to consider, and the risk calculation is different for each case," said Pease. "How many people are in the building and where are they located? How large is the building? How many rooms? There is not a great deal of data at this point on how viral particles move about in multiroom buildings."These numbers are very specific to this model -- this particular type of model, the amount of viral particles being shed by a person. Every building is different, and more research needs to be done," Pease added.Co-author Timothy Salsbury, a buildings control expert, notes that many of the trade-offs can be quantified and weighted depending on circumstances."Stronger filtration translates to higher energy costs, as does the introduction of more outside air than would usually be used in normal operations. Under many circumstances, the energy penalty for the increased fan power required for strong filtration is less than the energy penalty for heating or cooling additional outside air," said Salsbury."There are many factors to balance -- filtration level, outdoor air levels, air exchange -- to minimize transmission risk. Building managers certainly have their work cut out for them," he added.The team is already conducting a series of experimental studies along the same lines as the modeling study. Like the newly published study, the additional analyses look at the effects of filtration, outdoor air incorporation and air changes.These ongoing studies involve real particles made of mucus (not incorporating the actual SARS-CoV-2 virus) and consider differences among particles expelled from various parts of the respiratory tract, such as the oral cavity, the larynx, and the lungs. Investigators deploy an aerosolizing machine that disperses the viral-like particles much as they'd be dispersed by a cough, as well as fluorescent tracking technology to monitor where they go. Other factors include varying particle sizes, how long viral particles are likely to be infectious, and what happens when they drop and decay.In addition to Pease and Salsbury, authors of the published study include Nora Wang, Ronald Underhill, Julia Flaherty, Alex Vlachokostas, Gourihar Kulkarni and Daniel James.The research, the latest in a series of PNNL findings about COVID-19, brings together PNNL's strengths in building technologies and in aerosol science. The work was funded through the National Virtual Biotechnology Laboratory, a consortium of all 17 DOE national laboratories focused on response to COVID-19, with funding provided by the Coronavirus Aid, Relief, and Economic Security, or CARES, Act.
Pollution
2,021
April 21, 2021
https://www.sciencedaily.com/releases/2021/04/210421200130.htm
California's worst wildfires are helping improve air quality prediction
UC Riverside engineers are developing methods to estimate the impact of California's destructive wildfires on air quality in neighborhoods affected by the smoke from these fires. Their research, funded by NASA and the results published in Atmospheric Pollution Research, fills in the gaps in current methods by providing air quality information at the neighborhood scales required by public health officials to make health assessments and evacuation recommendations.
Measurements of air quality depend largely on ground-based sensors that are typically spaced many miles apart. Determining how healthy it is to breathe air is straightforward in the vicinity of the sensors but becomes unreliable in areas in between sensors.Akula Venkatram, a professor of mechanical engineering in UC Riverside's Marlan and Rosemary Bourns College of Engineering, directed a group that developed a method to interpret fine particulate matter concentrations observed by ground-based sensors during the 2017 fire complex that included the Atlas, Nuns, Tubbs, Pocket, and Redwood Valley fires, and the 2018 Camp Fire.Their method fills in the gaps in air quality information obtained from ground-level monitors and satellite images using a mathematical model that simulates the transport of smoke from the fires. This approach provides estimates of particulate emissions from wildfires, which is the most uncertain of the inputs of other methods of interpreting the same data. These emissions combined with the physics embodied in the smoke transport model allowed the group to estimate the variation of particulate concentrations over distances as small as one kilometer."We need better ways to measure air quality so we can let people know when and where it's safe to go out and exercise, or go stay somewhere else, for example," Venkatram said. "In addition to filling in the gaps in the data from monitoring stations and satellite images, our method can also be used to predict the next day's air quality by estimating wildfire emissions for tomorrow based on today's observations."While any smoke can make air unpleasant to breathe, it is the tiniest particles, called PM2.5, that can penetrate lung tissue and cause the most health problems. The UC Riverside model is specifically designed to predict PM2.5 concentrations in areas with insufficient coverage by air quality monitoring stations.The authors hope their work will help efforts to protect public health during California's inevitable annual wildfires.
Pollution
2,021
April 21, 2021
https://www.sciencedaily.com/releases/2021/04/210421151224.htm
Air pollution data in five Chinese cities: Local vs. U.S. monitoring stations
A new analysis of air pollution data from five large Chinese cities has found statistically significant differences between data from monitoring stations run by local governments and data from stations run by U.S. embassies and consulates. Jesse Turiel of the Harvard University John F. Kennedy School of Government and Robert Kaufmann of Boston University present these findings in the open-access journal
China has experienced poor air quality for several decades, and air pollution has been linked to significant increases in mortality and significant reductions of GDP for the country. In response, the Chinese central government has set targets for local environmental performance. Air quality data are collected at local monitoring stations, and local officials report them to the central government. Meanwhile, in some Chinese cities, U.S. embassies and consulates run their own monitoring stations.For the new study, the researchers analyzed and compared measurements reported by local- and U.S. embassy-controlled monitoring stations in five large Chinese cities. These data, which covered a period from January 2015 to June 2017, consisted of hourly measurements of air concentration of fine particles known as PM2.5 -- a standard indicator of air quality.The researchers identified hours in which data from local temporarily diverged from U.S. stations in a statistically significant manner. They found that these divergences occurred more often and were greater than one would expect by random chance. Hourly divergences were also more likely when air quality was particularly poor. Together, the results suggest that, when air pollution is high, local stations systematically report lower PM2.5 levels than reported by U.S. stations.The authors note that these findings add to existing concerns about underreporting of air pollution by some local officials in China. In fact, they say, the general public and other observers are often skeptical of local data because some officials might have an incentive to underreport in order to avoid professional repercussions.Still, the researchers emphasize the usefulness of local air pollution data in China and note that their study does not invalidate other findings that the country's air quality has improved in recent years.The authors add: "Our work finds evidence of systemic local government underreporting of air pollution levels in four out of five tested Chinese cities. This suggests that, between 2015 and 2017, some local governments in China misreported air quality data disclosed to the country's central environmental ministry, particularly on high-pollution days."
Pollution
2,021
April 21, 2021
https://www.sciencedaily.com/releases/2021/04/210421124635.htm
Wildfire smoke linked to skin disease
Wildfire smoke can trigger a host of respiratory and cardiovascular symptoms, ranging from runny nose and cough to a potentially life-threatening heart attack or stroke. A new study suggests that the dangers posed by wildfire smoke may also extend to the largest organ in the human body, and our first line of defense against outside threat: the skin.
During the two weeks in November 2018 when wildfire smoke from the Camp Fire choked the San Francisco Bay Area, health clinics in San Francisco saw an uptick in the number of patients visiting with concerns of eczema, also known as atopic dermatitis, and general itch, compared to the same time of the year in 2015 and 2016, the study found.The findings suggest that even short-term exposure to hazardous air quality from wildfire smoke can be damaging to skin health. The report, carried out by physician researchers at the University of California, San Francisco, in collaboration with researchers at the University of California, Berkeley, appears on April 21 in the journal "Existing research on air pollution and health outcomes has focused primarily on cardiac and respiratory health outcomes, and understandably so. But there is a gap in the research connecting air pollution and skin health," said study lead author Raj Fadadu, a student in the UC Berkeley-UCSF Joint Medical Program. "Skin is the largest organ of the human body, and it's in constant interaction with the external environment. So, it makes sense that changes in the external environment, such as increases or decreases in air pollution, could affect our skin health."Air pollution from wildfires, which consists of fine particulate matter (PM2.5), polycyclic aromatic hydrocarbons (PAHs), and gases, can impact both normal and eczema-prone skin in a variety of ways. These pollutants often contain chemical compounds that act like keys, allowing them to slip past the skin's outer barrier and penetrate into cells, where they can disrupt gene transcription, trigger oxidative stress or cause inflammation.Eczema, or atopic dermatitis, is a chronic condition which affects the skin's ability to serve as an effective barrier against environmental factors. Because the skin's barrier has been compromised, people with this condition are prone to flare-ups of red, itchy skin in response to irritants, and may be even more prone to harm from air pollution."Skin is a very excellent physical barrier that separates us and protects us from the environment," said study senior author Dr. Maria Wei, a dermatologist and melanoma specialist at UCSF. "However, there are certain skin disorders, such as atopic dermatitis, in which the barrier is not fully functional. It's not normal even when you don't have a rash. So, it would make sense that when exposed to significant air pollution, people with this condition might see an effect on the skin."Earlier studies have found a link between atopic dermatitis and air pollution in cities with high background levels of air pollution from cars and industry. However, this is the first study to examine the impacts of a very short burst of extremely hazardous air from wildfires. Despite being located 175 miles away from the Camp Fire, San Francisco saw an approximately nine-fold increase in baseline PM2.5 levels during the time of the blaze.To conduct the study, the team examined data from more than 8,000 visits to dermatology clinics by both adults and children between October of 2015, 2016 and 2018, and February of the following year. They found that, during the Camp Fire, clinic visits for atopic dermatitis and general itch increased significantly in both adult and pediatric patients."Fully 89 percent of the patients that had itch during the time of the Camp Fire did not have a known diagnosis of atopic dermatitis, suggesting that folks with normal skin also experienced irritation and/or absorption of toxins within a very short period of time," Wei said.While skin conditions like eczema and itch may not be as life-threatening as the respiratory and cardiovascular impacts of wildfire smoke, they can still severely impact people's lives, the researchers say. The study also documented increased rates of prescribed medications, such as steroids, during times of high air pollution, suggesting that patients can experience severe symptoms.Individuals can protect their skin during wildfire season by staying indoors, wearing clothing that covers the skin if they do go outside, and using emollients, which can strengthen the skin's barrier function. A new medication to treat eczema, called Tapinarof, is now in clinical trials and could also be a useful tool during times of bad air."A lot of the conversations about the health implications of climate change and air pollution don't focus on skin health, but it's important to recognize that skin conditions do affect people's quality of life, their social interactions and how they feel psychologically," Fadadu said. "I hope that these health impacts can be more integrated into policies and discussions about the wide-ranging health effects of climate change and air pollution."
Pollution
2,021
April 21, 2021
https://www.sciencedaily.com/releases/2021/04/210421124552.htm
Microplastics affect global nutrient cycle and oxygen levels in the ocean
The effects of the steadily increasing amount of plastic in the ocean are complex and not yet fully understood. Scientists at GEOMAR Helmholtz Centre for Ocean Research Kiel have now shown for the first time that the uptake of microplastics by zooplankton can have significant effects on the marine ecosystem even at low concentrations. The study, published in the international journal
Plastic debris in the ocean is a widely known problem for large marine mammals, fish and seabirds. These animals can mistake plastic objects, such as plastic bags, for similar-looking food items, such as jellyfish. Tiny zooplankton can also mistake very small plastic particles for food and ingest them either accidentally or by chance (when the particles have combined with organic particles).The direct effects of such microplastic ingestion on zooplankton are poorly understood, but the broader effects on ecosystems of zooplankton replacing some of their food with plastic are much less well understood. Now, for the first time, a research team has used an Earth system model to simulate how zooplankton that ingest microplastics could affect the base of the ocean food web and nutrient cycling. The results, now published in the international journal "These findings are significant because there has long been scepticism in the scientific community that microplastic concentrations in the ocean are high enough to have any impact on nutrient cycling," says Dr Karin Kvale "Our study shows that even at levels present in the ocean today, it may already be the case if zooplankton replace some of their natural food with microplastics. If zooplankton eat the microplastics and thus take up less food, this can have far-reaching ecological effects that can, for example, lead to increased algal blooms via a reduction in feeding pressure that affect the oxygen content of the oceans almost as much as climate change," Kvale continues. These findings point to a new potential driver of human-induced ocean change that has not been considered before. However, Kvale points out that the results are "very preliminary" because little is yet known about how the base of the food web interacts with microplastic pollution. Further work on this topic is needed, she says, but the study provides strong motivation to expand the capacity of Earth system models to include pollution effects as a new driver of ocean change.
Pollution
2,021
April 21, 2021
https://www.sciencedaily.com/releases/2021/04/210421124541.htm
Freshwater salt pollution threatens ecosystem health and human water security
Water touches virtually every aspect of human society, and all life on earth requires it. Yet, fresh, clean water is becoming increasingly scarce -- one in eight people on the planet lack access to clean water. Drivers of freshwater salt pollution such as de-icers on roads and parking lots, water softeners, and wastewater and industrial discharges further threaten freshwater ecosystem health and human water security.
"Inland freshwater salt pollution is rising nationwide and worldwide, and we investigated the potential conflict between managing freshwater salt pollution and the sustainable practice of increasing water supply through the addition of highly treated wastewater to surface waters and groundwaters," said Stanley Grant, professor of civil and environmental engineering in the Virginia Tech College of Engineering. "If we don't figure out how to reverse this trend of salt pollution soon, it may become one of our nation's top environmental challenges."Grant and his collaborators have recently published their findings in the journal In a recent modeling study, it was predicted that salt pollution will increase over 50 percent in more than half of U.S. streams by 2100. Freshwater salt pollution is associated with the decline of biodiversity, critical freshwater habitat, and lack of safe drinking water."We found there are numerous opportunities that exist to reduce the contribution of salt pollution in the highly treated wastewater discharged to the Occoquan Reservoir and freshwater pollution more generally," said Peter Vikesland, professor in the Department of Civil and Environmental Engineering and affiliated faculty member in the Global Change Center, housed within Fralin Life Sciences Institute at Virginia Tech. "These efforts will require deliberative engagement with a diverse community of watershed stakeholders and careful consideration of the local political, social and environmental context."From time-series data collected over 25 years, the researchers quantified the contributions of three salinity sources -- highly treated wastewater and outflows from two rapidly urbanizing watersheds in Northern Virginia -- to the rising concentration of sodium, a major ion associated with freshwater pollution.The Occoquan Reservoir, a regionally important drinking-water reservoir in the mid-Atlantic United States, is located approximately 19 miles southwest of Washington, D.C., in Northern Virginia, and is one of two primary sources of water supply for nearly 2 million people in Fairfax County, Virginia, and surrounding communities. On an annual basis, approximately 95% of the water flowing into the reservoir comes from its Occoquan River and Bull Run tributaries."This study exemplifies the power of combining historical data and new computational tools; it underscores the incredible value of long-term monitoring," said Grant who is the Co-Director of the Occoquan Watershed Monitoring Lab and an affiliated faculty member in the Center for Coastal Studies at Virginia Tech. "It is a testimony to the vision of Virginia Tech and the Occoquan Watershed Monitoring Lab and their collaboration with stakeholders in the watershed, including Fairfax Water and the Upper Occoquan Service Authority, over the past two decades."The researchers found that rising salt pollution in the reservoir is primarily from watershed runoff during wet weather and highly treated wastewater during dry weather.Across all timescales evaluated, sodium concentration in the treated wastewater is higher than in outflow from the two watersheds. Sodium in the treated wastewater originates from chemicals added during wastewater treatment, industrial and commercial discharges, human excretion and down-drain disposal of drinking water and sodium-rich household products."Our study is unique because it brings together engineers, ecologists, hydrologists, and social scientists to investigate and tackle one of the greatest threats to the world's water quality," said Sujay Kaushal, a co-author on the paper, professor of geology at the University of Maryland, and an international expert on freshwater salinization.The researchers envision at least four ways in which salt pollution can be reduced: limit watershed sources of sodium that enter the water supply (such as from deicer use), enforce more stringent pre-treatment requirements on industrial and commercial dischargers, switch to low-sodium water and wastewater treatment methods, and encourage households to adopt low-sodium products.Drinking water supply and sewage collection systems contribute salt to the former ultimately contribute salt to the latter as well."Citizens can start today or tomorrow by thinking more critically about what they put down the drain and how that harms the environment, and in turn, their own drinking water supply," said Vikesland.This research aligns with the One Water vision used nationally and globally by multiple water resource sectors, and it catalyzes robust stakeholder-driven decision making under seemingly conflicting objectives.This research was part of a partnership between Virginia Tech, University of Maryland, Vanderbilt University, and North Carolina State University. It was funded by a recent multimillion dollar grant that Grant and his collaborators received from the National Science Foundation aimed at addressing freshwater salt pollution and is part of the National Science Foundation's Growing Convergence Research (GCR) program, which aims to catalyze solutions to societal grand challenges by the merging of ideas, approaches, and technologies from widely diverse fields of knowledge to stimulate innovation and discovery. Experience gained and lessons learned from this research will be upscaled nationally and globally in partnership with The Water Research Foundation."The collaborative effort by this highly interdisciplinary team exemplifies the type of paradigm shifting science that we seek to catalyze and promote," said William Hopkins, professor in the College of Natural Resources and Environment, director of the Global Change Center, and associate executive director of the Fralin Life Sciences Institute. "Freshwater salt pollution has become a major focus for diverse researchers at Virginia Tech because the problem is so widespread, getting worse, and affects both the environment and society. Fortunately, the team's research advances our understanding of important sources of salt pollution so that evidence-based interventions can be identified and implemented. The study has far reaching implications globally as we try to solve this complex environmental problem."This study reflects the exciting convergent approach the NSF-funded project is taking."While the biophysical findings are front-and-center here, it acknowledges the complex socio-political contexts in which that information will be applied and foreshadows the collaborative, multi-stakeholder approaches to tackling the freshwater salt pollution problem that we are currently advancing," said Todd Schenk, assistant professor in the School of Public and International Affairs in the College of Architecture and Urban Studies and affiliated faculty member of the Global Change Center and Center for Coastal Studies.
Pollution
2,021
April 21, 2021
https://www.sciencedaily.com/releases/2021/04/210421082855.htm
Complexity of microplastic pollution
Microplastics -- small plastic pieces less than 5 millimeters in length -- are ubiquitous in the environment, and they can have significant effects on wildlife. A new study published in
By demonstrating that microplastics are both physical and chemical stressors, the study supports the need for research that considers microplastics a multiple stressor rather than a single contaminant. Importantly, current test methods used in most microplastic studies do not sufficiently investigate the chemical dimension of microplastic pollution."The chemical cocktail that is associated with microplastics in the environment consists of additives from manufacturing and contaminants sorbed from the surrounding environment; however this dimension is often missing from toxicity testing, where pristine microplastics purchased from a manufacturer are often used," said corresponding author Kennedy Bucci, a PhD candidate at the University of Toronto. "Our research shows that the chemical cocktail is an important driver of effects, and suggests that a new framework for risk assessment that captures the multi-dimensionality of microplastic pollution may be necessary."
Pollution
2,021
April 20, 2021
https://www.sciencedaily.com/releases/2021/04/210420160903.htm
Deregulated US Government oversight on interstate waters leaves murky implications for states
The familiar murkiness of waters in the Gulf of Mexico can be off-putting for beachgoers visiting Galveston Island. Runoff from the Mississippi River makes its way to local beaches and causes downstream water to turn opaque and brown. Mud is one factor, and river runoff is another. However, concern tends to ratchet up a notch when pollution enters the river runoff discussion on a national scale, specifically when smaller, navigable intrastate bodies of water push pollution into larger interstate waters often involved in commerce (i.e. the Mississippi River, Great Lakes, Ohio River).
A recently published research analysis in the journal In the article, "A water rule that turns a blind eye to transboundary pollution," Flatt contributed as the sole legal researcher, explaining how the 2020 Navigable Waters Protection Rule, which retracted federal oversight of interstate waters, did so with the overt assumption that state governments would fill in the oversight gap. Not only did the evidence point toward an alternate outcome but the rule's federalism rationale was incorrect, according to the researchers."New administrations get to implement new policies, but those policies have to be consistent with statutes, the Constitution and be logical," Flatt said. "The legal phrase is: 'they cannot be arbitrary and capricious.' An administration can only do what is allowed by the law and must be rational and logical. This fails that. This is a policy disagreement, but it is a policy disagreement that is out of the bounds of what is allowed by law."The cleanliness of larger transboundary rivers falls under the responsibility of the federal government and under the 2015 Clean Water Rule (CWR) enacted during former President Barack Obama's administration. This included small wetlands and streams that could push pollution runoff to these larger rivers that bisect several states. In 2020, under the Navigable Waters Protection Rule (NWPR), the federal regulation of some of those smaller, linked bodies of water was withdrawn, leaving individual states with the responsibility to fill in the gaps. However, many states did not assert control over these waters as assumed by former President Donald Trump's administration, leaving room for pollution to make its way to interstate waters."One prominent example is 31 states' challenge to the 2015 CWR in court, arguing that it would impose excessive costs. Inexplicably, the NWPR's economics analysis projected that 14 of these states would now change their position," according to the "The Army Corps and EPA said in their analysis that 31 states will move into the breach and help protect the wetlands that the federal government would no longer protect," Flatt said. "But best practices for economic analysis state that you cannot speculate about future state actions. When I looked at this, I found a lot of these states are even prohibited from enacting a rule more stringent than the federal government. Here, the data is flawed."In March, President Joe Biden's administration proposed a $111 billion investment in water infrastructure. Flatt said that the implementation of the investment will include review of previous policy and research, including information uncovered in the Research to inform this article was funded by the External Environmental Economics Advisory Committee, with funding and support from the UCLA Luskin Center for Innovation and Alfred P. Sloan Foundation.
Pollution
2,021
April 20, 2021
https://www.sciencedaily.com/releases/2021/04/210420092924.htm
Cool and COVID-safe: How radiant cooling could keep our cities comfortable and healthy
A novel system of chilled panels that can replace air conditioning can also help reduce the risk of indoor disease transmission, suggests new analysis from the University of British Columbia, University of Pennsylvania and Princeton University.
The researchers computed air conditioning requirements in 60 of the world's most populous cities -- with the additional ventilation required due to COVID-19. Then, they compared the energy costs with their cooling method, using the chilled panels and natural ventilation.The results, published in the COVID-19 edition of Dr. Adam Rysanek, a professor in the school of architecture and landscape architecture at UBC and co-author of the paper, notes that many public health guidelines, as well as building industry bodies, recommend increasing the flow of fresh, outdoor air into buildings in order to reduce the risk of spreading COVID-19 and other diseases."However, if we continue to rely on conventional HVAC systems to increase indoor fresh air rates, we may actually double energy consumption. That's the nature of conventional HVAC.""Alternatively, we can encourage people to install new types of radiant cooling systems, which allow them to keep their windows open even when it's hot outside. These alternative systems can provide a sufficient level of thermal comfort, increase protection against disease while lessening the impact on the environment," noted Rysanek, director of the Building Decisions Research Group at UBC's faculty of applied science.Rysanek and his colleagues earlier demonstrated their cooling system in the hot and humid climate of Singapore. They built a public pavilion featuring a system of chilled tubes enclosed within a condensation-preventing membrane. This allowed occupants to feel comfortable, and even cold, without changing the air temperature surrounding the human body."You can think of it as lean A/C -- or, even better, as a green alternative to energy-guzzling air conditioning," said Rysanek.Toronto is one of the cities included in the latest analysis, as are Beijing, Miami, Mumbai, New York and Paris. In all these regions, peak summer temperatures can soar past 35 degrees Celsius (95 degrees Fahrenheit)."A key impact of climate change is the accelerating rise in average and peak temperatures, particularly in urban areas. We are expecting the appetite for indoor cooling will ramp up in the years ahead. Yet, if we want to mitigate urban heat and ensure people are healthy and comfortable while reducing our energy use, we need to seriously consider revolutionising our historical approach to air-conditioning," adds Rysanek.Rysanek notes that, though chilled panel systems have been around for decades, adding the special membrane devised by the research team could be the key to making it a commercially viable alternative to traditional HVAC systems in all climates.
Pollution
2,021
April 19, 2021
https://www.sciencedaily.com/releases/2021/04/210419182101.htm
Researchers use AI to empower environmental regulators
Monitoring environmental compliance is a particular challenge for governments in poor countries. A new machine learning approach that uses satellite imagery to pinpoint highly polluting brick kilns in Bangladesh could provide a low-cost solution. Like superheroes capable of seeing through obstacles, environmental regulators may soon wield the power of all-seeing eyes that can identify violators anywhere at any time, according to a new Stanford University-led study. The paper, published the week of April 19 in
"Brick kilns have proliferated across Bangladesh to supply the growing economy with construction materials, which makes it really hard for regulators to keep up with new kilns that are constructed," said co-lead author Nina Brooks, a postdoctoral associate at the University of Minnesota's Institute for Social Research and Data Innovation who did the research while a PhD student at Stanford.While previous research has shown the potential to use machine learning and satellite observations for environmental regulation, most studies have focused on wealthy countries with dependable data on industrial locations and activities. To explore the feasibility in developing countries, the Stanford-led research focused on Bangladesh, where government regulators struggle to locate highly pollutive informal brick kilns, let alone enforce rules.Bricks are key to development across South Asia, especially in regions that lack other construction materials, and the kilns that make them employ millions of people. However, their highly inefficient coal burning presents major health and environmental risks. In Bangladesh, brick kilns are responsible for 17 percent of the country's total annual carbon dioxide emissions and -- in Dhaka, the country's most populous city -- up to half of the small particulate matter considered especially dangerous to human lungs. It's a significant contributor to the country's overall air pollution, which is estimated to reduce Bangladeshis' average life expectancy by almost two years."Air pollution kills seven million people every year," said study senior author Stephen Luby, a professor of infectious diseases at Stanford's School of Medicine. "We need to identify the sources of this pollution, and reduce these emissions."Bangladesh government regulators are attempting to manually map and verify the locations of brick kilns across the country, but the effort is incredibly time and labor intensive. It's also highly inefficient because of the rapid proliferation of kilns. The work is also likely to suffer from inaccuracy and bias, as government data in low-income countries often does, according to the researchers.Since 2016, Brooks, Luby and other Stanford researchers have worked in Bangladesh to pinpoint kiln locations, quantify brick kilns' adverse health effects and provide transparent public information to inform political change. They had developed an approach using infrared to pick out coal-burning kilns from remotely sensed data. While promising, the approach had serious flaws, such as the inability to distinguish between kilns and heat-trapping agricultural land.Working with Stanford computer scientists and engineers, as well as scientists at the International Centre for Diarrheal Disease Research, Bangladesh (icddr,b), the team shifted focus to machine learning.Building on past applications of deep-learning to environmental monitoring, and on specific efforts to use deep learning to identify brick kilns, they developed a highly accurate algorithm that not only identifies whether images contain kilns but also learns to localize kilns within the image. The method rebuilds kilns that have been fragmented across multiple images -- an inherent problem with satellite imagery -- and is able to identify when multiple kilns are contained within a single image. They are also able to distinguish between two kiln technologies -- one of which is banned -- based on shape classification.The approach revealed that more than three-fourths of kilns in Bangladesh are illegally constructed within 1 kilometer (six-tenths of a mile) of a school, and almost 10 percent are illegally close to health facilities. It also showed that the government systematically under-reports kilns with respect to regulations and -- according to the shape classification findings -- over-reports the percentage of kilns using a newer, cleaner technology relative to an older, banned approach. The researchers also found higher numbers of registered kilns in districts adjacent to the banned districts, suggesting kilns are formally registered in the districts where they are legal but constructed across district borders.The researchers are working to improve the approach's limitations by developing ways to use lower resolution imagery as well as expand their work to other regions where bricks are constructed similarly. Getting it right could make a big difference. In Bangladesh alone, almost everyone lives within 10 kilometers (6.2 miles) of a brick kiln, and more than 18 million -- more than twice the population of New York City -- live within 1 kilometer (.6 mile), according to the researchers estimates."We are hopeful our general approach can enable more effective regulation and policies to achieve better health and environmental outcomes in the future," said co-lead author Jihyeon Lee, a researcher in Stanford's Sustainability and Artificial Intelligence Lab.
Pollution
2,021
April 16, 2021
https://www.sciencedaily.com/releases/2021/04/210416120107.htm
Sunlight to solve the world's clean water crisis
Researchers at UniSA have developed a cost-effective technique that could deliver safe drinking water to millions of vulnerable people using cheap, sustainable materials and sunlight.
Less than 3 per cent of the world's water is fresh, and due to the pressures of climate change, pollution, and shifting population patterns, in many areas this already scarce resource is becoming scarcer.Currently, 1.42 billion people -- including 450 million children -- live in areas of high, or extremely high, water vulnerability, and that figure is expected to grow in coming decades.Researchers at UniSA's Future Industries Institute have developed a promising new process that could eliminate water stress for millions of people, including those living in many of the planet's most vulnerable and disadvantaged communities.A team led by Associate Professor Haolan Xu has refined a technique to derive freshwater from seawater, brackish water, or contaminated water, through highly efficient solar evaporation, delivering enough daily fresh drinking water for a family of four from just one square metre of source water."In recent years, there has been a lot of attention on using solar evaporation to create fresh drinking water, but previous techniques have been too inefficient to be practically useful," Assoc Prof Xu says."We have overcome those inefficiencies, and our technology can now deliver enough fresh water to support many practical needs at a fraction of the cost of existing technologies like reverse osmosis."At the heart of the system is a highly efficient photothermal structure that sits on the surface of a water source and converts sunlight to heat, focusing energy precisely on the surface to rapidly evaporate the uppermost portion of the liquid.While other researchers have explored similar technology, previous efforts have been hampered by energy loss, with heat passing into the source water and dissipating into the air above."Previously many of the experimental photothermal evaporators were basically two dimensional; they were just a flat surface, and they could lose 10 to 20 per cent of solar energy to the bulk water and the surrounding environment," Dr Xu says."We have developed a technique that not only prevents any loss of solar energy, but actually draws additional energy from the bulk water and surrounding environment, meaning the system operates at 100 per cent efficiency for the solar input and draws up to another 170 per cent energy from the water and environment."In contrast to the two-dimensional structures used by other researchers, Assoc Prof Xu and his team developed a three-dimensional, fin-shaped, heatsink-like evaporator.Their design shifts surplus heat away from the evaporator's top surfaces (i.e. solar evaporation surface), distributing heat to the fin surface for water evaporation, thus cooling the top evaporation surface and realising zero energy loss during solar evaporation.This heatsink technique means all surfaces of the evaporator remain at a lower temperature than the surrounding water and air, so additional energy flows from the higher-energy external environment into the lower-energy evaporator."We are the first researchers in the world to extract energy from the bulk water during solar evaporation and use it for evaporation, and this has helped our process become efficient enough to deliver between 10 and 20 litres of fresh water per square metre per day."In addition to its efficiency, the practicality of the system is enhanced by the fact it is built entirely from simple, everyday materials that are low cost, sustainable and easily obtainable."One of the main aims with our research was to deliver for practical applications, so the materials we used were just sourced from the hardware store or supermarket," Assoc Prof Xu says."The only exception is the photothermal materials, but even there we are using a very simple and cost-effective process, and the real advances we have made are with the system design and energy nexus optimisation, not the materials."In addition to being easy to construct and easy to deploy, the system is also very easy to maintain, as the design of the photothermal structure prevents salt and other contaminants building up on the evaporator surface.Together, the low cost and easy upkeep mean the system developed by Assoc Prof Xu and his team could be deployed in situations where other desalination and purification systems would be financially and operationally unviable."For instance, in remote communities with small populations, the infrastructure cost of systems like reverse osmosis is simply too great to ever justify, but our technique could deliver a very low cost alterative that would be easy to set up and basically free to run," Assoc Prof Xu says."Also, because it is so simple and requires virtually no maintenance, there is no technical expertise needed to keep it running and upkeep costs are minimal."This technology really has the potential to provide a long-term clean water solution to people and communities who can't afford other options, and these are the places such solutions are most needed."In addition to drinking water applications, Assoc Prof Xu says his team is currently exploring a range of other uses for the technology, including treating wastewater in industrial operations."There are a lot of potential ways to adapt the same technology, so we are really at the beginning of a very exciting journey," he says.
Pollution
2,021
April 15, 2021
https://www.sciencedaily.com/releases/2021/04/210415170716.htm
How the humble woodchip is cleaning up water worldwide
Australian pineapple, Danish trout, and Midwestern U.S. corn farmers are not often lumped together under the same agricultural umbrella. But they and many others who raise crops and animals face a common problem: excess nitrogen in drainage water. Whether it flows out to the Great Barrier Reef or the Gulf of Mexico, the nutrient contributes to harmful algal blooms that starve fish and other organisms of oxygen.
But there's a simple solution that significantly reduces the amount of nitrogen in drainage water, regardless of the production system or location: denitrifying bioreactors."Nitrogen pollution from farms is relevant around the world, from corn and bean farms here in Illinois to sugarcane and pineapple farms in Australia to diverse farms bordered by ditches in Belgium. We're all dealing with this issue. It's really exciting that bioreactors are bringing us together around a potential solution," says Laura Christianson, assistant professor in the Department of Crop Sciences at the University of Illinois and lead author on a new synthesis article accepted for publication in Denitrifying bioreactors come in many shapes and sizes, but in their simplest form, they're trenches filled with wood chips. Water from fields or aquaculture facilities flows through the trench, where bacteria living in wood chip crevices turn nitrate into a harmless gas that escapes into the air.This edge-of-field conservation practice has been studied for at least a dozen years, but most of what scientists know about nitrogen removal rates is based on laboratory replicas and smaller-scale experimental setups. The USDA's National Resource Conservation Service published a set of standardized bioreactor guidelines in 2015, based in part on Christianson's early field-scale work, and now more and more U.S. farmers are adding bioreactors. They're catching on in other countries, too.The ASABE article is the first to synthesize the available data from full-size bioreactors on working farms across the world."After gathering all the data, the message is bioreactors work. We've shown a 20-40% reduction in nitrate from bioreactors in the Midwest, and now we can say bioreactors around the world are pretty consistent with that," Christianson says.She adds bioreactors, like all conservation practices, have their limitations, but nitrous oxide emissions aren't one of them."People are worried we're just transferring nitrate in water for nitrous oxide, which is a greenhouse gas. We don't know the full story on nitrous oxide with bioreactors yet, but we can say with good confidence they're not creating a huge nitrous oxide problem," she says. "They're just not."Christianson says farmers frequently ask her about monitoring the water in bioreactors, so she and her co-authors detail the process in the ASABE article. She also partnered with the Illinois Farm Bureau to create a series of step-by-step videos explaining how to test the water."For monitoring, there are two parts. You have to know how much water is flowing through the bioreactor and how much nitrogen is in the water," she says.The short videos, which are aimed at non-researchers such as farmers and water quality volunteers, break the process down into five steps. Christianson notes her students, postdoctoral researchers, and lab staff all pulled together to create the series.The videos are available at Christianson, who may just be the world's biggest cheerleader for bioreactors, admits the monitoring guidelines and video series are a little self-serving."We included recommended monitoring approaches so that more people will build them, and then more people will monitor them. And then we'll have more data to show how well bioreactors work and how we can make them work better."
Pollution
2,021
April 15, 2021
https://www.sciencedaily.com/releases/2021/04/210415170702.htm
From smoky skies to a green horizon: Scientists convert fire-risk wood waste into biofuel
Reliance on petroleum fuels and raging wildfires: Two separate, large-scale challenges that could be addressed by one scientific breakthrough.
Teams from Lawrence Berkeley National Laboratory (Berkeley Lab) and Sandia National Laboratories have collaborated to develop a streamlined and efficient process for converting woody plant matter like forest overgrowth and agricultural waste -- material that is currently burned either intentionally or unintentionally -- into liquid biofuel. Their research was published recently in the journal "According to a recent report, by 2050 there will be 38 million metric tons of dry woody biomass available each year, making it an exceptionally abundant carbon source for biofuel production," said Carolina Barcelos, a senior process engineer at Berkeley Lab's Advanced Biofuels and Bioproducts Process Development Unit (ABPDU).However, efforts to convert woody biomass to biofuel are typically hindered by the intrinsic properties of wood that make it very difficult to break down chemically, added ABPDU research scientist Eric Sundstrom. "Our two studies detail a low-cost conversion pathway for biomass sources that would otherwise be burned in the field or in slash piles, or increase the risk and severity of seasonal wildfires. We have the ability to transform these renewable carbon sources from air pollution and fire hazards into a sustainable fuel."In a study led by Barcelos and Sundstrom, the scientists used non-toxic chemicals, commercially available enzymes, and a specially engineered strain of yeast to convert wood into ethanol in a single reactor, or "pot." Furthermore, a subsequent technological and economic analysis helped the team identify the necessary improvements required to reach ethanol production at $3 per gasoline gallon equivalent (GGE) via this conversion pathway. The work is the first-ever end-to-end process for ethanol production from woody biomass featuring both high conversion efficiency and a simple one-pot configuration. (As any cook knows, one-pot recipes are always easier than those requiring multiple pots, and in this case, it also means lower water and energy usage.)In a complementary study, led by John Gladden and Lalitendu Das at the Joint BioEnergy Institute (JBEI), a team fine-tuned the one-pot process so that it could convert California-based woody biomass -- such as pine, almond, walnut, and fir tree debris -- with the same level of efficiency as existing methods used to convert herbaceous biomass, even when the input is a mix of different wood types."Removing woody biomass from forests, like the overgrown pines of the Sierra, and from agricultural areas like the almond orchards of California's Central Valley, we can address multiple problems at once: disastrous wildfires in fire-prone states, air pollution hazards from controlled burning of crop residues, and our dependence on fossil fuels," said Das, a postdoctoral fellow at JBEI and Sandia. "On top of that, we would significantly reduce the amount of carbon added to the atmosphere and create new jobs in the bioenergy industry."Ethanol is already used as an emissions-reducing additive in conventional gasoline, typically constituting about 10% of the gas we pump into our cars and trucks. Some specialty vehicles are designed to operate on fuel with higher ethanol compositions of up to 83%. In addition, the ethanol generated from plant biomass can be used as an ingredient for making more complex diesel and jet fuels, which are helping to decarbonize the difficult-to-electrify aviation and freight sectors. Currently, the most common source of bio-based ethanol is corn kernels -- a starchy material that is much easier to break down chemically, but requires land, water, and other resources to produce.These studies indicate that woody biomass can be efficiently broken down and converted into advanced biofuels in an integrated process that is cost-competitive with starch-based corn ethanol. These technologies can also be used to produce "drop-in" biofuels that are chemically identical to compounds already present in gasoline and diesel.The next steps in this effort is to develop, design, and deploy the technology at the pilot scale, which is defined as a process that converts 1 ton of biomass per day. The Berkeley Lab teams are working with Aemetis, an advanced renewable fuels and biochemicals company based in the Bay Area, to commercialize the technology and launch it at larger scales once the pilot phase is complete.
Pollution
2,021
April 15, 2021
https://www.sciencedaily.com/releases/2021/04/210415170700.htm
AI pinpoints local pollution hotspots using satellite images
Researchers at Duke University have developed a method that uses machine learning, satellite imagery and weather data to autonomously find hotspots of heavy air pollution, city block by city block.
The technique could be a boon for finding and mitigating sources of hazardous aerosols, studying the effects of air pollution on human health, and making better informed, socially just public policy decisions."Before now, researchers trying to measure the distribution of air pollutants throughout a city would either try to use the limited number of existing monitors or drive sensors around a city in vehicles," said Mike Bergin, professor of civil and environmental engineering at Duke. "But setting up sensor networks is time-consuming and costly, and the only thing that driving a sensor around really tells you is that roads are big sources of pollutants. Being able to find local hotspots of air pollution using satellite images is hugely advantageous."The specific air pollutants that Bergin and his colleagues are interested in are tiny airborne particles called PM2.5. These are particles that have a diameter of less than 2.5 micrometers -- about three percent of the diameter of a human hair -- and have been shown to have a dramatic effect on human health because of their ability to travel deep into the lungs.The Global Burden of Disease study ranked PM2.5 fifth on its list of mortality risk factors in 2015. The study indicated that PM2.5 was responsible in one year for about 4.2 million deaths and 103.1 million years of life lost or lived with disability. A recent study from the Harvard University T.H. Chan School of Public Health also found that areas with higher PM2.5 levels are associated with higher death rates due to COVID-19.But the Harvard researchers could only access PM2.5 data on a county-by-county level within the United States. While a valuable starting point, county-level pollution statistics can't drill down to a neighborhood next to a coal-fired power plant versus one next to a park that is 30 miles upwind. And most countries outside of the Western world don't have that level of air quality monitoring."Ground stations are expensive to build and maintain, so even large cities aren't likely to have more than a handful of them," said Bergin. "So while they might give a general idea of the amount of PM2.5 in the air, they don't come anywhere near giving a true distribution for the people living in different areas throughout that city."In previous work with doctoral student Tongshu Zheng and colleague David Carlson, assistant professor of civil and environmental engineering at Duke, the researchers showed that satellite imagery, weather data and machine learning could provide PM2.5 measurements on a small scale.Building off that work and focusing on Beijing, the team has now improved their methods and taught the algorithm to automatically find hotspots and cool spots of air pollution with a resolution of 300 meters -- about the length of a New York City block.The advancement was made by using a technique called residual learning. The algorithm first estimates the levels of PM2.5 using weather data alone. It then measures the difference between these estimates and the actual levels of PM2.5 and teaches itself to use satellite images to make its predictions better."When predictions are made first with the weather, and then satellite data is added later to fine-tune them, it allows the algorithm to take full advantage of the information in satellite imagery," said Zheng.The researchers then used an algorithm initially designed to adjust uneven illumination in an image to find areas of high and low levels of air pollution. Called local contrast normalization, the technique essentially looks for city-block-sized pixels that have higher or lower levels of PM2.5 than others in their vicinity."These hotspots are notoriously difficult to find in maps of PM levels because some days the air is just really bad across the entire city, and it is really difficult to tell if there are true differences between them or if there's just a problem with the image contrast," said Carlson. "It's a big advantage to be able to find a specific neighborhood that tends to stay higher or lower than everywhere else, because it can help us answer questions about health disparities and environmental fairness."While the exact methods the algorithm teaches itself can't transfer from city to city, the algorithm could easily teach itself new methods in different locations. And while cities might evolve over time in both weather and pollution patterns, the algorithm shouldn't have any trouble evolving with them. Plus, the researchers point out, the number of air quality sensors is only going to increase in coming years, so they believe their approach will only get better with time."I think we'll be able to find built environments in these images that are related to the hot and cool spots, which can have a huge environmental justice component," said Bergin. "The next step is to see how these hotspots are related to socioeconomic status and hospital admittance rates from long-term exposures. I think this approach could take us really far and the potential applications are just amazing."
Pollution
2,021
April 14, 2021
https://www.sciencedaily.com/releases/2021/04/210414154952.htm
Reliably measuring oxygen deficiency in rivers or lakes
When wastewater from villages and cities flows into rivers and lakes, large quantities of fats, proteins, sugars and other carbon-containing, organic substances wind up in nature together with the fecal matter. These organic substances are broken down by bacteria that consume oxygen. The larger the volume of wastewater, the better the bacteria thrive. This, however, means the oxygen content of the water continues to decrease until finally the fish, muscles or worms literally run out of air. This has created low-oxygen death zones in many rivers and lakes around the world.
In order to measure how heavily the waters are polluted with organic matter from feces, government bodies and environmental researchers regularly take water samples. One widely used measurement method uses a chemical reaction to determine the content of organic substances. As an international team of scientists now shows, this established method provides values from which the actual degree of the water pollution can hardly be derived. Prof. Helmuth Thomas, Director of Hereon's Institute of Carbon Cycles is also a contributor to the study, which has now been published in the scientific journal Using the conventional measurement method, water samples are mixed with the chemicals permanganate or dichromate. These are especially reactive and break down all organic substances in a short time. The quantity of consumed permanganates or dichromates can then be used to determine how much organic substance was contained in the water sample. Experts refer to this measurement as "chemical oxygen demand," COD. The problem with the COD measurements is that they do not differentiate between the organic substances that wind up in the water with the sewage, and those that arise naturally -- such as lignin and humic acids -- which are released when wood decays. This means that the water pollution can hardly be distinguished from the natural content of organic substances. "For the Han River in South Korea, for example, we have shown that the pollution with organic substances from wastewater in the past twenty-five years has decreased. The COD measurements, however, still show high values as they were before," says Helmuth Thomas, "because here the natural substances make up a large portion of the organic matter in the water."But how can the actual pollution be measured more reliably? A biological measurement method has been established here for decades, but it is much more complex than the COD method and is therefore used more seldomly by government bodies and research institutions. In this case, a water sample is taken from the river or lake and the oxygen content of the water is measured as an initial value. Another "parallel sample" is immediately sealed airtight. Then this water sample rests for five days. During this time, the bacteria break down the organic substance, whereby they gradually consume the oxygen in the water. After five days, the container is opened and the oxygen is measured. If the water contains a great deal of organic matter, then the bacteria were particularly active. The oxygen consumption was then correspondingly high. Experts refer to the "biological oxygen demand" (BOD) in this measurement. "The BOD measurement is far more precise than the COD because the bacteria preferentially break down the small organic molecules from the wastewater but leave the natural ones, such as lignin, untouched," says Thomas. Nevertheless, the BOD measurement has its disadvantages, too. On the one hand, the BOD measurement takes five days, while the COD value is available after a few minutes. On the other, while filling, storing and measuring the water samples, meticulous care must be taken to ensure that no oxygen from the ambient air winds up in the sample and falsifies the measurement value. "Only a few people with a great deal of laboratory experience have mastered how to entirely handle the BOD measurement," says Thomas. "Therefore, government bodies and researchers even today still prefer the COD despite its greater uncertainties."Helmuth Thomas and his team are therefore introducing an alternative method that improves on the conventional BOD measurement. The advantage to the method is that only one water sample is necessary, which is immediately sealed and the oxygen consumption is measured without interfering with the sample. It is therefore unnecessary to open the sample after five days again to measure the oxygen content. This prevents the sample from coming into contact with atmospheric oxygen again. With the new approach, an optical fiber is inserted into the sample vessel as soon as the water sample is filled. Through this fiber, the oxygen content can be continuously measured directly in the sample using optical effects. Thomas says, "We can measure the oxygen content non-stop and obtain a far more precise picture of the oxygen consumption by the bacteria." First tests have shown that a meaningful result is already available after about forty-eight hours, something that considerably accelerates the BOD measurement. All in all, the optical method makes the BOD measurements not only more reliable, but also faster. Helmuth Thomas assumes that the new method in the coming years therefore will be established as the new standard, which will replace both the COD as well as the classic BOD measurements. In the future, for example, it will be possible to determine more reliably than before whether water pollution control measures are actually successful.
Pollution
2,021
April 14, 2021
https://www.sciencedaily.com/releases/2021/04/210414131732.htm
Air pollution may affect severity and hospitalization in COVID-19 patients
Patients who have preexisting respiratory conditions such as asthma or chronic obstructive pulmonary disease (COPD) and live in areas with high levels of air pollution have a greater chance of hospitalization if they contract COVID-19, says a University of Cincinnati researcher.
Angelico Mendy, MD, PhD, assistant professor of environmental and public health sciences, at the UC College of Medicine, looked at the health outcomes and backgrounds of 1,128 COVID-19 patients at UC Health, the UC-affiliated health care system in Greater Cincinnati.Mendy led a team of researchers in an individual-level study which used a statistical model to evaluate the association between long-term exposure to particulate matter less or equal to 2.5 micrometers -- it refers to a mixture of tiny particles and droplets in the air that are two-and-one half microns or less in width -- and hospitalizations for COVID-19. Medical records allowed researchers to use patients' zip codes for estimating their particulate exposure over a 10-year period."Particulate matter is very small, small enough to be inhaled deep into the lungs, they cross into the blood and also affect other organ systems," says Mendy. "Air pollution as a result of emissions from automobiles, factories or other sources is a generator of particulate matter.""Our study didn't find any correlation between severity of COVID-19 and particulate matter in general, but we found something for people who had asthma and COPD," says Mendy. "People who have preexisting asthma and COPD, when they are exposed to higher levels of particulate matter, they are more likely to have severe COVID-19, severe enough to be hospitalized."Researchers found that a one-unit increase in particulate matter 2.5 was associated with a 60% higher chance of hospitalization for COVID-19 patients with pre-existing respiratory disease. For patients without respiratory disease, no association was observed.The study's findings were published online in the scholarly journal It is the first study to look at an association between air pollution, COVID-19 and individual patients, says Mendy. A study co-author, Xiao Wu, PhD, in the Department of Biostatistics at Harvard University, led a study last year looking at air pollution and COVID-19 mortality in the United States."This study may have policy implications such as reducing particulate exposure," says Mendy. "Many people want to have more clean energy and reduced emissions into the atmosphere."Mendy says the findings of his pilot study are preliminary and he hopes to use it to generate support for a larger more comprehensive study of patients. The UC Health patients in the study were diagnosed with COVID-19 between March 13, 2020 and July 5, 2020. The dataset was stripped of all Health Insurance Portability and Accountability Act (HIPAA) identifiers. The median age for patients was 46 and 96.6% were residents of Ohio with the remaining 3.4% coming from Kentucky, Indiana, New York, South Carolina, West Virginia and Iowa.Other study co-authors from UC include Jason Keller, a researcher in the Department of Bioinformatics; Cecily Fassler, PhD, postdoctoral fellow in the Department of Environmental and Public Health Sciences; Senu Apewokin, MD, an assistant professor in the Department of Internal Medicine; Tesfaye Mersha, an associate professor pediatrics; and Changchun Xie, PhD, and Susan Pinney, PhD, both professors in the Department of Environmental and Public Health Sciences.Funding for the study included various grants from the National Institutes of Health supporting researchers.
Pollution
2,021
April 13, 2021
https://www.sciencedaily.com/releases/2021/04/210413081424.htm
Scientists identify severe asthma species, show air pollutant as likely contributor
Asthma afflicts more than 300 million people worldwide. The most severe manifestation, known as non-Th2, or non-atopic childhood asthma, represents the majority of the cases, greater than 85%, particularly in low-income countries, according to Hyunok Choi, an associate professor at the Lehigh University College of Health. Yet, whether non-Th2 is a distinct disease (or endotype) or simply a unique set of symptoms (or phenotype) remains unknown.
"Non-Th2 asthma is associated with very poor prognosis in children and great, life-long suffering due to the absence of effective therapies," says Choi. "There is an urgent need to better understand its mechanistic origin to enable early diagnosis and to stop the progression of the disease before it becomes severe."Studies show that nearly 50% of the children whose asthma is poorly controlled are expected to emerge as severe adult cases. Yet, a one-size-fits-all treatment approach, currently the norm for asthma, is ineffective and, says Choi, and partially responsible for asthma's growing economic burden."The primary reason for lack of therapeutic and preventive measures is that no etiologic, or causal, driver has ever been identified for the non-Th2 asthma," says Choi.Now, for the first time, an epidemiological study, led by Choi, has shown that not only is non-Th2 a distinct disease, its likely inducer is early childhood exposure to airborne Benzo[a]pyrene, a byproduct of fossil fuel combustion. Choi and her colleagues are the first to demonstrate air pollution as a driver of the most challenging type of asthma, the severe subtype which is non-responsive to current therapies.The team describes their results in an article recently published online in Environmental Health Journal called "Airborne Benzo[a]Pyrene May Contribute to Divergent Pheno-Endotypes in Children."What is termed asthma is an umbrella word for multiple diseases with common symptoms. Asthma has been broadly classified as two major sets of symptoms: T helper cell high (Th2-high) and T helper cell low (non-Th2). Th2-high is associated with early-childhood allergies to common pollutants such as pet dander, tree pollens, or mold. In contrast, non-TH2 is not related to an allergic response. The non-Th2 type, marked explicitly by being non-allergy-related, is far less understood than the TH-2 type and could transform into severe or difficult to treat type."The identification of non-Th2 asthma as a distinct disease, with early exposure to Benzo[a]pyrene as a driver, has the potential to impact tens of millions of sufferers, since this would make it possible to intervene before the onset of irreversible respiratory injuries," says Choi.The team tested two comparable groups of children from an industrial city, Ostrava, and the surrounding semi-rural area of Southern Bohemia, in the Czech Republic: 194 children with asthma and a control group consisting of 191 children. According to the study, Ostrava is an industrial city with a high level of coal mining activities, coal processing, and metallurgical refinement. The district-level ambient mean for Benzo[a]pyrene at the time of their investigation November 2008) was 11-times higher than the recommended outdoor and indoor air quality standard.Not only was elevated exposure to Benzo[a]pyrene associated with correspondingly elevated odds of non-Th2 asthma, it was also associated with depressed systemic oxidant levels."Contrary to the current body of evidence supporting adult onset of non-atopic asthma, our data suggest for the first time that the lung function deficit and suppressed oxidative stress levels during early childhood are critical sentinel events preceding non-atopic asthma," says Choi.
Pollution
2,021
April 13, 2021
https://www.sciencedaily.com/releases/2021/04/210412161911.htm
Plastic planet: Tracking pervasive microplastics across the globe
Really big systems, like ocean currents and weather, work on really big scales. And so too does your plastic waste, according to new research from Janice Brahney from the Department of Watershed Sciences. The plastic straw you discarded in 1980 hasn't disappeared; it has fragmented into pieces too small to see, and is cycling through the atmosphere, infiltrating soil, ocean waters and air. Microplastics are so pervasive that they now affect how plants grow, waft through the air we breathe, and permeate distant ecosystems. They can be found in places as varied as the human bloodstream to the guts of insects in Antarctica.
Understanding how microplastics move through global systems is essential to fixing the problem, said Brahney. Her new research focuses on how these invisible pieces of plastic get into the atmosphere, how long they stay aloft, and where in our global system we can expect to find hotspots of microplastic deposition.Plastics enter the atmosphere ... not directly from garbage cans or landfills as you might expect ... but from old, broken-down waste that makes its way into large-scale atmospheric patterns. Roads are a big source of atmospheric plastics, where vehicle tires churn and launch skyward the tiny pieces through strong vehicle-created turbulence. Ocean waves, too, are full of insoluble plastic particles that used to be food wrappers, soda bottles, and plastic bags. These "legacy plastic" particles bob to the top layer of water and are churned by waves and wind, and catapulted into the air.Another important source for the re-emission of plastics is dust produced from agricultural fields. Plastics are introduced to the soil when fertilizers from waste treatments operations are used (virtually all microplastics that are flushed with wastewater remain with the biowaste after the treatment process). Wind can also be a factor near population centers, whisking broken-down plastic particles into the air.Once in the atmosphere, plastics could remain airborne for up to 6.5 days -- enough time to cross a continent, said Natalie Mahowald, coauthor on the paper. The most likely place for plastic deposition from the atmosphere is over (and into) the Pacific and Mediterranean oceans, but continents actually receive more net plastics from polluted ocean sources than they send to them, according to the models. The U.S., Europe, Middle East, India and Eastern Asia are also hotspots for land-based plastic deposition. Along the coasts, ocean sources of airborne plastic become more prominent, including America's west coast, the Mediterranean and southern Australia. Dust and agriculture sources for airborne plastics factor more prominently in northern Africa and Eurasia, while road-produced sources had a big impact in heavily populated regions the world over.This study is important, said Brahney, but it is just the beginning. Much more work is needed on this pressing problem to understand how different environments might influence the process ... wet climates versus dry ones, mountainous regions versus flatlands. The world hasn't slowed its production or use of plastic, she said, so these questions become more pressing every passing year.
Pollution
2,021
April 13, 2021
https://www.sciencedaily.com/releases/2021/04/210408212959.htm
Even 'safe' ambient carbon monoxide levels may harm health, study finds
Data collected from 337 cities across 18 countries show that even slight increases in ambient carbon monoxide levels from automobiles and other sources are associated with increased mortality.
A scientific team led by Yale School of Public Health Assistant Professor Kai Chen analyzed data, including a total of 40 million deaths from 1979 to 2016, and ran it through a statistical model. The research, published today in Overall, a 1 mg/m³ increase in the average CO concentration of the previous day was associated with a 0.91% increase in daily total mortality, the study found. This suggests considerable public health benefits could be achieved by reducing ambient CO concentrations through stricter control of traffic emissions and other measures.Chen and colleagues also discovered that the exposure-response curve was steeper at daily CO levels lower than 1 mg/m³, indicating greater risk of mortality per increment in CO exposure, and this persisted at daily concentrations as low as 0.6 mg/m³ or less. The findings reveal that there is no evidence for a threshold value below which exposure to ambient CO can be considered "safe."The U.S. National Ambient Air Quality Standard for ambient CO (approximately 7 mg/m³ for the daily average) was established in 1971 and has not been revisited for the past five decades. The same air quality guideline for CO has been applied in other regions such as Europe, whereas a lower value of 4 mg/m³ was established as China's air quality standard.The study's findings strongly suggest the need to revisit global and national air quality guidelines for CO and, in addition to single-pollutant standards, policies should also be expanded to address traffic-related air pollution mixtures."These findings have significant public health implications," Chen said. "Millions and millions of people live in environments with elevated CO levels and in environments where the CO levels are within the current guidelines considered 'safe range.'"The international study is believed to be the largest epidemiological investigation on mortality and short-term CO exposure. Professor Michelle Bell of the Yale School of the Environment is a co-author of the paper.Chen collaborated with 37 other scientists from the Multi-Country Multi-City (MCC) Collaborative Research Network. The senior authors of this paper are Alexandra Schneider of the Helmholtz Zentrum München in Munich, Germany, and Antonio Gasparrini of the London School of Hygiene & Tropical Medicine.
Pollution
2,021
April 12, 2021
https://www.sciencedaily.com/releases/2021/04/210412161916.htm
Scientists discover three liquid phases in aerosol particles
Researchers at the University of British Columbia have discovered three liquid phases in aerosol particles, changing our understanding of air pollutants in the Earth's atmosphere.
While aerosol particles were known to contain up to two liquid phases, the discovery of an additional liquid phase may be important to providing more accurate atmospheric models and climate predictions. The study was published today in "We've shown that certain types of aerosol particles in the atmosphere, including ones that are likely abundant in cities, can often have three distinct liquid phases." says Dr. Allan Bertram, a professor in the department of chemistry. "These properties play a role in air quality and climate. What we hope is that these results improve models used in air quality and climate change policies."Aerosol particles fill the atmosphere and play a critical role in air quality. These particles contribute to poor air quality and absorb and reflect solar radiation, affecting the climate system. Nevertheless, how these particles behave remains uncertain. Prior to 2012, it was often assumed in models that aerosol particles contained only one liquid phase.In 2012, researchers from the University of British Columbia and Harvard University provided the first observations of two liquid phases in particles collected from the atmosphere. More recently, researchers at UBC hypothesized three liquid phases could form in atmospheric particles if the particles consisted of low polarity material, medium polarity material, and salty water.To test this, a solvatochromic dye -- a dye that changes color depending on polarity of its surroundings -- was injected into particles containing a mixture of all three of these components. Although the solvatochromic dye method has been used widely in biology and chemistry, it has not been used to characterize the phase behaviour of atmospheric aerosols. Remarkably, three different colors were observed in these particles, confirming the presence of three liquid phases.Scientists were also able to study the properties of particles containing three phases, including how well these particles acted as seeds for clouds, and how fast gases go into and out of the particles.The study focused on particles containing mixtures of lubricating oil from gas vehicles, oxidized organic material from fossil fuel combustion and trees, and inorganic material from fossil fuel combustion. Depending on the properties of the lubricating oil and the oxidized organic material, different number of liquid phases will appear resulting in different impacts on air quality and climate."Through what we've shown, we've improved our understanding of atmospheric aerosols. That should lead to better predictions of air quality and climate, and better prediction of what is going to happen in the next 50 years," says Dr. Bertram. "If policies are made based on a model that has high uncertainties, then the policies will have high uncertainties. I hope we can improve that."With the urgency of climate goals, policy that is built on accurate atmospheric modelling reduces the possibility of using resources and finances toward the wrong policies and goals.
Pollution
2,021
April 12, 2021
https://www.sciencedaily.com/releases/2021/04/210412084528.htm
Volcanic pollution return linked to jump in respiratory disease cases
Respiratory disease increased markedly following one of Iceland's largest volcanic eruptions, a new study has found.
And the findings could have significant implications for actions taken to protect the health of the 800 million people globally living near active volcanoes. Indeed, only last month (March), lava burst through a crack in Iceland's Mount Fagradalsfjall in the first eruption of its type in more than 800 years.The new research, led by the University of Leeds and the University of Iceland, examined the health impacts of pollution caused by the Holuhraun lava eruption in 2014-2015.It shows that following exposure to emissions that changed chemically from gas to fine particles, incidents of respiratory disease in Iceland rose by almost a quarter, and the incidence of asthma medication dispensing by a fifth.The findings, published today (10:00 GMT 12 April) in The report's co-lead author is Dr Evgenia Ilyinskaya, from the University of Leeds' School of Earth and Environment.She said: "Volcanoes are a significant source of air pollution, but of course it's a source that cannot be controlled."Large volcanic eruptions can cause harmful air pollution both immediately, and also when the plume returns to the same area, which may happen without it triggering air pollution alerts."Our research shows that during prolonged eruptions such as Holuhraun, both young and mature plumes can be circulating at the same time, increasing the harmful health effects on those living in volcanic regions."This pollution return is not currently factored into responses to the threat to public health caused by volcanoes."The Holuhraun eruption was one of the biggest of its kind in the last 200 years, releasing 11 million tonnes of sulphur dioxide that spread across Iceland and the Atlantic Ocean towards Europe.During the six-months long eruption, residents of Iceland's capital, Reykjavík, were repeatedly exposed to the young and mature plumes, despite living 250km from the eruption site.In their previous research, published in 2017, the scientists traced the evolution of the volcanic plume chemistry. They found that the plume had been swept by air currents towards the UK and mainland Europe before circling back to Icelandic cities and towns.During this process, the plume composition matured as it lingered in the atmosphere -- meaning that the volcanic sulphur dioxide had converted to particles.These fine particles found in mature plumes are so small they can penetrate deep into the lungs, potentially causing serious health problems such as exacerbating asthma attacks.In the returning plume, because the sulphur dioxide levels were reduced as the gas converted to particles, concentrations were therefore within European Commission air standards.As a result, no health advisory message were in place in Iceland for the returning plume.It is estimated that short and long-term exposure to these kind of fine particles, from both human-made and natural sources, cause over three million premature deaths globally per year and remains the single largest environmental health risk in Europe.The new findings highlight the health risks of pollutants lingering in the atmosphere, and the implications for monitoring emissions from volcanic activity.They point to the global need for health risk assessments and population safety management following volcanic eruptions.Co-lead author Dr Hanne Krage Carlsen, from the University of Iceland and University of Gothenburg, said: "Iceland has some of the most complete health care records in the world. This was the first time a population of a considerable size and density could be assessed following major volcanic activity."This study provides the most robust evidence to date that exposure to a chemically-mature volcanic plume leads to increased use of a country's health care system."It also emphasizes that emissions from volcanoes are a region-wide issue, in this case potentially affecting the whole North Atlantic region."As the Holuhraun plume returned to Iceland, there was increased use of GPs and hospital emergency care units with regards to respiratory diseases. At the same time, there was a lack of public health advice."We recommend that future Government responses to volcanic air pollution globally considers both the implications to health caused by the initial eruptions, but also those of the returning plumes with additional threats to health."
Pollution
2,021
April 9, 2021
https://www.sciencedaily.com/releases/2021/04/210408212954.htm
Sunlight linked with lower COVID-19 deaths, study shows
Sunnier areas are associated with fewer deaths from Covid-19, an observational study suggests.
Increased exposure to the sun's rays -- specifically UVA -- could act as a simple public health intervention if further research establishes it causes a reduction in mortality rates, experts say.Researchers from the University of Edinburgh compared all recorded deaths from Covid-19 in the continental US from January to April 2020 with UV levels for 2,474 US counties for the same time period.The study found that people living in areas with the highest level of exposure to UVA rays -- which makes up 95 per cent of the sun's UV light -- had a lower risk of dying from Covid-19 compared with those with lower levels. The analysis was repeated in England and Italy with the same results.The researchers took into account factors known to be associated with increased exposure to the virus and risk of death such as age, ethnicity, socioeconomic status, population density, air pollution, temperature and levels of infection in local areas.The observed reduction in risk of death from Covid-19 could not be explained by higher levels of vitamin D, the experts said. Only areas, with insufficient levels of UVB to produce significant vitamin D in the body, were included in the study.One explanation for the lower number of deaths, which the researchers are following up, is that sunlight exposure causes the skin to release nitric oxide. This may reduce the ability of SARS Coronavirus2 -- the cause of Covid-19 -- to replicate, as has been found in some lab studies.Previous research from the same group has shown that increased sunlight exposure is linked to improved cardiovascular health, with lower blood pressure and fewer heart attacks. As heart disease is a known risk factor in dying from Covid-19, this could also explain the latest findings.The team say due to the observational nature of the study it is not possible to establish cause and effect. However, it may lead to interventions that could be tested as potential treatments.The paper has been published in the Dr Richard Weller, corresponding author, consultant dermatologist and Reader at the University of Edinburgh, said: "There is still so much we don't understand about Covid-19, which has resulted in so many deaths worldwide. These early results open up sunlight exposure as one way of potentially reducing the risk of death."Professor Chris Dibben, Chair in Health Geography at the University of Edinburgh and Co-author said: "The relationship between Covid-19 mortality, season and latitude has been quite striking, here we offer an alternative explanation for this phenomenon."
Pollution
2,021
April 7, 2021
https://www.sciencedaily.com/releases/2021/04/210407143809.htm
Carbon dioxide levels reflect COVID-19 risk
Tracking carbon dioxide levels indoors is an inexpensive and powerful way to monitor the risk of people getting COVID-19, according to new research from the Cooperative Institute for Research in Environmental Sciences (CIRES) and the University of Colorado Boulder. In any given indoor environment, when excess CO
The chemists relied on a simple fact already put to use by other researchers more than a decade ago: Infectious people exhale airborne viruses at the same time as they exhale carbon dioxide. That means CO"You're never safe indoors sharing air with others, but you can reduce the risk," said Jose-Luis Jimenez, co-author of the new assessment, a CIRES Fellow and professor of chemistry at the University of Colorado Boulder."And COFor many months, researchers around the world have been searching for a way to continually monitor COVID-19 infection risk indoors, whether in churches or bars, buses or hospitals. Some are developing instruments that can detect viruses in the air continually, to warn of a spike or to indicate relative safety. Others tested existing laboratory-grade equipment that costs tens of thousands of dollars.Jimenez and colleagues turned to commercially available carbon dioxide monitors, which can cost just a few hundred dollars. First, they confirmed in the laboratory that the detectors were accurate. Then, they created a mathematical "box model" of how an infected person exhales viruses and COIt's important to understand that there is no single COBut in each indoor space, the model can illuminate "relative" risk: If COIn the new paper, Peng and Jimenez also shared a set of mathematical formulae and tools that experts in building systems and public health can use to pin down actual, not just relative, risk. But the most important conclusion is that to minimize risk, keep the CO"Wherever you are sharing air, the lower the CO
Pollution
2,021
April 7, 2021
https://www.sciencedaily.com/releases/2021/04/210407110405.htm
Scientists develop eco-friendly pollen sponge to tackle water contaminants
A team of scientists led by Nanyang Technological University, Singapore (NTU Singapore) has created a reusable, biodegradable sponge that can readily soak up oil and other organic solvents from contaminated water sources, making it a promising alternative for tackling marine oil spills.
Made of sunflower pollen, the sponge is hydrophobic -- it repels water -- thanks to a coat of natural fatty acid on the sponge. In lab experiments, the scientists showed the sponge's ability to absorb oil contaminants of various densities, such as gasoline and motor oil, at a rate comparable to that of commercial oil absorbents.Oil spills are difficult to clean up, and result in severe long-lasting damage to the marine ecosystem. Conventional clean-up methods, including using chemical dispersants to break oil down into very small droplets, or absorbing it with expensive, unrecyclable materials, may worsen the damage.So far, the researchers have engineered sponges that measure 5 cm in diameter. The research team, made up of scientists from NTU Singapore and Sungkyunkwan University in South Korea, believes that these sponges, when scaled up, could be an eco-friendly alternative to tackle marine oil spills.Professor Cho Nam-Joon from the NTU School of Materials Science and Engineering, who led the study, said: "By finetuning the material properties of pollen, our team successfully developed a sponge that can selectively target oil in contaminated water sources and absorb it. Using a material that is found abundantly in nature also makes the sponge affordable, biodegradable, and eco-friendly."This study builds on NTU's body of work on finding new uses for pollen, known as the diamond of the plant kingdom for its hard exterior, by transforming its tough shell into microgel particles. This soft, gel-like material is then used as a building block for a new category of environmentally sustainable materials.Last year, Prof Cho, together with NTU President Professor Subra Suresh, led a research team to create a paper-like material from pollen as a greener alternative to paper created from trees. This 'pollen paper' also bends and curls in response to changing levels of environmental humidity, a trait that could be useful for soft robots, sensors, and artificial muscles.Prof Cho, who also holds the Materials Research Society of Singapore Chair in Materials Science and Engineering, added: "Pollen that is not used for plant pollination is often considered biological waste. Through our work, we try to find new uses for this 'waste' and turn it into a natural resource that is renewable, affordable, and biodegradable. Pollen is also biocompatible. It does not cause an immunological, allergic or toxic reaction when exposed to body tissues, making it potentially suitable for applications such as wound dressing, prosthetics, and implantable electronics."The findings were published in the scientific journal To form the sponge, the NTU team first transformed the ultra-tough pollen grains from sunflowers into a pliable, gel-like material through a chemical process akin to conventional soap-making.This process includes removing the sticky oil-based pollen cement that coats the grain's surface, before incubating the pollen in alkaline conditions for three days. The resulting gel-like material was then freeze-dried.These processes resulted in the formation of pollen sponges with 3D porous architectures. The sponges were briefly heated to 200°C -- a step that makes their form and structure stable after repeatedly absorbing and releasing liquids. Heating also led to a two-fold improvement in the sponge's resistance to deformation, the scientists found.To make sure the sponge selectively targets oil and does not absorb water, the scientists coated it with a layer of stearic acid, a type of fatty acid found commonly in animal and vegetable fat. This renders the sponge hydrophobic while maintaining its structural integrity.The scientists performed oil-absorption tests on the pollen sponge with oils and organic solvents of varying densities, such as gasoline, pump oil, and n-hexane (a chemical found in crude oil).They found that the sponge had an absorption capacity in the range of 9.7 to over 29.3 g/g.* This is comparable to commercial polypropylene absorbents, which are petroleum derivatives and have an absorption capacity range of 8.1 to 24.6 g/g.They also tested the sponge for its durability and reusability by repeatedly soaking it in silicone oil, then squeezing the oil out. They found that this process could go on for at least 10 cycles.In a final proof-of-concept experiment, the team tested the ability of a sponge 1.5cm in diameter and 5mm in height to absorb motor oil from a contaminated water sample. The sponge readily absorbed the motor oil in less than 2 minutes."Collectively, these results demonstrate that the pollen sponge can selectively absorb and release oil contaminants and has similar performance levels to commercial oil absorbents while demonstrating compelling properties such as low cost, biocompatibility, and sustainable production," said Prof Cho, the corresponding author of this study.Going forward, the researchers plan to scale up the size of pollen sponges to meet industry needs. They are also looking to collaborate with non-governmental organisations and international partners to conduct pilot tests with pollen sponges in real-life environments."We hope our innovative pollen materials can one day replace widely-used plastics and help to curb the global issue of plastic pollution," said Prof Cho.*g/gis a unit of measurementfor absorption capacity. It refers to how many grams of the contaminant can adhere to per gram of the material that absorbs
Pollution
2,021
April 6, 2021
https://www.sciencedaily.com/releases/2021/04/210406131953.htm
Houston flooding polluted reefs more than 100 miles offshore
Runoff from Houston's 2016 Tax Day flood and 2017's Hurricane Harvey flood carried human waste onto coral reefs more than 100 miles offshore in the Flower Garden Banks National Marine Sanctuary, according to a Rice University study.
"We were pretty shocked," said marine biologist Adrienne Correa, co-author of the study in The Flower Garden Banks sit atop several salt domes near the edge of the continental shelf about 100 miles from the Texas and Louisiana coast. Rising several hundred feet from the seafloor, the domes are topped with corals, algae, sponges and fish. Each bank, or dome-topped ecosystem, is separated by miles of open ocean. The Flower Garden Banks National Marine Sanctuary, which was recently expanded, protects 17 banks.Correa and colleagues sampled sponges at the sanctuary in 2016, 2017 and 2018. They showed samples collected after extreme storm flooding in 2016 and 2017 contained E. coli and other human fecal bacteria. They also used a catalog of E. coli genetic markers contributed by Rice environmental engineer and co-author Lauren Stadler to show that E. coli on sponges in 2017 came from Harvey floodwaters.Lead author Amanda Shore, who conducted the research while a Rice Academy Postdoctoral Fellow in Correa's lab, said many studies have shown nearshore reefs can be harmed by pollutants that are washed into the ocean by rainfall over land. But marine biologists generally assume ecosystems far from shore are safe from such dangers."This shows perhaps they aren't protected from severe events," said Shore, an assistant professor of biology at Farmingdale State College in New York. "And these events are increasing in frequency and intensity with climate change."Correa said, "That's the other piece of this. There actually was a massive flooding event in 2015 with the Memorial Day flood. Dips in salinity after that event were detected at surface buoys offshore, but nobody looked or sampled out at the Flower Garden Banks. Nobody imagined you would see something like this 160 kilometers out."In April 2016, widespread flooding occurred in the Houston area when a severe storm dropped more than 17 inches of rain in some places in less than 24 hours. Three months after the flood, recreational divers reported murky waters and dead and dying organisms at East Flower Garden Bank. Marine biologists, including study co-author Sarah Davies of Boston University, arrived two weeks later to investigate.Shore and co-authors Carsten Grupstra, a Rice graduate student, and Jordan Sims, a Rice undergraduate, analyzed samples from the expedition, including tissue collected from sponges. Shore said sponges are indicators of water quality because they "are basically filtering seawater to catch organic material to use as food."She said previous studies have shown sponges have a microbiome, a population of bacteria that normally live in and on these animals. In this study, Shore characterized the microbiomes on two species: giant barrel sponges, or Xestospongia muta, and orange elephant ear sponges, or Agelas clathrodes. It was the first time the species' microbiomes had been assayed at Flower Garden Banks, and Correa said that was one reason it took so long to understand what happened in the flood years.Correa said, "In 2016, we saw differences between sponge bacteria at a location that showed signs of death and a location that didn't show signs of death, but we couldn't get at the cause of the differences because we had no baseline data. We thought we'd be able to get the baseline data -- the normal year -- the next year in 2017. But then there was another disaster. We couldn't get a normal sample in a no-flood year until 2018."Shore joined Correa's lab in 2018, helped collect samples that year and analyzed the microbiomes from each year.Correa said, "There was a big change in community composition, a shift of the team players, on the sponges that were most affected in 2016. Then, following Harvey in 2017 there was also a shift, but less water made it out there that year, and we think it was less stressful. We didn't see dead and dying organisms like we had the previous year."Harvey, the most intense rainfall event in U.S. history, dropped an estimated 13 trillion gallons of rain over southeast Texas in late August 2017. The researchers said Harvey posed a greater potential threat to the Flower Garden Banks, by far, than the 2016 flood. So why did reefs fare better in 2017?"Because we got lucky with ocean currents," Shore said. "Instead of going straight out from Galveston Bay and over the Flower Garden Banks, the water ended up turning a bit and going down the Texas coast instead."Harvey's runoff still sideswiped the banks. Research buoys at the reefs measured a 10% drop in salinity in less than a day on Sept. 28, and Correa's team found genetic evidence that fecal pollution gathered from the banks in October originated in Harvey floodwaters in Houston.Correa said the story in 2016 was more complicated."There was an upwelling event that brought nutrients and cooler waters up from the deep to the top part of the Flower Garden Banks," she said. "Fresh water is less dense than salt water, and we think the floodwaters came at the surface and sort of sat there like a lens on top of the salt water and kept oxygen from mixing in from the top. The combination of this surface event and the nutrients coming up from the bottom contributed to a bacterial bloom that drew down so much oxygen that things just asphyxiated."The big question is whether pollution from extreme storms poses a long-term threat to the Flower Garden Banks. Correa said the answer could come from an investment in research that follows the health and microbiomes of individual sponges and corals on the reef over time. She said her group at Rice and her collaborators are committed to learning as much as they can about the reefs, and they are determined to support efforts to conserve and protect them.
Pollution
2,021
April 6, 2021
https://www.sciencedaily.com/releases/2021/04/210406120656.htm
First air quality profile of two sub-Saharan African cities finds troubling news
Ambient air pollution is a global public health crisis, causing more than 4.9 million premature deaths per year around the world. In Africa, it has surpassed AIDS as the leading cause of premature death. According to one study, air pollution -- specifically, fine particulate matter (PM2.5) -- may cause at least as many as 780,000 premature deaths annually in Africa and worsen a significant number of diseases, including asthma, lung cancer, and chronic obstructive pulmonary disease.
Kinshasa, capital of the Democratic Republic of the Congo, and Brazzaville, capital of the Republic of Congo, are both large metropolises. However, neither Kinshasa (population 14. 3 million) nor Brazzaville (population 2.4 million) have had comprehensive air quality monitoring programs. There are no national ambient air quality standards in either country, according to an analysis done by the UN Environment Programme.A new study, led by Lamont-Doherty Earth Observatory atmospheric scientist Daniel Westervelt and Columbia University undergraduate student Celeste McFarlane, has yielded the first-ever multi-year ambient PM2.5 dataset in Kinshasa and Brazzaville. The team deployed a cadre of low-cost sensors and interpreted data in the context of changing weather and changing human activity related to COVID-19 stay-at-home orders. The study was supported by two local universities and their scientists in both cities, and is published online on What it shows is concerning. During the investigation, which began in March 2018, researchers found PM2.5 is highest during the dry season -- June, July, and August -- when it is up to five times higher than World Health Organization guidelines. It is lower in the remaining months, thanks in part to rainfall, but even then, it is more than four times higher than WHO guidelines."Average PM2.5 concentrations suggest unhealthy levels of human exposure, which, over time, can lead to cardiopulmonary problems and premature death," said Westervelt.The study also found that last year's stay-at-home and lockdown directives in response to COVID-19 corresponded to a 40% decrease in PM2.5."We were able to demonstrate that it is possible to robustly characterize air quality in African megacities using well-calibrated, relatively simple, cheap devices," Westervelt said.He added that given the health risks from air pollution, this data is urgently needed to draw attention to the problem. Researchers hope this study will lead to more concerted efforts to characterize sources of air pollution and develop strategies to mitigate the negative health impacts.Study collaborators include: Columbia University, Department of Chemical Engineering; Ecole Régionale postuniversitaire d'Aménagement et de Gestion Intégrés des Forêts et Territoire tropicaux (ERAIFT) Kinshasa Democratic Republic of Congo; World Bank Group; Kinshasa, Democratic Republic of Congo; Département de chimie, Université Marien Ngouabi, Brazzaville, Republic of Congo; Washington State Department of Ecology; Department of Chemistry, University of California Berkeley; 9OSU-EFLUVE -- Observatoire Sciences de l'Univers-Enveloppes Fluides de la Ville à l'Exobiologie, Université Paris-Est-Créteil, France; NASA Postdoctoral Program Fellow; Goddard Space Flight Center; Center for Atmospheric Particle Studies, Carnegie Mellon University; Kigali Collaborative Research Centre, Kigali, Rwanda; and NASA Goddard Institute for Space Studies.
Pollution
2,021
April 5, 2021
https://www.sciencedaily.com/releases/2021/04/210405123311.htm
Masks, ventilation stop COVID spread better than social distancing, study shows
A new study from the University of Central Florida suggests that masks and a good ventilation system are more important than social distancing for reducing the airborne spread of COVID-19 in classrooms.
The research, published recently in the journal "The research is important as it provides guidance on how we are understanding safety in indoor environments," says Michael Kinzel, an assistant professor in UCF's Department of Mechanical and Aerospace Engineering and study co-author."The study finds that aerosol transmission routes do not display a need for six feet social distancing when masks are mandated," he says. "These results highlight that with masks, transmission probability does not decrease with increased physical distancing, which emphasizes how mask mandates may be key to increasing capacity in schools and other places."In the study, the researchers created a computer model of a classroom with students and a teacher, then modeled airflow and disease transmission, and calculated airborne-driven transmission risk.The classroom model was 709 square feet with 9-foot-tall ceilings, similar to a smaller-size, university classroom, Kinzel says. The model had masked students -- any one of whom could be infected -- and a masked teacher at the front of the classroom.The researchers examined the classroom using two scenarios -- a ventilated classroom and an unventilated one -- and using two models, Wells-Riley and Computational Fluid Dynamics. Wells-Riley is commonly used to assess indoor transmission probability and Computational Fluid Dynamics is often used to understand the aerodynamics of cars, aircraft and the underwater movement of submarines.Masks were shown to be beneficial by preventing direct exposure of aerosols, as the masks provide a weak puff of warm air that causes aerosols to move vertically, thus preventing them from reaching adjacent students, Kinzel says.Additionally, a ventilation system in combination with a good air filter reduced the infection risk by 40 to 50% compared to a classroom with no ventilation. This is because the ventilation system creates a steady current of air flow that circulates many of the aerosols into a filter that removes a portion of the aerosols compared to the no-ventilation scenario where the aerosols congregate above the people in the room.These results corroborate recent guidelines from the U.S. Centers for Disease Control and Prevention that recommend reducing social distancing in elementary schools from six to three feet when mask use is universal, Kinzel says."If we compare infection probabilities when wearing masks, three feet of social distancing did not indicate an increase in infection probability with respect to six feet, which may provide evidence for schools and other businesses to safely operate through the rest of the pandemic," Kinzel says."The results suggest exactly what the CDC is doing, that ventilation systems and mask usage are most important for preventing transmission and that social distancing would be the first thing to relax," the researcher says.When comparing the two models, the researchers found that Wells-Riley and Computational Fluid Dynamics generated similar results, especially in the non-ventilated scenario, but that Wells-Riley underpredicted infection probability by about 29 percent in the ventilated scenario.As a result, they recommend some of the additional complex effects captured in Computational Fluid Dynamics be applied to Wells-Riley to develop a more complete understanding of risk of infection in a space, says Aaron Foster, a doctoral student in UCF's Department of Mechanical and Aerospace Engineering and the study's lead author."While the detailed Computational Fluid Dynamics results provided new insights into the risk variation and distance relationships, they also validated the more commonly used Wells-Riley models as capturing the majority of the benefit of ventilation with reasonable accuracy," Foster says. "This is important since these are publicly available tools that anyone can use to reduce risk."The research is part of a larger overall effort to control airborne disease transmission and better understand factors related to being a super-spreader. The researchers are also testing the effects of masks on aerosol and droplet transmission distance. The work is funded in part by the National Science Foundation.Kinzel received his doctorate in aerospace engineering from Pennsylvania State University and joined UCF in 2018. In addition to being a member of UCF's Department of Mechanical and Aerospace engineering, a part of UCF's College of Engineering and Computer Science, he also works with UCF's Center for Advanced Turbomachinery and Energy Research.
Pollution
2,021
April 2, 2021
https://www.sciencedaily.com/releases/2021/04/210402095934.htm
Scientists turn to deep learning to improve air quality forecasts
Air pollution from the burning of fossil fuels impacts human health but predicting pollution levels at a given time and place remains challenging, according to a team of scientists who are turning to deep learning to improve air quality estimates. Results of the team's study could be helpful for modelers examining how economic factors like industrial productivity and health factors like hospitalizations change with pollution levels.
"Air quality is one of the major issues within an urban area that affects people's lives," said Manzhu Yu, assistant professor of geography at Penn State. "Yet existing observations are not adequate to provide comprehensive information that may help vulnerable populations to plan ahead."Satellite and ground-based observations each measure air pollution, but they are limited, the scientists said. Satellites, for instance, may pass a given location at the same time each day and miss how emissions vary at different hours. Ground-based weather stations continuously collect data but only in a limited number of locations.To address this, the scientists used deep learning, a type of machine learning, to analyze the relationship between satellite and ground-based observations of nitrogen dioxide in the greater Los Angeles area. Nitrogen dioxide is largely associated with emissions from traffic and power plants, the scientists said."The problem right now is nitrogen dioxide varies a lot during the day," Yu said. "But we haven't had an hourly, sub-urban scale product available to track air pollution. By comparing surface level and satellite observations, we can actually produce estimates with higher spatial and temporal resolution."The learned relationship allowed the researchers to take daily satellite observations and create hourly estimates of atmospheric nitrogen dioxide in roughly 3-mile grids, the scientists said. They recently reported their findings in the journal "The challenge here is whether we can find a linkage between measurements from earth's surface and satellite observations of the troposphere, which are actually far away from each other. That's where deep learning comes in."Deep learning algorithms operate much like the human brain and feature multiple layers of artificial neurons for processing data and creating patterns. The system learns and trains itself based on connections it finds within large amounts of data, the scientists said.The scientists tested two deep-learning algorithms and found the one that compared the ground-based observations directly to the satellite observations more accurately predicted nitrogen dioxide levels. Adding information like meteorological data, elevation and the locations of the ground-based stations and major roads and power plants improved the prediction accuracy further.Yu said the study could be repeated for other greenhouse gases and applied to different cities or on regional and continental scales, the scientists said. In addition, the model could be updated when new, higher-resolution satellites are launched."With a high spatiotemporal resolution, our results will facilitate the study between air quality and health issues and improve the understanding of the dynamic evolution of airborne pollutants," Yu said.
Pollution
2,021
April 1, 2021
https://www.sciencedaily.com/releases/2021/04/210401151245.htm
New promise of forecasting meteotsunamis
On the afternoon of April 13, 2018, a large wave of water surged across Lake Michigan and flooded the shores of the picturesque beach town of Ludington, Michigan, damaging homes and boat docks, and flooding intake pipes. Thanks to a local citizen's photos and other data, NOAA scientists reconstructed the event in models and determined this was the first ever documented meteotsunami in the Great Lakes caused by an atmospheric inertia-gravity wave.
An atmospheric inertia-gravity wave is a wave of air that can run from 6 to 60 miles long that is created when a mass of stable air is displaced by an air mass with significantly different pressure. This sets in motion a wave of air with rising and falling pressure that can influence the water below, as it synchronizes with water movement on the lake's surface like two singers harmonizing."That meteotsunami was hands down off the chart awesome," said Debbie Maglothin of Ludington who took photos of the event. "The water in between the breakwaters didn't go down like the water on the outside of them, so it created waterfalls that cascaded over the breakwaters. Had this event occurred during summer it could have washed people right off the breakwaters."Meteotsunamis generated from this type atmospheric condition are common around the globe, but in the Great Lakes, the few well documented meteotsunamis have been driven by sudden severe thunderstorms where both winds and air pressure changes have played significant roles.While there are currently no forecast models that effectively predict meteotsunamis in the U.S., new NOAA research based on the Ludington wave demonstrates that existing NOAA numerical weather prediction models and hydrodynamic forecast models may enable scientists to predict these meteotsunami-driving atmospheric waves minutes to hours in advance. The research is published in a special edition of the journal "The good news with this type of meteotsunami is that it is easier to predict than ones triggered by thunderstorms," said Eric Anderson, an oceanographer at NOAA's Great Lakes Environmental Research Laboratory and lead author of the study. "Our short-range weather models can pick up these atmospheric pressure waves, whereas predicting thunderstorms is more difficult."Meteotsunamis are a lesser known category of tsunami. Unlike the more well known tsunami -- such as the catastrophic 2004 Boxing Day tsunami in Indonesia, which was caused by an earthquake on the seafloor, meteotsunamis are caused by weather, in particular some combination of changing air pressure, strong winds and thunderstorm activity."Because the lakes are relatively small, meteotsunamis typically need more than a jump in air pressure to drive them," said Anderson. "That's where the thunderstorms and wind come in to give them a push."Meteotsunamis occur around the world, and are known to occur in the United States primarily on the Great Lakes and along the East and Gulf of Mexico coasts. Meteotsunami waves in the Great Lakes can be particularly insidious because they can bounce off the shoreline and come back again when the skies are clear. They are relatively rare and typically small, the largest producing three to six foot waves, which only occur about once every 10 years.Predicting these waves in advance would give communities potentially life-saving warnings and would allow residents and businesses to take measures to better protect property. The Ludington meteotsunami resulted in some property damage but no serious injuries. Had the meteotsunami struck in the summer when swimmers, anglers and vacationers flock to the lakeshore beaches, parks and waters, it might have been a different story, as was the case with a meteotsunami that took the lives of eight people in Chicago in June 1954."It's a gap in our forecasting," said Anderson. "With this study and other research we are getting closer to being able to predict them in advance."
Pollution
2,021
March 31, 2021
https://www.sciencedaily.com/releases/2021/03/210331114815.htm
How Middle East dust intensifies summer monsoons on Indian subcontinent
New research from the University of Kansas published in
"We know that dust coming from the desert, when lifted by strong winds into the atmosphere, can absorb solar radiation," said lead author Qinjian Jin, lecturer and academic program associate with KU's Department of Geography & Atmospheric Science. "The dust, after absorbing solar radiation, becomes very hot. These dust particles suspended in the atmosphere can heat the atmosphere enough that the air pressure will change -- and it can result in changes in the circulation patterns, like the winds."This phenomenon, dubbed an "elevated heat pump," drives moisture onto the Indian subcontinent from the sea."The Indian summer monsoon is characterized by strong winds in the summer," Jin said. "So once the winds change, the moisture transport from ocean to land will change, and consequently they will increase the precipitation. The precipitation is very important for people living in South Asia, especially India, and important for agriculture and drinkable water."While the dust from the Middle East boosts the power of the monsoons on the Indian subcontinent, there is also a reverse effect that results in a positive feedback loop where the monsoons can increase the winds in the Middle East to produce yet more dust aerosols."The monsoon can influence dust emission," Jin said. "When we have a stronger monsoon, we have heating in the upper atmosphere. The convection associated with the monsoon can go up to a very high elevation, as much as 10 kilometers. When this pattern of air over the monsoon is heated, you produce something like a wave. Across the area, you'll have high pressure, then low pressure, then high pressure. Those waves can transport air to the Middle East. The air comes upward over the Indian subcontinent, then goes to the Middle East and goes downward -- and when the downward air strikes the surface, it can pick up a lot of dust aerosols."Jin's co-authors on the paper are Bing Pu, assistant professor of geography & atmospheric science at KU; Jiangfeng Wei of the Nanjing University of Information Science and Technology in Nanjing, Jiangsu, China; William K.M. Lau of the University of Maryland and Chien Wang of Laboratoire d'Aerologie, CNRS/UPS, Toulouse, France.Their work encompassed a review of literature on the relationship between Middle East dust and the Indian summer monsoons, along with original research using supercomputers to better understand the phenomenon.While it has been known that an "elevated heat pump" exists due to dust coming from the Arabian Peninsula, Jin and his colleagues argue for another source in South Asia fueling the effect of aerosolized dust upon the Indian summer monsoons: the Iranian Plateau."The Iranian Plateau is located between the Middle East and the Tibetan Plateau, and the Iranian Plateau is also very high," Jin said. "The land at high elevation, when solar radiation reaches the surface, will become very hot. Over this hot Iranian Plateau, we can expect changes to the monsoon circulation and at the same time the hot air over the Iranian Plateau can also strengthen the circulation over the deserts of the Arabian Peninsula. So, the Iranian Plateau can increase dust emission from the Middle East, as well as monsoon circulation and monsoon precipitation. The Iranian Plateau is another driver that can explain the relationship between Middle East dust and Indian summer monsoon."Jin said the paper explores three other mechanisms influencing the Indian summer monsoon. One is the snow-darkening effect, where black carbon and dust reduce snow's reflectivity, warming the land and the troposphere above. Another is the solar-dimming effect, where aerosols in the atmosphere cause the land surface to cool. Lastly, the research considers how aerosolized dust can serve as ice-cloud nuclei, which can "alter the microphysical properties of ice clouds and consequently the Indian summer monsoon rainfall."The KU researcher said understanding these mechanisms and climactic effects of dust will prove to be of increasing importance in the face of global climate change."This is especially true for Asia -- there are climate projections showing that in some areas of Asia, the land will become drier," Jin said. "So, we expect more dust emissions and dust will play a more important role for some time to come in Asia. We have a lot of anthropogenic emission, or air pollution, in eastern China, East Asia and India. But as people try to improve our air quality, the ratio between natural dust to anthropogenic aerosol will increase -- that means dust will play a more important role in the future."
Pollution
2,021
March 31, 2021
https://www.sciencedaily.com/releases/2021/03/210331103603.htm
Psychological interventions can reduce engine idling and improve air quality
New research by the University of Kent has found that using low-cost psychological interventions can reduce vehicle engine idling and in turn improve air quality, especially when there is increased traffic volume at railway level crossings.
A team of psychologists led by Professor Dominic Abrams, Dr Tim Hopthrow and Dr Fanny Lalot at the University's School of Psychology, found that using carefully worded road signage can decrease the number of drivers leaving engines idling during queues at crossing barriers.The research, which was funded by Canterbury City Council following a successful grant bid to the Department for Environment, Food and Rural Affairs (DEFRA), observed 6,049 drivers' engine idling at the St Dunstan's and St Stephen's level crossings in Canterbury, Kent. The researchers tested the effects of three intervention signs fixed to lampposts, which amplified existing signs to request drivers to switch off their engines. These were:The social norm and outcome efficacy messages successfully increased the proportion of drivers who turned off their engines, by 42% and 25%, respectively. This reduction in vehicle idling significantly reduced concentrations of atmospheric particulate matter (PM2.5) two metres above ground level.The presence of larger numbers of other drivers boosted the impact of the social norm road signage. These findings demonstrate that drivers may feel a stronger urge to conform to the norm of turning their engines off when those ahead of them in traffic do too. This reduces harmful emissions when it is most urgent to do so.This research, published by the As a result of the research, Canterbury City Council has installed permanent road signage at the St Dunstan's, St Stephen's and Sturry railway level crossings.Professor Abrams said: 'People have many creative ideas about how to improve air quality, but how do we know which will work? This research used a scientific method that enabled us to design effective messages to change people's behaviour, improving the air quality for themselves and others. Just as importantly, we have also discovered types of messages that do not work so well. This approach should also work when planning ways to encourage other behaviours that can improve air quality, health and quality of the environment.'Kelly Haynes, Environmental Health Officer -- Air Quality at Canterbury City Council, said: 'Improving air quality in the district is a major focus of the council and research like this is vital to that work.'The results clearly show the right messages in the right locations can be really effective in reducing the number of people leaving their engines running which is one of the main contributors to poor air quality in our city.'These signs are just one of many things we're doing to tackle air quality including the introduction of a new hybrid car club in Canterbury and plans to install more electric vehicle charging points across the district.'
Pollution
2,021
March 31, 2021
https://www.sciencedaily.com/releases/2021/03/210331085742.htm
Evidence of DNA collection from air
Researchers from Queen Mary University of London have shown for the first time that animal DNA shed within the environment can be collected from the air.
The proof-of-concept study, published in the journal Living organisms such as plants and animals shed DNA into their surrounding environments as they interact with them. In recent years, eDNA has become an important tool to help scientists identify species found within different environments. However, whilst a range of environmental samples, including soil and air, have been proposed as sources of eDNA until now most studies have focused on the collection of eDNA from water.In this study, the researchers explored whether eDNA could be collected from air samples and used to identify animal species. They first took air samples from a room which had housed naked mole-rats, a social rodent species that live in underground colonies, and then used existing techniques to check for DNA sequences within the sampled air.Using this approach, the research team showed that airDNA sampling could successfully detect mole-rat DNA within the animal's housing and from the room itself. The scientists also found human DNA in the air samples suggesting a potential use of this sampling technique for forensic applications.Dr Elizabeth Clare, Senior Lecturer at Queen Mary University of London and first author of the study, said: "The use of eDNA has become a topic of increasing interest within the scientific community particularly for ecologists or conservationists looking for efficient and non-invasive ways to monitor biological environments. Here we provide the first published evidence to show that animal eDNA can be collected from air, opening up further opportunities for investigating animal communities in hard to reach environments such as caves and burrows."The research team are now working with partners in industry and the third sector, including the company NatureMetrics, to bring some of the potential applications of this technology to life. Dr Clare added: "What started off as an attempt to see if this approach could be used for ecological assessments has now become much more, with potential applications in forensics, anthropology and even medicine.""For example, this technique could help us to better understand the transmission of airborne diseases such as Covid-19. At the moment social distancing guidelines are based on physics and estimates of how far away virus particles can move, but with this technique we could actually sample the air and collect real-world evidence to support such guidelines."The project was supported by Queen Mary's Impact Acceleration Accounts (IAAs), strategic awards provided to institutions by UK Research and Innovation (UKRI) that support knowledge exchange (KE) and help researchers generate impact from their research.
Pollution
2,021
March 30, 2021
https://www.sciencedaily.com/releases/2021/03/210330121311.htm
Environmental antimicrobial resistance driven by poorly managed urban wastewater
Researchers from Newcastle University, UK, working with colleagues at King Mongkut's University of Technology Thonburi (KMUTT) in Thailand and the Institute of Urban Environment of the Chinese Academy of Sciences, analysed samples of water and sediment taken from aquaculture ponds and nearby canals at five locations in central Thailand's coastal region.
The research, which was part-funded by an institutional links grant awarded by the Newton Fund via the British Council, and which has been published in the In comparison, they found a low number of AMR genes in all of the water and sediment samples collected from the aquaculture ponds.Aquaculture is the fastest growing animal food production sector globally, and over 91% of global aquaculture is now produced in Asia. The worldwide increase in demand for farmed fish, shrimp and other shellfish has led to the widespread use of antibiotics in aquaculture, and there have been concerns that this is driving environmental AMR, threatening global food production systems.In recent years, the Thai government has introduced measures aimed at tackling AMR in aquaculture including reducing the amount of antibiotics used in the industry and routinely monitoring antibiotic residues in aquaculture produce.Dr David Werner, from Newcastle University, said: "We found no evidence that aquaculture is driving environmental AMR. In fact, the data suggests that small-scale aquaculture farmers are complying with Thai government One Health policies to reduce antimicrobial use in aquaculture."Wide and regular monitoring of environmental antibiotic resistance with high-throughput diagnostic tools can identify pollution hot-spots and sources to pinpoint the most effective countermeasures. This study provides a further line of evidence for the importance of safely managed sanitation for combatting antibiotic resistance. Currently only around half of total domestic wastewater in Thailand is treated, and our findings have identified an urgent need to improve urban sanitation in the country's coastal aquaculture region, for the protection of global food production systems."The global spread of AMR is one of the greatest health threats to human, animal and environmental health. Without effective sanitation and adequate treatment of wastewater, bacteria can evolve quickly, increasing resistance to antibiotic medicines.This has led to fears that so-called superbugs -- bacteria that are resistant to all antibiotics -- will compromise our ability to combat many new biological infections.Reducing the spread of AMR is a World Health Organization (WHO) top five priority, and guidance published by the WHO in 2020 provides a framework for countries to create their own national action plans that suit their own particular regional setting. The guidance included contributions from Professor David Graham, also from Newcastle University, and reflects growing evidence, including research by Professor Graham, which suggests that the spread of AMR will not be solved by prudent antibiotic use alone and that environmental factors may be of equal or greater importance.Professor Graham, who was also part of the team involved with this aquaculture study, said: "The only way we are going to win the fight against antibiotic resistance is to understand and act on all of the pathways that accelerate its spread. Although the types and drivers of resistance are diverse and vary by region and country, there are common roots to its spread -- excess antibiotic use, pollution, poor water quality, and poor sanitation."This new work is crucial because it exemplifies how inadequate sanitation can affect the food supply, and may be among the strongest drivers of AMR spread."The work in Thailand is just one example of how experts from Newcastle University are working with scientists from countries including China, Malaysia, India, Ethiopia, Tanzania, and Nepal to track down the sources of waterborne hazards in rivers and their associated food production systems. By working together to carry out comprehensive water quality assessments, they are helping to address the global health challenges of safe water, safe food, and controlling AMR and infectious disease.
Pollution
2,021
March 30, 2021
https://www.sciencedaily.com/releases/2021/03/210330092530.htm
64% of global agricultural land at risk of pesticide pollution?
The study, published in
The study examined risk to soil, the atmosphere, and surface and ground water.The map also revealed Asia houses the largest land areas at high risk of pollution, with China, Japan, Malaysia, and the Philippines at highest risk. Some of these areas are considered "food bowl" nations, feeding a large portion of the world's population.University of Sydney Research Associate and the study's lead author, Dr Fiona Tang, said the widespread use of pesticides in agriculture -- while boosting productivity -- could have potential implications for the environment, human and animal health."Our study has revealed 64 percent of the world's arable land is at risk of pesticide pollution. This is important because the wider scientific literature has found that pesticide pollution can have adverse impacts on human health and the environment," said Dr Tang.Pesticides can be transported to surface waters and groundwater through runoff and infiltration, polluting water bodies, thereby reducing the usability of water resources."Although the agricultural land in Oceania shows the lowest pesticide pollution risk, Australia's Murray-Darling basin is considered a high-concern region both due to its water scarcity issues, and its high biodiversity," said co-author Associate Professor Federico Maggi from the School of Civil Engineering and the Sydney Institute of Agriculture."Globally, our work shows that 34 percent of the high-risk areas are in high-biodiversity regions, 19 percent in low-and lower-middle-income nations and five percent in water-scarce areas," said Dr Tang.There is concern that overuse of pesticides will tip the balance, destabilise ecosystems and degrade the quality of water sources that humans and animals rely on to survive.Global pesticide use is expected to increase as the global population heads towards an expected 8.5 billion by 2030."In a warmer climate, as the global population grows, the use of pesticides is expected to increase to combat the possible rise in pest invasions and to feed more people," said Associate Professor Maggi.Dr Tang said: "Although protecting food production is essential for human development, reducing pesticide pollution is equivalently crucial to protect the biodiversity that maintains soil health and functions, contributing towards food security."Co-author Professor Alex McBratney, Director of the Sydney Institute of Agriculture at the University of Sydney, said: "This study shows it will be important to carefully monitor residues on an annual basis to detect trends in order to manage and mitigate risks from pesticide use.""We recommend a global strategy to transition towards a sustainable, global agricultural model that reduces food wastage while reducing the use of pesticides," said the authors of the paper.
Pollution
2,021
March 29, 2021
https://www.sciencedaily.com/releases/2021/03/210329200307.htm
Air pollution and physical exercise: When to do more or less
Physical activity is important in preventing heart and blood vessel disease in young people so long as they don't undertake very strenuous activity on days when air pollution levels are high, according to a nationwide study of nearly 1.5 million people published today (Tuesday) in the
Until now, little has been known about the trade-offs between the health benefits of physical activity taking place outdoors and the potentially harmful effects of air pollution. Previous research by the authors of the current study had investigated the question in middle-aged people at a single point in time, but this is the first time that it has been investigated in people aged between 20-39 years over a period of several years. In addition, the researchers wanted to see what happens when people increase or decrease their physical activity over time.The researchers from Seoul National University College of Medicine (South Korea), led by Professor Sang Min Park, looked at information from the National Health Insurance Service (NHIS) in South Korea for 1,469,972 young Koreans living in cities, who underwent two consecutive health examinations during two screening periods: 2009-2010 and 2011-2012. They followed up the participants from January 2013 to December 2018.At each health check-up the participants completed a questionnaire asking about their physical activity in the past seven days and this information was converted into units of metabolic equivalent task (MET) minutes per week (MET-mins/week). The participants were divided into four groups: 0, 1-499, 500-999 and 1000 or more MET-mins/week. European Society of Cardiology guidelines recommend people should try to do 500-999 MET-mins/week and this can be achieved by, for example, running, cycling or hiking for 15-30 minutes five times a week, or brisk walking, doubles tennis or slow cycling for 30-60 minutes five times a week. [1]The researchers used data from the National Ambient Air Monitoring System in South Korea to calculate annual average levels of air pollution, in particular the levels of small particulate matter that are less than or equal to 10 or 2.5 microns in diameter, known as PM10 and PM2.5 [2]. The amount of exposure to air pollution was categorised at two levels: low to moderate (less than 49.92 and 26.43 micrograms per cubic metre, μm/m3, for PM10 and PM2.5 respectively), and high (49.92 and 26.46 μm/m3 or more, respectively). [2]Dr Seong Rae Kim, first author of the paper, said: "We found that in young adults aged 20-39 years old, the risk of cardiovascular diseases, such as stroke and heart attack, increased as the amount of physical activity decreased between the two screening periods in the group with low levels of exposure to air pollution."However, in the group with high levels of exposure to air pollution, increasing the amount of physical activity to more than 1000 MET-min/week, which is more than internationally recommended levels for physical activity, could adversely affect cardiovascular health. This is an important result suggesting that, unlike middle-aged people over 40, excessive physical activity may not always be beneficial for cardiovascular health in younger adults when they are exposed to high concentrations of air pollution."He continued: "Ultimately, it is imperative that air pollution is improved at the national level in order to maximise the health benefits of exercising in young adults. These are people who tend to engage in physical activity more than other age groups while their physical ability is at its best. If air quality is not improved, this could result in the incidence of cardiovascular diseases actually increasing despite the health benefits gained from exercise."The researchers adjusted their results to take account of factors that could affect them, such as age, sex, household income, body mass index, smoking and alcohol consumption. During the follow-up period there were 8706 cardiovascular events. Among people exposed to high levels of PM2.5 air pollution, those who increased their exercise from 0 to 1000 MET-min/week or more between the two screening periods had a 33% increased risk of cardiovascular disease during the follow-up period compared to those who were physically inactive and did not increase their exercise, although this result was slightly weaker than that needed to achieve statistical significance. This means an extra 108 people per 10,000 might develop cardiovascular disease during the follow-up period.Among people exposed to low to moderate levels of PM2.5, those who increased their physical activity from none to 1000 MET-min/week or more had a 27% reduced risk of developing cardiovascular disease compared to those who remained inactive, although this result was also not quite statistically significant. This means 49 fewer people per 10,000 might develop cardiovascular disease during the follow-up period.Dr Kim said: "These results are very close to statistical significance. In fact, a further analysis ... of our paper shows that statistical significance was achieved for increasing and decreasing amounts of physical activity."For low to moderate levels of PM10 air pollution, there was a statistically significant 38% or 22% increased risk of cardiovascular disease among people who started off doing 1000 MET-min/week or more and then reduced their activity to none or to 1-499 MET min/week, respectively, compared to people who maintained the same high level of activity. These results were statistically significant and mean that 74 and 66 extra people per 10,000 respectively would develop cardiovascular problems during the follow-up period.Professor Sang Min Park, who led the research, said: "Overall, our results show that physical activity, particularly at the level recommended by European Society of Cardiology guidelines, is associated with a lower risk of developing heart and blood vessel disease among young adults. However, when air pollution levels are high, exercising beyond the recommended amount may offset or even reverse the beneficial effects."The study cannot show that air pollution causes the increased cardiovascular risk, only that it is associated with it. Other limitations are that there was no information on whether or not the exercise took place indoors or outdoors; participants may not have remembered correctly the amount of exercise they took in the seven days before they attended their screening interview, although this is unlikely; PM2.5 data were only measured in three major cities; and the researchers did not investigate the short-term effects of exposure to air pollution.[1] Examples of activity for each of the MET-min/week categories: 0 MET-min/week: No physical activity at all; 1-499 MET-min/week: Running, bicycling, hiking etc. less than 15 minutes a day and less than 5 times a week / Brisk walking, doubles tennis, slow cycling, etc., less than 30 minutes a day and less than 5 times a week; 500-999 MET-min/week: Running, bicycling, hiking etc. 15-30 minutes a day and about 5 times a week / Brisk walking, doubles tennis, slow cycling, etc., 30-60 minutes a day and about 5 times a week; More than 1000 MET-min/week: Running, bicycling, hiking etc. more than 30 minutes a day and about 5 times a week / Brisk walking, doubles tennis, slow cycling, etc., more than 60 minutes a day and about 5 times a week[2] A micron is one millionth of a metre.
Pollution
2,021
March 29, 2021
https://www.sciencedaily.com/releases/2021/03/210329160002.htm
Probing wet fire smoke in clouds: Can water intensify Earth's warming?
A first-of-its-kind instrument that samples smoke from megafires and scans humidity will help researchers better understand the scale and long-term impact of fires -- specifically how far and high the smoke will travel; when and where it will rain; and whether the wet smoke will warm the climate by absorbing sunlight.
"Smoke containing soot and other toxic particles from megafires can travel thousands of kilometers at high altitudes where winds are fast and air is dry," said Manvendra Dubey, a Los Alamos National Laboratory atmospheric scientist and co-author on a paper published last week in The new instrument circumvents this problem by developing a gentler technique that uses a low-power, light-emitting diode to measure water's effect on scattering and absorbtion by wildfire smoke and hence its growth. By sampling the smoke and scanning the humidity from dry to very humid conditions while measuring its optical properties, the instrument mimicks what happens during cloud and rain formation, and the effects of water are measured immediately. Laboratory experiments show for the first time that water coating the black soot-like material can enhance the light absorption by up to 20 percent.The instrument will next be tested and the water effects probed in smoke from wildfires sampled at Los Alamos' Center for Aerosol-gas Forensics (CAFÉ). In addition, the effect of water uptake by soot in polluted air on deep convective storms will be measured at the DOE's Atmospheric Radiation Monitoring TRACER-CAT campaign led by Los Alamos in Houston next year.Dubey worked with Christian M. Carrico of the New Mexico Institute of Mining and Technology on a research team that included scientists from Los Alamos, New Mexico Tech, Michigan Technological University, and Aerodyne Research, Inc. The novel findings of experiments were performed by the Los Alamos Director's postdoctoral fellow Kyle Gorkowski and Department of Energy graduate awardee Tyler Capek.
Pollution
2,021
March 29, 2021
https://www.sciencedaily.com/releases/2021/03/210329122817.htm
Satellites contribute significant light pollution to night skies
Scientists reported new research results today suggesting that artificial objects in orbit around the Earth are brightening night skies on our planet significantly more than previously understood.
The research, accepted for publication in "Our primary motivation was to estimate the potential contribution to night sky brightness from external sources, such as space objects in Earth's orbit," said Miroslav Kocifaj of the Slovak Academy of Sciences and Comenius University in Slovakia, who led the study. "We expected the sky brightness increase would be marginal, if any, but our first theoretical estimates have proved extremely surprising and thus encouraged us to report our results promptly."The work is the first to consider the overall impact of space objects on the night sky rather than the effect of individual satellites and space debris affecting astronomers' images of the night sky. The team of researchers, based at institutions in Slovakia, Spain and the US, modelled the space objects' contribution to the overall brightness of the night sky, using the known distributions of the sizes and brightnesses of the objects as inputs to the model.The study includes both functioning satellites as well as assorted debris such as spent rocket stages. While telescopes and sensitive cameras often resolve space objects as discrete points of light, low-resolution detectors of light such as the human eye see only the combined effect of many such objects. The effect is an overall increase in the diffuse brightness of the night sky, potentially obscuring sights such as the glowing clouds of stars in the Milky Way, as seen away from the light pollution of cities."Unlike ground-based light pollution, this kind of artificial light in the night sky can be seen across a large part of the Earth's surface," explained John Barentine, Director of Public Policy for the International Dark-Sky Association and a study co-author. "Astronomers build observatories far from city lights to seek dark skies, but this form of light pollution has a much larger geographical reach."Astronomers have expressed unease in recent years about the growing number of objects orbiting the planet, particularly large fleets of communications satellites known informally as 'mega-constellations'.In addition to crowding the night sky with more moving sources of artificial light, the arrival of this technology increases the probability of collisions among satellites or between satellites and other objects, generating further debris. Recent reports sponsored by the US National Science Foundation and the United Nations Office for Outer Space Affairs identified mega-constellations as a threat to the continued utility of astronomy facilities on the ground and in low-Earth orbit. In the UK the Royal Astronomical Society has established several working groups to understand the impact of mega-constellations on optical and radio astronomical facilities used by scientists.The results published today imply a further brightening of the night sky proportional to the number of new satellites launched and their optical characteristics in orbit. Satellite operators like SpaceX have recently worked to lower the brightness of their spacecraft through design changes. Despite these mitigating efforts though, the collective effect of a sharp increase in the number of orbiting objects stands to change the experience of the night sky for many across the globe.The researchers hope that their work will change the nature of the ongoing dialog between satellite operators and astronomers concerning how best to manage the orbital space around the Earth."Our results imply that many more people than just astronomers stand to lose access to pristine night skies," Barentine said. "This paper may really change the nature of that conversation."
Pollution
2,021
March 25, 2021
https://www.sciencedaily.com/releases/2021/03/210325150148.htm
California's diesel emissions rules reduce air pollution, protect vulnerable communities
Extending California's stringent diesel emissions standards to the rest of the U.S. could dramatically improve the nation's air quality and health, particularly in lower income communities of color, finds a new analysis published today in the journal
Since 1990, California has used its authority under the federal Clean Air Act to enact more aggressive rules on emissions from diesel vehicles and engines compared to the rest of the U.S. These policies, crafted by the California Air Resources Board (CARB), have helped the state reduce diesel emissions by 78% between 1990 and 2014, while diesel emissions in the rest of the U.S. dropped by just 51% during the same time period, the new analysis found.The study estimates that by 2014, improved air quality cut the annual number of diesel-related cardiopulmonary deaths in the state in half, compared to the number of deaths that would have occurred if California had followed the same trajectory as the rest of the U.S. Adopting similar rules nationwide could produce the same kinds of benefits, particularly for communities that have suffered the worst impacts of air pollution."Everybody benefits from cleaner air, but we see time and again that it's predominantly lower income communities of color that are living and working in close proximity to sources of air pollution, like freight yards, highways and ports. When you target these sources, it's the highly exposed communities that stand to benefit most," said study lead author Megan Schwarzman, a physician and environmental health scientist at the University of California, Berkeley's School of Public Health. "It's about time, because these communities have suffered a disproportionate burden of harm."The study also points out that exposure to fine particulate matter (PMDiesel exhaust consists of both particles and gases and contributes significantly to PM"There are hundreds of studies around the world that link particulate matter exposure and premature death," said study co-author Álvaro Alvarado, a former air pollution specialist at CARB who now works for OEHHA. "In cities with higher levels of air pollution, there are also higher hospitalization rates for respiratory and cardiovascular illnesses and more emergency room visits for asthma."To improve air quality, CARB's policies have gone beyond federal standards to limit diesel emissions from a variety of mobile sources, including heavy-duty trucks and buses, ships and port equipment, train locomotives, and the engines that power construction equipment and agricultural machinery.In their study, Schwarzman and colleagues catalogued the wide range of CARB policies that target each emissions sector and tracked how changes in diesel emissions corresponded to the implementation of those rules. They then show the impact of CARB policies by comparing California's reductions in diesel emissions to those in the rest of the U.S. Their analysis reveals that CARB's policies reduced emissions to the extent that, by 2014, California was emitting less than half the diesel particulate matter, as would be expected had the state followed the same trajectory as the rest of the U.S.One key policy approach that sets California apart is the requirement that older diesel engines be retrofitted to meet strict emissions standards, Schwarzman said. In the rest of the U.S., new diesel engines must meet updated emissions standards, but older, dirtier engines are allowed to operate without upgrades."The average lifetime of a diesel engine is about 20 years, or a million miles, so waiting for fleet turnover is just too slow," Schwarzman said. "California requires retrofits for existing trucks so that all diesel engines are held to a higher standard. This has made an enormous difference for air quality."Requiring upgrades for the engines that power heavy-duty trucks and buses has reduced California's diesel emissions in that sector by 85% since 1990, the study found. By comparison, the study estimates that if California's heavy-duty vehicle sector had followed the trajectory of other U.S. states, the sector's emissions would have dropped by only 58% in that period.Because the highways, ports and rail yards where diesel engines operate are more likely to be located near lower income communities of color than affluent, white communities, regulating diesel emissions can help correct persistent disparities in air quality and health, said senior study author John Balmes, a Berkeley Public Health professor and professor of medicine at the University of California, San Francisco."There are truly different levels of exposure to air pollution, and those differences in exposure have been linked to differential health outcomes," said Balmes, who also serves as the physician member of CARB.The study reports that every dollar the state has spent controlling air pollution has generated $38 in benefits attributable to lower rates of illness, premature death and lost productivity attributable to air pollution. As a result, there is no reason why the U.S. as a whole shouldn't adopt diesel emissions standards similar to California's, the authors argue."In terms of public health, federal air quality policy should be moving toward that of California, because we've shown that it works, and we've also shown that greening transportation can be good for economic growth," Balmes said. "These environmental regulations not only save lives and improve public health, they actually drive innovation and grow the green economy, which is the future."Co-authors of the study include Samantha Schildroth of UC Berkeley and May Bhetraratana of CARB. This research was supported in part by the California Breast Cancer Research Program under grant 23QB-1881.
Pollution
2,021
March 25, 2021
https://www.sciencedaily.com/releases/2021/03/210325120810.htm
'Climbing droplets' could lead to more efficient water harvesting
University of Texas at Dallas researchers have discovered that a novel surface they developed to harvest water from the air encourages tiny water droplets to move spontaneously into larger droplets.
When researchers placed microdroplets of water on their liquid-lubricant surface, the microdroplets propelled themselves to climb, without external force, into larger droplets along an oily, ramp-shaped meniscus that forms from the lubricant around the larger droplets. The "coarsening droplet phenomenon" formed droplets large enough for harvesting."This meniscus-mediated climbing effect enabled rapid coalescence on hydrophilic surfaces and has not been reported before. We have discovered a new physical phenomenon that makes it possible to harvest water more rapidly from air without external force," said Dr. Xianming Dai, assistant professor of mechanical engineering in the Erik Jonsson School of Engineering and Computer Science, who led the work. "If we don't have this new phenomenon, the droplets would be too small, and we could hardly collect them."Microdroplets of water on a hydrophilic SLIPS surface (left) propel themselves to climb, without external force, into larger droplets along an oily, ramp-shaped meniscus that forms from the lubricant around the larger droplets. On the right, the video clip shows how microdroplets behave on a solid slippery surface.The findings, published March 25 in Developing new technologies that harvest water from the atmosphere is a growing field of research as more and more people live in areas where fresh water is in short supply. Scientists estimate that 4 billion people live in regions with severe freshwater shortages for at least one month each year. This number is predicted to rise to between 4.8 billion and 5.7 billion by 2050. Reasons include climate change, polluted water supplies and increased demand due to both population growth and changes in usage behavior.The key to the microdroplet's self-climbing action is a surface that Dai and his colleagues previously developed. Their liquid lubricant, a hydrophilic slippery liquid-infused porous surface (SLIPS), has a unique hydrophilic nature for water harvesting and rapidly directs water droplets into reservoirs.Researchers discovered the self-propelling droplet phenomenon on their surface by accident. They were testing different lubricants to determine which could best facilitate water harvesting when they saw the smaller water droplets propel themselves into larger droplets. That led them to collaborate with Dr. Howard A. Stone, chair of mechanical and aerospace engineering at Princeton University and an expert in fluid dynamics, to investigate the underlying physics of the phenomenon."Dr. Dai and his team led this work. The ideas are creative, and they made a series of observations in the laboratory that allowed them to understand the underlying physics and its potential applications," Stone said. "They reached out to me to discuss the mechanism, and we had several Skype or Zoom meetings and email exchanges. It was all very interesting and stimulating. I enjoyed very much seeing the ideas evolve into the published paper."As water vapor condenses on the liquid-lubricant surface, oil from the lubricant forms a meniscus, or curvature, around the droplets. The meniscus looks like an upward-curving ramp, which acts like a bridge along which microdroplets spontaneously climb toward and coalesce with larger water droplets, a process the researchers call the coarsening effect. The properties of the lubricated surface prevent the water droplets from being completely submerged in the oil, so they can float on the oil, allowing them to climb."The oil meniscus acts like a bridge, so the droplet can climb on it," Dai said. "The small droplet actively looks for a larger one. After they are connected by the bridge, they become one."As tiny water droplets condense from air on a cooled surface, they become thermal barriers that prevent further condensation. By allowing for rapid water droplet collection, the coarsening droplets help clear surfaces for new droplets to form, which facilitates faster, more efficient water harvesting.The self-propelled coarsening droplet on hydrophilic SLIPS shows rapid removal of condensed submicrometer-sized droplets regardless of how the surface is oriented, which presents a promising approach compared to other surfaces used for water harvesting."We cannot harvest a large amount of water unless we have a rapid harvesting process. The problem with other surfaces is that the small water droplets may evaporate before they can be harvested," Dai said."Based on our experimental data, the coarsening surface enhanced the water harvesting rate 200% higher than its counterparts," said Zongqi Guo, a mechanical engineering doctoral student and co-lead author. Dai and his colleagues continue to work on ways to use their lubricant to make sustainable water harvesting systems that are mobile, smaller in size, lower in weight and less expensive."If we can do that, we can harvest water anywhere that has air, which is particularly important in regions where water is scarce," Dai said.The research was funded by the National Science Foundation and the Army Research Office.
Pollution
2,021
March 24, 2021
https://www.sciencedaily.com/releases/2021/03/210324195141.htm
Even small increases in NO2 levels could be linked to heightened risk of heart and respiratory death
Even small increases in nitrogen dioxide levels in the air may be linked to increases in cardiovascular and respiratory deaths, according to research published by
The findings suggest a need to revise and tighten the current air quality guidelines, and to consider stricter regulatory limits for nitrogen dioxide concentrations.Nitrogen dioxide (NOIt is measured in micrograms (one-millionth of a gram) per cubic meter of air or µg/mMany studies have reported the effects of short term exposure to NOTo address this uncertainty, a team of international researchers set out to investigate the short term associations between NOTheir findings are based on daily concentrations of nitrogen dioxide from 398 cities in 22 low to high income countries/regions over a 45-year period (1973 to 2018).Daily weather data, including average temperature and humidity, were also recorded, and death records were obtained from local authorities within each country/region.A total of 62.8 million deaths were recorded over the 45-year study period, 19.7 million (31.5%) were cardiovascular related deaths and 5.5 million (8.7%) were respiratory deaths.On average, a 10 µg/m3 increase in NOThese associations did not change after adjusting for levels of other common air pollutants (sulphur dioxide, carbon monoxide, ozone, and varying sizes of fine particulate matter) obtained from the same fixed site monitoring stations, suggesting that the results withstand scrutiny.The researchers estimate that the proportion of deaths attributable to NOAnd while they acknowledge that reducing NOThis is an observational study, so can't establish cause, and the authors point out that because most of the data were obtained from developed areas, such as Europe, North America, and East Asia, any global generalisations should be made with caution.In addition, there might have been slight changes in air pollution measurements over the decades, and the health data collection might be subject to diagnostic or coding errors.However, strengths included the study's scale, providing enormous statistical power and ensuring the stability of the findings, and uniform analytical methods, allowing for more reliable comparisons across different regions and populations.As such, they say their analysis "provides robust evidence for the independent associations of short term exposure to NOThey add: "These findings contribute to a better understanding of how to optimise public health actions and strategies to mitigate air pollution."
Pollution
2,021
March 24, 2021
https://www.sciencedaily.com/releases/2021/03/210324142849.htm
Aerosol formation in clouds
Researchers at the Paul Scherrer Institute PSI have studied for the first time how chemical reactions in clouds can influence the global climate. They found that isoprene, the dominant non-methane organic compound emitted into the atmosphere, can strongly contribute to the formation of organic aerosols in clouds. They published their results today in the journal
Aerosols, a mixture of solid or liquid particles suspended in the air, play an important role in Earth's climate. Aerosols originate either from natural or human sources. They influence Earth's radiation balance by interacting with sunlight and forming clouds. However, their effect remains the single most significant uncertainty in climate models.One substance that is very common in the atmosphere is isoprene, an organic compound whose reactions in the gas phase are relatively well understood. Isoprene is given off by trees and can produce aerosols when it is oxidised. How isoprene and its reaction products react in cloud droplets is still largely unknown. That's why researchers at the Paul Scherrer Institute PSI have used a type of flow reactor with wetted walls, together with the most advanced mass spectrometers, to investigate what could be happening chemically inside clouds for the first time under atmospherically relevant conditions."Our experimental setup allows us for the first time to precisely investigate the distribution of organic vapours at the air-water interface under near-environmental conditions," says Houssni Lamkaddam, a researcher in the Laboratory of Atmospheric Chemistry at PSI. "With our apparatus, we can now simulate what happens in clouds."In the special apparatus, a so-called wetting reactor, a thin film of water is maintained on the inside of a quartz tube. A gas mixture containing, among other substances, isoprene, ozone, and so-called hydroxyl radicals is fed into the glass cylinder. UV lamps are installed around the glass cylinder to simulate daylight conditions for some of the experiments.Using this setup, the researchers found that up to 70 percent of the isoprene oxidation products can be dissolved in the water film. The subsequent aqueous oxidation of the dissolved species produces substantial amounts of secondary organic aerosols. On the basis of these analyses, they calculated that the chemical reactions that take place in clouds are responsible for up to 20 percent of the secondary organic aerosols on a global scale."This is another important contribution to a better understanding of the processes in the atmosphere," sums up Urs Baltensperger, scientific head of the Laboratory of Atmospheric Chemistry at PSI. Earth's radiation balance is a very important factor in the entire climate process and thus also in climate change. "And aerosols play a crucial role in this," says the atmospheric scientist. While aerosols form cloud droplets, this research shows that clouds can also form aerosols through the aqueous chemistry of organic vapours, a process that is well known with regard to sulfate aerosols but here is also shown for the organic fraction. This new experimental setup, developed at PSI, opens up the possibility of investigating aerosol formation in clouds under near-atmospheric conditions so that these processes can ultimately be included in climate models.
Pollution
2,021
March 24, 2021
https://www.sciencedaily.com/releases/2021/03/210324113533.htm
Dangerous landfill pollutants ranked in order of toxicity
Nearly 2,000 active landfills are spread across the U.S., with the majority of garbage discarded by homes and businesses finding its way to a landfill. The resulting chemicals and toxins that build up at these sites can then leach into soil and groundwater, and this "leachate" can present serious hazards to the environment and to the people who live nearby.
To help environmental agencies battle the toxic threats posed by landfills, researchers at the University of Missouri -- in partnership with the USDA Forest Service -- have developed a system that ranks the toxins present in a landfill by order of toxicity and quantity, allowing agencies to create more specific and efficient plans to combat leachate."Leachate from landfills can cause cancer and other serious harm, and it's a threat whether it's ingested, inhaled or touched," said Chung-Ho Lin, an associate research professor with the MU Center for Agroforestry in the College of Agriculture, Food and Natural Resources. "This is the first time a system has been created that can automatically prioritize the pollutants released from a landfill based on their toxicity and abundance."The system relies on an algorithm created by Elizabeth Rogers, a doctoral student working under Lin's guidance at the University of Missouri and a USDA Pathways Intern. Rogers drew from a previously existing system designed to prioritize chemicals in "fracking" wastewater and adapted it to apply to landfill pollution.Combining the algorithm with three "toxicity databases" that are referenced when analyzing a sample from a landfill, the system takes a traditionally time-consuming and expensive process -- identifying a pollutant and determining its abundance and potential harm -- and makes it routine. The result is a prioritization system that can rank pollutants by taking into account both their overall toxicity and prevalence at a given site. In addition, the prioritization of pollutants can be easily customized based on factors and goals that can vary from site to site.Ronald Zalesny Jr., a supervisory research plant geneticist for the USDA Forest Service who is also mentoring Rogers, worked with Lin and Rogers on the study optimizing the prioritization system and exploring its utility. For him, the ability to easily identify, quantify and rank landfill pollutants meets a very real need.Zalesny Jr. is a principal investigator for a project that harnesses trees to clean up contaminated soils and water at landfills. Through a natural process known as phytoremediation, the poplar and willow trees help degrade, absorb and otherwise inhibit pollutants and the groundwater runoff that carries them.Knowing which pollutants are the most important targets at a given location is crucial, said Zalesny Jr., because different trees employ different methods of removing pollutants from the soil, and no single method will work on every type of pollutant."In the past, we have mostly targeted the most common pollutants, such as herbicides and contaminants from crude oil," Zalesny Jr. said. "Using this prioritization tool, we could now go to basically any contaminated site, identify the top contaminants and match those contaminants with our trees to create a sustainable, long-term solution for cleaning up pollution."Zalesny Jr.'s project is part of the Great Lakes Restoration Initiative, which seeks to protect the Great Lakes from environmental degradation by providing relevant funding to federal agencies. If contaminated runoff from landfills makes its way into rivers and streams, it could ultimately make its way into the Great Lakes, Zalesny Jr. said.Rogers, who created the algorithm that can quickly sort pollutants by their relative toxicity, sees another important benefit to the system. While many landfill regulations have not been updated in decades, new classes of contaminants continue to arrive in landfills, posing a problem for those seeking to mitigate their effects. By offering scientists and researchers up to date information about hundreds of possible pollutants, the prioritization system could help environmental agencies tackle more of these dangerous new arrivals."Some of the most potentially harmful compounds that we identified using this scheme were from things like antibiotics or prescription medications, which could have serious impacts on the human endocrine system," Rogers said. "There were also compounds from personal care products. And while we know these newer classes of compounds can have negative impacts, there is still a lot we don't know about them, and they're ending up in landfills. Hopefully the use of this system will encourage more research into their impacts."
Pollution
2,021
March 24, 2021
https://www.sciencedaily.com/releases/2021/03/210324113348.htm
Pollutant levels after Hurricane Harvey exceeded lifetime cancer risk in some areas
The unprecedented rainfall from Hurricane Harvey in 2017 brought more than flood damage to southeast Texas. For people living in environmental justice communities such as the Manchester neighborhood near the Houston Ship Channel, heavy rainfall and flooding may have increased risks of exposure to harmful chemicals from nearby industry.
To gain a better understanding of how flooding mobilized pollution in the area, a research team led by Garett Sansom, DrPH, research assistant professor in the Department of Environmental and Occupational Health at the Texas A&M University School of Public Health, analyzed samples of soil from the Manchester neighborhood collected immediately after Hurricane Harvey. Findings were just published in the PAHs come from incomplete burning of hydrocarbons like wood and fossil fuels. They are found in high concentrations near oil refineries and other industrial facilities as well as major highways and other transportation hubs like shipyards and railways. PAHs also attach themselves to particles in the air, meaning once they settle, they can be moved around by flood waters. The high baseline levels of PAHs in Manchester have thus fueled resident concerns that floods such as those caused by Hurricane Harvey could increase exposure risks.Manchester is close to the Houston Ship Channel, a major interstate highway, a large railyard and several oil refineries. Previous studies have found that this neighborhood has a disproportionately high level of PAH pollution and associated health risks. Because of this it is important to understand how flooding and other disasters impact the area. Flooding is becoming a greater concern for residents in Manchester as well as in other locations in the Houston area as the frequency of heavy rainfall events appears to be increasing. Between 1981 and 2000, the odds of a rainfall event of more than 20 inches increased by one percent, and this frequency is expected to grow by 18 percent between 2018 and 2100.The analysis found differences in PAH concentrations across all 40 sample sites, with nearly half of Manchester contaminated to some degree and nine of the sites having a higher PAH concentration than the minimum standard for increased cancer risk. The highest concentrations were found at sites closest to the highway and the Houston Ship Channel, and the lowest concentrations were in farther away locations. The distribution of PAHs in Manchester may have been controlled in part by the way flood waters moved through the area. However, the researchers did not have data on street-level differences in surfaces for their analysis. Thus, it is unclear how much surfaces that do not absorb water, such as streets and sidewalks, contributed to the distribution pattern.The findings of this study build on prior research showing that people in areas that flood may have a greater risk of PAH exposure. This study also points to the need for a better understanding of how PAHs are dispersed during flood events. More data on baseline pollutant concentrations and improved analysis methods will help researchers, policy makers and community leaders assess the risks people living in environmental justice communities face, and possibly find ways to limit the health risks residents of neighborhoods like Manchester face in the future.
Pollution
2,021
March 24, 2021
https://www.sciencedaily.com/releases/2021/03/210324135430.htm
Rodenticides in the environment pose threats to birds of prey
Over the past decades, the increased use of chemicals in many areas led to environmental pollution -- of water, soil and also wildlife. In addition to plant protection substances and human and veterinary medical drugs, rodenticides have had toxic effects on wildlife. A new scientific investigation from scientists of the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW), the Julius Kühn Institute (JKI) and the German Environment Agency (Umweltbundesamt -- UBA) demonstrate that these substances are widely found in liver tissues of birds of prey from Germany. Anticoagulant rodenticides, commonly used to kill rodents in agriculture and forestry, were frequently detected, particularly in birds of prey close to or in urban environments. Especially northern goshawks in the urban conurbation of Berlin and red kites in all habitats were frequently exposed to rodenticides. Evidence of rodenticides in white-tailed sea eagles demonstrated that scavengers occupying habitats more distant from human-modified landscapes are subjected to exposure as well. The results, which were supported by WWF Germany, are published in the scientific journal
Europe's bird populations currently experience a substantial decline. Among the drivers of this decline are continued urbanisation, growing intensification of agriculture, the massive decline of insect populations as well as chemical pollution linked to the aforementioned processes of land use. "Raptors are known to be particularly sensitive to bioaccumulating pollutants," says Oliver Krone, bird of prey specialist at the Leibniz-IZW Department of Wildlife Diseases. Together with doctoral student Alexander Badry from Leibniz-IZW and colleagues Detlef Schenke from JKI and Gabriele Treu from UBA he now analysed in detail which substances are detectable in deceased red kites (Milvus milvus), northern goshawks (Accipiter gentilis), Eurasian sparrowhawks (Accipiter nisus), white-tailed sea eagles (Haliaeetus albicilla) and ospreys (Pandion haliaetus). The team analysed carcasses collected between 1996 and 2018."We found rodenticide residues in liver tissues of more than 80 percent of the northern goshawks and red kites which we examined," says lead author Badry. In total, 18 percent of the northern goshawks and 14 percent of the red kites exceeded the threshold level of 200 ng per gram body mass for acute toxic effects. This is expected to contribute to previously reported declines in survival of red kites in Germany. "In white-tailed sea eagles we found rodenticides in almost 40 percent of our samples, at lower concentrations, whereas exposure in sparrowhawks and ospreys was low or zero." Overall, more than 50 percent of the birds had rodenticide levels in their liver tissue, about 30% had combinations of more than one of these substances."Rodenticide poisoning represents an important cause of death for birds of prey," Badry and Krone conclude. "Species that facultatively scavenge have shown to be at high risk for rodenticide exposure." The application of these pesticides is not restricted to agricultural contexts, such as barns and stables or for controlling common vole populations on arable land. Anticoagulant rodenticides are also frequently used in large-scale forest plantations and in the sewage systems and canals of towns and cities to control rodent populations. The results of the analyses demonstrated that the closer a dead bird was found to urban landscapes such as industrial areas and the urban conurbation, the more likely it was exposed to rodenticides. "It seems that urban areas pose a great risk for birds of prey in terms of exposure to rodenticides, although the extent of exposure was not linked to the urban gradient," the authors explain. "This means that birds of prey are more likely to be exposed to rodenticides in the vicinity or inside urban areas but it does not automatically mean that more of these substances accumulate." Species-specific traits such as facultative scavenging on small mammals or foraging on birds that have direct access to rodenticide bait boxes seem to be responsible for the extent of exposure rather than urban habitat use as such. Additionally, accumulation takes place through multiple exposures throughout the life of an individual, which is why adults were more likely to be exposed than juvenile birds.In addition to rodenticides, the scientists also detected medical drugs such as ibuprofen (14.3 %) and fluoroquinolones (2.3 %) in the bird of prey carcasses. Among the plant protection products, they detected the insecticide dimethoate, which was allowed for use until 2019, and its metabolite omethoate as well as the neonicotinoid thiacloprid in four red kites, which were allowed for use until 2021. The scientists assume that the levels of dimethoate they found were a consequence of deliberate poisoning. The traces of thiacloprid -- a substance with a very short half-life in bird organs -- hint at an exposure briefly before their death.The results of these analyses clearly show that especially rodenticides and deliberate poisoning pose a threat to birds of prey, the authors conclude. This is true both for raptors living in or near urban habitats and facultative scavengers. Known sources of these substances need to be re-evaluated in terms of their effects along the food chain, i.e. in terms of secondary poisoning and potential toxicity to birds of prey. Furthermore, the levels of rodenticides found in white-tailed sea eagles, which do not usually feed on the species that the rodenticides target, indicate that further research on the sources is needed.
Pollution
2,021
March 23, 2021
https://www.sciencedaily.com/releases/2021/03/210323150822.htm
With drop in LA's vehicular aerosol pollution, vegetation emerges as major source
California's restrictions on vehicle emissions have been so effective that in at least one urban area, Los Angeles, the most concerning source of dangerous aerosol pollution may well be trees and other green plants, according to a new study by University of California, Berkeley, chemists.
Aerosols -- particles of hydrocarbons referred to as PM2.5 because they are smaller than 2.5 microns in diameter and easily lodge in the lungs -- are proven to cause cardiovascular and respiratory problems.As a result of strict vehicle emissions laws, organic aerosol levels have been significantly reduced throughout the United States, but the drop has been particularly dramatic in Los Angeles, which started out at a higher level.Based on pollution measurements over the past 20 years, the UC Berkeley scientists found that concentrations of PM2.5 in the Los Angeles basin in 2012 were half what they were in 1999. As a result, from 2016 to 2018, there were almost no PM2.5 violations in the area when temperatures were low, below 68 degrees Fahrenheit. But at warmer temperatures, aerosol concentrations rose -- over the same time period, 70% to 80% of days over 100 F exceeded the National Ambient Air Quality Standard (NAAQS) threshold."The positive news is that, where we did understand the source and we took action, that action has been incredibly effective," said Ronald Cohen, an atmospheric chemist and UC Berkeley professor of chemistry. "Twenty years ago, just about every day in LA was in violation of a health-based standard. And now it is only the hot days."As vehicle organic chemicals -- compounds of carcinogens such as benzene and toluene -- dropped, air quality experts focused on other potential sources of aerosols in those cities with unhealthful levels. Many researchers believe that personal care and household cleaning products -- some seemingly as benign as the citrus scent limonene -- may be the culprit. Given the temperature dependence of aerosol levels in Los Angeles, Cohen doubts that."There is a growing consensus that, as cars became unimportant, household chemicals are dominating the source of organics to the atmosphere and, therefore, dominating the source of aerosols," he said. "I am saying that I don't understand how aerosols from these chemicals could be temperature-dependent, and, therefore, I think it is likely something else. And trees are a good candidate."Plants are known to release more organic chemicals as the temperature rises and in many forested areas trees are the source of organic chemicals that combine with human-produced nitrogen oxides to form aerosol. President Ronald Reagan was partially correct when he infamously stated in 1981 that, "Trees cause more pollution than automobiles do." At the time, scientists were learning about the role of forests surrounding Atlanta in causing that city's air pollution.Cohen and former Berkeley master's degree student Clara Nussbaumer reviewed organic chemical emissions from various plants known to grow or be cultivated in the Los Angeles area and found that some, such as the city's iconic Mexican fan palms, produce lots of volatile organic compounds. Oak trees are also high emitters of organic chemicals.They estimated that, on average, 25% of the aerosols in the Los Angeles basin come from vegetation, which includes an estimated 18 million or more trees.Plant derived aerosols are likely made of the chemical isoprene -- the building block of rubber or plant chemicals such as terpenes, which consist of two or more isoprene building blocks combined to form a more complex molecule. Cohen says that PM2.5 aerosols can be thought of "as little tiny beads of candle wax," with plant-derived aerosols composed of many molecules of isoprene and terpenes, which are found in pine tree resins."I am not suggesting that we get rid of plants, but I want people who are thinking about large-scale planting to pick the right trees," he said. "They should pick low-emitting trees instead of high-emitting trees."The research was described this month in the journal How does global warming affect pollutants?Cohen, who has studied the temperature dependence of urban ozone levels for insight into the impact climate change will have on pollutants, decided two years ago to investigate the temperature dependence of ozone and aerosol pollution in five counties in the Los Angeles basin: Los Angeles, San Bernardino, Riverside, Orange and Ventura. He and Nussbaumer looked at data from 22 measurement sites across the basin -- eight in LA County, two in Orange County, five in Riverside County, four in San Bernardino County, and three in Ventura County -- to study aerosols, and at four sites -- three in LA, one in San Bernardino -- to study ozone.The researchers found that at the beginning of the 21st century, the relationship between temperature and aerosol pollution was quite varied: if the temperature went up, sometimes PM2.5 concentrations would increase a lot, sometimes a little. Today, the relationship is more linear: If the temperature goes up a degree, PM2.5 concentrations predictably increase by a set amount.Cohen and Nussbaumer focused primarily on secondary organic aerosols (SOA), which form as particles when gaseous pollutants -- primarily nitrogen oxides (NOx) and volatile organic compounds (VOCs) -- react with sunlight. The same conditions produce ozone.Using a simple atmospheric model, they concluded that both regulated chemicals from vehicle exhaust and cooking -- primary organic aerosols such as benzene, toluene, ethylbenzene and xylene -- and isoprene from plants were precursors of the majority of the organic aerosols observed. Their model suggests that about a quarter of the SOA in the LA Basin are formed by isoprene or other very similar compounds, and that these represent most of the temperature-dependent increase. While there is evidence that some temperature-dependent VOCs have been controlled over time, such as those from evaporation of gasoline, isoprene is not one of them.Cohen noted that as electric car use increases, the importance of organic aerosols from vegetation will become more dominant, requiring mitigation measures to keep levels within regulatory limits during heat waves."Cars are also contributing to ozone, and in the LA basin the ozone level is also high, at high temperatures and for the same reason: There are more organic molecules to drive the chemistry when it is hot ," Cohen said. "We want some strategy for thinking about which plants might emit fewer hydrocarbons as it gets hot or what other emissions we could control that prevent the formation of aerosols."Cohen hopes to look at data from other urban areas, including the San Francisco Bay Area, to see if the temperature-dependent aerosols now dominate, and whether vegetation is the culprit.The study was funded in part by a grant (NA18OAR4310117) from the National Oceanic and Atmospheric Administration (NOAA). Cohen and Allen Goldstein, a UC Berkeley professor of environmental science, policy and management and of civil and environmental engineering, have also partnered with NOAA scientists and the state and local air quality agencies on an experiment to observe emissions in Los Angeles at different temperatures. Combining these different observing strategies in the LA Basin, Cohen hopes, "will lead to better ideas for reducing high ozone and aerosol events in the basin, ones that can then be used as a guide in other major cities suffering from poor air quality."
Pollution
2,021
March 18, 2021
https://www.sciencedaily.com/releases/2021/03/210318142527.htm
Organic crystals' ice-forming superpowers
At the heart of clouds are ice crystals. And at the heart of ice crystals, often, are aerosol particles -- dust in the atmosphere onto which ice can form more easily than in the open air.
It's a bit mysterious how this happens, though, because ice crystals are orderly structures of molecules, while aerosols are often disorganized chunks. New research by Valeria Molinero, distinguished professor of chemistry, and Atanu K. Metya, now at the Indian Institute of Technology Patna, shows how crystals of organic molecules, a common component of aerosols, can get the job done.The story is more than that, though -- it's a throwback to Cold War-era cloud seeding research and an investigation into a peculiar memory effect that sees ice form more readily on these crystals the second time around.The research, funded by the Air Force Office of Scientific Research, is published in the Molinero's research is focused on how ice forms, particularly the process of nucleation, which is the beginning of ice crystal formation. Under the right conditions, water molecules can nucleate ice on their own. But often some other material, called a nucleant, can help the process along.After several studies on the ways that proteins can help form ice, Molinero and Metya turned their attention to organic ice nucleants (as used here, "organic" means organic compounds containing carbon) because they are similar to the ice-producing proteins and are found in airborne aerosols.But a review of the scientific literature found that the papers discussing ice nucleation by organic compounds came from the 1950s and 1960s, with very little follow-up work after that until very recently."That made me really curious," Molinero says, "because there is a lot of interest now on organic aerosols and whether and how they promote the formation of ice in clouds, but all this new literature seemed dissociated from these early fundamental studies of organic ice nucleants."Additional research revealed that the early work on organic ice nucleants was related to the study of cloud seeding, a post-war line of research into how particles (primarily silver iodide) could be introduced into the atmosphere to encourage cloud formation and precipitation. Scientists explored the properties of organic compounds as ice nucleants to see if they might be cost-effective alternatives to silver iodide.But cloud seeding research collapsed in the 1970s after political pressures and fears of weather modification led to a ban on the practice in warfare. Funding and interest in organic ice nucleants dried up until recently, when climate research spurred a renewed interest in the chemistry of ice formation in the atmosphere."There has been a growing interest in ice nucleation by organic aerosols in the last few years, but no connection to these old studies on organic crystals," Molinero says. "So, I thought it was time to "rescue" them into the modern literature."Phloroglucinol is one of the organic nucleants studied in the mid-20One question to answer is whether phloroglucinol nucleates ice through classical or non-classical processes. When ice nucleates on its own, without any surfaces or other molecules, the only hurdle to overcome is forming a stable crystallite of ice (only about 500 molecules in size under some conditions) that other molecules can build on to grow an ice crystal. That's classical nucleation.Non-classical nucleation, involving a nucleant surface, occurs when a layer of water molecules assembles on the surface on which other water molecules can organize into a crystal lattice. The hurdle to overcome in non-classical nucleation is the formation of the monolayer.Which applies to phloroglucinol? In the 1960s, researcher L.F. Evans concluded that it was non-classical. "I am still amazed he was able to deduce the existence of a monolayer and infer the mechanism was non-classical from experiments of freezing as a function of temperature alone!" Molinero says. But Molinero and Metya, using molecular simulations of how ice forms, found that it's more complicated."We find that the step that really decides whether water transforms to ice or not is not the formation of the monolayer but the growth of an ice crystallite on top," Molinero says. "That makes ice formation by organics classical although no less fascinating."The researchers also used their simulation methods to investigate an interesting memory effect previously observed with organic and other nucleants. When ice is formed, melted and formed again using these nucleants, the second round of crystallization is more effective than the first. It's assumed that the ice melts completely between crystallizations, and researchers have posed several potential explanations.Molinero and Metya found that the memory effect isn't due to the ice changing the nucleant surface, nor to the monolayer of water persisting on the nucleant surface after melting. Instead, their simulations supported an explanation where crevices in the nucleant can hold on to small amounts of ice that melt at higher temperatures than the rest of the ice in the experiment. If these crevices are adjacent to one of the nucleant crystal surfaces that's good at forming ice, then it's off to the races when the second round of freezing begins.Other mysteries still remain -- the mid-century studies of organic crystals found that at high pressures, around 1500 times atmospheric pressure, that the crystals are as efficient at organizing water molecules into ice as an ice crystal itself. Why? That's the focus of Molinero's next experiments.More immediately, though, phloroglucinol is a naturally-occurring compound in the atmosphere, so anything that researchers can learn about it and other organic nucleants can help explain the ability of aerosols to nucleate ice and regulate the formation of clouds and precipitation."It would be important to investigate whether small crystallites of these crystalline ice nucleants are responsible for the baffling ice nucleation ability of otherwise amorphous organic aerosols," Molinero says.
Pollution
2,021
March 18, 2021
https://www.sciencedaily.com/releases/2021/03/210318122525.htm
New analysis shows potential for 'solar canals' in California
UC Santa Cruz researchers published a new study -- in collaboration with UC Water and the Sierra Nevada Research Institute at UC Merced -- that suggests covering California's 6,350 km network of public water delivery canals with solar panels could be an economically feasible means of advancing both renewable energy and water conservation.
The concept of "solar canals" has been gaining momentum around the world as climate change increases the risk of drought in many regions. Solar panels can shade canals to help prevent water loss through evaporation, and some types of solar panels also work better over canals, because the cooler environment keeps them from overheating.Pilot projects in India have demonstrated the technical feasibility of several designs, but none have yet been deployed at scale. California's canal network is the world's largest water conveyance system, and the state faces both a drought-prone future and a rapid timeline for transitioning to renewable energy. Solar canals could target both challenges, but making the case for their implementation in California requires first quantifying the potential benefits. So that's exactly what researchers set out to do in their paper published by "While it makes sense to cover canals with solar panels because renewable energy and water conservation is a win-win, the devil is in the details," said Brandi McKuin, lead author of the new study and a UC Santa Cruz postdoctoral researcher in environmental studies. "A critical question was whether the infrastructure to span the canals would be cost-prohibitive."Canal-spanning solar panels are often supported either by steel trusses or suspension cables, both of which are more expensive to build than traditional support structures for ground-mounted solar panels. But McKuin led a techno-economic analysis that showed how the benefits of solar canals combine to outweigh the added costs for cable-supported installations. In fact, cable-supported solar canals showed a 20-50 percent higher net present value, indicating greater financial return on investment.In addition to benefits like increased solar panel performance and evaporation savings, shade from solar panels could help control the growth of aquatic weeds, which are a costly canal maintenance issue. Placing solar panels over existing canal sites could also avoid costs associated with land use. Now that the new paper has provided a more concrete assessment of these benefits, members of the research team hope this could lead to future field experiments with solar canals in California."This study is a very important step toward encouraging investments to produce renewable energy while also saving water," said Roger Bales, a coauthor on the paper who is a distinguished professor of engineering at UC Merced, the former director of the Sierra Nevada Research Institute, and a director at UC Water.Bales was part of the original group that got the project started in 2016, when San Francisco-based social impact agency Citizen Group approached UC Solar and UC Water with the concept. From there, the research grew into a collaboration between UC Merced, UC Santa Cruz, and Citizen Group, with funding support from NRG Energy and the USDA National Institute of Food and Agriculture.Lead author Brandi McKuin started working on the project while completing her Ph.D. at Merced, then continued with help from senior author and UCSC professor Elliott Campbell, the Stephen R. Gliessman Presidential Chair in Water Resources and Food Systems and a fellow Merced transplant. UC Merced professor Joshua Viers and researcher Tapan Pathak advised on the project, and graduate students Andrew Zumkehr and Jenny Ta contributed to analysis.Zumkehr led a complex hydrological analysis using data from satellites, climate models, and automated weather stations to model and compare evaporation rates at canal sites across the state, with and without shade from solar panels. McKuin then used this information in her assessment to calculate the financial benefits of reduced evaporation.Ultimately, it was the cost savings of many combined benefits that made solar canals financially viable, rather than benefits from reduced evaporation alone. But the study also notes that benefits from deploying solar canals could extend beyond immediate financial impacts. For example, every megawatt of solar energy produced by solar canals in California's Central Valley has the potential to replace 15-20 diesel-powered irrigation pumps, helping to reduce pollution in a region with some of the nation's worst air quality.And senior author Elliott Campbell says the wide range of benefits identified by the paper is, in itself, an important lesson to carry forward. He sees the findings as not only an assessment of solar canals, but also a clear illustration of the interconnections between urgent global issues like air quality, energy, and water conservation."What we're seeing here is actually some surprising benefits when you bring water and energy together," Campbell said. "Sometimes it leads to a smoother landing in how we transition to better ways of making energy and saving water."
Pollution
2,021
March 17, 2021
https://www.sciencedaily.com/releases/2021/03/210317141722.htm
Double-duty catalyst generates hydrogen fuel while cleaning up wastewater
Hydrogen is a pollution-free energy source when it's extracted from water using sunlight instead of fossil fuels. But current strategies for "splitting" or breaking apart water molecules with catalysts and light require the introduction of chemical additives to expedite the process. Now, researchers reporting in
Harnessing the sun's energy to split water to make hydrogen fuel is a promising renewable resource, but it is a slow process even when catalysts are used to speed it along. In some cases, alcohols or sugars are added to boost the rate of hydrogen production, but these chemicals are destroyed as hydrogen is generated, meaning the approach is not renewable. In a separate strategy, researchers have tried using contaminants in wastewater to enhance hydrogen fuel generation. While titanium-based catalysts worked for both removing contaminants and generating hydrogen, the efficiencies were lower than expected for both steps because of their overlapping reaction sites. One way to reduce such interferences is to make catalysts by fusing together different conductive metals, thus creating separate places for reactions to occur. So, Chuanhao Li and colleagues wanted to combine cobalt oxide and titanium dioxide to create a dual-functioning catalyst that would break down common drugs in wastewater while also efficiently converting water into hydrogen for fuel.To make the catalyst, the researchers coated nanoscale titanium dioxide crystals with a thin layer of cobalt oxide. Initial tests showed that this material didn't produce much hydrogen, so as a next step, the team spiked this dual catalyst with 1% by weight of platinum nanoparticles -- an efficient though expensive catalyst for generating hydrogen. In the presence of simulated sunlight, the platinum-impregnated catalyst degraded two antibiotics and produced substantial amounts of hydrogen. Finally, the team tested their product on real wastewater, water from a river in China and deionized water samples. Under simulated sunlight, the catalyst stimulated hydrogen production in all three samples. The greatest amount of hydrogen was obtained from the wastewater sample. The researchers say their catalyst could be a sustainable wastewater treatment option by generating hydrogen fuel at the same time.
Pollution
2,021
March 17, 2021
https://www.sciencedaily.com/releases/2021/03/210317111751.htm
Artificial light affects plant pollination even during the daytime
The use of artificial light at night around the world has increased enormously in recent years, causing adverse effects on the survival and reproduction of nocturnal organisms. Artificial light at night interferes with vital ecological processes such as the nighttime pollination of plants by nocturnal insects, which could have consequences for agricultural crop yields and reproduction of wild plants.
Scientists from the University of Zurich and Agroscope have now demonstrated for the first time that artificial light at night also adversely affects insects' pollination behavior during the daytime. In an experiment, they used commercial streetlamps to illuminate natural plant-pollinator communities during the nighttime on six natural meadows. Six other natural meadows were left dark. The research team concentrated its analysis on 21 naturally occurring plant species and the insect orders Diptera, Hymenoptera and Coleoptera.Differing interactions depending on plant species "Our findings indicate that artificial light during the nighttime alters the number of plant-pollinator interactions during the daytime, depending on the plant species," says Eva Knop from UZH's University Research Priority Program Global Change and Biodiversity and Agroscope. For example, three plant species received significantly fewer, and one other species slightly fewer, pollinator visits during the daytime. A different plant species, in contrast, received many more pollinator visits, and one other a little more, under LED illumination.Interestingly, nocturnal pollinator activity also varied in the presence of artificial light. For example, woodland geraniums (Geranium sylvaticum) in illuminated and dark meadows received the same number of pollinator visits, but not by the same insects: whereas dipterous insects reduced their visits to plants that were illuminated during the night, beetles (Coleoptera) tended to increase their visits. Two other plant species exhibited similar trends.Indirect ecological effects of light pollution The indirect ecological impact of light pollution has been ignored thus far. "Since insects play a vital role in pollinating crops and wild plants and are already endangered by habitat destruction and climate change regardless of artificial light, it is important to study and clarify these indirect mechanisms," Knop says.On the basis of their findings, Knop and her colleagues call for "the ecological impact of light pollution to be researched more thoroughly and for actions to be devised to avert adverse effects on the environment." They say they see ways to do this even though artificial light is an integral feature of populated areas. Public lighting, for instance, could be carefully designed in combination with new technologies to reduce it to a minimum.
Pollution
2,021
March 16, 2021
https://www.sciencedaily.com/releases/2021/03/210316093427.htm
Commercial truck electrification is within reach
When it comes to electric vehicles, particularly for heavy-duty trucks, the limitations of battery technology are often seen as the main barrier to widespread adoption. However, a new analysis concludes that it's the lack of appropriate policies around adoption incentives, charging infrastructure, and electricity pricing that prevents widespread electrification of commercial trucking fleets.
Researchers from the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California, Los Angeles published a new study that makes the case for prioritizing public policy to help move long-haul trucking from diesel to electric. Doing so will mean huge gains in addressing the climate crisis and avoiding premature deaths due to local vehicular pollution, which disproportionately affects communities of color.The study analyzes the total cost of ownership of an electric long-haul truck compared to a diesel long-haul truck. Using the current price of a battery pack and assuming a 375-mile range, the researchers found that an electric long-haul truck has a 13% per mile lower total cost of ownership, with a net savings of $200,000 over the lifetime of the electric truck. The total cost of ownership analysis takes into account the purchase price and operating costs over the lifetime of the truck.The researchers also showed that future reductions in battery costs -- taken together with a more aerodynamic design and monetized benefits of reduced pollution -- would result in a 50% per mile lower total cost of ownership compared to a diesel long-haul truck by 2030. The electrification of long-haul trucks therefore is possible, and figuring out what is required to move the nation's trucking fleet to widely adopt electric trucks is the next step, the authors said."Given the massive economic and environmental benefits, the case for long-haul electric trucking is stronger than ever before," said Berkeley Lab Research Scientist Nikit Abhyankar, one of the authors of the study. "Enabling policies such as adoption and charging infrastructure incentives, sales mandates, and cost-reflective electricity pricing are crucial."Electric cars are becoming more prevalent now, with a substantial increase in global sales and commitments from several major auto manufacturers, including General Motors and Volvo, to sell only electric vehicles by 2030-2035. Long-haul trucks have not experienced the same level of growth, yet they are diesel-fuel guzzlers and a major source of air pollution, contributing more than 20% of U.S. transportation-sector greenhouse gas emissions.Berkeley Lab scientists have done extensive research tracking the impact of diesel trucks on air quality and public health in disadvantaged communities. Even though diesel trucks account for just a small fraction of motor vehicles, they are responsible for almost one-third of motor vehicle CO"If we can move away from diesel-dependent heavy-duty vehicles, we have a chance at significantly reducing greenhouse gas and particulate emissions from the transportation sector," said Berkeley Lab Staff Scientist Amol Phadke, lead author on this study.There are currently two main pathways to electrify trucks -- fuel cells and batteries -- and both are actively being pursued by researchers at Berkeley Lab. Long-haul trucks powered by hydrogen fuel cells are on the horizon, and Berkeley Lab scientists are playing a leading role in a new DOE consortium called the Million Mile Fuel Cell Truck (M2FCT) to advance this technology. Battery-powered electric trucks have seen the most dramatic improvements in technology in recent years, making the battery costs more affordable and competitive.What's more, electricity from renewable energy sources is becoming more cost-competitive, and Berkeley Lab researchers have shown that decarbonizing the electric grid is feasible in the coming decades, which means electric long-haul trucks would no longer contribute to greenhouse gas emissions."It is exciting to see recent dramatic improvements in battery technology and costs," said Phadke. "Electric trucks can generate significant financial savings for truck owners and fleet operators, while enabling inflation-proof freight transportation that can have significant macroeconomic benefits."The study was supported in part by the Hewlett Foundation.
Pollution
2,021
March 15, 2021
https://www.sciencedaily.com/releases/2021/03/210315110159.htm
Model predicts urban development and greenhouses gasses will fuel urban floods
When rain began falling in northern Georgia on Sept. 15, 2009, little did Atlantans know that they would bear witness to epic flooding throughout the city. Neighborhoods, like Peachtree Hills, were submerged; Georgia's busiest expressway was underwater, as were roads and bridges; untreated sewage mingled with rising flood waters; cars and people were swept away. Then-Georgia-governor, Sonny Perdue, declared a state of emergency.
Moisture from the Gulf of Mexico fueled the flood of 2009. A decade later, Arizona State University researchers are asking whether a combination of urban development -- and climate change fueled by greenhouse gasses -- could bring about comparable scenarios in U.S. cities. Based on a just-published study, the answer is yes."When we account for these twin forcing agents of environmental change, the effect of the built environment and the effect of greenhouse gasses, we note a strong tendency toward increased extreme precipitation over future US metropolitan regions," said Matei Georgescu, associate professor in ASU's School of Geographical Sciences and Urban Planning and lead author of the study.Previous studies have shown that urban development modifies precipitation, thanks to what's known as the urban heat-island effect, the difference between the temperature in a city and the surrounding rural area. As a city grows, it gets warmer. The added warmth adds energy to the air, which forces it to rise faster, condense, form precipitation and rain out over the city or downwind of the city. So, the amount of precipitation a city receives either increases or decreases in response to the urban heat-island effect.However, when greenhouse gasses and urban development are both taken into account, regional climate modeling focused on the continental United States shows compensating impacts between the effect of urban development and greenhouse gas emissions on extreme precipitation.The study was published online in the journal Researchers have not previously looked at these two variables in tandem. Studies on future precipitation over urban environments typically examine effects for a limited number of events, and they do not account for the twin forcing agents of urban- and greenhouse-gas induced climate change."This new study is unique," said Georgescu. "We used climate-scale simulations with a regional climate model to examine potential changes in future extreme precipitation resulting from both urban expansion and increases in greenhouse gasses, across dozens of cities across the continental United States."In essence, the new study showed that incorporating greenhouse gasses into a regional climate model offset the sometimes-diminishing effect of urban development on extreme precipitation, said Georgescu."These are the effects our cities are likely to experience when accounting for the twin forcing agents of urban expansion and greenhouse gas emissions, simultaneously," explained Georgescu. "What this means for U.S. cities in the future is the need for a consistent response to an increase in extreme precipitation. We're no longer likely to see a decrease in precipitation as we've seen before."Like Atlanta, cities across the U.S., including Denver, Phoenix and Houston, appear to be vulnerable to extreme precipitation and its resultant flooding. Georgescu said the study's findings show the pressing need for cities to develop policies to address flooding that threatens each city's unique resilience and planned infrastructure investments."If we trust the models' capability to simulate average and extreme precipitation so well, and our results demonstrate such simulation skill, then we can conduct simulations that include future urbanization, future greenhouse gasses, separately and then together, and trust what the model will tell us," explained Georgescu.But it's not just about reducing greenhouse gas emissions, he noted. "It's also about how you build cities. How extensive they are, how vertical they are, how dense they are, how much vegetation there is, how much waste heat you put into the environment through electricity use, through air conditioning, or through transportation. All of these things can impact future precipitation in our cities."In fact, the study has important implications for climate change adaptation and planning. The study highlights the complex and regionally specific ways in which the competing forces of greenhouse gases and urban development can impact rainfall across U.S. metropolitan regions, explained Ashley Broadbent, assistant research professor in ASU's School of Geographical Sciences and Urban Planning."This complexity reinforces that future adaptation efforts must be informed by simulations that account for these interacting agents of environmental change," he said.Additional study authors include M. Wang and M. Moustaoui, ASU; and E. Scott Krayenhoff, ASU and University of Guelph, Ontario, Canada.The study was funded by the National Science Foundation through the Urban Water Innovation Network.
Pollution
2,021
March 12, 2021
https://www.sciencedaily.com/releases/2021/03/210312121333.htm
New review explores effective sampling techniques for collecting airborne viruses and ultrafine particles
As the world continues to grapple with the COVID-19 pandemic, an international team of researchers have published a review of the best techniques to collect airborne aerosols containing viruses.
In the review, which was published by the For example, the sampler draws the air through the cyclone separator. It then uses centrifugal forces to collect the particles on a sterile cone containing the liquid collection vessel, such as DMEM (Dulbecco's minimal essential medium). The collected sample can then be readily used for any analysis for virus detection.The research team hope that this wide-ranging review can serve as an information hub packed with the best methods and samplers involved in airborne virus collection.The study is part of the INHALE project -- an EPSRC funded project that aims to assess air pollution's impact on personal health in urban environments. The project involves Imperial College London, the University of Surrey and the University of Edinburgh.The INHALE team also reviewed effective techniques for capturing fine (PM2.5) and ultrafine (PM0.1) particles to understand their toxicity and their role on reactive oxygen species in cells, their elemental composition and carbon content. The team also set out to find the best solution to prevent samples from being destroyed, a common problem found in toxicological experiments that makes large sample collection challenging. The study concluded that Harvard impactor samplers could be used for both indoor and outdoor environments to effectively collect these fine and ultrafine samples.Professor Prashant Kumar, lead author of the study and Founding Director of the Global Centre for Clean Air Research at the University of Surrey, said: "The scientific community will have to become more efficient and resourceful if we are to overcome foes such as airborne viruses and air pollution. Knowing the right tools to use -- as well as how and where to use them -- is crucial in our ongoing fight to make the air we breathe cleaner and safer for all."Professor Fan Chung, co-lead of INHALE from Imperial College London, said: "I am pleased that this timely review found support for the techniques that have been adopted in the INHALE research program. The collection of ultrafine particles is of particular importance because of the commonly found difficulties of collecting enough for toxicity studies. Ultimately, the success of INHALE will depend on the ability to capture enough of these fine and ultrafine particles as far as possible in their natural state."Professor Chris Pain, co-lead of INHALE from Imperial College London, said: "Understanding the application of these sampling techniques is hugely important for environmental and health research in general and for the INHALE project itself, particularly concerning collecting ultra-fine particles."This work was supported by the EPSRC INHALE (Health assessment across biological length scales for personal pollution exposure and its mitigation) project (EP/T003189/1).
Pollution
2,021
March 11, 2021
https://www.sciencedaily.com/releases/2021/03/210311142038.htm
Air pollution: The silent killer called PM 2.5
Millions of people die prematurely every year from diseases and cancer caused by air pollution. The first line of defence against this carnage is ambient air quality standards. Yet, according to researchers from McGill University, over half of the world's population lives without the protection of adequate air quality standards.
Air pollution varies greatly in different parts of the world. But what about the primary weapons against it? To find answers, researchers from McGill University set out to investigate global air quality standards in a study published in the The researchers focused on air pollution called PM2.5 -- responsible for an estimated 4.2 million premature deaths every year globally. This includes over a million deaths in China, over half a million in India, almost 200,000 in Europe, and over 50,000 in the United States."In Canada, about 5,900 people die every year from air pollution, according to estimates from Health Canada. Air pollution kills almost as many Canadians every three years as COVID-19 killed to date," says co-author Parisa Ariya, a Professor in the Department of Chemistry at McGill University.Among the different types of air pollution, PM2.5 kills the most people worldwide. It consists of particles smaller than approximately 2.5 microns."We adopted unprecedented measures to protect people from COVID-19, yet we don't do enough to avoid the millions of preventable deaths caused by air pollution every year," says Yevgen Nazarenko, a Research Associate at McGill University who conducted the study with Devendra Pal under the supervision of Professor Ariya.The researchers found that where there is protection, standards are often much worse than what the World Health Organization considers safe. Many regions with the most air pollution don't even measure PM2.5 air pollution, like the Middle East. They also found that the weakest air quality standards are often violated, particularly in countries like China and India. In contrast, the strictest standards are often met, in places like Canada and Australia.Surprisingly, the researchers discovered that high population density is not necessarily a barrier to fighting air pollution successfully. Several jurisdictions with densely populated areas were successful in setting and enforcing strict standards. These included Japan, Taiwan, Singapore, El Salvador, Trinidad and Tobago, and the Dominican Republic."Our findings show that more than half of the world urgently needs protection in the form of adequate PM2.5 ambient air quality standards. Putting these standards in place everywhere will save countless lives. And where standards are already in place, they should be harmonized globally," says Nazarenko."Even in developed countries, we must work harder to clean up our air to save hundreds of thousands of lives every year," he says.
Pollution
2,021
March 11, 2021
https://www.sciencedaily.com/releases/2021/03/210311123516.htm
The secrets of the best rainbows on Earth
Rainbows are some of the most spectacular optical phenomena in the natural world and Hawai'i has an amazing abundance of them. In a new publication, an atmospheric scientist at the University of Hawai'i at Mānoa makes an impassioned case for Hawaii being the best place on Earth to experience the wonder of rainbows. He begins by highlighting the Hawaiian cultural significance of rainbows, he reviews the science of rainbows and the special combination of circumstances that makes Hawai'i a haven for rainbows.
"The cultural importance of rainbows is reflected in the Hawaiian language, which has many words and phrases to describe the variety of manifestations in Hawai'i," said author Steven Businger, professor in the UH Mānoa School of Ocean and Earth Science and Technology. "There are words for Earth-clinging rainbows (uakoko), standing rainbow shafts (kāhili), barely visible rainbows (punakea), and moonbows (ānuenue kau pō), among others. In Hawaiian mythology the rainbow is a symbol of transformation and a pathway between Earth and Heaven, as it is in many cultures around the world."The essential ingredients for rainbows are, of course, rain and sunlight. To see a rainbow on flat ground the sun must be within about 40 degrees of the horizon. As the sun rises to higher angles in the sky during the morning, the height of the rainbow diminishes until no rainbow is visible above the horizon. The pattern is reversed as the sun lowers in the afternoon, with rainbows rising in the east and the tallest rainbows just prior to sunset.Hawai'i's location in the subtropical Pacific means the overall weather pattern is dominated by trade winds, with frequent rain showers and clear skies between the showers.Businger outlines four additional factors affecting the prevalence of rainbows throughout the islands."At night a warm sea surface heats the atmosphere from below, while radiation to space cools cloud tops, resulting in deeper rain showers in the morning that produce rainbows in time for breakfast," said Businger.Another critical factor in producing frequent rainbows is Hawai'i's mountains, which cause trade wind flow to be pushed up, forming clouds and producing rainfall. Without mountains, Hawai'i would be a desert with a scant 17 inches annual rainfall.A third factor conducive to rainbow sightings is daytime heating, which drives island-scale circulations. During periods of lighter winds, showers form over the ridge crests over Oahu and Kauai in the afternoon, resulting in prolific rainbows as the sun sets.Due to the remoteness of the Hawaiian Islands, the air is exceptionally clean and free of pollution, continental dust, and pollen. This is the fourth factor that contributes to the numerous bright rainbows with the full spectrum of colors.As Businger pursued his passion for finding and photographing these beautiful light displays, he began to imagine a smartphone app with access to Doppler radar data and high-resolution satellite data that could alert users when nearby conditions become conducive for rainbow sightings."After a few years of false starts, Paul Cynn and I finally connected with Ikayso, a Hawaiian smartphone app developer in April of 2020. I am very excited to say that our app, called RainbowChase, is now available to the public for free," said Businger.
Pollution
2,021
March 10, 2021
https://www.sciencedaily.com/releases/2021/03/210310132351.htm
Scientists discover attacking fungi that show promise against emerald ash borer
Since its introduction, the emerald ash borer (EAB) has become the most devastating invasive forest insect in the United States, killing hundreds of millions of ash trees at a cost of hundreds of millions of dollars.
Now, new research from the University of Minnesota's Minnesota Invasive Terrestrial Plants and Pests Center (MITPPC) shows a possible path forward in controlling the invasive pest that threatens Minnesota's nearly one billion ash trees.In a recent study published in "We discovered that several different species of fungi attack EAB and other insects, and they can now be further tested for their potential for biocontrol," said Robert Blanchette, the study's project leader and professor in the College of Food, Agricultural and Natural Resource Sciences. "This is a very important first step in the search for a biocontrol for emerald ash borer."Larval EAB feed just beneath the bark, leaving behind tunnel galleries that can stretch up to 20 inches long. Beneath the surface, fungi -- some of which may be capable of parasitizing the EAB -- may be carried by the larvae as they develop, or may enter the tree through the tunnel galleries. Some of these fungi also seriously affect urban trees, causing rapid wood decay which result in hazardous tree situations.From Rochester to Duluth, researchers gathered samples where ash is affected by EAB. Through DNA sequencing, scientists identified fungal isolates and revealed a diverse assemblage of fungi. This included entomopathogenic fungi that attack insects, as well as other fungi that cause cankers -- which help EAB kill trees -- and some that cause wood decay."Before now, we simply haven't been sure what fungi are associated with EAB infestations in Minnesota. This project identified those species and, in doing so, opened up new possibilities for managing one of our state's most devastating tree pests," said Ben Held, the study's lead author and researcher in the College of Food, Agricultural and Natural Resource Sciences.As research continues, the scientists will build on the work from this study to determine if any of the fungi can be used to kill the emerald ash borer. Results will also be of value in helping control the insect in other parts of North America where EAB is found."Ash trees are vitally important to Minnesota," said Rob Venette, MITPPC director. "They reduce air pollution, storm water runoff, and cooling costs, all while increasing property values in local communities. It's critical we work to protect them from this invasive pest."
Pollution
2,021
March 10, 2021
https://www.sciencedaily.com/releases/2021/03/210310122456.htm
Air pollutant reductions could enhance global warming without greenhouse gas cuts
As countries around the world race to mitigate global warming by limiting carbon dioxide emissions, an unlikely source could be making climate goals harder to achieve without even deeper cuts in greenhouse gas production: reductions in air pollution.
New modeling experiments from Kyushu University in Japan of the long-term effects of reductions in pollutants known as sulfate aerosols predicts further increases in surface air temperature at current and increased carbon dioxide levels because of the loss of an overall cooling effect caused by the light-scattering particles."Air pollution causes an estimated seven million premature deaths per year worldwide, so action is essential, especially in emerging and developing countries, which tend to be most affected," says Toshihiko Takemura, professor at Kyushu University's Research Institute for Applied Mechanics and author of the study."However, reductions in air pollutants must come hand in hand with reductions in greenhouse gases to avoid accelerating global warming."To analyze how sulfate aerosols -- small particles of sulfur-containing compounds often produced by burning fossil fuels or biomass -- influence climate, Takemura used a combination of models known as MIROC-SPRINTARS.MIROC is a general circulation model taking into account many key aspects of the atmosphere and oceans along with their interactions, while SPRINTARS, which is widely used by news outlets for air pollution forecasts, is capable of predicting the mixing of aerosols in the atmosphere.Combining the two models allows for effects such as the scattering and absorption of light by aerosols and the interaction of aerosols with clouds to be included in the climate projection.Looking at the immediate changes to the atmosphere in the case of reduced emission of SOHowever, considering changes in the climate and surface temperatures over longer time scales showed that not only does the surface air temperature increase with a reduction in sulfate aerosols but this increase is even larger when carbon dioxide levels double."Although the fast response is similar for both situations, long-term changes caused by more slowly responding factors related to interactions with the oceans and subsequent changes, such as in clouds and precipitation, eventually leads to a bigger temperature increase," explains Takemura."Thus, global warming will accelerate unless increases in greenhouse gas concentrations are suppressed as air pollution control measures decrease sulfate aerosol concentrations, further emphasizing the urgency for reducing carbon dioxide in the atmosphere," he concludes.
Pollution
2,021
March 10, 2021
https://www.sciencedaily.com/releases/2021/03/210310122426.htm
Red Snapper in the Gulf show signs of stress after Gulf oil spill
Nearly 100 percent of the red snapper sampled in the Gulf of Mexico over a six-year period by University of South Florida (USF) marine scientists showed evidence of liver damage, according to a study reported in
The study is the first to correlate the concentration of crude oil found in the workhorses of the digestive system -- the liver, gall bladder, and bile -- with microscopic indicators of disease, such as inflammation, degenerative lesions, and the presence of parasites. The team sampled nearly 570 fish from 72 Gulf locations between 2011 to 2017 in the wake of the historic 2010 Deepwater Horizon oil spill."The results add to the list of other species we've analyzed indicating early warning signs of a compromised ecosystem," said Erin Pulster, PhD, first author of the study and researcher at the USF College of Marine Science.Pulster and the team of researchers studying oil pollution in Gulf of Mexico fishes have previously reported high levels of oil exposure in yellowfin tuna, golden tilefish, and red drum as well.The Gulf of Mexico not only experiences hundreds of annual oil spills with long-lasting effects such as the historic Deepwater Horizon spill in 2010 but is routinely subject to intense shipping traffic and collects pollutants from faraway places that flow in from coastlines and rivers like the Mighty Mississippi and the Rio Grande.In this study Pulster and the team looked specifically at the most toxic component of crude oil called polycyclic aromatic compounds, or PAHs. PAH sources include old oil and gas rigs, fuel from boats and airplanes, and natural oil seeps, which are fractures on the seafloor that can add millions of barrels of oil to the Gulf every year.The presence of PAHs in the bile, which is produced by the liver to aid in digestion, indicates relatively recent oil exposure (days to weeks). The team found that the PAH concentration in the bile declined and remained relatively stable after 2011 but they noted a sharp increase in 2017.Overall, the bile PAH "hot spots" were on the West Florida Shelf (WFS) and in the vicinity of the Deepwater Horizon spill, off the mouth of the Mississippi River. This is the site of the 2004 Taylor oil platform collapse off Louisiana, the longest oil spill in history, which continues to leak oil today. The hotspot west of Tampa on the WFS could be due to shipping traffic or submarine groundwater discharge, Pulster said.PAH found in the liver indicates the fish has been chronically exposed to oil (months to years). The team found the liver PAH "hot spots" in the northwest Gulf of Mexico, where a considerable number inactive oil and gas platforms exist.While the PAH concentrations in the liver remained relatively stable throughout the study, indicating that the red snapper are physically managing the oil exposure, there is a tipping point, Pulster said. Red snapper can live upwards of 40 years but fish manage oil toxins, similar to the way humans manage exposure to greasy burgers and alcohol.Repeated exposure to oil in fish can lead to cancer and eventually to death, but it can also result in sublethal impacts. Virtually all (99 percent) of the red snapper sampled had an average of five physical signs of liver damage. The observed changes can result from natural causes but are also well documented secondary responses to stress which, could potentially signal disease progression."We just don't know when we will tip the scale," said Pulster. There was literally one red snapper in the bunch with PAHs but no physical signs of damage when viewed under the microscope, said Pulster.It's a good thing that humans only eat the muscle of the fish, not the liver. Red snapper remain safe to eat but Pulster stressed the need for continued monitoring. Only then can scientists keep their finger on the pulse of fish health and know what the impacts of additional oil spills may be -- especially in species like snapper that are so critical to the Gulf economy, she said."This is a unique study. Most investigations of oil spill effects only last a year or two, and this study gives us both a wide scale of reference across the Gulf and also long-term monitoring, which we lacked prior to Deepwater Horizon," said Steve Murawski, PhD, senior author on the study. Murawski, a professor and the St. Petersburg Downtown Partnership Peter R. Betzer Endowed Chair at the USF College of Marine Science, led the 10-year research effort in response to the Deepwater Horizon oil spill (C-IMAGE ("There is a story we can tease out of the data," Murawski said. "The observed decline in oil exposure in red snapper in the few years following the Deepwater Horizon accident suggests the high levels measured in earlier years were a direct impact from the spill. Its legacy continues, and we'd be wise to continue the critical research ironically made possible by long-term monitoring post-disaster."The study was supported by The Gulf of Mexico Research Initiative, and The Center for the Integrated Modeling and Analysis of the Gulf Ecosystem (C-IMAGE I, II, and II).
Pollution
2,021
March 9, 2021
https://www.sciencedaily.com/releases/2021/03/210309132619.htm
Researchers modify air quality models to reflect polluted reality in Latin America
Computational models of air quality have long been used to shed light on pollution control efforts in the United States and Europe, but the tools have not found widespread adoption in Latin America. New work from North Carolina State University and Universidad de La Salle demonstrates how these models can be adapted to offer practical insights into air quality challenges in the Americas outside the U.S.
Computational air quality models can be used in multiple ways. For example, they can be used to determine which sources are responsible for what fraction of air pollution. They can also help authorities predict how air pollution might change if different pollution control methods are adopted."Historically, it's been very challenging to apply these modeling tools in Latin America, so it has rarely been done," says Fernando Garcia Menendez, corresponding author of a paper on the work and an assistant professor of environmental engineering at NC State. "This is important because the region has many areas that are dealing with significant air pollution, and these modeling tools can help governments identify the most cost-effective ways of achieving air quality improvements."One challenge to using computational air quality models in Latin America is that the relevant modeling frameworks were developed largely in the context of the U.S. and Europe. That means that some of the assumptions that modelers took for granted when developing the tools don't always apply in Latin American cities. Furthermore, computational resources and trained environmental modelers are still scarce in the region.For example, there are often substantially less air emissions data available. In addition, there are some contributors to air pollution that are common across Latin American metro areas, but that differ from what we see in the U.S. -- more unpaved roads, an older cargo fleet, a large number of motorcycles, informal economies, and so on.With that in mind, Garcia Menendez developed a research project with collaborators at the Universidad de La Salle, in Bogotá, Colombia. Specifically, the research team fine-tuned a modeling framework to reflect the air pollution dynamics in Bogotá and investigate the city's air quality problems. The collaborators at Universidad de La Salle also collected air pollution data that allowed the team to assess the accuracy of its modeling results."Our paper outlines the techniques we've used to perform computational modeling of air quality issues in a large Latin American city," says James East, first author of the paper and a Ph.D. student at NC State. "This not only demonstrates that it can be done, but provides an approach that others can use to provide insights into air pollution in other parts of the region that are experiencing similar issues."While the paper focuses on an air quality model for fine particulate matter (PM2.5), the researchers say that the model could be used to look at other air pollutants. Exposure to PM2.5 is associated with a wide variety of health problems, including heart and lung disease.In their proof-of-concept demonstration, the researchers found that the largest local sources of PM2.5 in Bogotá were dust from unpaved roads and emissions from heavy-duty vehicles. However, when the model was used to project future air quality, the study also found that while paving roads would decrease air pollution in some parts of the city, different emission sources would still lead to increased air pollution in other parts of the city -- unless other emission control measures were also implemented.In short, the model offered practical insights into possible solutions for a complex metropolitan area of 10 million people."These findings are of interest to environmental authorities, from the local to the national level, who are pursuing ways to effectively address air pollution in Bogotá and other Colombian cities," says Jorge Pachon, a co-author of the paper and an associate professor at the Universidad de La Salle.
Pollution
2,021
March 9, 2021
https://www.sciencedaily.com/releases/2021/03/210309132539.htm
Recyclable bioplastic membrane to clear oil spills from water
Polymer scientists from the University of Groningen and NHL Stenden University of Applied Sciences, both in the Netherlands, have developed a polymer membrane from biobased malic acid. It is a superamphiphilic vitrimer epoxy resin membrane that can be used to separate water and oil. This membrane is fully recyclable. When the pores are blocked by foulants, it can be depolymerized, cleaned and subsequently pressed into a new membrane. A paper describing the creation of this membrane was published in the journal
How do you clean up an oil spill in water? This is quite a challenge. Superamphiphilic membranes, that 'love' both oil and water, are a promising solution but not yet a very practical one. These membranes are often not robust enough for use outside the laboratory environment and the membrane pores can clog up as a result of fouling by algae and sand. Chongnan Ye and Katja Loos from the University of Groningen and Vincent Voet and Rudy Folkersma from NHL Stenden used a relatively new type of polymer to create a membrane that is both strong and easy to recycle.In recent years, the researchers from both institutes have joined forces to investigate vitrimer plastics, polymer materials that have the mechanical properties and chemical resistance of a thermoset plastic. However, vitrimer plastics can also behave like a thermoplastic, since they can be depolymerized and reused. This means that a vitrimer plastic has all the qualities to make a good membrane for oil spill remediation. 'Furthermore, it was made from malic acid, a natural monomer,' adds Loos.'The polymers in the vitrimer are crosslinked in a reversible manner,' explains Voet. 'They form a dynamic network, which enables recycling of the membrane.' The vitrimer is produced through base-catalysed ring-opening polymerization between pristine and epoxy-modified biobased malic acid. The polymers are ground into a powder by ball milling and turned into a porous membrane through the process of sintering.Both water and oil will spread out on the resulting superamphiphilic membrane. In an oil spill, much more water is present than oil, which means that the membrane is covered by water that can then pass through the pores. Voet: 'The water film on the membrane's surface keeps the oil out of the pores so that it is separated from the water.'The membrane is firm enough to filter oil from the water. When sand and algae clog up the pores, the membrane can be depolymerized and recreated from the building blocks after removal of the pollutants. 'We have tested this on a laboratory scale of a few square centimetres,' says Loos. 'And we are confident that our methods are scalable, both for the polymer synthesis and for the production and recycling of the membrane.' The scientists are hoping that an industrial partner will take up further development.Creating this new membrane for oil spill remediation shows the power of cooperation between a research university and an applied university. 'A while ago, we decided that the polymer groups at the two institutes should become one, by sharing students, staff and facilities. We recently started the first hybrid research group in the Netherlands,' explains Loos. This makes it easier to find applications for newly designed materials. Voet: 'Polymer chemists strive to link molecular structures to material properties and applications. Our hybrid research team has the experience to do just that.'
Pollution
2,021
March 9, 2021
https://www.sciencedaily.com/releases/2021/03/210309114357.htm
Strategic air purifier placement reduces virus spread within music classrooms
The University of Minnesota School of Music was concerned about one-on-one teaching during the COVID-19 pandemic and wondered if it should supplement its ventilation system with portable HEPA air purifiers.
So, school officials reached out to Suo Yang, a professor within the College of Science and Engineering, and his team to figure it out. In Physics of Fluids, from AIP Publishing, Yang and the researchers describe their work to predict how virus particles spread within a music classroom."The airborne transmission of COVID-19 through droplets or droplet nuclei can be modeled as a typical particle-laden flow, and simulation work needs to accurately capture the movement of particles within the turbulent flow environment," said Yang. "Particle deposition onto the walls and other surfaces also needs to be captured accurately, and we wanted to explore whether particles are trapped in some recirculation zones, ventilated out, or removed by an air purifier."Their simulations suggest that a HEPA air purifier should be placed directly in front of the person or instrument expelling aerosols and located away from other people who are to be protected within the room.The researchers' goal was a ventilation rate of 288 cubic meters per hour, per World Health Organization guidelines, for each person within the room. To their knowledge, this is the most comprehensive investigation of using HEPA air purifiers to combat COVID-19."The key thing is that while portable purifiers can be very helpful, their placement is critical," said Yang.It is absolutely critical to not spread aerosols further by inducing a very dispersive flow, according to Sai Ranjeet Narayanan, a graduate research assistant working with Yang.One big surprise for the researchers was that portable HEPA air purifiers reduce in-air aerosols so much faster than just ventilation -- offering orders of magnitude higher aerosol removal compared to not using a purifier.If the flow rate of the HVAC system is known, this work can serve as a guideline for the size and number of portable purifiers needed, as well as where to place them."Our work also provides a guideline for how long of a break between classes is necessary to ensure the safety of the next class, which is about 25 minutes for this case," said Yang."And it provides an analytic function, found to be a linear function, to make predictions for the aerosol removal, deposition, or suspension rate in almost any scenario: breathing, speaking, singing, coughing, sneezing, different wind instruments, with and without masks."In the future, "we plan to model larger domains with more people, such as bands, orchestras, or any group gathering," Narayanan said. "And we may also compare the performance of different types of purifiers."
Pollution
2,021
March 9, 2021
https://www.sciencedaily.com/releases/2021/03/210309105200.htm
Safe, simple additive could cut agrochemical pollution
Adding a simple polymer to fertilizers or pesticides could dramatically reduce agricultural pollution, suggests a new study by researchers at the University of British Columbia.
When agrochemicals are sprayed onto crops, a large amount typically ends up in the surrounding environment due to droplets splashing, rebounding or rolling off the target plants.This amount could be cut at least in half by mixing fertilizers and pesticides with a small quantity of polyethylene oxide, a common polymer additive that improves the ability of agrochemical solutions to stick to plant surfaces, the study found."Other studies have explored ways to decrease the loss of agrochemicals to the environment," says John Frostad, the study lead and a chemical and biological engineering professor at UBC. "But this is the first to quantify the results using realistic spray conditions that can be translated directly from the lab to field applications."To conduct the study, Frostad and his colleagues built a lab-scale device that allows liquids to be sprayed onto surfaces through real agricultural nozzles. The device also enables users, for the first time, to measure precisely how much liquid remains on a surface after it has been sprayed at industrial pressures and deposition rates in a laboratory setting.The team found that combining a fertilizer solution with a miniscule amount of polyethylene oxide -- an environmentally safe polymer widely used in cosmetics and biomedical applications -- significantly enhanced the fertilizer's stickiness.In fact, the additive nearly eliminated splashing, bouncing or rolling by droplets when they came into contact with plant surfaces, reducing the percentage of fertilizer that entered the surrounding environment from 30 per cent to just five."Using this device, researchers can measure exactly how effective different additives are at improving retention," says Frostad. "New formulations of agrochemicals that include these additives could allow crops to be sprayed more efficiently, cutting both environmental pollution caused by agrochemicals and the amount of chemicals that need to be used in the first place."
Pollution
2,021
March 5, 2021
https://www.sciencedaily.com/releases/2021/03/210305123815.htm
Eight ways chemical pollutants harm the body
A new review of existing evidence proposes eight hallmarks of environmental exposures that chart the biological pathways through which pollutants contribute to disease: oxidative stress and inflammation, genomic alterations and mutations, epigenetic alterations, mitochondrial dysfunction, endocrine disruption, altered intercellular communication, altered microbiome communities, and impaired nervous system function.
The study by researchers at Columbia University Mailman School of Public Health, Ludwig Maximilian University, and Hasselt University is published in the journal "Every day we learn more about how exposure to pollutants in air, water, soil, and food is harmful to human health," says senior author Andrea Baccarelli, MD, PhD, chair of Environmental Health Sciences at Columbia Mailman School. "Less understood, however, are the specific biological pathways through which these chemicals inflict damage on our bodies. In this paper, we provide a framework to understand why complex mixtures of environmental exposures bring about serious illness even at relatively modest concentrations."We are continually exposed to a mixture of pollutants, which lead to changes in our bodies in multiple domains, from conception to old age. They govern gene expression, train and shape our immune systems, trigger physiological responses, and determine wellbeing and disease.The paper summarizes evidence for eight hallmarks of environmental insults:1. Oxidative stress and inflammation: When antioxidant defenses are depleted, inflammation, cell death, and organ damage occur.2. Genomic alterations and mutations: An accumulation of DNA errors can trigger cancer and other chronic diseases.3. Epigenetic alterations: Epigenetic changes alter the synthesis of proteins responsible for childhood development and regular function of the body.4. Mitochondrial dysfunction: A breakdown in the cellular powerplant may interfere with human development and contribute to chronic disease.5. Endocrine disruption: Chemicals found in our environment, food, and consumer products disrupt the regulation of hormones and contribute to disease.6. Altered intercellular communication: Signaling receptors and other means by which cells communicate with each other, including neurotransmission, are affected.7. Altered microbiome communities: An imbalance in the population of bacteria and other microorganisms in our body can make us susceptible to allergies and infections.8. Impaired nervous system function. Microscopic particles in air pollution reach the brain through the olfactory nerve, and can interfere with cognition.Not all environmental exposures are harmful. The researchers note that exposure to nature has been reported to have beneficial impacts on mental health.These eight hallmarks are by no means comprehensive and do not capture the full complexity of the chemical and physical properties of environmental exposures, including mixtures of exposures over the short and long-term. Further research is needed to understand the complex mechanisms by which exposures affect human biology, and how altered processes interact and contribute to disease or confer health benefits, across the life course."We need research to expand our knowledge of disease mechanisms going beyond genetics.Advances in biomedical technologies and data science will allow us to delineate the complex interplay of environmental insults down to the single-cell level," says Baccarelli. "This knowledge will help us develop ways to prevent and treat illness. With the serious environmental challenges like air pollution and climate change, most of all, we need strong local, national, and inter-governmental policies to ensure healthy environments."
Pollution
2,021
March 5, 2021
https://www.sciencedaily.com/releases/2021/03/210305080124.htm
Fine particulate matter from wildfire smoke more harmful than pollution from other sources
Researchers at Scripps Institution of Oceanography at UC San Diego examining 14 years of hospital admissions data conclude that the fine particles in wildfire smoke can be several times more harmful to human respiratory health than particulate matter from other sources such as car exhaust. While this distinction has been previously identified in laboratory experiments, the new study confirms it at the population level.
This new research work, focused on Southern California, reveals the risks of tiny airborne particles with diameters of up to 2.5 microns, about one-twentieth that of a human hair. These particles -- termed PM2.5 -- are the main component of wildfire smoke and can penetrate the human respiratory tract, enter the bloodstream and impair vital organs.The study appears March 5 in the journal To isolate wildfire-produced PM2.5 from other sources of particulate pollution, the researchers defined exposure to wildfire PM2.5 as exposure to strong Santa Ana winds with fire upwind. A second measure of exposure involved smoke plume data from NOAA's Hazard Mapping System.A 10 microgram-per-cubic meter increase in PM2.5 attributed to sources other than wildfire smoke was estimated to increase respiratory hospital admissions by 1 percent. The same increase, when attributed to wildfire smoke, caused between a 1.3 to 10 percent increase in respiratory admissions.Corresponding author Rosana Aguilera said the research suggests that assuming all particles of a certain size are equally toxic may be inaccurate and that the effects of wildfires -- even at a distance -- represent a pressing human health concern."There is a daily threshold for the amount of PM2.5 in the air that is considered acceptable by the county and the Environmental Protection Agency (EPA)," said Aguilera, a postdoctoral scholar at Scripps Institution of Oceanography. "The problem with this standard is that it doesn't account for different sources of emission of PM2.5."As of now, there is not a consensus as to why wildfire PM2.5 is more harmful to humans than other sources of particulate pollution. If PM2.5 from wildfires is more dangerous to human lungs than that of ambient air pollution, the threshold for what are considered safe levels of PM2.5 should reflect the source of the particles, especially during the expanding wildfire season. This is especially relevant in California and other regions where most PM2.5 is expected to come from wildfires.In Southern California, the Santa Ana winds drive the most severe wildfires and tend to blow wildfire smoke towards populated coastal regions. Climate change delays the start of the region's rainy season, which pushes wildfire season closer to the peak of the Santa Ana winds in early winter. Additionally, as populations grow in wildland urban interface areas, the risks of ignitions and impacts of wildfire and smoke increase for those who live inland and downwind.Coauthor Tom Corringham points to the implications for climate change: "As conditions in Southern California become hotter and drier, we expect to see increased wildfire activity. This study demonstrates that the harm due to wildfire smoke may be greater than previously thought, bolstering the argument for early wildfire detection systems and efforts to mitigate climate change."
Pollution
2,021
March 4, 2021
https://www.sciencedaily.com/releases/2021/03/210304161117.htm
Apparent Atlantic warming cycle likely an artifact of climate forcing
Volcanic eruptions, not natural variability, were the cause of an apparent "Atlantic Multidecadal Oscillation," a purported cycle of warming thought to have occurred on a timescale of 40 to 60 years during the pre-industrial era, according to a team of climate scientists who looked at a large array of climate modeling experiments.
The result complements the team's previous finding that what had looked like an "AMO" occurring during the period since industrialization is instead the result of a competition between steady human-caused warming from greenhouse gases and cooling from more time-variable industrial sulphur pollution."It is somewhat ironic, I suppose," said Michael E. Mann, distinguished professor of atmospheric science and director, Earth System Science Center, Penn State. "Two decades ago, we brought the AMO into the conversation, arguing that there was a long-term natural, internal climate oscillation centered in the North Atlantic based on the limited observations and simulations that were available then, and coining the term 'AMO.' Many other scientists ran with the concept, but now we've come full circle. My co-authors and I have shown that the AMO is very likely an artifact of climate change driven by human forcing in the modern era and natural forcing in pre-industrial times."The researchers previously showed that the apparent AMO cycle in the modern era was an artifact of industrialization-driven climate change, specifically the competition between warming over the past century from carbon pollution and an offsetting cooling factor, industrial sulphur pollution, that was strongest from the 1950s through the passage of the Clean Air Acts in the 1970s and 1980s. But they then asked, why do we still see it in pre-industrial records?Their conclusion, reported today (Mar. 5) in "Some hurricane scientists have claimed that the increase in Atlantic hurricanes in recent decades is due to the uptick of an internal AMO cycle," said Mann. "Our latest study appears to be the final nail in the coffin of that theory. What has in the past been attributed to an internal AMO oscillation is instead the result of external drivers, including human forcing during the industrial era and natural volcanic forcing during the pre-industrial era."The researchers looked at state-of-the-art climate models both for preindustrial times over the past thousand years where external factors such as solar and volcanic drivers were used, and unforced, "control" simulations where no external drivers were applied and any changes that happen are internally generated. When they looked at simulations for the short, 3- to 7-year El Niño Southern Oscillation (ENSO) cycles, they found that these cycles occurred in the models without adding forcing by climate change, volcanic activity, or anything else.However, when they looked for the AMO, it did not occur in the unforced model and only appeared in modern times using climate change variables as forcing and in preindustrial times with forcing by volcanic eruptions."The models do show intrinsic internal oscillations on a 3- to 7-year time scale characteristic of the established El Niño phenomenon, but nothing on the multi-decadal scale that would be the AMO," said Byron A. Steinman, associate professor of Earth and environmental sciences, University of Minnesota Duluth, who was also on the project. "What we know is an oscillation like El Niño is real, but the AMO is not."Mann suggested that while some influential scientists continue to dismiss certain climate change trends as the result of a supposed internal AMO climate cycle, the best available scientific evidence does not support the existence of such a cycle.Other researchers from Penn State on this project were Daniel J. Brouillette and Sonya K. Miller, both researchers in meteorology.The National Science Foundation partially funded this research.
Pollution
2,021
March 3, 2021
https://www.sciencedaily.com/releases/2021/03/210303081403.htm
How 'green' are environmentally friendly fireworks?
Fireworks are used in celebrations around the world, including Independence Day in the U.S., the Lantern Festival in China and the Diwali Festival in India. However, the popular pyrotechnic displays emit large amounts of pollutants into the atmosphere, sometimes causing severe air pollution. Now, researchers reporting in ACS'
Fireworks displays can cause health problems, such as respiratory ailments, because they release high levels of air pollutants, including particulate matter (PM), sulfur dioxide, heavy metals and perchlorates. As a result, some cities have banned their use. But because the displays are an important aspect of many traditional celebrations, researchers and manufacturers have tried to develop more environmentally friendly pyrotechnics, including those with smokeless charges and sulfur-free propellants. Although research suggests that these fireworks emit less pollutants, their impact on air quality has not been evaluated. Ying Li and colleagues wanted to use data from a large fireworks display held in Shenzhen, China, on Chinese National Day on Oct. 1, 2019, to assess how "green" these fireworks really are.The researchers estimated emissions of PM
Pollution
2,021
March 2, 2021
https://www.sciencedaily.com/releases/2021/03/210302154243.htm
Indoors, outdoors, 6 feet apart? Transmission risk of airborne viruses can be quantified
In the 1995 movie "Outbreak," Dustin Hoffman's character realizes, with appropriately dramatic horror, that an infectious virus is "airborne" because it's found to be spreading through hospital vents.
The issue of whether our real-life pandemic virus, SARS-CoV-2, is "airborne" is predictably more complex. The current body of evidence suggests that COVID-19 primarily spreads through respiratory droplets -- the small, liquid particles you sneeze or cough, that travel some distance, and fall to the floor. But consensus is mounting that, under the right circumstances, smaller floating particles called aerosols can carry the virus over longer distances and remain suspended in air for longer periods. Scientists are still determining SARS-CoV-2's favorite way to travel.That the science was lacking on how COVID-19 spreads seemed apparent a year ago to Tami Bond, professor in the Department of Mechanical Engineering and Walter Scott, Jr. Presidential Chair in Energy, Environment and Health. As an engineering researcher, Bond spends time thinking about the movement and dispersion of aerosols, a blanket term for particles light and small enough to float through air - whether cigarette smoke, sea spray, or hair spray."It quickly became clear there was some airborne component of transmission," Bond said. "A virus is an aerosol. Health-wise, they are different than other aerosols like pollution, but physically, they are not. They float in the air, and their movement depends on their size."The rush for scientific understanding of the novel coronavirus has focused -- understandably -- on biological mechanisms: how people get infected, the response of the human body, and the fastest path to a vaccine. As an aerosol scientist, Bond went a different route, convening a team at Colorado State University that would treat the virus like any other aerosol. This team, now published in The cross-section of expertise to answer this question existed in droves at CSU, Bond found. The team she assembled includes epidemiologists, aerosol scientists, and atmospheric chemists, and together they created a new tool for defining how infectious pathogens, including SARS-CoV-2, transport in the air.Their tool is a metric they're calling Effective Rebreathed Volume, or simply, the amount of exhaled air from one person that, by the time it travels to the next person, contains the same number of particles. Treating virus-carrying particles agnostically like any other aerosol allowed the team to make objective, physics-based comparisons between different modes of transmission, accounting for how sizes of particles would affect the number of particles that traveled from one person to another.They looked at three size categories of particles that cover a biologically relevant range: 1 micron, 10 microns, and 100 microns -- about the width of a human hair. Larger droplets expelled by sneezing would be closer to the 100-micron region. Particles closer to the size of a single virion would be in the 1-micron region. Each have very different air-travel characteristics, and depending on the size of the particles, different mitigation measures would apply, from opening a window, to increasing fresh air delivery with through an HVAC system.They compiled a set of models to compare different scenarios. For example, the team compared the effective rebreathed volume of someone standing outdoors 6 feet away, to how long it would take someone to rebreathe the same amount of air indoors but standing farther away.The team found that distancing indoors, even 6 feet apart, isn't enough to limit potentially harmful exposures, because confinement indoors allows particle volumes to build up in the air. Such insights aren't revelatory, in that most people avoid confinement in indoor spaces and generally feel safer outdoors. What the paper shows, though, is that the effect of confinement indoors and subsequent particle transport can be quantified, and it can be compared to other risks that people find acceptable, Bond said.Co-authors Jeff Pierce in atmospheric science and Jay Ham in soil and crop sciences helped the team understand atmospheric turbulence in ways that could be compared in indoor and outdoor environments.Pierce said he sought to constrain how the virus-containing particles disperse as a function of distance from the emitting person. When the pandemic hit last year, the public had many questions about whether it was safe to run or bike on trails, Pierce said. The researchers found that longer-duration interactions outdoors at greater than 6-foot distances appeared safer than similar-duration indoor interactions, even if people were further apart indoors, due to particles filling the room rather than being carried away by wind."We started fairly early on in the pandemic, and we were all filled with questions about: 'Which situations are safer than others?' Our pooled expertise allowed us to find answers to this question, and I learned a lot about air filtration and air exchange in my home and in my CSU classroom," Pierce said.What remains unclear is which size particles are most likely to cause COVID-19 infection.Viruses can be carried on droplets large and small, but there is likely a "sweet spot" between droplet size; ability to disperse and remain airborne; and desiccation time, all of which factor into infective potential, explained Angela Bosco-Lauth, paper co-author and assistant professor in biomedical sciences.The paper includes an analysis of the relative infection risk of different indoor and outdoor scenarios and mitigation measures, depending on the numbers of particles being inhaled."The problem we face is that we still don't know what the infectious dose is for people," Bosco-Lauth said. "Certainly, the more virus present, the higher the risk of infection, but we don't have a good model to determine the dose for people. And quantifying infectious virus in the air is tremendously difficult."The team is now pursuing follow-up questions, like comparing different mitigation measures for reducing exposures to viruses indoors. Some of these inquiries fall into the category of "stuff you already know, but with numbers," Bond said. "People are now thinking, OK, more ventilation is better, or remaining outside is better, but there is not a lot of quantification and numbers behind those recommendations," Bond said.Bond hopes the team's work can lay a foundation for more up-front quantification of transmission dynamics in the unfortunate event of another pandemic. "This time, there was a lot of guessing at the beginning, because the science of transmission wasn't fully developed," she said. "There shouldn't be a next time."
Pollution
2,021
March 2, 2021
https://www.sciencedaily.com/releases/2021/03/210302130720.htm
Dethroning electrocatalysts for hydrogen production with inexpensive alternative material
Today, we can say without a shadow of doubt that an alternative to fossil fuels is needed. Fossil fuels are not only non-renewable sources of energy but also among the leading causes of global warming and air pollution. Thus, many scientists worldwide have their hopes placed on what they regard as the fuel of tomorrow: hydrogen (H2). Although H2 is a clean fuel with incredibly high energy density, efficiently generating large amounts of it remains a difficult technical challenge.
Water splitting -- the breaking of water molecules -- is among the most explored methods to produce H2. While there are many ways to go about it, the best-performing water splitting techniques involve electrocatalysts made from expensive metals, such as platinum, ruthenium, and iridium. The problem lies in that known electrocatalysts made from abundant metals are rather ineffective at the oxygen evolution reaction (OER), the most challenging aspect of the water-splitting process.In a recent study published in To this end, the team tested six kinds of iron-based oxides, including CaFe2O4. They soon found that the OER performance of CaFe2O4 was vastly greater than that of other bimetallic electrocatalysts and even higher than that of iridium oxide, a widely accepted benchmark. Additionally, they tested the durability of this promising material and found that it was remarkably stable; no significant structural nor compositional changes were seen after measurement cycles, and the performance of the CaFe2O4 electrode in the electrochemical cell remained high.Eager to understand the reason behind the exceptional capabilities of this unexplored electrocatalyst, the scientists carried out calculations using density functional theory and discovered an unconventional catalytic mechanism. It appears that CaFe2O4 offers an energetically favorable pathway for the formation of oxygen bonds, which is a limiting step in the OER. Although more theoretical calculations and experiments will be needed to be sure, the results indicate that the close distance between multiple iron sites plays a key role.The newly discovered OER electrocatalyst could certainly be a game changer, as Dr Sugawara remarks, "CaFe2O4 has many advantages, from its easy and cost-effective synthesis to its environmental friendliness. We expect it will be a promising OER electrocatalyst for water splitting and that it will open up a new avenue for the development of energy conversion devices." In addition, the new OER boosting mechanism found in CaFe2O4 could lead to the engineering of other useful catalysts. Let us hope these findings help pave the way to the much-needed hydrogen society of tomorrow!
Pollution
2,021
March 2, 2021
https://www.sciencedaily.com/releases/2021/03/210302130659.htm
Rice variety resists arsenic
The agricultural cultivation of the staple food of rice harbours the risk of possible contamination with arsenic that can reach the grains following uptake by the roots. In their investigation of over 4,000 variants of rice, a Chinese-German research team under the direction of Prof. Dr Rüdiger Hell from the Centre for Organismal Studies (COS) of Heidelberg University and Prof. Dr Fang-Jie Zhao of Nanjing Agricultural University (China) discovered a plant variant that resists the toxin. Although the plants thrive in arsenic-contaminated fields, the grains contain far less arsenic than other rice plants. At the same time, this variant has an elevated content of the trace element selenium.
The researchers explain that especially in agricultural regions in Asia, increasing amounts of the metalloid arsenic get into the groundwater through large-scale fertilisation or wastewater sludge, for example. Because rice is cultivated in submerged fields, the plants absorb a good deal of arsenic through the roots, thus giving the potential carcinogen a pathway into the food chain. According to Prof. Hell, arsenic pollution in some soils in Asia is now so high that it is also causing significant crop losses because the arsenic is poisonous to the plants themselves.In the course of their research project, the scientists exposed over 4,000 rice variants to water containing arsenic and then observed their growth. Only one of the plants studied proved to be tolerant against the toxic metalloid. What biologically characterises the rice variant called astol1 is a so-called amino acid exchange in a single protein. "This protein is part of a sensor complex and controls the formation of the amino acid cysteine, which is an important component in the synthesis of phytochelatins. Plants form these detoxifying substances in response to toxic metals and thus neutralise them," explains Prof. Hell, who together with his research group at the COS is studying the function of this sensory complex. The neutralised arsenic is stored in the roots of the plant before it reaches the edible rice grains and can endanger humans.In the field study, astol1 rice grains absorbed one third less arsenic than conventional rice grains that were also exposed to arsenic-contaminated water. The researchers further discovered a 75 percent higher content of the essential trace element selenium, which is involved in the production of thyroid hormones in humans. As for yield, astol1 is just as good as the standard high-yield rice variants, making it especially suitable for agricultural use."In future, rice plants like astol1 could be used in arsenic-contaminated regions to feed the population as well as help fight diet-related selenium deficiency," states Dr Sheng-Kai Sun with optimism. The junior researcher was instrumental in discovering the rice variant during the course of his PhD work at Nanjing Agricultural University. Thanks to a scholarship from the Alexander von Humboldt Foundation, he has been working since last year at the Centre for Organismal Studies in the groups of Prof. Hell and Dr Markus Wirtz to investigate the sensor complex causing the astol1 phenotype.The basic research into this sensor complex is being funded by the German Research Foundation. 
Pollution
2,021
March 2, 2021
https://www.sciencedaily.com/releases/2021/03/210302130636.htm
'Canary in the mine' warning follows new discovery of effects of pollutants on fertility
New research has found that shrimp like creatures on the South Coast of England have 70 per cent less sperm than less polluted locations elsewhere in the world. The research also discovered that individuals living in the survey area are six times less numerous per square metre than those living in cleaner waters.
This discovery, published today in Professor Alex Ford, Professor of Biology, University of Portsmouth, says: "We normally study the effect of chemicals on species after the water has been treated. The shrimp that we have tested are often in untreated water. The study site suffers from storm water surges, which is likely to become more common with climate change. This means that the creatures could be exposed to lots of different contaminants via sewage, historical landfills and legacy chemicals such as those in antifoulting paints. There is a direct relationship between the incidence of high rainfall events and in the levels of untreated sewage."Professor Ford describes the shrimp as "the canary in the mine" -- concerned that the plight of the shrimp is only just the tip of the iceberg in terms of fertility problems in male creatures, both great and small."It is thought that some male fertility problems are related to pollution," said Professor Ford. "It may not be the same pollutants, but it is all chemicals that are being released into the environment. It is not being stopped and, more importantly, the effects are not being properly monitored or understood."Most male fertility research has historically focused on vertebrate species. Very little is known about the effects of pollution on invertebrate fertility, especially those amphipods at the bottom of the food chain.A decade ago University of Portsmouth scientists observed little shrimp with very low sperm counts in nearby Langstone Harbour. Surprised by such a result they decided to monitor the animals over the next 10 years.When Marina Tenório Botelho, a University of Portsmouth PHD student, couldn't continue with her lab-based research due to COVID restrictions she was given the task of data mining the decade's worth of statistics. Her routine study uncovered a worrying reality that these animals have consistently low sperm similar to those in areas that are industrially contaminated.Professor Ford explains that other marine creatures are also suffering: "We know that pollutants are affecting male fertility levels of all species. Killer whales around our coasts are contaminated with so many pollutants that some can't reproduce. Recent studies* have also suggested that harbour porpoises contaminated with highly toxic industrial compounds, known as polychlorinated biphenyls (PCBs), have smaller testes."Researchers at the University of Portsmouth believe this new study feeds into wider studies on male fertility. Professor Ford says: "Researchers have been looking at worldwide declines in sperm counts of humans over the past 50 years. Research* has shown that in some countries, a boy born today will have half the sperm count of his grandfather and there are fears boys are getting critically close to being infertile."Marina Tenório Botelho's research also showed that female shrimp produce fewer numbers of eggs and appear in low densities in the same waters. It suggests that because male shrimps' capacity to fertilise females is compromised, the females in turn have fewer eggs. Scientists are concerned this could lead to a population collapse in the area, which would have a knock on effect on the rest of the food chain. Less food to go round would also eventually mean fewer birds and fish in the region.
Pollution
2,021
March 2, 2021
https://www.sciencedaily.com/releases/2021/03/210302130628.htm
Indoor air quality study shows aircraft in flight may have lowest particulate levels
If you're looking for an indoor space with a low level of particulate air pollution, a commercial airliner flying at cruising altitude may be your best option. A newly reported study of air quality in indoor spaces such as stores, restaurants, offices, public transportation -- and commercial jets -- shows aircraft cabins with the lowest levels of tiny aerosol particles.
Conducted in July 2020, the study included monitoring both the number of particles and their total mass across a broad range of indoor locations, including 19 commercial flights in which measurements took place throughout departure and arrival terminals, the boarding process, taxiing, climbing, cruising, descent, and deplaning. The monitoring could not identify the types of the particles and therefore does not provide a direct measure of coronavirus exposure risk."We wanted to highlight how important it is to have a high ventilation rate and clean air supply to lower the concentration of particles in indoor spaces," said Nga Lee (Sally) Ng, associate professor and Tanner Faculty Fellow in the School of Chemical and Biomolecular Engineering and the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology. "The in-flight cabin had the lowest particle mass and particle number concentration."The study, believed to be the first to measure both size-resolved particle mass and number in commercial flights from terminal to terminal and a broad range of indoor spaces, has been accepted for publication in the journal As scientists learn more about transmission of the coronavirus, the focus has turned to aerosol particles as an important source of viral spread indoors. Infected people can spread the virus as they breathe, talk, or cough, creating particles ranging in size from less than a micron -- one millionth of a meter -- to 1,000 microns. The larger particles quickly fall out of the air, but the smaller ones remain suspended."Especially in poorly ventilated spaces, these particles can be suspended in the air for a long period of time, and can travel to every corner of a room," Ng said. "If they are viral particles, they can infect people who may be at a considerable distance from a person emitting the particles."To better understand the circulation of airborne particles, Delta approached Ng to conduct a study of multiple indoor environments, with a strong focus on air travel conditions. Using handheld instruments able to measure the total number of particles and their mass, Georgia Tech researchers examined air quality in a series of Atlanta area restaurants, stores, offices, homes, and vehicles -- including buses, trains, and private automobiles.They trained Delta staff to conduct the same type of measurements in terminals, boarding areas, and a variety of aircraft through all phases of flight. The Delta staff recorded their locations as they moved through the terminals, and the instruments produced measurements consistent with the restaurants and stores they passed on their way to and from boarding and departure gates."The measurements started as soon as they stepped into the departure terminal," Ng said. "We were thinking about the whole trip, what a person would encounter from terminal to terminal."In flight, aircraft air is exchanged between 10 and 30 times per hour. Some aircraft bring in exclusively outside air, which at cruising altitude is largely free of pollutant particles found in air near the ground. Other aircraft mix outdoor air with recirculated air that goes through HEPA filters, which remove more than 99% of particles.In all, the researchers evaluated measurements from 19 commercial flights with passenger loads of approximately 50%. The flights included a mix of short- and medium-length flights, and aircraft ranging from the CRJ-200 and A220 to the 757, A321, and 737.Among all the spaces measured, restaurants had the highest particle levels because of cooking being done there. Stores were next, followed by vehicles, homes, and offices. The average sub-micron particle number concentration measured in restaurants, for instance, was 29,400 particles per cubic centimeter, and in offices it was 2,473 per cubic centimeter."We have quite a comprehensive data set to look at the size distribution of particles across these different spaces," Ng said. "We can now compare indoor air quality in a variety of different spaces."Because of the portable instruments used, the researchers were unable to determine the source of the particles, which could have included both biological and non-biological sources. "Further studies can include direct measurements of viral loads and tracing particle movements in indoor spaces," she added.Jonathan Litzenberger, Delta's managing director of Global Cleanliness Strategy, said the research helps advance the company's goals of protecting its customers and employees."Keeping the air clean and safe during flight is one of the most foundational layers of protection Delta aims to provide to our customers and employees," he said. "We are always working to better understand the travel environment and confirm that the measures we are implementing are working."Overall, the study highlights the importance of improving indoor air quality as a means of reducing coronavirus transmission."Regardless of whether you are in an office or an aircraft, having a higher ventilation rate and good particle filtration are the keys to reducing the total particle concentration," said Ng. "That should also reduce the concentration of any viral particles that may be present."
Pollution
2,021
March 1, 2021
https://www.sciencedaily.com/releases/2021/03/210301151553.htm
COVID-19 lockdown highlights ozone chemistry in China
In early 2020, daily life in Northern China slammed to a halt as the region entered a strict period of lockdown to slow the spread of COVID-19. Emissions from transportation and industry plummeted. Emissions of nitrogen oxides (NOx) from fossil fuels fell by 60 to 70 percent.
And yet, environmental researchers noticed that ground-level ozone pollution in Beijing and the Northern China Plain skyrocketed during this time period, despite the decrease of NOx, a component of ozone.The region is no stranger to severe ozone pollution but until about five years ago, most ozone events occurred during the summer. Recently, the ozone season in China has been getting longer, spreading into early spring and late winter. As it turns out, the COVID-19 lockdown can help explain why.Researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and the Nanjing University of Information Science & Technology (NUIST) have found that another component of ozone, volatile organic compounds (VOCs), may be to blame for the increase in winter ozone.The research is published in the "The COVID-19 lockdown was an involuntary experiment in which the emissions decreased abruptly and a lot of ozone appeared suddenly," said Daniel J. Jacob, the Vasco McCoy Family Professor of Atmospheric Chemistry and Environmental Engineering at SEAS and co-corresponding author of the paper.Ozone is formed through a series of chemical reactions, starting with the oxidation of VOCs. This reaction forms chemical radicals, which drive reactions between NOx and VOCs to produce ozone in the presence of sunlight. In a previous study, researchers from SEAS and NUIST found that in the summertime, particulate matter (PM2.5) acts like a sponge for the radicals needed to generate ozone pollution, sucking them up and preventing them from producing ozone.In that paper, the researchers found that air pollution policies instituted by the Chinese government that reduced PM2.5 were causing an increase in harmful ground-level ozone pollution, especially in large cities.In this research, the team found that NOx plays a similar role in the wintertime, scavenging radicals and preventing them from forming ozone. As NOx levels decrease, either all of a sudden with lockdown or gradually with air pollution controls, there are more radicals available for VOCs to react with. This enhanced oxidation of VOCs by radicals would amplify by producing more radicals themselves, and this process optimizes the ozone production efficiency of NOx."The COVID-19 experience helps explain the trend of increasing ozone pollution in the late winter and spring in China," said Ke Li, a postdoctoral fellow at SEAS and first author of the study. "As NOx emissions have decreased, the ozone season in China is getting longer."The research highlights the need to better understand the sources and species of VOCs and regulate their emissions."VOC emission controls would stop the spread of the ozone season and have major benefits on public health, crop production, and particulate pollution," said Hong Liao, Professor at NUIST and co-corresponding author of this work.The paper was co-authored by Yulu Qiu, Lu Shen, Shixian Zhai, Kelvin H. Bates, Melissa P. Sulprizio, Shaojie Song, Xiao Lu, Qiang Zhang, Bo Zheng, Yuli Zhang, Jinqiang Zhang, Hyun Chul Lee, and Su Keun Ku.It was supported by NUIST through the Harvard-NUIST Joint Laboratory for Air Quality and Climate (JLAQC).
Pollution
2,021
March 1, 2021
https://www.sciencedaily.com/releases/2021/03/210301151529.htm
On calm days, sunlight warms the ocean surface and drives turbulence
In tropical oceans, a combination of sunlight and weak winds drives up surface temperatures in the afternoon, increasing atmospheric turbulence, unprecedented new observational data collected by an Oregon State University researcher shows.
The new findings could have important implications for weather forecasting and climate modeling, said Simon de Szoeke, a professor in OSU's College of Earth, Ocean, and Atmospheric Sciences and the lead author of the study."The ocean warms in the afternoon by just a degree or two, but it is an effect that has largely been ignored," said de Szoeke. "We would like to know more accurately how often this is occurring and what role it may play in global weather patterns."The findings were just published in the journal Over land, afternoon warming can lead to atmospheric convection and turbulence and often produces thunderstorms. Over the ocean, the afternoon convection also draws water vapor from the ocean surface to moisten the atmosphere and form clouds. The warming over the ocean is more subtle and gets stronger when the wind is weak, said de Szoeke.De Szoeke's study of ocean warming began during a research trip in the Indian Ocean several years ago. The research vessel was equipped with Doppler lidar, a remote sensing technology similar to radar that uses a laser pulse to measure air velocity. That allowed researchers to collect measurements of the height and strength of the turbulence generated by the afternoon warming for the first time.Previous observations of the turbulence over the ocean had been made only by aircraft, de Szoeke said."With lidar, we have the ability to profile the turbulence 24 hours a day, which allowed us to capture how these small shifts in temperature lead to air turbulence," he said. "No one has done this kind of measurement over the ocean before."Researchers gathered data from the lidar around the clock for about two months. At one point, surface temperatures warmed each afternoon for four straight days with calm wind speeds, giving researchers the right conditions to observe a profile of the turbulence created in this type of sea surface warming event.It took a "perfect storm" of conditions, including round-the-clock sampling by the lidar and a long ocean deployment, to capture these unprecedented observations, de Szoeke said.Sunlight warms the ocean surface in the afternoon, surface temperatures go up by a degree Celsius or more. This warming occurs during roughly 5% of days in the world's tropical oceans. Those oceans represent about 2% of the Earth's surface, about the equivalent of the size of the United States.The calm wind and warming air conditions occur in different parts of the ocean in response to weather conditions, including monsoons and Madden-Julian Oscillation, or MJO, events, which are ocean-scale atmospheric disturbances that occur regularly in the tropics.To determine the role these changing temperatures play in weather conditions in the tropics, weather models need to include the effects of surface warming, de Szoeke said."There are a lot of subtle effects that people are trying to get right in climate modeling," de Szoeke said. "This research gives us a more precise understanding of what happens when winds are low."The research was supported by NOAA and the Office of Naval Research.
Pollution
2,021