AAEES.org

AAEES.org
Click logo to return to AAEES.org

Monday, March 23, 2015

Chlorine Use in Sewage Treatment Could Promote Antibiotic Resistance

Date: March 22, 2015

Source: American Chemical Society


Graduate student Nicole Kennedy measures the antibiotic activity of various samples in the lab.
Credit: Olya Keen

Chlorine, a disinfectant commonly used in most wastewater treatment plants, may be failing to completely eliminate pharmaceuticals from wastes. As a result, trace levels of these substances get discharged from the plants to the nation's waterways. And now, scientists are reporting preliminary studies that show chlorine treatment may encourage the formation of new, unknown antibiotics that could also enter the environment, potentially contributing to the growing problem of antibiotic resistance.

The research, which will be presented today at the 249th National Meeting & Exposition of the American Chemical Society (ACS), suggests that a re-evaluation of wastewater treatment and disinfection practices is needed.

"Pharmaceuticals that get out into the environment can harm aquatic life, making them react slowly in the wild and disrupting their hormone systems," notes Olya Keen, Ph.D. She adds that increased antibiotic exposure, even at low levels in the environment, can lead to development of antibiotic-resistant microbes and a general weakening of antibiotics' abilities to fight bacterial infections in humans.

"Treated wastewater is one of the major sources of pharmaceuticals and antibiotics in the environment," says Keen. "Wastewater treatment facilities were not designed to remove these drugs. The molecules are typically very stable and do not easily get biodegraded. Instead, most just pass through the treatment facility and into the aquatic environment."

But besides failing to remove all drugs from wastewater, sewage treatment facilities using chlorine may have the unintended consequences of encouraging the formation of other antibiotics in the discharged water. Keen, graduate student Nicole Kennedy and others in her team at the University of North Carolina at Charlotte ran several lab experiments and found that exposing doxycycline, a common antibiotic, to chlorine in wastewater increased the antibiotic properties of their samples.

"Surprisingly, we found that the products formed in the lab sample were even stronger antibiotics than doxycycline, the parent and starting compound," she adds. Keen has not yet identified all the properties of these "transformation products," and that research is now underway. She notes that these compounds could turn out to be previously unidentified antibiotics.

Keen explains that the best solution may be to decrease the amount of these drugs that reach a treatment plant in the first place. Currently, disposal of pharmaceuticals is not regulated, however. So she urges a greater emphasis on collecting and incinerating old pharmaceuticals, rather than dumping them down the drain or placing them in the trash, which can lead to harmful environmental exposures.

In addition, this research has applications to drinking water treatment systems, most of which also use chlorine as a disinfectant, she says. To purify drinking water, chlorine must remain in the distribution piping system for hours, which blocks microbes from growing. But this also provides ample time for chlorine to interact with pharmaceuticals that may be in the water, encouraging development of new antibiotic compounds.

Story Source: "Chlorine use in sewage treatment could promote antibiotic resistance." ScienceDaily. ScienceDaily, 22 March 2015. www.sciencedaily.com/releases/2015/03/150322080204.htm

Tuesday, March 10, 2015

Earth's Climate is Starting to Change Faster, New Research Shows

Date: March 9, 2015

Source: Pacific Northwest National Laboratory

Summary: Earth is now entering a period of changing climate that will likely be faster than what's occurred naturally over the last thousand years, according to a new article, committing people to live through and adapt to a warming world.

The rate of temperature change is rising and will continue to do so, as seen here with the thick gray line. This model depicts rates measured in 40-year windows of time, a time frame that reflects lifespans of people.
Credit: Image courtesy of Pacific Northwest National Laboratory


An analysis of changes to the climate that occur over several decades suggests that these changes are happening faster than historical levels and are starting to speed up. The Earth is now entering a period of changing climate that will likely be faster than what's occurred naturally over the last thousand years, according to a new paper in Nature Climate Change, committing people to live through and adapt to a warming world.

In this study, interdisciplinary scientist Steve Smith and colleagues at the Department of Energy's Pacific Northwest National Laboratory examined historical and projected changes over decades rather than centuries to determine the temperature trends that will be felt by humans alive today.

"We focused on changes over 40-year periods, which is similar to the lifetime of houses and human-built infrastructure such as buildings and roads," said lead author Smith. "In the near term, we're going to have to adapt to these changes."

See CMIP Run

Overall, the Earth is getting warmer due to increasing greenhouse gases in the atmosphere that trap heat. But the rise is not smooth -- temperatures bob up and down. Although natural changes in temperature have long been studied, less well-understood is how quickly temperatures changed in the past and will change in the future over time scales relevant to society, such as over a person's lifetime. A better grasp of how fast the climate might change could help decision-makers better prepare for its impacts.

To examine rates of change, Smith and colleagues at the Joint Global Change Research Institute, a collaboration between PNNL and the University of Maryland in College Park, turned to the Coupled Model Intercomparison Project. The CMIP combines simulations from over two-dozen climate models from around the world to compare model results.

All the CMIP models used the same data for past and future greenhouse gas concentrations, pollutant emissions, and changes to how land is used, which can emit or take in greenhouse gases. The more models in agreement, the more confidence in the results.

The team calculated how fast temperatures changed between 1850 and 1930, a period when people started keeping records but when the amount of fossil fuel gases collecting in the atmosphere was low. They compared these rates to temperatures reconstructed from natural sources of climate information, such as from tree rings, corals and ice cores, for the past 2,000 years.

Taken together, the shorter time period simulations were similar to the reconstructions over a longer time period, suggesting the models reflected reality well.

While there was little average global temperature increase in this early time period, Earth's temperature fluctuated due to natural variability. Rates of change over 40-year periods in North America and Europe rose and fell as much as 0.2 degrees Celsius per decade. The computer models and the reconstructions largely agreed on these rates of natural variability, indicating the models provide a good representation of trends over a 40-year scale.

Now Versus Then

Then the team performed a similar analysis using CMIP but calculated 40-year rates of change between 1971 to 2020. They found the average rate of change over North America, for example, to be about 0.3 degrees Celsius per decade, higher than can be accounted for by natural variability. The CMIP models show that, at the present time, most world regions are almost completely outside the natural range for rates of change.

The team also examined how the rates of change would be affected in possible scenarios of future emissions. Climate change picked up speed in the next 40 years in all cases, even in scenarios with lower rates of future greenhouse gas emissions. A scenario where greenhouse gas emissions remained high resulted in high rates of change throughout the rest of this century.

Still, the researchers can't say exactly what impact faster rising temperatures will have on the Earth and its inhabitants.

"In these climate model simulations, the world is just now starting to enter into a new place, where rates of temperature change are consistently larger than historical values over 40-year time spans," said Smith. "We need to better understand what the effects of this will be and how to prepare for them."

This work was supported by the Department of Energy Office of Science.


Story Source:
The above story is based on materials provided by Pacific Northwest National Laboratory. Pacific Northwest National Laboratory. "Earth's climate is starting to change faster, new research shows." ScienceDaily. ScienceDaily, 9 March 2015. www.sciencedaily.com/releases/2015/03/150309134642.htm.

Thursday, February 12, 2015

Electricity from Biomass with Carbon Capture Could Make Western US Carbon-Negative

Date: February 9, 2015

Source: University of California - Berkeley

Summary: Biomass conversion to electricity combined with technologies for capturing and storing carbon, which should become viable within 35 years, could result in a carbon-negative power grid in the western US by 2050. That prediction comes from an analysis of various fuel scenarios. Bioenergy with carbon capture and sequestration may be a better use of plant feedstocks than making biofuels.


This is a chart showing how different mixes of fuels can affect the carbon emissions in 2050 from the electrical grid in the western US. Biomass carbon capture and sequestration and biomass co-firing CCS on coal CCS plants provide negative carbon dioxide emissions. As emissions limits are reduced, fossil-fuel CO2 emissions shift from coal and combined-cycle gas turbine technology to CCGT with CCS.
Credit: Daniel Kammen and Daniel Sanchez, UC Berkeley

Generating electricity from biomass, such as urban waste and sustainably-sourced forest and crop residues, is one strategy for reducing greenhouse gas emissions, because it is carbon-neutral: it produces as much carbon as the plants suck out of the atmosphere.

A new UC Berkeley study shows that if biomass electricity production is combined with carbon capture and sequestration in the western United States, power generators could actually store more carbon than they emit and make a critical contribution to an overall zero-carbon future by the second half of the 21st century.

By capturing carbon from burning biomass -- termed bioenergy with carbon capture and sequestration (BECCS) -- power generators could become carbon-negative even while retaining gas- or coal-burning plants. The carbon reduction might even offset the emissions from fossil fuel used in transportation, said study leader Daniel Sanchez, a graduate student in UC Berkeley's Energy and Resources Group.

"There are a lot of commercial uncertainties about carbon capture and sequestration technologies," Sanchez admitted. "Nevertheless, we're taking this technology and showing that in the Western United States 35 years from now, BECCS doesn't merely let you reduce emissions by 80 percent -- the current 2050 goal in California -- but gets the power system to negative carbon emissions: you store more carbon than you create."

BECCS may be one of the few cost-effective carbon-negative opportunities available to mitigate the worst effects of anthropogenic climate change, said energy expert Daniel Kammen, who directed the research. This strategy will be particularly important should climate change be worse than anticipated, or emissions reductions in other portions of the economy prove particularly difficult to achieve.

"Biomass, if managed sustainably can provide the 'sink' for carbon that, if utilized in concert with low-carbon generation technologies, can enable us to reduce carbon in the atmosphere," said Kammen, a Professor of Energy in UC Berkeley's Energy and Resources Group and director of the Renewable and Appropriate Energy Laboratory (RAEL) in which the work was conducted.

Sanchez, Kammen and their colleagues published their analysis of BECCS in western North America Feb. 9 in the online journal Nature Climate Change.

Carbon Capture & Sequestration

Though the financial costs, not to mention technological hurdles, of capturing carbon from biomass power plants and compressing it underground are huge, the Intergovernmental Panel on Climate Change (IPCC), the major international body studying the issue, assumes that it will become viable in 50 years, and includes it in its long-term predictions.

"BECCS technologies figure prominently in the IPCC's recent Fifth Assessment Report (AR5), which focuses in part on mitigating climate change, but previous models examining BECCS deployment have not investigated its role in power systems in detail or in aggressive time frames," said Kammen, who serves as a coordinating lead author on the IPCC.

To remedy this, the UC Berkeley scientists used a detailed computer model they developed of the West's electric power grid to predict deployment of BECCS in low-carbon and carbon-negative power systems. This model of western North America, called SWITCH-WECC, was developed in the RAEL lab. Researchers can use SWITCH to study generation, transmission and storage options for the United States west of the Kansas/Colorado border as well as in northwest Mexico and the Canadian provinces of Alberta and British Columbia.

The study found that BECCS, combined with aggressive renewable energy deployment and fossil emissions reductions, can enable a carbon-negative power system in western North America by 2050 with up to 145 percent emissions reduction from 1990 levels. Such reductions can occur with as little as 7 percent of the power coming from BECCS. In most scenarios explored, the carbon offsets produced by BECCS are more valuable to the power system than the electricity it provides.

The study relies on a detailed spatial and temporal inventory of potential bioenergy feedstocks, such as forest residues, municipal solid waste and switchgrass, as well as complementary renewable energy, such as wind and solar power.

Sanchez noted that burning biomass as part of BECCS may have a greater impact on greenhouse gas emissions than using these same feedstocks for biofuels, solely because of the possibility of carbon capture.

"We're evaluating a technology with some uncertainty behind it, but we are saying that if the technology exists, it really sketches out a different kind of climate mitigation pathway than what people are assuming," Sanchez said.

Story Source:
The above story is based on materials provided by University of California - Berkeley.

University of California - Berkeley. "Electricity from biomass with carbon capture could make western US carbon-negative." ScienceDaily. ScienceDaily, 9 February 2015. www.sciencedaily.com/releases/2015/02/150209130732.htm.

Tuesday, February 10, 2015

Preventing Greenhouse Gas from Entering the Atmosphere

Date: February 5, 2015

Source: Harvard University

Summary: A novel class of materials that enable a safer, cheaper, and more energy-efficient process for removing greenhouse gas from power plant emissions has been developed by a multi-institution team of researchers. The approach could be an important advance in carbon capture and sequestration.


Microcapsule method offers new approach to carbon capture and storage at power plants. The new technique for carbon-capturing employs an abundant and environmentally benign sorbent: sodium carbonate, which is kitchen-grade baking soda. The microencapsulated carbon sorbents (MECS) achieve an order-of-magnitude increase in CO2 absorption rates compared to sorbents currently used. This illustration shows the flow-focusing microfluidic capillary device used to produce the silicone microcapsules.
Credit: John Vericella, Chris Spadaccini, and Roger Aines/LLNL; James Hardin and Jennifer Lewis/Harvard University; Nature

A team of researchers has developed a novel class of materials that enable a safer, cheaper, and more energy-efficient process for removing greenhouse gas from power-plant emissions. The approach could be an important advance in carbon capture and sequestration.

The team, led by scientists from Harvard University and Lawrence Livermore National Laboratory, employed a microfluidic assembly technique to produce microcapsules that contain liquid sorbents, or absorbing materials, encased in highly permeable polymer shells. They have significant performance advantages over the carbon-absorbing materials used in current capture and sequestration technology.

The work is described in a paper published online today in the journal Nature Communications.

"Microcapsules have been used in a variety of applications -- for example, in pharmaceuticals, food flavoring, cosmetics, and agriculture -- for controlled delivery and release, but this is one of the first demonstrations of this approach for controlled capture," said Jennifer A. Lewis, the Hansjörg Wyss Professor of Biologically Inspired Engineering at the Harvard School of Engineering and Applied Sciences (SEAS) and a co-lead author. Lewis is also a core faculty member of the Wyss Institute for Biologically Inspired Engineering at Harvard.

Power plants are the single largest source of carbon dioxide (CO2), a greenhouse gas that traps heat and makes the planet warmer. According to the U.S. Environmental Protection Agency, coal- and natural gas-fired plants were responsible for a third of U.S. greenhouse gas emissions in 2012.

That's why the agency has proposed rules mandating dramatically reduced carbon emissions at all new fossil fuel-fired power plants. Satisfying the new standards will require operators to equip plants with carbon-trapping technology.

Current carbon-capture technology uses caustic amine-based solvents to separate CO2 from the flue gas escaping a facility's smokestacks. But state-of-the-art processes are expensive, result in a significant reduction in a power plant's output, and yield toxic byproducts. The new technique employs an abundant and environmentally benign sorbent: sodium carbonate, which is kitchen-grade baking soda. The microencapsulated carbon sorbents (MECS) achieve an order-of-magnitude increase in CO2 absorption rates compared to sorbents currently used in carbon capture. Another advantage is that amines break down over time, while carbonates have a virtually limitless shelf life.

"MECS provide a new way to capture carbon with fewer environmental issues," said Roger D. Aines, leader of the fuel cycle innovations program at Lawrence Livermore National Laboratory and a co-lead author. "Capturing the world's carbon emissions is a huge job. We need technology that can be applied to many kinds of carbon dioxide sources, with the public's full confidence in the safety and sustainability."

Researchers at Lawrence Livermore and the U.S. Department of Energy's National Energy Technology Lab are now working on enhancements to the capture process to bring the technology to scale.

Aines says that the MECS-based approach could also be tailored to industrial processes like steel and cement production, which are significant greenhouse gas sources.

"These permeable silicone beads could be a 'sliced-bread' breakthrough for CO2 capture -- efficient, easy-to-handle, minimal waste, and cheap to make," said Stuart Haszeldine, a professor of carbon capture and storage at the University of Edinburgh, who was not involved in the research. "Durable, safe, and secure capsules containing solvents tailored to diverse applications can place CO2 capture … firmly onto the cost-reduction pathway."

MECS are produced using a double-capillary device in which the flow rates of three fluids -- a carbonate solution combined with a catalyst for enhanced CO2 absorption, a photo-curable silicone that forms the capsule shell, and an aqueous solution -- can be independently controlled.

"Encapsulation allows you to combine the advantages of solid-capture media and liquid-capture media in the same platform," said Lewis. "It is also quite flexible, in that both the core and shell chemistries can be independently modified and optimized."

"This innovative gas separation platform provides large surface areas while eliminating a number of operational issues, including corrosion, evaporative losses, and fouling," said Ah-Hyung (Alissa) Park, the chair in applied climate science and associate professor of Earth and environmental engineering at Columbia University, who was not involved in the research.

Lewis has previously conducted groundbreaking research in the 3-D printing of functional materials, including tissue constructs with embedded vasculature, lithium-ion microbatteries, and ultra-lightweight carbon-fiber epoxy materials.

Funding for the encapsulated liquid carbonates work was provided by the Innovative Materials and Processes for Advanced Carbon Capture Technology program of the U.S. Department of Energy's Advanced Research Projects Agency-Energy.

Story Source:

Harvard University. "Preventing greenhouse gas from entering the atmosphere." ScienceDaily. ScienceDaily, 5 February 2015. .

Wednesday, February 4, 2015

Rivers Might Constitute Just 20 Percent of Continental Water Flowing into Oceans

Date: February 2, 2015

Source: University of South Carolina

Summary: The Amazon, Nile and Mississippi are mighty rivers, but they and all their worldwide brethren might be a relative trickle compared with an unseen torrent below the surface. New research shows that rivers might constitute as little as 20 percent of the water that flows yearly into the Atlantic and Indo-Pacific Oceans from the continents. The rest flows through what is termed the 'subterranean estuary,' which some researchers think supply the lion's share of terrestrial nutrients to the oceans.



Professor emeritus Willard Moore was elected this year as a Fellow of the American Association for the Advancement of Science, which is among the oldest scientific societies in America and is the publisher of the journal Science.



In recently published research, Moore was part of a team that used observed concentrations of radium-228 to model the global distribution of the radionuclide between 60 degrees South and 70 degrees North latitude. [Image adapted from Geophysical Research Letters]


If you think rivers are what send terrestrial rainfall back into the oceans, you don't know the half of it. And that fraction keeps shrinking. According to new research, it might be that only one-fifth of the water flowing from the continents into the Atlantic and Indo-Pacific Oceans runs through overland channels of water. And just as surprising, a vast amount flows into the land from the ocean.

University of South Carolina professor Willard Moore is part of an international team that recently estimated how much of the water flowing into the oceans comes not from surface rivers, creeks and streams, but instead from what he has termed the "subterranean estuary." For two decades, Moore has drawn attention to the oft-overlooked flow and exchange of ocean and groundwater in the permeable layers of rock and sediment, a process that occurs both near the coastline and extending out on the continental shelf. But the roots of his work in the field go back even further in his 50-year scientific career.

Developing the Tools of the Trade

Soon after earning a doctorate at the State University of New York at Stony Brook, Moore began working in the early 1970s as a civilian in the Naval Oceanographic Office, then located in Maryland. The task at hand was to study deep ocean processes, and one of the best tools for doing that was to measure the amounts of certain naturally occurring radioactive elements dissolved in the seawater at different locations and depths.

"There's a little bit of uranium and thorium in all rocks, and as those elements decay, they produce a whole string of different elements, which themselves are radioactive," Moore says. "So say a rock is in seawater and the uranium decays to thorium and then it decays to radium. Well, the radium is much more soluble than the other components, so it can go into solution in the seawater. There are very few ways to remove it naturally, except by radioactive decay."

Moore likens the radium that dissolves from a rock to a dye that slowly loses its intensity over time. The half-life of radium establishes how fast the "dye" loses intensity, and by measuring how radium concentrations diminish with increasing distance from the seafloor -- the rock source -- a scientist can come up with a model for how the water there is flowing and mixing.

A major shortcoming at the time, though, was how laborious it was to collect data. Radioactive elements are present in very small concentrations in seawater.

"It used to be that you needed about 600 to 800 liters of water," amounting to more than 200 gallons for a single data point, Moore says. "It was a very time-consuming series of chemical processes, and I decided early on that if we were really going to use radium to understand the ocean, we had to have a better way to extract it."

From Lake Oneida to the Sea

Moore put together a new method after mulling over a few disparate observations. Some colleagues had come up with a much more efficient means for extracting a different radioactive isotope, silicon-32, by coating an acrylic fiber with an iron compound and then exposing it to flowing seawater. Silica (which contains silicon-32) in the water is adsorbed onto the fiber, effectively concentrating the radionuclide into a much smaller area (namely, on the surface of the fiber rather than dispersed in many gallons of water).

The iron coating on the fiber didn't pick up radium, but Moore found something that did while working on a small side project. He was trying to understand the growth of a characteristic kind of rock formation found in certain places on the bottom of Lake Oneida, a freshwater lake of glacial origin in central New York. The formation is called a manganese nodule, which has alternating layers rich in manganese and iron oxides.

In the course of that research, he found that the nodules were rich in radium as well, which put him on the idea that perhaps manganese dioxide could be used to extract radium from seawater.

"I remember very clearly when I saw the first counts on the radium in the nodule. I walked into the lab, mixed up a manganese solution, and put it on the fiber," Moore says. "I was living on Chesapeake Bay and had a sailboat, so I went out, towed it through the bay, came back and it was loaded with radium. It's just an illustration of how if you have several irons in the fire, they all don't get hot at the same time, but sometimes one will ignite another one."

Putting the Tools to Work

Moore joined the Carolina faculty in 1976 with a ready means of determining radium concentration in seawater, and he expanded his repertoire from the deep-sea research that characterized his work with the Navy to closer-to-shore studies. A primary goal in South Carolina was to understand the exchange processes between water and surface sediments in estuaries near the coastline.

Much of the early work with radium, though, raised all sorts of questions, Moore says. The gradients they were seeing simply didn't make sense, but in retrospect it was because one basic assumption was way off.

At the time, it was generally thought that groundwater flow into the ocean was insignificant, maybe 3 percent to 5 percent of river flow, Moore says. The breakthrough came when one of his colleagues suggested that a sizable salty groundwater flow must be responsible for their observations.

They measured radium in inland wells, finding that fresh groundwater had almost none, but that saltier groundwater was loaded with it. The inescapable conclusion: water from the ocean was being exchanged with groundwater in prodigious quantities, and it was happening underground.

"The action was in the permeable sediments below. It started this whole idea that the continents were connected to the ocean not only by riverine processes, but by submarine processes." Moore says. "I came up with the term subterranean estuary. So just like the surface estuary, it's the region between the coast and the ocean where freshwater is coming in on one side and seawater is coming in on the other side, they're mixing, and after chemical reactions, some of that water is expelled back into the ocean."

Moore was part of an international team that developed a quantitative model for submarine groundwater discharge across most of the globe, and they just published a paper in Geophysical Research Letters showing that the amount of subterranean water flow into the Atlantic and Indo-Pacific Oceans is some three to four times that of all rivers combined. Perhaps even more important is the conclusion that most of the flow of terrestrial nutrients is subterranean.

"If you put a lot of nutrients into the ocean, you increase primary productivity. You make lots of algae, which may be good, but excessive algae settles out and as it decays it uses up oxygen from the bottom water," Moore says. "We call it hypoxia, where the oxygen is so low fish can't breathe. So, productivity is a delicate balance.

"Currently, in most of the estimates of how nutrients come into the water, it's thought to be coming from rivers, springs, streams -- things you can see -- or from point source pollution, sewage, drainage pipes off of golf courses. But people have not considered how much is coming from the subterranean estuary. It's a whole biogeochemical process that's going on that people haven't really thought about very much."

Story Source:
The above story is based on materials provided by University of South Carolina. The original article was written by Steven Powell.

Monday, January 12, 2015

Algae Blooms Create Their Own Favorable Conditions

Date: January 8, 2015
Source: Dartmouth College
Summary: Fertilizers are known to promote the growth of toxic cyanobacterial blooms in freshwater and oceans worldwide, but a new multi-institution study shows the aquatic microbes themselves can drive nitrogen and phosphorus cycling in a combined one-two punch in lakes.



This is a cyanobacterial bloom in China's Lake Taihu.
Credit: Cayelan Carey


The findings suggest cyanobacteria -- sometimes known as pond scum or blue-green algae -- that get a toe-hold in low-to-moderate nutrient lakes can set up positive feedback loops that amplify the effects of pollutants and climate change and make conditions even more favorable for blooms, which threaten water resources and public health worldwide. The findings shed new light on what makes cyanobacteria so successful and may lead to new methods of prevention and control.

The study appears in the journal Ecosphere.

"We usually think of cyanobacteria as responders to human manipulations of watersheds that increase nutrient loading, but our findings show they can also be drivers of nitrogen and phosphorus cycling in lakes," says Dartmouth Professor Kathryn Cottingham, one of the study's lead authors. "This is important because cyanobacteria are on the increase in response to global change -- both warming temperatures and land use -- and may be driving nutrient cycling in more lakes in the future, especially the clear-water, low-nutrient lakes that are so important for drinking water, fisheries and recreation."

Biogeochemical cycling is the natural recycling of nutrients between living organisms and the atmosphere, land and water. The researchers found that cyanobacterial blooms can influence lake nutrient cycling and the ability of a lake to maintain its current conditions by tapping into pools of nitrogen and phosphorus not usually accessible to phytoplankton. The ability of many cyanobacterial organisms to fix dissolved nitrogen gas is a well-known potential source of nitrogen, but some organisms can also access pools of phosphorus in sediments and bottom waters. Both of these nutrients can subsequently be released to the water column via leakage or decomposing organisms, thereby increasing nutrient availability for other phytoplankton and microbes.

Story Source: The above story is based on materials provided by Dartmouth College.

Tuesday, December 16, 2014

Hazy Road to Mecca

Date: December 15, 2014

Source: University of California - Irvine

Summary: Dangerously high levels of air pollutants are being released in Mecca during the hajj, the annual holy pilgrimage in which millions of Muslims on foot and in vehicles converge on the Saudi Arabian city, according to new findings.



UC Irvine and other researchers are testing air pollution in the Middle East, including in Mecca during the annual hajj, at burning landfills and elsewhere. Dangerously high levels of smog forming contaminants are being released, the scientists have found.
Credit: Image courtesy of Dr. Azhar Siddique


Dangerously high levels of air pollutants are being released in Mecca during the hajj, the annual holy pilgrimage in which millions of Muslims on foot and in vehicles converge on the Saudi Arabian city, according to findings reported today at the American Geophysical Union meeting in San Francisco.

"Hajj is like nothing else on the planet. You have 3 to 4 million people -- a whole good-sized city -- coming into an already existing city," said Isobel Simpson, a UC Irvine research chemist in the Nobel Prize-winning Rowland-Blake atmospheric chemistry laboratory. "The problem is that this intensifies the pollution that already exists. We measured among the highest concentrations our group has ever measured in urban areas -- and we've studied 75 cities around the world in the past two decades."

Scientists from UCI, King Abdulaziz University in Saudi Arabia, the University of Karachi in Pakistan, the New York State Department of Health's Wadsworth Center, and the University at Albany in New York captured and analyzed air samples during the 2012 and 2013 hajjes on roadsides; near massive, air-conditioned tents; and in narrow tunnels that funnel people to the Grand Mosque, the world's largest, in the heart of Mecca.

The worst spot was inside the Al-Masjid Al-Haram tunnel, where pilgrims on foot, hotel workers and security personnel are exposed to fumes from idling vehicles, often for hours. The highest carbon monoxide level -- 57,000 parts per billion -- was recorded in this tunnel during October 2012. That's more than 300 times regional background levels.

Heart attacks are a major concern linked to such exposure: The risk of heart failure hospitalization or death rises sharply as the amount of carbon monoxide in the air escalates, the researchers note in a paper published in the journal Environmental Science & Technology. Headaches, dizziness and nausea have also been associated with inhaling carbon monoxide.

"There's carbon monoxide that increases the risk of heart failure. There's benzene that causes narcosis and leukemia," Simpson said. "But the other way to look at it is that people are not just breathing in benzene or CO, they're breathing in hundreds of components of smog and soot."

The scientists detected a stew of unhealthy chemicals, many connected to serious illnesses by the World Health Organization and others.

"Air pollution is the cause of one in eight deaths and has now become the single biggest environmental health risk globally," said Haider Khwaja of the University at Albany. "There were 4.3 million deaths in 2012 due to indoor air pollution and 3.7 million deaths because of outdoor air pollution, according to WHO. And more than 90 percent of those deaths and lost life years occur in developing countries."

Khwaja experienced sooty air pollution firsthand as a child in Karachi, Pakistan, and saw his elderly father return from the hajj with a wracking cough that took weeks to clear. He and fellow researchers braved the tunnels and roads to take air samples and install continuous monitors in Mecca.

"Suffocating," he said of the air quality.

In addition to the high smog-forming measurements, the team in follow-up work found alarming levels of black carbon and fine particulates that sink deep into lungs. Once the hajj was over, concentrations of all contaminants fell but were still comparable to those in other large cities with poor air quality. Just as unhealthy "bad air" days once plagued Greater Los Angeles, research is now showing degraded air in the oil-rich, sunny Arabian Peninsula and elsewhere in the Middle East. Because the number of pilgrims and permanent residents is increasing, the scientists recommend reducing emissions by targeting fossil fuel sources.

Besides vehicle exhaust, other likely culprits include gasoline high in benzene, a lack of vapor locks around gas station fuel nozzles, and older cars with disintegrating brake liners and other parts. Coolants used for air-conditioned tents sleeping up to 40 people also contribute to greenhouse gas buildup. And the dearth of regulations exacerbates these problems.

The researchers said that Saudi officials are aware of the issues and taking steps to address them, such as working to reduce benzene in area gasoline supplies. Directing Mecca pedestrians and vehicles to separate tunnels would be optimal. In addition, clearing the region's air with time-tested technologies used elsewhere in the world could sharply reduce pollution and save lives.

"This is a major public health problem, and the positive news is that some of the answers are very much within reach, like putting rubber seals on nozzles at gas stations to reduce leaks," Simpson said. "It's a simple, doable solution."

Story Source:
The above story is based on materials provided by University of California - Irvine.