AAEES.org

AAEES.org
Click logo to return to AAEES.org

Tuesday, May 26, 2015

Climate Engineering May Save Coral Reefs, study shows

Date: May 25, 2015

Source: University of Exeter

Summary: Mass coral bleaching, which can lead to coral mortality, is predicted to occur far more frequently over the coming decades, due to the stress exerted by higher seawater temperatures. Geoengineering of the climate may be the only way to save coral reefs from mass bleaching, according to new research.


Current coral bleaching in Fiji.
Credit: Professor Peter J Mumby, University of Queensland

Geoengineering of the climate may be the only way to save coral reefs from mass bleaching, according to new research.


Coral reefs are considered one of the most vulnerable ecosystems to future climate change due to rising sea surface temperatures and ocean acidification, which is caused by higher atmospheric levels of carbon dioxide.

Mass coral bleaching, which can lead to coral mortality, is predicted to occur far more frequently over the coming decades, due to the stress exerted by higher seawater temperatures.

Scientists believe that, even under the most ambitious future CO2 reduction scenarios, widespread and severe coral bleaching and degradation will occur by the middle of this century.

The collaborative new research, which includes authors from the Carnegie Institution for Science, the University of Exeter, the Met Office Hadley Centre and the University of Queensland, suggest that a geoengineering technique called Solar Radiation Management (SRM) reduces the risk of global severe bleaching.

The SRM method involves injecting gas into the stratosphere, forming microscopic particles which reflect some of the sun's energy and so help limit rising sea surface temperatures.

The study compared a hypothetical SRM geoengineering scenario to the most aggressive future CO2 reduction strategy considered by the Intergovernmental Panel on Climate Change (IPCC), and found that coral reefs fared much better under geoengineering despite increasing ocean acidification.

The pioneering international study is published in leading scientific journal, Nature Climate Change.

Lead author Dr Lester Kwiatkowski of the Carnegie Institution for Science said "Our work highlights the sort of climate scenarios that now need to be considered if the protection of coral reefs is a priority."

Dr Paul Halloran, from the Geography department of the University of Exeter added: "The study shows that the benefit of SRM over a conventional CO2 reduction scenario is dependent on the sensitivity of future thermal bleaching thresholds to changes in seawater acidity.

This emphasises the need to better characterise how warming and ocean acidification may interact to influence coral bleaching over the 21st century."

Professor Peter Cox, co-author of the research and from the University of Exeter said: "Coral reefs face a dire situation regardless of how intensively society decarbonises the economy. In reality there is no direct choice between conventional mitigation and climate engineering but this study shows that we need to either accept that the loss of a large percentage of the world's reefs is inevitable or start thinking beyond conventional mitigation of CO2 emissions."

This work shows the very different impacts on coral bleaching of different measures to tackle climate change. These different techniques will also have different effects on other impacts such as regional crop growth or water availability.

Story Source: University of Exeter. "Climate engineering may save coral reefs, study shows." ScienceDaily. ScienceDaily, 25 May 2015. www.sciencedaily.com/releases/2015/05/150525120430.htm.

Monday, April 20, 2015

Engineers Purify Sea and Wastewater in 2.5 Minutes

Date: April 17, 2015
Source: Investigación y Desarrollo
Summary: A group of engineers have created technology to recover and purify, either seawater or wastewater from households, hotels, hospitals, commercial and industrial facilities, regardless of the content of pollutants and microorganisms in, incredibly, just 2.5 minutes, experts say.



Credit: Image courtesy of Investigación y Desarrollo

A group of Mexican engineers from the Jhostoblak Corporate created technology to recover and purify, either seawater or wastewater from households, hotels, hospitals, commercial and industrial facilities, regardless of the content of pollutants and microorganisms in, incredibly, just 2.5 minutes, researchers say.

The System PQUA, works with a mixture of dissociating elements, capable of separating and removing all contaminants, as well as organic and inorganic pollutants. "The methodology is founded on molecularly dissociating water pollutants to recover the minerals necessary and sufficient in order for the human body to function properly nourished," technical staff explained.

Notably, the engineers developed eight dissociating elements, and after extensive testing on different types of contaminated water, implemented a unique methodology that indicates what and how much of each element should be combined.

"During the purification process no gases, odors nor toxic elements that may damage or alter the environment, human health or quality of life are generated," said the Mexican firm.

The corporation has a pilot plant in their offices that was used to demonstrate the purification process, which uses gravity to save energy. We observed that the residual water in the container was pumped to reactor tank, where it received a dosing of the dissociating elements in predetermined amounts.

In this phase solid, organic and inorganic matter as well as heavy metals are removed by precipitation and gravity; and a sludge settles at the bottom of the reactor. The latter is removed and examined to determine if it is suitable to use as fertilizer or manufacture construction materials.

Subsequently, the water is conducted to a clarifier tank, to sediment the excess charge of dissolved elements; then the liquid reaches a filter to remove turbidity and is finally passed by polishing tank that eliminates odors, colors and flavors. The treated water is transported to a container where ozone is added to ensure its purity, and finally is ready to drink. Indeed, the resulting liquid is fresh, odorless and has a neutral taste.

"We have done over 50 tests on different types of wastewater and all have been certified and authorized by the laboratories of the Mexican Accreditation Agency (EMA). Also, the Monterrey Institute of Technology and Higher Education (ITESM), the College of Mexico and the National Polytechnic Institute (IPN) have given their validation that the water treated with our technology meets the SSA NOM 127 standard, which indicates the parameters and quality characteristics for vital liquid to be used for human consumption," says the Corporate Jhostoblak.

Moreover, they report that this development is protected under trade secret in America and soon will get the same record in Switzerland. Its implementation in the market will depend on the needs of users and the issue of new laws regarding use, consumption and water discharge.

Story Source:
The above story is based on materials provided by Investigación y Desarrollo; nvestigación y Desarrollo. "Engineers purify sea and wastewater in 2.5 minutes." ScienceDaily. ScienceDaily, 17 April 2015.

Monday, March 23, 2015

Chlorine Use in Sewage Treatment Could Promote Antibiotic Resistance

Date: March 22, 2015

Source: American Chemical Society


Graduate student Nicole Kennedy measures the antibiotic activity of various samples in the lab.
Credit: Olya Keen

Chlorine, a disinfectant commonly used in most wastewater treatment plants, may be failing to completely eliminate pharmaceuticals from wastes. As a result, trace levels of these substances get discharged from the plants to the nation's waterways. And now, scientists are reporting preliminary studies that show chlorine treatment may encourage the formation of new, unknown antibiotics that could also enter the environment, potentially contributing to the growing problem of antibiotic resistance.

The research, which will be presented today at the 249th National Meeting & Exposition of the American Chemical Society (ACS), suggests that a re-evaluation of wastewater treatment and disinfection practices is needed.

"Pharmaceuticals that get out into the environment can harm aquatic life, making them react slowly in the wild and disrupting their hormone systems," notes Olya Keen, Ph.D. She adds that increased antibiotic exposure, even at low levels in the environment, can lead to development of antibiotic-resistant microbes and a general weakening of antibiotics' abilities to fight bacterial infections in humans.

"Treated wastewater is one of the major sources of pharmaceuticals and antibiotics in the environment," says Keen. "Wastewater treatment facilities were not designed to remove these drugs. The molecules are typically very stable and do not easily get biodegraded. Instead, most just pass through the treatment facility and into the aquatic environment."

But besides failing to remove all drugs from wastewater, sewage treatment facilities using chlorine may have the unintended consequences of encouraging the formation of other antibiotics in the discharged water. Keen, graduate student Nicole Kennedy and others in her team at the University of North Carolina at Charlotte ran several lab experiments and found that exposing doxycycline, a common antibiotic, to chlorine in wastewater increased the antibiotic properties of their samples.

"Surprisingly, we found that the products formed in the lab sample were even stronger antibiotics than doxycycline, the parent and starting compound," she adds. Keen has not yet identified all the properties of these "transformation products," and that research is now underway. She notes that these compounds could turn out to be previously unidentified antibiotics.

Keen explains that the best solution may be to decrease the amount of these drugs that reach a treatment plant in the first place. Currently, disposal of pharmaceuticals is not regulated, however. So she urges a greater emphasis on collecting and incinerating old pharmaceuticals, rather than dumping them down the drain or placing them in the trash, which can lead to harmful environmental exposures.

In addition, this research has applications to drinking water treatment systems, most of which also use chlorine as a disinfectant, she says. To purify drinking water, chlorine must remain in the distribution piping system for hours, which blocks microbes from growing. But this also provides ample time for chlorine to interact with pharmaceuticals that may be in the water, encouraging development of new antibiotic compounds.

Story Source: "Chlorine use in sewage treatment could promote antibiotic resistance." ScienceDaily. ScienceDaily, 22 March 2015. www.sciencedaily.com/releases/2015/03/150322080204.htm

Tuesday, March 10, 2015

Earth's Climate is Starting to Change Faster, New Research Shows

Date: March 9, 2015

Source: Pacific Northwest National Laboratory

Summary: Earth is now entering a period of changing climate that will likely be faster than what's occurred naturally over the last thousand years, according to a new article, committing people to live through and adapt to a warming world.

The rate of temperature change is rising and will continue to do so, as seen here with the thick gray line. This model depicts rates measured in 40-year windows of time, a time frame that reflects lifespans of people.
Credit: Image courtesy of Pacific Northwest National Laboratory


An analysis of changes to the climate that occur over several decades suggests that these changes are happening faster than historical levels and are starting to speed up. The Earth is now entering a period of changing climate that will likely be faster than what's occurred naturally over the last thousand years, according to a new paper in Nature Climate Change, committing people to live through and adapt to a warming world.

In this study, interdisciplinary scientist Steve Smith and colleagues at the Department of Energy's Pacific Northwest National Laboratory examined historical and projected changes over decades rather than centuries to determine the temperature trends that will be felt by humans alive today.

"We focused on changes over 40-year periods, which is similar to the lifetime of houses and human-built infrastructure such as buildings and roads," said lead author Smith. "In the near term, we're going to have to adapt to these changes."

See CMIP Run

Overall, the Earth is getting warmer due to increasing greenhouse gases in the atmosphere that trap heat. But the rise is not smooth -- temperatures bob up and down. Although natural changes in temperature have long been studied, less well-understood is how quickly temperatures changed in the past and will change in the future over time scales relevant to society, such as over a person's lifetime. A better grasp of how fast the climate might change could help decision-makers better prepare for its impacts.

To examine rates of change, Smith and colleagues at the Joint Global Change Research Institute, a collaboration between PNNL and the University of Maryland in College Park, turned to the Coupled Model Intercomparison Project. The CMIP combines simulations from over two-dozen climate models from around the world to compare model results.

All the CMIP models used the same data for past and future greenhouse gas concentrations, pollutant emissions, and changes to how land is used, which can emit or take in greenhouse gases. The more models in agreement, the more confidence in the results.

The team calculated how fast temperatures changed between 1850 and 1930, a period when people started keeping records but when the amount of fossil fuel gases collecting in the atmosphere was low. They compared these rates to temperatures reconstructed from natural sources of climate information, such as from tree rings, corals and ice cores, for the past 2,000 years.

Taken together, the shorter time period simulations were similar to the reconstructions over a longer time period, suggesting the models reflected reality well.

While there was little average global temperature increase in this early time period, Earth's temperature fluctuated due to natural variability. Rates of change over 40-year periods in North America and Europe rose and fell as much as 0.2 degrees Celsius per decade. The computer models and the reconstructions largely agreed on these rates of natural variability, indicating the models provide a good representation of trends over a 40-year scale.

Now Versus Then

Then the team performed a similar analysis using CMIP but calculated 40-year rates of change between 1971 to 2020. They found the average rate of change over North America, for example, to be about 0.3 degrees Celsius per decade, higher than can be accounted for by natural variability. The CMIP models show that, at the present time, most world regions are almost completely outside the natural range for rates of change.

The team also examined how the rates of change would be affected in possible scenarios of future emissions. Climate change picked up speed in the next 40 years in all cases, even in scenarios with lower rates of future greenhouse gas emissions. A scenario where greenhouse gas emissions remained high resulted in high rates of change throughout the rest of this century.

Still, the researchers can't say exactly what impact faster rising temperatures will have on the Earth and its inhabitants.

"In these climate model simulations, the world is just now starting to enter into a new place, where rates of temperature change are consistently larger than historical values over 40-year time spans," said Smith. "We need to better understand what the effects of this will be and how to prepare for them."

This work was supported by the Department of Energy Office of Science.


Story Source:
The above story is based on materials provided by Pacific Northwest National Laboratory. Pacific Northwest National Laboratory. "Earth's climate is starting to change faster, new research shows." ScienceDaily. ScienceDaily, 9 March 2015. www.sciencedaily.com/releases/2015/03/150309134642.htm.

Thursday, February 12, 2015

Electricity from Biomass with Carbon Capture Could Make Western US Carbon-Negative

Date: February 9, 2015

Source: University of California - Berkeley

Summary: Biomass conversion to electricity combined with technologies for capturing and storing carbon, which should become viable within 35 years, could result in a carbon-negative power grid in the western US by 2050. That prediction comes from an analysis of various fuel scenarios. Bioenergy with carbon capture and sequestration may be a better use of plant feedstocks than making biofuels.


This is a chart showing how different mixes of fuels can affect the carbon emissions in 2050 from the electrical grid in the western US. Biomass carbon capture and sequestration and biomass co-firing CCS on coal CCS plants provide negative carbon dioxide emissions. As emissions limits are reduced, fossil-fuel CO2 emissions shift from coal and combined-cycle gas turbine technology to CCGT with CCS.
Credit: Daniel Kammen and Daniel Sanchez, UC Berkeley

Generating electricity from biomass, such as urban waste and sustainably-sourced forest and crop residues, is one strategy for reducing greenhouse gas emissions, because it is carbon-neutral: it produces as much carbon as the plants suck out of the atmosphere.

A new UC Berkeley study shows that if biomass electricity production is combined with carbon capture and sequestration in the western United States, power generators could actually store more carbon than they emit and make a critical contribution to an overall zero-carbon future by the second half of the 21st century.

By capturing carbon from burning biomass -- termed bioenergy with carbon capture and sequestration (BECCS) -- power generators could become carbon-negative even while retaining gas- or coal-burning plants. The carbon reduction might even offset the emissions from fossil fuel used in transportation, said study leader Daniel Sanchez, a graduate student in UC Berkeley's Energy and Resources Group.

"There are a lot of commercial uncertainties about carbon capture and sequestration technologies," Sanchez admitted. "Nevertheless, we're taking this technology and showing that in the Western United States 35 years from now, BECCS doesn't merely let you reduce emissions by 80 percent -- the current 2050 goal in California -- but gets the power system to negative carbon emissions: you store more carbon than you create."

BECCS may be one of the few cost-effective carbon-negative opportunities available to mitigate the worst effects of anthropogenic climate change, said energy expert Daniel Kammen, who directed the research. This strategy will be particularly important should climate change be worse than anticipated, or emissions reductions in other portions of the economy prove particularly difficult to achieve.

"Biomass, if managed sustainably can provide the 'sink' for carbon that, if utilized in concert with low-carbon generation technologies, can enable us to reduce carbon in the atmosphere," said Kammen, a Professor of Energy in UC Berkeley's Energy and Resources Group and director of the Renewable and Appropriate Energy Laboratory (RAEL) in which the work was conducted.

Sanchez, Kammen and their colleagues published their analysis of BECCS in western North America Feb. 9 in the online journal Nature Climate Change.

Carbon Capture & Sequestration

Though the financial costs, not to mention technological hurdles, of capturing carbon from biomass power plants and compressing it underground are huge, the Intergovernmental Panel on Climate Change (IPCC), the major international body studying the issue, assumes that it will become viable in 50 years, and includes it in its long-term predictions.

"BECCS technologies figure prominently in the IPCC's recent Fifth Assessment Report (AR5), which focuses in part on mitigating climate change, but previous models examining BECCS deployment have not investigated its role in power systems in detail or in aggressive time frames," said Kammen, who serves as a coordinating lead author on the IPCC.

To remedy this, the UC Berkeley scientists used a detailed computer model they developed of the West's electric power grid to predict deployment of BECCS in low-carbon and carbon-negative power systems. This model of western North America, called SWITCH-WECC, was developed in the RAEL lab. Researchers can use SWITCH to study generation, transmission and storage options for the United States west of the Kansas/Colorado border as well as in northwest Mexico and the Canadian provinces of Alberta and British Columbia.

The study found that BECCS, combined with aggressive renewable energy deployment and fossil emissions reductions, can enable a carbon-negative power system in western North America by 2050 with up to 145 percent emissions reduction from 1990 levels. Such reductions can occur with as little as 7 percent of the power coming from BECCS. In most scenarios explored, the carbon offsets produced by BECCS are more valuable to the power system than the electricity it provides.

The study relies on a detailed spatial and temporal inventory of potential bioenergy feedstocks, such as forest residues, municipal solid waste and switchgrass, as well as complementary renewable energy, such as wind and solar power.

Sanchez noted that burning biomass as part of BECCS may have a greater impact on greenhouse gas emissions than using these same feedstocks for biofuels, solely because of the possibility of carbon capture.

"We're evaluating a technology with some uncertainty behind it, but we are saying that if the technology exists, it really sketches out a different kind of climate mitigation pathway than what people are assuming," Sanchez said.

Story Source:
The above story is based on materials provided by University of California - Berkeley.

University of California - Berkeley. "Electricity from biomass with carbon capture could make western US carbon-negative." ScienceDaily. ScienceDaily, 9 February 2015. www.sciencedaily.com/releases/2015/02/150209130732.htm.

Tuesday, February 10, 2015

Preventing Greenhouse Gas from Entering the Atmosphere

Date: February 5, 2015

Source: Harvard University

Summary: A novel class of materials that enable a safer, cheaper, and more energy-efficient process for removing greenhouse gas from power plant emissions has been developed by a multi-institution team of researchers. The approach could be an important advance in carbon capture and sequestration.


Microcapsule method offers new approach to carbon capture and storage at power plants. The new technique for carbon-capturing employs an abundant and environmentally benign sorbent: sodium carbonate, which is kitchen-grade baking soda. The microencapsulated carbon sorbents (MECS) achieve an order-of-magnitude increase in CO2 absorption rates compared to sorbents currently used. This illustration shows the flow-focusing microfluidic capillary device used to produce the silicone microcapsules.
Credit: John Vericella, Chris Spadaccini, and Roger Aines/LLNL; James Hardin and Jennifer Lewis/Harvard University; Nature

A team of researchers has developed a novel class of materials that enable a safer, cheaper, and more energy-efficient process for removing greenhouse gas from power-plant emissions. The approach could be an important advance in carbon capture and sequestration.

The team, led by scientists from Harvard University and Lawrence Livermore National Laboratory, employed a microfluidic assembly technique to produce microcapsules that contain liquid sorbents, or absorbing materials, encased in highly permeable polymer shells. They have significant performance advantages over the carbon-absorbing materials used in current capture and sequestration technology.

The work is described in a paper published online today in the journal Nature Communications.

"Microcapsules have been used in a variety of applications -- for example, in pharmaceuticals, food flavoring, cosmetics, and agriculture -- for controlled delivery and release, but this is one of the first demonstrations of this approach for controlled capture," said Jennifer A. Lewis, the Hansjörg Wyss Professor of Biologically Inspired Engineering at the Harvard School of Engineering and Applied Sciences (SEAS) and a co-lead author. Lewis is also a core faculty member of the Wyss Institute for Biologically Inspired Engineering at Harvard.

Power plants are the single largest source of carbon dioxide (CO2), a greenhouse gas that traps heat and makes the planet warmer. According to the U.S. Environmental Protection Agency, coal- and natural gas-fired plants were responsible for a third of U.S. greenhouse gas emissions in 2012.

That's why the agency has proposed rules mandating dramatically reduced carbon emissions at all new fossil fuel-fired power plants. Satisfying the new standards will require operators to equip plants with carbon-trapping technology.

Current carbon-capture technology uses caustic amine-based solvents to separate CO2 from the flue gas escaping a facility's smokestacks. But state-of-the-art processes are expensive, result in a significant reduction in a power plant's output, and yield toxic byproducts. The new technique employs an abundant and environmentally benign sorbent: sodium carbonate, which is kitchen-grade baking soda. The microencapsulated carbon sorbents (MECS) achieve an order-of-magnitude increase in CO2 absorption rates compared to sorbents currently used in carbon capture. Another advantage is that amines break down over time, while carbonates have a virtually limitless shelf life.

"MECS provide a new way to capture carbon with fewer environmental issues," said Roger D. Aines, leader of the fuel cycle innovations program at Lawrence Livermore National Laboratory and a co-lead author. "Capturing the world's carbon emissions is a huge job. We need technology that can be applied to many kinds of carbon dioxide sources, with the public's full confidence in the safety and sustainability."

Researchers at Lawrence Livermore and the U.S. Department of Energy's National Energy Technology Lab are now working on enhancements to the capture process to bring the technology to scale.

Aines says that the MECS-based approach could also be tailored to industrial processes like steel and cement production, which are significant greenhouse gas sources.

"These permeable silicone beads could be a 'sliced-bread' breakthrough for CO2 capture -- efficient, easy-to-handle, minimal waste, and cheap to make," said Stuart Haszeldine, a professor of carbon capture and storage at the University of Edinburgh, who was not involved in the research. "Durable, safe, and secure capsules containing solvents tailored to diverse applications can place CO2 capture … firmly onto the cost-reduction pathway."

MECS are produced using a double-capillary device in which the flow rates of three fluids -- a carbonate solution combined with a catalyst for enhanced CO2 absorption, a photo-curable silicone that forms the capsule shell, and an aqueous solution -- can be independently controlled.

"Encapsulation allows you to combine the advantages of solid-capture media and liquid-capture media in the same platform," said Lewis. "It is also quite flexible, in that both the core and shell chemistries can be independently modified and optimized."

"This innovative gas separation platform provides large surface areas while eliminating a number of operational issues, including corrosion, evaporative losses, and fouling," said Ah-Hyung (Alissa) Park, the chair in applied climate science and associate professor of Earth and environmental engineering at Columbia University, who was not involved in the research.

Lewis has previously conducted groundbreaking research in the 3-D printing of functional materials, including tissue constructs with embedded vasculature, lithium-ion microbatteries, and ultra-lightweight carbon-fiber epoxy materials.

Funding for the encapsulated liquid carbonates work was provided by the Innovative Materials and Processes for Advanced Carbon Capture Technology program of the U.S. Department of Energy's Advanced Research Projects Agency-Energy.

Story Source:

Harvard University. "Preventing greenhouse gas from entering the atmosphere." ScienceDaily. ScienceDaily, 5 February 2015. .

Wednesday, February 4, 2015

Rivers Might Constitute Just 20 Percent of Continental Water Flowing into Oceans

Date: February 2, 2015

Source: University of South Carolina

Summary: The Amazon, Nile and Mississippi are mighty rivers, but they and all their worldwide brethren might be a relative trickle compared with an unseen torrent below the surface. New research shows that rivers might constitute as little as 20 percent of the water that flows yearly into the Atlantic and Indo-Pacific Oceans from the continents. The rest flows through what is termed the 'subterranean estuary,' which some researchers think supply the lion's share of terrestrial nutrients to the oceans.



Professor emeritus Willard Moore was elected this year as a Fellow of the American Association for the Advancement of Science, which is among the oldest scientific societies in America and is the publisher of the journal Science.



In recently published research, Moore was part of a team that used observed concentrations of radium-228 to model the global distribution of the radionuclide between 60 degrees South and 70 degrees North latitude. [Image adapted from Geophysical Research Letters]


If you think rivers are what send terrestrial rainfall back into the oceans, you don't know the half of it. And that fraction keeps shrinking. According to new research, it might be that only one-fifth of the water flowing from the continents into the Atlantic and Indo-Pacific Oceans runs through overland channels of water. And just as surprising, a vast amount flows into the land from the ocean.

University of South Carolina professor Willard Moore is part of an international team that recently estimated how much of the water flowing into the oceans comes not from surface rivers, creeks and streams, but instead from what he has termed the "subterranean estuary." For two decades, Moore has drawn attention to the oft-overlooked flow and exchange of ocean and groundwater in the permeable layers of rock and sediment, a process that occurs both near the coastline and extending out on the continental shelf. But the roots of his work in the field go back even further in his 50-year scientific career.

Developing the Tools of the Trade

Soon after earning a doctorate at the State University of New York at Stony Brook, Moore began working in the early 1970s as a civilian in the Naval Oceanographic Office, then located in Maryland. The task at hand was to study deep ocean processes, and one of the best tools for doing that was to measure the amounts of certain naturally occurring radioactive elements dissolved in the seawater at different locations and depths.

"There's a little bit of uranium and thorium in all rocks, and as those elements decay, they produce a whole string of different elements, which themselves are radioactive," Moore says. "So say a rock is in seawater and the uranium decays to thorium and then it decays to radium. Well, the radium is much more soluble than the other components, so it can go into solution in the seawater. There are very few ways to remove it naturally, except by radioactive decay."

Moore likens the radium that dissolves from a rock to a dye that slowly loses its intensity over time. The half-life of radium establishes how fast the "dye" loses intensity, and by measuring how radium concentrations diminish with increasing distance from the seafloor -- the rock source -- a scientist can come up with a model for how the water there is flowing and mixing.

A major shortcoming at the time, though, was how laborious it was to collect data. Radioactive elements are present in very small concentrations in seawater.

"It used to be that you needed about 600 to 800 liters of water," amounting to more than 200 gallons for a single data point, Moore says. "It was a very time-consuming series of chemical processes, and I decided early on that if we were really going to use radium to understand the ocean, we had to have a better way to extract it."

From Lake Oneida to the Sea

Moore put together a new method after mulling over a few disparate observations. Some colleagues had come up with a much more efficient means for extracting a different radioactive isotope, silicon-32, by coating an acrylic fiber with an iron compound and then exposing it to flowing seawater. Silica (which contains silicon-32) in the water is adsorbed onto the fiber, effectively concentrating the radionuclide into a much smaller area (namely, on the surface of the fiber rather than dispersed in many gallons of water).

The iron coating on the fiber didn't pick up radium, but Moore found something that did while working on a small side project. He was trying to understand the growth of a characteristic kind of rock formation found in certain places on the bottom of Lake Oneida, a freshwater lake of glacial origin in central New York. The formation is called a manganese nodule, which has alternating layers rich in manganese and iron oxides.

In the course of that research, he found that the nodules were rich in radium as well, which put him on the idea that perhaps manganese dioxide could be used to extract radium from seawater.

"I remember very clearly when I saw the first counts on the radium in the nodule. I walked into the lab, mixed up a manganese solution, and put it on the fiber," Moore says. "I was living on Chesapeake Bay and had a sailboat, so I went out, towed it through the bay, came back and it was loaded with radium. It's just an illustration of how if you have several irons in the fire, they all don't get hot at the same time, but sometimes one will ignite another one."

Putting the Tools to Work

Moore joined the Carolina faculty in 1976 with a ready means of determining radium concentration in seawater, and he expanded his repertoire from the deep-sea research that characterized his work with the Navy to closer-to-shore studies. A primary goal in South Carolina was to understand the exchange processes between water and surface sediments in estuaries near the coastline.

Much of the early work with radium, though, raised all sorts of questions, Moore says. The gradients they were seeing simply didn't make sense, but in retrospect it was because one basic assumption was way off.

At the time, it was generally thought that groundwater flow into the ocean was insignificant, maybe 3 percent to 5 percent of river flow, Moore says. The breakthrough came when one of his colleagues suggested that a sizable salty groundwater flow must be responsible for their observations.

They measured radium in inland wells, finding that fresh groundwater had almost none, but that saltier groundwater was loaded with it. The inescapable conclusion: water from the ocean was being exchanged with groundwater in prodigious quantities, and it was happening underground.

"The action was in the permeable sediments below. It started this whole idea that the continents were connected to the ocean not only by riverine processes, but by submarine processes." Moore says. "I came up with the term subterranean estuary. So just like the surface estuary, it's the region between the coast and the ocean where freshwater is coming in on one side and seawater is coming in on the other side, they're mixing, and after chemical reactions, some of that water is expelled back into the ocean."

Moore was part of an international team that developed a quantitative model for submarine groundwater discharge across most of the globe, and they just published a paper in Geophysical Research Letters showing that the amount of subterranean water flow into the Atlantic and Indo-Pacific Oceans is some three to four times that of all rivers combined. Perhaps even more important is the conclusion that most of the flow of terrestrial nutrients is subterranean.

"If you put a lot of nutrients into the ocean, you increase primary productivity. You make lots of algae, which may be good, but excessive algae settles out and as it decays it uses up oxygen from the bottom water," Moore says. "We call it hypoxia, where the oxygen is so low fish can't breathe. So, productivity is a delicate balance.

"Currently, in most of the estimates of how nutrients come into the water, it's thought to be coming from rivers, springs, streams -- things you can see -- or from point source pollution, sewage, drainage pipes off of golf courses. But people have not considered how much is coming from the subterranean estuary. It's a whole biogeochemical process that's going on that people haven't really thought about very much."

Story Source:
The above story is based on materials provided by University of South Carolina. The original article was written by Steven Powell.