Category Archives: Research

A promising concept on the path to fusion energy (IEEE Transactions on Plasma Science)

by John Greenwald, Princeton Plasma Physics Laboratory

QUASAR stellerator design

QUASAR stellerator design (Source: PPPL)

Completion of a promising experimental facility at the U.S. Department of Energy’s Princeton Plasma Laboratory (PPPL) could advance the development of fusion as a clean and abundant source of energy for generating electricity, according to a PPPL paper published this month in the journal IEEE Transactions on Plasma Science.

The facility, called the Quasi-Axisymmetric Stellarator Research (QUASAR) experiment, represents the first of a new class of fusion reactors based on the innovative theory of quasi-axisymmetry, which makes it possible to design a magnetic bottle that combines the advantages of the stellarator with the more widely used tokamak design. Experiments in QUASAR would test this theory. Construction of QUASAR — originally known as the National Compact Stellarator Experiment — was begun in 2004 and halted in 2008 when costs exceeded projections after some 80 percent of the machine’s major components had been built or procured.

“This type of facility must have a place on the roadmap to fusion,” said physicist George “Hutch” Neilson, the head of the Advanced Projects Department at PPPL.

Both stellarators and tokamaks use magnetic fields to control the hot, charged plasma gas that fuels fusion reactions. While tokamaks put electric current into the plasma to complete the magnetic confinement and hold the gas together, stellarators don’t require such a current to keep the plasma bottled up. Stellarators rely instead on twisting — or 3D —magnetic fields to contain the plasma in a controlled “steady state.”

Stellarator plasmas thus run little risk of disrupting — or falling apart — as can happen in tokamaks if the internal current abruptly shuts off. Developing systems to suppress or mitigate such disruptions is a challenge that builders of tokamaks like ITER, the international fusion experiment under construction in France, must face.

Stellarators had been the main line of fusion development in the 1950s and early 1960s before taking a back seat to tokamaks, whose symmetrical, doughnut-shaped magnetic field geometry produced good plasma confinement and proved easier to create. But breakthroughs in computing and physics understanding have revitalized interest in the twisty, cruller-shaped stellarator design and made it the subject of major experiments in Japan and Germany.

PPPL developed the QUASAR facility with both stellarators and tokamaks in mind. Tokamaks produce magnetic fields and a plasma shape that are the same all the way around the axis of the machine — a feature known as “axisymmetry.” QUASAR is symmetrical too, but in a different way. While QUASAR was designed to produce a twisting and curving magnetic field, the strength of that field varies gently as in a tokamak — hence the name “quasi-symmetry” (QS) for the design.  This property of the field strength was to produce plasma confinement properties identical to those of tokamaks.

“If the predicted near-equivalence in the confinement physics can be validated experimentally,” Neilson said, “then the development of the QS line may be able to continue as essentially a ‘3D tokamak.’”

Such development would test whether a QUASAR-like design could be a candidate for a demonstration — or DEMO —fusion facility that would pave the way for construction of a commercial fusion reactor that would generate electricity for the power grid.

Read the paper.

George Neilson, David Gates, Philip Heitzenroeder, Joshua Breslau, Stewart Prager, Timothy Stevenson, Peter Titus, Michael Williams, and Michael Zarnstorff. Next Steps in Quasi-Axisymmetric Stellarator Research IEEE Transactions on Plasma Science, vol. 42, No. 3, March 2014.

The research was supported by the U.S. Department of Energy under contract DE-AC02 09CH11466. Princeton University manages PPPL, which is part of the national laboratory system funded by the U.S. Department of Energy through the Office of Science.

A more potent greenhouse gas than CO2, methane emissions will leap as Earth warms (Nature)

Freshwater wetlands can release methane, a potent greenhouse gas, as the planet warms. (Image source: RGBstock.com)

Freshwater wetlands can release methane, a potent greenhouse gas, as the planet warms. (Image source: RGBstock.com)

By Morgan Kelly, Office of Communications

While carbon dioxide is typically painted as the bad boy of greenhouse gases, methane is roughly 30 times more potent as a heat-trapping gas. New research in the journal Nature indicates that for each degree that the Earth’s temperature rises, the amount of methane entering the atmosphere from microorganisms dwelling in lake sediment and freshwater wetlands — the primary sources of the gas — will increase several times. As temperatures rise, the relative increase of methane emissions will outpace that of carbon dioxide from these sources, the researchers report.

The findings condense the complex and varied process by which methane — currently the third most prevalent greenhouse gas after carbon dioxide and water vapor — enters the atmosphere into a measurement scientists can use, explained co-author Cristian Gudasz, a visiting postdoctoral research associate in Princeton’s Department of Ecology and Evolutionary Biology. In freshwater systems, methane is produced as microorganisms digest organic matter, a process known as “methanogenesis.” This process hinges on a slew of temperature, chemical, physical and ecological factors that can bedevil scientists working to model how the Earth’s systems will contribute, and respond, to a hotter future.

The researchers’ findings suggest that methane emissions from freshwater systems will likely rise with the global temperature, Gudasz said. But to not know the extent of methane contribution from such a widely dispersed ecosystem that includes lakes, swamps, marshes and rice paddies leaves a glaring hole in climate projections.

“The freshwater systems we talk about in our paper are an important component to the climate system,” Gudasz said. “There is more and more evidence that they have a contribution to the methane emissions. Methane produced from natural or manmade freshwater systems will increase with temperature.”

To provide a simple and accurate way for climate modelers to account for methanogenesis, Gudasz and his co-authors analyzed nearly 1,600 measurements of temperature and methane emissions from 127 freshwater ecosystems across the globe.

New research in the journal Nature found that for each degree that the Earth's temperature rises, the amount of methane entering the atmosphere from microorganisms dwelling in freshwater wetlands — a primary source of the gas — will increase several times. The researchers analyzed nearly 1,600 measurements of temperature and methane emissions from 127 freshwater ecosystems across the globe (above), including lakes, swamps, marshes and rice paddies. The size of each point corresponds with the average rate of methane emissions in milligrams per square meter, per day, during the course of the study. The smallest points indicate less than one milligram per square meter, while the largest-sized point represents more than three milligrams. (Image courtesy of Cristian Gudasz)

New research in the journal Nature found that for each degree that the Earth’s temperature rises, the amount of methane entering the atmosphere from microorganisms dwelling in freshwater wetlands — a primary source of the gas — will increase several times. The researchers analyzed nearly 1,600 measurements of temperature and methane emissions from 127 freshwater ecosystems across the globe (above), including lakes, swamps, marshes and rice paddies. The size of each point corresponds with the average rate of methane emissions in milligrams per square meter, per day, during the course of the study. The smallest points indicate less than one milligram per square meter, while the largest-sized point represents more than three milligrams. (Image courtesy of Cristian Gudasz)

The researchers found that a common effect emerged from those studies: freshwater methane generation very much thrives on high temperatures. Methane emissions at 0 degrees Celsius would rise 57 times higher when the temperature reached 30 degrees Celsius, the researchers report. For those inclined to model it, the researchers’ results translated to a temperature dependence of 0.96 electron volts (eV), an indication of the temperature-sensitivity of the methane-emitting ecosystems.

“We all want to make predictions about greenhouse gas emissions and their impact on global warming,” Gudasz said. “Looking across these scales and constraining them as we have in this paper will allow us to make better predictions.”

Read the abstract.

Yvon-Durocher, Gabriel, Andrew P. Allen, David Bastviken, Ralf Conrad, Cristian Gudasz, Annick St-Pierre, Nguyen Thanh-Duc, Paul A. del Giorgio. 2014. Methane fluxes show consistent temperature dependence across microbial to ecosystem scales. Nature. Article published online before print: March 19, 2014. DOI: 10.1038/nature13164 and in the March 27, 2014 print edition.

It slices, it dices, and it protects the body from harm (Science)

By Catherine Zandonella, Office of the Dean for Research

RNase L enzyme structure

Researchers at Princeton have deciphered the 3D structure of RNase L, an enzyme that slices through RNA and is a first responder in the innate immune system. The structure contains two subunits, represented in red as two parts of a pair of scissors. Illustration by Sneha Rath, Inset courtesy of Science.

An essential weapon in the body’s fight against infection has come into sharper view. Researchers at Princeton University have discovered the 3D structure of an enzyme that cuts to ribbons the genetic material of viruses and helps defend against bacteria.

The discovery of the structure of this enzyme, a first-responder in the body’s “innate immune system,” could enable new strategies for fighting infectious agents and possibly prostate cancer and obesity. The work was published Feb. 27 in the journal Science.

Until now, the research community has lacked a structural model of the human form of this enzyme, known as RNase L, said Alexei Korennykh, an assistant professor of molecular biology and leader of the team that made the discovery.

“Now that we have the human RNase L structure, we can begin to understand the effects of carcinogenic mutations in the RNase L gene. For example, families with hereditary prostate cancers often carry genetic mutations in the region, or locus, encoding RNase L,” Korennykh said. The connection is so strong that the RNase L locus also goes by the name “hereditary prostate cancer 1.” The newly found structure reveals the positions of these mutations and explains why some of these mutations could be detrimental, perhaps leading to cancer, Korennykh said. RNase L is also essential for insulin function and has been implicated in obesity.

The Princeton team’s work has also led to new insights on the enzyme’s function.

The enzyme is an important player in the innate immune system, a rapid and broad response to invaders that includes the production of a molecule called interferon. Interferon relays distress signals from infected cells to neighboring healthy cells, thereby activating RNase L to turn on its ability to slice through RNA, a type of genetic material that is similar to DNA. The result is new cells armed for destruction of the foreign RNA.

The 3D structure uncovered by Korennykh and his team consists of two nearly identical subunits called protomers. The researchers found that one protomer finds and attaches to the RNA, while the other protomer snips it.

The initial protomer latches onto one of the four “letters” that make up the RNA code, in particular, the “U,” which stands for a component of RNA called uridine. The other protomer “counts” RNA letters starting from the U, skips exactly one letter, then cuts the RNA.

Although the enzyme can slice any RNA, even that of the body’s own cells, it only does so when activated by interferon.

“We were surprised to find that the two protomers were identical but have different roles, one binding and one slicing,” Korennykh said. “Enzymes usually have distinct sites that bind the substrate and catalyze reactions. In the case of RNase L, it appears that the same exact protein surface can do both binding and catalysis. One RNase L subunit randomly adopts a binding role, whereas the other identical subunit has no other choice but to do catalysis.”

To discover the enzyme’s structure, the researchers first created a crystal of the RNase L enzyme. The main challenge was finding the right combination of chemical treatments that would force the enzyme to crystallize without destroying it.

Korennykh groupAfter much trial and error and with the help of an automated system, postdoctoral research associate Jesse Donovan and graduate student Yuchen Han succeeded in making the crystals.

Next, the crystals were bombarded with powerful X-rays, which diffract when they hit the atoms in the crystal and form patterns indicative of the crystal’s structure. The diffraction patterns revealed how the atoms of RNase L are arranged in 3D space.

At the same time Sneha Rath, a graduate student in Korennykh’s laboratory, worked on understanding the RNA cleavage mechanism of RNase L using synthetic RNA fragments. Rath’s results matched the structural findings of Han and Donovan, and the two pieces of data ultimately revealed how RNase L cleaves its RNA targets.

Han, Donovan and Rath contributed equally to the paper and are listed as co-first authors.

Finally, senior research specialist Gena Whitney and graduate student Alisha Chitrakar conducted additional studies of RNase L in human cells, confirming the 3D structure.

Now that the human structure has been solved, researchers can explore ways to either enhance or dampen RNase L activity for medical and therapeutic uses, Korennykh said.

“This work illustrates the wonderful usefulness of doing both crystallography and careful kinetic and enzymatic studies at the same time,” said Peter Walter, professor of biochemistry and biophysics at the University of California-San Francisco School of Medicine. “Crystallography gives a static picture which becomes vastly enhanced by studies of the kinetics.”

Support for the work was provided by Princeton University.

Read the abstract.

Han, Yuchen, Jesse Donovan, Sneha Rath, Gena Whitney, Alisha Chitrakar, and Alexei Korennykh. Structure of Human RNase L Reveals the Basis for Regulated RNA Decay in the IFN Response Science 1249845. Published online 27 February 2014 [DOI:10.1126/science.1249845]

Now in 3D: Video of virus-sized particle trying to enter cell (Nature Nanotechnology)

Video of virus trying to enter cell

3D movie (below) of virus-like nanoparticle trying to gain entry to a cell

By Catherine Zandonella, Office of the Dean for Research

Tiny and swift, viruses are hard to capture on video. Now researchers at Princeton University have achieved an unprecedented look at a virus-like particle as it tries to break into and infect a cell. The technique they developed could help scientists learn more about how to deliver drugs via nanoparticles — which are about the same size as viruses — as well as how to prevent viral infection from occurring.

The video reveals a virus-like particle zipping around in a rapid, erratic manner until it encounters a cell, bounces and skids along the surface, and either lifts off again or, in much less time than it takes to blink an eye, slips into the cell’s interior. The work was published in Nature Nanotechnology.

Video caption: ‘Kiss and run’ on the cell surface. This 3D movie shows actual footage of a virus-like particle (red dot) approaching a cell (green with reddish brown nucleus), as captured by Princeton University researchers Kevin Welcher and Haw Yang. The color of the particle represents its speed, with red indicating rapid movement and blue indicating slower movement. The virus-like particle lands on the surface of the cell, appears to try to enter it, then takes off again. Source: Nature Nanotechnology.

“The challenge in imaging these events is that viruses and nanoparticles are small and fast, while cells are relatively large and immobile,” said Kevin Welsher, a postdoctoral researcher in Princeton’s Department of Chemistry and first author on the study. “That has made it very hard to capture these interactions.”

The problem can be compared to shooting video of a hummingbird as it roams around a vast garden, said Haw Yang, associate professor of chemistry and Welsher’s adviser. Focus the camera on the fast-moving hummingbird, and the background will be blurred. Focus on the background, and the bird will be blurred.

The researchers solved the problem by using two cameras, one that locked onto the virus-like nanoparticle and followed it faithfully, and another that filmed the cell and surrounding environment.

Putting the two images together yielded a level of detail about the movement of nano-sized particles that has never before been achieved, Yang said. Prior to this work, he said, the only way to see small objects at a similar resolution was to use a technique called electron microscopy, which requires killing the cell.

“What Kevin has done that is really different is that he can capture a three-dimensional view of a virus-sized particle attacking a living cell, whereas electron microscopy is in two-dimensions and on dead cells,” Yang said. “This gives us a completely new level of understanding.”

In addition to simply viewing the particle’s antics, the researchers can use the technique to map the contours of the cell surface, which is bumpy with proteins that push up from beneath the surface. By following the particle’s movement along the surface of the cell, the researchers were able to map the protrusions, just as a blind person might use his or her fingers to construct an image of a person’s face.

“Following the motion of the particle allowed us to trace very fine structures with a precision of about 10 nanometers, which typically is only available with an electron microscope,” Welsher said. (A nanometer is one billionth of a meter and roughly 1000 times smaller than the width of a human hair.) He added that measuring changes in the speed of the particle allowed the researchers to infer the viscosity of the extracellular environment just above the cell surface.

The technology has potential benefits for both drug discovery and basic scientific discovery, Yang said.  “We believe this will impact the study of how nanoparticles can deliver medicines to cells, potentially leading to some new lines of defense in antiviral therapies,” he said. “For basic research, there are a number of questions that can now be explored, such as how a cell surface receptor interacts with a viral particle or with a drug.”

Welsher added that such basic research could lead to new strategies for keeping viruses from entering cells in the first place.

“If we understand what is happening to the virus before it gets to your cells,” said Welsher, “then we can think about ways to prevent infection altogether. It is like deflecting missiles before they get there rather than trying to control the damage once you’ve been hit.”

To create the virus-like particle, the researchers coated a miniscule polystyrene ball with quantum dots, which are semiconductor bits that emit light and allow the camera to find the particle. Next, the particle was studded with protein segments known as Tat peptides, derived from the HIV-1 virus, which help the particle find the cell. The width of the final particle was about 100 nanometers.

The researchers then let loose the particles into a dish containing skin cells known as fibroblasts. One camera followed the particle while a second imaging system took pictures of the cell using a technique called laser scanning microscopy, which involves taking multiple images, each in a slightly different focal plane, and combining them to make a three-dimensional picture.

The research was supported by the US Department of Energy (DE-SC0006838) and by Princeton University.

Read the abstract.

Kevin Welsher and Haw Yang. 2014. Multi-resolution 3D visualization of the early stages of cellular uptake of peptide-coated nanoparticles. Nature nanotechnology. Published online: 23 February 2014 | DOI: 10.1038/NNANO.2014.12

Rife with hype, exoplanet study needs patience and refinement (PNAS)

By Morgan Kelly, Office of Communications

Exoplanet

Exoplanet transiting in front of its star. Princeton’s Adam Burrows argues against drawing too many conclusions about such distant objects with today’s technologies. Photo credit: ESA/C. Carreau

Imagine someone spent months researching new cities to call home using low-resolution images of unidentified skylines. The pictures were taken from several miles away with a camera intended for portraits, and at sunset. From these fuzzy snapshots, that person claims to know the city’s air quality, the appearance of its buildings, and how often it rains.

This technique is similar to how scientists often characterize the atmosphere — including the presence of water and oxygen — of planets outside of Earth’s solar system, known as exoplanets, according to a review of exoplanet research published in the Proceedings of the National Academy of Sciences.

A planet’s atmosphere is the gateway to its identity, including how it was formed, how it developed and whether it can sustain life, stated Adam Burrows, author of the review and a Princeton University professor of astrophysical sciences.

But the dominant methods for studying exoplanet atmospheres are not intended for objects as distant, dim and complex as planets trillions of miles from Earth, Burrows said. They were instead designed to study much closer or brighter objects, such as planets in Earth’s solar system and stars.

Nonetheless, scientific reports and the popular media brim with excited depictions of Earth-like planets ripe for hosting life and other conclusions that are based on vague and incomplete data, Burrows wrote in the first in a planned series of essays that examine the current and future study of exoplanets. Despite many trumpeted results, few “hard facts” about exoplanet atmospheres have been collected since the first planet was detected in 1992, and most of these data are of “marginal utility.”

The good news is that the past 20 years of study have brought a new generation of exoplanet researchers to the fore that is establishing new techniques, technologies and theories. As with any relatively new field of study, fully understanding exoplanets will require a lot of time, resources and patience, Burrows said.

“Exoplanet research is in a period of productive fermentation that implies we’re doing something new that will indeed mature,” Burrows said. “Our observations just aren’t yet of a quality that is good enough to draw the conclusions we want to draw.

“There’s a lot of hype in this subject, a lot of irrational exuberance. Popular media have characterized our understanding as better than it actually is,” he said. “They’ve been able to generate excitement that creates a positive connection between the astrophysics community and the public at large, but it’s important not to hype conclusions too much at this point.”

The majority of data on exoplanet atmospheres come from low-resolution photometry, which captures the variation in light and radiation an object emits, Burrows reported. That information is used to determine a planet’s orbit and radius, but its clouds, surface, and rotation, among other factors, can easily skew the results. Even newer techniques such as capturing planetary transits — which is when a planet passes in front of its star, and was lauded by Burrows as an unforeseen “game changer” when it comes to discovering new planets — can be thrown off by a thick atmosphere and rocky planet core.

All this means that reliable information about a planet can be scarce, so scientists attempt to wring ambitious details out of a few data points. “We have a few hard-won numbers and not the hundreds of numbers that we need,” Burrows said. “We have in our minds that exoplanets are very complex because this is what we know about the planets in our solar system, but the data are not enough to constrain even a fraction of these conceptions.”

Burrows emphasizes that astronomers need to acknowledge that they will never achieve a comprehensive understanding of exoplanets through the direct-observation, stationary methods inherited from the exploration of Earth’s neighbors. He suggests that exoplanet researchers should acknowledge photometric interpretations as inherently flawed and ambiguous. Instead, the future of exoplanet study should focus on the more difficult but comprehensive method of spectrometry, wherein the physical properties of objects are gauged by the interaction of its surface and elemental features with light wavelengths, or spectra. Spectrometry has been used to determine the age and expansion of the universe.

Existing telescopes and satellites are likewise vestiges of pre-exoplanet observation. Burrows calls for a mix of small, medium and large initiatives that will allow the time and flexibility scientists need to develop tools to detect and analyze exoplanet spectra. He sees this as a challenge in a research environment that often puts quick-payback results over deliberate research and observation. Once scientists obtain high-quality spectral data, however, Burrows predicted, “Many conclusions reached recently about exoplanet atmospheres will be overturned.”

“The way we study planets out of the solar system has to be radically different because we can’t ‘go’ to those planets with satellites or probes,” Burrows said. “It’s much more an observational science. We have to be detectives. We’re trying to find clues and the best clues since the mid-19th century have been in spectra. It’s the only means of understanding the atmosphere of these planets.”

A longtime exoplanet researcher, Burrows predicted the existence of “hot-Jupiter” planets — gas planets similar to Jupiter but orbiting very close to the parent star — in a paper in the journal Nature months before the first such planet, 51 Pegasi b, was discovered in 1995.

Read the abstract.

Citation: Burrows, Adam S. 2014. Spectra as windows into exoplanet atmospheres. Proceedings of the National Academy of Sciences. Article first published online: Jan. 13, 2014. DOI: 10.1073/pnas.1304208111

Asian ozone pollution in Hawaii is tied to climate variability (Nature Geoscience)

Asian air pollution

Asian pollution drifts east toward North America in 2010. Hawaii is denoted by the star. (Source: Nature Geoscience)

By Joanne Curcio, Program in Atmospheric and Oceanic Sciences

Air pollution from Asia has been rising for several decades but Hawaii had seemed to escape the ozone pollution that drifts east with the springtime winds. Now a team of researchers has found that shifts in atmospheric circulation explain the trends in Hawaiian ozone pollution.

Ozone levels during autumn 1975-2012

Researchers found that ozone levels measured during autumn at Mauna Loa Observatory in Hawaii (black line) accurately reflect the trend in rising Asian air pollution from 1975 to 2012. The researchers demonstrated that the autumnal rise in ozone could be explained by atmospheric and climatic shifts over periods of decades. Using a chemistry-climate model, the researchers modeled this autumnal variation in ozone using constant (red) and time-varying (purple) emissions of ozone precursors. (Source: Nature Geoscience.)

The researchers found that since the mid-1990s, these shifts in atmospheric circulation have caused Asian ozone pollution reaching Hawaii to be relatively low in spring but rise significantly in autumn. The study, led by Meiyun Lin, an associate research scholar in the Program in Atmospheric and Oceanic Sciences (NOAA) at Princeton University and a scientist at the National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory, was published in Nature Geoscience.

“The findings indicate that decade-long variability in climate must be taken into account when attributing U.S. surface ozone trends to rising Asian emissions,” Lin said. She conducted the research with Larry Horowitz and Songmiao Fan of GFDL, Samuel Oltmans of the University of Colorado and the NOAA Earth System Research Laboratory in Boulder; and Arlene Fiore of the Lamont-Doherty Earth Observatory at Columbia University.

Although protective at high altitudes, ozone near the Earth’s surface is a greenhouse gas and a health-damaging air pollutant. The longest record of ozone measurements in the U.S. dates back to 1974 in Hawaii. Over the past few decades, emissions of ozone precursors in Asia has tripled, yet the 40-year Hawaiian record revealed little change in ozone levels during spring, but a surprising rise in autumn.

Through their research, Lin and her colleagues solved the puzzle. “We found that changing wind patterns ‘hide’ the increase in Asian pollution reaching Hawaii in the spring, but amplify the change in the autumn,” Lin said.

Using chemistry-climate models and observations, Lin and her colleagues uncovered the different mechanisms driving spring versus autumn changes in atmospheric circulation patterns. The findings indicate that the flow of ozone-rich air from Eurasia towards Hawaii during spring weakened in the 2000s as a result of La-Niña-like decadal cooling in the equatorial Pacific Ocean. The stronger transport of Asian pollution to Hawaii during autumn since the mid-1990s corresponds to a positive pattern of atmospheric circulation variability known as the Pacific-North American pattern.

“This study not only solves the mystery of Hawaiian ozone changes since 1974, but it also has broad implications for interpreting trends in surface ozone levels globally,” Lin said. “Characterizing shifts in atmospheric circulation is of paramount importance for understanding the response of surface ozone levels to a changing climate and evolving global emissions of ozone precursors,” she said.

The work was supported by NOAA’s Cooperative Institute for Climate Science at Princeton University. Ozone measurements were obtained at Mauna Loa Observatory, operated by NOAA’s Earth System Research Laboratory.

Read the abstract

Meiyun Lin, Larry W. Horowitz, Samuel J. Oltmans, Arlene M. Fiore, Songmiao Fan. Tropospheric ozone trends at Mauna Loa Observatory tied to decadal climate variability. Nature Geoscience, Published Online: 26 January, 2014, http://dx.doi.org/10.1038/ngeo2066.

Model anticipates ecological impacts of human responses to climate (Conservation Biology)

A Princeton University research team has created a readily transferable method for conservation planners trying to anticipate how agriculture will be affected by such adaptations. The tested their model by studying wheat and maize production in South Africa. (Image source: WWS)

A Princeton University research team has created a readily transferable method for conservation planners trying to anticipate how agriculture will be affected by such adaptations. The tested their model by studying wheat and maize production in South Africa. (Image source: WWS)

By B. Rose Huber, Woodrow Wilson School of Public and International Affairs

Throughout history, humans have responded to climate.

Take, for example, the Mayans, who, throughout the eighth and 10th centuries, were forced to move away from their major ceremonial centers after a series of multi-year droughts, bringing about agricultural expansion in Mesoamerica, and a clearing of forests. Much later, in the late 20th century, frequent droughts caused the people of Burkina Faso in West Africa to migrate from the dry north to the wetter south where they have transformed forests to croplands and cut the nation’s area of natural vegetation in half.

Such land transformations, while necessary to ensure future crop productivity, can themselves have large ecological impacts, but few studies have examined their effects. To that end, a Princeton University research team has created a model to evaluate how a human response to climate change may alter the agricultural utility of land. The study, featured in Conservation Biology, provides a readily transferable method for conservation planners trying to anticipate how agriculture will be affected by such adaptations.

“Humans can transform an ecosystem much more rapidly and completely than it can be altered by shifting temperature and precipitation patterns,” said Lyndon Estes, lead author and associate research scholar in the Woodrow Wilson School of International and Public Affairs. “This model provides an initial approach for understanding how agricultural land-use might shift under climate change, and therefore which currently natural areas might be converted to farming.”

Under the direction of faculty members Michael Oppenheimer and David Wilcove, both from the Wilson School’s Program in Science, Technology and Policy, and with the help of visiting student research collaborator Lydie-Line Paroz from ETH Zurich and colleagues from several other institutions, Estes studied South Africa, an area projected to be vulnerable to climate change where wheat and maize are the dominant crops.

Before determining how climate change could impact the crops, the team first needed to determine which areas have been or might be farmed for maize and wheat. They created a land-use model based on an area’s potential crop output and simulated how much of each crop was grown from 1979 to 1999 – the two decades for which historical weather data was available. They also calculated the ruggedness of each area of land, which is related to the cost of farming it. Taking all factors into account, the model provides an estimate of whether the land is likely to be profitable or unprofitable for farming.

To investigate any climate-change impacts, the team then examined the production of wheat and maize under 36 different climate-response scenarios. Many possible future climates were taken into account as well as how the crops might respond to rising levels of carbon dioxide. Based on their land-use model, the researchers calculated how the climate-induced productivity changes alter a land’s agricultural utility. In their analysis, they included only conservation lands – current nature reserves and those that South African conservation officials plan to acquire – that contained land suitable for growing one of the two crops either currently or in the future. However, Estes said the model could be adapted to assess whether land under other types of uses (besides conservation) are likely to be profitable or unprofitable for future farming.

They found that most conservation lands currently have low agricultural utility because of their rugged terrain, which makes them difficult to farm, and that they are likely to stay that way under future climate-change scenarios. The researchers did pinpoint several areas that could become more valuable for farming in the future, putting them at greater risk of conversion. However, some areas were predicted to decrease value for farming, which could make them easier to protect and conserve.

“While studying the direct response of species to climatic shifts is important, it’s only one piece of a complicated puzzle. A big part of that puzzle relates to how humans will react, and history suggests you don’t need much to trigger a change in the way land is used that has a fairly long-lasting impact. ” said Estes. “We hope that conservation planners can use this approach to start thinking about human climate change adaptation and how it will affect areas needing protection.”

Other researchers involved in the study include: Lydie-Line Paroz, Swiss Federal Institute of Technology; Bethany A. Bradley, University of Massachusetts; Jonathan Green, STEP; David G. Hole, Conservation International; Stephen Holness, Centre for African Conservation Ecology; and Guy Ziv, University of Leeds.

The work was funded by the Princeton Environmental Institute‘s Grand Challenges Program.

Read the abstract.

Estes LD, Paroz LL, Bradley BA, Green JM, Hole DG, Holness S, Ziv G, Oppenheimer MG, Wilcove DS. Using Changes in Agricultural Utility to Quantify Future Climate-Induced Risk to Conservation Conservation Biology (2013). First published online Dec. 26, 2013.