Unstoppable magnetoresistance (Nature)

Mazhar Ali (left) and Steven Flynn (right), co-authors on the Nature paper. Photo by C. Todd Reichert.

Mazhar Ali (left) and Steven Flynn (right), co-authors on the Nature paper. Photo by C. Todd Reichart.

by Tien Nguyen, Department of Chemistry

Mazhar Ali, a fifth-year graduate student in the laboratory of Robert Cava, the Russell Wellman Moore Professor of Chemistry at Princeton University, has spent his academic career discovering new superconductors, materials coveted for their ability to let electrons flow without resistance. While testing his latest candidate, the semimetal tungsten ditelluride (WTe2), he noticed a peculiar result.

Ali applied a magnetic field to a sample of WTe2, one way to kill superconductivity if present, and saw that its resistance doubled. Intrigued, Ali worked with Jun Xiong, a student in the laboratory of Nai Phuan Ong, the Eugene Higgins Professor of Physics at Princeton, to re-measure the material’s magnetoresistance, which is the change in resistance as a material is exposed to stronger magnetic fields.

“He noticed the magnetoresistance kept going up and up and up—that never happens.” said Cava. The researchers then exposed WTe2 to a 60-tesla magnetic field, close to the strongest magnetic field mankind can create, and observed a magnetoresistance of 13 million percent. The material’s magnetoresistance displayed unlimited growth, making it the only known material without a saturation point. The results were published online on September 14 in the journal Nature.

Crystal structure of WTe2 (Source: Nature)

Crystal structure of WTe2 (Source: Nature)

Electronic information storage is dependent on the use of magnetic fields to switch between distinct resistivity values that correlate to either a one or a zero. The larger the magnetoresistance, the smaller the magnetic field needed to change from one state to another, Ali said. Today’s devices use layered materials with so-called “giant magnetoresistance,” with changes in resistance of 20,000 to 30,000 percent when a magnetic field is applied. “Colossal magnetoresistance” is close to 100,000 percent, so for a magnetoresistance percentage in the millions, the researchers hoped to coin a new term.

Their original choice was “ludicrous” magnetoresistance, which was inspired by “ludicrous speed,” the fictional form of fast-travel used in the comedy “Spaceballs.” They even included an acknowledgement to director Mel Brooks. After other lab members vetoed “ludicrous,” the researchers considered “titanic” before Nature editors ultimately steered them towards the term “large magnetoresistance.”

Terminology aside, the fact remained that the magnetoresistance values were extraordinarily high, a phenomenon that might be understood through the structure of WTe2. To look at the structure with an electron microscope, the research team turned to Jing Tao, a researcher at Brookhaven National Laboratory.

“Jing is a great microscopist. They have unique capabilities at Brookhaven,” Cava said. “One is that they can measure diffraction at 10 Kelvin (-441 °F). Not too many people on Earth can do that, but Jing can.”

Electron microscopy experiments revealed the presence of tungsten dimers, paired tungsten atoms, arranged in chains responsible for the key distortion from the classic octahedral structure type. The research team proposed that WTe2 owes its lack of saturation to the nearly perfect balance of electrons and electron holes, which are empty docks for traveling electrons. Because of its structure, WTe2 only exhibits magnetoresistance when the magnetic field is applied in a certain direction. This could be very useful in scanners, where multiple WTe2 devices could be used to detect the position of magnetic fields, Ali said.

“Aside from making devices from WTe2, the question to ask yourself as a scientist is: How can it be perfectly balanced, is there something more profound,” Cava said.

Read the abstract.

Ali, M. N.; Xiong, J.; Flynn, S.; Tao, J.; Gibson, Q. D.; Schoop, L. M.; Haldolaarachchige, N.; Hirschberger, M.; Ong, N. P.; Cava, R. J. “Large, non-saturating magnetoresistance in WTe2.” Nature. Published online September 14. 514, 205–208 (09 October 2014).

This research was supported by the Army Research Office, grants W911NF-12-1-0461 and W911NF-11-1-0379, and the NSF MRSEC Program Grant DMR-0819860. This work was supported by the US Department of Energy’s Basic Energy Sciences (DOE BES) project “Science at 100 Tesla.” The electron microscopy study at Brookhaven National Laboratory was supported by the DOE BES, by the Materials Sciences and Engineering Division under contract DE-AC02-98CH10886, and through the use of the Center for Functional Nanomaterials.

Study questions the prescription for drug resistance (Proceedings of the Royal Society B)

A drug-resistant strain of bacteria known as MRSA. Photo by James Gathany

A new study examines the question of aggressive versus moderate drug treatment on the emergence of drug-resistant pathogens. Shown is a strain of bacteria known as methicillin-resistant Staphylococcus aureus (MRSA). Photo by James Gathany

By Catherine Zandonella, Office of the Dean for Research

In response to the rise of drug-resistant pathogens, doctors are routinely cautioned against overprescribing antimicrobials. But when a patient has a confirmed bacterial infection, the advice is to treat aggressively to quash the infection before the bacteria can develop resistance.

A new study questions the accepted wisdom that aggressive treatment with high drug dosages and long durations is always the best way to stem the emergence and spread of resistant pathogens. The review of nearly 70 studies of antimicrobial resistance, which was authored by researchers at Princeton and other leading institutions and published last week in the journal Proceedings of the Royal Society B, reveals the lack of evidence behind the practice of aggressive treatment in many cases.

“We found that while there are many studies that test for resistance emergence between different drug regimes, surprisingly few have looked at the topic of how varying drug dosage might affect the emergence and spread of resistance,” said Ruthie Birger, a Princeton graduate student who works with C. Jessica Metcalf, an assistant professor of ecology and evolutionary biology and public affairs at Princeton’s Woodrow Wilson School, and Bryan Grenfell, the Kathryn Briger and Sarah Fenton Professor of Ecology and Evolutionary Biology and Public Affairs in Princeton’s Woodrow Wilson School. Birger, Metcalf and Grenfell coauthored the paper with colleagues from 16 universities. “We are a long way from having the evidence for the best treatment decisions with respect to resistance for a range of diseases,” Birger said.

Microbes such as bacteria and parasites can evade today’s powerful drugs by undergoing genetic mutations that enable them to avoid being killed by the drug. For example, bacteria can develop enzymes that degrade certain antibiotics. The logic behind aggressive treatment goes something like this: kill off as many microbes as you can so that few will be around to evolve into resistant forms.

But some scientists have observed a different outcome in mice infected with both an already-resistant strain of malaria and a non-resistant strain. The high-dose drug treatment killed off the non-resistant malarial parasites, leaving the resistant strains to multiply and make the mice even sicker.

The idea that aggressive treatment may backfire against malarial parasites led the authors of the current study to comb the scientific literature to examine whether the same may be true for other types of microbes such as bacteria. The few studies that they found — mostly in laboratory cell cultures rather than animal models or patients — suggest that the picture is complicated, and depends on whether the resistance is new or existing, how many mutations are necessary for the pathogen to become resistant, and how long the drugs have been in use. “It’s remarkable how little we know about this topic,” said Metcalf. “The malaria study conducted by Silvie Huijben and colleagues at Pennsylvania State University is an inspiring step towards developing an evidence base for these important issues.”

In the current analysis, the study authors found that drug resistance is governed by two factors: the abundance of the pathogen and the strength of the selection pressure that drives the pathogen to evolve. Aggressive treatment deals with the first factor by killing off as much pathogen as possible, while moderate treatment may, for some pathogens, reduce the ability for the resistant pathogen to thrive (for example, by maintaining the competitive advantage of a co-infecting drug-sensitive strain of the pathogen) but still reduce total pathogen levels sufficiently that the patient can recover.

Finding the ideal dose and duration of treatment, one that cures the patient without aiding the spread of resistance, will likely be done on a disease by disease basis, the authors found.

One possibility is that moderate treatment might be best used against already-resistant microbes to prevent their spread. Moderate treatment may also be best for drugs that have been on the market for several years with plenty of time for resistant strains to develop.

Aggressive treatment might be best for pathogens that develop resistance slowly, over the course of several mutations. High doses early in the process could be effective at heading off the development of resistance.

Read the abstract.

Roger D. Kouyos, C. Jessica E. Metcalf, Ruthie Birger, Eili Y. Klein, Pia Abel zur Wiesch, Peter Ankomah, Nimalan Arinaminpathy, Tiffany L. Bogich, Sebastian Bonhoeffer, Charles Brower, Geoffrey Chi-Johnston, Ted Cohen, Troy Day, Bryan Greenhouse, Silvie Huijben, Joshua Metlay, Nicole Mideo, Laura C. Pollitt, Andrew F. Read, David L. Smith, Claire Standley, Nina Wale and Bryan Grenfell. Proc. R. Soc. B: Biological Sciences, 281, 20140566. Published Sept. 24, 2014

The work emerged from two workshops held at Princeton University and funded by the RAPIDD program of the Science and Technology Directorate, Department of Homeland Security and the Fogarty International Center, National Institutes of Health; Science and Technology Directorate, Department of Homeland Security; contract HSHQDC-12-C-00058

Longstanding bottleneck in crystal structure prediction solved (Science)

By Tien Nguyen, Department of Chemistry

benzene crystal

Orthographic projections of a cluster cut from the benzene crystal along the two directions (Image courtesy of Science/AAAS)

Two years after its release, the HIV-1 drug Ritonavir was pulled from the market. Scientists discovered that the drug had crystallized into a slightly different form—called a polymorph—that was less soluble and made it ineffective as a treatment.

The various patterns that atoms of a solid material can adopt, called crystal structures, can have a huge impact on its properties. Being able to accurately predict the most stable crystal structure for a material has been a longstanding challenge for scientists.

“The holy grail of this particular problem is to say, I’ve written down this chemical formula for a material, and then just from the formula be able to predict its structure—a goal since the dawn of chemistry,” said Garnet K. L. Chan, the A. Barton Hepburn Professor of Theoretical Chemistry at Princeton University. One major bottleneck towards achieving this goal has been to compute the lattice energy—the energy associated with a structure—to sufficient accuracy to distinguish between several competing polymorphs.

Chan’s group has now accomplished this task, publishing their results in the journal Science on August 8. The research team demonstrated that new techniques could be used to calculate the lattice energy of benzene, a simple yet important molecule in pharmaceutical and energy research, to sub-kilojoule per mole accuracy—a level of certainty that allows polymorphism to be resolved.

Chan credited this success to the combined application of advances in the field of quantum mechanics over the last 15 years. “Some of these advances allow you to resolve the behavior of electrons more finely, do computations on more atoms more quickly, and allow you to consider more electrons at the same time,” Chan said. “It’s a triumph of the modern field of quantum chemistry that we can now determine the behavior of Nature to this level of precision.”

The group’s next goal is to shorten the time it takes to run the desired calculations. These initial calculations consumed several months of computer time, Chan said, but with some practical modifications, future predictions should take only a few hours.

Chan’s colleagues on the work included first author Jun Yang, an electronic structure theory specialist and lecturer in chemistry, and graduate student Weifeng Hu at Princeton University. Additional collaborators were Denis Usvyat and Martin Schutz of the University of Regensburg and Devin Matthews of the University of Texas at Austin.

The work was supported by the U.S. Department of Energy under grant no. DE-SC0008624, with secondary support from grant no. DE-SC0010530. Additional funding was received from the National Science Foundation under grant no. OCI-1265278 and CHE-1265277. D.M. was supported by the U.S. Department of Energy through a Computational Science Graduate Fellowship, funded by grant no. DE-FG02-97ER25308.

Read the abstract.

Yang J., Hu, W., Usvyat, D., Matthews, D., Schutz, M., Chan, G. K. L. Ab initio determination of the crystalline benzene lattice energy to sub-kilojoule/mol accuracy. Science 2014, 345, 640.

Conservation versus innovation in the fight against antibiotic resistance (Science)

Pills (Image source: NIH)

(Image source: NIH)

“Antibiotic resistance is a problem of managing an open-access resource, such as fisheries or oil,” writes Ramanan Laxminarayan, a research scholar at Princeton University and the director of the Center for Disease Dynamics, Economics & Policy in Washington, D. C., in today’s issue of the journal Science. He goes on to say that individuals have little incentive to use antibiotics wisely, just as people have little incentive to conserve oil when it is plentiful.

As with many other natural resources, maintaining the effectiveness of antibiotics requires two approaches: conserving the existing resource and exploring new sources, Laxminarayan says. These two approaches are linked, however. “Just as incentives for finding new sources of oil reduce incentives to conserve oil,” Laxminarayan writes, “large public subsidies for new drug development discourage efforts to improve how existing antibiotics are used.” Yet new antibiotics tend to cost more than existing ones due to the expense of clinical trials and the fact that the easiest-to-find drugs may have already been discovered.

Laxminarayan’s analysis reveals that the benefits of conserving existing drugs are significant, and argues that the proposed increases in public subsidies for new antibiotics should be matched by greater spending on conservation of antibiotic effectiveness through public education, research and surveillance.

Ramanan Laxminarayan is a research scholar at the Princeton Environmental Institute. His perspective, “Antibiotic effectiveness: Balancing conservation against innovation,” appeared in the September 12, 2014 issue of Science.

Read the article.

Fast-camera image of plasma during magnetic reconnection with rendering of the field lines, shown in white, based on measurements made during the experiment. The converging horizontal lines represent the field lines prior to reconnection. The outgoing vertical lines represent the field lines after reconnection. Image courtesy of Jongsoo Yoo.

PPPL scientists take key step toward solving a major astrophysical mystery

By John Greenwald, Princeton Plasma Physics Laboratory

Magnetic reconnection in the Earth and sun’s atmospheres can trigger geomagnetic storms that disrupt cell phone service, damage satellites and blackout power grids. Understanding how reconnection transforms magnetic energy into explosive particle energy has been a major unsolved problem in plasma astrophysics.

Scientists at the Princeton Plasma Physics Laboratory (PPPL) and Princeton University have taken a key step toward a solution, as described in a paper published this week in the journal Nature Communications. In research conducted on the Magnetic Reconnection Experiment (MRX) at PPPL, the scientists not only identified how the mysterious transformation takes place, but measured experimentally the amount of magnetic energy that turns into particle energy. The work is supported by the U. S. Department of Energy as well as the NSF-funded Center for Magnetic Self-Organization.

Fast-camera image of plasma during magnetic reconnection with rendering of the field lines, shown in white, based on measurements made during the experiment. The converging horizontal lines represent the field lines prior to reconnection. The outgoing vertical lines represent the field lines after reconnection. Image courtesy of Jongsoo Yoo.

Fast-camera image of plasma during magnetic reconnection with rendering of the field lines, shown in white, based on measurements made during the experiment. The converging horizontal lines represent the field lines prior to reconnection. The outgoing vertical lines represent the field lines after reconnection. Image courtesy of Jongsoo Yoo.

Magnetic field lines represent the direction, and indicate the shape, of magnetic fields. In magnetic reconnection, the magnetic field lines in plasma snap apart and violently reconnect. The MRX, built in 1995, allows researchers to study the process in a controlled laboratory environment.

The new research shows that reconnection converts about 50 percent of the magnetic energy, with one-third of the conversion heating the electrons and two-thirds accelerating the ions — or atomic nuclei — in the plasma. In large bodies like the sun, such converted energy can equal the power of millions of tons of TNT.

“This is a major milestone for our research,” said Masaaki Yamada, a research physicist, the principal investigator for the MRX and first author of the Nature Communications paper. “We can now see the entire picture of how much of the energy goes to the electrons and how much to the ions in a proto-typical reconnection layer.”

The findings also suggested the process by which the energy conversion occurs. Reconnection first propels and energizes the electrons, according to the researchers, and this creates an electrically charged field that “becomes the primary energy source for the ions,” said Jongsoo Yoo, an associate research physicist at PPPL and co-author of the paper.

The other contributors to the paper were Hantao Ji, professor of astrophysical sciences at Princeton; Russell Kulsrud, professor of astrophysical sciences, emeritus, at Princeton; and doctoral candidates in astrophysical sciences Jonathan Jara-Almonte and Clayton Myers.

If confirmed by data from space explorations, the PPPL results could help resolve decades-long questions and create practical benefits. These could include a better understanding of geomagnetic storms that could lead to advanced warning of the disturbances and an improved ability to cope with them. Researchers could shut down sensitive instruments on communications satellites, for example, to protect the instruments from harm.

Next year NASA plans to launch a four-satellite mission to study reconnection in the magnetosphere — the magnetic field that surrounds the Earth. The PPPL team plans to collaborate with the venture, called the Magnetospheric Multiscale (MMS) Mission, by providing MRX data to it. The MMS probes could help to confirm the laboratory’s findings.

PPPL, on Princeton University’s Forrestal Campus in Plainsboro, New Jersey, is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Fusion takes place when atomic nuclei fuse and release a burst of energy. This compares with the fission reactions in today’s nuclear power plants, which operate by splitting atoms apart.

Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

Read the abstract.

Yamada, M; Yoo J.; Jara-Almonte, J.; Ji, H.; Kulsrud, R.M.; Myers, C.E. Conversion of magnetic energy in the magnetic reconnection layer of a laboratory plasma. Nature Communications. Article published online Sept. 10, 2014. DOI: NCOMMS5774

 

Shale-gas field

‘Fracking’ in the dark: Biological fallout of shale-gas production still largely unknown (Frontiers in Ecology and the Environment)

Fracking diagram

Eight conservation biologists from various organizations and institutions, including Princeton University, found that shale-gas extraction in the United States has vastly outpaced scientists’ understanding of the industry’s environmental impact. Each gas well can act as a source of air, water, noise and light pollution (above) that — individually and collectively — can interfere with wild animal health, habitats and reproduction. Of particular concern is the fluid and wastewater associated with hydraulic fracturing, or “fracking,” a technique that releases natural gas from shale by breaking the rock up with a high-pressure blend of water, sand and other chemicals. (Frontiers in Ecology and the Environment )

By Morgan Kelly, Office of Communications

In the United States, natural-gas production from shale rock has increased by more than 700 percent since 2007. Yet scientists still do not fully understand the industry’s effects on nature and wildlife, according to a report in the journal Frontiers in Ecology and the Environment.

As gas extraction continues to vastly outpace scientific examination, a team of eight conservation biologists from various organizations and institutions, including Princeton University, concluded that determining the environmental impact of gas-drilling sites — such as chemical contamination from spills, well-casing failures and other accidents — must be a top research priority.

With shale-gas production projected to surge during the next 30 years, the authors call on scientists, industry representatives and policymakers to cooperate on determining — and minimizing — the damage inflicted on the natural world by gas operations such as hydraulic fracturing, or “fracking.” A major environmental concern, hydraulic fracturing releases natural gas from shale by breaking the rock up with a high-pressure blend of water, sand and other chemicals, which can include carcinogens and radioactive substances.

“We can’t let shale development outpace our understanding of its environmental impacts,” said co-author Morgan Tingley, a postdoctoral research associate in the Program in Science, Technology and Environmental Policy in Princeton’s Woodrow Wilson School of Public and International Affairs.

Shale-gas extraction in Wyoming

With shale-gas production projected to surge during the next 30 years, determining and minimizing the industry’s effects on nature and wildlife must become a top priority for scientists, industry and policymakers. Image of Wyoming’s Jonah Field. Although modern shale-gas wells need less surface area than the older methods shown here, the ecological impact from extraction operations past and present pose a long-lasting threat to the natural world. (Photo courtesy of Ecoflight.)

“The past has taught us that environmental impacts of large-scale development and resource extraction, whether coal plants, large dams or biofuel monocultures, are more than the sum of their parts,” Tingley said.

The researchers found that there are significant “knowledge gaps” when it comes to direct and quantifiable evidence of how the natural world responds to shale-gas operations. A major impediment to research has been the lack of accessible and reliable information on spills, wastewater disposal and the composition of fracturing fluids. Of the 24 American states with active shale-gas reservoirs, only five — Pennsylvania, Colorado, New Mexico, Wyoming and Texas — maintain public records of spills and accidents, the researchers report.

“The Pennsylvania Department of Environmental Protection’s website is one of the best sources of publicly available information on shale-gas spills and accidents in the nation. Even so, gas companies failed to report more than one-third of spills in the last year,” said first author Sara Souther, a postdoctoral research associate at the University of Wisconsin-Madison.

“How many more unreported spills occurred, but were not detected during well inspections?” Souther asked. “We need accurate data on the release of fracturing chemicals into the environment before we can understand impacts to plants and animals.”

One of the greatest threats to animal and plant life identified in the study is the impact of rapid and widespread shale development, which has disproportionately affected rural and natural areas. A single gas well results in the clearance of 3.7 to 7.6 acres (1.5 to 3.1 hectares) of vegetation, and each well contributes to a collective mass of air, water, noise and light pollution that has or can interfere with wild animal health, habitats and reproduction, the researchers report.

“If you look down on a heavily ‘fracked’ landscape, you see a web of well pads, access roads and pipelines that create islands out of what was, in some cases, contiguous habitat,” Souther said. “What are the combined effects of numerous wells and their supporting infrastructure on wide-ranging or sensitive species, like the pronghorn antelope or the hellbender salamander?”

The chemical makeup of fracturing fluid and wastewater is often unknown. The authors reviewed chemical-disclosure statements for 150 wells in three of the top gas-producing states and found that an average of two out of every three wells were fractured with at least one undisclosed chemical. The exact effect of fracturing fluid on natural water systems as well as drinking water supplies remains unclear even though improper wastewater disposal and pollution-prevention measures are among the top state-recorded violations at drilling sites, the researchers found.

“Some of the wells in the chemical disclosure registry were fractured with fluid containing 20 or more undisclosed chemicals,” said senior author Kimberly Terrell, a researcher at the Smithsonian Conservation Biology Institute. “This is an arbitrary and inconsistent standard of chemical disclosure.”

The paper’s co-authors also include researchers from the University of Bucharest in Romania, Colorado State University, the University of Washington, and the Society for Conservation Biology.

The work was supported by the David H. Smith Fellowship program administered by the Society for Conservation Biology and funded by the Cedar Tree Foundation; and by a Policy Fellowship from the Wilburforce Foundation to the Society for Conservation Biology.

Read the abstract.

Souther, Sara, Morgan W. Tingley, Viorel D. Popescu, David T.S. Hyman, Maureen E. Ryan, Tabitha A. Graves, Brett Hartl, Kimberly Terrell. 2014. Biotic impacts of energy development from shale: research priorities and knowledge gaps. Frontiers in Ecology and the Environment. Article published online Aug. 1, 2014. DOI: 10.1890/130324.

Water, Water — Not Everywhere: Mapping water trends for African maize (Environmental Research Letters)

By Molly Sharlach, Office of the Dean for Research

Water availability trends in Africa

Researchers analyzed water availability trends in African maize-growing regions from 1979 to 2010. Each quarter-degree grid cell represents a 200-square-mile area and is colored according to its average water availability level during the maize growing season. In redder areas, water availability is more limited by rainfall levels, while bluer areas are more limited by evaporative demand. (Image source: Environmental Research Letters)

Today’s food production relies heavily on irrigation, but across sub-Saharan Africa only 4 percent of cultivated land is irrigated, compared with a global average of 18 percent. Small-scale farming is the main livelihood for many people in the region, who depend on rainfall to water their crops.

To understand how climate change may affect the availability of water for agriculture, researchers at Princeton University analyzed trends in the water cycle in maize-growing areas of 21 African countries between 1979 and 2010. The team examined both levels of rainfall and the evaporative demand of the atmosphere — the combined effects of evaporation and transpiration, which is the movement of water through plants.

Overall, they found increases in water availability during the maize-growing season, although the trends varied by region. The greater availability of water generally resulted from a mixture of increased rainfall and decreased evaporative demand.

However, some regions of East Africa experienced declines in water availability, the study found. “Some places, like parts of Tanzania, got a double whammy that looks like a declining trend in rainfall as well as an increasing evaporative demand during the more sensitive middle part of the growing season,” said Lyndon Estes, the study’s lead author and an associate research scholar in the Program in Science, Technology and Environmental Policy at the Woodrow Wilson School of Public and International Affairs. The analysis was published in the July issue of the journal Environmental Research Letters.

A key goal of the study was to incorporate reliable data on factors that influence evaporative demand. These include temperature, wind speed, humidity and net radiation — defined as the amount of energy from the sun that is absorbed by the land, minus the amount reflected back into the atmosphere by the Earth’s surface. Measurements of three of these parameters came from the Princeton University Global Meteorological Forcing Dataset (PGF) previously developed by two of the study’s authors, Research Scholar Justin Sheffield and Eric Wood, the Susan Dod Brown Professor of Civil and Environmental Engineering and the study’s senior author.

The PGF merges a variety of weather and satellite data, and covers all land areas at a resolution of three hours and one degree of latitude or longitude (one degree of latitude is about 70 miles). Nathaniel Chaney, a graduate student who works with Sheffield, downscaled the data to a resolution of about 15 miles. He incorporated observations from African weather stations to improve the accuracy of the data. To do this, he used statistical techniques based on the principle that areas close to one another are likely to have similar weather.

The team also had to correct the data for errors due to changes in instruments or satellites, which can create what appear to be sudden jumps in temperature or wind speed. “When you’re dealing with gridded global weather data, they come with many warts,” Estes said. “So we try to remove as many of those warts as possible,” he said, to gain a faithful picture of weather changes at each location.

Most areas saw a decrease in evaporative demand, leading to higher water availability. The researchers analyzed the contributions of different factors to this decrease, and found that a downward trend in net radiation was largely responsible for the change. This was a surprising result, according to Estes, who said he expected to see decreases in evaporative demand, but thought lower wind speeds would have a greater impact than drops in net radiation. In a 2012 study published in the journal Nature, Sheffield and Wood showed that diminished wind speeds have helped to offset the effects of rising temperatures that would otherwise lead to an increase in droughts. Another study found that decreasing wind speeds contributed to declining evaporative demand in South Africa. The current study only examined water availability during the maize growing season, which could account for this discrepancy, Estes said.

The trends revealed by this research could have implications for agricultural policies and practices, including irrigation planning, timing of planting and choice of crop varietals. For example, in Burkina Faso in West Africa, a comparison of different parts of the growing season showed a decrease in water availability early in the season, but an increase at later time points. This might mean that the rainy season is starting later, in which case farmers in that region might adapt by planting their maize later. In South Africa, evaporative demand dropped in many areas; this could inform a reallocation of water use.

According to Estes, this study, which examined only 34 percent of all African maize-growing areas, may serve as a framework to guide more detailed analyses within individual countries. It’s also essential to understand the relationship between changes in water availability and changes in actual crop yields, which is more complex because yield trends are influenced by numerous political and economic factors, in addition to farming practices. That’s where Estes hopes to focus his next efforts. “All those factors would have to be teased out to isolate what these changes in water supply and demand mean for crop production,” he said.

Other researchers in Princeton’s Department of Civil and Environmental Engineering involved in the study include graduate student Julio Herrera Estrada and Associate Professor Kelly Caylor.

This work was funded by the United States Army Corps of Engineers Institute for Water Resources, the NASA Measures Program and the Princeton Environmental Institute Grand Challenges Program.

Read the abstract.

Estes, L. D.; Chaney, N. W.; Herrera-Estrada, J.; Sheffield, J.; Caylor, K. K.; Wood, E. F. Changing water availability during the African maize-growing season, 1979–2010. Environmental Research Letters Volume 9 Number 7 075005 doi:10.1088/1748-9326/9/7/075005