Study questions the prescription for drug resistance (Proceedings of the Royal Society B)

A drug-resistant strain of bacteria known as MRSA. Photo by James Gathany
A new study examines the question of aggressive versus moderate drug treatment on the emergence of drug-resistant pathogens. Shown is a strain of bacteria known as methicillin-resistant Staphylococcus aureus (MRSA). Photo by James Gathany

By Catherine Zandonella, Office of the Dean for Research

In response to the rise of drug-resistant pathogens, doctors are routinely cautioned against overprescribing antimicrobials. But when a patient has a confirmed bacterial infection, the advice is to treat aggressively to quash the infection before the bacteria can develop resistance.

A new study questions the accepted wisdom that aggressive treatment with high drug dosages and long durations is always the best way to stem the emergence and spread of resistant pathogens. The review of nearly 70 studies of antimicrobial resistance, which was authored by researchers at Princeton and other leading institutions and published last week in the journal Proceedings of the Royal Society B, reveals the lack of evidence behind the practice of aggressive treatment in many cases.

“We found that while there are many studies that test for resistance emergence between different drug regimes, surprisingly few have looked at the topic of how varying drug dosage might affect the emergence and spread of resistance,” said Ruthie Birger, a Princeton graduate student who works with C. Jessica Metcalf, an assistant professor of ecology and evolutionary biology and public affairs at Princeton’s Woodrow Wilson School, and Bryan Grenfell, the Kathryn Briger and Sarah Fenton Professor of Ecology and Evolutionary Biology and Public Affairs in Princeton’s Woodrow Wilson School. Birger, Metcalf and Grenfell coauthored the paper with colleagues from 16 universities. “We are a long way from having the evidence for the best treatment decisions with respect to resistance for a range of diseases,” Birger said.

Microbes such as bacteria and parasites can evade today’s powerful drugs by undergoing genetic mutations that enable them to avoid being killed by the drug. For example, bacteria can develop enzymes that degrade certain antibiotics. The logic behind aggressive treatment goes something like this: kill off as many microbes as you can so that few will be around to evolve into resistant forms.

But some scientists have observed a different outcome in mice infected with both an already-resistant strain of malaria and a non-resistant strain. The high-dose drug treatment killed off the non-resistant malarial parasites, leaving the resistant strains to multiply and make the mice even sicker.

The idea that aggressive treatment may backfire against malarial parasites led the authors of the current study to comb the scientific literature to examine whether the same may be true for other types of microbes such as bacteria. The few studies that they found — mostly in laboratory cell cultures rather than animal models or patients — suggest that the picture is complicated, and depends on whether the resistance is new or existing, how many mutations are necessary for the pathogen to become resistant, and how long the drugs have been in use. “It’s remarkable how little we know about this topic,” said Metcalf. “The malaria study conducted by Silvie Huijben and colleagues at Pennsylvania State University is an inspiring step towards developing an evidence base for these important issues.”

In the current analysis, the study authors found that drug resistance is governed by two factors: the abundance of the pathogen and the strength of the selection pressure that drives the pathogen to evolve. Aggressive treatment deals with the first factor by killing off as much pathogen as possible, while moderate treatment may, for some pathogens, reduce the ability for the resistant pathogen to thrive (for example, by maintaining the competitive advantage of a co-infecting drug-sensitive strain of the pathogen) but still reduce total pathogen levels sufficiently that the patient can recover.

Finding the ideal dose and duration of treatment, one that cures the patient without aiding the spread of resistance, will likely be done on a disease by disease basis, the authors found.

One possibility is that moderate treatment might be best used against already-resistant microbes to prevent their spread. Moderate treatment may also be best for drugs that have been on the market for several years with plenty of time for resistant strains to develop.

Aggressive treatment might be best for pathogens that develop resistance slowly, over the course of several mutations. High doses early in the process could be effective at heading off the development of resistance.

Read the abstract.

Roger D. Kouyos, C. Jessica E. Metcalf, Ruthie Birger, Eili Y. Klein, Pia Abel zur Wiesch, Peter Ankomah, Nimalan Arinaminpathy, Tiffany L. Bogich, Sebastian Bonhoeffer, Charles Brower, Geoffrey Chi-Johnston, Ted Cohen, Troy Day, Bryan Greenhouse, Silvie Huijben, Joshua Metlay, Nicole Mideo, Laura C. Pollitt, Andrew F. Read, David L. Smith, Claire Standley, Nina Wale and Bryan Grenfell. Proc. R. Soc. B: Biological Sciences, 281, 20140566. Published Sept. 24, 2014

The work emerged from two workshops held at Princeton University and funded by the RAPIDD program of the Science and Technology Directorate, Department of Homeland Security and the Fogarty International Center, National Institutes of Health; Science and Technology Directorate, Department of Homeland Security; contract HSHQDC-12-C-00058

Longstanding bottleneck in crystal structure prediction solved (Science)

By Tien Nguyen, Department of Chemistry

benzene crystal
Orthographic projections of a cluster cut from the benzene crystal along the two directions (Image courtesy of Science/AAAS)

Two years after its release, the HIV-1 drug Ritonavir was pulled from the market. Scientists discovered that the drug had crystallized into a slightly different form—called a polymorph—that was less soluble and made it ineffective as a treatment.

The various patterns that atoms of a solid material can adopt, called crystal structures, can have a huge impact on its properties. Being able to accurately predict the most stable crystal structure for a material has been a longstanding challenge for scientists.

“The holy grail of this particular problem is to say, I’ve written down this chemical formula for a material, and then just from the formula be able to predict its structure—a goal since the dawn of chemistry,” said Garnet K. L. Chan, the A. Barton Hepburn Professor of Theoretical Chemistry at Princeton University. One major bottleneck towards achieving this goal has been to compute the lattice energy—the energy associated with a structure—to sufficient accuracy to distinguish between several competing polymorphs.

Chan’s group has now accomplished this task, publishing their results in the journal Science on August 8. The research team demonstrated that new techniques could be used to calculate the lattice energy of benzene, a simple yet important molecule in pharmaceutical and energy research, to sub-kilojoule per mole accuracy—a level of certainty that allows polymorphism to be resolved.

Chan credited this success to the combined application of advances in the field of quantum mechanics over the last 15 years. “Some of these advances allow you to resolve the behavior of electrons more finely, do computations on more atoms more quickly, and allow you to consider more electrons at the same time,” Chan said. “It’s a triumph of the modern field of quantum chemistry that we can now determine the behavior of Nature to this level of precision.”

The group’s next goal is to shorten the time it takes to run the desired calculations. These initial calculations consumed several months of computer time, Chan said, but with some practical modifications, future predictions should take only a few hours.

Chan’s colleagues on the work included first author Jun Yang, an electronic structure theory specialist and lecturer in chemistry, and graduate student Weifeng Hu at Princeton University. Additional collaborators were Denis Usvyat and Martin Schutz of the University of Regensburg and Devin Matthews of the University of Texas at Austin.

The work was supported by the U.S. Department of Energy under grant no. DE-SC0008624, with secondary support from grant no. DE-SC0010530. Additional funding was received from the National Science Foundation under grant no. OCI-1265278 and CHE-1265277. D.M. was supported by the U.S. Department of Energy through a Computational Science Graduate Fellowship, funded by grant no. DE-FG02-97ER25308.

Read the abstract.

Yang J., Hu, W., Usvyat, D., Matthews, D., Schutz, M., Chan, G. K. L. Ab initio determination of the crystalline benzene lattice energy to sub-kilojoule/mol accuracy. Science 2014, 345, 640.

Conservation versus innovation in the fight against antibiotic resistance (Science)

Pills (Image source: NIH)
(Image source: NIH)

“Antibiotic resistance is a problem of managing an open-access resource, such as fisheries or oil,” writes Ramanan Laxminarayan, a research scholar at Princeton University and the director of the Center for Disease Dynamics, Economics & Policy in Washington, D. C., in today’s issue of the journal Science. He goes on to say that individuals have little incentive to use antibiotics wisely, just as people have little incentive to conserve oil when it is plentiful.

As with many other natural resources, maintaining the effectiveness of antibiotics requires two approaches: conserving the existing resource and exploring new sources, Laxminarayan says. These two approaches are linked, however. “Just as incentives for finding new sources of oil reduce incentives to conserve oil,” Laxminarayan writes, “large public subsidies for new drug development discourage efforts to improve how existing antibiotics are used.” Yet new antibiotics tend to cost more than existing ones due to the expense of clinical trials and the fact that the easiest-to-find drugs may have already been discovered.

Laxminarayan’s analysis reveals that the benefits of conserving existing drugs are significant, and argues that the proposed increases in public subsidies for new antibiotics should be matched by greater spending on conservation of antibiotic effectiveness through public education, research and surveillance.

Ramanan Laxminarayan is a research scholar at the Princeton Environmental Institute. His perspective, “Antibiotic effectiveness: Balancing conservation against innovation,” appeared in the September 12, 2014 issue of Science.

Read the article.

PPPL scientists take key step toward solving a major astrophysical mystery

By John Greenwald, Princeton Plasma Physics Laboratory

Magnetic reconnection in the Earth and sun’s atmospheres can trigger geomagnetic storms that disrupt cell phone service, damage satellites and blackout power grids. Understanding how reconnection transforms magnetic energy into explosive particle energy has been a major unsolved problem in plasma astrophysics.

Scientists at the Princeton Plasma Physics Laboratory (PPPL) and Princeton University have taken a key step toward a solution, as described in a paper published this week in the journal Nature Communications. In research conducted on the Magnetic Reconnection Experiment (MRX) at PPPL, the scientists not only identified how the mysterious transformation takes place, but measured experimentally the amount of magnetic energy that turns into particle energy. The work is supported by the U. S. Department of Energy as well as the NSF-funded Center for Magnetic Self-Organization.

Fast-camera image of plasma during magnetic reconnection with rendering of the field lines, shown in white, based on measurements made during the experiment. The converging horizontal lines represent the field lines prior to reconnection. The outgoing vertical lines represent the field lines after reconnection. Image courtesy of Jongsoo Yoo.
Fast-camera image of plasma during magnetic reconnection with rendering of the field lines, shown in white, based on measurements made during the experiment. The converging horizontal lines represent the field lines prior to reconnection. The outgoing vertical lines represent the field lines after reconnection. Image courtesy of Jongsoo Yoo.

Magnetic field lines represent the direction, and indicate the shape, of magnetic fields. In magnetic reconnection, the magnetic field lines in plasma snap apart and violently reconnect. The MRX, built in 1995, allows researchers to study the process in a controlled laboratory environment.

The new research shows that reconnection converts about 50 percent of the magnetic energy, with one-third of the conversion heating the electrons and two-thirds accelerating the ions — or atomic nuclei — in the plasma. In large bodies like the sun, such converted energy can equal the power of millions of tons of TNT.

“This is a major milestone for our research,” said Masaaki Yamada, a research physicist, the principal investigator for the MRX and first author of the Nature Communications paper. “We can now see the entire picture of how much of the energy goes to the electrons and how much to the ions in a proto-typical reconnection layer.”

The findings also suggested the process by which the energy conversion occurs. Reconnection first propels and energizes the electrons, according to the researchers, and this creates an electrically charged field that “becomes the primary energy source for the ions,” said Jongsoo Yoo, an associate research physicist at PPPL and co-author of the paper.

The other contributors to the paper were Hantao Ji, professor of astrophysical sciences at Princeton; Russell Kulsrud, professor of astrophysical sciences, emeritus, at Princeton; and doctoral candidates in astrophysical sciences Jonathan Jara-Almonte and Clayton Myers.

If confirmed by data from space explorations, the PPPL results could help resolve decades-long questions and create practical benefits. These could include a better understanding of geomagnetic storms that could lead to advanced warning of the disturbances and an improved ability to cope with them. Researchers could shut down sensitive instruments on communications satellites, for example, to protect the instruments from harm.

Next year NASA plans to launch a four-satellite mission to study reconnection in the magnetosphere — the magnetic field that surrounds the Earth. The PPPL team plans to collaborate with the venture, called the Magnetospheric Multiscale (MMS) Mission, by providing MRX data to it. The MMS probes could help to confirm the laboratory’s findings.

PPPL, on Princeton University’s Forrestal Campus in Plainsboro, New Jersey, is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Fusion takes place when atomic nuclei fuse and release a burst of energy. This compares with the fission reactions in today’s nuclear power plants, which operate by splitting atoms apart.

Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

Read the abstract.

Yamada, M; Yoo J.; Jara-Almonte, J.; Ji, H.; Kulsrud, R.M.; Myers, C.E. Conversion of magnetic energy in the magnetic reconnection layer of a laboratory plasma. Nature Communications. Article published online Sept. 10, 2014. DOI: NCOMMS5774

 

‘Fracking’ in the dark: Biological fallout of shale-gas production still largely unknown (Frontiers in Ecology and the Environment)

Shale-gas field
Fracking diagram
Eight conservation biologists from various organizations and institutions, including Princeton University, found that shale-gas extraction in the United States has vastly outpaced scientists’ understanding of the industry’s environmental impact. Each gas well can act as a source of air, water, noise and light pollution (above) that — individually and collectively — can interfere with wild animal health, habitats and reproduction. Of particular concern is the fluid and wastewater associated with hydraulic fracturing, or “fracking,” a technique that releases natural gas from shale by breaking the rock up with a high-pressure blend of water, sand and other chemicals. (Frontiers in Ecology and the Environment )

By Morgan Kelly, Office of Communications

In the United States, natural-gas production from shale rock has increased by more than 700 percent since 2007. Yet scientists still do not fully understand the industry’s effects on nature and wildlife, according to a report in the journal Frontiers in Ecology and the Environment.

As gas extraction continues to vastly outpace scientific examination, a team of eight conservation biologists from various organizations and institutions, including Princeton University, concluded that determining the environmental impact of gas-drilling sites — such as chemical contamination from spills, well-casing failures and other accidents — must be a top research priority.

With shale-gas production projected to surge during the next 30 years, the authors call on scientists, industry representatives and policymakers to cooperate on determining — and minimizing — the damage inflicted on the natural world by gas operations such as hydraulic fracturing, or “fracking.” A major environmental concern, hydraulic fracturing releases natural gas from shale by breaking the rock up with a high-pressure blend of water, sand and other chemicals, which can include carcinogens and radioactive substances.

“We can’t let shale development outpace our understanding of its environmental impacts,” said co-author Morgan Tingley, a postdoctoral research associate in the Program in Science, Technology and Environmental Policy in Princeton’s Woodrow Wilson School of Public and International Affairs.

Shale-gas extraction in Wyoming
With shale-gas production projected to surge during the next 30 years, determining and minimizing the industry’s effects on nature and wildlife must become a top priority for scientists, industry and policymakers. Image of Wyoming’s Jonah Field. Although modern shale-gas wells need less surface area than the older methods shown here, the ecological impact from extraction operations past and present pose a long-lasting threat to the natural world. (Photo courtesy of Ecoflight.)

“The past has taught us that environmental impacts of large-scale development and resource extraction, whether coal plants, large dams or biofuel monocultures, are more than the sum of their parts,” Tingley said.

The researchers found that there are significant “knowledge gaps” when it comes to direct and quantifiable evidence of how the natural world responds to shale-gas operations. A major impediment to research has been the lack of accessible and reliable information on spills, wastewater disposal and the composition of fracturing fluids. Of the 24 American states with active shale-gas reservoirs, only five — Pennsylvania, Colorado, New Mexico, Wyoming and Texas — maintain public records of spills and accidents, the researchers report.

“The Pennsylvania Department of Environmental Protection’s website is one of the best sources of publicly available information on shale-gas spills and accidents in the nation. Even so, gas companies failed to report more than one-third of spills in the last year,” said first author Sara Souther, a postdoctoral research associate at the University of Wisconsin-Madison.

“How many more unreported spills occurred, but were not detected during well inspections?” Souther asked. “We need accurate data on the release of fracturing chemicals into the environment before we can understand impacts to plants and animals.”

One of the greatest threats to animal and plant life identified in the study is the impact of rapid and widespread shale development, which has disproportionately affected rural and natural areas. A single gas well results in the clearance of 3.7 to 7.6 acres (1.5 to 3.1 hectares) of vegetation, and each well contributes to a collective mass of air, water, noise and light pollution that has or can interfere with wild animal health, habitats and reproduction, the researchers report.

“If you look down on a heavily ‘fracked’ landscape, you see a web of well pads, access roads and pipelines that create islands out of what was, in some cases, contiguous habitat,” Souther said. “What are the combined effects of numerous wells and their supporting infrastructure on wide-ranging or sensitive species, like the pronghorn antelope or the hellbender salamander?”

The chemical makeup of fracturing fluid and wastewater is often unknown. The authors reviewed chemical-disclosure statements for 150 wells in three of the top gas-producing states and found that an average of two out of every three wells were fractured with at least one undisclosed chemical. The exact effect of fracturing fluid on natural water systems as well as drinking water supplies remains unclear even though improper wastewater disposal and pollution-prevention measures are among the top state-recorded violations at drilling sites, the researchers found.

“Some of the wells in the chemical disclosure registry were fractured with fluid containing 20 or more undisclosed chemicals,” said senior author Kimberly Terrell, a researcher at the Smithsonian Conservation Biology Institute. “This is an arbitrary and inconsistent standard of chemical disclosure.”

The paper’s co-authors also include researchers from the University of Bucharest in Romania, Colorado State University, the University of Washington, and the Society for Conservation Biology.

The work was supported by the David H. Smith Fellowship program administered by the Society for Conservation Biology and funded by the Cedar Tree Foundation; and by a Policy Fellowship from the Wilburforce Foundation to the Society for Conservation Biology.

Read the abstract.

Souther, Sara, Morgan W. Tingley, Viorel D. Popescu, David T.S. Hyman, Maureen E. Ryan, Tabitha A. Graves, Brett Hartl, Kimberly Terrell. 2014. Biotic impacts of energy development from shale: research priorities and knowledge gaps. Frontiers in Ecology and the Environment. Article published online Aug. 1, 2014. DOI: 10.1890/130324.

Water, Water — Not Everywhere: Mapping water trends for African maize (Environmental Research Letters)

By Molly Sharlach, Office of the Dean for Research

Water availability trends in Africa
Researchers analyzed water availability trends in African maize-growing regions from 1979 to 2010. Each quarter-degree grid cell represents a 200-square-mile area and is colored according to its average water availability level during the maize growing season. In redder areas, water availability is more limited by rainfall levels, while bluer areas are more limited by evaporative demand. (Image source: Environmental Research Letters)

Today’s food production relies heavily on irrigation, but across sub-Saharan Africa only 4 percent of cultivated land is irrigated, compared with a global average of 18 percent. Small-scale farming is the main livelihood for many people in the region, who depend on rainfall to water their crops.

To understand how climate change may affect the availability of water for agriculture, researchers at Princeton University analyzed trends in the water cycle in maize-growing areas of 21 African countries between 1979 and 2010. The team examined both levels of rainfall and the evaporative demand of the atmosphere — the combined effects of evaporation and transpiration, which is the movement of water through plants.

Overall, they found increases in water availability during the maize-growing season, although the trends varied by region. The greater availability of water generally resulted from a mixture of increased rainfall and decreased evaporative demand.

However, some regions of East Africa experienced declines in water availability, the study found. “Some places, like parts of Tanzania, got a double whammy that looks like a declining trend in rainfall as well as an increasing evaporative demand during the more sensitive middle part of the growing season,” said Lyndon Estes, the study’s lead author and an associate research scholar in the Program in Science, Technology and Environmental Policy at the Woodrow Wilson School of Public and International Affairs. The analysis was published in the July issue of the journal Environmental Research Letters.

A key goal of the study was to incorporate reliable data on factors that influence evaporative demand. These include temperature, wind speed, humidity and net radiation — defined as the amount of energy from the sun that is absorbed by the land, minus the amount reflected back into the atmosphere by the Earth’s surface. Measurements of three of these parameters came from the Princeton University Global Meteorological Forcing Dataset (PGF) previously developed by two of the study’s authors, Research Scholar Justin Sheffield and Eric Wood, the Susan Dod Brown Professor of Civil and Environmental Engineering and the study’s senior author.

The PGF merges a variety of weather and satellite data, and covers all land areas at a resolution of three hours and one degree of latitude or longitude (one degree of latitude is about 70 miles). Nathaniel Chaney, a graduate student who works with Sheffield, downscaled the data to a resolution of about 15 miles. He incorporated observations from African weather stations to improve the accuracy of the data. To do this, he used statistical techniques based on the principle that areas close to one another are likely to have similar weather.

The team also had to correct the data for errors due to changes in instruments or satellites, which can create what appear to be sudden jumps in temperature or wind speed. “When you’re dealing with gridded global weather data, they come with many warts,” Estes said. “So we try to remove as many of those warts as possible,” he said, to gain a faithful picture of weather changes at each location.

Most areas saw a decrease in evaporative demand, leading to higher water availability. The researchers analyzed the contributions of different factors to this decrease, and found that a downward trend in net radiation was largely responsible for the change. This was a surprising result, according to Estes, who said he expected to see decreases in evaporative demand, but thought lower wind speeds would have a greater impact than drops in net radiation. In a 2012 study published in the journal Nature, Sheffield and Wood showed that diminished wind speeds have helped to offset the effects of rising temperatures that would otherwise lead to an increase in droughts. Another study found that decreasing wind speeds contributed to declining evaporative demand in South Africa. The current study only examined water availability during the maize growing season, which could account for this discrepancy, Estes said.

The trends revealed by this research could have implications for agricultural policies and practices, including irrigation planning, timing of planting and choice of crop varietals. For example, in Burkina Faso in West Africa, a comparison of different parts of the growing season showed a decrease in water availability early in the season, but an increase at later time points. This might mean that the rainy season is starting later, in which case farmers in that region might adapt by planting their maize later. In South Africa, evaporative demand dropped in many areas; this could inform a reallocation of water use.

According to Estes, this study, which examined only 34 percent of all African maize-growing areas, may serve as a framework to guide more detailed analyses within individual countries. It’s also essential to understand the relationship between changes in water availability and changes in actual crop yields, which is more complex because yield trends are influenced by numerous political and economic factors, in addition to farming practices. That’s where Estes hopes to focus his next efforts. “All those factors would have to be teased out to isolate what these changes in water supply and demand mean for crop production,” he said.

Other researchers in Princeton’s Department of Civil and Environmental Engineering involved in the study include graduate student Julio Herrera Estrada and Associate Professor Kelly Caylor.

This work was funded by the United States Army Corps of Engineers Institute for Water Resources, the NASA Measures Program and the Princeton Environmental Institute Grand Challenges Program.

Read the abstract.

Estes, L. D.; Chaney, N. W.; Herrera-Estrada, J.; Sheffield, J.; Caylor, K. K.; Wood, E. F. Changing water availability during the African maize-growing season, 1979–2010. Environmental Research Letters Volume 9 Number 7 075005 doi:10.1088/1748-9326/9/7/075005

Solar panels light the way from carbon dioxide to fuel (Journal of CO2 Utilization)

By Tien Nguyen, Department of Chemistry

Research to curb global warming caused by rising levels of atmospheric greenhouse gases, such as carbon dioxide, usually involves three areas: Developing alternative energy sources, capturing and storing greenhouse gases, and repurposing excess greenhouse gases. Drawing on two of these approaches, researchers in the laboratory of Andrew Bocarsly, a Princeton professor of chemistry, collaborated with researchers at start-up company Liquid Light Inc. of Monmouth Junction, New Jersey, to devise an efficient method for harnessing sunlight to convert carbon dioxide into a potential alternative fuel known as formic acid. The study was published June 13 in the Journal of CO2 Utilization.

Pictured with the photovoltaic-electrochemical cell system from left to right: Graduate student James White (Princeton), Professor Andrew Bocarsly (Princeton and Liquid Light) and principal engineer Paul Majsztrik (Liquid Light). (Photo by Frank Wojciechowski)
Pictured with the photovoltaic-electrochemical cell system from left to right: Graduate student James White (Princeton), Professor Andrew Bocarsly (Princeton and Liquid Light) and principal engineer Paul Majsztrik (Liquid Light). (Photo by Frank Wojciechowski)

The transformation from carbon dioxide and water to formic acid was powered by a commercial solar panel provided by the energy company PSE&G that can be found atop electric poles across New Jersey. The process takes place inside an electrochemical cell, which consists of metal plates the size of rectangular lunch-boxes that enclose liquid-carrying channels.

To maximize the efficiency of the system, the amount of power produced by the solar panel must match the amount of power the electrochemical cell can handle, said Bocarsly. This optimization process is called impedance matching. By stacking three electrochemical cells together, the research team was able to reach almost 2 percent energy efficiency, which is twice the efficiency of natural photosynthesis. It is also the best energy efficiency reported to date using a man-made device.

A number of energy companies are interested in storing solar energy as formic acid in fuel cells. Additionally, formate salt—readily made from formic acid—is the preferred de-icing agent on airplane runways because it is less corrosive to planes and safer for the environment than chloride salts. With increased availability, formate salts could supplant more harmful salts in widespread use.

Using waste carbon dioxide and easily obtained machined parts, this approach offers a promising route to a renewable fuel, Bocarsly said.

This work was financially supported by Liquid Light, Inc., which was cofounded by Bocarsly, and the National Science Foundation under grant no. CHE-0911114.

Read the abstract.

White, J. L.; Herb, J. T.; Kaczur, J. J.; Majsztrik, P. W.; Bocarsly, A. B. Photons to formate: Efficient electrochemical solar energy conversion via reduction of carbon dioxide. Journal of CO2 Utilization. Available online June 13, 2014.

With climate change, heat more than natural disasters will drive people away (PNAS)

By Morgan Kelly, Office of Communications

Although scenes of people fleeing from dramatic displays of Mother Nature’s power dominate the news, gradual increases in an area’s overall temperature — and to a lesser extent precipitation — actually lead more often to permanent population shifts, according to Princeton University research.

The researchers examined 15 years of migration data for more than 7,000 families in Indonesia and found that increases in temperature and, to a lesser extent, rainfall influenced a family’s decision to permanently migrate to another of the country’s provinces. They report in the journal the Proceedings of the National Academy of Sciences that increases in average yearly temperature took a detrimental toll on people’s economic wellbeing. On the other hand, natural disasters such as floods and earthquakes had a much smaller to non-existent impact on permanent moves, suggesting that during natural disasters relocation was most often temporary as people sought refuge in other areas of the country before returning home to rebuild their lives.

The results suggest that the consequences of climate change will likely be more subtle and permanent than is popularly believed, explained first author Pratikshya Bohra-Mishra, a postdoctoral research associate in the Program in Science, Technology and Environmental Policy (STEP) in Princeton’s Woodrow Wilson School of Public and International Affairs. The effects likely won’t be limited to low-lying areas or developing countries that are unprepared for an uptick in hurricanes, floods and other natural disasters, she said.

“We do not think of ‘environmental migrants’ in a broader sense; images of refugees from natural disasters often dominate the overall picture,” Bohra-Mishra said. “It is important to understand the often less conspicuous and gradual effect of climate change on migration. Our study suggests that in areas that are already hot, a further increase in temperature will increase the likelihood that more people will move out.”

Indonesia’s tropical climate and dependence on agriculture may amplify the role of temperature as a migration factor, Bohra-Mishra said. However, existing research shows that climate-driven changes in crop yields can effect Mexican migration to the United States, and that extreme temperature had a role in the long-term migration of males in rural Pakistan.

“Based on these emerging findings, it is likely that the societal reach of climate change could be much broader to include warm regions that are now relatively safe from natural disasters,” Bohra-Mishra said.

Indonesia became the case study because the multi-island tropical nation is vulnerable to climate change and events such as earthquakes and landslides. In addition, the Indonesian Family Life Survey (IFLS) conducted by the RAND Corporation from 1993 to 2007 provided thorough information about the movements of 7,185 families from 13 of the nation’s 27 provinces in 1993. The Princeton researchers matched province-to-province movement of households over 15 years to data on temperature, precipitation and natural disasters from those same years. Bohra-Mishra worked with co-authors Michael Oppenheimer, the Albert G. Millbank Professor of Geosciences and International Affairs and director of STEP, and Solomon Hsiang, a past Princeton postdoctoral researcher now an assistant professor of public policy at the University of California-Berkeley.

People start to rethink their location with each degree that the average annual temperature rises above 25 degrees Celsius (77 degrees Fahrenheit), the researchers found. The chances that a family will leave an area for good in a given year rise with each degree. With a change from 26 to 27 degrees Celsius (78.8 to 80.6 Fahrenheit), the probability of a family emigrating that year increased by 0.8 percent when other factors for migration were controlled for. From 27 to 28 degrees Celsius (80.6 to 82.4 Fahrenheit), those chances jumped to 1.4 percent.

When it comes to annual rainfall, families seem to tolerate and prefer an average of 2.2 meters (7.2 feet). The chances of outmigration increased with each additional meter of average annual precipitation, as well as with further declines in rainfall.

Landslides were the only natural disaster with a consistent positive influence on permanent migration. With every 1 percent increase in the number of deaths or destroyed houses in a family’s home province, the likelihood of permanent migration went up by only 0.0006 and 0.0004 percent, respectively.

The much higher influence of heat on permanent migration can be pinned on its effect on local economies and social structures, the researchers write. Previous research has shown that a one-degree change in the average growing-season temperature can reduce yields of certain crops by as much as 17 percent. At the same time, research conducted by Hsiang while at Princeton and published in 2013 showed a correlation between higher temperatures and social conflict such as civil wars, ethnic conflict and street crime.

In the current study, the researchers found that in Indonesia, a shift from 25 to 26 degrees Celsius resulted in a significant 14 to 15 percent decline in the value of household assets, for example. Precipitation did not have a notable affect on household worth, nor did natural disasters except landslides, which lowered assets by 5 percent for each 1 percent increase in the number of people who died.

Read the abstract.

Bohra-Mishra, Pratikshya, Michael Oppenheimer, Solomon Hsiang. 2014. Nonlinear permanent migration response to climatic variations but minimal response to disasters. Proceedings of the National Academy of Sciences. Article published online June 23, 2014. DOI: 10.1073/pnas.1317166111.

A farewell to arms? Scientists developing a novel technique that could facilitate nuclear disarmament (Nature)

Alexander Glaser and Robert Goldston
Alexander Glaser and Robert Goldston with the British Test Object. Credit: Elle Starkman/PPPL Communications Office

By John Greenwald, Princeton Plasma Physics Laboratory Office of Communications

A proven system for verifying that apparent nuclear weapons slated to be dismantled contained true warheads could provide a key step toward the further reduction of nuclear arms. The system would achieve this verification while safeguarding classified information that could lead to nuclear proliferation.

Scientists at Princeton University and the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) are developing the prototype for such a system, as reported this week in the journal Nature. Their novel approach, called a “zero-knowledge protocol,” would verify the presence of warheads without collecting any classified information at all.

“The goal is to prove with as high confidence as required that an object is a true nuclear warhead while learning nothing about the materials and design of the warhead itself,” said physicist Robert Goldston, coauthor of the paper, a fusion researcher and former director of PPPL, and a professor of astrophysical sciences at Princeton.

While numerous efforts have been made over the years to develop systems for verifying the actual content of warheads covered by disarmament treaties, no such methods are currently in use for treaty verification.

Traditional nuclear arms negotiations focus instead on the reduction of strategic — or long-range — delivery systems, such as bombers, submarines and ballistic missiles, without verifying their warheads. But this approach could prove insufficient when future talks turn to tactical and nondeployed nuclear weapons that are not on long-range systems. “What we really want to do is count warheads,” said physicist Alexander Glaser, first author of the paper and an assistant professor in Princeton’s Woodrow Wilson School of Public and International Affairs and the Department of Mechanical and Aerospace Engineering.

The system Glaser and Goldston are mapping out would compare a warhead to be inspected with a known true warhead to see if the weapons matched. This would be done by beaming high-energy neutrons into each warhead and recording how many neutrons passed through to detectors positioned on the other side. Neutrons that passed through would be added to those already “preloaded” into the detectors by the warheads’ owner — and if the total number of neutrons were the same for each warhead, the weapons would be found to match. But different totals would show that the putative warhead was really a spoof. Prior to the test, the inspector would decide which preloaded detector would go with which warhead.

No classified data would be measured in this process, and no electronic components that might be vulnerable to tampering and snooping would be used. “This approach really is very interesting and elegant,” said Steve Fetter, a professor in the School of Public Policy at the University of Maryland and a former White House official. “The main question is whether it can be implemented in practice.”

A project to test this approach is under construction at PPPL. The project calls for firing high-energy neutrons at a non-nuclear target, called a British Test Object, that will serve as a proxy for warheads. Researchers will compare results of the tests by noting how many neutrons pass through the target to bubble detectors that Yale University is designing for the project. The gel-filled detectors will add the neutrons that pass through to those already preloaded to produce a total for each test.

The project was launched with a seed grant from The Simons Foundation of Vancouver, Canada, that came to Princeton through Global Zero, a nonprofit organization. Support also was provided by the U.S. Department of State, the DOE (via PPPL pre-proposal development funding), and most recently, a total of $3.5 million over five years from the National Nuclear Security Administration.

Glaser hit upon the idea for a zero-knowledge proof over a lunch hosted by David Dobkin, a computer scientist, and until June 2014, dean of the Princeton faculty. “I told him I was really interested in nuclear warhead verification without learning anything about the warhead itself,” Glaser said. ‘“We call this a zero-knowledge proof in computer science,”’ Glaser said Dobkin replied. “That was the trigger,” Glaser recalled. “I went home and began reading about zero-knowledge proofs,” which are widely used in applications such as verifying online passwords.

Glaser’s reading led him to Boaz Barak, a senior researcher at Microsoft New England who had taught computer science at Princeton and is an expert in cryptology, the science of disguising secret information. “We started having discussions,” Glaser said of Barak, who helped develop statistical measures for the PPPL project and is the third coauthor of the paper in Nature.

Glaser also reached out to Goldston, with whom he had taught a class for three years in the Princeton Department of Astrophysical Sciences. “I told Rob that we need neutrons for this project,” Glaser recalled. “And he said, ‘That’s what we do — we have 14 MeV [or high-energy] neutrons at the Laboratory.’” Glaser, Goldston and Barak then worked together to refine the concept, developing ways to assure that even the statistical noise — or random variation — in the measurements conveyed no information.

If proven successful, dedicated inspection systems based on radiation measurements, such as the one proposed here, could help to advance disarmament talks beyond the New Strategic Arms Reduction Treaty (New START) between the United States and Russia, which runs from 2011 to 2021. The treaty calls for each country to reduce its arsenal of deployed strategic nuclear arms to 1,550 weapons, for a total of 3,100, by 2018.

Not included in the New START treaty are more than 4,000 nondeployed strategic and tactical weapons in each country’s arsenal. These very weapons, note the authors of the Nature paper, are apt to become part of future negotiations, “which will likely require verification of individual warheads, rather than whole delivery systems.” Deep cuts in the nuclear arsenals and the ultimate march to zero, say the authors, will require the ability to verifiably count individual warheads.

Read the abstract: http://dx.doi.org/10.1038/nature13457

A.Glaser, B. Barak, R. Goldston. A zero-knowledge protocol for nuclear  warhead verification. Nature 26 June 2014 DOI: 10.1038/nature13457

Strange physics turns off laser (Nature Communications)

By Steve Schultz, School of Engineering Office of Communications

An electron microscope image shows two lasers placed just two microns apart from each other. (Image source: Turecki lab)
An electron microscope image shows two lasers placed just two microns apart from each other. (Image source: Turecki lab)

Inspired by anomalies that arise in certain mathematical equations, researchers have demonstrated a laser system that paradoxically turns off when more power is added rather than becoming continuously brighter.

The finding by a team of researchers at Vienna University of Technology and Princeton University, could lead to new ways to manipulate the interaction of electronics and light, an important tool in modern communications networks and high-speed information processing.

The researchers published their results June 13 in the journal Nature Communications.

Their system involves two tiny lasers, each one-tenth of a millimeter in diameter, or about the width of a human hair. The two are nearly touching, separated by a distance 50 times smaller than the lasers themselves. One is pumped with electric current until it starts to emit light, as is normal for lasers. Power is then added slowly to the other, but instead of it also turning on and emitting even more light, the whole system shuts off.

“This is not the normal interference that we know,” said Hakan Türeci, assistant professor of electrical engineering at Princeton, referring to the common phenomenon of light waves or sound waves from two sources cancelling each other.  Instead, he said, the cancellation arises from the careful distribution of energy loss within an overall system that is being amplified.

Interactions between two lasers
Manipulating minute areas of gain and loss within individual lasers (shown as peaks and valleys in the image), researchers were able to create paradoxical interactions between two nearby lasers.(Image source: Turecki lab)

“Loss is something you normally are trying to avoid,” Türeci said. “In this case, we take advantage of it and it gives us a different dimension we can use – a new tool – in controlling optical systems.”

The research grows out of Türeci’s longstanding work on mathematical models that describe the behavior of lasers. In 2008, he established a mathematical framework for understanding the unique properties and complex interactions that are possible in extremely small lasers – devices with features measured in micrometers or nanometers. Different from conventional desk-top lasers, these devices fit on a computer chip.

That work opened the door to manipulating gain or loss (the amplification or loss of an energy input) within a laser system. In particular, it allowed researchers to judiciously control the spatial distribution of gain and loss within a single system, with one tiny sub-area amplifying light and an immediately adjacent area absorbing the generated light.

Türeci and his collaborators are now using similar ideas to pursue counterintuitive ideas for using distribution of gain and loss to make micro-lasers more efficient.

The researchers’ ideas for taking advantage of loss derive from their study of mathematical constructs called “non-Hermitian” matrices in which a normally symmetric table of values becomes asymmetric. Türeci said the work is related to certain ideas of quantum physics in which the fundamental symmetries of time and space in nature can break down even though the equations used to describe the system continue to maintain perfect symmetry.

Over the past several years, Türeci and his collaborators at Vienna worked to show how the mathematical anomalies at the heart of this work, called “exceptional points,” could be manifested in an actual system. In 2012 (Ref. 3), the team published a paper in the journal Physical Review Letters demonstrating computer simulations of a laser system that shuts off as energy is being added. In the current Nature Communications paper, the researchers created an experimental realization of their theory using a light source known as a quantum cascade laser.

The researchers report in the article that results could be of particular value in creating “lab-on-a-chip” devices – instruments that pack tiny optical devices onto a single computer chip. Understanding how multiple optical devices interact could provide ways to manipulate their performance electronically in previously unforeseen ways. Taking advantage of the way loss and gain are distributed within tightly coupled laser systems could lead to new types of highly accurate sensors, the researchers said.

“Our approach provides a whole new set of levers to create unforeseen and useful behaviors,” Türeci said.

The work at Vienna, including creation and demonstration of the actual device, was led by Stefan Rotter at Vienna along with Martin Brandstetter, Matthias Liertzer, C. Deutsch, P. Klang, J. Schöberl, G. Strasser and K. Unterrainer. Türeci participated in the development of the mathematical models underlying the phenomena. The work on the 2012 computer simulation of the system also included Li Ge, who was a post-doctoral researcher at Princeton at the time and is now an assistant professor at City University of New York.

The work was funded by the Vienna Science and Technology Fund and the Austrian Science Fund, as well as by the National Science Foundation through a major grant for the Mid-Infrared Technologies for Health and the Environment Center based at Princeton and by the Defense Advanced Research Projects Agency.

Read the abstract.

M. Brandstetter, M. Liertzer, C. Deutsch,P. Klang,J. Schöberl,H. E. Türeci,G. Strasser,K. Unterrainer & S. Rotter. Reversing the pump dependence of a laser at an exceptional point. Nature Communications 13 June 2014. DOI:10.1038/ncomms5034

Science 2 May 2008. DOI: 10.1126/science.1155311

Physical Review Letters 24 April 2012. DOI:10.1103/PhysRevLett.108.173901