Author Archives: Catherine Zandonella

Conservation versus innovation in the fight against antibiotic resistance (Science)

Pills (Image source: NIH)

(Image source: NIH)

“Antibiotic resistance is a problem of managing an open-access resource, such as fisheries or oil,” writes Ramanan Laxminarayan, a research scholar at Princeton University and the director of the Center for Disease Dynamics, Economics & Policy in Washington, D. C., in today’s issue of the journal Science. He goes on to say that individuals have little incentive to use antibiotics wisely, just as people have little incentive to conserve oil when it is plentiful.

As with many other natural resources, maintaining the effectiveness of antibiotics requires two approaches: conserving the existing resource and exploring new sources, Laxminarayan says. These two approaches are linked, however. “Just as incentives for finding new sources of oil reduce incentives to conserve oil,” Laxminarayan writes, “large public subsidies for new drug development discourage efforts to improve how existing antibiotics are used.” Yet new antibiotics tend to cost more than existing ones due to the expense of clinical trials and the fact that the easiest-to-find drugs may have already been discovered.

Laxminarayan’s analysis reveals that the benefits of conserving existing drugs are significant, and argues that the proposed increases in public subsidies for new antibiotics should be matched by greater spending on conservation of antibiotic effectiveness through public education, research and surveillance.

Ramanan Laxminarayan is a research scholar at the Princeton Environmental Institute. His perspective, “Antibiotic effectiveness: Balancing conservation against innovation,” appeared in the September 12, 2014 issue of Science.

Read the article.

Fast-camera image of plasma during magnetic reconnection with rendering of the field lines, shown in white, based on measurements made during the experiment. The converging horizontal lines represent the field lines prior to reconnection. The outgoing vertical lines represent the field lines after reconnection. Image courtesy of Jongsoo Yoo.

PPPL scientists take key step toward solving a major astrophysical mystery

By John Greenwald, Princeton Plasma Physics Laboratory

Magnetic reconnection in the Earth and sun’s atmospheres can trigger geomagnetic storms that disrupt cell phone service, damage satellites and blackout power grids. Understanding how reconnection transforms magnetic energy into explosive particle energy has been a major unsolved problem in plasma astrophysics.

Scientists at the Princeton Plasma Physics Laboratory (PPPL) and Princeton University have taken a key step toward a solution, as described in a paper published this week in the journal Nature Communications. In research conducted on the Magnetic Reconnection Experiment (MRX) at PPPL, the scientists not only identified how the mysterious transformation takes place, but measured experimentally the amount of magnetic energy that turns into particle energy. The work is supported by the U. S. Department of Energy as well as the NSF-funded Center for Magnetic Self-Organization.

Fast-camera image of plasma during magnetic reconnection with rendering of the field lines, shown in white, based on measurements made during the experiment. The converging horizontal lines represent the field lines prior to reconnection. The outgoing vertical lines represent the field lines after reconnection. Image courtesy of Jongsoo Yoo.

Fast-camera image of plasma during magnetic reconnection with rendering of the field lines, shown in white, based on measurements made during the experiment. The converging horizontal lines represent the field lines prior to reconnection. The outgoing vertical lines represent the field lines after reconnection. Image courtesy of Jongsoo Yoo.

Magnetic field lines represent the direction, and indicate the shape, of magnetic fields. In magnetic reconnection, the magnetic field lines in plasma snap apart and violently reconnect. The MRX, built in 1995, allows researchers to study the process in a controlled laboratory environment.

The new research shows that reconnection converts about 50 percent of the magnetic energy, with one-third of the conversion heating the electrons and two-thirds accelerating the ions — or atomic nuclei — in the plasma. In large bodies like the sun, such converted energy can equal the power of millions of tons of TNT.

“This is a major milestone for our research,” said Masaaki Yamada, a research physicist, the principal investigator for the MRX and first author of the Nature Communications paper. “We can now see the entire picture of how much of the energy goes to the electrons and how much to the ions in a proto-typical reconnection layer.”

The findings also suggested the process by which the energy conversion occurs. Reconnection first propels and energizes the electrons, according to the researchers, and this creates an electrically charged field that “becomes the primary energy source for the ions,” said Jongsoo Yoo, an associate research physicist at PPPL and co-author of the paper.

The other contributors to the paper were Hantao Ji, professor of astrophysical sciences at Princeton; Russell Kulsrud, professor of astrophysical sciences, emeritus, at Princeton; and doctoral candidates in astrophysical sciences Jonathan Jara-Almonte and Clayton Myers.

If confirmed by data from space explorations, the PPPL results could help resolve decades-long questions and create practical benefits. These could include a better understanding of geomagnetic storms that could lead to advanced warning of the disturbances and an improved ability to cope with them. Researchers could shut down sensitive instruments on communications satellites, for example, to protect the instruments from harm.

Next year NASA plans to launch a four-satellite mission to study reconnection in the magnetosphere — the magnetic field that surrounds the Earth. The PPPL team plans to collaborate with the venture, called the Magnetospheric Multiscale (MMS) Mission, by providing MRX data to it. The MMS probes could help to confirm the laboratory’s findings.

PPPL, on Princeton University’s Forrestal Campus in Plainsboro, New Jersey, is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Fusion takes place when atomic nuclei fuse and release a burst of energy. This compares with the fission reactions in today’s nuclear power plants, which operate by splitting atoms apart.

Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

Read the abstract.

Yamada, M; Yoo J.; Jara-Almonte, J.; Ji, H.; Kulsrud, R.M.; Myers, C.E. Conversion of magnetic energy in the magnetic reconnection layer of a laboratory plasma. Nature Communications. Article published online Sept. 10, 2014. DOI: NCOMMS5774

 

Shale-gas field

‘Fracking’ in the dark: Biological fallout of shale-gas production still largely unknown (Frontiers in Ecology and the Environment)

Fracking diagram

Eight conservation biologists from various organizations and institutions, including Princeton University, found that shale-gas extraction in the United States has vastly outpaced scientists’ understanding of the industry’s environmental impact. Each gas well can act as a source of air, water, noise and light pollution (above) that — individually and collectively — can interfere with wild animal health, habitats and reproduction. Of particular concern is the fluid and wastewater associated with hydraulic fracturing, or “fracking,” a technique that releases natural gas from shale by breaking the rock up with a high-pressure blend of water, sand and other chemicals. (Frontiers in Ecology and the Environment )

By Morgan Kelly, Office of Communications

In the United States, natural-gas production from shale rock has increased by more than 700 percent since 2007. Yet scientists still do not fully understand the industry’s effects on nature and wildlife, according to a report in the journal Frontiers in Ecology and the Environment.

As gas extraction continues to vastly outpace scientific examination, a team of eight conservation biologists from various organizations and institutions, including Princeton University, concluded that determining the environmental impact of gas-drilling sites — such as chemical contamination from spills, well-casing failures and other accidents — must be a top research priority.

With shale-gas production projected to surge during the next 30 years, the authors call on scientists, industry representatives and policymakers to cooperate on determining — and minimizing — the damage inflicted on the natural world by gas operations such as hydraulic fracturing, or “fracking.” A major environmental concern, hydraulic fracturing releases natural gas from shale by breaking the rock up with a high-pressure blend of water, sand and other chemicals, which can include carcinogens and radioactive substances.

“We can’t let shale development outpace our understanding of its environmental impacts,” said co-author Morgan Tingley, a postdoctoral research associate in the Program in Science, Technology and Environmental Policy in Princeton’s Woodrow Wilson School of Public and International Affairs.

Shale-gas extraction in Wyoming

With shale-gas production projected to surge during the next 30 years, determining and minimizing the industry’s effects on nature and wildlife must become a top priority for scientists, industry and policymakers. Image of Wyoming’s Jonah Field. Although modern shale-gas wells need less surface area than the older methods shown here, the ecological impact from extraction operations past and present pose a long-lasting threat to the natural world. (Photo courtesy of Ecoflight.)

“The past has taught us that environmental impacts of large-scale development and resource extraction, whether coal plants, large dams or biofuel monocultures, are more than the sum of their parts,” Tingley said.

The researchers found that there are significant “knowledge gaps” when it comes to direct and quantifiable evidence of how the natural world responds to shale-gas operations. A major impediment to research has been the lack of accessible and reliable information on spills, wastewater disposal and the composition of fracturing fluids. Of the 24 American states with active shale-gas reservoirs, only five — Pennsylvania, Colorado, New Mexico, Wyoming and Texas — maintain public records of spills and accidents, the researchers report.

“The Pennsylvania Department of Environmental Protection’s website is one of the best sources of publicly available information on shale-gas spills and accidents in the nation. Even so, gas companies failed to report more than one-third of spills in the last year,” said first author Sara Souther, a postdoctoral research associate at the University of Wisconsin-Madison.

“How many more unreported spills occurred, but were not detected during well inspections?” Souther asked. “We need accurate data on the release of fracturing chemicals into the environment before we can understand impacts to plants and animals.”

One of the greatest threats to animal and plant life identified in the study is the impact of rapid and widespread shale development, which has disproportionately affected rural and natural areas. A single gas well results in the clearance of 3.7 to 7.6 acres (1.5 to 3.1 hectares) of vegetation, and each well contributes to a collective mass of air, water, noise and light pollution that has or can interfere with wild animal health, habitats and reproduction, the researchers report.

“If you look down on a heavily ‘fracked’ landscape, you see a web of well pads, access roads and pipelines that create islands out of what was, in some cases, contiguous habitat,” Souther said. “What are the combined effects of numerous wells and their supporting infrastructure on wide-ranging or sensitive species, like the pronghorn antelope or the hellbender salamander?”

The chemical makeup of fracturing fluid and wastewater is often unknown. The authors reviewed chemical-disclosure statements for 150 wells in three of the top gas-producing states and found that an average of two out of every three wells were fractured with at least one undisclosed chemical. The exact effect of fracturing fluid on natural water systems as well as drinking water supplies remains unclear even though improper wastewater disposal and pollution-prevention measures are among the top state-recorded violations at drilling sites, the researchers found.

“Some of the wells in the chemical disclosure registry were fractured with fluid containing 20 or more undisclosed chemicals,” said senior author Kimberly Terrell, a researcher at the Smithsonian Conservation Biology Institute. “This is an arbitrary and inconsistent standard of chemical disclosure.”

The paper’s co-authors also include researchers from the University of Bucharest in Romania, Colorado State University, the University of Washington, and the Society for Conservation Biology.

The work was supported by the David H. Smith Fellowship program administered by the Society for Conservation Biology and funded by the Cedar Tree Foundation; and by a Policy Fellowship from the Wilburforce Foundation to the Society for Conservation Biology.

Read the abstract.

Souther, Sara, Morgan W. Tingley, Viorel D. Popescu, David T.S. Hyman, Maureen E. Ryan, Tabitha A. Graves, Brett Hartl, Kimberly Terrell. 2014. Biotic impacts of energy development from shale: research priorities and knowledge gaps. Frontiers in Ecology and the Environment. Article published online Aug. 1, 2014. DOI: 10.1890/130324.

Water, Water — Not Everywhere: Mapping water trends for African maize (Environmental Research Letters)

By Molly Sharlach, Office of the Dean for Research

Water availability trends in Africa

Researchers analyzed water availability trends in African maize-growing regions from 1979 to 2010. Each quarter-degree grid cell represents a 200-square-mile area and is colored according to its average water availability level during the maize growing season. In redder areas, water availability is more limited by rainfall levels, while bluer areas are more limited by evaporative demand. (Image source: Environmental Research Letters)

Today’s food production relies heavily on irrigation, but across sub-Saharan Africa only 4 percent of cultivated land is irrigated, compared with a global average of 18 percent. Small-scale farming is the main livelihood for many people in the region, who depend on rainfall to water their crops.

To understand how climate change may affect the availability of water for agriculture, researchers at Princeton University analyzed trends in the water cycle in maize-growing areas of 21 African countries between 1979 and 2010. The team examined both levels of rainfall and the evaporative demand of the atmosphere — the combined effects of evaporation and transpiration, which is the movement of water through plants.

Overall, they found increases in water availability during the maize-growing season, although the trends varied by region. The greater availability of water generally resulted from a mixture of increased rainfall and decreased evaporative demand.

However, some regions of East Africa experienced declines in water availability, the study found. “Some places, like parts of Tanzania, got a double whammy that looks like a declining trend in rainfall as well as an increasing evaporative demand during the more sensitive middle part of the growing season,” said Lyndon Estes, the study’s lead author and an associate research scholar in the Program in Science, Technology and Environmental Policy at the Woodrow Wilson School of Public and International Affairs. The analysis was published in the July issue of the journal Environmental Research Letters.

A key goal of the study was to incorporate reliable data on factors that influence evaporative demand. These include temperature, wind speed, humidity and net radiation — defined as the amount of energy from the sun that is absorbed by the land, minus the amount reflected back into the atmosphere by the Earth’s surface. Measurements of three of these parameters came from the Princeton University Global Meteorological Forcing Dataset (PGF) previously developed by two of the study’s authors, Research Scholar Justin Sheffield and Eric Wood, the Susan Dod Brown Professor of Civil and Environmental Engineering and the study’s senior author.

The PGF merges a variety of weather and satellite data, and covers all land areas at a resolution of three hours and one degree of latitude or longitude (one degree of latitude is about 70 miles). Nathaniel Chaney, a graduate student who works with Sheffield, downscaled the data to a resolution of about 15 miles. He incorporated observations from African weather stations to improve the accuracy of the data. To do this, he used statistical techniques based on the principle that areas close to one another are likely to have similar weather.

The team also had to correct the data for errors due to changes in instruments or satellites, which can create what appear to be sudden jumps in temperature or wind speed. “When you’re dealing with gridded global weather data, they come with many warts,” Estes said. “So we try to remove as many of those warts as possible,” he said, to gain a faithful picture of weather changes at each location.

Most areas saw a decrease in evaporative demand, leading to higher water availability. The researchers analyzed the contributions of different factors to this decrease, and found that a downward trend in net radiation was largely responsible for the change. This was a surprising result, according to Estes, who said he expected to see decreases in evaporative demand, but thought lower wind speeds would have a greater impact than drops in net radiation. In a 2012 study published in the journal Nature, Sheffield and Wood showed that diminished wind speeds have helped to offset the effects of rising temperatures that would otherwise lead to an increase in droughts. Another study found that decreasing wind speeds contributed to declining evaporative demand in South Africa. The current study only examined water availability during the maize growing season, which could account for this discrepancy, Estes said.

The trends revealed by this research could have implications for agricultural policies and practices, including irrigation planning, timing of planting and choice of crop varietals. For example, in Burkina Faso in West Africa, a comparison of different parts of the growing season showed a decrease in water availability early in the season, but an increase at later time points. This might mean that the rainy season is starting later, in which case farmers in that region might adapt by planting their maize later. In South Africa, evaporative demand dropped in many areas; this could inform a reallocation of water use.

According to Estes, this study, which examined only 34 percent of all African maize-growing areas, may serve as a framework to guide more detailed analyses within individual countries. It’s also essential to understand the relationship between changes in water availability and changes in actual crop yields, which is more complex because yield trends are influenced by numerous political and economic factors, in addition to farming practices. That’s where Estes hopes to focus his next efforts. “All those factors would have to be teased out to isolate what these changes in water supply and demand mean for crop production,” he said.

Other researchers in Princeton’s Department of Civil and Environmental Engineering involved in the study include graduate student Julio Herrera Estrada and Associate Professor Kelly Caylor.

This work was funded by the United States Army Corps of Engineers Institute for Water Resources, the NASA Measures Program and the Princeton Environmental Institute Grand Challenges Program.

Read the abstract.

Estes, L. D.; Chaney, N. W.; Herrera-Estrada, J.; Sheffield, J.; Caylor, K. K.; Wood, E. F. Changing water availability during the African maize-growing season, 1979–2010. Environmental Research Letters Volume 9 Number 7 075005 doi:10.1088/1748-9326/9/7/075005

Solar_panels_800

Solar panels light the way from carbon dioxide to fuel (Journal of CO2 Utilization)

By Tien Nguyen, Department of Chemistry

Research to curb global warming caused by rising levels of atmospheric greenhouse gases, such as carbon dioxide, usually involves three areas: Developing alternative energy sources, capturing and storing greenhouse gases, and repurposing excess greenhouse gases. Drawing on two of these approaches, researchers in the laboratory of Andrew Bocarsly, a Princeton professor of chemistry, collaborated with researchers at start-up company Liquid Light Inc. of Monmouth Junction, New Jersey, to devise an efficient method for harnessing sunlight to convert carbon dioxide into a potential alternative fuel known as formic acid. The study was published June 13 in the Journal of CO2 Utilization.

Pictured with the photovoltaic-electrochemical cell system from left to right: Graduate student James White (Princeton), Professor Andrew Bocarsly (Princeton and Liquid Light) and principal engineer Paul Majsztrik (Liquid Light). (Photo by Frank Wojciechowski)

Pictured with the photovoltaic-electrochemical cell system from left to right: Graduate student James White (Princeton), Professor Andrew Bocarsly (Princeton and Liquid Light) and principal engineer Paul Majsztrik (Liquid Light). (Photo by Frank Wojciechowski)

The transformation from carbon dioxide and water to formic acid was powered by a commercial solar panel provided by the energy company PSE&G that can be found atop electric poles across New Jersey. The process takes place inside an electrochemical cell, which consists of metal plates the size of rectangular lunch-boxes that enclose liquid-carrying channels.

To maximize the efficiency of the system, the amount of power produced by the solar panel must match the amount of power the electrochemical cell can handle, said Bocarsly. This optimization process is called impedance matching. By stacking three electrochemical cells together, the research team was able to reach almost 2 percent energy efficiency, which is twice the efficiency of natural photosynthesis. It is also the best energy efficiency reported to date using a man-made device.

A number of energy companies are interested in storing solar energy as formic acid in fuel cells. Additionally, formate salt—readily made from formic acid—is the preferred de-icing agent on airplane runways because it is less corrosive to planes and safer for the environment than chloride salts. With increased availability, formate salts could supplant more harmful salts in widespread use.

Using waste carbon dioxide and easily obtained machined parts, this approach offers a promising route to a renewable fuel, Bocarsly said.

This work was financially supported by Liquid Light, Inc., which was cofounded by Bocarsly, and the National Science Foundation under grant no. CHE-0911114.

Read the abstract.

White, J. L.; Herb, J. T.; Kaczur, J. J.; Majsztrik, P. W.; Bocarsly, A. B. Photons to formate: Efficient electrochemical solar energy conversion via reduction of carbon dioxide. Journal of CO2 Utilization. Available online June 13, 2014.

Indonesia

With climate change, heat more than natural disasters will drive people away (PNAS)

By Morgan Kelly, Office of Communications

Although scenes of people fleeing from dramatic displays of Mother Nature’s power dominate the news, gradual increases in an area’s overall temperature — and to a lesser extent precipitation — actually lead more often to permanent population shifts, according to Princeton University research.

The researchers examined 15 years of migration data for more than 7,000 families in Indonesia and found that increases in temperature and, to a lesser extent, rainfall influenced a family’s decision to permanently migrate to another of the country’s provinces. They report in the journal the Proceedings of the National Academy of Sciences that increases in average yearly temperature took a detrimental toll on people’s economic wellbeing. On the other hand, natural disasters such as floods and earthquakes had a much smaller to non-existent impact on permanent moves, suggesting that during natural disasters relocation was most often temporary as people sought refuge in other areas of the country before returning home to rebuild their lives.

The results suggest that the consequences of climate change will likely be more subtle and permanent than is popularly believed, explained first author Pratikshya Bohra-Mishra, a postdoctoral research associate in the Program in Science, Technology and Environmental Policy (STEP) in Princeton’s Woodrow Wilson School of Public and International Affairs. The effects likely won’t be limited to low-lying areas or developing countries that are unprepared for an uptick in hurricanes, floods and other natural disasters, she said.

“We do not think of ‘environmental migrants’ in a broader sense; images of refugees from natural disasters often dominate the overall picture,” Bohra-Mishra said. “It is important to understand the often less conspicuous and gradual effect of climate change on migration. Our study suggests that in areas that are already hot, a further increase in temperature will increase the likelihood that more people will move out.”

Indonesia’s tropical climate and dependence on agriculture may amplify the role of temperature as a migration factor, Bohra-Mishra said. However, existing research shows that climate-driven changes in crop yields can effect Mexican migration to the United States, and that extreme temperature had a role in the long-term migration of males in rural Pakistan.

“Based on these emerging findings, it is likely that the societal reach of climate change could be much broader to include warm regions that are now relatively safe from natural disasters,” Bohra-Mishra said.

Indonesia became the case study because the multi-island tropical nation is vulnerable to climate change and events such as earthquakes and landslides. In addition, the Indonesian Family Life Survey (IFLS) conducted by the RAND Corporation from 1993 to 2007 provided thorough information about the movements of 7,185 families from 13 of the nation’s 27 provinces in 1993. The Princeton researchers matched province-to-province movement of households over 15 years to data on temperature, precipitation and natural disasters from those same years. Bohra-Mishra worked with co-authors Michael Oppenheimer, the Albert G. Millbank Professor of Geosciences and International Affairs and director of STEP, and Solomon Hsiang, a past Princeton postdoctoral researcher now an assistant professor of public policy at the University of California-Berkeley.

People start to rethink their location with each degree that the average annual temperature rises above 25 degrees Celsius (77 degrees Fahrenheit), the researchers found. The chances that a family will leave an area for good in a given year rise with each degree. With a change from 26 to 27 degrees Celsius (78.8 to 80.6 Fahrenheit), the probability of a family emigrating that year increased by 0.8 percent when other factors for migration were controlled for. From 27 to 28 degrees Celsius (80.6 to 82.4 Fahrenheit), those chances jumped to 1.4 percent.

When it comes to annual rainfall, families seem to tolerate and prefer an average of 2.2 meters (7.2 feet). The chances of outmigration increased with each additional meter of average annual precipitation, as well as with further declines in rainfall.

Landslides were the only natural disaster with a consistent positive influence on permanent migration. With every 1 percent increase in the number of deaths or destroyed houses in a family’s home province, the likelihood of permanent migration went up by only 0.0006 and 0.0004 percent, respectively.

The much higher influence of heat on permanent migration can be pinned on its effect on local economies and social structures, the researchers write. Previous research has shown that a one-degree change in the average growing-season temperature can reduce yields of certain crops by as much as 17 percent. At the same time, research conducted by Hsiang while at Princeton and published in 2013 showed a correlation between higher temperatures and social conflict such as civil wars, ethnic conflict and street crime.

In the current study, the researchers found that in Indonesia, a shift from 25 to 26 degrees Celsius resulted in a significant 14 to 15 percent decline in the value of household assets, for example. Precipitation did not have a notable affect on household worth, nor did natural disasters except landslides, which lowered assets by 5 percent for each 1 percent increase in the number of people who died.

Read the abstract.

Bohra-Mishra, Pratikshya, Michael Oppenheimer, Solomon Hsiang. 2014. Nonlinear permanent migration response to climatic variations but minimal response to disasters. Proceedings of the National Academy of Sciences. Article published online June 23, 2014. DOI: 10.1073/pnas.1317166111.

Glaser Goldston and BTO_cropped

A farewell to arms? Scientists developing a novel technique that could facilitate nuclear disarmament (Nature)

Alexander Glaser and Robert Goldston

Alexander Glaser and Robert Goldston with the British Test Object. Credit: Elle Starkman/PPPL Communications Office

By John Greenwald, Princeton Plasma Physics Laboratory Office of Communications

A proven system for verifying that apparent nuclear weapons slated to be dismantled contained true warheads could provide a key step toward the further reduction of nuclear arms. The system would achieve this verification while safeguarding classified information that could lead to nuclear proliferation.

Scientists at Princeton University and the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) are developing the prototype for such a system, as reported this week in the journal Nature. Their novel approach, called a “zero-knowledge protocol,” would verify the presence of warheads without collecting any classified information at all.

“The goal is to prove with as high confidence as required that an object is a true nuclear warhead while learning nothing about the materials and design of the warhead itself,” said physicist Robert Goldston, coauthor of the paper, a fusion researcher and former director of PPPL, and a professor of astrophysical sciences at Princeton.

While numerous efforts have been made over the years to develop systems for verifying the actual content of warheads covered by disarmament treaties, no such methods are currently in use for treaty verification.

Traditional nuclear arms negotiations focus instead on the reduction of strategic — or long-range — delivery systems, such as bombers, submarines and ballistic missiles, without verifying their warheads. But this approach could prove insufficient when future talks turn to tactical and nondeployed nuclear weapons that are not on long-range systems. “What we really want to do is count warheads,” said physicist Alexander Glaser, first author of the paper and an assistant professor in Princeton’s Woodrow Wilson School of Public and International Affairs and the Department of Mechanical and Aerospace Engineering.

The system Glaser and Goldston are mapping out would compare a warhead to be inspected with a known true warhead to see if the weapons matched. This would be done by beaming high-energy neutrons into each warhead and recording how many neutrons passed through to detectors positioned on the other side. Neutrons that passed through would be added to those already “preloaded” into the detectors by the warheads’ owner — and if the total number of neutrons were the same for each warhead, the weapons would be found to match. But different totals would show that the putative warhead was really a spoof. Prior to the test, the inspector would decide which preloaded detector would go with which warhead.

No classified data would be measured in this process, and no electronic components that might be vulnerable to tampering and snooping would be used. “This approach really is very interesting and elegant,” said Steve Fetter, a professor in the School of Public Policy at the University of Maryland and a former White House official. “The main question is whether it can be implemented in practice.”

A project to test this approach is under construction at PPPL. The project calls for firing high-energy neutrons at a non-nuclear target, called a British Test Object, that will serve as a proxy for warheads. Researchers will compare results of the tests by noting how many neutrons pass through the target to bubble detectors that Yale University is designing for the project. The gel-filled detectors will add the neutrons that pass through to those already preloaded to produce a total for each test.

The project was launched with a seed grant from The Simons Foundation of Vancouver, Canada, that came to Princeton through Global Zero, a nonprofit organization. Support also was provided by the U.S. Department of State, the DOE (via PPPL pre-proposal development funding), and most recently, a total of $3.5 million over five years from the National Nuclear Security Administration.

Glaser hit upon the idea for a zero-knowledge proof over a lunch hosted by David Dobkin, a computer scientist, and until June 2014, dean of the Princeton faculty. “I told him I was really interested in nuclear warhead verification without learning anything about the warhead itself,” Glaser said. ‘“We call this a zero-knowledge proof in computer science,”’ Glaser said Dobkin replied. “That was the trigger,” Glaser recalled. “I went home and began reading about zero-knowledge proofs,” which are widely used in applications such as verifying online passwords.

Glaser’s reading led him to Boaz Barak, a senior researcher at Microsoft New England who had taught computer science at Princeton and is an expert in cryptology, the science of disguising secret information. “We started having discussions,” Glaser said of Barak, who helped develop statistical measures for the PPPL project and is the third coauthor of the paper in Nature.

Glaser also reached out to Goldston, with whom he had taught a class for three years in the Princeton Department of Astrophysical Sciences. “I told Rob that we need neutrons for this project,” Glaser recalled. “And he said, ‘That’s what we do — we have 14 MeV [or high-energy] neutrons at the Laboratory.’” Glaser, Goldston and Barak then worked together to refine the concept, developing ways to assure that even the statistical noise — or random variation — in the measurements conveyed no information.

If proven successful, dedicated inspection systems based on radiation measurements, such as the one proposed here, could help to advance disarmament talks beyond the New Strategic Arms Reduction Treaty (New START) between the United States and Russia, which runs from 2011 to 2021. The treaty calls for each country to reduce its arsenal of deployed strategic nuclear arms to 1,550 weapons, for a total of 3,100, by 2018.

Not included in the New START treaty are more than 4,000 nondeployed strategic and tactical weapons in each country’s arsenal. These very weapons, note the authors of the Nature paper, are apt to become part of future negotiations, “which will likely require verification of individual warheads, rather than whole delivery systems.” Deep cuts in the nuclear arsenals and the ultimate march to zero, say the authors, will require the ability to verifiably count individual warheads.

Read the abstract: http://dx.doi.org/10.1038/nature13457

A.Glaser, B. Barak, R. Goldston. A zero-knowledge protocol for nuclear  warhead verification. Nature 26 June 2014 DOI: 10.1038/nature13457