Category Archives: Research

Shale-gas field

‘Fracking’ in the dark: Biological fallout of shale-gas production still largely unknown (Frontiers in Ecology and the Environment)

Fracking diagram

Eight conservation biologists from various organizations and institutions, including Princeton University, found that shale-gas extraction in the United States has vastly outpaced scientists’ understanding of the industry’s environmental impact. Each gas well can act as a source of air, water, noise and light pollution (above) that — individually and collectively — can interfere with wild animal health, habitats and reproduction. Of particular concern is the fluid and wastewater associated with hydraulic fracturing, or “fracking,” a technique that releases natural gas from shale by breaking the rock up with a high-pressure blend of water, sand and other chemicals. (Frontiers in Ecology and the Environment )

By Morgan Kelly, Office of Communications

In the United States, natural-gas production from shale rock has increased by more than 700 percent since 2007. Yet scientists still do not fully understand the industry’s effects on nature and wildlife, according to a report in the journal Frontiers in Ecology and the Environment.

As gas extraction continues to vastly outpace scientific examination, a team of eight conservation biologists from various organizations and institutions, including Princeton University, concluded that determining the environmental impact of gas-drilling sites — such as chemical contamination from spills, well-casing failures and other accidents — must be a top research priority.

With shale-gas production projected to surge during the next 30 years, the authors call on scientists, industry representatives and policymakers to cooperate on determining — and minimizing — the damage inflicted on the natural world by gas operations such as hydraulic fracturing, or “fracking.” A major environmental concern, hydraulic fracturing releases natural gas from shale by breaking the rock up with a high-pressure blend of water, sand and other chemicals, which can include carcinogens and radioactive substances.

“We can’t let shale development outpace our understanding of its environmental impacts,” said co-author Morgan Tingley, a postdoctoral research associate in the Program in Science, Technology and Environmental Policy in Princeton’s Woodrow Wilson School of Public and International Affairs.

Shale-gas extraction in Wyoming

With shale-gas production projected to surge during the next 30 years, determining and minimizing the industry’s effects on nature and wildlife must become a top priority for scientists, industry and policymakers. Image of Wyoming’s Jonah Field. Although modern shale-gas wells need less surface area than the older methods shown here, the ecological impact from extraction operations past and present pose a long-lasting threat to the natural world. (Photo courtesy of Ecoflight.)

“The past has taught us that environmental impacts of large-scale development and resource extraction, whether coal plants, large dams or biofuel monocultures, are more than the sum of their parts,” Tingley said.

The researchers found that there are significant “knowledge gaps” when it comes to direct and quantifiable evidence of how the natural world responds to shale-gas operations. A major impediment to research has been the lack of accessible and reliable information on spills, wastewater disposal and the composition of fracturing fluids. Of the 24 American states with active shale-gas reservoirs, only five — Pennsylvania, Colorado, New Mexico, Wyoming and Texas — maintain public records of spills and accidents, the researchers report.

“The Pennsylvania Department of Environmental Protection’s website is one of the best sources of publicly available information on shale-gas spills and accidents in the nation. Even so, gas companies failed to report more than one-third of spills in the last year,” said first author Sara Souther, a postdoctoral research associate at the University of Wisconsin-Madison.

“How many more unreported spills occurred, but were not detected during well inspections?” Souther asked. “We need accurate data on the release of fracturing chemicals into the environment before we can understand impacts to plants and animals.”

One of the greatest threats to animal and plant life identified in the study is the impact of rapid and widespread shale development, which has disproportionately affected rural and natural areas. A single gas well results in the clearance of 3.7 to 7.6 acres (1.5 to 3.1 hectares) of vegetation, and each well contributes to a collective mass of air, water, noise and light pollution that has or can interfere with wild animal health, habitats and reproduction, the researchers report.

“If you look down on a heavily ‘fracked’ landscape, you see a web of well pads, access roads and pipelines that create islands out of what was, in some cases, contiguous habitat,” Souther said. “What are the combined effects of numerous wells and their supporting infrastructure on wide-ranging or sensitive species, like the pronghorn antelope or the hellbender salamander?”

The chemical makeup of fracturing fluid and wastewater is often unknown. The authors reviewed chemical-disclosure statements for 150 wells in three of the top gas-producing states and found that an average of two out of every three wells were fractured with at least one undisclosed chemical. The exact effect of fracturing fluid on natural water systems as well as drinking water supplies remains unclear even though improper wastewater disposal and pollution-prevention measures are among the top state-recorded violations at drilling sites, the researchers found.

“Some of the wells in the chemical disclosure registry were fractured with fluid containing 20 or more undisclosed chemicals,” said senior author Kimberly Terrell, a researcher at the Smithsonian Conservation Biology Institute. “This is an arbitrary and inconsistent standard of chemical disclosure.”

The paper’s co-authors also include researchers from the University of Bucharest in Romania, Colorado State University, the University of Washington, and the Society for Conservation Biology.

The work was supported by the David H. Smith Fellowship program administered by the Society for Conservation Biology and funded by the Cedar Tree Foundation; and by a Policy Fellowship from the Wilburforce Foundation to the Society for Conservation Biology.

Read the abstract.

Souther, Sara, Morgan W. Tingley, Viorel D. Popescu, David T.S. Hyman, Maureen E. Ryan, Tabitha A. Graves, Brett Hartl, Kimberly Terrell. 2014. Biotic impacts of energy development from shale: research priorities and knowledge gaps. Frontiers in Ecology and the Environment. Article published online Aug. 1, 2014. DOI: 10.1890/130324.

Water, Water — Not Everywhere: Mapping water trends for African maize (Environmental Research Letters)

By Molly Sharlach, Office of the Dean for Research

Water availability trends in Africa

Researchers analyzed water availability trends in African maize-growing regions from 1979 to 2010. Each quarter-degree grid cell represents a 200-square-mile area and is colored according to its average water availability level during the maize growing season. In redder areas, water availability is more limited by rainfall levels, while bluer areas are more limited by evaporative demand. (Image source: Environmental Research Letters)

Today’s food production relies heavily on irrigation, but across sub-Saharan Africa only 4 percent of cultivated land is irrigated, compared with a global average of 18 percent. Small-scale farming is the main livelihood for many people in the region, who depend on rainfall to water their crops.

To understand how climate change may affect the availability of water for agriculture, researchers at Princeton University analyzed trends in the water cycle in maize-growing areas of 21 African countries between 1979 and 2010. The team examined both levels of rainfall and the evaporative demand of the atmosphere — the combined effects of evaporation and transpiration, which is the movement of water through plants.

Overall, they found increases in water availability during the maize-growing season, although the trends varied by region. The greater availability of water generally resulted from a mixture of increased rainfall and decreased evaporative demand.

However, some regions of East Africa experienced declines in water availability, the study found. “Some places, like parts of Tanzania, got a double whammy that looks like a declining trend in rainfall as well as an increasing evaporative demand during the more sensitive middle part of the growing season,” said Lyndon Estes, the study’s lead author and an associate research scholar in the Program in Science, Technology and Environmental Policy at the Woodrow Wilson School of Public and International Affairs. The analysis was published in the July issue of the journal Environmental Research Letters.

A key goal of the study was to incorporate reliable data on factors that influence evaporative demand. These include temperature, wind speed, humidity and net radiation — defined as the amount of energy from the sun that is absorbed by the land, minus the amount reflected back into the atmosphere by the Earth’s surface. Measurements of three of these parameters came from the Princeton University Global Meteorological Forcing Dataset (PGF) previously developed by two of the study’s authors, Research Scholar Justin Sheffield and Eric Wood, the Susan Dod Brown Professor of Civil and Environmental Engineering and the study’s senior author.

The PGF merges a variety of weather and satellite data, and covers all land areas at a resolution of three hours and one degree of latitude or longitude (one degree of latitude is about 70 miles). Nathaniel Chaney, a graduate student who works with Sheffield, downscaled the data to a resolution of about 15 miles. He incorporated observations from African weather stations to improve the accuracy of the data. To do this, he used statistical techniques based on the principle that areas close to one another are likely to have similar weather.

The team also had to correct the data for errors due to changes in instruments or satellites, which can create what appear to be sudden jumps in temperature or wind speed. “When you’re dealing with gridded global weather data, they come with many warts,” Estes said. “So we try to remove as many of those warts as possible,” he said, to gain a faithful picture of weather changes at each location.

Most areas saw a decrease in evaporative demand, leading to higher water availability. The researchers analyzed the contributions of different factors to this decrease, and found that a downward trend in net radiation was largely responsible for the change. This was a surprising result, according to Estes, who said he expected to see decreases in evaporative demand, but thought lower wind speeds would have a greater impact than drops in net radiation. In a 2012 study published in the journal Nature, Sheffield and Wood showed that diminished wind speeds have helped to offset the effects of rising temperatures that would otherwise lead to an increase in droughts. Another study found that decreasing wind speeds contributed to declining evaporative demand in South Africa. The current study only examined water availability during the maize growing season, which could account for this discrepancy, Estes said.

The trends revealed by this research could have implications for agricultural policies and practices, including irrigation planning, timing of planting and choice of crop varietals. For example, in Burkina Faso in West Africa, a comparison of different parts of the growing season showed a decrease in water availability early in the season, but an increase at later time points. This might mean that the rainy season is starting later, in which case farmers in that region might adapt by planting their maize later. In South Africa, evaporative demand dropped in many areas; this could inform a reallocation of water use.

According to Estes, this study, which examined only 34 percent of all African maize-growing areas, may serve as a framework to guide more detailed analyses within individual countries. It’s also essential to understand the relationship between changes in water availability and changes in actual crop yields, which is more complex because yield trends are influenced by numerous political and economic factors, in addition to farming practices. That’s where Estes hopes to focus his next efforts. “All those factors would have to be teased out to isolate what these changes in water supply and demand mean for crop production,” he said.

Other researchers in Princeton’s Department of Civil and Environmental Engineering involved in the study include graduate student Julio Herrera Estrada and Associate Professor Kelly Caylor.

This work was funded by the United States Army Corps of Engineers Institute for Water Resources, the NASA Measures Program and the Princeton Environmental Institute Grand Challenges Program.

Read the abstract.

Estes, L. D.; Chaney, N. W.; Herrera-Estrada, J.; Sheffield, J.; Caylor, K. K.; Wood, E. F. Changing water availability during the African maize-growing season, 1979–2010. Environmental Research Letters Volume 9 Number 7 075005 doi:10.1088/1748-9326/9/7/075005

Solar_panels_800

Solar panels light the way from carbon dioxide to fuel (Journal of CO2 Utilization)

By Tien Nguyen, Department of Chemistry

Research to curb global warming caused by rising levels of atmospheric greenhouse gases, such as carbon dioxide, usually involves three areas: Developing alternative energy sources, capturing and storing greenhouse gases, and repurposing excess greenhouse gases. Drawing on two of these approaches, researchers in the laboratory of Andrew Bocarsly, a Princeton professor of chemistry, collaborated with researchers at start-up company Liquid Light Inc. of Monmouth Junction, New Jersey, to devise an efficient method for harnessing sunlight to convert carbon dioxide into a potential alternative fuel known as formic acid. The study was published June 13 in the Journal of CO2 Utilization.

Pictured with the photovoltaic-electrochemical cell system from left to right: Graduate student James White (Princeton), Professor Andrew Bocarsly (Princeton and Liquid Light) and principal engineer Paul Majsztrik (Liquid Light). (Photo by Frank Wojciechowski)

Pictured with the photovoltaic-electrochemical cell system from left to right: Graduate student James White (Princeton), Professor Andrew Bocarsly (Princeton and Liquid Light) and principal engineer Paul Majsztrik (Liquid Light). (Photo by Frank Wojciechowski)

The transformation from carbon dioxide and water to formic acid was powered by a commercial solar panel provided by the energy company PSE&G that can be found atop electric poles across New Jersey. The process takes place inside an electrochemical cell, which consists of metal plates the size of rectangular lunch-boxes that enclose liquid-carrying channels.

To maximize the efficiency of the system, the amount of power produced by the solar panel must match the amount of power the electrochemical cell can handle, said Bocarsly. This optimization process is called impedance matching. By stacking three electrochemical cells together, the research team was able to reach almost 2 percent energy efficiency, which is twice the efficiency of natural photosynthesis. It is also the best energy efficiency reported to date using a man-made device.

A number of energy companies are interested in storing solar energy as formic acid in fuel cells. Additionally, formate salt—readily made from formic acid—is the preferred de-icing agent on airplane runways because it is less corrosive to planes and safer for the environment than chloride salts. With increased availability, formate salts could supplant more harmful salts in widespread use.

Using waste carbon dioxide and easily obtained machined parts, this approach offers a promising route to a renewable fuel, Bocarsly said.

This work was financially supported by Liquid Light, Inc., which was cofounded by Bocarsly, and the National Science Foundation under grant no. CHE-0911114.

Read the abstract.

White, J. L.; Herb, J. T.; Kaczur, J. J.; Majsztrik, P. W.; Bocarsly, A. B. Photons to formate: Efficient electrochemical solar energy conversion via reduction of carbon dioxide. Journal of CO2 Utilization. Available online June 13, 2014.

Indonesia

With climate change, heat more than natural disasters will drive people away (PNAS)

By Morgan Kelly, Office of Communications

Although scenes of people fleeing from dramatic displays of Mother Nature’s power dominate the news, gradual increases in an area’s overall temperature — and to a lesser extent precipitation — actually lead more often to permanent population shifts, according to Princeton University research.

The researchers examined 15 years of migration data for more than 7,000 families in Indonesia and found that increases in temperature and, to a lesser extent, rainfall influenced a family’s decision to permanently migrate to another of the country’s provinces. They report in the journal the Proceedings of the National Academy of Sciences that increases in average yearly temperature took a detrimental toll on people’s economic wellbeing. On the other hand, natural disasters such as floods and earthquakes had a much smaller to non-existent impact on permanent moves, suggesting that during natural disasters relocation was most often temporary as people sought refuge in other areas of the country before returning home to rebuild their lives.

The results suggest that the consequences of climate change will likely be more subtle and permanent than is popularly believed, explained first author Pratikshya Bohra-Mishra, a postdoctoral research associate in the Program in Science, Technology and Environmental Policy (STEP) in Princeton’s Woodrow Wilson School of Public and International Affairs. The effects likely won’t be limited to low-lying areas or developing countries that are unprepared for an uptick in hurricanes, floods and other natural disasters, she said.

“We do not think of ‘environmental migrants’ in a broader sense; images of refugees from natural disasters often dominate the overall picture,” Bohra-Mishra said. “It is important to understand the often less conspicuous and gradual effect of climate change on migration. Our study suggests that in areas that are already hot, a further increase in temperature will increase the likelihood that more people will move out.”

Indonesia’s tropical climate and dependence on agriculture may amplify the role of temperature as a migration factor, Bohra-Mishra said. However, existing research shows that climate-driven changes in crop yields can effect Mexican migration to the United States, and that extreme temperature had a role in the long-term migration of males in rural Pakistan.

“Based on these emerging findings, it is likely that the societal reach of climate change could be much broader to include warm regions that are now relatively safe from natural disasters,” Bohra-Mishra said.

Indonesia became the case study because the multi-island tropical nation is vulnerable to climate change and events such as earthquakes and landslides. In addition, the Indonesian Family Life Survey (IFLS) conducted by the RAND Corporation from 1993 to 2007 provided thorough information about the movements of 7,185 families from 13 of the nation’s 27 provinces in 1993. The Princeton researchers matched province-to-province movement of households over 15 years to data on temperature, precipitation and natural disasters from those same years. Bohra-Mishra worked with co-authors Michael Oppenheimer, the Albert G. Millbank Professor of Geosciences and International Affairs and director of STEP, and Solomon Hsiang, a past Princeton postdoctoral researcher now an assistant professor of public policy at the University of California-Berkeley.

People start to rethink their location with each degree that the average annual temperature rises above 25 degrees Celsius (77 degrees Fahrenheit), the researchers found. The chances that a family will leave an area for good in a given year rise with each degree. With a change from 26 to 27 degrees Celsius (78.8 to 80.6 Fahrenheit), the probability of a family emigrating that year increased by 0.8 percent when other factors for migration were controlled for. From 27 to 28 degrees Celsius (80.6 to 82.4 Fahrenheit), those chances jumped to 1.4 percent.

When it comes to annual rainfall, families seem to tolerate and prefer an average of 2.2 meters (7.2 feet). The chances of outmigration increased with each additional meter of average annual precipitation, as well as with further declines in rainfall.

Landslides were the only natural disaster with a consistent positive influence on permanent migration. With every 1 percent increase in the number of deaths or destroyed houses in a family’s home province, the likelihood of permanent migration went up by only 0.0006 and 0.0004 percent, respectively.

The much higher influence of heat on permanent migration can be pinned on its effect on local economies and social structures, the researchers write. Previous research has shown that a one-degree change in the average growing-season temperature can reduce yields of certain crops by as much as 17 percent. At the same time, research conducted by Hsiang while at Princeton and published in 2013 showed a correlation between higher temperatures and social conflict such as civil wars, ethnic conflict and street crime.

In the current study, the researchers found that in Indonesia, a shift from 25 to 26 degrees Celsius resulted in a significant 14 to 15 percent decline in the value of household assets, for example. Precipitation did not have a notable affect on household worth, nor did natural disasters except landslides, which lowered assets by 5 percent for each 1 percent increase in the number of people who died.

Read the abstract.

Bohra-Mishra, Pratikshya, Michael Oppenheimer, Solomon Hsiang. 2014. Nonlinear permanent migration response to climatic variations but minimal response to disasters. Proceedings of the National Academy of Sciences. Article published online June 23, 2014. DOI: 10.1073/pnas.1317166111.

Glaser Goldston and BTO_cropped

A farewell to arms? Scientists developing a novel technique that could facilitate nuclear disarmament (Nature)

Alexander Glaser and Robert Goldston

Alexander Glaser and Robert Goldston with the British Test Object. Credit: Elle Starkman/PPPL Communications Office

By John Greenwald, Princeton Plasma Physics Laboratory Office of Communications

A proven system for verifying that apparent nuclear weapons slated to be dismantled contained true warheads could provide a key step toward the further reduction of nuclear arms. The system would achieve this verification while safeguarding classified information that could lead to nuclear proliferation.

Scientists at Princeton University and the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) are developing the prototype for such a system, as reported this week in the journal Nature. Their novel approach, called a “zero-knowledge protocol,” would verify the presence of warheads without collecting any classified information at all.

“The goal is to prove with as high confidence as required that an object is a true nuclear warhead while learning nothing about the materials and design of the warhead itself,” said physicist Robert Goldston, coauthor of the paper, a fusion researcher and former director of PPPL, and a professor of astrophysical sciences at Princeton.

While numerous efforts have been made over the years to develop systems for verifying the actual content of warheads covered by disarmament treaties, no such methods are currently in use for treaty verification.

Traditional nuclear arms negotiations focus instead on the reduction of strategic — or long-range — delivery systems, such as bombers, submarines and ballistic missiles, without verifying their warheads. But this approach could prove insufficient when future talks turn to tactical and nondeployed nuclear weapons that are not on long-range systems. “What we really want to do is count warheads,” said physicist Alexander Glaser, first author of the paper and an assistant professor in Princeton’s Woodrow Wilson School of Public and International Affairs and the Department of Mechanical and Aerospace Engineering.

The system Glaser and Goldston are mapping out would compare a warhead to be inspected with a known true warhead to see if the weapons matched. This would be done by beaming high-energy neutrons into each warhead and recording how many neutrons passed through to detectors positioned on the other side. Neutrons that passed through would be added to those already “preloaded” into the detectors by the warheads’ owner — and if the total number of neutrons were the same for each warhead, the weapons would be found to match. But different totals would show that the putative warhead was really a spoof. Prior to the test, the inspector would decide which preloaded detector would go with which warhead.

No classified data would be measured in this process, and no electronic components that might be vulnerable to tampering and snooping would be used. “This approach really is very interesting and elegant,” said Steve Fetter, a professor in the School of Public Policy at the University of Maryland and a former White House official. “The main question is whether it can be implemented in practice.”

A project to test this approach is under construction at PPPL. The project calls for firing high-energy neutrons at a non-nuclear target, called a British Test Object, that will serve as a proxy for warheads. Researchers will compare results of the tests by noting how many neutrons pass through the target to bubble detectors that Yale University is designing for the project. The gel-filled detectors will add the neutrons that pass through to those already preloaded to produce a total for each test.

The project was launched with a seed grant from The Simons Foundation of Vancouver, Canada, that came to Princeton through Global Zero, a nonprofit organization. Support also was provided by the U.S. Department of State, the DOE (via PPPL pre-proposal development funding), and most recently, a total of $3.5 million over five years from the National Nuclear Security Administration.

Glaser hit upon the idea for a zero-knowledge proof over a lunch hosted by David Dobkin, a computer scientist, and until June 2014, dean of the Princeton faculty. “I told him I was really interested in nuclear warhead verification without learning anything about the warhead itself,” Glaser said. ‘“We call this a zero-knowledge proof in computer science,”’ Glaser said Dobkin replied. “That was the trigger,” Glaser recalled. “I went home and began reading about zero-knowledge proofs,” which are widely used in applications such as verifying online passwords.

Glaser’s reading led him to Boaz Barak, a senior researcher at Microsoft New England who had taught computer science at Princeton and is an expert in cryptology, the science of disguising secret information. “We started having discussions,” Glaser said of Barak, who helped develop statistical measures for the PPPL project and is the third coauthor of the paper in Nature.

Glaser also reached out to Goldston, with whom he had taught a class for three years in the Princeton Department of Astrophysical Sciences. “I told Rob that we need neutrons for this project,” Glaser recalled. “And he said, ‘That’s what we do — we have 14 MeV [or high-energy] neutrons at the Laboratory.’” Glaser, Goldston and Barak then worked together to refine the concept, developing ways to assure that even the statistical noise — or random variation — in the measurements conveyed no information.

If proven successful, dedicated inspection systems based on radiation measurements, such as the one proposed here, could help to advance disarmament talks beyond the New Strategic Arms Reduction Treaty (New START) between the United States and Russia, which runs from 2011 to 2021. The treaty calls for each country to reduce its arsenal of deployed strategic nuclear arms to 1,550 weapons, for a total of 3,100, by 2018.

Not included in the New START treaty are more than 4,000 nondeployed strategic and tactical weapons in each country’s arsenal. These very weapons, note the authors of the Nature paper, are apt to become part of future negotiations, “which will likely require verification of individual warheads, rather than whole delivery systems.” Deep cuts in the nuclear arsenals and the ultimate march to zero, say the authors, will require the ability to verifiably count individual warheads.

Read the abstract: http://dx.doi.org/10.1038/nature13457

A.Glaser, B. Barak, R. Goldston. A zero-knowledge protocol for nuclear  warhead verification. Nature 26 June 2014 DOI: 10.1038/nature13457

An electron microscope image shows two lasers placed just two microns apart from each other. (Image source: Turecki lab)

Strange physics turns off laser (Nature Communications)

By Steve Schultz, School of Engineering Office of Communications

An electron microscope image shows two lasers placed just two microns apart from each other. (Image source: Turecki lab)

An electron microscope image shows two lasers placed just two microns apart from each other. (Image source: Turecki lab)

Inspired by anomalies that arise in certain mathematical equations, researchers have demonstrated a laser system that paradoxically turns off when more power is added rather than becoming continuously brighter.

The finding by a team of researchers at Vienna University of Technology and Princeton University, could lead to new ways to manipulate the interaction of electronics and light, an important tool in modern communications networks and high-speed information processing.

The researchers published their results June 13 in the journal Nature Communications.

Their system involves two tiny lasers, each one-tenth of a millimeter in diameter, or about the width of a human hair. The two are nearly touching, separated by a distance 50 times smaller than the lasers themselves. One is pumped with electric current until it starts to emit light, as is normal for lasers. Power is then added slowly to the other, but instead of it also turning on and emitting even more light, the whole system shuts off.

“This is not the normal interference that we know,” said Hakan Türeci, assistant professor of electrical engineering at Princeton, referring to the common phenomenon of light waves or sound waves from two sources cancelling each other.  Instead, he said, the cancellation arises from the careful distribution of energy loss within an overall system that is being amplified.

Interactions between two lasers

Manipulating minute areas of gain and loss within individual lasers (shown as peaks and valleys in the image), researchers were able to create paradoxical interactions between two nearby lasers.(Image source: Turecki lab)

“Loss is something you normally are trying to avoid,” Türeci said. “In this case, we take advantage of it and it gives us a different dimension we can use – a new tool – in controlling optical systems.”

The research grows out of Türeci’s longstanding work on mathematical models that describe the behavior of lasers. In 2008, he established a mathematical framework for understanding the unique properties and complex interactions that are possible in extremely small lasers – devices with features measured in micrometers or nanometers. Different from conventional desk-top lasers, these devices fit on a computer chip.

That work opened the door to manipulating gain or loss (the amplification or loss of an energy input) within a laser system. In particular, it allowed researchers to judiciously control the spatial distribution of gain and loss within a single system, with one tiny sub-area amplifying light and an immediately adjacent area absorbing the generated light.

Türeci and his collaborators are now using similar ideas to pursue counterintuitive ideas for using distribution of gain and loss to make micro-lasers more efficient.

The researchers’ ideas for taking advantage of loss derive from their study of mathematical constructs called “non-Hermitian” matrices in which a normally symmetric table of values becomes asymmetric. Türeci said the work is related to certain ideas of quantum physics in which the fundamental symmetries of time and space in nature can break down even though the equations used to describe the system continue to maintain perfect symmetry.

Over the past several years, Türeci and his collaborators at Vienna worked to show how the mathematical anomalies at the heart of this work, called “exceptional points,” could be manifested in an actual system. In 2012 (Ref. 3), the team published a paper in the journal Physical Review Letters demonstrating computer simulations of a laser system that shuts off as energy is being added. In the current Nature Communications paper, the researchers created an experimental realization of their theory using a light source known as a quantum cascade laser.

The researchers report in the article that results could be of particular value in creating “lab-on-a-chip” devices – instruments that pack tiny optical devices onto a single computer chip. Understanding how multiple optical devices interact could provide ways to manipulate their performance electronically in previously unforeseen ways. Taking advantage of the way loss and gain are distributed within tightly coupled laser systems could lead to new types of highly accurate sensors, the researchers said.

“Our approach provides a whole new set of levers to create unforeseen and useful behaviors,” Türeci said.

The work at Vienna, including creation and demonstration of the actual device, was led by Stefan Rotter at Vienna along with Martin Brandstetter, Matthias Liertzer, C. Deutsch, P. Klang, J. Schöberl, G. Strasser and K. Unterrainer. Türeci participated in the development of the mathematical models underlying the phenomena. The work on the 2012 computer simulation of the system also included Li Ge, who was a post-doctoral researcher at Princeton at the time and is now an assistant professor at City University of New York.

The work was funded by the Vienna Science and Technology Fund and the Austrian Science Fund, as well as by the National Science Foundation through a major grant for the Mid-Infrared Technologies for Health and the Environment Center based at Princeton and by the Defense Advanced Research Projects Agency.

Read the abstract.

M. Brandstetter, M. Liertzer, C. Deutsch,P. Klang,J. Schöberl,H. E. Türeci,G. Strasser,K. Unterrainer & S. Rotter. Reversing the pump dependence of a laser at an exceptional point. Nature Communications 13 June 2014. DOI:10.1038/ncomms5034

Science 2 May 2008. DOI: 10.1126/science.1155311

Physical Review Letters 24 April 2012. DOI:10.1103/PhysRevLett.108.173901

 

Migrating north may trigger immediate health declines among Mexicans (Demography)

Mexican-Square_351x351_45

Photo credit: Ticiana Jardim Marini, Woodrow Wilson School

By B. Rose Huber, Woodrow Wilson School of Public and International Affairs

Mexican immigrants who relocate to the United States often face barriers like poorly paying jobs, crowded housing and family separation. Such obstacles – including the migration process itself – may be detrimental to the health of Mexican immigrants, especially those who have recently moved.

A study led by Princeton University’s Woodrow Wilson School of Public and International Affairs finds that Mexican immigrants who relocate to the United States are more likely to experience declines in health within a short time period compared with other Mexicans.

While past studies have attempted to examine the consequences of immigration for a person’s health, few have had adequate data to compare recent Mexican immigrants, those who moved years ago and individuals who never left Mexico. Published in the journal Demography, the Princeton-led study is one of the first to examine self-reported health at two stages among these groups.

“Our study demonstrates that declines in health appear quickly after migrants’ arrival in the United States,” said Noreen Goldman, lead author and professor of demography and public affairs at the Wilson School and faculty associate at the Wilson School’s Office of Population Research (OPR). “Overall, we find that recent Mexican migrants are more likely to experience rapid changes in health, both good and bad, than the other groups. The deteriorations in health within a year or two of migration far outweigh the improvements.”

For the study, the researchers used data from the Mexican Family Life Survey, a longitudinal survey containing demographic and health information on nearly 20,000 Mexicans who were 20 years or older at the time of the first interview in 2002. Follow-up interviews took place in 2005-06 with individuals who stayed in Mexico as well as with those who moved to the United States between 2002 and 2005. Goldman and her collaborators based their analysis on a sample of 14,257 adults, excluding those who didn’t report health conditions at the follow-up interview.

In order to assess whether migrants from Mexico to the United States experienced changes in their health after they moved, the researchers used two health assessments: self-rated health (compared to someone of the same age and sex) at each of the two interviews and perceived change in health at the second interview. The latter measure was based on the following question: “Comparing your health to a year ago, would you say your health is much better, better, the same, worse or much worse?” Goldman and her collaborators narrowed the original five response categories to three: better, worse or the same. Changes in health for Mexicans who migrated between 2002 to 2005 were compared with those of migrants from earlier time periods and with people who remained in Mexico.

The researchers also took health measures at the first wave into account: obesity, anemia, hypertension – which were all determined by at-home visits by trained health workers – and hospitalization within the past year. They also controlled for socioeconomic factors – years of schooling and household spending. Additionally, they included data from 136 municipalities in Mexico (as past research has found that migration decisions can differ based on place of origin.)

Using statistical models, the researchers analyzed changes in health status. The two health measures revealed that recent migrants to the United States were more apt to experience both improvements and declines in their health than either earlier migrants or non-migrants. However, the overall net change was a substantial deterioration in the health of recent migrants relative to the other groups. The health of recent migrants was about 60 percent more likely to have worsened within a one- or two-year period than that of those who never left Mexico.

“The speed of the health decline for recent migrants suggests that the process of border crossing for both documented and undocumented immigrants combined with the physical and psychological costs of finding work, crowded housing, limited access to health care in the United States and isolation from family members can result in rapid deterioration of immigrants’ physical and mental wellbeing,” said Goldman.

“Immigrants are often assumed to be resilient and in good health because they have not yet adopted unhealthy American behaviors like poor diet and a sedentary lifestyle,” said co-author Anne Pebley from the California Center for Population Research at the University of California, Los Angeles. “But these results suggest that the image of the ‘healthy migrant’ is an illusion – at least for many recent immigrants.”

“These results demonstrate the high personal costs that many immigrants are willing to pay for a chance to improve their lives,” said Goldman. “From a humanitarian standpoint, the health declines underscore the need for public health, social service and immigration agencies to provide basic services for physical and psychological health to recent migrants.”

Given the limitations of the dataset, Goldman and her collaborators could not provide a more nuanced analysis regarding the causes of the changes in health status, but, with the availability of the third wave of data (collected between 2009-12), many of these questions can be later addressed.

In addition to Goldman and Pebley, study researchers include Chang Chung from OPR; Mathew Creighton from the University of Massachusetts; Graciela Teruel from the Universidad Iberoamericana; and Luis Rubalcava from the Centro de Análisis y Medición del Bienestar Social.

Support for this project from the Eunice Kennedy Shriver National Institute of Child Health and Human Development (R01HD051764, R24HD047879, R03HD040906, and R01HD047522) and from the Sector Research Fund for Social Development of the National Council for Science and Technology of Mexico.

Read the abstract.

Goldman N, Pebley AR, Creighton MJ, Teruel GM, Rubalcava LN, Chung C. 2014. The Consequences of Migration to the United States for Short-Term Changes in the Health of Mexican Immigrants. Demography. 2014 May 1 (Epub ahead of print).