Do biofuel policies seek to cut emissions by cutting food? (Science)

By Catherine Zandonella, Office of the Dean for Research

2015_03_27_cornfieldA study published today in the journal Science found that government biofuel policies rely on reductions in food consumption to generate greenhouse gas savings.

Shrinking the amount of food that people and livestock eat decreases the amount of carbon dioxide that they breathe out or excrete as waste. The reduction in food available for consumption, rather than any inherent fuel efficiency, drives the decline in carbon dioxide emissions in government models, the researchers found.

“Without reduced food consumption, each of the models would estimate that biofuels generate more emissions than gasoline,” said Timothy Searchinger, first author on the paper and a research scholar at Princeton University’s Woodrow Wilson School of Public and International Affairs and the Program in Science, Technology, and Environmental Policy.

Searchinger’s co-authors were Robert Edwards and Declan Mulligan of the Joint Research Center at the European Commission; Ralph Heimlich of the consulting practice Agricultural Conservation Economics; and Richard Plevin of the University of California-Davis.

The study looked at three models used by U.S. and European agencies, and found that all three estimate that some of the crops diverted from food to biofuels are not replaced by planting crops elsewhere. About 20 percent to 50 percent of the net calories diverted to make ethanol are not replaced through the planting of additional crops, the study found.

The result is that less food is available, and, according to the study, these missing calories are not simply extras enjoyed in resource-rich countries. Instead, when less food is available, prices go up. “The impacts on food consumption result not from a tailored tax on excess consumption but from broad global price increases that will disproportionately affect some of the world’s poor,” Searchinger said.

The emissions reductions from switching from gasoline to ethanol have been debated for several years. Automobiles that run on ethanol emit less carbon dioxide, but this is offset by the fact that making ethanol from corn or wheat requires energy that is usually derived from traditional greenhouse gas-emitting sources, such as natural gas.

Both the models used by the U.S. Environmental Protection Agency and the California Air Resources Board indicate that ethanol made from corn and wheat generates modestly fewer emissions than gasoline. The fact that these lowered emissions come from reductions in food production is buried in the methodology and not explicitly stated, the study found.

The European Commission’s model found an even greater reduction in emissions. It includes reductions in both quantity and overall food quality due to the replacement of oils and vegetables by corn and wheat, which are of lesser nutritional value. “Without these reductions in food quantity and quality, the [European] model would estimate that wheat ethanol generates 46% higher emissions than gasoline and corn ethanol 68% higher emissions,” Searching said.

The paper recommends that modelers try to show their results more transparently so that policymakers can decide if they wish to seek greenhouse gas reductions from food reductions. “The key lesson is the trade-offs implicit in the models,” Searchinger said.

The research was supported by The David and Lucile Packard Foundation.

Read the abstract.

T. Searchinger, R. Edwards, D. Mulligan, R. Heimlich, and R. Plevin. Do biofuel policies seek to cut emissions by cutting food? Science 27 March 2015: 1420-1422. DOI: 10.1126/science.1261221.

When attention is a deficit: How the brain switches strategies to find better solutions (Neuron)

By Catherine Zandonella, Office of the Dean for Research

2015_03_26_JW_Schuck_NYC3Sometimes being too focused on a task is not a good thing.

During tasks that require our attention, we might become so engrossed in what we are doing that we fail to notice there is a better way to get the job done.

For example, let’s say you are coming out of a New York City subway one late afternoon and you want to find out which way is west. You might begin to scan street signs and then suddenly realize that you could just look for the setting sun.

A new study explored the question of how the brain switches from an ongoing strategy to a new and perhaps more efficient one. The study, conducted by researchers at Princeton University, Humboldt University of Berlin, the Bernstein Center for Computational Neuroscience in Berlin, and the University of Milan-Bicocca, found that activity in a region of the brain known as the medial prefrontal cortex was involved in monitoring what is happening outside one’s current focus of attention and shifting focus from a successful strategy to one that is even better. They published the finding in the journal Neuron.

“The human brain at any moment in time has to process quite a wealth of information,” said Nicolas Schuck, a postdoctoral research associate in the Princeton Neuroscience Institute and first author on the study. “The brain has evolved mechanisms that filter that information in a way that is useful for the task that you are doing. But the filter has a disadvantage: you might miss out on important information that is outside your current focus.”

Schuck and his colleagues wanted to study what happens at the moment when people realize there is a different and potentially better way of doing things. They asked volunteers to play a game while their brains were scanned with magnetic resonance imaging (MRI). The volunteers were instructed to press one of two buttons depending on the location of colored squares on a screen. However, the game contained a hidden pattern that the researchers did not tell the participants about, namely, that if the squares were green, they always appeared in one part of the screen and if the squares were red, they always appeared in another part. The researchers refrained from telling players that they could improve their performance by paying attention to the color instead of the location of the squares.

Volunteers played a game where they had to press one button or another depending on the location of squares on a screen. Participants that switched to a strategy based on the color of the square were able to improve their performance on the game. (Image source: Schuck, et al.)
Volunteers played a game where they had to press one button or another depending on the location of squares on a screen. Participants that switched to a strategy based on the color of the squares were able to improve their performance on the game. (Image source: Schuck, et al.)

Not all of the players figured out that there was a more efficient way to play the game. However, among those that did, their brain images revealed specific signals in the medial prefrontal cortex that corresponded to the color of the squares. These signals arose minutes before the participants switched their strategies. This signal was so reliable that the researchers could use it to predict spontaneous strategy shifts ahead of time, Schuck said.

“These findings are important to better understand the role of the medial prefrontal cortex in the cascade of processes leading to the final behavioral change, and more generally, to understand the role of the medial prefrontal cortex in human cognition,” said Carlo Reverberi, a researcher at the University of Milan-Bicocca and senior author on the study. “Our findings suggest that the medial prefrontal cortex is ‘simulating’ in the background an alternative strategy, while the overt behavior is still shaped by the old strategy.”

The study design – specifically, not telling the participants that there was a more effective strategy – enabled the researchers to show that the brain can monitor background information while focused on a task, and choose to act on that background information.

“What was quite special about the study was that the behavior was completely without instruction,” Schuck said. “When the behavior changed, this reflected a spontaneous internal process.”

Before this study, he said, most researchers had focused on the question of switching strategies because you made a mistake or you realized that your current approach isn’t working. “But what we were able to explore,” he said, “is what happens when people switch to a new way of doing things based on information from their surroundings.” In this way, the study sheds light on how learning and attention can interact, he said.

The study has relevance for the question of how the brain balances the need to maintain attention with the need to incorporate new information about the environment, and may eventually help our understanding of disorders that involve attention deficits.

Schuck designed and conducted the experiments while a graduate student at Humboldt University and the International Max Planck Research School on the Life Course (LIFE) together with the other authors, and conducted the analysis at Princeton University in the laboratory of Yael Niv, assistant professor of psychology and the Princeton Neuroscience Institute in close collaboration with Reverberi.

The research was supported through a grant from the U.S. National Institutes of Health, the International Max Planck Research School LIFE, the Italian Ministry of University, the German Federal Ministry of Education and Research, and the German Research Foundation.

Read the abstract.

Nicolas W. Schuck, Robert Gaschler, Dorit Wenke, Jakob Heinzle, Peter A. Frensch, John-Dylan Haynes, and Carlo Reverberi. Medial Prefrontal Cortex Predicts Internally Driven Strategy Shifts, Neuron (2015) http://dx.doi.org/10.1016/j.neuron.2015.03.015.

 

 

Cytomegalovirus hijacks human enzyme for replication (Cell Reports)

DiagramBy: Tien Nguyen, Department of Chemistry

More than 60 percent of the world’s population is infected with a type of herpes virus called human cytomegalovirus. The virus replicates by commandeering the host cell’s metabolism but the details of this maneuver are unclear.

Researchers at Princeton University have discovered that cytomegalovirus manipulates a process called fatty acid elongation, which makes the very-long-chain fatty acids necessary for virus replication. Published in the journal Cell Reports on March 3, the research team identified a specific human enzyme—elongase enzyme 7—that the virus induces to turn on fatty acid elongation.

“Elongase 7 was just screaming, ‘I’m important, study me,’” said John Purdy, a postdoctoral researcher in the Rabinowitz lab and lead author on the study.

He found that once a cell was infected by cytomegalovirus, the level of elongase 7 RNA increased over 150-fold. Purdy then performed a genetic knockdown experiment to silence elongase 7 and established that in its absence the virus was unable to efficiently replicate.

“Elongases are a family of seven related proteins. The particular importance of elongase 7 for cytomegalovirus replication was a pleasant surprise, and enhances its appeal as a drug target,” said Joshua Rabinowitz, a professor of chemistry and the Lewis-Sigler Institute for Integrative Genomics at Princeton and co-author on the paper.

Activation of the elongase enzyme led to an increase in very-long-chain fatty acids, which are used by the virus to build its viral envelope and replicate. The researchers fed infected cells a sugar called heavy isotope-labeled carbon-13 glucose, which is metabolized by the cell to form substrates for fatty acid elongation. The heavy isotope carbon-13 atoms were incorporated into new products that were detected and identified by their mass using a mass spectrometry method. This powerful technique provided insight into the amount of fatty acids produced and how they are constructed.

Cytomegalovirus infection mostly threatens populations with compromised immune systems and developing fetuses, and is the leading cause of hearing loss in children. Current treatments target the DNA replication step of the virus and are not very effective. These findings have advanced the understanding of the virus’s operations and identified fatty acid elongation as a key process that warrants further study.

This work was funded by National Institute of Health grants AI78063, CA82396, and GM71508 and an American Heart Association postdoctoral fellowship to J.G.P. (12POST9190001).

Read the full article here:

Purdy, J. G.; Shenk, T.; Rabinowitz, J. D. “Fatty Acid Elongase 7 Catalyzes the Lipidome Remodeling Essential for Human Cytomegalovirus Replication.” Cell Reports, 2015, 10, 1375.

 

Letting go of the (genetic) apron strings (Cell)

Researchers explore the shift from maternal genes to the embryo’s genes during development

By Catherine Zandonella, Office of the Dean for Research

Fruit fly embryo
Cells in an early-stage fruit fly embryo. (Image courtesy of NIGMS image gallery).

A new study from Princeton University researchers sheds light on the handing over of genetic control from mother to offspring early in development. Learning how organisms manage this transition could help researchers understand larger questions about how embryos regulate cell division and differentiation into new types of cells.

The study, published in the March 12 issue of the journal Cell, provides new insight into the mechanism for this genetic hand-off, which happens within hours of fertilization, when the newly fertilized egg is called a zygote.

“At the beginning, everything the embryo needs to survive is provided by mom, but eventually that stuff runs out, and the embryo needs to start making its own proteins and cellular machinery,” said Princeton postdoctoral researcher in the Department of Molecular Biology and first author Shelby Blythe. “We wanted to find out what controls that transition.”

Blythe conducted the study with senior author Eric Wieschaus, Princeton’s Squibb Professor in Molecular Biology, Professor of Molecular Biology and the Lewis-Sigler Institute for Integrative Genomics, a Howard Hughes Medical Institute investigator, and a Nobel laureate in physiology or medicine.

Researchers have known that in most animals, a newly fertilized egg cell divides rapidly, producing exact copies of itself using gene products supplied by the mother. After a short while, this rapid cell division pauses, and when it restarts, the embryonic DNA takes control and the cells divide much more slowly, differentiating into new cell types that are needed for the body’s organs and systems.

To find out what controls this maternal to zygotic transition, also called the midblastula transition (MBT), Blythe conducted experiments in the fruit fly Drosophila melanogaster, which has long served as a model for development in higher organisms including humans.

These experiments revealed that the slower cell division is a consequence of an upswing in DNA errors after the embryo’s genes take over. Cell division slows down because the cell’s DNA-copying machinery has to stop and wait until the damage is repaired.

Blythe found that it wasn’t the overall amount of embryonic DNA that caused this increase in errors. Instead, his experiments indicated that the high error rate was due to molecules that bind to DNA to activate the reading, or “transcription,” of the genes. These molecules stick to the DNA strands at thousands of sites and prevent the DNA copying machinery from working properly.

To discover this link between DNA errors and slower cell replication, Blythe used genetic techniques to create Drosophila embryos that were unable to repair DNA damage and typically died shortly after beginning to use their own genes. He then blocked the molecules that initiate the process of transcription of the zygotic genes, and found that the embryos survived, indicating that these molecules that bind to the DNA strands, called transcription factors, were triggering the DNA damage. He also discovered that a protein involved in responding to DNA damage, called Replication Protein A (RPA), appeared near the locations where DNA transcription was being initiated. “This provided evidence that the process of awakening the embryo’s genome is deleterious for DNA replication,” he said.

The study also demonstrates a mechanism by which the developing embryo ensures that cell division happens at a pace that is slow enough to allow the repair of damage to DNA during the switchover from maternal to zygotic gene expression. “For the first time we have a mechanistic foothold on how this process works,” Blythe said.

The work also enables researchers to explore larger questions of how embryos regulate DNA replication and transcription. “This study allows us to think about the idea that the ‘character’ of the DNA before and after the MBT has something to do with the DNA acquiring the architectural features of chromatin [the mix of DNA and proteins that make up chromosomes] that allow us to point to a spot and say ‘this is a gene’ and ‘this is not a gene’,” Blythe said. “Many of these features are indeed absent early in embryogenesis, and we suspect that the absence of these features is what allows the rapid copying of the DNA template early on. Part of what is so exciting about this is that early embryos may represent one of the only times when this chromatin architecture is missing or ‘blank’. Additionally, these early embryos allow us to study how the cell builds and installs these features that are so essential to the fundamental processes of cell biology.”

This work was supported in part by grant 5R37HD15587 from the Eunice Kennedy Shriver National Institute of Child Health and Human Development.

Read the abstract

Blythe, Shelby A. & Eric R. Wieschaus. Zygotic Genome Activation Triggers the DNA Replication Checkpoint at the Midblastula Transition. Cell. Published online on March 5, 2015. doi:10.1016/j.cell.2015.01.050. http://www.sciencedirect.com/science/article/pii/S0092867415001282

 

Scientists make breakthrough in understanding how to control intense heat bursts in fusion experiments (Physical Review Letters)

Computer simulation
Computer simulation of a cross-section of a DIII-D plasma responding to tiny magnetic fields. The left image models the response that suppressed the ELMs while the right image shows a response that was ineffective. Simulation by General Atomics.

By Raphael Rosen, Princeton Plasma Physics Laboratory

Researchers from General Atomics and the U.S. Department of Energy (DOE)’s Princeton Plasma Physics Laboratory (PPPL) have made a major breakthrough in understanding how potentially damaging heat bursts inside a fusion reactor can be controlled. Scientists performed the experiments on the DIII-D National Fusion Facility, a tokamak operated by General Atomics in San Diego. The findings represent a key step in predicting how to control heat bursts in future fusion facilities including ITER, an international experiment under construction in France to demonstrate the feasibility of fusion energy. This work is supported by the DOE Office of Science (Fusion Energy Sciences).

The studies build upon previous work pioneered on DIII-D showing that these intense heat bursts – called “ELMs” for short – could be suppressed with tiny magnetic fields. These tiny fields cause the edge of the plasma to smoothly release heat, thereby avoiding the damaging heat bursts. But until now, scientists did not understand how these fields worked. “Many mysteries surrounded how the plasma distorts to suppress these heat bursts,” said Carlos Paz-Soldan, a General Atomics scientist and lead author of the first of the two papers that report the seminal findings back-to-back in the March 12 issue of Physical Review Letters.

Paz-Soldan and a multi-institutional team of researchers found that tiny magnetic fields applied to the device can create two distinct kinds of response, rather than just one response as previously thought. The new response produces a ripple in the magnetic field near the plasma edge, allowing more heat to leak out at just the right rate to avert the intense heat bursts. Researchers applied the magnetic fields by running electrical current through coils around the plasma. Pickup coils then detected the plasma response, much as the microphone on a guitar picks up string vibrations.

The second result, led by PPPL scientist Raffi Nazikian, who heads the PPPL research team at DIII-D, identified the changes in the plasma that lead to the suppression of the large edge heat bursts or ELMs. The team found clear evidence that the plasma was deforming in just the way needed to allow the heat to slowly leak out. The measured magnetic distortions of the plasma edge indicated that the magnetic field was gently tearing in a narrow layer, a key prediction for how heat bursts can be prevented.  “The configuration changes suddenly when the plasma is tapped in a certain way,” Nazikian said, “and it is this response that suppresses the ELMs.”

Paz-Soldan and Nazikian
Carlos Paz-Soldan, left, and Raffi Nazikian at the DIII-D tokamak. (Photo by Lisa Petrillo/General Atomics)

The work involved a multi-institutional team of researchers who for years have been working toward an understanding of this process. These researchers included people from General Atomics, PPPL, Oak Ridge National Laboratory, Columbia University, Australian National University, the University of California-San Diego, the University of Wisconsin-Madison, and several others.

The new results suggest further possibilities for tuning the magnetic fields to make ELM-control easier. These findings point the way to overcoming a persistent barrier to sustained fusion reactions. “The identification of the physical processes that lead to ELM suppression when applying a small 3D magnetic field to the inherently 2D tokamak field provides new confidence that such a technique can be optimized in eliminating ELMs in ITER and future fusion devices,” said Mickey Wade, the DIII-D program director.

The results further highlight the value of the long-term multi-institutional collaboration between General Atomics, PPPL and other institutions in DIII-D research. This collaboration, said Wade, “was instrumental in developing the best experiment possible, realizing the significance of the results, and carrying out the analysis that led to publication of these important findings.”

PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

General Atomics has participated in fusion research for over 50 years and presently operates the DIII-D National Fusion Facility for the U.S. Department of Energy Office of Science with a mission “to provide the physics basis for the optimization of the tokamak approach to fusion energy production.”  The General Atomics group of companies is a world renowned leader in developing high-technology systems ranging from the nuclear fuel cycle to electromagnetic systems; remotely operated surveillance aircraft; airborne sensors; advanced electronic, wireless, and laser technologies; and biofuels.

Read the articles:

C. Paz-Soldan, R. Nazikian, S. R. Haskey, N. C. Logan, E. J. Strait, N. M. Ferraro, J. M. Hanson, J. D. King, M. J. Lanctot, R. A. Moyer, M. Okabayashi, J-K. Park, M. W. Shafer, and B. J. Tobias. Observation of a Multimode Plasma Response and its Relationship to Density Pumpout and Edge-Localized Mode Suppression. Phys. Rev. Lett. 114, 105001 – Published 12 March 2015.

R. Nazikian, C. Paz-Soldan, J. D. Callen, J. S. deGrassie, D. Eldon, T. E. Evans, N. M. Ferraro, B. A. Grierson, R. J. Groebner, S. R. Haskey, C. C. Hegna, J. D. King, N. C. Logan, G. R. McKee, R. A. Moyer, M. Okabayashi, D. M. Orlov, T. H. Osborne, J-K. Park, T. L. Rhodes, M. W. Shafer, P. B. Snyder, W. M. Solomon, E. J. Strait, and M. R. Wade. Pedestal Bifurcation and Resonant Field Penetration at the Threshold of Edge-Localized Mode Suppression in the DIII-D Tokamak. Phys. Rev. Lett. 114, 105002 – Published 12 March 2015.

 

 

 

 

Second natural quasicrystal found in ancient meteorite (Scientific Reports)

By Catherine Zandonella, Office of the Dean for Research

A team from Princeton University and the University of Florence in Italy has discovered a quasicrystal — so named because of its unorthodox arrangement of atoms — in a 4.5-billion-year-old meteorite from a remote region of northeastern Russia, bringing to two the number of natural quasicrystals ever discovered. Prior to the team finding the first natural quasicrystal in 2009, researchers thought that the structures were too fragile and energetically unstable to be formed by natural processes.

“The finding of a second naturally occurring quasicrystal confirms that these materials can form in nature and are stable over cosmic time scales,” said Paul Steinhardt, Princeton’s Albert Einstein Professor of Science and a professor of physics, who led the study with Luca Bindi of the University of Florence. The team published the finding in the March 13 issue of the journal Scientific Reports.

The discovery raises the possibility that other types of quasicrystals can be formed in nature, according to Steinhardt. Quasicrystals are very hard, have low friction, and don’t conduct heat very well — making them good candidates for applications such as protective coatings on items ranging from airplanes to non-stick cookware.

2015_03_13_Steinhardt_quasicrystalsThe newly discovered quasicrystal, which is yet to be named, has a structure that resembles flat 10-sided disks stacked in a column. This type of structure is impossible in ordinary crystals, in which atoms are packed closely together in a repeated and orderly fashion. The difference between crystals and quasicrystals can be visualized by imagining a tiled floor: Tiles that are 6-sided hexagons can fit neatly against each other to cover the entire floor. But 5-sided pentagons or 10-sided decagons laid next to each will result in gaps between tiles. “The structure is saying ‘I am not a crystal, but on the other hand, I am not random either,'” Steinhardt said.

Crystals with these forbidden symmetries had been created in the laboratory, but it wasn’t until 2009 that Bindi, Steinhardt, Nan Yao of Princeton and Peter Lu of Harvard reported the first natural quasicrystal, now known as icosahedrite, in a rock that had been collected years before in Chukotka, Russia. To confirm that this quasicrystal, which has the five-fold symmetry of a soccer ball, was indeed of natural origins, Steinhardt and a team of scientists including geologists from the Russian Academy of Sciences traveled to the region in 2011 and returned with additional samples which they analyzed at the University of Florence; the Smithsonian Museum in Washington, DC; the California Institute of Technology; and the Princeton Institute for the Science and Technology of Materials (PRISM) Imaging and Analysis Center.

Quasicrystal
The top panel shows an X-ray tomography image (similar to a “CAT” scan) at two different rotations of the whole mineral grain. The brighter and the darker regions are copper-aluminum metals and meteoritic silicates, respectively. The bottom panel shows a scanning electron micrograph image of the quasicrystal (QC) in apparent contact with another mineral, olivine (Ol). Source: Paul Steinhardt.

The researchers confirmed that the quasicrystal originated in an extraterrestrial body that formed about 4.57 billion years ago, which is around the time our solar system formed. They published the results in the Proceedings of the National Academy of Sciences in 2012. “Bringing back the material and showing that it was of natural origins was an important scientific barrier to overcome,” Steinhardt said.

This new quasicrystal, which was found in a different grain of the same meteorite, has 10-fold, or decagonal, symmetry. It is made up of aluminum, nickel and iron, which normally are not found together in the same mineral because aluminum binds quickly to oxygen, blocking attachment to nickel and iron.

Pattern of ten-fold symmetry
The new mineral is the grain shown in panel (a). The ten-fold symmetry is evident when the mineral is hit with x-rays (b). Aiming the beam from a different direction results in paterns as in (c) or (d) in which the spots form along horizontal lines that are equally spaced. Source: Paul Steinhardt.

The researchers are now exploring how the mineral formed, “We know there was a meteor impact, and that the temperature was around 1000 to 1200 degrees Kelvin, and that the pressure was a hundred thousand times greater than atmospheric pressure, but that is not enough to tell us all the details,” Steinhardt said. “We’d like to know whether the formation of quasicrystals is rare or is fairly frequent, how it occurs, and whether it could happen in other solar systems. What we find out could answer basic questions about the materials found in our universe.”

The team included, from Princeton: Nan Yao, a senior research scholar at PRISM and director of the PRISM Imaging and Analysis Center; Chaney Lin, a graduate student in physics; and Lincoln Hollister, professor of geosciences, emeritus, and a senior geologist. Co-authors also included Christopher Andronicos of Purdue University; Vadim Distler, Valery Kryachko and Marina Yudovskaya of the Russian Academy of Sciences; Alexander Kostin of BHP Billiton; Michael Eddy of the Massachusetts Institute of Technology; Glenn MacPherson the Smithsonian Institution; and William Steinhardt, a graduate student at Harvard University.

This work was supported in part by the National Science Foundation-MRSEC program (DMR-0820341) the Princeton Center for Complex Materials (DMR-0819860) and NASA (NNX11AD43G).

Ten-fold symmetry
The ordered yet non-standard pattern of the quasicrystal is revealed by an electron beam, which enables a view of a pattern of spots with ten-fold symmetry. Source: Paul Steinhardt.

Read the paper: Bindi, et al., 2015 – Natural quasicrystal with decagonal symmetry. Scientific Reports, 5, 9111. doi:10.1038/srep09111

Additional reading:

Bindi et al., 2009. Natural quasicrystals. Science 324, 1306-1309. http://www.sciencemag.org/content/324/5932/1306

Bindi et al., 2012. Evidence for the extraterrestrial origin of a natural quasicrystal. Proceedings of the National Academy of Sciences 109, 1396-1401. http://www.pnas.org/content/109/5/1396.full

 

 

 

Beautiful but strange: The dark side of cosmology (Science)

By Catherine Zandonella, Office of the Dean for Research

It’s a beautiful theory: the standard model of cosmology describes the universe using just six parameters. But it is also strange. The model predicts that dark matter and dark energy – two mysterious entities that have never been detected — make up 95% of the universe, leaving only 5% composed of the ordinary matter so essential to our existence.

In an article in this week’s Science, Princeton astrophysicist David Spergel reviews how cosmologists came to be certain that we are surrounded by matter and energy that we cannot see. Observations of galaxies, supernovae, and the universe’s temperature, among other things, have led researchers to conclude that the universe is mostly uniform and flat, but is expanding due to a puzzling phenomenon called dark energy. The rate of expansion is increasing over time, counteracting the attractive force of gravity. This last observation, says Spergel, implies that if you throw a ball upward you will see it start to accelerate away from you.

The components of our universe
The components of our universe. Dark energy comprises 69% of the mass energy density of the universe, dark matter comprises 25%, and “ordinary” atomic matter makes up 5%. Three types of neutrinos make up at least 0.1%, the cosmic background radiation makes up 0.01%, and black holes comprise at least 0.005%. (Source: Science/AAAS)

A number of experiments to detect dark matter and dark energy are underway, and some researchers have already claimed to have found particles of dark matter, although the results are controversial. New findings expected in the coming years from the Large Hadron Collider, the world’s most powerful particle accelerator, could provide evidence for a proposed theory, supersymmetry, that could explain the dark particles.

But explaining dark energy, and why the universe is accelerating, is a tougher problem. Over the next decade, powerful telescopes will come online to map the structure of the universe and trace the distribution of matter over the past 10 billion years, providing new insights into the source of cosmic acceleration.

Yet observations alone are probably not enough, according to Spergel. A full understanding will require new ideas in physics, perhaps even a new theory of gravity, possibly including extra dimensions, Spergel writes. “We will likely need a new idea as profound as general relativity to explain these mysteries.”

When that happens, our understanding of the dark side of cosmology will no longer accelerate away from us.

Read the article

Citation: Spergel, David. The dark side of cosmology: Dark matter and dark energy. Science, 6 March 2015: Vol. 347 no. 6226 pp. 1100-1102 DOI: 10.1126/science.aaa0980.

–David Spergel is the Charles A. Young Professor of Astronomy on the Class of 1897 Foundation, a professor of astrophysical sciences, and chair of Princeton’s Department of Astrophysical Sciences. His research is supported by the National Science Foundation and NASA.

Pennies reveal new insights on the nature of randomness (PNAS)

By Tien Nguyen, Department of Chemistry

The concept of randomness appears across scientific disciplines, from materials science to molecular biology. Now, theoretical chemists at Princeton have challenged traditional interpretations of randomness by computationally generating random and mechanically rigid arrangements of two-dimensional hard disks, such as pennies, for the first time.

‘It’s amazing that something so simple as the packing of pennies can reveal to us deep ideas about the meaning of randomness or disorder,” said Salvatore Torquato, professor of chemistry at Princeton and principal investigator of the report published on December 30 in the journal Proceedings of the National Academy of Sciences.

In two dimensions, conventional wisdom held that the most random arrangements of pennies were those most likely to form upon repeated packing, or in other words, most “entropically” favored. But when a group of pennies are rapidly compressed, the most probable states are actually highly ordered with small imperfections—called a polycrystalline state.

“We’re saying that school of thought is wrong because you can find much lower density states that have a high degree of disorder, even if they are not seen in typical experiments,” Torquato said.

Torquato and coworkers proposed that randomness should be judged from the disorder of a single state as opposed to many states. “It’s a new way of searching for randomness,” said Morrel Cohen, a senior scholar at Princeton and the editor assigned to the article.

Using a computer algorithm, the researchers produced so-called maximally random, jammed (rigid) states as defined by a set of “order metrics.” These measurements reflect features of a single configuration, such as the fluctuations of density within a system and the extent to which one penny’s position can be used to predict another’s.

The algorithm generated random states that have never been seen before in systems with up to approximately 200 disks. Theoretically, these maximally random states should exist for even larger systems, but are beyond the computational limits of the program.

These findings hold promise especially for the physics and chemistry of surfaces. Randomly dispersed patterns can be relayed to a 3D printer to create materials with unique properties. This may be desirable in photonics—analogous to electronics, but with photons instead of electrons—where the orientation of particles affects light’s ability to travel through a material.

This work also provides a tool for measuring degrees of order that may be applied to broadly to other fields. For example, the degree of disorder in the spatial distribution of cancer cells versus healthy cells could be measured and compared for possible biological links. The next challenge in this line of research will be for experimentalists to replicate these findings in the laboratory.

Read the article.

Atkinson, S.; Stillinger, F. H.; Torquato, S. “Existence of isostatic, maximally random jammed monodisperse hard-disk packings,” Proc. Natl. Acad. Sci., 2014, 111, 18436.

This work was supported in part by the National Science Foundation under Grants DMR- 0820341 and DMS-1211087. This work was partially supported by Simons Foundation Grant in Theoretical Physics 231015.