Flexibility is key in mechanism of biological self-assembly

By Catherine Zandonella, Office of the Dean for Research

A new study has modeled a crucial first step in the self-assembly of cellular structures such as drug receptors and other protein complexes, and found that the flexibility of the structures has a dramatic impact on how fast they join together.

The study, published this week in the journal Proceedings of the National Academy of Sciences, explored what happens when two water-repelling surfaces connect to build more complex structures. Using molecular simulations, researchers at Princeton University illustrated the mechanism by which the process occurs and explored factors that favor self-assembly.

A surprise finding was the sensitivity with which the surfaces’ flexibility determined the rate at which the surfaces eventually came together, with more flexible surfaces favoring joining. “Flexibility is like a knob that nature can tune to control the self-assembly of molecules,” said Pablo Debenedetti, senior author on the study and Princeton’s Dean for Research. Debenedetti is the Class of 1950 Professor in Engineering and Applied Science and a professor of chemical and biological engineering.

Researchers have long been interested in how biological structures can self-assemble according to physical laws. Tapping the secrets of self-assembly could, for example, lead to new methods of building nanomaterials for future electronic devices. Self-assembled protein complexes are the basis not only of drug receptors but also many other cellular structures, including ion channels that facilitate the transmission of signals in the brain.

The study illustrated the process by which two water-repelling, or hydrophobic, structures come together. At the start of the simulation, the two surfaces were separated by a watery environment. Researchers knew from previous studies that these surfaces, due to their hydrophobic nature, will push water molecules away until only a very few water molecules remain in the gap. The evaporation of these last few molecules allows the two surfaces to snap together.

The new molecular simulation conducted at Princeton yielded a more detailed look at the mechanism behind this process. In the simulation, when the surfaces are sufficiently close to each other, their hydrophobic nature triggered fluctuations in the number of water molecules in the gap, causing the liquid water to evaporate and form bubbles on the surfaces. The bubbles grew as more water molecules evaporated. Eventually two bubbles on either surface connected to form a gap-spanning tube, which expanded and pushed away any remaining water until the two surfaces collided.

Biological surfaces, such as cellular membranes, are flexible, so the researchers explored how the surfaces’ flexibility affected the process. The researchers tuned the flexibility of the surfaces by varying the strength of the coupling between the surface atoms. The stronger the coupling, the less each atom can wiggle relative to its neighbors.

The researchers found that the speed at which the two surfaces snap together depended greatly on flexibility. Small changes in flexibility led to large changes in the rate at which the surfaces stuck together. For example, two very flexible surfaces adhered in just nanoseconds, whereas two inflexible surfaces fused incredibly slowly, on the order of seconds.

Another finding was that the last step in the process, where the vapor tube expands, was critical for ensuring that the surfaces came together. In simulations where the tube failed to expand, the surfaces never joined. Flexibility was key to ensuring that the tube expanded, the researchers found. Making the material more flexible lowered the barriers to evaporation and stabilized the vapor tube, increasing the chances that the tube would expand.

The molecular simulation provides a foundation for understanding how biological structures assemble and function, according to Elia Altabet, a graduate student in Debenedetti’s group, and first author on the study. “A deeper understanding of the formation and function of protein assemblies such as drug receptors and ion channels could inform the design of new drugs to treat diseases,” he said.

Funding for this study was provided by National Science Foundation grants CHE-1213343 and CBET-1263565. Computations were performed at the Terascale Infrastructure for Groundbreaking Research in Engineering and Science (TIGRESS) at Princeton University.

The study, “Effect of material flexibility on the thermodynamics and kinetics of hydrophobically induced evaporation of water,” by Y. Elia Altabet, Amir Haji-Akbari and Pablo Debenedetti, was published online in the journal Proceedings of the National Academy of Sciences the week of March 13, 2017. doi: 10.1073/pnas.1620335114

Deep-sea corals reveal why atmospheric carbon was lower during the ice ages

Deep sea corals reveal that efficient nutrient consumption by plankton drove carbon sequestration in the deep ocean during the ice ages. Photo courtesy of Caltech.

By Robert Perkins, Caltech

We know a lot about how carbon dioxide (CO2) levels can drive climate change, but how about the way that climate change can cause fluctuations in CO2 levels? New research from an international team of scientists reveals one of the mechanisms by which a colder climate was accompanied by depleted atmospheric CO2 during past ice ages.

The overall goal of the work is to better understand how and why the earth goes through periodic climate change, which could shed light on how man-made factors could affect the global climate.

Now, an international team of scientists has shown that periods of colder climates are associated with higher phytoplankton efficiency and a reduction in nutrients in the surface of the Southern Ocean (the ocean surrounding the Antarctic), which is related to an increase in carbon sequestration in the deep ocean. A paper about their research appears this week in the online edition of the Proceedings of the National Academy of Sciences.

“It is critical to understand why atmospheric CO2 concentration was lower during the ice ages. This will help us understand how the ocean will respond to ongoing anthropogenic CO2 emissions,” says Xingchen (Tony) Wang, lead author of the study. Wang was a graduate student at Princeton University while conducting the research in the lab of Daniel Sigman, the Dusenbury Professor of Geological and Geophysical Sciences. Wang is now a Simons Foundation Postdoctoral Fellow on the Origins of Life at Caltech. The study used a library of 10,000 deep-sea corals collected by Caltech’s Jess Adkins.

Xingchen (Tony) Wang and Jess Adkins. Photo courtesy of Caltech

Earth’s average temperature has naturally fluctuated by about 4 to 5 degrees Celsius over the course of the past million years as the planet has cycled in and out of glacial periods. During that time, the earth’s atmospheric CO2 levels have fluctuated between roughly 180 and 280 parts per million (ppm) every 100,000 years or so. (In recent years, man-made carbon emissions have boosted that concentration up to over 400 ppm.)

About 10 years ago, researchers noticed a close correspondence between the fluctuations in CO2 levels and in temperature over the last million years. When the earth is at its coldest, the amount of CO2 in the atmosphere is also at its lowest. During the most recent ice age, which ended about 11,000 years ago, global temperatures were 5 degrees Celsius lower than they are today, and atmospheric CO2 concentrations were at 180 ppm.

There is 60 times more carbon in the ocean than in the atmosphere—partly because the ocean is so big. The mass of the world’s oceans is roughly 270 times greater than that of the atmosphere. As such, the ocean is the greatest regulator of carbon in the atmosphere, acting as both a sink and a source for atmospheric CO2.

Biological processes are the main driver of CO2 absorption from the atmosphere to the ocean. Just like photosynthesizing trees and plants on land, plankton at the surface of the sea turn CO2 into sugars that are eventually consumed by other creatures. As the sea creatures who consume those sugars—and the carbon they contain—die, they sink to the deep ocean, where the carbon is locked away from the atmosphere for a long time. This process is called the “biological pump.”

A healthy population of phytoplankton helps lock away carbon from the atmosphere. In order to thrive, phytoplankton need nutrients—notably, nitrogen, phosphorus, and iron. In most parts of the modern ocean, phytoplankton deplete all of the available nutrients in the surface ocean, and the biological pump operates at maximum efficiency.

However, in the modern Southern Ocean, there is a limited amount of iron—which means that there are not enough phytoplankton to fully consume the nitrogen and phosphorus in the surface waters. When there is less living biomass, there is also less that can die and sink to the bottom—which results in a decrease in carbon sequestration. The biological pump is not currently operating as efficiently as it theoretically could.

To track the efficiency of the biological pump over the span of the past 40,000 years, Adkins and his colleagues collected more than 10,000 fossils of the coral Desmophyllum dianthus.

Why coral? Two reasons: first, as it grows, coral accretes a skeleton around itself, precipitating calcium carbonate (CaCO3) and other trace elements (including nitrogen) out of the water around it. That process creates a rocky record of the chemistry of the ocean. Second, coral can be precisely dated using a combination of radiocarbon and uranium dating.

“Finding a few centimeter-tall fossil corals 2,000 meters deep in the ocean is no trivial task,” says Adkins, the Smits Family Professor of Geochemistry and Global Environmental Science at Caltech.

Adkins and his colleagues collected coral from the relatively narrow (500-mile) gap known as the Drake Passage between South America and Antarctica (among other places). Because the Southern Ocean flows around Antarctica, all of its waters funnel through that gap—making the samples Adkins collected a robust record of the water throughout the Southern Ocean.

Coauthors include scientists from Caltech, Princeton University, Pomona College, the Max Planck Institute for Chemistry in Germany, University of Bristol, and ETH Zurich in Switzerland.

Wang analyzed the ratios of two isotopes of nitrogen atoms in these corals – nitrogen-14 (14N, the most common variety of the atom, with seven protons and seven neutrons in its nucleus) and nitrogen-15 (15N, which has an extra neutron). When phytoplankton consume nitrogen, they prefer 14N to 15N. As a result, there is a correlation between the ratio of nitrogen isotopes in sinking organic matter (which the corals then eat as it falls to the seafloor) and how much nitrogen is being consumed in the surface ocean—and, by extension, the efficiency of the biological pump.

A higher amount of 15N in the fossils indicates that the biological pump was operating more efficiently at that time. An analogy would be monitoring what a person eats in their home. If they are eating more of their less-liked foods, then one could assume that the amount of food in their pantry is running low.

Indeed, Wang found that higher amounts of 15N were present in fossils corresponding to the last ice age, indicating that the biological pump was operating more efficiently during that time. As such, the evidence suggests that colder climates allow more biomass to grow in the surface Southern Ocean—likely because colder climates experience stronger winds, which can blow more iron into the Southern Ocean from the continents. That biomass consumes carbon, then dies and sinks, locking it away from the atmosphere.

Adkins and his colleagues plan to continue probing the coral library for further details about the cycles of ocean chemistry changes over the past several hundred thousand years.

The research was funded by the National Science Foundation, Princeton University, the European Research Council, and the Natural Environment Research Council.

The study, “Deep-sea coral evidence for lower Southern Ocean surface nitrate concentrations during the last ice age,” Xingchen Tony Wang, Daniel M. Sigman, Maria G. Prokopenko, Jess F. Adkins, Laura F. Robinson, Sophia K. Hines, Junyi Chai, Anja S. Studer, Alfredo Martínez-García, Tianyu Chen, and Gerald H. Haug, was published in the journal Proceedings of the National Academy of Sciences early edition the week of March 13, 2017. doi: 10.1073/pnas.1615718114

Article provided courtesy of Caltech

Researchers develop technique to track yellow fever virus replication

Infection with a strain of yellow fever virus
Infection with a strain of yellow fever virus (YFD-17D) in mouse liver. The liver of a mouse whose immune cells lack the immune signaling component known as STAT1 shows severe lymphocyte infiltration and inflammation, as well as necrosis, after infection with YFV-17D. Credit: Florian Douam and Alexander Ploss

By Staff, Department of Molecular Biology

Researchers from Princeton University‘s Department of Molecular Biology have developed a new method that can precisely track the replication of yellow fever virus in individual host immune cells. The technique, which is described in a paper published March 14 in the journal Nature Communications, could aid the development of new vaccines against a range of viruses, including Dengue and Zika.

Yellow fever virus (YFV) is a member of the flavivirus family that also includes Dengue and Zika virus. The virus, which is thought to infect a variety of cell types in the body, causes up to 200,000 cases of yellow fever every year, despite the widespread use of a highly effective vaccine. The vaccine consists of a live, attenuated form of the virus called YFV-17D, whose RNA genome is more than 99 percent identical to the virulent strain. This one percent difference in the attenuated virus’ genome may subtly alter interactions with the host immune system so that it induces a protective immune response without causing disease.

To explore how viruses interact with their hosts, and how these processes lead to virulence and disease, Alexander Ploss, assistant professor of molecular biology, and colleagues at Princeton University adapted a technique — called RNA Prime flow — that can detect RNA molecules within individual cells. They used the technique to track the presence of replicating viral particles in various immune cells circulating in the blood of infected mice. Mice are usually resistant to YFV, but Ploss and colleagues found that even the attenuated YFV-17D strain was lethal if the transcription factor STAT1, part of the antiviral interferon signaling pathway, was removed from mouse immune cells. The finding suggests that interferon signaling within immune cells protects mice from YFV, and that species-specific differences in this pathway allow the virus to replicate in humans and certain other primates but not mice.

Accordingly, YFV-17D was able to replicate efficiently in mice whose immune systems had been replaced with human immune cells capable of activating interferon signaling. However, just like humans immunized with the attenuated YFV vaccine, these “humanized” mice didn’t develop disease symptoms when infected with YFV-17D, allowing Ploss and colleagues to study how the attenuated virus interacts with the human immune system. Using their viral RNA flow technique, the researchers determined that the virus can replicate inside certain human immune cell types, including B lymphocytes and natural killer cells, in which the virus has not been detected previously. The researchers found that the panel of human cell types targeted by the virus changes over the course of infection in both the blood and the spleen of the animals, highlighting the distinct dynamics of YFV-17D replication in the human immune system.

The next step, said Florian Douam, a postdoctoral research associate in the Department of Molecular Biology and first author on the study, is to confirm YFV replication in these subsets of immune cells in YFV-infected patients and in recipients of the YFV-17D vaccine. Viral RNA flow now provides the means to perform such analyses, Douam said.

The researchers also plan to study whether the virulent and attenuated strains of yellow fever virus infect different host immune cells. The approach may help explain why some people infected with the virus die while others develop only the mildest of symptoms, as well as which changes in the YFV-17D genome weaken the virus’ ability to cause disease. “This could guide the rational design of vaccines against related pathogens, such as Zika and Dengue virus,” Ploss said.

This work was supported by a grant from the Health Grand Challenge program from Princeton University, the New Jersey Commission on Cancer Research (Grant No. DHFS16PPC007), the Genentech Foundation and Princeton University’s Anthony Evnin ’62 Senior Thesis Fund.

Florian Douam, Gabriela Hrebikova, Yentli E. Soto Albrecht, Julie Sellau, Yael Sharon, Qiang Ding and Alexander Ploss. Single-cell tracking of flavivirus RNA uncovers species-specific interactions with the immune system dictating disease outcome. Nature Communications. 8: 14781. (2017). doi: 10.1038/ncomms14781

A new cosmic survey offers unprecedented view of galaxies

View of the galaxies
A color composite image in the green, red and infrared bands of a patch of the sky known as the COSMOS field, as imaged by the Subaru Telescope in Hawaii. The galaxies are seen at such large distances that the light from them has taken billions of years to reach Earth. The light from the faintest galaxies in this image was emitted when the universe was less than 10 percent of its present age. Click here to pan around the image. (Credit: Princeton University/HSC Project)

By the Office of the Dean for Research

The universe has come into sharper focus with the release this week of new images from one of the largest telescopes in the world. A multinational collaboration led by the National Astronomical Observatory of Japan that includes Princeton University scientists has published a “cosmic census” of a large swath of the night sky containing roughly 100 million stars and galaxies, including some of the most distant objects in the universe. These high-quality images allow an unprecedented view into the nature and evolution of galaxies and dark matter.

The images and accompanying data were collected using a digital optical-imaging camera on the Subaru Telescope, located at the Mauna Kea Observatory in Hawaii. The camera, known as Hyper Suprime-Cam, is mounted directly in the optical path, at the “prime focus,” of the Subaru Telescope. A single image from the camera captures an amount of sky equal to the area of about nine full moons.

The project, known as the Hyper Suprime-Cam Subaru Strategic Program, is led by the National Astronomical Observatory of Japan (NAOJ) in collaboration with the Kavli Institute for the Physics and Mathematics of the Universe in Japan, the Academia Sinica Institute of Astronomy and Astrophysics in Taiwan, and Princeton University.

The release includes data from the first one-and-a-half years of the project, consisting of 61.5 nights of observations beginning in 2014. The project will take 300 nights over five to six years.

The data will allow researchers to look for previously undiscovered galaxies and to search for dark matter, which is matter that neither emits nor absorbs light but which can be detected via its effects on gravity. A 2015 study using Hyper Suprime-Cam surveyed 2.3 square degrees of sky and found gravitational signatures of nine clumps of dark matter, each weighing as much as a galaxy cluster (Miyazaki et al., 2015). The current data release covers about 50 times more sky than was used in that study, showing the potential of these data to reveal the statistical properties of dark matter.

The survey consists of three layers: a Wide survey that will eventually cover an area equal to 7000 full moons, or 1400 square degrees; a Deep survey that will look farther into the universe and encompass 26 square degrees; and an UltraDeep survey that will cover 3.5 square degrees and penetrate deep into space, allowing observations of some of the most distant galaxies in the universe. The surveys use optical and near infrared wavelengths in five broad wavelength bands (green, red, infrared, z, and y) and four narrow-band filters. In the multi-band images, the images are extremely sharp, with star images only 0.6 to 0.8 arcseconds across. (One arcsecond equals 3600th part of a degree.)

Figure 2: Cluster of galaxies
An image of a massive cluster of galaxies in the Virgo constellation showing numerous strong gravitational lenses. The distance to the central galaxy is 5.3 billion light years, while the lensed galaxies, apparent as the arcs around the cluster, are much more distant. This is a composite image in the green, red, and infrared band, and has a spatial resolution of about 0.6 arcsecond. (Credit: NAOJ/HSC Project)

The ability to capture images from deep in space is made possible by the light-collection power of the Subaru Telescope’s mirror, which has an aperture of 8.2 meters, as well as the image exposure time. The depth into space that one can look is measured in terms of the magnitude, or brightness of objects that can be seen from Earth in a given wavelength band. The depths of the three surveys are characterized by magnitudes in the red band of 26.4, 26.6 and 27.3 in the Wide, Deep and Ultradeep data, respectively. As the survey continues, the Deep and Ultradeep surveys will be able to image fainter objects.

The Hyper Suprime-Cam contains 104 scientific charge-coupled devices (CCDs) for a total of 870 million pixels. The total amount of data taken so far comprises 80 terabytes, which is comparable to the size of about 10 million images by a typical digital camera, and covers 108 square degrees. Because it is difficult to search such a huge dataset with standard tools, NAOJ has developed a dedicated database and interface for ease of access and use of the data.

Figure 3: Interation between galaxies
A color composite image in the green, red and infrared bands of UGC 10214, known as the Tadpole Galaxy in the ELAIS-N1 region. The distance to this galaxy is about 400 million light years. The long tail of stars is due to gravitational interaction between two galaxies. (Credit: NAOJ/HSC Project)

“Since 2014, we have been observing the sky with HSC, which can capture a wide-field image with high resolution,” said Satoshi Miyazaki, the leader of the project and a scientist at NAOJ. “We believe the data release will lead to many exciting astronomical results, from exploring the nature of dark matter and dark energy, as well as asteroids in our own solar system and galaxies in the early universe. The team members are now preparing a number of scientific papers based on these data. We plan to publish them in a special issue of the Publication of Astronomical Society of Japan. Moreover, we hope that interested members of the public will also access the data and enjoy the real universe imaged by the Subaru telescope, one of the largest the world.”

At Princeton, the project is co-led by Michael Strauss and Robert Lupton of the Department of Astrophysical Sciences. “The HSC data are really beautiful,” Strauss said. “Princeton scientists are using these data to explore the nature of merging galaxies, to search for the most distant quasars in the universe, to map the outer reaches of the Milky Way Galaxy, and for many other projects. We are delighted to make these wonderful images available to the world-wide astronomical community.”

Funding for the HSC Project was provided in part by the following grants: Grant-in-Aid for Scientific Research (B) JP15340065; Grant-in-Aid for Scientific Research on Priority Areas JP18072003; and the Funding Program for World-Leading Innovative R&D on Science and Technology (FIRST) entitled, “Uncovering the origin and future of the Universe-ultra-wide-field imaging and spectroscopy reveal the nature of dark matter and dark energy.” Funding was also provided by Princeton University.

This article was adapted from a press release from the National Astronomical Observatory of Japan.

Come together: Nucleolus forms via combination of active and passive processes

Movie caption: Researchers at Princeton studied the temperature dependence of the formation of the nucleolus, a cellular organelle. The movie shows the nuclei of intact fly cells as they are subjected to temperature changes in the surrounding fluid. As the temperature is shifted from low to high, the spontaneously assembled proteins dissolve, as can be seen in the disappearance of the bright spots.

By Catherine Zandonella, Office of the Dean for Research

Researchers at Princeton found that the nucleolus, a cellular organelle involved in RNA synthesis, assembles in part through the passive process of phase separation – the same type of process that causes oil to separate from water. The study, published in the journal Proceedings of the National Academy of Sciences, is the first to show that this happens in living, intact cells.

Understanding how cellular structures form could help explain how organelles change in response to diseases. For example, a hallmark of cancer cells is the swelling of the nucleolus.

To explore the role of passive processes – as opposed to active processes that involve energy consumption – in nucleolus formation, Hanieh Falahati, a graduate student in Princeton’s Lewis-Sigler Institute for Integrative Genomics, looked at the behavior of six nucleolus proteins under different temperature conditions. Phase separation is enhanced at lower temperatures, which is why salad dressing containing oil and vinegar separates when stored in the refrigerator. If phase separation were driving the assembly of proteins, the researchers should see the effect at low temperatures.

Falahati showed that four of the six proteins condensed and assembled into the nucleolus at low temperatures and reverted when the temperature rose, indicating that the passive process of phase separation was at work. However, the assembly of the other two proteins was irreversible, indicating that active processes were in play.

“It was kind of a surprising result, and it shows that cells can take advantage of spontaneous processes for some functions, but for other things, active processes may give the cell more control,” said Falahati, whose adviser is Eric Wieschaus, Princeton’s Squibb Professor in Molecular Biology and a professor of molecular biology and the Lewis-Sigler Institute for Integrative Genomics, and a Howard Hughes Medical Institute researcher.

The research was funded in part by grant 5R37HD15587 from the National Institute of Child Health and Human Development (NICHD), and by the Howard Hughes Medical Institute.

The study, “Independent active and thermodynamic processes govern the nucleolus assembly in vivo,” by Hanieh Falahatia and Eric Wieschaus, was published online ahead of print in the journal Proceedings of the National Academy of Sciences on January 23, 2017, doi: 10.1073/pnas.1615395114.

 

Theorists propose new class of topological metals with exotic electronic properties (Physics Review X)

Band structure spectral function
A new theory explains the behavior of a class of metals with exotic electronic properties. Credit: Muechler et al., Physics Review X

By Tien Nguyen, Department of Chemistry

Researchers at Princeton, Yale, and the University of Zurich have proposed a theory-based approach to characterize a class of metals that possess exotic electronic properties that could help scientists find other, similarly-endowed materials.

Published in the journal Physical Review X, the study described a new class of metals based on their symmetry and a mathematical classification known as a topological number, which is predictive of special electronic properties. Topological materials have drawn intense research interest since the early 2000s culminating in last year’s Nobel Prize in Physics awarded to three physicists, including F. Duncan Haldane, Princeton’s Eugene Higgins Professor of Physics, for theoretical discoveries in this area.

“Topological classification is a very general way of looking at the properties of materials,” said Lukas Muechler, a Princeton graduate student in the laboratory of Roberto Car, Princeton’s Ralph W. *31 Dornte Professor in Chemistry and lead author on the article.

A popular way of explaining this abstract mathematical classification involves breakfast items. In topological classification, donuts and coffee cups are equivalent because they both have one hole and can be smoothly deformed into one another. Meanwhile donuts cannot deform into muffins which makes them inequivalent. The number of holes is an example of a topological invariant that is equal for the donut and coffee cup, but distinguishes between the donut and the muffin.

“The idea is that you don’t really care about the details. As long as two materials have the same topological invariants, we can say they are topologically equivalent,” he said.

Muechler and his colleagues’ interest in the topological classification of this new class of metals was sparked by a peculiar discovery in the neighboring laboratory of Robert Cava, Princeton’s Russell Wellman Moore Professor of Chemistry. While searching for superconductivity in a crystal called tungsten telluride (WTe2), the Cava lab instead found that the material could continually increase its resistance in response to ever stronger magnetic fields – a property that might be used to build a sensor of magnetic fields.

The origin of this property was, however, mysterious. “This material has very interesting properties, but there had been no theory around it,” Muechler said.

The researchers first considered the arrangement of the atoms in the WTe2 crystal. Patterns in the arrangement of atoms are known as symmetries, and they fall into two fundamentally different classes – symmorphic and nonsymmorphic – which lead to profound differences in electronic properties, such as the transport of current in an electromagnetic field.

a) Symmorphic symmetry b) Nonsymmorphic symmetry
a) Symmorphic symmetry b) Nonsymmorphic symmetry Credit: Lukas Muechler

While WTe2 is composed of many layers of atoms stacked upon each other, Car’s team found that a single layer of atoms has a particular nonsymmorphic symmetry, where the atomic arrangement is unchanged overall if it is first rotated and then translated by a fraction of the lattice period (see figure).

Having established the symmetry, the researchers mathematically characterized all possible electronic states having this symmetry, and classified those states that can be smoothly deformed into each other as topologically equivalent, just as a donut can be deformed into a cup. From this classification, they found WTe2 belongs to a new class of metals which they coined nonsymmorphic topological metals. These metals are characterized by a different electron number than the nonsymmorphic metals that have previously been studied.

In nonsymmorphic topological metals, the current-carrying electrons behave like relativistic particles, in other words, as particles traveling at nearly the speed of light. This property is not as susceptible to impurities and defects as ordinary metals, making them attractive candidates for electronic devices.

The abstract topological classification also led the researchers to suggest some explanations for some of the outstanding electronic properties of bulk WTe2, most importantly its perfect compensation, meaning that it has an equal number of holes and electrons. Through theoretical simulations, the researchers found that this property could be achieved in the three-dimensional crystalline stacking of the WTe2 monolayers, which was a surprising result, Muechler said.

“Usually in theory research there isn’t much that’s unexpected, but this just popped out,” he said. “This abstract classification directly led us to explaining this property. In this sense, it’s a very elegant way of looking at this compound and now you can actually understand or design new compounds with similar properties.”

Recent photoemission experiments have also shown that the electrons in WTe2 absorb right-handed photons differently than they would left-handed photons. The theory formulated by the researchers showed that these photoemission experiments on WTe2 can be understood based on the topological properties of this new class of metals.

In future studies, the theorists want to test whether these topological properties are also present in atomically-thin layers of these metals, which could be exfoliated from a larger crystal to make electronic devices. “The study of this phenomena has big implications for the electronics industry, but it’s still in its infant years,” Muechler said.

This work was supported by the U.S. Department of Energy (DE-FG02-05ER46201), the Yale Postdoctoral Prize Fellowship, the National Science Foundation (NSF CAREER DMR-095242 and NSF-MRSEC DMR-0819860), the Office of Naval Research (ONR-N00014-11-1- 0635), the U.S. Department of Defense (MURI-130-6082), the David and Lucile Packard Foundation, the W. M. Keck Foundation, and the Eric and Wendy Schmidt Transformative Technology Fund.

Ultrafast lasers reveal light-harvesting secrets of photosynthetic algae (Chem)

Research photo
Credit: Scholes’ lab

By Tien Nguyen

Photosynthetic algae have been refining their technique for capturing light for millions of years. As a result, these algae boast powerful light harvesting systems — proteins that absorb light to be turned into energy for the plants — that scientists have long aspired to understand and mimic for renewable energy applications.

Cryptophyte algae
Microscopy image of cryptophyte algae. Credit: Desmond Toa

Now, researchers at Princeton University have revealed a mechanism that enhances the light harvesting rates of the cryptophyte algae Chroomonas mesostigmatica. Published in the journal Chem on December 8, these findings provide valuable insights for the design of artificial light-harvesting systems such as molecular sensors and solar energy collectors.

Cryptophyte algae often live below other organisms that absorb most of the sun’s rays. In response, the algae have evolved to thrive on wavelengths of light that aren’t captured by their neighbors above, mainly the yellow-green colors. The algae collects this yellow-green light energy and passes it through a network of molecules that converts it into red light, which chlorophyll molecules need to perform important photosynthetic chemistry.

Graduate student Desmond Toa (left) and Jacob Dean, a postdoctoral research associate and lecturer in chemistry with the laser set-up, Credit: C. Todd Reichart
Graduate student Desmond Toa (left) and Jacob Dean, a postdoctoral research associate and lecturer in chemistry, with the laser set-up, Credit: C. Todd Reichart

The speed of the energy transfer through the system has both impressed and perplexed the scientists that study them. In Gregory Scholes’ lab at Princeton University, predictions were always about three times slower than the observed rates. “The timescales that the energy is moved through the protein — we could never understand why the process so fast,” said Scholes, the William S. Tod Professor of Chemistry.

In 2010, Scholes’ team found evidence that the culprit behind these fast rates was a strange phenomenon called quantum coherence, in which molecules could share electronic excitation and transfer energy according to quantum mechanical probability laws instead of classical physics. But the research team couldn’t explain exactly how coherence worked to speed up the rates until now.

Using a sophisticated method enabled by ultrafast lasers, the researchers were able to measure the molecules’ light absorption and essentially track the energy flow through the system. Normally the absorption signals would overlap, making them impossible to assign to specific molecules within the protein complex, but the team was able to sharpen the signals by cooling the proteins down to very low temperatures, said Jacob Dean, lead author and postdoctoral researcher in the Scholes lab.

The researchers observed the system as energy was transferred from molecule to molecule, from high-energy green light to lower energy red light, with excess energy lost as vibrational energy. These experiments revealed a particular spectral pattern that was a ‘smoking gun’ for vibrational resonance, or vibrational matching, between the donor and acceptor molecules, Dean said.

This vibrational matching allowed energy to be transferred much faster than it otherwise would be by distributing the excitation between molecules. This effect provided a mechanism for the previously reported quantum coherence. Taking this redistribution into account, the researchers recalculated their prediction and landed on a rate that was about three times faster.

“Finally the prediction is in the right ballpark,” Scholes said. “Turns out that it required this quite different, surprising mechanism.”

The Scholes lab plans to study related proteins to investigate if this mechanism is operative in other photosynthetic organisms. Ultimately, scientists hope to create light-harvesting systems with perfect energy transfer by taking inspiration and design principles from these finely tuned yet extremely robust light-harvesting proteins. “This mechanism is one more powerful statement of the optimality of these proteins,” Scholes said.

Read the full article here:

Dean, J. C.; Mirkovic, T.; Toa, Z. S. D.; Oblinsky, D. G.; Scholes, G. D. “Vibronic Enhancement of Algae Light Harvesting.Chem 2016, 1, 858.

 

An explanation for the mysterious onset of a universal process (Physics of Plasmas)

Solar flares
Magnetic reconnection happens in solar flares on the surface in the sun, as well as in experimental fusion energy reactors here on Earth. Image credit: NASA.

By John Greenwald, Princeton Plasma Physics Laboratory Communications

Scientists have proposed a groundbreaking solution to a mystery that has puzzled physicists for decades. At issue is how magnetic reconnection, a universal process that sets off solar flares, northern lights and cosmic gamma-ray bursts, occurs so much faster than theory says should be possible. The answer, proposed by researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) and Princeton University, could aid forecasts of space storms, explain several high-energy astrophysical phenomena, and improve plasma confinement in doughnut-shaped magnetic devices called tokamaks designed to obtain energy from nuclear fusion.

Magnetic reconnection takes place when the magnetic field lines embedded in a plasma — the hot, charged gas that makes up 99 percent of the visible universe — converge, break apart and explosively reconnect. This process takes place in thin sheets in which electric current is strongly concentrated.

According to conventional theory, these sheets can be highly elongated and severely constrain the velocity of the magnetic field lines that join and split apart, making fast reconnection impossible. However, observation shows that rapid reconnection does exist, directly contradicting theoretical predictions.

Detailed theory for rapid reconnection

Now, physicists at PPPL and Princeton University have presented a detailed theory for the mechanism that leads to fast reconnection. Their paper, published in the journal Physics of Plasmas in October, focuses on a phenomenon called “plasmoid instability” to explain the onset of the rapid reconnection process. Support for this research comes from the National Science Foundation and the DOE Office of Science.

Plasmoid instability, which breaks up plasma current sheets into small magnetic islands called plasmoids, has generated considerable interest in recent years as a possible mechanism for fast reconnection. However, correct identification of the properties of the instability has been elusive.

Luca Comisson, PPPL
Luca Comisso, lead author of the study. Photo courtesy of PPPL.

The Physics of Plasmas paper addresses this crucial issue. It presents “a quantitative theory for the development of the plasmoid instability in plasma current sheets that can evolve in time” said Luca Comisso, lead author of the study. Co-authors are Manasvi Lingam and Yi-Min Huang of PPPL and Princeton, and Amitava Bhattacharjee, head of the Theory Department at PPPL and Princeton professor of astrophysical sciences.

Pierre de Fermat’s principle

The paper describes how the plasmoid instability begins in a slow linear phase that goes through a period of quiescence before accelerating into an explosive phase that triggers a dramatic increase in the speed of magnetic reconnection. To determine the most important features of this instability, the researchers adapted a variant of the 17th century “principle of least time” originated by the mathematician Pierre de Fermat.

Use of this principle enabled the researchers to derive equations for the duration of the linear phase, and for computing the growth rate and number of plasmoids created. Hence, this least-time approach led to a quantitative formula for the onset time of fast magnetic reconnection and the physics behind it.

The paper also produced a surprise. The authors found that such relationships do not reflect traditional power laws, in which one quantity varies as a power of another. “It is common in all realms of science to seek the existence of power laws,” the researchers wrote. “In contrast, we find that the scaling relations of the plasmoid instability are not true power laws – a result that has never been derived or predicted before.”

PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. The Laboratory is managed by Princeton University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Read the abstract here: Comisso, L.; Lingam, M.; Huang, Y.-M.; Bhattacharjee, A. General theory of the plasmoid instability. Physics of Plasmas 23, 2016. DOI: 10.1063/1.4964481

 

 

 

 

Outlook for subtropical rainfall under climate change not so gloomy (Nature Climate Change)

Researchers found a clear difference in the rate of global surface warming (left panel) and the rate of subtropical rainfall decline (indicated by the brown shading in the right panel) when forced with an instantaneous increase of CO2. This is the main evidence to show that the subtropical rainfall decline is unrelated to the global surface warming. Credit: Jie He, Ph.D., Princeton University and Brian J. Soden, Ph.D., University of Miami Rosenstiel School of Marine and Atmospheric Science
Researchers found a clear difference in the rate of global surface warming (left panel) and the rate of subtropical rainfall decline (indicated by the brown shading in the right panel) when forced with an instantaneous increase of CO2. (Credit: Jie He, Ph.D., Princeton University and Brian J. Soden, Ph.D., University of Miami Rosenstiel School of Marine and Atmospheric Science)

By Diana Udel, University of Miami

Terrestrial rainfall in the subtropics — including the southeastern United States — may not decline in response to increased greenhouse gases as much as it could over oceans, according to a study from Princeton University and the University of Miami (UM). The study challenges previous projections of how dry subtropical regions could become in the future, and it suggests that the impact of decreased rainfall on people living in these regions could be less severe than initially thought.

“The lack of rainfall decline over subtropical land is caused by the fact that land will warm much faster than the ocean in the future — a mechanism that has been overlooked in previous studies about subtropical precipitation change,” said first author Jie He, a postdoctoral research associate in Princeton’s Program in Atmospheric and Oceanic Sciences who works at the National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory located on Princeton’s Forrestal Campus.

In the new study, published in the journal Nature Climate Change, He and co-author Brian Soden, a UM professor of atmospheric sciences, used an ensemble of climate models to show that rainfall decreases occur faster than global warming, and therefore another mechanism must be at play. They found that direct heating from increasing greenhouse gases is causing the land to warm faster than the ocean. The associated changes in atmospheric circulation are thus driving rainfall decline over the oceans rather than land.

Subtropical rainfall changes have been previously attributed to two mechanisms related to global warming: greater moisture content in air that is transported away from the subtropics, and a pole-ward shift in air circulation. While both mechanisms are present, this study shows that neither one is responsible for a decline in rainfall.

“It has been long accepted that climate models project a large-scale rainfall decline in the future over the subtropics. Since most of the subtropical regions are already suffering from rainfall scarcity, the possibility of future rainfall decline is of great concern,” Soden said. “However, most of this decline occurs over subtropical oceans, not land, due to changes in the atmospheric circulation induced by the more rapid warming of land than ocean.”

Most of the reduction in subtropical rainfall occurs instantaneously with an increase of greenhouse gases, independent of the warming of the Earth’s surface, which occurs much more slowly. According to the authors, this indicates that emission reductions would immediately mitigate subtropical rainfall decline, even though the surface will continue to warm for a long time.

He is supported by the Visiting Scientist Program at the department of Atmospheric and Oceanic Science, Princeton University.

Read the abstract:

The study, “A re-examination of the projected subtropical precipitation decline,” was published in the Nov. 14 issue of the journal Nature Climate Change.

Researchers’ Sudoku strategy democratizes powerful tool for genetics research (Nature Communications)

Princeton University researchers Buz Barstow (left), graduate student Kemi Adesina and undergraduate researcher Isao Anzai ’17,
Princeton University researchers Buz Barstow (left), graduate student Kemi Adesina and undergraduate researcher Isao Anzai, Class of 2017, with colleagues at Harvard Universiy, have developed a strategy called “Knockout Sodoku” for figuring out gene function.

By Tien Nguyen, Department of Chemistry

Researchers at Princeton and Harvard Universities have developed a way to produce the tools for figuring out gene function faster and cheaper than current methods, according to new research in the journal Nature Communications.

The function of sizable chunks of many organisms’ genomes is a mystery, and figuring out how to fill these information gaps is one of the central questions in genetics research, said study author Buz Barstow, a Burroughs-Wellcome Fund Research Fellow in Princeton’s Department of Chemistry. “We have no idea what a large fraction of genes do,” he said.

One of the best strategies that scientists have to determine what a particular gene does is to remove it from the genome, and then evaluate what the organism can no longer do. The end result, known as a whole-genome knockout collection, provides full sets of genomic copies, or mutants, in which single genes have been deleted or “knocked out.” Researchers then test the entire knockout collection against a specific chemical reaction. If a mutant organism fails to perform the reaction that means it must be missing the particular gene responsible for that task.

It can take several years and millions of dollars to build a whole-genome knockout collection through targeted gene deletion. Because it’s so costly, whole-genome knockout collections only exist for a handful of organisms such as yeast and the bacterium Escherichia coli. Yet, these collections have proven to be incredibly useful as thousands of studies have been conducted on the yeast gene-deletion collection since its release.

The Princeton and Harvard researchers are the first to create a collection quickly and affordably, doing so in less than a month for several thousand dollars. Their strategy, called “Knockout Sudoku,” relies on a combination of randomized gene deletion and a powerful reconstruction algorithm. Though other research groups have attempted this randomized approach, none have come close to matching the speed and cost of Knockout Sudoku.

“We sort of see it as democratizing these powerful tools of genetics,” said Michael Baym, a co-author on the work and a Harvard Medical School postdoctoral researcher. “Hopefully it will allow the exploration of genetics outside of model organisms,” he said.

Their approach began with steep pizza bills and a technique called transposon mutagenesis that ‘knocks out’ genes by randomly inserting a single disruptive DNA sequence into the genome. This technique is applied to large colonies of microbes to ensure the likelihood that every single gene is disrupted. For example, the team started with a colony of about 40,000 microbes for the bacterium Shewanella oneidensis, which has approximately 3,600 genes in its genome.

Barstow recruited undergraduates and graduate students to manually transfer 40,000 mutants out of laboratory Petri dishes into separate wells using toothpicks. He offered pizza as an incentive, but after a full day of labor, they only managed to move a couple thousand mutants. “I thought to myself, ‘Wait a second, this pizza is going to ruin me,’” Barstow said.

Instead, they decided to rent a colony-picking robot. In just two days, the robot was able to transfer each mutant microbe to individual homes in 96-well plates, 417 plates in total.

But the true challenge and opportunity for innovation was in identifying and cataloging the mutants that could comprise a whole-genome knockout collection in a fast and practical way.

DNA amplification and sequencing is a straightforward way to identify each mutant, but doing it individually quickly gets very expensive and time-consuming. So the researchers’ proposed a scheme in which mutants could be combined into groups that would only require 61 amplification reactions and a single sequencing run.

But still, after sequencing each of the pools, the researchers had an incredible amount of data. They knew the identities of all the mutants, but now they had to figure exactly where each mutant came from in the grid of plates. This is where the Sudoku aspect of the method came in. The researchers built an algorithm that could deduce the location of individual mutants through its repeated appearance in various row, column, plate-row and plate-column pools.

Knockout sodoku helps find genes' functions.

But there’s a problem. Because the initial gene-disruption process is random, it’s possible that the same mutant is formed more than once, which means that playing Sudoku wouldn’t be simple. To find a solution for this issue, Barstow recalled watching the movie, “The Imitation Game,” about Alan Turing’s work on the enigma code, for inspiration.

“I felt like the problem in some ways was very similar to code breaking,” he said. There are simple codes that substitute one letter for another that can be easily solved by looking at the frequency of the letter, Barstow said. “For instance, in English the letter A is used 8.2 percent of the time. So, if you find that the letter X appears in the message about 8.2 percent of the time, you can tell this is supposed to be decoded as an A. This is a very simple example of Bayesian inference.”

With that same logic, Barstow and colleagues developed a statistical picture of what a real location assignment should look like based on a mutant that only appeared once and used that to rate the likelihood of possible locations being real.

“One of the things I really like about this technique is that it’s a prime example of designing a technique with the mathematics in mind at the outset which lets you do much more powerful things than you could do otherwise,” Baym said. “Because it was designed with the mathematics built in, it allows us to get much, much more data out of much less experiments,” he said.

Using their expedient strategy, the researchers created a collection for microbe Shewanella oneidensis. These microbes are especially good at transferring electrons and understanding their powers could prove highly valuable for developing sustainable energy sources, such as artificial photosynthesis, and for environmental remediation in the neutralization of radioactive waste.

Using the resultant collection, the team was able to recapitulate 15 years of research, Barstow said, bolstering their confidence in their method. In an early validation test, they noticed a startlingly poor accuracy rate. After finding no fault with the math, they looked at the original plates to realize that one of the researchers had grabbed the wrong sample. “The least reliable part of this is the human,” Barstow said.

The work was supported by a Career Award at the Scientific Interface from the Burroughs Wellcome Fund and Princeton University startup funds and Fred Fox Class of 1939 funds.

Read the full article here:

Baym, M.; Shaker, L.; Anzai, I. A.; Adesina, O.; Barstow, B. “Rapid construction of a whole-genome transposon insertion collection for Shewanella oneidensis by Knockout Sudoku.” Nature Comm. Available online on Nov. 10, 2016.