Unconventional quasiparticles predicted in conventional crystals (Science)

Fermi arcs on the surface of uncoventional materials
Two electronic states known as Fermi arcs, localized on the surface of a material, stem out of the projection of a 3-fold degenerate bulk new fermion. This new fermion is a cousin of the Weyl fermion discovered last year in another class of topological semimetals. The new fermion has a spin-1, a reflection of the 3- fold degeneracy, unlike the spin-½ that the recently discovered Weyl fermions have.

By Staff

An international team of researchers has predicted the existence of several previously unknown types of quantum particles in materials. The particles — which belong to the class of particles known as fermions — can be distinguished by several intrinsic properties, such as their responses to applied magnetic and electric fields. In several cases, fermions in the interior of the material show their presence on the surface via the appearance of electron states called Fermi arcs, which link the different types of fermion states in the material’s bulk.

The research, published online this week in the journal Science, was conducted by a team at Princeton University in collaboration with researchers at the Donostia International Physics Center (DIPC) in Spain and the Max Planck Institute for Chemical Physics of Solids in Germany. The investigators propose that many of the materials hosting the new types of fermions are “protected metals,” which are metals that do not allow, in most circumstances, an insulating state to develop. This research represents the newest avenue in the physics of “topological materials,” an area of science that has already fundamentally changed the way researchers see and interpret states of matter.

The team at Princeton included Barry Bradlyn and Jennifer Cano, both associate research scholars at the Princeton Center for Theoretical Science; Zhijun Wang, a postdoctoral research associate in the Department of Physics, Robert Cava, the Russell Wellman Moore Professor of Chemistry; and B. Andrei Bernevig, associate professor of physics. The research team also included Maia Vergniory, a postdoctoral research fellow at DIPC, and Claudia Felser, a professor of physics and chemistry and director of the Max Planck Institute for Chemical Physics of Solids.

For the past century, gapless fermions, which are quantum particles with no energy gap between their highest filled and lowest unfilled states, were thought to come in three varieties: Dirac, Majorana and Weyl. Condensed matter physics, which pioneers the study of quantum phases of matter, has become fertile ground for the discovery of these fermions in different materials through experiments conducted in crystals. These experiments enable researchers to explore exotic particles using relatively inexpensive laboratory equipment rather than large particle accelerators.

In the past four years, all three varieties of gapless fermions have been theoretically predicted and experimentally observed in different types of crystalline materials grown in laboratories around the world. The Weyl fermion was thought to be last of the group of predicted quasiparticles in nature. Research published earlier this year in the journal Nature (Wang et al., doi:10.1038/nature17410) has shown, however, that this is not the case, with the discovery of a bulk insulator which hosts an exotic surface fermion.

In the current paper, the team predicted and classified the possible exotic fermions that can appear in the bulk of materials. The energy of these fermions can be characterized as a function of their momentum into so-called energy bands, or branches. Unlike the Weyl and Dirac fermions, which, roughly speaking, exhibit an energy spectrum with 2- and 4-fold branches of allowed energy states, the new fermions can exhibit 3-, 6- and 8-fold branches. The 3-, 6-, or 8-fold branches meet up at points – called degeneracy points – in the Brillouin zone, which is the parameter space where the fermion momentum takes its values.

“Symmetries are essential to keep the fermions well-defined, as well as to uncover their physical properties,” Bradlyn said. “Locally, by inspecting the physics close to the degeneracy points, one can think of them as new particles, but this is only part of the story,” he said.

Cano added, “The new fermions know about the global topology of the material. Crucially, they connect to other points in the Brillouin zone in nontrivial ways.”

During the search for materials exhibiting the new fermions, the team uncovered a fundamentally new and systematic way of finding metals in nature. Until now, searching for metals involved performing detailed calculations of the electronic states of matter.

“The presence of the new fermions allows for a much easier way to determine whether a given system is a protected metal or not, in some cases without the need to do a detailed calculation,” Wang said.

Verginory added, “One can just count the number of electrons of a crystal, and figure out, based on symmetry, if a new fermion exists within observable range.”

The researchers suggest that this is because the new fermions require multiple electronic states to meet in energy: The 8-branch fermion requires the presence of 8 electronic states. As such, a system with only 4 electrons can only occupy half of those states and cannot be insulating, thereby creating a protected metal.

“The interplay between symmetry, topology and material science hinted by the presence of the new fermions is likely to play a more fundamental role in our future understanding of topological materials – both semimetals and insulators,” Cava said.

Felser added, “We all envision a future for quantum physical chemistry where one can write down the formula of a material, look at both the symmetries of the crystal lattice and at the valence orbitals of each element, and, without a calculation, be able to tell whether the material is a topological insulator or a protected metal.”

Read the abstract.

Funding for this study was provided by the US Army Research Office Multidisciplinary University Research Initiative, the US Office of Naval Research, the National Science Foundation, the David and Lucile Packard Foundation, the W. M. Keck Foundation, and the Spanish Ministry of Economy and Competitiveness.

Study Models How the Immune System Might Evolve to Conquer HIV (PLOS Genetics)

By Katherine Unger Baillie, courtesy of the University of Pennsylvania

It has remained frustratingly difficult to develop a vaccine for HIV/AIDS, in part because the virus, once in our bodies, rapidly reproduces and evolves to escape being killed by the immune system.

“The viruses are constantly producing mutants that evade detection,” said Joshua Plotkin, a professor in the University of Pennsylvania’s Department of Biology in the School of Arts & Sciences. “A single person with HIV may have millions of strains of the virus circulating in the body.”

Yet the body’s immune system can also evolve. Antibody-secreting B-cells compete among themselves to survive and proliferate depending on how well they bind to foreign invaders. They dynamically produce diverse types of antibodies during the course of an infection.

In a new paper in PLOS Genetics, Plotkin, along with postdoctoral researcher Jakub Otwinowski and Armita Nourmohammad, an associate research scholar at Princeton University’s Lewis-Sigler Institute for Integrative Genomics, mathematically modeled these dueling evolutionary processes to understand the conditions that influence how antibodies and viruses interact and adapt to one another over the course of a chronic infection.

Notably, the researchers considered the conditions under which the immune system gives rise to broadly neutralizing antibodies, which can defeat broad swaths of viral strains by targeting the most vital and immutable parts of the viral genome. Their findings, which suggest that presenting the immune system with a large diversity of viral antigens may be the best way to encourage the emergence of such potent antibodies, have implications for designing vaccines against HIV and other chronic infections.

“This isn’t a prescription for how to design an HIV vaccine,” Plotkin said, “but our work provides some quantitative guidance for how to prompt the immune system to elicit broadly neutralizing antibodies.”

The biggest challenge in attempting to model the co-evolution of antibodies and viruses is keeping track of the vast quantity of different genomic sequences that arise in each population during the course of an infection. So the researchers focused on the statistics of the binding interactions between the virus and antibodies.

“This is the key analytical trick to simplify the problem,” said Otwinowski. “It would otherwise be impossible to track and write equations for all the interactions.”

The researchers constructed a model to examine how mutations would affect the binding affinity between antibodies and viruses. Their model calculated the average binding affinities between the entire population of viral strains and the repertoire of antibodies over time to understand how they co-evolve.

“It’s one of the things that is unique about our work,” said Nourmohammad. “We’re not only looking at one virus binding to one antibody but the whole diversity of interactions that occur over the course of a chronic infection.”

What they saw was an S-shaped curve, in which sometimes the immune system appeared to control the infection with high levels of binding, but subsequently a viral mutation would arise that could evade neutralization, and then binding affinities would go down.

“The immune system does well if there is active binding between antibodies and virus,” Plotkin said, “and the virus does well if there is not strong binding.”

Such a signature is indicative of a system that is out of equilibrium where the viruses are responding to the antibodies and vice versa. The researchers note that this signature is likely common to many antagonistically co-evolving populations.

To see how well their model matched with data from an actual infection, the researchers looked at time-shifted experimental data from two HIV patients, in which their antibodies were collected at different time points and then “competed” against the viruses that had been in their bodies at different times during their infections.

They saw that these patient data are consistent with their model: Viruses from earlier time points would be largely neutralized by antibodies collected at later time points but could outcompete antibodies collected earlier in infection.

Finally, the researchers used the model to try to understand the conditions under which broadly neutralizing antibodies, which could defeat most strains of virus, would emerge and rise to prominence.

“Despite the effectiveness of broadly neutralizing antibodies, none of the patients with these antibodies has been cured of HIV,” Plotkin said. “It’s just that by the time they develop them, it’s too late and their T-cell repertoire is depleted. This raises the intriguing idea that, if only they could develop these antibodies earlier in infection, they might be prepared to combat an evolving target.”

“The model that we built,” Nourmohammad said, “was able to show that, if viral diversity is very large, the chance that these broadly neutralizing antibodies outcompete more specifically targeted antibodies and proliferate goes up.”

The finding suggests that, in order for a vaccine to elicit these antibodies, it should present a diverse set of viral antigens to the host. That way no one specialist antibody would have a significant fitness advantage, leaving room for the generalist, broadly neutralizing antibodies to succeed.

The researchers said that there has been little theoretical modeling of co-evolutionary systems such as this one. As such, their work could have implications for other co-evolution scenarios.

“Our theory can also apply to other systems, such as bacteria-phage co-evolution,” said Otwinowski, in which viruses infect bacteria, a process that drives bacterial evolution and ecology.

“It could also shed light on the co-evolution of the influenza virus in the context of evolving global immune systems,” Nourmohammad said.

Read the article.

The work was supported by funding from the U.S. National Science Foundation, James S. McDonnell Foundation, David and Lucile Packard Foundation, U.S. Army Research Office and National Institutes of Health.


Role for enhancers in bursts of gene activity (Cell)


By Marisa Sanders for the Office of the Dean for Research

A new study by researchers at Princeton University suggests that sporadic bursts of gene activity may be important features of genetic regulation rather than just occasional mishaps. The researchers found that snippets of DNA called enhancers can boost the frequency of bursts, suggesting that these bursts play a role in gene control.

The researchers analyzed videos of Drosophila fly embryos undergoing DNA transcription, the first step in the activation of genes to make proteins. In a study published on July 14 in the journal Cell, the researchers found that placing enhancers in different positions relative to their target genes resulted in dramatic changes in the frequency of the bursts.

“The importance of transcriptional bursts is controversial,” said Michael Levine, Princeton’s Anthony B. Evnin ’62 Professor in Genomics and director of the Lewis-Sigler Institute for Integrative Genomics. “While our study doesn’t prove that all genes undergo transcriptional bursting, we did find that every gene we looked at showed bursting, and these are the critical genes that define what the embryo is going to become. If we see bursting here, the odds are we are going to see it elsewhere.”

The transcription of DNA occurs when an enzyme known as RNA polymerase converts the DNA code into a corresponding RNA code, which is later translated into a protein. Researchers were puzzled to find about ten years ago that transcription can be sporadic and variable rather than smooth and continuous.

In the current study, Takashi Fukaya, a postdoctoral research fellow, and Bomyi Lim, a postdoctoral research associate, both working with Levine, explored the role of enhancers on transcriptional bursting. Enhancers are recognized by DNA-binding proteins to augment or diminish transcription rates, but the exact mechanisms are poorly understood.

Until recently, visualizing transcription in living embryos was impossible due to limits in the sensitivity and resolution of light microscopes. A new method developed three years ago has now made that possible. The technique, developed by two separate research groups, one at Princeton led by Thomas Gregor, associate professor of physics and the Lewis-Sigler Institute for Integrative Genomics, and the other led by Nathalie Dostatni at the Curie Institute in Paris, involves placing fluorescent tags on RNA molecules to make them visible under the microscope.

The researchers used this live-imaging technique to study fly embryos at a key stage in their development, approximately two hours after the onset of embryonic life where the genes undergo fast and furious transcription for about one hour. During this period, the researchers observed a significant ramping up of bursting, in which the RNA polymerase enzymes cranked out a newly transcribed segment of RNA every 10 or 15 seconds over a period of perhaps 4 or 5 minutes per burst. The genes then relaxed for a few minutes, followed by another episode of bursting.

The team then looked at whether the location of the enhancer – either upstream from the gene or downstream – influenced the amount of bursting. In two different experiments, Fukaya placed the enhancer either upstream of the gene’s promoter, or downstream of the gene and saw that the different enhancer positions resulted in distinct responses. When the researchers positioned the enhancer downstream of the gene, they observed periodic bursts of transcription. However when they positioned the enhancer upstream of the gene, the researchers saw some fluctuations but no discrete bursts. They found that the closer the enhancer is to the promoter, the more frequent the bursting.

To confirm their observations, Lim applied further data analysis methods to tally the amount of bursting that they saw in the videos. The team found that the frequency of the bursts was related to the strength of the enhancer in upregulating gene expression. Strong enhancers produced more bursts than weak enhancers. The team also showed that inserting a segment of DNA called an insulator reduced the number of bursts and dampened gene expression.

In a second series of experiments, Fukaya showed that a single enhancer can activate simultaneously two genes that are located some distance apart on the genome and have separate promoters. It was originally thought that such an enhancer would facilitate bursting at one promoter at a time—that is, it would arrive at a promoter, linger, produce a burst, and come off. Then, it would randomly select one of the two genes for another round of bursting. However, what was instead observed was bursting occurring simultaneously at both genes.

“We were surprised by this result,” Levine said. “Back to the drawing board! This means that traditional models for enhancer-promoter looping interactions are just not quite correct,” Levine said. “It may be that the promoters can move to the enhancer due to the formation of chromosomal loops. That is the next area to explore in the future.”

The study was funded by grants from the National Institutes of Health (U01EB021239 and GM46638).

Access the paper here:

Takashi Fukaya, Bomyi Lim & Michael Levine. Enhancer Control of Transcriptional Bursting, Cell (2016), Published July 14. EPub ahead of print June 9. http://dx.doi.org/10.1016/j.cell.2016.05.025

Study of individual neurons in flies reveals memory-related changes in gene activity (Cell Reports)

Image of the Drosophila brain (magenta) with a subset of mushroom body neurons expressing green fluorescent protein (GFP) via a genetic marker. This marker was used to harvest these neurons following the learning and memory assay. (Credit: Crocker, et al.)
Image of the Drosophila brain (magenta) with a subset of mushroom body neurons expressing green fluorescent protein (GFP) via a genetic marker. (Credit: Janelia Farm/HHMI – FlyLight)

By Kristin Qian for the Office of the Dean for Research

Researchers at Princeton University have developed a highly sensitive and precise method to explore genes important for memory formation within single neurons of the Drosophila fly brain. With this method, the researchers found an unexpected result: certain genes involved in creating long-term memories in the brain are the same ones that the eye uses for sensing light.

The study, published in the May 17 issue of the journal Cell Reports, demonstrated the utility of the new method and also identified new patterns of gene expression that drive long-term memory formation.

“Ultimately, to understand the brain, we want to know what individual neurons are doing,” said Mala Murthy, assistant professor in the Princeton Neuroscience Institute and the Department of Molecular Biology. “We found that single neurons can be defined by their pattern of their gene expression, even if they’re all in the same brain network.”

To their surprise, the researchers found that many of the active genes in these neurons produce proteins that are best known for their roles in detecting light in the fly’s eye or sensing odor in the fly’s nose. “It is possible that these sensory proteins have been repurposed by the brain for a different function,” Murthy said.

“Even though the paper is focused on the methodology, which I think will be impactful for the field, there is this new science here—a whole new class of molecules we found that is in the central brain and seems to be involved in memory formation,” Murthy said.

Researchers have known that genes “turn on,” or start making proteins, during the formation of long-term memories in Drosophila, a widely used organism in studies of neurobiology, but they didn’t know exactly which genes in which neurons were involved.

To investigate this question, the researchers first trained flies to form long-term memories. Then they extracted single neurons from the fly brains and evaluated all of the gene readouts, or transcripts, which encode proteins. By comparing the transcripts of the memory-trained flies to those of non-trained flies, researchers were able to identify genes involved in long-term memory formation.

The task was complicated by the tiny size of the fly’s head, which is just one millimeter across, and contains fewer than 100,000 neurons. Murthy’s team focused on neuron types in one part of the brain, the mushroom body, named for its distinctive shape.

First author Amanda Crocker, a former postdoctoral fellow in Murthy’s lab and now an assistant professor of neuroscience at Middlebury College, conducted the experiments in collaboration with co-authors Xiao-Juan Guan, a senior research specialist in the Princeton Neuroscience Institute; Coleen Murphy, professor of molecular biology and the Lewis-Sigler Institute for Integrative Genomics; and Murthy.

“Our work opens up the ability to use Drosophila as a way to study how gene expression in single neurons relates to brain function,” Crocker said. “This has been a challenge because the fly brain is very small and contains fewer neurons than other organisms that neuroscientists study. The advantage of using flies is that they have significantly less redundancy in the neurons that they do have. We can look at specific neurons and gene expression, and ask what the genes are doing in that cell to cause the behavior.”

The researchers trained the flies to form long-term memories by exposing them to an odor – either an earthy, mushroom-like smell (3-octanol) or a menthol-like smell (4-methylcyclohexanol) – while simultaneously delivering a negative stimulus in the form of an electric shock.

Flies experience two odor spaces in each tube. If neither odor has been paired with electric shock, flies spend an equal amount of time on both sides of the tube (control). If one of the odors is paired with electric shock, flies avoid that side of the tube - for example, flies trained to associate the odor 3-OCT with electric shock, avoid the red side (containing 3-OCT) of the tube. (Credit: Crocker, et al.)
Flies experience two odor spaces in each tube. If neither odor has been paired with electric shock, flies spend an equal amount of time on both sides of the tube (control). If one of the odors is paired with electric shock, flies avoid that side of the tube. For example, flies trained to associate the odor 3-OCT with electric shock avoided the red side (containing 3-OCT) of the tube. (Credit: Murthy lab, Princeton University)

The training took place in a tube containing the two odors, one at each end of the tube. Researchers paired one of the odors with the electric shock, and as a result the fly avoided that end of the tube. The assay was conducted in the dark, so that the flies could use only their sense of smell, not their vision, to navigate the tube.

A second group of flies received the electric shock and the odor, but not at the same time, so they did not form the memory that linked odor to shock.

The researchers then isolated single neurons from the fly brains using tiny glass tubes to suction out the cells. Harvesting neurons using this technique is not common, Murthy said, and it had not been combined with a complete analysis of gene activity in fly neurons before. With this novel method, they were able to use only 10 to 90 femtograms – a quintillionth of a kilogram – of genetic material.

They evaluated gene activity by looking at the production of messenger ribonucleic acid (mRNA), an intermediary between DNA and proteins. The result is a “transcriptome,” or readout of all of the genetic messages that the cell uses to produce proteins. The researchers then read the transcriptome to see which genes produced proteins in the memory-trained flies versus the non-trained flies, and found that some of the active genes in memory-trained flies were the same as ones used in the sensory organs to detect light, odors and taste.

To follow-up, the researchers bred mutant flies that lacked genes for some of the light-sensing proteins and thus could not see. The same memory experiments as before were carried out, and the researchers confirmed that the flies lacking light-sensing proteins were both unable to see and unable to form long-term memories.

The discovery of the expression of genes for classical ‘light-sensing’ proteins, such as rhodopsin, as well as other sensory-related proteins for odor and taste detection, was unexpected because these proteins were not known to be utilized in mushroom bodies, Murthy said. Although studies in other organisms, including humans, have detected sensory genes in areas of the brain unrelated to the sensory organ itself, this may be the first study to link these genes to memory formation.

The study was funded by a National Institutes of Health Ruth L. Kirschstein Institutional National Research Service Award, the Alfred P. Sloan Foundation, the Human Frontier Science Program, a National Science Foundation (NSF) CAREER award, the McKnight Endowment Fund for Neuroscience, the Klingenstein Foundation, a National Institutes of Health New Innovator award, and an NSF BRAIN Initiative EAGER award. The study was also funded in part through Princeton’s Glenn Center for Quantitative Aging Research, directed by Coleen Murphy.

The paper, “Cell-Type-Specific Transcriptome Analysis in the Drosophila Mushroom Body Reveals Memory-Related Changes in Gene Expression,” was published in the May 17 issue of Cell Reports.

Read the journal article.

Scientists capture the elusive structure of essential digestive enzyme (JACS)

Stylized graphic of data on the structure of an active form of an important digestive enzyme, phenylalanine hydolase. The cyan cross-section shows the elution profile and magenta cross-section shows scattering profile. At right is the structure of the activated phenylalanine hydroxylase. Image source: Ando et al.
Stylized graphic of data on the structure of an important digestive enzyme, phenylalanine hydroxylase. At right is the structure of the activated enzyme. Image source: Ando et al.

By Tien Nguyen, Department of Chemistry

Using a powerful combination of techniques from biophysics to mathematics, researchers have revealed new insights into the mechanism of a liver enzyme that is critical for human health. The enzyme, phenylalanine hydroxylase, turns the essential amino acid phenylalanine – found in eggs, beef and many other foods and as an additive in diet soda —into tyrosine, a precursor for multiple important neurotransmitters.

“We need phenylalanine hydroxylase to control levels of phenylalanine in the blood because too much is toxic to the brain,” said Steve Meisburger, lead author on the study and a post-doctoral researcher in the Ando lab. Genetic mutations in phenylalanine hydroxylase can lead to disorders such as phenylketonuria, an inherited condition that can cause intellectual and behavioral disabilities unless detected at birth and managed through dietary restrictions.

Published earlier this month in the Journal of the American Chemical Society, the article presented detailed structural data on the enzyme’s active state – the shape it adopts when performing its chemical duties – that has eluded scientists for years.

“It’s a floppy enzyme which means it’s dynamic,” said Nozomi Ando, an assistant professor of chemistry at Princeton and corresponding author on the paper. “That also means it doesn’t like to crystallize,” she said. This is problematic for the classic method used to study enzymatic structure, known as x-ray crystallography, which requires solid crystal samples. Efforts to crystallize phenylalanine hydroxylase have just recently met success, but still only captured the enzyme in its inactive state.

The researchers in the Ando lab were able to bypass the tricky task of growing crystals of the active enzyme by using their expertise in a special technique akin to crystallography, called small angle x-ray scattering (SAXS), which allows scientists to study enzymes in a solution. And because the enzyme is susceptible to aggregation or clumping up in solution, the researchers coupled their scattering method with a purification technique called size exclusion chromatography (SEC), in which different species in a sample flow through a column at different speeds based on their size.

Steve Meisburger (left) and Nozomi Ando (right)
Steve Meisburger (left) and Nozomi Ando (right)

“Pairing SEC with SAXS is an emergent technique. Our contribution is that we saw a clever way to use it,” Ando said. The experiment is highly specialized and relies on powerful x-rays emitted by particles speeding around the circular track at a synchrotron facility. The research team traveled from Princeton to the Cornell High Energy Synchrotron Source in Ithaca, New York, for multiple intensive data-collection sessions. “Any time on the machine that is available, we use it. Not a single photon gets wasted,” Ando said.

As the enzyme solution passes through the purification technique, flowing across the path of the x-ray beam, researchers record snapshots of the x-ray scattering patterns. The resulting dataset is quite complex as the sample also contains phenylalanine, the compound that “turns on” phenylalanine hydroxylase so that researchers can catch the dynamic enzyme in action.

“Current approaches for analyzing this type of dataset are very crude,” Meisburger said. Essentially, these methods assume that each signal – known as an elution peak – represents a single species, when each peak is actually a mixture of species. In this work, the team used an advanced linear algebra method known as evolving factor analysis that allowed them to separate the scattering components. “We can use these linear algebra methods to ‘un-mix’ species that are overlapping,” Meisburger said, “That’s the piece that I think is really exciting.”

By applying their unique approach, the researchers were able to provide evidence for a model of the active structure of phenylalanine hydroxylase that builds upon recent work by their collaborators in Paul Fitzpatrick’s group at UT Health Science Center at San Antonio. In this model, two phenylalanine molecules dock to a pair of sites on the enzyme, bringing a pair of arms together and freeing up the active sites for doing chemistry once more phenylalanine molecules come along.

“I’m very proud that this is our first paper [published since Ando joined the faculty at Princeton]. We wanted it to be very quantitative and heavy on the biochemistry plus heavy on the physical chemistry. I’m really pleased with the way it turned out,” Ando said.

This work was supported by National Health Institutes grants GM100008 and GM098140 and Welch Foundation grant AQ-1245.

Access the paper here:

Meisburger, S. P.; Taylor, A. B.; Khan, C. A.; Zhang, S.; Fitzpatrick, P. F.; Ando, N. “Domain movements upon activation of phenylalanine hydroxylase characterized by crystallography and chromatography-coupled small-angle X-ray scattering.J. Am. Chem. Soc., 2016, 138 (20), pp 6506–6516.DOI: 10.1021/jacs.6b01563. Published online May 4, 2016.



Theorists smooth the way to solving one of quantum mechanics oldest problems: Modeling quantum friction (J. Phys. Chem. Letters)

Researchers at Princeton
From left to right: Herschel Rabitz, Renan Cabrera, Andre Campos and Denys Bondar. Photo credit: C. Todd Reichart

By: Tien Nguyen, Department of Chemistry

Theoretical chemists at Princeton University have pioneered a strategy for modeling quantum friction, or how a particle’s environment drags on it, a vexing problem in quantum mechanics since the birth of the field. The study was published in the Journal of Physical Chemistry Letters.

“It was truly a most challenging research project in terms of technical details and the need to draw upon new ideas,” said Denys Bondar, a research scholar in the Rabitz lab and corresponding author on the work.

Quantum friction may operate at the smallest scale, but its consequences can be observed in everyday life. For example, when fluorescent molecules are excited by light, it’s because of quantum friction that the atoms are returned to rest, releasing photons that we see as fluorescence. Realistically modeling this phenomenon has stumped scientists for almost a century and recently has gained even more attention due to its relevance to quantum computing.

“The reason why this problem couldn’t be solved is that everyone was looking at it through a certain lens,” Bondar said. Previous models attempted to describe quantum friction by considering the quantum system as interacting with a surrounding, larger system. This larger system presents an impossible amount of calculations, so in order to simplify the equations to the pertinent interactions, scientists introduced numerous approximations.

These approximations led to numerous different models that could each only satisfy one or the other of two critical requirements. In particular, they could either produce useful observations about the system, or they could obey the Heisenberg Uncertainty Principle, which states that there is a fundamental limit to the precision with which a particle’s position and momentum can be simultaneous measured. Even famed physicist Werner Heisenberg’s attempt to derive an equation for quantum friction was incompatible with his own uncertainty principle.

The researchers’ approach, called operational dynamic modeling (ODM) and introduced in 2012 by the Rabitz group, led to the first model for quantum friction to satisfy both demands. “To succeed with the problem, we had to literally rethink the physics involved, not merely mathematically but conceptually,” Bondar said.

Bondar and his colleagues focused on the two ultimate requirements for their model – that it should obey the Heisenberg principle and produce real observations – and worked backwards to create the proper model.

“Rather than starting with approximations, Denys and the team built in the proper physics in the beginning,” said Herschel Rabitz, the Charles Phelps Smyth ’16 *17 Professor of Chemistry and co-author on the paper. “The model is built on physical and mathematical truisms that must hold. This distinct approach creates a new rigorous and practical formulation for quantum friction,” he said.

The research team included research scholar Renan Cabrera and Ph.D. student Andre Campos as well as Shaul Mukamel, professor of chemistry at the University of California, Irvine.

Their model opens a way forward to understand not only quantum friction but other dissipative phenomena as well. The researchers are interested in exploring the means to manipulate these forces to their advantage. Other theorists are rapidly taking up the new paradigm of operational dynamic modeling, Rabitz said.

Reflecting on how they arrived at such a novel approach, Bondar recalled the unique circumstances under which he first started working on this problem. After he received the offer to work at Princeton, Bondar spent four months awaiting a US work visa (he is a citizen of the Ukraine) and pondering fundamental physics questions. It was during this time that he first thought of this strategy. “The idea was born out of bureaucracy, but it seems to be holding up,” Bondar said.

Read the full article here:

Bondar, D. I.; Cabrera, R.; Campos, A.; Mukamel, S.; Rabitz, H. A. “Wigner-Lindblad Equations for Quantum Friction.J. Phys. Chem. Lett. 2016, 7, 1632.

This work was supported by the US National Science Foundation CHE 1058644, the US Department of Energy DE-FG02-02ER-15344, and ARO-MURI W911NF-11-1-0268.

PPPL scientists challenge conventional understanding and improve predictions of the bootstrap current at the edge of fusion plasmas (Physics of Plasmas)

Simulation shows trapped electrons at left and passing electrons at right that are carried in the bootstrap current of a tokamak. Credit: Kwan Liu-Ma, University of California, Davis.
Simulation shows trapped electrons at left and passing electrons at right that are carried in the bootstrap current of a tokamak. Credit: Kwan Liu-Ma, University of California, Davis.

By John Greenwald, PPPL Office of Communications

Researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) have challenged understanding of a key element in fusion plasmas. At issue has been an accurate prediction of the size of the “bootstrap current” — a self-generating electric current — and an understanding of what carries the current at the edge of plasmas in doughnut-shaped facilities called tokamaks. This bootstrap-generated current combines with the current in the core of the plasma to produce a magnetic field to hold the hot gas together during experiments, and can produce stability at the edge of the plasma.

The recent work, published in the April issue of the journal Physics of Plasmas, focuses on the region at the edge in which the temperature and density drop off sharply. In this steep gradient region — or pedestal — the bootstrap current is large, enhancing the confining magnetic field but also triggering instability in some conditions.

The bootstrap current appears in a plasma when the pressure is raised. It was first discovered at the University of Wisconsin by Stewart Prager, now director of PPPL, and Michael Zarnstorff, now deputy director for research at PPPL. Prager was Zarnstorff’s thesis advisor at the time.

Physics understanding and accurate prediction of the size of the current at the edge of the plasma is essential for predicting its effect on instabilities that can diminish the performance of fusion reactors. Such understanding will be vital for ITER, the international tokamak under construction in France to demonstrate the feasibility of fusion power. This work was supported by the DOE Office of Science (FES).

The new paper, by physicists Robert Hager and C.S. Chang, leader of the Scientific Discovery through Advanced Computing project’s Center for Edge Physics Simulation headquartered at PPPL, discovered that the bootstrap current in the tokamak edge is mostly carried by the “magnetically trapped” electrons that cannot travel as freely as the “passing” electrons in plasma. The trapped particles bounce between two points in the tokamak while the passing particles swirl all the way around it.

The discovery challenges conventional understanding and provides an explanation of how the bootstrap current can be so large at the tokamak edge, where the passing electron population is small. Previously, physicists thought that only the passing electrons carry the bootstrap current. “Correct modeling of the current enables accurate prediction of the instabilities,” said Hager, the lead author of the paper.

The researchers performed the study by running an advanced global code called “XGCa” on the Mira supercomputer at the Argonne Leadership Computing Facility, a DOE Office of Science User Facility located at the Department’s Argonne National Laboratory. Researchers turned to the new global code, which models the entire plasma volume, because simpler local computer codes can become inadequate and inaccurate in the pedestal region.

Numerous XGCa simulations led Hager and Chang to construct a new formula that greatly improves the accuracy of bootstrap current predictions. The new formula was found to fit well with all the XGCa cases studied and could easily be implemented into modeling or analysis codes.

PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Read the abstract and article.

The paper, “Gyrokinetic neoclassical study of the bootstrap current in the tokamak edge pedestal with fully non-linear Coulomb collisions,” by Robert Hager and C.S. Chang, was published in the April, 2016, Physics of Plasmas, doi: 10.1063/1.4945615.

Support for this work was provided through the Scientific Discovery through Advanced Computing (SciDAC) program funded by the U.S. Department of Energy Office of Advanced Scientific Computing Research and the Office of Fusion Energy Sciences.

Dopamine neurons have a role in movement, new study finds (Nature Neuroscience)

Dopamine neurons. Image credit: Witten et al., Nature Neuroscience
Dopamine neurons. Image credit: Witten et al., Nature Neuroscience

By Catherine Zandonella, Office of the Dean for Research

Princeton University researchers have found that dopamine – a brain chemical involved in learning, motivation and many other functions – also has a direct role in representing or encoding movement. The finding could help researchers better understand dopamine’s role in movement-related disorders such as Parkinson’s disease.

The researchers used a new, more precise technique to record the activity of dopamine neurons at two regions within a part of the brain known as the striatum, which oversees action planning, motivation and reward perception. The researchers found that while all of the neurons carried signals needed to learn and plan movement, one of the nerve bundles, the one that went to the region called the dorsomedial striatum, also carried a signal that could be used to control movements.

The work was published online in the journal Nature Neuroscience this week.

“What we learned from this study is that dopamine neurons that go to one part of the brain act differently than dopamine neurons that go to another part of the brain,” said Ilana Witten, assistant professor of psychology and the Princeton Neuroscience Institute. “This is contrary to what has been the mainstream view of dopamine neurons.”

The research may shed light on how Parkinson’s disease, which involves the destruction of dopamine neurons in the dorsomedial striatum, deprives patients of the ability to move. Previous studies have failed to find a direct link between dopamine neuron activity and the control of movement or actions. Instead, the mainstream view suggested an indirect role for dopamine: the neurons make it possible for us to learn which actions are likely to lead to a rewarding experience, which in turn enables us to plan to take that action. When dopamine neurons are destroyed, the individual cannot learn to plan actions and thus cannot move.

The new study affirmed the role of dopamine in reward-based learning, but also found that in the dorsomedial striatum, dopamine neurons can play a direct role in movement. The researchers used a method for measuring neuron activity at very precise locations in the brain. They measured the activity at the ends of neurons – the terminals where dopamine is released into the junction, or synapse, between two cells – in two locations in the striatum: the nucleus accumbens, known to be involved in processing reward, and the dorsomedial striatum, known for evaluating and generating actions.

Until recently, it has been difficult to measure dopamine neuron activity in these regions due to the small size of the regions and the fact that there are many other neurons present that are delivering other brain chemicals, or neurotransmitters, to the same areas of the brain.

To restrict their measurements to only dopamine-carrying neurons, the researchers used mice whose brains carry genetically altered cells that glow green when active. The mice also contain a second gene that ensured that the glowing could only occur when dopamine was present.

The researchers then recorded neuron activity from either the nucleus accumbens or the dorsomedial striatum by inserting a very thin optical fiber into each region to record the fluorescing dopamine cells in only the desired regions.

Once the ability to measure neuron activity was in place, the researchers gave the mice a task that involved both reward-based learning as well as movement.

The task involved presenting the mice with two levers, one of which, when pressed, gave a drink of sweetened water. Through trial and error, the mice learned which lever would give the reward. During the task, the researchers recorded their brain activity.

The task is analogous to playing slot machines at a casino. Picture yourself at a casino with two slot machines in front of you. You pull the lever on the machine to your left and it spits out some coins. Your brain learns that the left lever leads to a reward, so you plan and execute an action: you pull the left lever again. After a few more pulls on the left lever without a reward, you switch to the machine on the right.

When an action is rewarding, you are likely to remember it, an important step in learning. The difference between how much reward you expect, and how much you get, is also important, because it tells you whether or not something is new and how much you should pay attention to it. Researchers call this gap between your predicted reward and the reward you actually get the “reward-prediction error” and consider it an important teaching signal.

By matching the mice’s actions to the dopamine activity in their brains during these tasks, the researchers could determine which parts of the brain were active during reward-based learning, and which parts were active when choosing to press a lever. Assistance with computational modeling of the mice’s behaviors was provided by Nathaniel Daw, a professor of the Princeton Neuroscience Institute and Psychology.

The researchers found that the dopamine neurons that innervate the nucleus accumbens and the dorsomedial striatum did indeed encode reward-prediction cues, which is consistent with previous findings. But they also found that in the dorsomedial striatum, the dopamine neurons carried information about what actions the animal is going to take.

“This idea was that dopamine neurons carry this reward-prediction error signal, and that could indirectly affect movement or actions, because if you don’t have this, you won’t correctly learn which actions to perform,” Witten said. “We show that while this is true, it is certainly not the whole story. There is also a layer where dopamine is directly coding movement or actions.”

Nathan Parker, a graduate student in the Witten lab who designed and conducted the experiments and is first author on the paper, added that new findings were made possible both by the improvements in recording of neurons and by the experimental design, which gave researchers a detailed evaluation of neuron activity during a relatively complex task.

Additional research assistance was provided by Princeton postdoctoral research associates Courtney Cameron and Junuk Lee, and graduate student Jung Yoon Choi. Research Specialist Joshua Taliaferro, Class of 2015, begin working on the project as part of his senior thesis. The study also involved contributions from Thomas Davidson, a postdoctoral researcher at Stanford University.

The study also addresses the general question of how dopamine can be involved in so many functions in the brain, Witten said. “We think that some of the way that dopaminergic neurons achieve such diverse functions in the brain is by having specific roles based on their anatomical target.”

Naoshige Uchida, a professor of molecular and cellular biology at Harvard University who was not involved in the study, said the results challenge long-held views and open up new directions of research. “This study by the Witten lab elegantly shows that the activity of some dopamine neurons is modulated by the direction of motion,” Uchida said. “More importantly, they found some of the clearest evidence indicating the heterogeneity of dopamine neurons: A specific population of dopamine neurons projecting to the dorsomedial striatum encodes movement direction more so compared to another population projecting to the ventral striatum.”

Uchida continued, “A similar phenomenon has also been reported in an independent study in non-human primates (Kim, et al., Cell, 2015), suggesting that the Witten lab finding is more universal and not specific to mice. This is particularly important because dopamine has been implicated in Parkinson’s disease but how dopamine regulates movement remains a large mystery.”

Funding for the study was provided by the Pew Charitable Trusts, the McKnight Foundation, the Brain & Behavior Research (NARSAD) Foundation, the Alfred P. Sloan Foundation, the National Institutes of Health, the National Science Foundation, and Princeton’s Stuart M. Essig ’83 and Erin S. Enright ’82 Fund for Innovation in Engineering and Neuroscience.

Read the abstract.

The study, “Reward and choice encoding in terminals of midbrain dopamine neurons depends on striatal target,” by Nathan Parker, Courtney Cameron, Joshua Taliaferro, Junuk Lee, Jung Yoon Choi, Thomas Davidson, Nathaniel Daw and Ilana Witten, was published online in the journal Nature Neuroscience (Advance Online Publication, http://dx.doi.org/10.1038/nn.4287).

Chemical tracers reveal oxygen-dependent switch in cellular pathway to fat (Nature Chemical Biology)

By Tien Nguyen, Department of Chemistry

Using tracer compounds, scientists have been able to track the cellular production of NADPH, a key coenzyme for making fat, through a pathway that has never been measured directly before.

By tracking this pathway, known as malic enzyme metabolism, which is one of a few recognized routes to make NADPH, researchers from Rabinowitz lab discovered a novel switch in the way fat cells make NADPH depending on the presence of oxygen. The findings were published in Nature Chemical Biology.

Ling Liu (left) and Joshua Rabinowitz (right)

“No one had ever shown an environmental dependent switch in any NADPH production pathway,” said Joshua Rabinowitz, Professor of Chemistry and the Lewis-Sigler Institute for Integrative Genomics at Princeton and principal investigator of the work. “No one had the tools to look,” he said.

NADPH is critical to not only fat synthesis, but also protein and DNA synthesis, and antioxidant defense, implicating it in many diseases such as cancer and diabetes. By understanding and monitoring the pathways through which NADPH is made, scientists can work towards influencing these processes using therapeutic compounds.

The Rabinowitz lab first applied their tracer method in 2014 to study the most well known NADPH production pathway, the oxidative pentose phosphate pathway (oxPPP). The method relied on compounds labeled with deuterium atoms, hydrogen’s heavier cousin, which can be deployed in the cell and measured by a technique called mass spectrometry.

In this work, the researchers extended their method to probe the lesser-known malic enzyme pathway by developing two new, orthogonal tracer compounds specific to this pathway. One tracer, a deuterated succinate compound, enters the cycle more directly but is somewhat challenging for the cell to uptake, while the other, a deuterated glucose molecule, is taken up by the cell readily but takes an extra step to enter the pathway.

The research team investigated the malic enzyme pathway under various concentrations of oxygen. Low oxygen environments, which are found in fat cells in obesity, are of particular clinical interest. They found that in a low oxygen environment, the oxidative pentose phosphate pathway produced more NADPH than the malic enzyme pathway, but in a higher oxygen environment, the pathway contributions completely flipped.

“It’s like the cells are quite clever. They choose the pathway depending on what they want to make, and what nutrients they can access,” said Ling Liu, a graduate student in the Rabinowitz lab and lead author on the work.

One advantage of this method is that it tracks NADPH made specifically in the cytosolic compartment of the cell, whereas the previous leading technique, which relied on tracer compounds with carbon-13 atoms, is unable to differentiate between malic enzyme activity in the cytosol and mitochondria.

NADPH involvement in essential cellular processes has a direct impact on diseases such as diabetes, obesity and cancer. “All of these central biomedical questions depend on an understanding of NADPH pathways, and if you can’t quantify how a metabolite is made and used, you can’t understand what’s going on,” Rabinowitz said. “Ultimately, we’re trying to understand the fundamental chemistry that’s leading to these important biological outcomes,” he said.

Read the full article or abstract:

Liu, L.; Shah, S.; Fan, J.; Park, J. O.; Wellen, K. E.; Rabinowitz, J. D. “Malic enzyme tracers reveal hypoxia-induced switch in adipocyte NADPH pathway usage.Nat. Chem. Bio. Published online March 21, 2016.

This work was supported by the US National Institutes of Health grants R01CA163591, R01AI097382 and P30DK019525 (to the University of Pennsylvania Diabetes Research Center).

Electrons slide through the hourglass on surface of bizarre material (Nature)

An illustration of the hourglass fermion predicted to lie on the surface of crystals of potassium mercury antimony. (Bernevig et al., Princeton University)
An illustration of the hourglass fermion predicted to lie on the surface of crystals of potassium mercury antimony. (Image credit: Laura R. Park and Aris Alexandradinata)

By Staff

A team of researchers at Princeton University has predicted the existence of a new state of matter in which current flows only through a set of surface channels that resemble an hourglass. These channels are created through the action of a newly theorized particle, dubbed the “hourglass fermion,” which arises due to a special property of the material. The tuning of this property can sequentially create and destroy the hourglass fermions, suggesting a range of potential applications such as efficient transistor switching.

In an article published in the journal Nature this week, the researchers theorize the existence of these hourglass fermions in crystals made of potassium and mercury combined with either antimony, arsenic or bismuth. The crystals are insulators in their interiors and on their top and bottom surfaces, but perfect conductors on two of their sides where the fermions create hourglass-shaped channels that enable electrons to flow.

The research was performed by Princeton University postdoctoral researcher Zhi Jun Wang and former graduate student Aris Alexandradinata, now a postdoctoral researcher at Yale University, working with Robert Cava, Princeton’s Russell Wellman Moore Professor of Chemistry, and Associate Professor of Physics B. Andrei Bernevig.

The new hourglass fermion exists – theoretically for now, until detected experimentally – in a family of materials broadly called topological insulators, which were first observed experimentally in the mid-2000s and have since become one of the most active and interesting branches of quantum physics research. The bulk, or interior, acts as an insulator, which means it prohibits the travel of electrons, but the surface of the material is conducting, allowing electrons to travel through a set of channels created by particles known as Dirac fermions.

Fermions are a family of subatomic particles that include electrons, protons and neutrons, but they also appear in nature in many lesser known forms such as the massless Dirac, Majorana and Weyl fermions. After years of searching for these particles in high-energy accelerators and other large-scale experiments, researchers found that they can detect these elusive fermions in table-top laboratory experiments on crystals. Over the past few years, researchers have used these “condensed matter” systems to first predict and then confirm the existence of Majorana and Weyl fermions in a wide array of materials.

The next frontier in condensed matter physics is the discovery of particles that can exist in the so-called “material universe” inside crystals but not in the universe at large. Such particles come about due to the properties of the materials but cannot exist outside the crystal the way other subatomic particles do. Classifying and discovering all the possible particles that can exist in the material universe is just beginning. The work reported by the Princeton team lays the foundations of one of the most interesting of these systems, according to the researchers.

In the current study, the researchers theorize that the laws of physics prohibit current from flowing in the crystal’s bulk and top and bottom surfaces, but permit electron flow in completely different ways on the side surfaces through the hourglass-shaped channels. This type of channel, known more precisely as a dispersion, was completely unknown before.

The researchers then asked whether this dispersion is a generic feature found in certain materials or just a fluke arising from a specific crystal model.

It turned out to be no fluke.

A long-standing collaboration with Cava, a material science expert, enabled Bernevig, Wang, and Alexandradinata to uncover more materials exhibiting this remarkable behavior.

“Our hourglass fermion is curiously movable but unremovable,” said Bernevig. “It is impossible to remove the hourglass channel from the surface of the crystal.”

Bernevig explained that this robust property arises from the intertwining of spatial symmetries, which are characteristics of the crystal structure, with the modern band theory of crystals. Spatial symmetries in crystals are distinguished by whether a crystal can be rotated or otherwise moved without altering its basic character.

In a paper published in Physical Review X this week to coincide with the Nature paper, the team detailed the theory behind how the crystal structure leads to the existence of the hourglass fermion.

An illustration of the complicated dispersion of the surface fermion arising from a background of mercury and bismuth atoms (blue and red). (Image credit: Mingyee Tsang and Aris Alexandradinata)
An illustration of the complicated dispersion of the surface fermion arising from a background of mercury and bismuth atoms (blue and red). (Image credit: Mingyee Tsang and Aris Alexandradinata)

“Our work demonstrates how this basic geometric property gives rise to a new topology in band insulators,” Alexandradinata said. The hourglass is a robust consequence of spatial symmetries that translate the origin by a fraction of the lattice period, he explained. “Surface bands connect one hourglass to the next in an unbreakable zigzag pattern,” he said.

The team found esoteric connections between their system and high-level mathematics. Origin-translating symmetries, also called non-symmorphic symmetries, are described by a field of mathematics called cohomology, which classifies all the possible crystal symmetries in nature. For example, cohomology gives the answer to how many crystal types exist in three spatial dimensions: 230.

In the cohomological perspective, there are 230 ways to combine origin-preserving symmetries with real-space translations, known as the “space groups.” The theoretical framework to understand the crystals in the current study requires a cohomological description with momentum-space translations.

“The hourglass theory is the first of its kind that describes time-reversal-symmetric crystals, and moreover, the crystals in our study are the first topological material class which relies on origin-translating symmetries,” added Wang.

Out of the 230 space groups in which materials can exist in nature, 157 are non-symmorphic, meaning they can potentially host interesting electronic behavior such as the hourglass fermion.

“The exploration of the behavior of these interesting fermions, their mathematical description, and the materials where they can be observed, is poised to create an onslaught of activity in quantum, solid state and material physics,” Cava said. “We are just at the beginning.”

The study was funded by the National Science Foundation, the Office of Naval Research, the David and Lucile Packard Foundation, the W. M. Keck Foundation, and the Eric and Wendy Schmidt Transformative Technology Fund at Princeton University.

The paper, “Hourglass fermions” by Zhijun Wang, A. Alexandradinata, R. J. Cava and B. Andrei Bernevig, was published in the April 14, 2016 issue of Nature, 532,189–194, doi:10.1038/nature17410. Read the preprint.

The paper, “Topological insulators from group cohomology” by A. Alexandradinata, Zhijun Wang, and B. Andrei Bernevig, was published in the April 15, 2016 issue of Phys. Rev. X 6, 021008.