Genes for age-related cognitive decline found in adult worm neurons (Nature)

By Staff

Research image
A research team from Princeton University led by Coleen Murphy, professor of Molecular Biology and the Lewis-Sigler Institute for Integrative Genomics, has developed a new method for isolating neurons from adult C. elegans worms. The first panel shows worms containing neurons labeled with green fluorescent protein (GFP). Using rapid, chilled chemomechanical disruption, the neuron cells were extracted and purified, then sorted using fluorescence-activated cell sorting (FACS).

Researchers from Princeton University have identified genes important for age-related cognitive declines in memory in adult worm neurons, which had not been studied previously. The research, published in the journal Nature, could eventually point the way toward therapies to extend life and enhance health in aging human populations.

“The newly discovered genes regulate enhanced short-term memory as well as the ability to repair damaged neurons, two factors that play an important role in healthy aging,” said Coleen Murphy, a professor of Molecular Biology and the Lewis-Sigler Institute for Integrative Genomics, director of the Glenn Center for Quantitative Aging Research at Princeton, and senior author on the study. “Identifying the individual factors involved in neuron health in the worm is the first step to understanding human neuronal decline with age.”

The small soil-dwelling roundworm, Caenorhabditis elegans, contains genes that determine the rate of aging and overall health during aging. Mutations in one of these genetic pathways, the insulin/IGF-1 signaling (IIS) pathway, can double worm lifespan. Similar mutations in humans have been found in long-lived humans.

But studying the IIS mutation in adult worm neurons was difficult because the adults have a thick, durable covering that protects the neurons.

Using a new technique they developed to break up the tough outer covering, researchers at Princeton succeeded in isolating adult neurons, which enabled the detection of the new set of genes regulated by the insulin/IGF-1 signaling pathway.

“Our technique enabled us to study gene expression in adult neurons, which are the cells that govern cognitive aspects such as memory and nerve-regeneration,” said Murphy, whose research on aging is funded in part by the National Institutes of Health. “Prior to this work, researchers were only able to examine gene regulation either using adult worms or individual tissues from young worms.”

The work allowed co-first authors Rachel Kaletsky and Vanisha Lakhina to explore why long-lived IIS mutants maintain memory and neuron-regeneration abilities with age. Until now, the known targets of the insulin longevity pathway were located mostly in the intestine and skin of the worm rather than the neurons. Kaletsky is a postdoctoral research fellow and Lakhina is a postdoctoral research associate in the Lewis-Sigler Institute.

Kaletsky worked out the new way to isolate neurons from adult worms, and with Lakhina, proceeded to profile the gene activity in adult C. elegans neurons for the first time. They discovered that the IIS mutant worms express genes that keep neurons working longer, and that these genes are completely different from the previously known longevity targets. They also discovered a new factor that is responsible for nerve cell (axon) regeneration in adult worms, which could have implications for human traumatic brain injury.

“Kaletsky and Lakhina developed a new technique that is going to be used by the entire worm community, so it really opens up new avenues of research even beyond the discoveries we describe in the paper,” Murphy said.

One of the newly identified genes, fkh-9, regulates both enhanced memory and neuronal regeneration in IIS mutants. Previous studies have detected only one other gene that regulates neuronal regeneration in the mutants, demonstrating the power of the technique to identify new gene regulators. The researchers also found that fkh-9 gene expression is required for long lifespan in many IIS mutants, but it did not play that role in neurons, suggesting the gene governs multiple outcomes in the worm.

Murphy’s lab is now working to understand how fkh-9 works to influence memory, axon regeneration, and lifespan. The gene codes for a protein, FKH-9, that acts as a transcription factor, meaning it controls the expression of other genes and is likely part of a larger regulatory network. FKH-9 also appears to regulate different processes in different tissues: It is required in neurons for memory and axon repair, but not for lifespan. Murphy’s group is working to figure out how FKH-9 acts in distinct tissues to regulate such different processes.

The study provides a more complete picture of how IIS mutants control gene expression in different tissues to promote healthy aging, Murphy said.

“fkh-9 is likely only one of the exciting genes that will emerge from using this technique,” Murphy said. “By identifying the suite of IIS-regulated neuronal genes, there are many candidates for follow-up, only a fraction of which have been characterized in any great detail,” she said.

Other contributors to the study included Rachel Arey, a postdoctoral research fellow; former graduate students April Williams and Jessica Landis; and Jasmine Ashraf, a research specialist in the Lewis-Sigler Institute.

Additional funding for the study was provided by the Keck Foundation, the Ruth L. Kirschstein National Research Service Awards, the National Science Foundation and the New Jersey Commission on Brain Injury Research.

Read the abstract.

The article, The C. elegans adult neuronal IIS/FOXO transcriptome reveals adult phenotype regulators, by Rachel Kaletsky, Vanisha Lakhina, Rachel Arey, April Williams, Jessica Landis, Jasmine Ashraf and Coleen T. Murphy, was published in the journal Nature online ahead of print on December 14, 2015. doi:10.1038/nature16483.

Using powerful computers, physicists uncover mechanism that stabilizes plasma (Physical Review Letters)

Virtual plasma
A cross-section of the virtual plasma showing where the magnetic field lines intersect the plane. The central section has field lines that rotate exactly once. Image Credit: Stephen Jardin, PPPL.

By Raphael Rosen, Princeton Plasma Physics Laboratory Communications

A team of physicists led by Stephen Jardin of the U.S. Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) has discovered a mechanism that prevents the electrical current flowing through fusion plasma from repeatedly peaking and crashing. This behavior is known as a “sawtooth cycle” and can cause instabilities within the plasma’s core. The results have been published online in Physical Review Letters. The research was supported by the DOE Office of Science.

The team, which included scientists from General Atomics and the Max Planck Institute for Plasma Physics, performed calculations on the Edison computer at the National Energy Research Scientific Computing Center, a division of the Lawrence Berkeley National Laboratory. Using M3D-C1, a program they developed that creates three-dimensional simulations of fusion plasmas, the team found that under certain conditions a helix-shaped whirlpool of plasma forms around the center of the tokamak. The swirling plasma acts like a dynamo — a moving fluid that creates electric and magnetic fields. Together these fields prevent the current flowing through plasma from peaking and crashing.

The researchers found two specific conditions under which the plasma behaves like a dynamo. First, the magnetic lines that circle the plasma must rotate exactly once, both the long way and the short way around the doughnut-shaped configuration, so an electron or ion following a magnetic field line would end up exactly where it began. Second, the pressure in the center of the plasma must be significantly greater than at the edge, creating a gradient between the two sections. This gradient combines with the rotating magnetic field lines to create spinning rolls of plasma that swirl around the tokamak and gives rise to the dynamo that maintains equilibrium and produces stability.

This dynamo behavior arises only under certain conditions. Both the electrical current running through the plasma and the pressure that the plasma’s electrons and ions exert on their neighbors must be in a range that is “not too large and not too small,” said Jardin. In addition, the speed at which the conditions for the fusion reaction are established must be “not too fast and not too slow.”

Jardin stressed that once a range of conditions like pressure and current are set, the dynamo phenomenon occurs all by itself. “We don’t have to do anything else from the outside,” he noted. “It’s something like when you drain your bathtub and a whirlpool forms over the drain by itself. But because a plasma is more complicated than water, the whirlpool that forms in the tokamak needs to also generate the voltage to sustain itself.”

During the simulations the scientists were able to virtually add new diagnostics, or probes, to the computer code. “These diagnostics were able to measure the helical velocity fields, electric potential, and magnetic fields to clarify how the dynamo forms and persists,” said Jardin. The persistence produces the “voltage in the center of the discharge that keeps the plasma current from peaking.”

Physicists have indirectly observed what they believe to be the dynamo behavior on the DIII-D National Fusion Facility that General Atomics operates for the Department of Energy in San Diego and on the ASDEX Upgrade in Garching, Germany. They hope to learn to create these conditions on demand, especially in ITER, the huge multinational fusion machine being constructed in France to demonstrate the practicality of fusion power. “Now that we understand it better, we think that computer simulations will show us under what conditions this will occur in ITER,” said Jardin. “That will be the focus of our research in the near future.”

Learning how to create these conditions will be particularly important for ITER, which will produce helium nuclei that could amplify the sawtooth disruptions. If large enough, these disruptions could cause other instabilities that could halt the fusion process. Preventing the cycle from starting would therefore be highly beneficial for the ITER experiment.

Read the abstract.

S.C. Jardin, N. Ferraro, and I. Krebs. “Self-Organized Stationary States of Tokamaks.” Physical Review Letters. Published November 17, 2015. DOI: http://dx.doi.org/10.1103/PhysRevLett.115.215001

This article is courtesy of the Princeton Plasma Physics Laboratory.

Warm nights could flood the atmosphere with carbon under climate change (PNAS)

Photo courtesy of William Anderegg, Princeton University
Amazonian tropical rainforest near Manaus, Brazil. Photo courtesy of William Anderegg, Princeton University.

By Morgan Kelly, Office of Communications

The warming effects of climate change usually conjure up ideas of parched and barren landscapes broiling under a blazing sun, its heat amplified by greenhouse gases. But a study led by Princeton University researchers suggests that hotter nights may actually wield much greater influence over the planet’s atmosphere as global temperatures rise — and could eventually lead to more carbon flooding the atmosphere.

Since measurements began in 1959, nighttime temperatures in the tropics have had a strong influence over year-to-year shifts in the land’s carbon-storage capacity, or “sink,” the researchers report in the journal Proceedings of the National Academy of Sciences. Earth’s ecosystems absorb about 25% of the excess carbon from the atmosphere, and tropical forests account for about one-third of land-based plant productivity.

During the past 50 years, the land-based carbon sink’s “interannual variability” has grown by 50 to 100 percent, the researchers found. The researchers used climate- and satellite-imaging data to determine which of various climate factors — including rainfall, drought and daytime temperatures — had the most effect on the carbon sink’s swings. They found the strongest association with variations in tropical nighttime temperatures, which have risen by about 0.6 degrees Celsius since 1959.

First author William Anderegg, an associate research scholar in the Princeton Environmental Institute, explained that he and his colleagues determined that warm nighttime temperatures lead plants to put more carbon into the atmosphere through a process known as respiration.

Just as people are more active on warm nights, so too are plants. Although plants take up carbon dioxide from the atmosphere, they also internally consume sugars to stay alive. That process, known as respiration, produces carbon dioxide. Plants step up respiration in warm weather, Anderegg said. The researchers found that yearly variations in the carbon sink strongly correlated with variations in plant respiration.

“When you heat up a system, biological processes tend to increase,” Anderegg said. “At hotter temperatures, plant respiration rates go up and this is what’s happening during hot nights. Plants lose a lot more carbon than they would during cooler nights.”

Previous research has shown that nighttime temperatures have risen significantly faster as a result of climate change than daytime temperatures, Anderegg said. This means that in future climate scenarios respiration rates could increase to the point that the land is putting more carbon into the atmosphere than it’s taking out, “which would be disastrous,” he said.

Of course, plants consume carbon dioxide as a part of photosynthesis, during which they convert sunlight into energy. Photosynthesis also is sensitive to rises in temperature, but it occurs only during the day, whereas respiration occurs at all hours and thus is more sensitive to nighttime warming, Anderegg said.

“Nighttime temperatures have been increasing faster than daytime temperatures and will continue to rise faster,” Anderegg said. “This suggests that tropical ecosystems might be more vulnerable to climate change than previously thought, risking crossing the threshold from a carbon sink to a carbon source. But there’s certainly potential for plants to acclimate their respiration rates and that’s an area that needs future study.”

This research was supported by the National Science Foundation MacroSystems Biology Grant (EF-1340270), RAPID Grant (DEB-1249256) and EAGER Grant (1550932); and a National Oceanic and Atmospheric Administration (NOAA) Climate and Global Change postdoctoral fellowship administered by the University Corporation of Atmospheric Research.

William R. L. Anderegg, Ashley P. Ballantyne, W. Kolby Smith, Joseph Majkut, Sam Rabin, Claudie Beaulieu, Richard Birdsey, John P. Dunne, Richard A. Houghton, Ranga B. Myneni, Yude Pan, Jorge L. Sarmiento, Nathan Serota, Elena Shevliakova, Pieter Tan and Stephen W. Pacala. “Tropical nighttime warming as a dominant driver of variability in the terrestrial carbon sink.” Proceedings of the National Academy of Sciences, published online in-advance of print Dec. 7, 2015. DOI: 10.1073/pnas.1521479112.

 

PPPL physicists propose new plasma-based method to treat radioactive waste (Journal of Hazardous Materials)

Caption: Securing a shipment of mixed, low-level waste from Hanford for treatment and disposal. Credit: U.S. Department of Energy
Caption: Securing a shipment of mixed, low-level waste from Hanford for treatment and disposal. Credit: U.S. Department of Energy

By Raphael Rosen, Princeton Plasma Physics Laboratory Communications

Physicists at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) are proposing a new way to process nuclear waste that uses a plasma-based centrifuge. Known as plasma mass filtering, the new mass separation techniques would supplement chemical techniques. It is hoped that this combined approach would reduce both the cost of nuclear waste disposal and the amount of byproducts produced during the process. This work was supported by PPPL’s Laboratory Directed Research and Development Program.

“The safe disposal of nuclear waste is a colossal problem,” said Renaud Gueroult, staff physicist at PPPL and lead author of the paper that appeared in the Journal of Hazardous Materials in October. “One solution might be to supplement existing chemical separation techniques with plasma separation techniques, which could be economically attractive, ideally leading to a reevaluation of how nuclear waste is processed.”

The immediate motivation for safe disposal is the radioactive waste stored currently at the Hanford Site, a facility in Washington State that produced plutonium for nuclear weapons during the Cold War. The volume of this waste originally totaled 54 million gallons and was stored in 177 underground tanks.

In 2000, Hanford engineers began building machinery that would encase the radioactive waste in glass. The method, known as “vitrification,” had been used at another Cold War-era nuclear production facility since 1996. A multibillion-dollar vitrification plant is currently under construction at the Hanford site.

To reduce the cost of high-level waste vitrification and disposal, it may be advantageous to reduce the number of high-level glass canisters by packing more waste into each glass canister. To reduce the volume to be vitrified, it would be advantageous to separate the nonradioactive waste, like aluminum and iron, out of the waste, leaving less waste to be vitrified. However, in its 2014 report, the DOE Task Force on Technology Development for Environmental Management argued that, “without the development of new technology, it is not clear that the cleanup can be completed satisfactorily or at any reasonable cost.”

The high-throughput, plasma-based, mass separation techniques advanced at PPPL offer the possibility of reducing the volume of waste that needs to be immobilized in glass. “The interesting thing about our ideas on mass separation is that it is a form of magnetic confinement, so it fits well within the Laboratory’s culture,” said physicist Nat Fisch, co-author of the paper and director of the Princeton University Program in Plasma Physics. “To be more precise, it is ‘differential magnetic confinement’ in that some species are confined while others are lost quickly, which is what makes it a high-throughput mass filter.”

How would a plasma-based mass filter system work? The method begins by atomizing and ionizing the hazardous waste and injecting it into the rotating filter so the individual elements can be influenced by electric and magnetic fields. The filter then separates the lighter elements from the heavier ones by using centrifugal and magnetic forces. The lighter elements are typically less radioactive than the heavier ones and often do not need to be vitrified. Processing of the high-level waste therefore would need fewer high-level glass canisters overall, while the less radioactive material could be immobilized in less costly wasteform (e.g., concrete, bitumen).

The new technique would also be more widely applicable than traditional chemical-based methods since it would depend less on the nuclear waste’s chemical composition. While “the waste’s composition would influence the performance of the plasma mass filter in some ways, the effect would most likely be less than that associated with chemical techniques,” said Gueroult.

Gueroult points out why savings by plasma techniques can be important. “For only about $10 a kilogram in energy cost, solid waste can be ionized. In its ionized form, the waste can then be separated into heavy and light components. Because the waste is atomized, the separation proceeds only on the basis of atomic mass, without regard to the chemistry. Since the total cost of chemical-based techniques can be $2,000 per kilogram of the vitrified waste, as explained in the Journal of Hazardous Materials paper, it stands to reason that even if several plasma-based steps are needed to achieve pure enough separation, there is in principle plenty of room to cut the overall costs. That is the point of our recent paper. It is also why we are excited about our plasma-based methods.”

Fisch notes that “our original ideas grew out of the thesis of Abe Fetterman, who began by considering centrifugal mirror confinement for nuclear fusion, but then realized the potential for mass separation. Now the key role on this project is being played by Renaud, who has developed the concept substantially further.”

According to Fisch, the current developments are a variation and refinement of a plasma-based mass separation system first advanced by a private company called Archimedes Technology Group. That company, started by the late Dr. Tihiro Ohkawa, a fusion pioneer, raised private capital to advance a plasma-based centrifuge concept to clean up the legacy waste at Hanford, but ceased operation in 2006 after failing to receive federal funding.

Now an updated understanding of the complexity of the Hanford problem, combined with an increased appreciation of new ideas, has led to renewed federal interest in waste-treatment solutions. Completion of the main waste processing operations, which was in 2002 projected for 2028, has slipped by 20 years over the last 13 years, and the total cleanup cost is now estimated by the Department of Energy to be greater than 250 billion dollars, according to the DOE Office of Inspector General, Office of Audits and Inspections. DOE, which has the responsibility of cleaning up the legacy nuclear waste at Hanford and other sites, conducted a Basic Research Needs Workshop on nuclear waste cleanup in July that both Fisch and Gueroult attended. The report of that workshop, which is expected to highlight new approaches to the cleanup problem, is due out this fall.

PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Read the abstract.

Renaud Gueroult, David T. Hobbs, Nathaniel J. Fisch. “Plasma filtering techniques for nuclear waste remediation.” Journal of Hazardous Materials, published October 2015. doi:10.1016/j.jhazmat.2015.04.058.

Dolphin-disease outbreak shows how to account for the unknown when tracking epidemics (Journal of the Royal Society Interface)

By Morgan Kelly, Office of Communications

Common bottlenose dolphin. Image credit: Allison Henry, NOAA.
Common bottlenose dolphin. Image credit: Allison Henry, NOAA.

Stopping the outbreak of a disease hinges on a wealth of data such as what makes a suitable host and how a pathogen spreads. But gathering these data can be difficult for diseases in remote areas of the world, or for epidemics involving wild animals.

A new study led by Princeton University researchers and published in the Journal of the Royal Society Interface explores an approach to studying epidemics for which details are difficult to obtain. The researchers analyzed the 2013 outbreak of dolphin morbillivirus — a potentially fatal pathogen from the same family as the human measles virus — that resulted in more than 1,600 bottlenose dolphins becoming stranded along the Atlantic coast of the United States by 2015. Because scientists were able to observe dolphins only after they washed up on shore, little is known about how the disease transmits and persists in the wild.

The researchers used a Poisson process — a statistical tool used to model the random nature of disease transmission — to determine from sparse data how dolphin morbillivirus can spread. They found that individual bottlenose dolphins may be infectious for up to a month and can spread the disease over hundreds of miles, particularly during seasonal migrations. In 2013, the height of disease transmission occurred toward the end of summer around an area offshore of Virginia Beach, Virginia, where multiple migratory dolphin groups are thought to cross paths.

In the interview below, first author Sinead Morris, a graduate student in ecology and evolutionary biology, explains what the researchers learned about the dolphin morbillivirus outbreak, and how the Poisson process can help scientists understand human epidemics. Morris is in the research group of co-author Bryan Grenfell, Princeton’s Kathryn Briger and Sarah Fenton Professor of Ecology and Evolutionary Biology and Public Affairs.

Q: How does the Poisson process track indirectly observed epidemics and what specific challenges does it overcome?

A: One of the main challenges in modeling indirectly observed epidemics is a lack of data. In our case, we had information on all infected dolphins that had been found stranded on shore, but had no data on the number of individuals that became infected but did not strand. The strength of the Poisson process is that its simple framework means it can be used to extract important information about the how the disease is spreading across space and time, despite having such incomplete data. Essentially the way the process works is that it keeps track of where and when individual dolphins stranded, and then at each new point in the epidemic it uses the history of what has happened before to project what will happen in the future. For example, an infected individual is more likely to transmit the disease onwards to other individuals in close spatial proximity than to those far away. So, by keeping track of all these infections the model can identify where and when the largest risk of new infections will be.

Q: Why was this 2013-15 outbreak of dolphin morbillivirus selected for study, and what key insights does this work provide?

A: The recent outbreak of dolphin morbillivirus spread rapidly along the northwestern Atlantic coast from New York to Florida, causing substantial mortality among coastal bottlenose dolphin populations. Despite the clear detrimental impact that this disease can have, however, it is still poorly understood. Therefore, our aim in modeling the epidemic was to gain much needed information about how the virus spreads. We found that a dolphin may be infectious for up to 24 days and can travel substantial distances (up to 220 kilometers, or 137 miles) within this time. This is important because such long-range movements — for example, during periods of seasonal migration — are likely to create many transmission opportunities from infected to uninfected individuals, and may have thus facilitated the rapid spread of the virus down the Atlantic coast.

Q: Can this model be used for human epidemics?

A: The Poisson process framework was originally developed to model the occurrence of earthquakes, and has since been used in a variety of other contexts that also tend to suffer from noisy, indirectly observed data, such as urban crime distribution. To model dolphin morbillivirus, we adapted the framework to incorporate more biological information, and similar techniques have also been applied to model meningococcal disease in humans, which can cause meningitis and sepsis. Generally, the data characterizing human epidemics are more detailed than the data we had for this project and, as such, models that can incorporate greater complexity are more widely used. However, we hope that our methods will stimulate the greater use of Poisson process models in epidemiological systems that also suffer from indirectly observed data.

Graph of predictions of risk of disease transmission.
A new study led by Princeton University researchers used a Poisson process to analyze sparse data from the 2013 outbreak of morbillivirus among bottlenose dolphins along the United States’ Atlantic coast. This graph shows the model predictions of how the risk of disease transmission (marginal hazard) changes over space (A) and time (B) since the beginning of the epidemic. The peaks indicate that the greatest risk of transmission occurred around day 70 of the epidemic between 36 and 37 degrees north latitude, which is an area that encompasses the offshore waters of Virginia Beach, Virginia. These peaks coincide with a period towards the end of summer when large numbers of dolphins are known to gather around Virginia Beach as their seasonal migratory ranges overlap. (Image courtesy of Sinead Morris, Princeton University)

This research was supported by the RAPIDD program of the Science and Technology Directorate of the Department of Homeland Security; the National Institutes of Health Fogarty International Center; the Bill and Melinda Gates Foundation; and the Marine Mammal Unusual Mortality Event Contingency Fund and John H. Prescott Marine Mammal Rescue Assistance Grant Program operated by the National Oceanic and Atmospheric Administration.

Read the abstract.

Sinead E. Morris, Jonathan L. Zelner, Deborah A. Fauquier, Teresa K. Rowles, Patricia E. Rosel, Frances Gulland and Bryan T. Grenfell. “Partially observed epidemics in wildlife hosts: modeling an outbreak of dolphin morbillivirus in the northwestern Atlantic, June 2013–2014.” Journal of the Royal Society Interface, published Nov. 18 2015. DOI: 10.1098/rsif.2015.0676

 

Identifying new sources of turbulence in spherical tokamaks (Physics of Plasmas)

By John Greenwald, Princeton Plasma Physics Laboratory Communications

Turbulence 1
Computer simulation of turbulence in a model of the NSTX-U, a spherical tokamak fusion facility at the U.S. Dept. of Energy’s Princeton Plasma Physics Laboratory. Credit: Eliot Feibush

For fusion reactions to take place efficiently, the atomic nuclei that fuse together in plasma must be kept sufficiently hot. But turbulence in the plasma that flows in facilities called tokamaks can cause heat to leak from the core of the plasma to its outer edge, causing reactions to fizzle out.

Researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) have for the first time modeled previously unsuspected sources of turbulence in spherical tokamaks, an alternative design for producing fusion energy. The findings, published online in October in Physics of Plasmas, could influence the development of future fusion facilities. This work was supported by the DOE Office of Science.

Spherical tokamaks, like the recently completed National Spherical Torus Experiment-Upgrade (NSTX-U) at PPPL, are shaped like cored apples compared with the mushroom-like design of conventional tokamaks that are more widely used. The cored-apple shape provides some distinct characteristics for the behavior of the plasma inside.

The paper, with PPPL principal research physicist Weixing Wang as lead author, identifies two important new sources of turbulence based on data from experiments on the National Spherical Torus Experiment prior to its upgrade. The discoveries were made by using state-of-the-art large-scale computer simulations. These sources are:

  • Instabilities caused by plasma that flows faster in the center of the fusion facility than toward the edge when rotating strongly in L-mode — or low confinement — regimes. These instabilities, called “Kelvin-Helmholtz modes” after physicists Baron Kelvin and Hermann von Helmholtz, act like wind that stirs up waves as it blows over water and are for the first time found to be relevant for realistic fusion experiments. Such non-uniform plasma flows have been known to play favorable roles in fusion plasmas in conventional and spherical tokamaks. The new results from this study suggest that we may also need to keep these flows within an optimized level.
  • Trapped electrons that bounce between two points in a section of the tokamak instead of swirling all the way around the facility. These electrons were shown to cause significant leakage of heat in H-mode — or high-confinement — regimes by driving a specific instability when they collide frequently. This type of instability is believed to play little role in conventional tokamaks but can provide a robust source of plasma turbulence in spherical tokamaks.

Most interestingly, the model predicts a range of trapped electron collisions in spherical tokamaks that can be turbulence-free, thus improving the plasma confinement. Such favorable plasmas could possibly be achieved by future advanced spherical tokamaks operating at high temperature.

Findings of the new model can be tested on the NSTX-U and will help guide experiments to identify non-traditional sources of turbulence in the spherical facility. Results of this research can shed light on the physics behind key obstacles to plasma confinement in spherical facilities and on ways to overcome them in future machines.

PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by Princeton University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Read the abstract:

Weixing X. Wang, Stephane Ethier, Yang Ren, Stanley Kaye, Jin Chen, Edward Startsev, Zhixin Lu, and Zhengqian Li. “Identification of new turbulence contributions to plasma transport and confinement in spherical tokamak regime.” Physics of Plasmas, published October 2015. doi:10.1063/1.4933216.