Cytomegalovirus hijacks human enzyme for replication (Cell Reports)

DiagramBy: Tien Nguyen, Department of Chemistry

More than 60 percent of the world’s population is infected with a type of herpes virus called human cytomegalovirus. The virus replicates by commandeering the host cell’s metabolism but the details of this maneuver are unclear.

Researchers at Princeton University have discovered that cytomegalovirus manipulates a process called fatty acid elongation, which makes the very-long-chain fatty acids necessary for virus replication. Published in the journal Cell Reports on March 3, the research team identified a specific human enzyme—elongase enzyme 7—that the virus induces to turn on fatty acid elongation.

“Elongase 7 was just screaming, ‘I’m important, study me,’” said John Purdy, a postdoctoral researcher in the Rabinowitz lab and lead author on the study.

He found that once a cell was infected by cytomegalovirus, the level of elongase 7 RNA increased over 150-fold. Purdy then performed a genetic knockdown experiment to silence elongase 7 and established that in its absence the virus was unable to efficiently replicate.

“Elongases are a family of seven related proteins. The particular importance of elongase 7 for cytomegalovirus replication was a pleasant surprise, and enhances its appeal as a drug target,” said Joshua Rabinowitz, a professor of chemistry and the Lewis-Sigler Institute for Integrative Genomics at Princeton and co-author on the paper.

Activation of the elongase enzyme led to an increase in very-long-chain fatty acids, which are used by the virus to build its viral envelope and replicate. The researchers fed infected cells a sugar called heavy isotope-labeled carbon-13 glucose, which is metabolized by the cell to form substrates for fatty acid elongation. The heavy isotope carbon-13 atoms were incorporated into new products that were detected and identified by their mass using a mass spectrometry method. This powerful technique provided insight into the amount of fatty acids produced and how they are constructed.

Cytomegalovirus infection mostly threatens populations with compromised immune systems and developing fetuses, and is the leading cause of hearing loss in children. Current treatments target the DNA replication step of the virus and are not very effective. These findings have advanced the understanding of the virus’s operations and identified fatty acid elongation as a key process that warrants further study.

This work was funded by National Institute of Health grants AI78063, CA82396, and GM71508 and an American Heart Association postdoctoral fellowship to J.G.P. (12POST9190001).

Read the full article here:

Purdy, J. G.; Shenk, T.; Rabinowitz, J. D. “Fatty Acid Elongase 7 Catalyzes the Lipidome Remodeling Essential for Human Cytomegalovirus Replication.” Cell Reports, 2015, 10, 1375.

 

Letting go of the (genetic) apron strings (Cell)

Researchers explore the shift from maternal genes to the embryo’s genes during development

By Catherine Zandonella, Office of the Dean for Research

Fruit fly embryo

Cells in an early-stage fruit fly embryo. (Image courtesy of NIGMS image gallery).

A new study from Princeton University researchers sheds light on the handing over of genetic control from mother to offspring early in development. Learning how organisms manage this transition could help researchers understand larger questions about how embryos regulate cell division and differentiation into new types of cells.

The study, published in the March 12 issue of the journal Cell, provides new insight into the mechanism for this genetic hand-off, which happens within hours of fertilization, when the newly fertilized egg is called a zygote.

“At the beginning, everything the embryo needs to survive is provided by mom, but eventually that stuff runs out, and the embryo needs to start making its own proteins and cellular machinery,” said Princeton postdoctoral researcher in the Department of Molecular Biology and first author Shelby Blythe. “We wanted to find out what controls that transition.”

Blythe conducted the study with senior author Eric Wieschaus, Princeton’s Squibb Professor in Molecular Biology, Professor of Molecular Biology and the Lewis-Sigler Institute for Integrative Genomics, a Howard Hughes Medical Institute investigator, and a Nobel laureate in physiology or medicine.

Researchers have known that in most animals, a newly fertilized egg cell divides rapidly, producing exact copies of itself using gene products supplied by the mother. After a short while, this rapid cell division pauses, and when it restarts, the embryonic DNA takes control and the cells divide much more slowly, differentiating into new cell types that are needed for the body’s organs and systems.

To find out what controls this maternal to zygotic transition, also called the midblastula transition (MBT), Blythe conducted experiments in the fruit fly Drosophila melanogaster, which has long served as a model for development in higher organisms including humans.

These experiments revealed that the slower cell division is a consequence of an upswing in DNA errors after the embryo’s genes take over. Cell division slows down because the cell’s DNA-copying machinery has to stop and wait until the damage is repaired.

Blythe found that it wasn’t the overall amount of embryonic DNA that caused this increase in errors. Instead, his experiments indicated that the high error rate was due to molecules that bind to DNA to activate the reading, or “transcription,” of the genes. These molecules stick to the DNA strands at thousands of sites and prevent the DNA copying machinery from working properly.

To discover this link between DNA errors and slower cell replication, Blythe used genetic techniques to create Drosophila embryos that were unable to repair DNA damage and typically died shortly after beginning to use their own genes. He then blocked the molecules that initiate the process of transcription of the zygotic genes, and found that the embryos survived, indicating that these molecules that bind to the DNA strands, called transcription factors, were triggering the DNA damage. He also discovered that a protein involved in responding to DNA damage, called Replication Protein A (RPA), appeared near the locations where DNA transcription was being initiated. “This provided evidence that the process of awakening the embryo’s genome is deleterious for DNA replication,” he said.

The study also demonstrates a mechanism by which the developing embryo ensures that cell division happens at a pace that is slow enough to allow the repair of damage to DNA during the switchover from maternal to zygotic gene expression. “For the first time we have a mechanistic foothold on how this process works,” Blythe said.

The work also enables researchers to explore larger questions of how embryos regulate DNA replication and transcription. “This study allows us to think about the idea that the ‘character’ of the DNA before and after the MBT has something to do with the DNA acquiring the architectural features of chromatin [the mix of DNA and proteins that make up chromosomes] that allow us to point to a spot and say ‘this is a gene’ and ‘this is not a gene’,” Blythe said. “Many of these features are indeed absent early in embryogenesis, and we suspect that the absence of these features is what allows the rapid copying of the DNA template early on. Part of what is so exciting about this is that early embryos may represent one of the only times when this chromatin architecture is missing or ‘blank’. Additionally, these early embryos allow us to study how the cell builds and installs these features that are so essential to the fundamental processes of cell biology.”

This work was supported in part by grant 5R37HD15587 from the Eunice Kennedy Shriver National Institute of Child Health and Human Development.

Read the abstract

Blythe, Shelby A. & Eric R. Wieschaus. Zygotic Genome Activation Triggers the DNA Replication Checkpoint at the Midblastula Transition. Cell. Published online on March 5, 2015. doi:10.1016/j.cell.2015.01.050. http://www.sciencedirect.com/science/article/pii/S0092867415001282

 

Scientists make breakthrough in understanding how to control intense heat bursts in fusion experiments (Physical Review Letters)

Computer simulation

Computer simulation of a cross-section of a DIII-D plasma responding to tiny magnetic fields. The left image models the response that suppressed the ELMs while the right image shows a response that was ineffective. Simulation by General Atomics.

By Raphael Rosen, Princeton Plasma Physics Laboratory

Researchers from General Atomics and the U.S. Department of Energy (DOE)’s Princeton Plasma Physics Laboratory (PPPL) have made a major breakthrough in understanding how potentially damaging heat bursts inside a fusion reactor can be controlled. Scientists performed the experiments on the DIII-D National Fusion Facility, a tokamak operated by General Atomics in San Diego. The findings represent a key step in predicting how to control heat bursts in future fusion facilities including ITER, an international experiment under construction in France to demonstrate the feasibility of fusion energy. This work is supported by the DOE Office of Science (Fusion Energy Sciences).

The studies build upon previous work pioneered on DIII-D showing that these intense heat bursts – called “ELMs” for short – could be suppressed with tiny magnetic fields. These tiny fields cause the edge of the plasma to smoothly release heat, thereby avoiding the damaging heat bursts. But until now, scientists did not understand how these fields worked. “Many mysteries surrounded how the plasma distorts to suppress these heat bursts,” said Carlos Paz-Soldan, a General Atomics scientist and lead author of the first of the two papers that report the seminal findings back-to-back in the March 12 issue of Physical Review Letters.

Paz-Soldan and a multi-institutional team of researchers found that tiny magnetic fields applied to the device can create two distinct kinds of response, rather than just one response as previously thought. The new response produces a ripple in the magnetic field near the plasma edge, allowing more heat to leak out at just the right rate to avert the intense heat bursts. Researchers applied the magnetic fields by running electrical current through coils around the plasma. Pickup coils then detected the plasma response, much as the microphone on a guitar picks up string vibrations.

The second result, led by PPPL scientist Raffi Nazikian, who heads the PPPL research team at DIII-D, identified the changes in the plasma that lead to the suppression of the large edge heat bursts or ELMs. The team found clear evidence that the plasma was deforming in just the way needed to allow the heat to slowly leak out. The measured magnetic distortions of the plasma edge indicated that the magnetic field was gently tearing in a narrow layer, a key prediction for how heat bursts can be prevented.  “The configuration changes suddenly when the plasma is tapped in a certain way,” Nazikian said, “and it is this response that suppresses the ELMs.”

Paz-Soldan and Nazikian

Carlos Paz-Soldan, left, and Raffi Nazikian at the DIII-D tokamak. (Photo by Lisa Petrillo/General Atomics)

The work involved a multi-institutional team of researchers who for years have been working toward an understanding of this process. These researchers included people from General Atomics, PPPL, Oak Ridge National Laboratory, Columbia University, Australian National University, the University of California-San Diego, the University of Wisconsin-Madison, and several others.

The new results suggest further possibilities for tuning the magnetic fields to make ELM-control easier. These findings point the way to overcoming a persistent barrier to sustained fusion reactions. “The identification of the physical processes that lead to ELM suppression when applying a small 3D magnetic field to the inherently 2D tokamak field provides new confidence that such a technique can be optimized in eliminating ELMs in ITER and future fusion devices,” said Mickey Wade, the DIII-D program director.

The results further highlight the value of the long-term multi-institutional collaboration between General Atomics, PPPL and other institutions in DIII-D research. This collaboration, said Wade, “was instrumental in developing the best experiment possible, realizing the significance of the results, and carrying out the analysis that led to publication of these important findings.”

PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

General Atomics has participated in fusion research for over 50 years and presently operates the DIII-D National Fusion Facility for the U.S. Department of Energy Office of Science with a mission “to provide the physics basis for the optimization of the tokamak approach to fusion energy production.”  The General Atomics group of companies is a world renowned leader in developing high-technology systems ranging from the nuclear fuel cycle to electromagnetic systems; remotely operated surveillance aircraft; airborne sensors; advanced electronic, wireless, and laser technologies; and biofuels.

Read the articles:

C. Paz-Soldan, R. Nazikian, S. R. Haskey, N. C. Logan, E. J. Strait, N. M. Ferraro, J. M. Hanson, J. D. King, M. J. Lanctot, R. A. Moyer, M. Okabayashi, J-K. Park, M. W. Shafer, and B. J. Tobias. Observation of a Multimode Plasma Response and its Relationship to Density Pumpout and Edge-Localized Mode Suppression. Phys. Rev. Lett. 114, 105001 – Published 12 March 2015.

R. Nazikian, C. Paz-Soldan, J. D. Callen, J. S. deGrassie, D. Eldon, T. E. Evans, N. M. Ferraro, B. A. Grierson, R. J. Groebner, S. R. Haskey, C. C. Hegna, J. D. King, N. C. Logan, G. R. McKee, R. A. Moyer, M. Okabayashi, D. M. Orlov, T. H. Osborne, J-K. Park, T. L. Rhodes, M. W. Shafer, P. B. Snyder, W. M. Solomon, E. J. Strait, and M. R. Wade. Pedestal Bifurcation and Resonant Field Penetration at the Threshold of Edge-Localized Mode Suppression in the DIII-D Tokamak. Phys. Rev. Lett. 114, 105002 – Published 12 March 2015.

 

 

 

 

Second natural quasicrystal found in ancient meteorite (Scientific Reports)

By Catherine Zandonella, Office of the Dean for Research

A team from Princeton University and the University of Florence in Italy has discovered a quasicrystal — so named because of its unorthodox arrangement of atoms — in a 4.5-billion-year-old meteorite from a remote region of northeastern Russia, bringing to two the number of natural quasicrystals ever discovered. Prior to the team finding the first natural quasicrystal in 2009, researchers thought that the structures were too fragile and energetically unstable to be formed by natural processes.

“The finding of a second naturally occurring quasicrystal confirms that these materials can form in nature and are stable over cosmic time scales,” said Paul Steinhardt, Princeton’s Albert Einstein Professor of Science and a professor of physics, who led the study with Luca Bindi of the University of Florence. The team published the finding in the March 13 issue of the journal Scientific Reports.

The discovery raises the possibility that other types of quasicrystals can be formed in nature, according to Steinhardt. Quasicrystals are very hard, have low friction, and don’t conduct heat very well — making them good candidates for applications such as protective coatings on items ranging from airplanes to non-stick cookware.

2015_03_13_Steinhardt_quasicrystalsThe newly discovered quasicrystal, which is yet to be named, has a structure that resembles flat 10-sided disks stacked in a column. This type of structure is impossible in ordinary crystals, in which atoms are packed closely together in a repeated and orderly fashion. The difference between crystals and quasicrystals can be visualized by imagining a tiled floor: Tiles that are 6-sided hexagons can fit neatly against each other to cover the entire floor. But 5-sided pentagons or 10-sided decagons laid next to each will result in gaps between tiles. “The structure is saying ‘I am not a crystal, but on the other hand, I am not random either,'” Steinhardt said.

Crystals with these forbidden symmetries had been created in the laboratory, but it wasn’t until 2009 that Bindi, Steinhardt, Nan Yao of Princeton and Peter Lu of Harvard reported the first natural quasicrystal, now known as icosahedrite, in a rock that had been collected years before in Chukotka, Russia. To confirm that this quasicrystal, which has the five-fold symmetry of a soccer ball, was indeed of natural origins, Steinhardt and a team of scientists including geologists from the Russian Academy of Sciences traveled to the region in 2011 and returned with additional samples which they analyzed at the University of Florence; the Smithsonian Museum in Washington, DC; the California Institute of Technology; and the Princeton Institute for the Science and Technology of Materials (PRISM) Imaging and Analysis Center.

Quasicrystal

The top panel shows an X-ray tomography image (similar to a “CAT” scan) at two different rotations of the whole mineral grain. The brighter and the darker regions are copper-aluminum metals and meteoritic silicates, respectively. The bottom panel shows a scanning electron micrograph image of the quasicrystal (QC) in apparent contact with another mineral, olivine (Ol). Source: Paul Steinhardt.

The researchers confirmed that the quasicrystal originated in an extraterrestrial body that formed about 4.57 billion years ago, which is around the time our solar system formed. They published the results in the Proceedings of the National Academy of Sciences in 2012. “Bringing back the material and showing that it was of natural origins was an important scientific barrier to overcome,” Steinhardt said.

This new quasicrystal, which was found in a different grain of the same meteorite, has 10-fold, or decagonal, symmetry. It is made up of aluminum, nickel and iron, which normally are not found together in the same mineral because aluminum binds quickly to oxygen, blocking attachment to nickel and iron.

Pattern of ten-fold symmetry

The new mineral is the grain shown in panel (a). The ten-fold symmetry is evident when the mineral is hit with x-rays (b). Aiming the beam from a different direction results in paterns as in (c) or (d) in which the spots form along horizontal lines that are equally spaced. Source: Paul Steinhardt.

The researchers are now exploring how the mineral formed, “We know there was a meteor impact, and that the temperature was around 1000 to 1200 degrees Kelvin, and that the pressure was a hundred thousand times greater than atmospheric pressure, but that is not enough to tell us all the details,” Steinhardt said. “We’d like to know whether the formation of quasicrystals is rare or is fairly frequent, how it occurs, and whether it could happen in other solar systems. What we find out could answer basic questions about the materials found in our universe.”

The team included, from Princeton: Nan Yao, a senior research scholar at PRISM and director of the PRISM Imaging and Analysis Center; Chaney Lin, a graduate student in physics; and Lincoln Hollister, professor of geosciences, emeritus, and a senior geologist. Co-authors also included Christopher Andronicos of Purdue University; Vadim Distler, Valery Kryachko and Marina Yudovskaya of the Russian Academy of Sciences; Alexander Kostin of BHP Billiton; Michael Eddy of the Massachusetts Institute of Technology; Glenn MacPherson the Smithsonian Institution; and William Steinhardt, a graduate student at Harvard University.

This work was supported in part by the National Science Foundation-MRSEC program (DMR-0820341) the Princeton Center for Complex Materials (DMR-0819860) and NASA (NNX11AD43G).

Ten-fold symmetry

The ordered yet non-standard pattern of the quasicrystal is revealed by an electron beam, which enables a view of a pattern of spots with ten-fold symmetry. Source: Paul Steinhardt.

Read the paper: Bindi, et al., 2015 – Natural quasicrystal with decagonal symmetry. Scientific Reports, 5, 9111. doi:10.1038/srep09111

Additional reading:

Bindi et al., 2009. Natural quasicrystals. Science 324, 1306-1309. http://www.sciencemag.org/content/324/5932/1306

Bindi et al., 2012. Evidence for the extraterrestrial origin of a natural quasicrystal. Proceedings of the National Academy of Sciences 109, 1396-1401. http://www.pnas.org/content/109/5/1396.full

 

 

 

Beautiful but strange: The dark side of cosmology (Science)

By Catherine Zandonella, Office of the Dean for Research

It’s a beautiful theory: the standard model of cosmology describes the universe using just six parameters. But it is also strange. The model predicts that dark matter and dark energy – two mysterious entities that have never been detected — make up 95% of the universe, leaving only 5% composed of the ordinary matter so essential to our existence.

In an article in this week’s Science, Princeton astrophysicist David Spergel reviews how cosmologists came to be certain that we are surrounded by matter and energy that we cannot see. Observations of galaxies, supernovae, and the universe’s temperature, among other things, have led researchers to conclude that the universe is mostly uniform and flat, but is expanding due to a puzzling phenomenon called dark energy. The rate of expansion is increasing over time, counteracting the attractive force of gravity. This last observation, says Spergel, implies that if you throw a ball upward you will see it start to accelerate away from you.

The components of our universe

The components of our universe. Dark energy comprises 69% of the mass energy density of the universe, dark matter comprises 25%, and “ordinary” atomic matter makes up 5%. Three types of neutrinos make up at least 0.1%, the cosmic background radiation makes up 0.01%, and black holes comprise at least 0.005%. (Source: Science/AAAS)

A number of experiments to detect dark matter and dark energy are underway, and some researchers have already claimed to have found particles of dark matter, although the results are controversial. New findings expected in the coming years from the Large Hadron Collider, the world’s most powerful particle accelerator, could provide evidence for a proposed theory, supersymmetry, that could explain the dark particles.

But explaining dark energy, and why the universe is accelerating, is a tougher problem. Over the next decade, powerful telescopes will come online to map the structure of the universe and trace the distribution of matter over the past 10 billion years, providing new insights into the source of cosmic acceleration.

Yet observations alone are probably not enough, according to Spergel. A full understanding will require new ideas in physics, perhaps even a new theory of gravity, possibly including extra dimensions, Spergel writes. “We will likely need a new idea as profound as general relativity to explain these mysteries.”

When that happens, our understanding of the dark side of cosmology will no longer accelerate away from us.

Read the article

Citation: Spergel, David. The dark side of cosmology: Dark matter and dark energy. Science, 6 March 2015: Vol. 347 no. 6226 pp. 1100-1102 DOI: 10.1126/science.aaa0980.

–David Spergel is the Charles A. Young Professor of Astronomy on the Class of 1897 Foundation, a professor of astrophysical sciences, and chair of Princeton’s Department of Astrophysical Sciences. His research is supported by the National Science Foundation and NASA.

Pennies reveal new insights on the nature of randomness (PNAS)

By Tien Nguyen, Department of Chemistry

The concept of randomness appears across scientific disciplines, from materials science to molecular biology. Now, theoretical chemists at Princeton have challenged traditional interpretations of randomness by computationally generating random and mechanically rigid arrangements of two-dimensional hard disks, such as pennies, for the first time.

‘It’s amazing that something so simple as the packing of pennies can reveal to us deep ideas about the meaning of randomness or disorder,” said Salvatore Torquato, professor of chemistry at Princeton and principal investigator of the report published on December 30 in the journal Proceedings of the National Academy of Sciences.

In two dimensions, conventional wisdom held that the most random arrangements of pennies were those most likely to form upon repeated packing, or in other words, most “entropically” favored. But when a group of pennies are rapidly compressed, the most probable states are actually highly ordered with small imperfections—called a polycrystalline state.

“We’re saying that school of thought is wrong because you can find much lower density states that have a high degree of disorder, even if they are not seen in typical experiments,” Torquato said.

Torquato and coworkers proposed that randomness should be judged from the disorder of a single state as opposed to many states. “It’s a new way of searching for randomness,” said Morrel Cohen, a senior scholar at Princeton and the editor assigned to the article.

Using a computer algorithm, the researchers produced so-called maximally random, jammed (rigid) states as defined by a set of “order metrics.” These measurements reflect features of a single configuration, such as the fluctuations of density within a system and the extent to which one penny’s position can be used to predict another’s.

The algorithm generated random states that have never been seen before in systems with up to approximately 200 disks. Theoretically, these maximally random states should exist for even larger systems, but are beyond the computational limits of the program.

These findings hold promise especially for the physics and chemistry of surfaces. Randomly dispersed patterns can be relayed to a 3D printer to create materials with unique properties. This may be desirable in photonics—analogous to electronics, but with photons instead of electrons—where the orientation of particles affects light’s ability to travel through a material.

This work also provides a tool for measuring degrees of order that may be applied to broadly to other fields. For example, the degree of disorder in the spatial distribution of cancer cells versus healthy cells could be measured and compared for possible biological links. The next challenge in this line of research will be for experimentalists to replicate these findings in the laboratory.

Read the article.

Atkinson, S.; Stillinger, F. H.; Torquato, S. “Existence of isostatic, maximally random jammed monodisperse hard-disk packings,” Proc. Natl. Acad. Sci., 2014, 111, 18436.

This work was supported in part by the National Science Foundation under Grants DMR- 0820341 and DMS-1211087. This work was partially supported by Simons Foundation Grant in Theoretical Physics 231015.

Genome-wide search reveals new genes involved in long-term memory (Neuron)

By Catherine Zandonella, Office of the Dean for Research

Whole genome expression data reveals new genes involved in long-term memory formation in worms. (Image source: Murphy lab)

Whole genome expression data reveals new genes involved in long-term memory formation in worms. (Image source: Murphy lab)

A new study has identified genes involved in long-term memory in the worm as part of research aimed at finding ways to retain cognitive abilities during aging.

The study, which was published in the journal Neuron, identified more than 750 genes involved in long-term memory, including many that had not been found previously and that could serve as targets for future research, said senior author Coleen Murphy, an associate professor of molecular biology and the Lewis-Sigler Institute for Integrative Genomics at Princeton University.

“We want to know, are there ways to extend memory?” Murphy said. “And eventually, we would like to ask, are there compounds that could maintain memory with age?”

Long-term memory training in worms (left) led to induction of the transcription factor CREB in AIM neurons (shown by arrows in right). CREB-induced genes were shown to be involved in forming long-term memories in worm neurons. (Image source: Murphy lab)

Long-term memory training in worms (left) led to induction of the transcription factor CREB in AIM neurons (shown by arrows in right). CREB-induced genes were shown to be involved in forming long-term memories in worm neurons. (Image source: Murphy lab)

The newly pinpointed genes are “turned on” by a molecule known as CREB (cAMP-response element-binding protein), a factor known to be required for long-term memory in many organisms, including worms and mice.

“There is a pretty direct relationship between CREB and long-term memory,” Murphy said, “and many organisms lose CREB as they age.” By studying the CREB-activated genes involved in long-term memory, the researchers hope to better understand why some organisms lose their long-term memories as they age.

To identify the genes, the researchers first instilled long-term memories in the worms by training them to associate meal-time with a butterscotch smell. Trained worms were able to remember that the butterscotch smell means dinner for about 16 hours, a significant amount of time for the worm.

The researchers then scanned the genomes of both trained worms and non-trained worms, looking for genes turned on by CREB.

The researchers detected 757 CREB-activated genes in the long-term memory-trained worms, and showed that these genes were turned on primarily in worm cells called the AIM interneurons.

They also found CREB-activated genes in non-trained worms, but the genes were not turned on in AIM interneurons and were not involved in long-term memory. CREB turns on genes involved in other biological functions such as growth, immune response, and metabolism. Throughout the worm, the researchers noted distinct non-memory (or “basal”) genes in addition to the memory-related genes.

The next step, said Murphy, is to find out what these newly recognized long-term memory genes do when they are activated by CREB. For example, the activated genes may strengthen connections between neurons.

Worms are a perfect system in which to explore that question, Murphy said. The worm Caenorhabditis elegans has only 302 neurons, whereas a typical mammalian brain contains billions of the cells.

“Worms use the same molecular machinery that higher organisms, including mammals, use to carry out long-term memory,” said Murphy. “We hope that other researchers will take our list and look at the genes to see whether they are important in more complex organisms.”

Murphy said that future work will involve exploring CREB’s role in long-term memory as well as reproduction in worms as they age.

The team included co-first-authors Postdoctoral Research Associate Vanisha Lakhina, Postdoctoral Research Associate Rachel Arey, and Associate Research Scholar Rachel Kaletsky of the Lewis-Sigler Institute for Integrative Genomics. Additional research was performed by Amanda Kauffman, who earned her Ph.D. in Molecular Biology in 2010; Geneva Stein, who earned her Ph.D. in Molecular Biology in 2014; William Keyes, a laboratory assistant in the Department of Molecular Biology; and Daniel Xu, who earned his B.A. in Molecular Biology in 2014.

Funding for the research was provided by the National Institutes of Health and the Paul F. Glenn Laboratory for Aging Research at Princeton University.

Read the abstract

Citation: Vanisha Lakhina, Rachel N. Arey, Rachel Kaletsky, Amanda Kauffman, Geneva Stein, William Keyes, Daniel Xu, and Coleen T. Murphy. Genome-wide Functional Analysis of CREB/Long-Term Memory-Dependent Transcription Reveals Distinct Basal and Memory Gene Expression Programs, Neuron (2015), http://dx.doi.org/10.1016/j.neuron.2014.12.029.