DNA Gridlock – Cells undo glitches to prevent mutations (Nature)

By Catherine Zandonella, Office of the Dean for Research

G4 Quadruplex
The diagram shows a G-quadruplex (G4) on the upper of the two strands that make up DNA. The purple shape represents DNA polymerase, which is blocked by the G4 in its attempt to copy DNA. Regions of the genome that are especially susceptible to forming G-quadruplexes are ones rich in guanine, which is one of the four nucleotides, designated by the letters G, A, C, and T, in DNA. Adapted from Nature Genetics, 2012.

Roughly six feet of DNA are packed into every human cell, so it is not surprising that our genetic material occasionally folds into odd shapes such as hairpins, crosses and clover leafs. But these structures can block the copying of DNA during cell division, leading to gene mutations that could have implications in cancer and aging.

Now researchers based at Princeton University have uncovered evidence that cells contain a built-in system for eliminating one of the worst of these roadblocks, a structure known as a G-quadruplex. In a paper published earlier this month in Nature, a group of researchers led by Princeton’s Virginia Zakian reported that an enzyme known as the Pif1 helicase can unfold these structures both in test tubes and in cells, bringing DNA replication back on track.

Given that Pif1 mutations have been associated with an increased risk of breast cancer, Zakian said, the study of how Pif1 ensures proper DNA replication could be relevant to human health. Zakian is Princeton’s Harry C. Wiess Professor in the Life Sciences.

Most DNA is made of up of two strands twisted about each other in a way that resembles a spiral staircase. Every time a cell divides, each DNA molecule must be duplicated, a process that involves unwinding the staircase so that an enzyme known as DNA polymerase can work down each strand, copying each letter in the DNA code. During this exposed period, regions of the unwound single strands can fold into G-quadruplexes (see diagram).

Like a car that encounters a pile-up on an Interstate, the DNA polymerase halts when it encounters a G-quadruplex, explained Matthew Bochman, a postdoctoral researcher who was a co-first author with Katrin Paeschke, now an independent investigator at University of Würzburg in Germany. The work also included Princeton graduate student Daniela Garcia.

“The DNA that is folded into a G-quadruplex cannot be replicated, so essentially it is skipped,” Bochman said. “Failure to copy specific areas of DNA that you really need is a serious problem, especially in regions that control genes that either suppress or contribute to cancer,” Bochman said.

Last year, the Zakian group in collaboration with human geneticists at the University of Washington reported that a mutation in human Pif1 is associated with an increased risk of breast cancer, suggesting that the ability to unwind G-quadruplexes could be important for protecting against cancer. The finding was published in the journal PLoS One.

G-quadruplexes could also be implicated in the process of aging, according to the researchers. The structures are thought to form at the ends of chromosomes in regions called telomeres, said Zakian, an expert on telomere biology.  Damaged or shortened telomeres are associated with premature aging and cancer.

To explore the role of Pif1 helicases in tackling G-quadruplexes, Bochman and Paeschke purified Pif1 helicases from yeast and bacteria and found that in test tubes, all of the Pif1 helicases unwind G-quadruplex structures extremely fast and very efficiently, much better than other helicases tested in the same way.

Next, these investigators set up an experiment to determine if Pif1 acts on G-quadruplexes inside cells. Using a system that could precisely evaluate the effects of G-quadruplex structures on the integrity of chromosomes, the researchers found that normal cells had no problem with the addition of a G-quadruplex structure, but when cells lack Pif1 helicases, the G-quadruplex induced a high amount of genome instability.

“To me, the most remarkable aspect of the study was the demonstration that Pif1-like helicases taken from species ranging from bacteria to humans and placed in yeast cells can suppress G-quadruplex-induced DNA damage,” Zakian said. “This finding suggests that resolving G-quadruplexes is an evolutionarily conserved function of Pif1 helicases.”

The Zakian lab also found that replicating through G-quadruplexes in the absence of Pif1 helicases results not only in mutations of the DNA at the site of the G-quadruplex but also in intriguing “epigenetic” effects on expression of nearby genes that were totally unexpected. Epigenetic events cause changes in gene expression that are inherited, yet they do not involve loss or mutation of DNA. Graduate student Daniela Garcia has proposed that the epigenetic silencing of gene expression that occurs near G-quadruplexes in the absence of Pif1 helicases is a result of the addition or removal of molecular tags on histones, which are proteins that bind DNA and regulate gene expression. This hypothesis is currently being studied.

The study involved contributions from Petr Cejka and Stephen C. Kowalczykowski of the University of California-Davis, and Katherine Friedman of Vanderbilt University.

Read the abstract.

Paeschke, Katrin, Matthew L. Bochman, P. Daniela Garcia, Petr Cejka, Katherine L. Friedman, Stephen C. Kowalczykowski & Virginia A. Zakian. Pif1 family helicases suppress genome instability at G-quadruplex motifs. Nature. 2013. doi:10.1038/nature12149.

This research was supported by the National Institutes of Health (V.A.Z., GM026938-34; S.C.K.GM041347), the National Science Foundation (K.L.F., MCB-0721595), the German Research Foundation (DFG), the New Jersey Commission on Cancer Research (K.P.) and the American Cancer Society (M.L.B., PF-10-145-02-01).

 

How the ice ages ended (Nature)

by Catherine Zandonella, Office of the Dean for Research

Antarctica. Photo credit: Harley D. Nygren, NOAA
Antarctica. Photo credit: Harley D. Nygren, NOAA

A study of sediment cores collected from the deep ocean supports a new explanation for how glacier melting at the end of the ice ages led to the release of carbon dioxide from the ocean.

The study published in Nature suggests that melting glaciers in the northern hemisphere caused a disruption of deep ocean currents, leading to the release of trapped carbon dioxide from the Southern Ocean around Antarctica.

Understanding what happened when previous glaciers melted could help climate researchers make accurate predictions about future global temperature increases and their effects on the planet.

The evidence is strong that ice ages are driven by periodic changes in the amount of sunlight reaching the poles due to cyclic changes in Earth’s rotation and orbit. Yet scientists have been puzzled by evidence that although the timing of ice ages are best explained by changes in sunlight in the northern part of the globe, the warming at the end of ice ages occurred first in the southern hemisphere, with a rise in carbon dioxide levels appearing to be cued from the south.

The new study suggests that changes in ocean currents, connecting the north to the south through the deep ocean, were to blame.

As glaciers melted in the northern reaches of the globe (far upper left), the influx of freshwater, which is naturally less dense than salt-laden ocean water, reduced the normally strong sinking of water in that region. This allowed silicate-rich deep water to rise upward into the shallower ocean waters (upward blue arrows), stimulating the production of opal by diatoms, while warm surface water mixed downward (red arrows) into the southern-sourced deep water. The rising silicate-rich water drew dense cold water from near Antarctica, yielding a cycle of water movement (in yellow). The new circulation pattern caused the carbon dioxide stored in the deep water to be released to the atmosphere near Antarctica (far upper right). Image source: Daniel Sigman.
As glaciers melted in the northern reaches of the globe (far upper left), the influx of freshwater, which is naturally less dense than salt-laden ocean water, caused a reduction in the normally strong sinking of water in that region. This allowed silicate-rich deep water to rise upward into the shallower ocean waters (upward blue arrows), stimulating the production of opal by diatoms, while warm surface water mixed downward (red arrows) into the southern-sourced deep water. The rising silicate-rich water drew dense cold water from near Antarctica, yielding a cycle of water movement (in yellow). The new circulation pattern caused carbon dioxide stored in the deep water to be released to the atmosphere near Antarctica (far upper right). Image source: Daniel Sigman.

Part of this story was suggested more than a decade ago and is already accepted by many climate scientists: As glaciers in the north started melting, the influx of fresh water diluted the salty waters that today flow to the north from the tropics as an extension of the Gulf Stream. Normally, these salty waters become cool and sink into the deep ocean, forming cold and dense water that flows southward, and allowing more salty tropical water to take its place in a sort of ocean conveyor belt. But the influx of fresh water due to melting glaciers stalled the conveyor belt.

So how did this lead to changes in the southern hemisphere?

The new research suggests that the shutdown in northern sinking water allowed southern-sourced water to fill up the deep Atlantic, setting up a new ocean circulation pattern. This new circulation pattern brought deep-sea water, which was rich in carbon dioxide due to sunken dead marine algae, to the surface near Antarctica, where the gas escaped into the atmosphere and acted to drive global warming.  (See diagram.)

The researchers included investigators from ETH Zürich, Princeton University, the University of Miami, the University of British Columbia, and the University of Bremen and the Alfred Wegener Institute in Germany. The Princeton effort was led by Daniel Sigman, the Dusenbury Professor of Geological and Geophysical Sciences.

The team tracked these historic movements of water through the study of sediment cores that are rich in silicon dioxide, or opal. Tiny marine algae known as diatoms make their cell walls out of opal, and when the organisms die, their opal remains sink to the deep sea bed.

The researchers looked at opal in sediment core samples drilled from deep beneath the ocean floor off the coast of northwest Africa and Antarctica. The team found that each period of glacier melting, which occurred five times over the last 550 thousand years, corresponded to a spike in the amount of the opal in the sediment, signaling an increase in diatom growth. The timing of the opal spikes provides evidence that the deep, opal-rich waters in the south were drawn to the surface in response to new meltwater entering the northern ocean.

The mechanism clashes with a previously offered explanation of why the melting of the northern glaciers, or deglaciations, leads to the release of ocean carbon dioxide from the Southern Ocean – the theory that the melting glaciers in the north increased southern hemisphere westerly winds, which in turn caused upwelling of Southern Ocean deep waters. “While distinguishing between these alternatives is important,” says Sigman, “the greater challenge is to test and understand a premise that is shared by both of these scenarios: that ice age conditions around Antarctica caused the deep ocean to be sluggish and rich in carbon dioxide. If this was really how the ice age ocean operated, then it calls for us to reconsider how we expect deep ocean circulation to respond to modern global warming.”

Read the abstract.

A. N. Meckler, D. M. Sigman, K. A. Gibson, R. François, A. Martínez-García, S. L. Jaccard, U. Röhl, L. C. Peterson, R. Tiedemann & G. H. Haug. 2013. Deglacial pulses of deep-ocean silicate into the subtropical North Atlantic Ocean. Nature 495 (7442), 495-498. doi:10.1038/nature12006. Published online 27 March, 2013.

This research used samples provided by the ODP, which is sponsored by the US National Science Foundation (NSF) and participating countries under the management of the Joint Oceanographic Institutions. XRF data were acquired at the XRF Core Scanner Lab at MARUM – Center for Marine Environmental Sciences, University of Bremen, with support from the DFG-Leibniz Center for Surface Process and Climate Studies at the University of Potsdam. Further support was provided by the US NSF through grant OCE-1060947 to D.M.S. and by NSERC and CFCAS to R.F.

Shape from sound — new methods to probe the universe (Physical Review Letters)

By Morgan Kelly, Office of Communications

As the universe expands, it is continually subjected to energy shifts, or “quantum fluctuations,” that send out little pulses of “sound” into the fabric of spacetime. In fact, the universe is thought to have sprung from just such an energy shift.

A recent paper in the journal Physical Review Letters reports a new mathematical tool that should allow one to use these sounds to help reveal the shape of the universe. The authors reconsider an old question in spectral geometry that asks, roughly, to what extent can the shape of a thing be known from the sound of its acoustic vibrations? The researchers approached this problem by breaking it down into small workable pieces, according to author Tejal Bhamre, a Princeton University graduate student in the Department of Physics.

To understand the authors’ method, consider a vase. If one taps a vase with a spoon, it will make a sound that is characteristic of its shape. Similarly, the technique Bhamre and her coauthors developed could, in principle, determine the shape of spacetime from the perpetual ringing caused by quantum fluctuations.

The researchers’ technique also provides a unique connection between the two pillars of modern physics — quantum theory and general relativity — by using vibrational wavelengths to define the geometric property that is spacetime.

Bhamre worked with coauthors David Aasen, a physics graduate student at Caltech, and Achim Kempf, a Waterloo University professor of physics of information.

Read the abstract.

David Aasen, Tejal Bhamre and Achim Kempf. 2013. Shape from Sound: Toward New Tools for Quantum Gravity. Physical Review Letters. Article first published online: March 18, 2013. DOI: 10.1103/PhysRevLett.110.121301.

This research was supported by the Natural Sciences and Engineering Research Council of Canada.

Serendipity Pays Off (Science)

By Catherine Zandonella, Office of the Dean for Research

Serendipity –­­ the act of finding something good or useful while not specifically searching for it – can sometimes pay off. Now Princeton University chemistry researchers report that this non-specific type of searching has yielded a new method of building molecules for use in new drugs, new agricultural chemicals and even new perfumes.

In a paper published today in the journal Science, Princeton’s David MacMillan and his team describe the discovery of a new chemical reaction – not noted before in nature or in any lab – that could assist pharmaceutical chemists and others who routinely create new chemicals for a variety of industries.

Until now, no one realized this chemical reaction – which involves adding atoms to a specific carbon atom on a molecule – could occur, according to MacMillan, the James S. McDonnell Distinguished University Professor of Chemistry at Princeton. “If you show this chemical reaction to most chemists, they immediately say ‘that’s impossible,'” MacMillan said.

In this case, the team discovered this “impossible” reaction using an approach MacMillan pioneered that he calls “accelerated serendipity.” The researchers use robotic arms to conduct thousands of reactions per day by combining in test tubes different combinations of chemicals along with catalysts that spur the reactions. When the investigators find a reaction that makes an interesting product, they study it to understand how the reaction occurs.

“We didn’t invent this new reaction – nature did that,” MacMillan said, “but we figured out how to get the reaction to happen in the lab.” said MacMillan. His team, which included graduate student Michael Pirnot, postdoctoral researcher David Martin and former postdoctoral researcher Danica Rankic, uses ordinary light bulbs as catalysts, a technique developed in MacMillan’s lab and published in Science in 2008, to spur the reactions.

Going forward, chemists can add this new reaction to their tool box of methods for building up molecules, which they do in a way analogous to joining together pieces of Kinex or Tinker Toys, by swapping in new parts to increase the function of the molecule. In the new reaction published today, the team discovered a way to join so-called “functional groups” to a specific carbon atom (see diagram) in larger structures known as ketones and aldehydes. The ability to add functional groups to that carbon atom was thought impossible until now.

macmillan
Caption: Upper and lower left: Green spots indicate carbon atoms known to undergo reactions. Right panel: Purple spot indicates a carbon atom thought not to undergo reactions. The team discovered, using accelerated serendipity, a way to cause this carbon to react, resulting in addition of functional groups, and potentially leading to new drugs or other important industrial chemicals. (Source: Science)

This new chemical reaction has wide applications, MacMillan said. “This is a fundamental reaction which any chemist can start using.”

For example, a chemist who is building a drug to treat Alzheimer’s disease might desire to add a chemical group to the reluctant carbon atom. Normally that would require the chemist to conduct several different chemical reactions over several weeks, but with the new reaction the chemist could build the drug in two days and be testing drug candidates much more quickly.

Similarly a chemist at a fragrance company could use the new reaction to experiment with the creation of new perfume formulations.

MacMillan’s original paper on accelerated serendipity, published in 2011 in Science, successfully discovered a reaction now used in the drug industry. Yet it was controversial because other scientists interpreted the robotic searches as random searches, when in fact they were not random. “We chose chemicals that had never been shown to react with each other – those are the ones we believe might lead to as-yet undiscovered reactions.” MacMillan said that these reactions may have been created in the past by chemists who didn’t recognize what they were.

Read the abstract.

Michael T. Pirnot, Danica A. Rankic, David B. C. Martin, David W. C. MacMillan. Photoredox Activation for the Direct β-Arylation of Ketones and Aldehydes. Science 29 March 2013. Vol. 339 no. 6127 pp. 1593-1596.

This research was supported by the National Institute of General Medical Sciences grant R01 GM103558-01 and gifts from Merck, Amgen, Abbott, and Bristol-Myers Squibb.

Younger cancer patients experience greater increase in religiosity (Social Science Research)

By Michael Hotchkiss, Office of Communications

People diagnosed with cancer at younger ages are more likely to become more religious than their counterparts diagnosed at older ages, researchers including a Princeton research scholar have found.

Overall, the researchers found that people diagnosed with cancer experienced a one-time increase in religiosity, with the greater increase among those who experienced a diagnosis at a younger age, what’s known as an “off-time diagnosis.”

“Off-time diagnoses may also be related to increased religiosity because the meaning of having cancer may be different for those in middle adulthood compared to older adulthood,” the researchers said. The results come from a review of surveys of more than 3,400 people conducted in 1994-95 and 2004-06.

The research, detailed in an article in the March issue of Social Science Research, was conducted by Michael McFarland, a postdoctoral researcher at Princeton’s Office of Population Research, Tetyana Pudrovska, an assistant professor at Pennsylvania State University; Scott Schieman, a professor at the University of Toronto; Christopher Ellison, a professor at the the University of Texas at San Antonio; and Alex Bierman, an assistant professor at the University of Calgary.

Read the abstract.

McFarland, Michael J., Tetyana Pudrovska, Scott Schieman, Christopher G. Ellison, and Alex Bierman. March 2013. Does a cancer diagnosis influence religiosity? Integrating a life course perspective. Social Science Research. Vol. 42, Issue 2, pp. 311–20.

Drug-resistant MRSA bacteria – here to stay in both hospital and community (PLoS Pathogens)

By Catherine Zandonella, Office of the Dean for Research

A colorized scanning electron micrograph of a white blood cell eating an antibiotic resistant strain of Staphylococcus aureus bacteria, commonly known as MRSA. (Source: National Institute of Allergy and Infectious Diseases (NIAID))
A colorized scanning electron micrograph of a white blood cell eating an antibiotic resistant strain of Staphylococcus aureus bacteria, commonly known as MRSA. (Source: National Institute of Allergy and Infectious Diseases (NIAID))

The drug-resistant bacteria known as MRSA, once confined to hospitals but now widespread in communities, will likely continue to exist in both settings as separate strains, according to a new study.

The prediction that both strains will coexist is reassuring because previous projections indicated that the more invasive and fast-growing community strains would overtake and eliminate hospital strains, possibly posing a threat to public health.

Researchers at Princeton University used mathematical models to explore what will happen to community and hospital MRSA strains, which differ genetically.  Originally MRSA, which is short for methicillin-resistant Staphylococcus aureus, was confined to hospitals. However, community-associated strains emerged in the past decade and can spread widely from person to person in schools, athletic facilities and homes.

Both community and hospital strains cause diseases ranging from skin and soft-tissue infections to pneumonia and septicemia. Hospital MRSA is resistant to numerous antibiotics and is very difficult to treat, while community MRSA is resistant to fewer antibiotics.

The new study found that these differences in antibiotic resistance, combined with more aggressive antibiotic usage patterns in hospitals versus the community setting, over time will permit hospital strains to survive despite the competition from community strains. Hospital-based antibiotic usage is likely to successfully treat patients infected with community strains, preventing the newcomer strains from spreading to new patients and gaining the foothold they need to out-compete the hospital strains.

The researchers made their predictions by using mathematical models of MRSA transmission that take into account data on drug-usage, resistance profiles, person-to-person contact, and patient age.

Published February 28 in the journal PLOS Pathogens, the study was conducted by postdoctoral researcher Roger Kouyos, now a scholar at the University of Zurich, and Eili Klein, a graduate student who is now an assistant professor in the Johns Hopkins School of Medicine. They conducted the work under the advisement of Bryan Grenfell, Princeton’s Kathryn Briger and Sarah Fenton Professor of Ecology and Evolutionary Biology and Public Affairs at Princeton’s Woodrow Wilson School of International and Public Affairs.

Read the article (open access).

Kouyos R., Klein E. & Grenfell B. (2013). Hospital-Community Interactions Foster Coexistence between Methicillin-Resistant Strains of Staphylococcus aureus. PLoS Pathogens, 9 (2) e1003134. PMID:

RK was supported by the Swiss National Science Foundation (Grants PA00P3_131498 and PZ00P3_142411). EK was supported by Princeton University (Harold W. Dodds Fellowship), as well as the Models of Infectious Disease Agent Study (MIDAS), under Award Number U01GM070708 from the National Institute of General Medical Sciences. BG was supported by the Bill and Melinda Gates Foundation; the Research and Policy for Infectious Disease Dynamics (RAPIDD) program of the Science and Technology Directorate, Department of Homeland Security; and the Fogarty International Center, National Institutes of Health.

Quantum computing moves forward (Science)

By Catherine Zandonella, Office of the Dean for Research

New technologies that exploit quantum behavior for computing and other applications are closer than ever to being realized due to recent advances, according to a review article published this week in the journal Science.

Science_cover
A silicon chip levitates individual atoms used in quantum information processing. Photo: Curt Suplee and Emily Edwards, Joint Quantum Institute and University of Maryland. Credit: Science.

These advances could enable the creation of immensely powerful computers as well as other applications, such as highly sensitive detectors capable of probing biological systems. “We are really excited about the possibilities of new semiconductor materials and new experimental systems that have become available in the last decade,” said Jason Petta, one of the authors of the report and an associate professor of physics at Princeton University.

Petta co-authored the article with David Awschalom of the University of Chicago, Lee Basset of the University of California-Santa Barbara, Andrew Dzurak of the University of New South Wales and Evelyn Hu of Harvard University.

Two significant breakthroughs are enabling this forward progress, Petta said in an interview. The first is the ability to control quantum units of information, known as quantum bits, at room temperature. Until recently, temperatures near absolute zero were required, but new diamond-based materials allow spin qubits to be operated on a table top, at room temperature. Diamond-based sensors could be used to image single molecules, as demonstrated earlier this year by Awschalom and researchers at Stanford University and IBM Research (Science, 2013).

The second big development is the ability to control these quantum bits, or qubits, for several seconds before they lapse into classical behavior, a feat achieved by Dzurak’s team (Nature, 2010) as well as Princeton researchers led by Stephen Lyon, professor of electrical engineering (Nature Materials, 2012). The development of highly pure forms of silicon, the same material used in today’s classical computers, has enabled researchers to control a quantum mechanical property known as “spin”. At Princeton, Lyon and his team demonstrated the control of spin in billions of electrons, a state known as coherence, for several seconds by using highly pure silicon-28.

Quantum-based technologies exploit the physical rules that govern very small particles — such as atoms and electrons — rather than the classical physics evident in everyday life. New technologies based on “spintronics” rather than electron charge, as is currently used, would be much more powerful than current technologies.

In quantum-based systems, the direction of the spin (either up or down) serves as the basic unit of information, which is analogous to the 0 or 1 bit in a classical computing system. Unlike our classical world, an electron spin can assume both a 0 and 1 at the same time, a feat called entanglement, which greatly enhances the ability to do computations.

A remaining challenge is to find ways to transmit quantum information over long distances. Petta is exploring how to do this with collaborator Andrew Houck, associate professor of electrical engineering at Princeton. Last fall in the journal Nature, the team published a study demonstrating the coupling of a spin qubit to a particle of light, known as a photon, which acts as a shuttle for the quantum information.

Yet another remaining hurdle is to scale up the number of qubits from a handful to hundreds, according to the researchers. Single quantum bits have been made using a variety of materials, including electronic and nuclear spins, as well as superconductors.

Some of the most exciting applications are in new sensing and imaging technologies rather than in computing, said Petta. “Most people agree that building a real quantum computer that can factor large numbers is still a long ways out,” he said. “However, there has been a change in the way we think about quantum mechanics – now we are thinking about quantum-enabled technologies, such as using a spin qubit as a sensitive magnetic field detector to probe biological systems.”

Read the abstract.

Awschalom D.D., Bassett L.C., Dzurak A.S., Hu E.L. & Petta J.R. (2013). Quantum spintronics: engineering and manipulating atom-like spins in semiconductors. Science 339 (6124) 1174-1179. PMID:

The research at Princeton University was supported by the Alfred P. Sloan Foundation, the David and Lucile Packard Foundation, US Army Research Office grant W911NF-08-1-0189, DARPA QuEST award HR0011-09-1-0007 and the US National Science Foundation through the Princeton Center for Complex Materials (DMR-0819860) and CAREER award DMR-0846341.

Researchers discover workings of brain’s ‘GPS system’ (Nature)

By Catherine Zandonella, Office of the Dean for Research

Just as a global positioning system (GPS) helps find your location, the brain has an internal system for helping determine the body’s location as it moves through its surroundings.

A new study from researchers at Princeton University provides evidence for how the brain performs this feat. The study, published in the journal Nature, indicates that certain position-tracking neurons — called grid cells — ramp their activity up and down by working together in a collective way to determine location, rather than each cell acting on its own as was proposed by a competing theory.

Grid cells are neurons that become electrically active, or “fire,” as animals travel in an environment. First discovered in the mid-2000s, each cell fires when the body moves to specific locations, for example in a room. Amazingly, these locations are arranged in a hexagonal pattern like spaces on a Chinese checker board.  (See figure.)

Tank_Brain_GPS
As the mouse moves around in a square arena (left), a single grid cell in the mouse’s brain becomes active, or spikes, when the animal arrives at particular locations in the arena (right). These locations are arranged in a hexagonal pattern. The red dots indicate the mouse’s location in the arena when the grid cell fired. (Image credit: Cristina Domnisoru, Princeton University)

“Together, the grid cells form a representation of space,” said David Tank, Princeton’s Henry L. Hillman Professor in Molecular Biology and leader of the study. “Our research focused on the mechanisms at work in the neural system that forms these hexagonal patterns,” he said. The first author on the paper was graduate student Cristina Domnisoru, who conducted the experiments together with postdoctoral researcher Amina Kinkhabwala.

Domnisoru measured the electrical signals inside individual grid cells in mouse brains while the animals traversed a computer-generated virtual environment, developed previously in the Tank lab. The animals moved on a mouse-sized treadmill while watching a video screen in a set-up that is similar to video-game virtual reality systems used by humans.

She found that the cell’s electrical activity, measured as the difference in voltage between the inside and outside of the cell, started low and then ramped up, growing larger as the mouse reached each point on the hexagonal grid and then falling off as the mouse moved away from that point.

This ramping pattern corresponded with a proposed mechanism of neural computation called an attractor network. The brain is made up of vast numbers of neurons connected together into networks, and the attractor network is a theoretical model of how patterns of connected neurons can give rise to brain activity by collectively working together. The attractor network theory was first proposed 30 years ago by John Hopfield, Princeton’s Howard A. Prior Professor in the Life Sciences, Emeritus.

The team found that their measurements of grid cell activity corresponded with the attractor network model but not a competing theory, the oscillatory interference model. This competing theory proposed that grid cells use rhythmic activity patterns, or oscillations, which can be thought of as many fast clocks ticking in synchrony, to calculate where animals are located. Although the Princeton  researchers detected rhythmic activity inside most neurons, the activity patterns did not appear to participate in position calculations.

Read the abstract.

Domnisoru, Cristina, Amina A. Kinkhabwala & David W. Tank. 2013. Membrane potential dynamics of grid cells. Nature. doi:10.1038/nature11973. Published online Feb. 10, 2013.

This work was supported by the National Institute of Neurological Disorders and Stroke under award numbers 5RC1NS068148-02 and 1R37NS081242-01, the National Institute of Mental Health under award number 5R01MH083686-04, a National Institutes of Health Postdoctoral Fellowship grant F32NS070514-01A1 (A.A.K.), and a National Science Foundation Graduate Research Fellowship (C.D.).

 

 

How do bacteria clog medical devices? Very quickly. (PNAS)

stone-figure-2D_540A new study has examined how bacteria clog medical devices, and the result isn’t pretty. The microbes join to create slimy ribbons that tangle and trap other passing bacteria, creating a full blockage in a startlingly short period of time.

The finding could help shape strategies for preventing clogging of devices such as stents — which are implanted in the body to keep open blood vessels and passages — as well as water filters and other items that are susceptible to contamination. The research was published in Proceedings of the National Academy of Sciences.

stone-figure-2D_540
Click on the image to view movie. Over a period of about 40 hours, bacterial cells (green) flowed through a channel, forming a green biofilm on the walls. Over the next ten hours, researchers sent red bacterial cells through the channel. The red cells became stuck in the sticky biofilm and began to form thin red streamers. Once stuck, these streamers in turn trapped additional cells, leading to rapid clogging. (Image source: Knut Drescher)

Using time-lapse imaging, researchers at Princeton University monitored fluid flow in narrow tubes or pores similar to those used in water filters and medical devices. Unlike previous studies, the Princeton experiment more closely mimicked the natural features of the devices, using rough rather than smooth surfaces and pressure-driven fluid instead of non-moving fluid.

The team of biologists and engineers introduced a small number of bacteria known to be common contaminants of medical devices. Over a period of about 40 hours, the researchers observed that some of the microbes — dyed green for visibility — attached to the inner wall of the tube and began to multiply, eventually forming a slimy coating called a biofilm. These films consist of thousands of individual cells held together by a sort of biological glue.

Over the next several hours, the researchers sent additional microbes, dyed red, into the tube. These red cells became stuck to the biofilm-coated walls, where the force of the flowing liquid shaped the trapped cells into streamers that rippled in the liquid like flags rippling in a breeze. During this time, the fluid flow slowed only slightly.

At about 55 hours into the experiment, the biofilm streamers tangled with each other, forming a net-like barrier that trapped additional bacterial cells, creating a larger barrier which in turn ensnared more cells. Within an hour, the entire tube became blocked and the fluid flow stopped.

The study was conducted by lead author Knut Drescher with assistance from technician Yi Shen. Drescher is a postdoctoral research associate working with Bonnie Bassler, Princeton’s Squibb Professor in Molecular Biology and a Howard Hughes Medical Institute Investigator, and Howard Stone, Princeton’s Donald R. Dixon ’69 and Elizabeth W. Dixon Professor of Mechanical and Aerospace Engineering.

“For me the surprise was how quickly the biofilm streamers caused complete clogging,” said Stone. “There was no warning that something bad was about to happen.”

By constructing their own controlled environment, the researchers demonstrated that rough surfaces and pressure driven flow are characteristics of nature and need to be taken into account experimentally. The researchers used stents, soil-based filters and water filters to prove that the biofilm streams indeed form in real scenarios and likely explain why devices fail.

The work also allowed the researchers to explore which bacterial genes contribute to biofilm streamer formation. Previous studies, conducted under non-realistic conditions, identified several genes involved in formation of the biofilm streamers. The Princeton researchers found that some of those previously identified genes were not needed for biofilm streamer formation in the more realistic habitat.

Read the abstract.

Drescher, Knut, Yi Shen, Bonnie L. Bassler, and Howard A. Stone. 2013. Biofilm streamers cause catastrophic disruption of flow with consequences for environmental and medical systems. Proceedings of the National Academy of Sciences. Published online February 11.

This work was supported by the Howard Hughes Medical
Institute, National Institutes of Health grant 5R01GM065859, National Science Foundation (NSF) grant MCB-0343821, NSF grant MCB-1119232, and the Human Frontier Science Program.

Where the wild things go (Folia Primatologica)

P.kirkii adult female 540
A lack of fresh water makes swamp life hard for animals such as the endangered Zanzibar red colobus monkey, pictured here drinking from a container of fresh water provided by locals. (Photo by Katarzyna Nowak)

by Morgan Kelly, Office of Communications

Ecologists have evidence that some endangered primates and large cats faced with relentless human encroachment will seek sanctuary in the sultry thickets of mangrove and peat swamp forests. These harsh coastal biomes are characterized by thick vegetation — particularly clusters of salt-loving mangrove trees — and poor soil in the form of highly acidic peat, which is the waterlogged remains of partially decomposed leaves and wood. As such, swamp forests are among the few areas in many African and Asian countries that humans are relatively less interested in exploiting (though that is changing).

Yet conservationists have been slow to consider these tropical hideaways when keeping tabs on the distribution of threatened animals such as Sumatran orangutans and Javan leopards, according to a recent Princeton University study in the journal Folia Primatologica. To draw attention to peat and mangrove swamps as current — and possibly future — wildlife refuges, Katarzyna Nowak, a former postdoctoral researcher of ecology and evolutionary biology at Princeton, compiled a list of 60 primates and 20 felids (the large-cat family that includes tigers and leopards) known to divide their time between their natural forest habitats and some 47 swamp forests in Africa and Asia.

Because swamp forests often lack food sources, fresh water and easy mobility, few mammals are exclusive to these areas, Nowak reported. Consequently, conservation groups have not intensely monitored the animals’ swamp use.

But the presence of endangered cats and primates in swamp forests might be seriously overlooked, Nowak found. About 55 percent of Old World monkeys — primates such as baboons and macaques that are native to Africa and Asia — take to the swamps either regularly, seasonally or as needed. In 2008, the Wildlife Conservation Society reported that the inaccessible Lake Télé swamp forest in the Republic of the Congo was home to 125,000 lowland gorillas — more than were thought to exist in the wild. Among big cats, the Bengal tiger, for instance, holds its sole ground in Bangladesh in the Sundarbans, the world’s largest mangrove forest.

Fig. 1 Site map_w mangrove 640
Princeton University research compiled 21 swamp forests in Africa (left) and 26 in Asia where primates and felids (a large cat family that includes tigers and leopards) are known to seek refuge from human encroachment. The colored dots indicate the overall “threat score,” or vulnerability, of species living in a particular site. Purple denotes a site with high species diversity, and where some resident primates and felids are likely listed as a conservation concern on the Red List of the International Union for Conservation of Nature. (Image by Katarzyna Nowak)

Life in the swamps can still be harsh for some animals. Species such as the crab-eating macaque and fishing cat can adapt somewhat readily to a life of swimming and foraging for crustaceans. Meanwhile, Zanzibar’s red colobus monkey — driven to coastal mangroves by deforestation — can struggle to find the freshwater it needs, as Nowak reported in the American Journal of Primatology in 2008. Such a trend could result in local extinction of the red colobus nonetheless, she said.

Nowak concludes that swamp forests beg further exploration as places where endangered species such as lowland gorillas and flat-headed cats have preserved their numbers — and where humans could potentially preserve them into the future.

Read the abstract.

Citation: Nowak, Katarzyna. 2013. Mangrove and Peat Swamp Forests: Refuge Habitats for Primates and Felids. Folia Primatologica. Vol. 83, no. 3-6, pp. 361-76.