Movie caption: Researchers at Princeton studied the temperature dependence of the formation of the nucleolus, a cellular organelle. The movie shows the nuclei of intact fly cells as they are subjected to temperature changes in the surrounding fluid. As the temperature is shifted from low to high, the spontaneously assembled proteins dissolve, as can be seen in the disappearance of the bright spots.
By Catherine Zandonella, Office of the Dean for Research
Researchers at Princeton found that the nucleolus, a cellular organelle involved in RNA synthesis, assembles in part through the passive process of phase separation – the same type of process that causes oil to separate from water. The study, published in the journal Proceedings of the National Academy of Sciences, is the first to show that this happens in living, intact cells.
Understanding how cellular structures form could help explain how organelles change in response to diseases. For example, a hallmark of cancer cells is the swelling of the nucleolus.
To explore the role of passive processes – as opposed to active processes that involve energy consumption – in nucleolus formation, Hanieh Falahati, a graduate student in Princeton’s Lewis-Sigler Institute for Integrative Genomics, looked at the behavior of six nucleolus proteins under different temperature conditions. Phase separation is enhanced at lower temperatures, which is why salad dressing containing oil and vinegar separates when stored in the refrigerator. If phase separation were driving the assembly of proteins, the researchers should see the effect at low temperatures.
Falahati showed that four of the six proteins condensed and assembled into the nucleolus at low temperatures and reverted when the temperature rose, indicating that the passive process of phase separation was at work. However, the assembly of the other two proteins was irreversible, indicating that active processes were in play.
“It was kind of a surprising result, and it shows that cells can take advantage of spontaneous processes for some functions, but for other things, active processes may give the cell more control,” said Falahati, whose adviser is Eric Wieschaus, Princeton’s Squibb Professor in Molecular Biology and a professor of molecular biology and the Lewis-Sigler Institute for Integrative Genomics, and a Howard Hughes Medical Institute researcher.
The research was funded in part by grant 5R37HD15587 from the National Institute of Child Health and Human Development (NICHD), and by the Howard Hughes Medical Institute.
By Catherine Zandonella, Office of the Dean for Research
A group of prominent researchers from seven institutions including Princeton University are calling for the establishment of a worldwide program to collect and test blood and other human bodily fluids to aid in the study and prevention of emerging infectious diseases such as the mosquito-borne Zika fever, which is caused by the Zika virus and has spread throughout Latin America and the Caribbean since early 2015.
In an article published in The Lancet April 5, the authors call for the creation of a World Serology Bank that would include storage repositories located around the world that use a common set of best practices to collect and evaluate blood and other bodily fluids from people infected by various pathogens. The resulting data would help scientists better understand the susceptibility of humans to emerging diseases such as Zika fever. The information could be shared widely among scientists who track disease.
Why is it important to create a World Serology Bank?
Serology is the study of bodily fluids including serum, the part of the blood that contains antibodies, with the aim of detecting the body’s immune response to pathogens. Serology provides us with the clearest window we have onto the landscape of susceptibility to pathogens across human populations, and the consequent risk of outbreaks. A World Serology Bank would shed light on the global risk of infectious disease outbreaks, and would be of tremendous public health benefit.
Why do you feel it is important to address this issue now?
With the emergence of pathogens like Zika virus, it becomes ever more important to understand what enhances or limits the global spread of these pathogens, and what the consequences of such spread may be across our pathogen communities. A World Serology Bank would provide a powerful mechanism toward such a global perspective.
What are the challenges involved in creating the Bank?
Challenges range from developing systems for collecting fluids, which can be done on a regular schedule or during specific disease events, to methods for sera storage and sera testing. Other challenges include defining who will administer the World Serology Bank, and what global data-sharing agreements will be put in place. Finally, we will need to develop new methods to translate what we learn from the evaluation of sera, such as patterns of susceptibility to specific pathogens, or protection from those pathogens. These methods will be driven by the underlying biology, and are likely to require an array of analytical innovations.
The article, “Use of serological surveys to generate key insights into the changing global landscape of infectious disease,” by C. Jessica E Metcalf, Jeremy Farrar, Felicity T. Cutts, Nicole E. Basta, Andrea L. Graham, Justin Lessler, Neil M. Ferguson, Donald S. Burke and Bryan T. Grenfell was published online in the journal The Lancet on April 5, 2016. http://dx.doi.org/10.1016/S0140-6736(16)30164-7.
Stopping the outbreak of a disease hinges on a wealth of data such as what makes a suitable host and how a pathogen spreads. But gathering these data can be difficult for diseases in remote areas of the world, or for epidemics involving wild animals.
A new study led by Princeton University researchers and published in the Journal of the Royal Society Interface explores an approach to studying epidemics for which details are difficult to obtain. The researchers analyzed the 2013 outbreak of dolphin morbillivirus — a potentially fatal pathogen from the same family as the human measles virus — that resulted in more than 1,600 bottlenose dolphins becoming stranded along the Atlantic coast of the United States by 2015. Because scientists were able to observe dolphins only after they washed up on shore, little is known about how the disease transmits and persists in the wild.
The researchers used a Poisson process — a statistical tool used to model the random nature of disease transmission — to determine from sparse data how dolphin morbillivirus can spread. They found that individual bottlenose dolphins may be infectious for up to a month and can spread the disease over hundreds of miles, particularly during seasonal migrations. In 2013, the height of disease transmission occurred toward the end of summer around an area offshore of Virginia Beach, Virginia, where multiple migratory dolphin groups are thought to cross paths.
In the interview below, first author Sinead Morris, a graduate student in ecology and evolutionary biology, explains what the researchers learned about the dolphin morbillivirus outbreak, and how the Poisson process can help scientists understand human epidemics. Morris is in the research group of co-author Bryan Grenfell, Princeton’s Kathryn Briger and Sarah Fenton Professor of Ecology and Evolutionary Biology and Public Affairs.
Q: How does the Poisson process track indirectly observed epidemics and what specific challenges does it overcome?
A: One of the main challenges in modeling indirectly observed epidemics is a lack of data. In our case, we had information on all infected dolphins that had been found stranded on shore, but had no data on the number of individuals that became infected but did not strand. The strength of the Poisson process is that its simple framework means it can be used to extract important information about the how the disease is spreading across space and time, despite having such incomplete data. Essentially the way the process works is that it keeps track of where and when individual dolphins stranded, and then at each new point in the epidemic it uses the history of what has happened before to project what will happen in the future. For example, an infected individual is more likely to transmit the disease onwards to other individuals in close spatial proximity than to those far away. So, by keeping track of all these infections the model can identify where and when the largest risk of new infections will be.
Q: Why was this 2013-15 outbreak of dolphin morbillivirus selected for study, and what key insights does this work provide?
A: The recent outbreak of dolphin morbillivirus spread rapidly along the northwestern Atlantic coast from New York to Florida, causing substantial mortality among coastal bottlenose dolphin populations. Despite the clear detrimental impact that this disease can have, however, it is still poorly understood. Therefore, our aim in modeling the epidemic was to gain much needed information about how the virus spreads. We found that a dolphin may be infectious for up to 24 days and can travel substantial distances (up to 220 kilometers, or 137 miles) within this time. This is important because such long-range movements — for example, during periods of seasonal migration — are likely to create many transmission opportunities from infected to uninfected individuals, and may have thus facilitated the rapid spread of the virus down the Atlantic coast.
Q: Can this model be used for human epidemics?
A: The Poisson process framework was originally developed to model the occurrence of earthquakes, and has since been used in a variety of other contexts that also tend to suffer from noisy, indirectly observed data, such as urban crime distribution. To model dolphin morbillivirus, we adapted the framework to incorporate more biological information, and similar techniques have also been applied to model meningococcal disease in humans, which can cause meningitis and sepsis. Generally, the data characterizing human epidemics are more detailed than the data we had for this project and, as such, models that can incorporate greater complexity are more widely used. However, we hope that our methods will stimulate the greater use of Poisson process models in epidemiological systems that also suffer from indirectly observed data.
This research was supported by the RAPIDD program of the Science and Technology Directorate of the Department of Homeland Security; the National Institutes of Health Fogarty International Center; the Bill and Melinda Gates Foundation; and the Marine Mammal Unusual Mortality Event Contingency Fund and John H. Prescott Marine Mammal Rescue Assistance Grant Program operated by the National Oceanic and Atmospheric Administration.
Sinead E. Morris, Jonathan L. Zelner, Deborah A. Fauquier, Teresa K. Rowles, Patricia E. Rosel, Frances Gulland and Bryan T. Grenfell. “Partially observed epidemics in wildlife hosts: modeling an outbreak of dolphin morbillivirus in the northwestern Atlantic, June 2013–2014.” Journal of the Royal Society Interface, published Nov. 18 2015. DOI: 10.1098/rsif.2015.0676
“Antibiotic resistance is a problem of managing an open-access resource, such as fisheries or oil,” writes Ramanan Laxminarayan, a research scholar at Princeton University and the director of the Center for Disease Dynamics, Economics & Policy in Washington, D. C., in today’s issue of the journal Science. He goes on to say that individuals have little incentive to use antibiotics wisely, just as people have little incentive to conserve oil when it is plentiful.
As with many other natural resources, maintaining the effectiveness of antibiotics requires two approaches: conserving the existing resource and exploring new sources, Laxminarayan says. These two approaches are linked, however. “Just as incentives for finding new sources of oil reduce incentives to conserve oil,” Laxminarayan writes, “large public subsidies for new drug development discourage efforts to improve how existing antibiotics are used.” Yet new antibiotics tend to cost more than existing ones due to the expense of clinical trials and the fact that the easiest-to-find drugs may have already been discovered.
Laxminarayan’s analysis reveals that the benefits of conserving existing drugs are significant, and argues that the proposed increases in public subsidies for new antibiotics should be matched by greater spending on conservation of antibiotic effectiveness through public education, research and surveillance.
Ramanan Laxminarayan is a research scholar at the Princeton Environmental Institute. His perspective, “Antibiotic effectiveness: Balancing conservation against innovation,” appeared in the September 12, 2014 issue of Science.
By Catherine Zandonella, Office of the Dean for Research
A decades-long debate over how nitrogen is removed from the ocean may now be settled by new findings from researchers at Princeton University and their collaborators at the University of Washington.
The debate centers on how nitrogen — one of the most important food sources for ocean life and a controller of atmospheric carbon dioxide — becomes converted to a form that can exit the ocean and return to the atmosphere where it is reused in the global nitrogen cycle.
Researchers have argued over which of two nitrogen-removal mechanisms, denitrification and anammox, is most important in the oceans. The question is not just a scientific curiosity, but has real world applications because one mechanism contributes more greenhouse gases to the atmosphere than the other.
“Nitrogen controls much of the productivity of the ocean,” said Andrew Babbin, first author of the study and a graduate student who works with Bess Ward, Princeton’s William J. Sinclair Professor of Geosciences. “Understanding nitrogen cycling is crucial to understanding the productivity of the oceans as well as the global climate,” he said.
In the new study, the researchers found that both of these nitrogen “exit strategies” are at work in the oceans, with denitrification mopping up about 70 percent of the nitrogen and anammox disposing of the rest.
The researchers also found that this 70-30 ratio could shift in response to changes in the quantity and quality of the nitrogen in need of removal. The study was published online this week in the journal Science.
The two other members of the research team were Richard Keil and Allan Devol, both professors at University of Washington’s School of Oceanography.
Essential for the Earth’s life and climate, nitrogen is an element that cycles between soils and the atmosphere and between the atmosphere and the ocean. Bacteria near the surface help shuttle nitrogen into the ocean food chain by converting or “fixing” atmospheric nitrogen into forms that phytoplankton can use.
Without this fixed nitrogen, phytoplankton could not absorb carbon dioxide from the air, a feat which is helping to check today’s rising carbon dioxide levels in the atmosphere. When these tiny marine algae die or are consumed by predators, their biomass sinks to the ocean interior where it becomes food for other types of bacteria.
Until about 20 years ago, most scientists thought that denitrification, carried out by some of these bacteria, was the primary way that fixed nitrogen was recycled back to nitrogen gas. The second process, known as anaerobic ammonia oxidation, or anammox, was discovered by Dutch researchers studying how nitrogen is removed in sewage treatment plants.
Both processes occur in regions of the ocean that are naturally low in oxygen, or anoxic, due to local lack of water circulation and intense phytoplankton productivity overlying these regions. Within the world’s ocean, such regions occur only in the Arabian Sea, and off the coasts of Peru and Mexico.
In these anoxic environments, anaerobic bacteria feast on the decaying phytoplankton, and in the process cause the denitrification of nitrate into nitrogen gas, which cannot be used as a nutrient by most phytoplankton. During this process, ammonium is also produced, although marine geochemists had never been able to detect the ammonium that they knew must be there.
That riddle was solved in the early 2000s by the discovery of the anammox reaction in the marine environment, in which anaerobic bacteria feed on ammonium and convert it to nitrogen gas.
But another riddle soon appeared: the anammox rates that Dutch and German teams of researchers measured in the oceans appeared to account for the entire nitrogen loss, leaving no role for denitrification.
Then in 2009, Ward’s team published a study in the journal Nature showing that denitrification was still a major actor in returning nitrogen to the air, at least in the Arabian Sea. The paper further fueled the controversy.
Back at Princeton, Ward suspected that both processes were necessary, with denitrification churning out the ammonium that anammox then converted to nitrogen gas.
To settle the issue, Ward and Babbin decided to look at exactly what was going on in anoxic ocean water when bacteria were given nitrogen and other nutrients to chew on.
They collected water samples from an anoxic region in the ocean south of Baja California and brought test tubes of the water into an on-ship laboratory. Working inside a sturdy, flexible “glove bag” to keep air from contaminating the low-oxygen water, Babbin added specific amounts and types of nitrogen and organic compounds to each test tube, and then noted whether denitrification or anammox occurred.
“We conducted a suite of experiments in which we added different types of organic matter, with variable ammonium content, to see if the ratio between denitrification and anammox would change,” said Babbin. “We found that not only did increased ammonia favor anammox as predicted, but that the precise proportions of nitrogen loss matched exactly as predicted based on the ammonium content.”
The explanation of why, in past experiments, some researchers found mostly denitrification while others found only anammox comes down to a sort-of “bloom and bust” cycle of phytoplankton life, explained Ward.
“If you have a big plankton bloom, then when those organisms die, a large amount of organic matter will sink and be degraded,” she said, “but we scientists are not always there to measure this. In other words, if you aren’t there on the day lunch is delivered, you won’t measure these processes.”
The researchers also linked the rates of nitrogen loss with the supply of organic material that drives the rates: more organic material equates to more nitrogen loss, so the quantity of the material matters too, Babbin said.
The two pathways have distinct metabolisms that turn out to be important in global climate change, he said. “Denitrification produces carbon dioxide and both produces and consumes nitrous oxide, which is another major greenhouse gas and an ozone depletion agent,” he said. “Anammox, however, consumes carbon dioxide and has no known nitrous oxide byproduct. The balance between the two therefore has a significant impact on the production and consumption of greenhouse gases in the ocean.”
The research was funded by National Science Foundation grant OCE-1029951.
Andrew R. Babbin, Richard G. Keil, Allan H. Devol, and Bess B. Ward. Organic Matter Stoichiometry, Flux, and Oxygen Control Nitrogen Loss in the Ocean. Science.Published Online April 10 2014. DOI: 10.1126/science.1248364
While carbon dioxide is typically painted as the bad boy of greenhouse gases, methane is roughly 30 times more potent as a heat-trapping gas. New research in the journal Nature indicates that for each degree that the Earth’s temperature rises, the amount of methane entering the atmosphere from microorganisms dwelling in lake sediment and freshwater wetlands — the primary sources of the gas — will increase several times. As temperatures rise, the relative increase of methane emissions will outpace that of carbon dioxide from these sources, the researchers report.
The findings condense the complex and varied process by which methane — currently the third most prevalent greenhouse gas after carbon dioxide and water vapor — enters the atmosphere into a measurement scientists can use, explained co-author Cristian Gudasz, a visiting postdoctoral research associate in Princeton’s Department of Ecology and Evolutionary Biology. In freshwater systems, methane is produced as microorganisms digest organic matter, a process known as “methanogenesis.” This process hinges on a slew of temperature, chemical, physical and ecological factors that can bedevil scientists working to model how the Earth’s systems will contribute, and respond, to a hotter future.
The researchers’ findings suggest that methane emissions from freshwater systems will likely rise with the global temperature, Gudasz said. But to not know the extent of methane contribution from such a widely dispersed ecosystem that includes lakes, swamps, marshes and rice paddies leaves a glaring hole in climate projections.
“The freshwater systems we talk about in our paper are an important component to the climate system,” Gudasz said. “There is more and more evidence that they have a contribution to the methane emissions. Methane produced from natural or manmade freshwater systems will increase with temperature.”
To provide a simple and accurate way for climate modelers to account for methanogenesis, Gudasz and his co-authors analyzed nearly 1,600 measurements of temperature and methane emissions from 127 freshwater ecosystems across the globe.
The researchers found that a common effect emerged from those studies: freshwater methane generation very much thrives on high temperatures. Methane emissions at 0 degrees Celsius would rise 57 times higher when the temperature reached 30 degrees Celsius, the researchers report. For those inclined to model it, the researchers’ results translated to a temperature dependence of 0.96 electron volts (eV), an indication of the temperature-sensitivity of the methane-emitting ecosystems.
“We all want to make predictions about greenhouse gas emissions and their impact on global warming,” Gudasz said. “Looking across these scales and constraining them as we have in this paper will allow us to make better predictions.”
Yvon-Durocher, Gabriel, Andrew P. Allen, David Bastviken, Ralf Conrad, Cristian Gudasz, Annick St-Pierre, Nguyen Thanh-Duc, Paul A. del Giorgio. 2014. Methane fluxes show consistent temperature dependence across microbial to ecosystem scales. Nature. Article published online before print: March 19, 2014. DOI: 10.1038/nature13164 and in the March 27, 2014 print edition.
The oxygen content of the ocean may be subject to frequent ups and downs in a very literal sense — that is, in the form of the numerous sea creatures that dine near the surface at night then submerge into the safety of deeper, darker waters at daybreak.
Research begun at Princeton University and recently reported on in the journal Nature Geoscience found that animals ranging from plankton to small fish consume vast amounts of what little oxygen is available in the ocean’s aptly named “oxygen minimum zone” daily. The sheer number of organisms that seek refuge in water roughly 200- to 650-meters deep (650 to 2,000 feet) every day result in the global consumption of between 10 and 40 percent of the oxygen available at these depths.
The findings reveal a crucial and underappreciated role that animals have in ocean chemistry on a global scale, explained first author Daniele Bianchi, a postdoctoral researcher at McGill University who began the project as a doctoral student of atmospheric and oceanic sciences at Princeton.
“In a sense, this research should change how we think of the ocean’s metabolism,” Bianchi said. “Scientists know that there is this massive migration, but no one has really tried to estimate how it impacts the chemistry of the ocean.
“Generally, scientists have thought that microbes and bacteria primarily consume oxygen in the deeper ocean,” Bianchi said. “What we’re saying here is that animals that migrate during the day are a big source of oxygen depletion. We provide the first global data set to say that.”
Much of the deep ocean can replenish (often just barely) the oxygen consumed during these mass migrations, which are known as diel vertical migrations (DVMs).
But the balance between DVMs and the limited deep-water oxygen supply could be easily upset, Bianchi said — particularly by climate change, which is predicted to further decrease levels of oxygen in the ocean. That could mean these animals would not be able to descend as deep, putting them at the mercy of predators and inflicting their oxygen-sucking ways on a new ocean zone.
“If the ocean oxygen changes, then the depth of these migrations also will change. We can expect potential changes in the interactions between larger guys and little guys,” Bianchi said. “What complicates this story is that if these animals are responsible for a chunk of oxygen depletion in general, then a change in their habits might have a feedback in terms of oxygen levels in other parts of the deeper ocean.”
The researchers produced a global model of DVM depths and oxygen depletion by mining acoustic oceanic data collected by 389 American and British research cruises between 1990 and 2011. Using the background readings caused by the sound of animals as they ascended and descended, the researchers identified more than 4,000 DVM events.
They then chemically analyzed samples from DVM-event locations to create a model that could correlate DVM depth with oxygen depletion. With that data, the researchers concluded that DVMs indeed intensify the oxygen deficit within oxygen minimum zones.
“You can say that the whole ecosystem does this migration — chances are that if it swims, it does this kind of migration,” Bianchi said. “Before, scientists tended to ignore this big chunk of the ecosystem when thinking of ocean chemistry. We are saying that they are quite important and can’t be ignored.”
Bianchi conducted the data analysis and model development at McGill with assistant professor of earth and planetary sciences Eric Galbraith and McGill doctoral student David Carozza. Initial research of the acoustic data and development of the migration model was conducted at Princeton with K. Allison Smith (published as K.A.S. Mislan), a postdoctoral research associate in the Program in Atmospheric and Oceanic Sciences, and Charles Stock, a researcher with the Geophysical Fluid Dynamics Laboratory operated by the National Oceanic and Atmospheric Administration.
Citation: Bianchi, Daniele, Eric D. Galbraith, David A. Carozza, K.A.S. Milan and Charles A. Stock. 2013. Intensification of open-oxygen minimum zones by vertically migrating animals. Nature Geoscience. Article first published online: June 9, 2013. DOI:10.1038/ngeo1837
This work was supported in part by grants from the Canadian Institute for Advanced Research and the Princeton Carbon Mitigation Initiative.
A new study has examined how bacteria clog medical devices, and the result isn’t pretty. The microbes join to create slimy ribbons that tangle and trap other passing bacteria, creating a full blockage in a startlingly short period of time.
The finding could help shape strategies for preventing clogging of devices such as stents — which are implanted in the body to keep open blood vessels and passages — as well as water filters and other items that are susceptible to contamination. The research was published in Proceedings of the National Academy of Sciences.
Using time-lapse imaging, researchers at Princeton University monitored fluid flow in narrow tubes or pores similar to those used in water filters and medical devices. Unlike previous studies, the Princeton experiment more closely mimicked the natural features of the devices, using rough rather than smooth surfaces and pressure-driven fluid instead of non-moving fluid.
The team of biologists and engineers introduced a small number of bacteria known to be common contaminants of medical devices. Over a period of about 40 hours, the researchers observed that some of the microbes — dyed green for visibility — attached to the inner wall of the tube and began to multiply, eventually forming a slimy coating called a biofilm. These films consist of thousands of individual cells held together by a sort of biological glue.
Over the next several hours, the researchers sent additional microbes, dyed red, into the tube. These red cells became stuck to the biofilm-coated walls, where the force of the flowing liquid shaped the trapped cells into streamers that rippled in the liquid like flags rippling in a breeze. During this time, the fluid flow slowed only slightly.
At about 55 hours into the experiment, the biofilm streamers tangled with each other, forming a net-like barrier that trapped additional bacterial cells, creating a larger barrier which in turn ensnared more cells. Within an hour, the entire tube became blocked and the fluid flow stopped.
The study was conducted by lead author Knut Drescher with assistance from technician Yi Shen. Drescher is a postdoctoral research associate working with Bonnie Bassler, Princeton’s Squibb Professor in Molecular Biology and a Howard Hughes Medical Institute Investigator, and Howard Stone, Princeton’s Donald R. Dixon ’69 and Elizabeth W. Dixon Professor of Mechanical and Aerospace Engineering.
“For me the surprise was how quickly the biofilm streamers caused complete clogging,” said Stone. “There was no warning that something bad was about to happen.”
By constructing their own controlled environment, the researchers demonstrated that rough surfaces and pressure driven flow are characteristics of nature and need to be taken into account experimentally. The researchers used stents, soil-based filters and water filters to prove that the biofilm streams indeed form in real scenarios and likely explain why devices fail.
The work also allowed the researchers to explore which bacterial genes contribute to biofilm streamer formation. Previous studies, conducted under non-realistic conditions, identified several genes involved in formation of the biofilm streamers. The Princeton researchers found that some of those previously identified genes were not needed for biofilm streamer formation in the more realistic habitat.
Drescher, Knut, Yi Shen, Bonnie L. Bassler, and Howard A. Stone. 2013. Biofilm streamers cause catastrophic disruption of flow with consequences for environmental and medical systems. Proceedings of the National Academy of Sciences. Published online February 11.
This work was supported by the Howard Hughes Medical
Institute, National Institutes of Health grant 5R01GM065859, National Science Foundation (NSF) grant MCB-0343821, NSF grant MCB-1119232, and the Human Frontier Science Program.
In mammals such as rodents that raise their young as a group, infants will nurse from their mother as well as other females, a dynamic known as allosuckling. Ecologists have long hypothesized that allosuckling lets newborns stockpile antibodies to various diseases, but the experimental proof has been lacking until now.
An in-press report in the journal Mammalian Biology found that infant Mongolian gerbils that suckled from females given separate vaccines for two different diseases wound up with antibodies for both illnesses.
The findings not only demonstrate the potential purpose of allosuckling, but also provide the first framework for further studying it in the wild by using traceable antibodies, said first author Romain Garnier, a postdoctoral researcher in Princeton University’s Department of Ecology and Evolutionary Biology. Garnier conducted the research with Sylvain Gandon and Thierry Boulinier of the Center for Functional and Evolutionary Ecology in France, and with Yannick Chaval and Nathalie Charbonnel at the Center for Biology and Management of Populations in France.
Garnier and his coauthors administered an influenza vaccine to one group of female gerbils, and a vaccine for Borrelia burgdorferi — the bacterial agent of Lyme disease — to another group. Once impregnated, female gerbils from each vaccine group were paired and, as the gerbils do in nature, kept separate from the male gerbils to birth and rear their young. In the wild, females can choose which young to nurse and infant gerbils can likewise choose which female to suckle. In the typical lab, however, one male, one female and their young are housed together, the researchers wrote.
When screened upon birth, all the infant gerbils had no detectable antibodies against influenza while one had antibodies against B. burgdorferi, according to the paper. But after eight days of nursing, all the infants contained high levels of antibodies for both influenza and B. burgdorferi, suggesting that the females nursed the young — their own and those of the other female — evenly. These results suggest that allosuckling is indeed intended to expose newborn animals to a host of antibodies.
This benefit sheds light on a peculiar arrangement in cooperative mammals that ecologists have puzzled over, the authors wrote. In social species, females usually fall into dominant or subordinate groups with the subordinate females typically involved in tending to the young produced by dominant females. Yet, in many cases, subordinate females are “allowed” to breed. Garnier and his colleagues suggest that the potentially larger antibody pool available through nursing might be one of the reasons why.
Citation: Garnier, R., et al., Evidence of cross-transfer of maternal antibodies through allosuckling in a mammal: Potential importance for behavioral ecology. Mammal. Biol. (2012).
Researchers use mathematical models to consider the implications of “self-boosting” vaccines—a class of emerging vaccines that can establish long-term intermittent antigen presentation within a host—on herd immunity.
“Self-boosting vaccines and their implications for herd immunity” by Nimalan Arinaminpathy, et al.