Researchers propose surveillance system for Zika virus and other infectious diseases (The Lancet)

Test tube image courtesy of the NIH

Image courtesy of the National Institutes of Health

By Catherine Zandonella, Office of the Dean for Research

A group of prominent researchers from seven institutions including Princeton University are calling for the establishment of a worldwide program to collect and test blood and other human bodily fluids to aid in the study and prevention of emerging infectious diseases such as the mosquito-borne Zika fever, which is caused by the Zika virus and has spread throughout Latin America and the Caribbean since early 2015.

In an article published in The Lancet April 5, the authors call for the creation of a World Serology Bank that would include storage repositories located around the world that use a common set of best practices to collect and evaluate blood and other bodily fluids from people infected by various pathogens. The resulting data would help scientists better understand the susceptibility of humans to emerging diseases such as Zika fever. The information could be shared widely among scientists who track disease.

The authors include Princeton’s C. Jessica Metcalf, an assistant professor of Ecology and Evolutionary Biology and Public Affairs in the Woodrow Wilson School for Public and International Affairs, and Bryan Grenfell, the Kathryn Briger and Sarah Fenton Professor of Ecology and Evolutionary Biology and Public Affairs. In the interview below, Metcalf explains more about the World Serology Bank proposal.

Why is it important to create a World Serology Bank?

Serology is the study of bodily fluids including serum, the part of the blood that contains antibodies, with the aim of detecting the body’s immune response to pathogens. Serology provides us with the clearest window we have onto the landscape of susceptibility to pathogens across human populations, and the consequent risk of outbreaks. A World Serology Bank would shed light on the global risk of infectious disease outbreaks, and would be of tremendous public health benefit.

Why do you feel it is important to address this issue now?

With the emergence of pathogens like Zika virus, it becomes ever more important to understand what enhances or limits the global spread of these pathogens, and what the consequences of such spread may be across our pathogen communities. A World Serology Bank would provide a powerful mechanism toward such a global perspective.

What are the challenges involved in creating the Bank?

Challenges range from developing systems for collecting fluids, which can be done on a regular schedule or during specific disease events, to methods for sera storage and sera testing. Other challenges include defining who will administer the World Serology Bank, and what global data-sharing agreements will be put in place. Finally, we will need to develop new methods to translate what we learn from the evaluation of sera, such as patterns of susceptibility to specific pathogens, or protection from those pathogens. These methods will be driven by the underlying biology, and are likely to require an array of analytical innovations.

Read more

The article, “Use of serological surveys to generate key insights into the changing global landscape of infectious disease,” by C. Jessica E Metcalf, Jeremy Farrar, Felicity T. Cutts, Nicole E. Basta, Andrea L. Graham, Justin Lessler, Neil M. Ferguson, Donald S. Burke and Bryan T. Grenfell was published online in the journal The Lancet on April 5, 2016.


Dolphin-disease outbreak shows how to account for the unknown when tracking epidemics (Journal of the Royal Society Interface)

By Morgan Kelly, Office of Communications

Common bottlenose dolphin. Image credit: Allison Henry, NOAA.

Common bottlenose dolphin. Image credit: Allison Henry, NOAA.

Stopping the outbreak of a disease hinges on a wealth of data such as what makes a suitable host and how a pathogen spreads. But gathering these data can be difficult for diseases in remote areas of the world, or for epidemics involving wild animals.

A new study led by Princeton University researchers and published in the Journal of the Royal Society Interface explores an approach to studying epidemics for which details are difficult to obtain. The researchers analyzed the 2013 outbreak of dolphin morbillivirus — a potentially fatal pathogen from the same family as the human measles virus — that resulted in more than 1,600 bottlenose dolphins becoming stranded along the Atlantic coast of the United States by 2015. Because scientists were able to observe dolphins only after they washed up on shore, little is known about how the disease transmits and persists in the wild.

The researchers used a Poisson process — a statistical tool used to model the random nature of disease transmission — to determine from sparse data how dolphin morbillivirus can spread. They found that individual bottlenose dolphins may be infectious for up to a month and can spread the disease over hundreds of miles, particularly during seasonal migrations. In 2013, the height of disease transmission occurred toward the end of summer around an area offshore of Virginia Beach, Virginia, where multiple migratory dolphin groups are thought to cross paths.

In the interview below, first author Sinead Morris, a graduate student in ecology and evolutionary biology, explains what the researchers learned about the dolphin morbillivirus outbreak, and how the Poisson process can help scientists understand human epidemics. Morris is in the research group of co-author Bryan Grenfell, Princeton’s Kathryn Briger and Sarah Fenton Professor of Ecology and Evolutionary Biology and Public Affairs.

Q: How does the Poisson process track indirectly observed epidemics and what specific challenges does it overcome?

A: One of the main challenges in modeling indirectly observed epidemics is a lack of data. In our case, we had information on all infected dolphins that had been found stranded on shore, but had no data on the number of individuals that became infected but did not strand. The strength of the Poisson process is that its simple framework means it can be used to extract important information about the how the disease is spreading across space and time, despite having such incomplete data. Essentially the way the process works is that it keeps track of where and when individual dolphins stranded, and then at each new point in the epidemic it uses the history of what has happened before to project what will happen in the future. For example, an infected individual is more likely to transmit the disease onwards to other individuals in close spatial proximity than to those far away. So, by keeping track of all these infections the model can identify where and when the largest risk of new infections will be.

Q: Why was this 2013-15 outbreak of dolphin morbillivirus selected for study, and what key insights does this work provide?

A: The recent outbreak of dolphin morbillivirus spread rapidly along the northwestern Atlantic coast from New York to Florida, causing substantial mortality among coastal bottlenose dolphin populations. Despite the clear detrimental impact that this disease can have, however, it is still poorly understood. Therefore, our aim in modeling the epidemic was to gain much needed information about how the virus spreads. We found that a dolphin may be infectious for up to 24 days and can travel substantial distances (up to 220 kilometers, or 137 miles) within this time. This is important because such long-range movements — for example, during periods of seasonal migration — are likely to create many transmission opportunities from infected to uninfected individuals, and may have thus facilitated the rapid spread of the virus down the Atlantic coast.

Q: Can this model be used for human epidemics?

A: The Poisson process framework was originally developed to model the occurrence of earthquakes, and has since been used in a variety of other contexts that also tend to suffer from noisy, indirectly observed data, such as urban crime distribution. To model dolphin morbillivirus, we adapted the framework to incorporate more biological information, and similar techniques have also been applied to model meningococcal disease in humans, which can cause meningitis and sepsis. Generally, the data characterizing human epidemics are more detailed than the data we had for this project and, as such, models that can incorporate greater complexity are more widely used. However, we hope that our methods will stimulate the greater use of Poisson process models in epidemiological systems that also suffer from indirectly observed data.

Graph of predictions of risk of disease transmission.

A new study led by Princeton University researchers used a Poisson process to analyze sparse data from the 2013 outbreak of morbillivirus among bottlenose dolphins along the United States’ Atlantic coast. This graph shows the model predictions of how the risk of disease transmission (marginal hazard) changes over space (A) and time (B) since the beginning of the epidemic. The peaks indicate that the greatest risk of transmission occurred around day 70 of the epidemic between 36 and 37 degrees north latitude, which is an area that encompasses the offshore waters of Virginia Beach, Virginia. These peaks coincide with a period towards the end of summer when large numbers of dolphins are known to gather around Virginia Beach as their seasonal migratory ranges overlap. (Image courtesy of Sinead Morris, Princeton University)

This research was supported by the RAPIDD program of the Science and Technology Directorate of the Department of Homeland Security; the National Institutes of Health Fogarty International Center; the Bill and Melinda Gates Foundation; and the Marine Mammal Unusual Mortality Event Contingency Fund and John H. Prescott Marine Mammal Rescue Assistance Grant Program operated by the National Oceanic and Atmospheric Administration.

Read the abstract.

Sinead E. Morris, Jonathan L. Zelner, Deborah A. Fauquier, Teresa K. Rowles, Patricia E. Rosel, Frances Gulland and Bryan T. Grenfell. “Partially observed epidemics in wildlife hosts: modeling an outbreak of dolphin morbillivirus in the northwestern Atlantic, June 2013–2014.” Journal of the Royal Society Interface, published Nov. 18 2015. DOI: 10.1098/rsif.2015.0676


Genetic tweak gave yellow fever mosquitoes a nose for human odor (Nature)

-By Morgan Kelly, Office of Communications

2014_11_12_Mosquito1One of the world’s deadliest mosquitoes sustains its taste for human blood thanks in part to a genetic tweak that makes it more sensitive to human odor, according to new research.

Researchers report in the journal Nature that the yellow fever mosquito contains a version of an odor-detecting gene in its antennae that is highly attuned to sulcatone, a compound prevalent in human odor. The researchers found that the gene, AaegOr4, is more abundant and more sensitive in the human-preferring “domestic” form of the yellow fever mosquito than in its ancestral “forest” form that prefers the blood of non-human animals.

The research provides a rare glimpse at the genetic changes that cause behaviors to evolve, explained first author Carolyn “Lindy” McBride, an assistant professor in Princeton University’s Department of Ecology and Evolutionary Biology and the Princeton Neuroscience Institute who conducted the work as a postdoctoral researcher at the Rockefeller University. Uncovering the genetic basis of changes in behavior can help us understand the neural pathways that carry out that behavior, McBride said.

The research also could help in developing better ways to stem the yellow fever mosquito’s appetite for humans, McBride said. The yellow fever mosquito is found in tropical and subtropical areas worldwide and is the principal carrier of yellow fever, the measles-like dengue fever, and the painful infection known as chikungunya. Yellow fever annually kills tens of thousands of people worldwide, primarily in Africa, while dengue fever infects hundreds of millions. The research also suggests a possible genetic root for human preference in other mosquitoes, such as malaria mosquitoes, although that species is genetically very different from the yellow fever mosquito.

“The more we know about the genes and compounds that help mosquitoes target us, the better chance we have of manipulating their response to our odor” McBride said, adding that scent is not the only driver of mosquito behavior, but it is a predominant factor.

The researchers first conducted a three-part series of experiments to establish the domestic yellow fever mosquito’s preference for human scent. Forest and domestic mosquitoes were put into a large cage and allowed to bite either a guinea pig or a researcher’s arm. Then the mosquitoes were allowed to choose between streams of air that had passed over a guinea pig or human arm. Finally, to rule out general mosquito attractants such as exhaled carbon dioxide, mosquitoes were allowed to choose between the scent of nylon sleeves that had been in contact with a human or a guinea pig.

In all three cases, the domestic form of the yellow fever mosquito showed a strong preference for human scent, while the forest form primarily chose the guinea pig. Although domestic mosquitoes would sometimes go for the guinea pig, it happened very rarely, McBride said.

McBride and colleagues then decided to look for differences in the mosquitoes’ antennae, which are equivalent to a human’s nose. They interbred domestic and forest mosquitoes, then interbred their offspring to create a second hybrid generation. The genomes of these second-generation hybrids were so completely reshuffled that when the researchers compared the antennae of the human- and guinea pig-preferring individuals they expected to see only genetic differences linked directly to behavior, McBride said.

The researchers found 14 genes that differed between human- and guinea pig-preferring hybrids — two of them were the odorant receptors Or4 and Or103. Choosing to follow up on Or4, the researchers implanted the gene into fruit-fly neurons. They found that the neurons exhibited a burst of activity when exposed to sulcatone, but no change when exposed to guinea pig odors. McBride plans to further study Or103 and other genes that could be linked to host preference at Princeton.

Gene expression

A comparison of domestic and forest form antennae found that two odorant-receptor genes, Or4 and Or103, are more “expressed,” or abundant, in the human-preferring domestic mosquitoes (top bar) than in the forest form that feeds primarily on non-human animals (bottom bar). The color scale indicates the level of gene expression with purple standing for the least amount and red for the most. The numbers to the left of the colored bars represent three different colonies of each mosquito form. The slanted line under each gene’s name points to the level of expression of that gene in each colony. (Image courtesy of Carolyn McBride, Department of Ecology and Evolutionary Biology and the Princeton Neuroscience Institute)

This work provides insight into how the domestic form of the yellow fever mosquito evolved from its animal-loving ancestor into a human-biting specialist, McBride said. “At least one of the things that happened is a retuning of the ways odors are detected by the antennae,” she said. “We don’t yet know whether there are also differences in how odor information is interpreted by the brain.”

This work was supported in part by the National Institutes of Health (NIDCD grant no. DC012069; NIAID grant no. HHSN272200900039C; and NCATS CTSA award no. 5UL1TR000043); the Swedish Research Council and the Swedish University of Agricultural Science’s Insect Chemical Ecology, Ethology and Evolution initiative; and the Howard Hughes Medical Institute.

Read the abstract.

Carolyn S. McBride, Felix Baier, Aman B. Omondi, Sarabeth A. Spitzer, Joel Lutomiah, Rosemary Sang, Rickard Ignell, and Leslie B. Vosshall. 2014. Evolution of mosquito preference for humans linked to an odorant receptor. Nature. Arti­cle pub­lished in print Nov. 13, 2014. DOI: nature13964.3d

Model anticipates ecological impacts of human responses to climate (Conservation Biology)

A Princeton University research team has created a readily transferable method for conservation planners trying to anticipate how agriculture will be affected by such adaptations. The tested their model by studying wheat and maize production in South Africa. (Image source: WWS)

A Princeton University research team has created a readily transferable method for conservation planners trying to anticipate how agriculture will be affected by such adaptations. The tested their model by studying wheat and maize production in South Africa. (Image source: WWS)

By B. Rose Huber, Woodrow Wilson School of Public and International Affairs

Throughout history, humans have responded to climate.

Take, for example, the Mayans, who, throughout the eighth and 10th centuries, were forced to move away from their major ceremonial centers after a series of multi-year droughts, bringing about agricultural expansion in Mesoamerica, and a clearing of forests. Much later, in the late 20th century, frequent droughts caused the people of Burkina Faso in West Africa to migrate from the dry north to the wetter south where they have transformed forests to croplands and cut the nation’s area of natural vegetation in half.

Such land transformations, while necessary to ensure future crop productivity, can themselves have large ecological impacts, but few studies have examined their effects. To that end, a Princeton University research team has created a model to evaluate how a human response to climate change may alter the agricultural utility of land. The study, featured in Conservation Biology, provides a readily transferable method for conservation planners trying to anticipate how agriculture will be affected by such adaptations.

“Humans can transform an ecosystem much more rapidly and completely than it can be altered by shifting temperature and precipitation patterns,” said Lyndon Estes, lead author and associate research scholar in the Woodrow Wilson School of International and Public Affairs. “This model provides an initial approach for understanding how agricultural land-use might shift under climate change, and therefore which currently natural areas might be converted to farming.”

Under the direction of faculty members Michael Oppenheimer and David Wilcove, both from the Wilson School’s Program in Science, Technology and Policy, and with the help of visiting student research collaborator Lydie-Line Paroz from ETH Zurich and colleagues from several other institutions, Estes studied South Africa, an area projected to be vulnerable to climate change where wheat and maize are the dominant crops.

Before determining how climate change could impact the crops, the team first needed to determine which areas have been or might be farmed for maize and wheat. They created a land-use model based on an area’s potential crop output and simulated how much of each crop was grown from 1979 to 1999 – the two decades for which historical weather data was available. They also calculated the ruggedness of each area of land, which is related to the cost of farming it. Taking all factors into account, the model provides an estimate of whether the land is likely to be profitable or unprofitable for farming.

To investigate any climate-change impacts, the team then examined the production of wheat and maize under 36 different climate-response scenarios. Many possible future climates were taken into account as well as how the crops might respond to rising levels of carbon dioxide. Based on their land-use model, the researchers calculated how the climate-induced productivity changes alter a land’s agricultural utility. In their analysis, they included only conservation lands – current nature reserves and those that South African conservation officials plan to acquire – that contained land suitable for growing one of the two crops either currently or in the future. However, Estes said the model could be adapted to assess whether land under other types of uses (besides conservation) are likely to be profitable or unprofitable for future farming.

They found that most conservation lands currently have low agricultural utility because of their rugged terrain, which makes them difficult to farm, and that they are likely to stay that way under future climate-change scenarios. The researchers did pinpoint several areas that could become more valuable for farming in the future, putting them at greater risk of conversion. However, some areas were predicted to decrease value for farming, which could make them easier to protect and conserve.

“While studying the direct response of species to climatic shifts is important, it’s only one piece of a complicated puzzle. A big part of that puzzle relates to how humans will react, and history suggests you don’t need much to trigger a change in the way land is used that has a fairly long-lasting impact. ” said Estes. “We hope that conservation planners can use this approach to start thinking about human climate change adaptation and how it will affect areas needing protection.”

Other researchers involved in the study include: Lydie-Line Paroz, Swiss Federal Institute of Technology; Bethany A. Bradley, University of Massachusetts; Jonathan Green, STEP; David G. Hole, Conservation International; Stephen Holness, Centre for African Conservation Ecology; and Guy Ziv, University of Leeds.

The work was funded by the Princeton Environmental Institute‘s Grand Challenges Program.

Read the abstract.

Estes LD, Paroz LL, Bradley BA, Green JM, Hole DG, Holness S, Ziv G, Oppenheimer MG, Wilcove DS. Using Changes in Agricultural Utility to Quantify Future Climate-Induced Risk to Conservation Conservation Biology (2013). First published online Dec. 26, 2013.