Microbiome hype is relatively new, but the idea that some microbial ecosystems help, not hinder us, is much older than many people realise.
With a creak, the hatch in the floor opened, and a faceless figure emerged dripping from the pool of liquid disinfectant below. The intruder had a roughly human-like shape but was sheathed from head to toe in specially-treated canvas. A glass panel in the head revealed a human face sealed within the steampunk-style diving suit. These strangely-suited figures, who were in reality 1940s researchers from the University of Notre Dame, were the only contact the inhabitants of the enclosure would ever have with the outside world.
The guinea pigs, rats, rabbits, monkeys, and chickens that lived within the enclosure would spend their entire lives without ever encountering a germ. The mammals had been delivered by premature c-section and left to grow in a completely sterile environment, accessible only by traversing a tank of liquid disinfectant and emerging from the germ-free side through a hatch in the floor.
The unexposed animals filled a key need for twentieth century biology research. Since the 1870s, microbiologists had been injecting animals with various microorganisms in order to figure out which ‘germs’ cause disease. But each of those experiments included billions of confounding factors in the form of the microbes that live on and inside animals. It had taken many years and several attempts, but by the 1940s, James Reyniers of University of Notre Dame had gotten the first germ-free animal operation up and running.
“The germ-free animals represent the greatest advance in bacteriology since scientists learned how to isolate and grow germs in pure cultures,” LIFE magazine declared in a glowing 1949 photo essay on the germ-free animal lab. The germ-free animals’ existence refuted the long-held idea that the colonies of microbes that live on and in animals were essential for animals’ survival, the magazine story argued, adding, “Despite the pristine purity of their existence, the germ-free animals develop into normal adults.”
Subsequent research would discover that germ-free animals, though similar to their microbe-bearing counterparts, were not entirely normal. They gained weight slowly, and the linings of their intestines were thinner and leakier. Though the germless animals could survive on their own, they had a harder time absorbing nutrients. Scientists would go on to discover that many resident bacteria secrete chemical signals that encourage their host’s cells to turn on helpful genes. Many germ-free animals also behaved strangely, often mirroring anxiety and other behavioural issues in humans.
However, in the 1940s, those findings were decades away, and most people would have seen germ-free life as the way of the future. For the first time ever, children were being spared from infectious diseases. The sterile chamber at Notre Dame’s clean, submarine-like aesthetic would come to be seen as futuristic.
And for the time being, microbes would remain the living avatars of filth and plague.
Germ Theory, or the idea that microscopic organisms cause infectious disease, is one of the most successful scientific concepts of all time. Its adoption enabled the rise of vaccines, antibiotics, and modern sanitation. Exact estimates vary, but germ theory has saved hundreds of millions if not billions of lives since its rise to prominence in the late 1800s.
However, germ theory has always had a quieter twin, the idea that life would not be possible without the multitudinous masses of microbes that live inside larger organisms. Many scientists thought that intestinal bacteria might play a role in digestion, but attempts to document these changes were largely overshadowed by studies on disease-causing germs.
“The very importance of these discoveries [on pathogenic bacteria] has been a potent factor in diverting attention from the studies of normal intestinal flora,” bacteriologist Arthur I. Kendall opined in a 1909 study.
Most Victorian-era bacteriologists chose to focus on fighting infectious diseases. Such an approach meant zeroing in on bacterial and viral culprits and devising ways to kill them. One of the few microbiologists who looked at microbial ecosystems, René Dubos, gained fame by ‘feeding’ pathogenic bacteria to a sample of soil bacteria and isolating the chemical that allowed the soil to fend off pathogens. That bacteria-killing chemical went on to become the first antibiotic synthesised for the commercial market, until drug manufacturers realised it was too toxic to use as a go-to medicine.
So while soil microbiologists and others continued to quietly work at understanding microbial ecosystems, most people rarely heard about bacteria outside of discussions about disease. The advent of antibiotics in the 1940s and 1950s did little to reverse the trend.
“When someone is told that his skin supports a large population of microorganisms, he may look a bit uneasy and respond that he takes a shower every morning,” New Zealand microbiologist Mary Marples wrote in a 1969 piece for Scientific American. She continued, “If, on the other hand, one considers skin from the standpoint of its natural inhabitants, rather than in terms of the appearance, comfort, and defense mechanisms of the human host, a fascinating new world comes into view... This environment and the populations that live on it form an ecosystem, a discrete world whose living and nonliving components, all interacting with one another, exist in equilibrium.”
In comparing the tiny skin ecosystems she studied to large-scale biomes, Marples was likely influenced by bacteriologist Theodor Rosebury. His 1962 book The Microorganisms Indigenous to Man is still considered one of microbiome science’s founding texts but saw little circulation outside of academia. In 1969, he followed up with a book called Life on Man, which was written for general audiences.
“These microbes are found in every mouth,” Rosebury told journalist Studs Turkel during a 1969 radio interview. “They are found in every healthy mouth. Their presence in the mouth are entirely compatible with continued good health, but... they can do damage.”
Though Life on Man received rave reviews, it was a mere drop in the bucket against the onslaught of advertising urging consumers to cleanse themselves and control their bodily odours by killing bacteria.
Part of the problem was that these tiny ecosystems didn’t have a catchy nickname. And with nearly a century of anti-bacterial rhetoric painting microbes as villains, they would need one.
Tracing the origins of the term ‘microbiome’ is difficult. Many mistakenly attribute a 2001 essay co-authored by Alexa McCray and Nobel laureate and microbiologist Joshua Lederberg, whose work in the 1940s and 1950s revealed that bacteria can swap DNA plasmids. The word microbiome never appears in the main text of the essay but does feature alongside about three dozen other -ome words. (For even more -ome words, check out the previous Paradigms installment on -Omics.)
When evolutionary microbiologist Jonathan Eisen looked into the term’s origins, he was able to find an obscure footnote in a Italian gynaecological text from 1895 that appears to use the word microbiome. (I did a similar Google books search on just biome and found that biome in its modern spelling does indeed pre-date 1895, if only slightly. But I would agree with Eisen’s assessment of “more digging is needed.”)
Regardless of its origins, the term microbiome was very rare until the early 2000s, when Lederberg and other prominent microbiologists began using the term to describe assemblages of microbes and their genomes.
Research into microbial ecosystems had been slowly growing through the second half of the twentieth century. One key lab was Jeffrey Gordon’s lab at Washington University of St. Louis. In the late 1970s, as a young gastroenterologist, Gordon became interested in the geography of the intestines and how intestinal cells knew what roles to take on. In the 1980s and 1990s, Gordon and his trainees showed several times that bacteria living in the gut secreted key signals that told intestinal host cells what to do.
As genome sequencing technology accelerated and the catalogues of known bacterial species expanded, studies looking at interactions between microbes and their hosts became more feasible and more common. By the time the term microbiome caught fire in the early 2000s, research into the area had exploded.
And the pace hasn’t slowed since.
Evidence shows that gut microbes are key players in mental health, immune function,metabolism, and many other dimensions of human health. Hundreds of labs around the world are analysing microbial genomes and secretions, looking for potential drug targets or for helpful species that be used as treatments themselves. The ambitious Earth Microbiome Project — led by another New Zealander, Rob Knight of UCSD — is sequencing hundreds of human, animal, and environmental microbiome samples. The long-term goal is to create a central repository of info about Earth’s microscopic diversity.
The concept of the microbiome, a network of friendly bacterial ecosystems that keep us healthy, has made quite an impression in the past two decades. Probiotics, packets of microbes that allegedly boost health, line the shelves of pharmacies — despite the absence of solid scientific evidence backing them. (It’s clear the the microbiome can shape health, but it’s unclear whether the species currently sold as probiotics actually help.) Bacteria-filled yoghurt is now seen as a health food.
However, Thomas Kuhn’s description of scientific revolutions doesn’t quite fit modern microbiology. (See our “Paradigm Shift 101” post for a refresher). According to Kuhn, revolutionary paradigms challenge and eventually replace paradigms that are “in crisis”. But Germ Theory has yet to weaken or wane. Medical researchers have updated it to account for diseases that stem from genes, unhealthy lifestyles, or a combination of factors. Updating an existing big idea isn’t revolutionary; it’s what Kuhn called ‘normal science’.
Microbiome research hasn’t replaced traditional pathogen-centric microbiology. Instead, research exploring microbial ecosystems has grown alongside its predecessor. Some people consider microbiome research to be its own sub-discipline (spawning new sub-disciplines is also a hallmark of scientific revolutions), but many still consider it part of microbiology. If a revolution has occurred, it has been a bloodless one.
However, the microbiome does exhibit many of the key traits of a revolutionary paradigm. It answers questions that can’t be accounted for with just one “bad” bacterium and has inspired thousands of new questions and follow-up studies. Though new microbiome data offers is rarely clear-cut, there’s no doubt that these microbial ecosystems are important.
Perhaps most importantly, microbiome research has redefined its subjects — the bacteria, archaea, fungi, and protists that inhabit the world around and within us. Instead of tiny bogeymen out to kill us, microbiome research often makes microbes sound like a ragtag band of unlikely anti-heroes with both the power to kill and the power to grant powerful boons to humankind. In some circles, germs are a source of hope.
The microbiome may not be revolutionary in the “overthrowing its predecessor sense”, but it is a transformational idea that offers a different point of view on a familiar story.
Edited by Tessa Evans