Eat Dirt?

Allergies, autoimmune disease, and the hygiene hypothesis

Shay Blufarb

We are prone to odd fantasies about which bits of our world should be germ free. Until a few weeks ago my youngest daughter relied on breast milk. If bottle feeding had been straightforward, we might have introduced an ele­ment of it, but the ceremony of constant sterilization helped put us off. It’s a strange thing, when you think about it. What delusional ideas do we have about the human breast? It’s far from sterile—no more sterile, indeed, than the mouth of a six-month-old baby. Now she’s getting her first semi­solid food. The instructions on the side of the jar of rice pudding say the remnants have to be thrown away immediately if left in the jar, but can last forty-eight hours if I transfer them to a bowl—so long as I transfer them with a wooden and not a metal spoon. Although I’m an attending physician with substantial clinical experience and postgraduate training in microbiol­ogy and infectious diseases, I can’t figure out those warnings. Yet in our search for good health, particu­larly for our children, can there be anything wrong with erring on the side of cleanliness?

In a brief British Medical Journal article in 1989—“Hay Fever, Hy­giene, and Household Size”—the idea was born that came to be chris­tened the “hygiene hypothesis.” Hay fever, an allergic reaction, was noted to have risen rapidly in the modern world, and the article tried to work out why. It looked at 17,000 twenty-three-year-old Britons born during the same week, and it sought factors associated with their risk of getting hay fever. Back then the idea was that respiratory infections damaged the lungs and brought on hay fever. The paper, though, showed that the more infections you got, the less likely you were to get hay fever. The author of the study thought his findings could be explained “if allergic diseases were prevented by infection in early childhood, trans­mitted by unhygienic contact with older siblings, or acquired prenatally from a mother infected by con­tact with her older children.” He thought the data fit with our envi­ronment having become too sterile for our own good:

Over the past century declining fam­ily size, improvements in household amenities, and higher standards of personal cleanliness have reduced the opportunity for cross infection in young families.This may have resulted in more widespread clinical expression of atopic [allergic] disease, emerging earlier in wealthier people, as seems to have occurred for hay fever.

It’s an appealing idea.You want an immune system fierce enough to flare up promptly in response to challenges, but not so out of control it does so inappropriately. Perfection in evolved systems is an impossible goal—most of the time all that’s achievable is efficient compromise. Some people are more prone to infections, some to allergies and autoimmune diseases. Genetic tendencies are likely to run one way or the other, although they can flip-flop cheerfully in response to different triggers.

It makes sense to expect that our immune systems adapt themselves to their environment; everything else in our bodies does. Logically, living in a world full of pathogens should cause our immune system to be more fired up; living in one free of them should let it settle down.What the hygiene hypothesis suggests, paradoxically, is that when our environment is uncannily free of germs, our immune system runs into trouble.An immune system that isn’t given enough to do turns sour. Without the parasites and plagues it’s built for, it starts finding other ways to occupy its energies, getting edgy about things that really shouldn’t be a problem at all—like peanuts or house dust mites or some of the body’s own cells.When an excessive response is elicited by something in the environment, we call it an allergic reaction; if it’s a reaction to some benign part of one’s own body, we call it autoimmune disease.

All this sounds reasonable enough. But many ideas about health and the human body sound reasonable with­out being true. Since that 1989 ar­ticle, which did no more than point out a suggestive association, what other evidence has been uncovered?

There have certainly been lots of articles on the subject, includ­ing lots of peer-reviewed studies that seek to look at original data. I hesitate to term them all “sci­ence,” since many are descriptive: they design and carry out no ex­periment and attempt to falsify no hypothesis. They fall into the sort of technically proficient studies of natural history that most of us think of as science but which are really just the groundwork for it. They develop theories about the way the world works; science only happens when those theories get properly tested.

Consistently, allergic and autoim­mune diseases have risen in the last decades, their rise taking place in the richer parts of the world that have also seen drops in many common infections. Hepa­titis A, measles, rheumatic fever, mumps—these things have fallen away. Subtler ex­posures to infections have also changed.All of us get colonized by bacteria soon after birth, but for the richer citizens of the world, expo­sure to many common mi­croorganisms is being delayed. Among other things, cesarean birth and bottle feeding both interfere with normal routes of infection.

Children sent early to nursery schools or day care centers—wonderful places, as parents know, for them to acquire a stream of minor infections—have lower risks of allergic and autoimmune disease than others. Having siblings seems to offer pro­tection also, as does growing up on a farm. Being exposed to germs in soil, on animals, with an increase in the risk of devel­oping allergic diseases in later life.

Broken-down parts of bacteria are in the air we breathe, and the more particles—bacteria thrive in the guts of other mammals, as they do in our own—and so does not having the cleanest, most spotless house. In experiments on rats and mice, keeping them infection free (isolated in sterile environ­ments after being born by cesarean deliveries to prevent them acquiring germs from their mother’s vagina) in­creases rates of allergic and autoimmune illness. Delib­erately infecting them, or exposing them to non-patho­genic fragments of infectious organisms, does the opposite. The associations are con­sistent, their possible mecha­nistic links would seem to make sense, and some animal experiments support the idea—but that’s as far as we can go.“Eating dirt or mov­ing to a farm,” said a New England Journal of Medicine editorial on the subject,“are at best theoretical rather than practical clinical rec­ommendations.”

Since 1989, a lot of data has focused not only on bac­terial and viral infections, but also on helminths. They’re the parasitic worms that were a daily part of our lives for the bulk of evolutionary history and now, happily, are around the house, and from the people you live, go to school, and play with—all these things have been linked with reducing lifetime risk of al­lergies, asthma, diabetes, and some autoimmune conditions such as Crohn’s disease. In contrast, early use of antibiotics has been associated not. Once the hygiene hy­pothesis was suggested, they were an obvious target of interest. Sever­al studies noted that rates of allergic and autoimmune diseases went up when worm infections went down, and that carrying parasites around appeared to fend those diseases off. (While being chronically parasit­ized by worms seemed to offer advantages, briefer infections with them were associated with rises in the number of allergy or autoim­mune problems—something to consider should over-enthusiasm have you preparing plates of worm-filled food for your kids.) Work in Gabon, in equatorial Africa, showed that if you dewormed large numbers of children, rates of illnesses linked to overactive immune systems rose.

The working hy­pothesis is that inter­acting with these par­asites—“old friends,” as some term them— is important for the immune system to develop and work well. It’s even been suggested that the way our immune system reshapes itself in the absence of helminths makes us fatter—not through an increase in our overall health, but by deranging the normal homeostatic mechanisms that have evolved to keep us alive in the company of such parasites.

The most important thing to note, though, is that what started off inconclusive has stayed that way.The association between infections and allergies was first suggested as long as half a century ago—although originally the idea was that infections caused such diseases, not prevented them. Aller­gies and autoimmune diseases have risen and parasites have dropped away at the same time as a host of other things have changed—democ­racy has spread, satellites have been launched, mobile phones have been developed, and a raft of modern pol­iticians elected. If you wanted, you could show an association between any of those aspects of contempo­rary life and the increase in rates of allergies and autoimmune disorders.

Associations are easy to spot, even plausible ones—but proving causa­tion is hard. Might early exposureto bugs and parasites be entirely unrelated to allergies and autoim­mune diseases? Quite possibly. It’s been suggested that the spread of detergents in the world (which lines up chronologically with every other modern phenomenon) disrupts the mucus lining the inside of our intes­tines, with some dis­eases being caused as a result. Sounds like a crazy idea to me, but it would be a tough one to sit down and disprove—and there is no shortage of oth­ers like it.We’re under no mental obligation to explain the mod­ern rise in allergies and autoimmune dis­eases in any one way.

The hygiene hypoth­esis is the sort of story that’s attractive to mass media, and that doesn’t help. What editors tend to want—because it’s what most readers prefer—is cheerful certainty. If it com­pletely goes against another cheerful certainty from a previ­ous week, that’s OK. (Readers are flattered into feeling superior to the scientific elite, who can’t seem to agree on anything.)

It’s always best to expect the simplest explanation for life’s phenomena, we’re told, generally before being offered something sweet and easy to swal­low. It’s easy to overlook the fact that the most reliable explanations often aren’t very simple or even that explanatory. Pointing out doubt and uncertainty is worth doing. It’s good for us.Which is partly what I find so charming about the idea that getting our hands dirty is also good for us. I believe we profit from rubbing our noses in the grimy realities of men­tal life, so I’m drawn to the idea of a physical parallel.

I started writing this essay topic with more than a passing knowledge of the evidence, but no deep appreciation of it. I had the strong impression that research into the hygiene hypothesis had shown that dirt was good for us. I found I was mistaken. The evidence is in favor of this idea, but the evidence is also clearly weak. My mistake was not to notice the extent to which my morals and tastes—my belief that we’re irrationally fastidious—were shap­ing my beliefs about the natural world.

Although he didn’t coin the term “hygiene hypothesis,” David Strachan, an English epidemiologist, was the man responsible for starting it when he wrote that 1989 paper in the British Medical Journal. He has since reflected on the fate of his notion.“An idea which was introduced on the basis of epidemiological data and contrary to immunological ortho­doxy,”he went on to write,“now survives as much on the grounds of biological plausibility as confir­matory epidemiological observa­tions. . . .There is undoubtedly something to explain, but the results of studies which have more directly addressed infection as the explana­tory factor have been disappointing and often difficult to interpret.”This is a remarkably sane, measured, and serious approach from a man who, you might expect, should be prone to exaggerate the evidence for his famous idea.

Physical sterility—the absence of germs—may yet turn out to be good for us, the hygiene hypothesis completely wrong. Mental sterili­ty—the replacement of uncertainties with simplifications, and the acceptance of untested ideas as flat and featureless facts—requires no ex­periment to demonstrate its harms. Strachan’s insistence on doubt, his insistence that all he has put forward is an idea worth testing, is an exam­ple of how good science works.

So should we encourage our chil­dren to play outdoors in order to protect them from asthma? We don’t know, but we do know that giv­en the chance, most children love to play outdoors. Should we foster physical activ­ity? By all means, but let’s do so for the joy of physical exertion and for the camara­derie and athleticismof sport. Should we encourage ourselves to tolerate a bit of dirt, a bit of expo­sure to germs and microbes? Yes, abso­lutely, but let’s do so on the basis that it’s unlikely to do us any harm and allows us to get on with our lives with less fuss and anxiety. Healthy bodies contain more ­ bacteria than human cells. Putting energy into trying to steril­ize our fecund world may or may not prolong our lives, but it’ll certainly suck up precious hours.

view counter
view counter

Recent Stories

Conspicuous armaments are good visual proxies for fighting ability.

Bats, reservoirs for such viruses as Ebola, are increasingly villainized and require special conservation.

After centuries of moving through the Irish countryside, a group known as Travellers has come to rest.

Algae, plants and humans: three groups of organisms that used chemistry to change the planet.