The biggest, baddest myth about gluten and wheat

Gluten-free diets are all the rage right now. It seems like every second celebrity, athlete and blogger has lost weight, improved their tennis game, cured their anxiety/acne/cancer/leprosy and won Lotto, all because they cut evil gluten out of their diet.

Now, don’t get me wrong. There are people who should absolutely avoid eating wheat or any other gluten-containing food because they have coeliac disease, wheat allergy or appropriately diagnosed non-coeliac gluten sensitivity, or a non-coeliac autoimmune disease which benefits from gluten restriction. Some people who have those conditions in a silent form, or haven’t been diagnosed correctly, would experience health benefits if they cut gluten out of their diets.

But since coeliac disease and wheat allergy combined currently affect less than 2% of the Australian population, and non-coeliac gluten sensitivity is only slightly more common than coeliac disease, the majority of the more than 10% of Australians who actively avoid gluten don’t actually need to. Worse yet, following a gluten-free diet may actually be harmful, since it reduces the population of beneficial gut bacteria, setting up the circumstances for an overgrowth of harmful bacteria, and decreases the immune system’s ability to respond to infections.

Many people who are following gluten-free diets have been persuaded by 5 major myths about wheat and other gluten-containing foods that float freely around the Internet and popular books and media. (I thoroughly debunked these 5 myths in my Deep Dive webinar, ‘Should I be Gluten-Free’; EmpowerEd members can access the video recording of the webinar along with the fully-referenced slides.)

I’ll cover the #1 most often-circulated myth in this article.

The Biggest Myth About Gluten:

Wheat (and other gluten-containing foods) cause health problems in humans because we have only been consuming them since agriculture began, and that’s not long enough for us to adapt to them.

Here’s the reality:

Absolutely everything that humans eat now, including animals, has only been consumed since agriculture began. We have extensively modified all food species (plant and animal) through selective breeding, which was – and still is – intended to enhance desirable characteristics in food, such as size, sweetness, hardiness to environmental stress or cropping duration. It is simply no longer possible to find the species of plants that our ancestors ate unless we go foraging in the ever-shrinking wilderness; and the nutritional composition of beef, lamb, pork and poultry is dramatically different to that of the wild animals that pre-agricultural humans hunted.

If you think that you should only be eating what humans ate in Paleolithic times, prepare to be very hungry indeed… and forget about living in a city!

How long have humans been eating wheat, anyway? Wheat was first domesticated – that is, deliberately cultivated – in southeastern Anatolia (now part of Turkey) roughly 11 000 years ago. However, archaeological evidence from the Ohalo II site in Israel (a cave inhabited by hunter-gatherers) shows humans gathered, processed and ate wild grains, including barley and wheat, around 23,000 years ago – that is, during the Paleolithic era. They also ate herbs, nuts, fruits and legumes, as indicated by the tens of thousands of seeds and fruits discovered at the site.

Aside from the fact that humans have been eating grains (including wheat) for far longer than we’ve been intentionally growing them, the argument that humans have not had enough time to genetically adapt to grains just doesn’t stack up.

Humans, along with all other species, are constantly adapting to their environment through the random generation of genetic mutations that occurs when sperm meets egg, and their genetic material combines (called ‘recombination’). If those adaptions are beneficial to survival and reproduction, the ‘new’ gene will persist in the population. That’s how humans came to have more copies of the gene that codes for production of the starch-digesting enzyme amylase in our genome than earlier humans, and non-human primates – being able to digest cooked starches from tubers, and later on from wild grains, was a significant survival advantage. (In fact, without this capacity to harvest energy from starches, we would never have developed the brain size and capacity that distinguishes us as humans.)

Furthermore, population growth increases the speed of adaptation – more individuals means more reproduction and more genetic diversity – and agricultural facilitated a dramatic increase in the human population. Only a few million of us walked Earth 10,000 years ago, at the beginning of the agricultural revolution. After roughly 8000 years of agriculture the human population had swelled to about 200 million people; and from there to 600 million people in the year 1700. Now there are over 7 billion of us.

This rapid population expansion facilitates evolutionary adaptation. In fact researchers have found evidence of this adaptation in roughly 7 percent of all human genes.

Let’s put it this way: if the CCR5 gene, which originated only about 4,000 years ago, can now be found in the genomes of about 10% of Europeans, (it probably increased resistance to smallpox, and now have been found to protect against HIV infection) it’s a total cinch for humans to adapt to eating wheat and other gluten-containing grains over the course of 23 000 years.

The bottom line: If you have a health condition such as coeliac disease, wheat allergy, non-coeliac gluten sensitivity, or a non-coeliac autoimmune disease that benefits from gluten restriction, you should be on a gluten-free diet. Everyone else can eat gluten to their heart’s content.

Want to find out the truth about the other 4 gluten myths, and learn more about who should, and shouldn’t, follow a gluten-free diet? Join EmpowerEd and you’ll receive instant access to the Deep Dive webinar ‘Should I Be Gluten-Free’ and all other membership benefits.

Leave your comments below:

Leave A Response

* Denotes Required Field