Is charred meat bad for you?

imagesAs the end of summer approaches, you love to grill your food on the open flame.  You savor that char-grilled flavor on your meat or fish.  Perhaps you fashion yourself as a modern-day caveman, inspired by the Paleo 亚搏体育客户端下载 and getting back to Nature.

At the same time, you’ve probably heard that eating grilled meat is a bad idea, because compounds in the meat char can cause cancer.

According to the National Cancer Institute, grilling meat to the point of charring causes the formation of heterocyclic amines (HCAs), polycyclic aromatic hydrocarbons (PAHs), and Maillard reaction products such as acrylamide (AA) or advanced glycation end-products (AGEs).  HCAs and AGEs are formed when the amino acids, sugars and creatine in meat react at high temperatures. PAHs are formed when meat fats burn.  Maillard reaction products are those tasty brown “caramelized” substances produced by the reaction of sugars and amino acids when meats and other foods are cooked by grilling, baking, frying or toasting.

The National Cancer Institute reports that HCAs, PAHs and acrylamide have been shown to cause cancer in laboratory animals.  Added to this are a number of epidemiological studies purporting to show an association between consumption of cooked meats and cancer.

So you reluctantly curtail your inner caveman and carefully scrape the blackened parts off your meats, or grill them at a lower temperature.  Or perhaps you avoid grilling altogether, retreating indoors and lightly sautéing or boiling your meat dishes.

Relax. I’m here to make the case that charred meat is not to be feared.  It may actually be good for you, hormetically boosting your general ability to neutralize and dispose of dietary toxins. In this blog post, we will take a closer look at the animal and human studies, combined with a deeper look at the evolutionary record, aided by the perspective of modern toxicology.  I think it may change your mind.

Conventional wisdom.   It’s not just Mainstream Health that pushes the idea that charred meat causes cancer.  Even Paleo advocates like Mark Sisson warn against the dangers of grilling and cooking meats at high temperatures:

But there’s a dark side to cooking. Depending on the methods and ingredients you use and the temperature you apply, cooking can create carcinogenic and toxic compounds, and oxidized fats – and these may be involved in some of the diseases studied. It may not be the meat itself, but how we treat the meat. …The easiest way to minimize your exposure to heat-related toxins is to emphasize gentle cooking methods and de-emphasize higher heat methods.

Sisson cites the usual culprits — HCAs, AGEs, acrylamide — and oxidized lipids.  He cautions particularly againt “grilling over an open flame – the worst”, pointing to its association with higher levels of HCA.  He recommends gentle methods such as steaming, poaching, boiling, braising, simmering and baking.

Unknown-1We should be a bit suspicious of these claims, given the fact that humans have been cooking meat over open flames for more than millenium, inevitably creating char. If these claims were correct, we’d expect to see elevated rates of cancer among modern hunter-gathers who cook their meat — but we don’t.  Just as the demonization of meat and saturated fat as agents of disease has been challenged by reappraisals of epidemiological and biomedical evidence, I think we should take a closer look at the shibboleth that charbroiling meat or otherwise cooking it at high temperatures poses a health risk.

The studies. Let’s look more closely animal and human studies which gave rise to these concerns about the compounds in grilled meats.  Here’s what the National Cancer Institute actually says about the studies of HCAs and PAHs:

  • In many experiments, rodents fed a diet supplemented with HCAs developed tumors of the breast, colon, liver, skin, lung, prostate, and other organs. Rodents fed PAHs also developed cancers, including leukemia and tumors of the gastrointestinal tract and lungs.
  • However, the doses of HCAs and PAHs used in these studies were very high—equivalent to thousands of times the doses that a person would consume in a normal diet.
  • Population studies have not established a definitive link between HCA and PAH exposure from cooked meats and cancer in humans. One difficulty with conducting such studies is that it can be difficult to determine the exact level of HCA and/or PAH exposure a person gets from cooked meats.
  • Although dietary questionnaires can provide good estimates, they may not capture all the detail about cooking techniques that is necessary to determine HCA and PAH exposure levels.

Soimages-1 the NCI concludes that there is no direct evidence in humans that HCA and PAH from cooked meats causes cancer in humans.  The argument is based upon extrapolation from studies in mice and rats.

Even if you are worried about PAHs, there is nothing distinctive about meats. A 2003 Spanish study of actual measured dietary exposures to PAHs, found that vegetables, fruits, cereals and milk all have comparable or higher levels of PAH than meat.  The average male in Spain who consumes these foods raises his risk of cancer by a 5-in-a million chance.

What about acrylamide?  According to the American Cancer Society, “Based on the studies done so far, it’s not yet clear if acrylamide affects cancer risk in people.” They further state:

Acrylamide has been found to increase the risk of several types of cancer when given to lab animals (rats and mice) in their drinking water. The doses of acrylamide given in these studies have been as much as 1,000 to 10,000 times higher than the levels people might be exposed to in foods. It’s not clear if these results would apply to people as well, but in general it makes sense to limit human exposure to substances that cause cancer in animals.

Writing for the American Council on Science and Health, Joseph Rosen of Rutgers University criticized the rodent studies as flawed, based on the extreme levels, the mode of administration, and key differences between the endocrine organs of humans and rats, where tumors were noted.  Based on the animal studies, Rosen concluded that “There is no credible evidence that acrylamide in food poses a human cancer risk.”

What about studies of acrylamide specifically in humans?  A 2007 Dutch study linked acrylamide consumption to cancer in women, finding that women who absorbed more acrylamide were twice as likely to develop ovarian or womb cancer as those who ingested a smaller amount.  But a closer look at this study revealed that the measure of acrylamide was based upon a self-reported food questionnaire, not actual clinical measurements.  And there was no clear linkage specifically to meat consumption. As noted by Dr. Lesley Walker of Cancer Resarch UK:

Women shouldn’t be unduly worried by this news…It’s not easy to separate out one component of the diet from all the others when studying the complex diets of ordinary people. And as acrylamide levels are highest in carbohydrate containing foods – such as chips and crisps – other factors need to be firmly ruled out, especially being overweight or obese, which we know is strongly linked to womb cancer and probably linked to ovarian cancer.

As Dr. Walker notes, it is misleading to connect the potential dangers of acrylamide specifically to meat, since levels are much higher in other foods, particularly high carbohydrate foods.  According to the American Cancer Society, “Acrylamide is found mainly in plant foods, such as potato products, grain products, or coffee. Foods such as French fries and potato chips seem to have the highest levels of acrylamide, but it’s also found in breads and other grain products. Acrylamide does not form (or forms at lower levels) in dairy, meat, and fish products.”

UnknownFor an eye opener, consult the FDA’s survey of acrylamide levels in food. If you are worried about acrylamide, you might start by cutting out Ore Ida french fries (with 1098 ppb acrylamide), Hershey’s cocoa (909 ppb), Health Valley Original Oat Bran Graham Crackers (1540 ppb), Ak-mak 100% whole wheat stone ground sesame crackers (343 ppb), Safeway pitted olives (226 ppb) or Starbucks coffee (175 ppb).   By contrast,  none of the meat products on the FDA list had more than 100 ppb; many had undetectable levels.

In short, if you are concerned about acrylamide, there are more worrisome places to look than meats.

Evolution and dietary adaptation.  But are mice and rats a valid model for assessing the toxicity in humans of compounds from cooked foods?  Rodents may indeed be suitable animal models for many aspects of human physiology and toxicology, such as testing the safety of drugs, synthetic chemicals, and evolutionarily novel compounds. But animal models are mainly useful for understanding human physiological responses only where there are shared metabolic, immune, or detoxification pathways.  This is of particular importance in toxicology, because what is toxic to one species is often harmless or even beneficial to another. Detoxification pathways vary considerably across species.  For example, chocolate can poison dogs because the active ingredient, theobromine, is highly toxic to canines, while humans typically tolerate it well, and even find it to be beneficial. So if you relied on dog testing to determine your food choices, you would never eat another piece of chocolate.

Organisms evolve detoxification processes that respond to what they are likely to encounter in their diets and their environments. Humans have been cooking (and burning) food over hot flames since they diverged from other primates half a millennia ago, so they have had ample opportunity to adapt their detoxification responses to the compounds present in charred meat.  It’s just not novel for us.

However, to the best of my knowledge, mice and rats don’t cook their meat — with or without fire.  Meat char represents a novel toxin to rodents, so they are unlikely to have developed a strong system for detoxifying HCAs, PAHs, or acrylamide.   Thus, rodents are not a reasonable model for assessing the toxicity of compounds produced by cooking meat.

The barbecuing species.  There is compelling evidence that humans evolved to eat and thrive on a diet that includes fire-cooked foods – both meat and plants.

the_hunt2-1The evidence comes from several independent and reinforcing lines of evidence.  While the idea of “man as cook” goes back at least to the anthropologist Claude Levi-Strauss, his main interest was in how cooking changed social psychology.  The most comprehensive and convincing argument for the “cooking hypothesis” is laid out by Richard Wrangham in his 2009 masterpiece,  Catching Fire, How Cooking Made us Human.

Wrangham supports his contention that cooking shaped human biology with several independent lines of evidence — including the findings of archaeology, anthropology, nutrition, evolutionary biology and physiology.  I’ll try to summarize here the key elements and implications of Wrangham’s argument:

  1. Archaeology. Based on excavation of artifacts like burnt bones at campsites, archaeologists trace the origins of cooking by fire to physical evidence dated to less than million years ago, but other evidence suggests the advent of cooking may have been between 1 and 2 million years ago.  The timing of these archaeological events are mirrored in simultaneous dramatic changes in the teeth, jaws, digestive apparatus and brain of our human ancestors, most particularly Homo erectus, between 1-2 million years ago, as elaborated in Point #4 below.
  2. Anthropology.  Cooking is practiced in every human society, including by modern hunter-gatherers. Hunter-gatherers lived in a wide variety of environments including deserts, mountains, the arctic and rain forests — but cooking is universal. While foods like fruits, organ meats, grubs and fish are typically eaten raw, most hunter-gatherers prefer to cook their meat, eggs and tubers.
  3. Nutrition. Cooking food dramatically increases the palatability, digestibility, tenderness and available caloric value of both meat and plants. Cooking meat and other protein denatures it, rendering it more accessible to the action of digestive enzymes.  The benefits of cooking are not restricted to meats — cooking also makes plant foods more digestible.  Given the limited availability and lean composition of meat in the tropics, equatorial hunter-gathers also need to consume plant carbohydrates, such as starchy tubers.  Cooking gelatinizes the starch, increasing its glycemic index, and making it easier to digest and absorb its sugars that one could otherwise obtain from raw starch.  Raw foodism is effective as a weight loss diet because uncooked food provides less caloric value and requires more energy and time to digest than cooked food.  Domestic animals grow faster and fatter on cooked food containing the same caloric content as the equivalent uncooked food.
  4. images-2Evolutionary biology.  More nutritive and tender cooked food “remodeled” the bodies of early humans, enabling the evolution of a weaker jaw, smaller teeth, and a smaller digestive tract. Gorillas and chimpanzee are mainly herbivores. They have large molars to chew plant foods, and voluminous stomachs and large intestines capable of digesting and fermenting fiber from plants.  By contrast, humans are less able than apes to digest, ferment and utilize fiber — largely because eating cooked food reduces the need for this capacity. The tenderness of cooked food enabled the human jaw and teeth to shrink.  Human jaws and molars are the smallest of any primate species, relative to body mass.  As a result of their more nutrient-dense cooked diet, humans eat half as much per pound of body weight as do great apes. The shift to a cooked meat diet also coincided with a doubling of the size of the brain of Homo erectus relative to that of Homo habilis and earlier hominids.  The “expensive tissue hypothesis” holds that more nutritive cooked food dramatically increased the energetic efficiency and intelligence of humans, freeing us from time spent gathering and digesting, increasing hunting and migration range and propelling reproductive success.

The evolution of toxicity.   One of the corollaries of the shift from raw to cooked food by early humans was a re-tooling of their sensitivity to and tolerance for dietary toxins.  Wrangham draws out some very important, but often overlooked, consequences of this evolutionary change:

Beyond reducing the size of teeth and guts, the adoption of cooking must have had numerous effects on our digestive system because it changed the chemistry of our food. Cooking would have created some toxins, reduced others, and probably favored adjustments to our digestive enzymes….Take, for example, Maillard compounds, such as heterocyclic amines and acrylamide…They occur at low concentration in natural foods but under the influence of heat their concentration becomes much higher than what is found in nature…They can also induce a chronic state of inflammation, a process that raw-foodists invoke to explain why they feel better on raw diets. The cooking hypothesis suggests that our long evolutionary history of exposure to Maillard compounds has led humans to be more resistant to their damaging effects than other mammals are. It is an important question because many processed foods contain Maillard compounds that are known to cause cancer in other animals. Acrylamide is an example. In 2002 acrylamide was discovered to occur widely in commercially produced potato products, such as potato chips. If it is as carcinogenic to humans as it is to other animals, it is dangerous. If not, it may provide evidence of human adaptation to Maillard compounds, and hence of a long exposure to heated foods.

So while humans have adapted to better tolerate “toxins” like Maillard compounds in cooked foods, the converse is true for certain plant toxins that we expect apes to tolerate better than humans.  Wrangham observes how this shift is readily demonstrated by differences in palatability and food preference between our species and the apes:

Unknown-2In my experience of sampling many wild foods eaten by primates, items eaten by chimpanzees in the wild taste better than foods eaten by monkeys. Even so, some of the fruits, seeds, and leaves that chimpanzees select taste so foul that I can barely swallow them. The tastes are strong and rich, excellent indicators of the presence of non-nutritional compounds, many of which are likely to be toxic to humans—but presumably much less so to chimpanzees….The shifts in food preference between chimpanzees and humans suggest that our species has a reduced physiological tolerance for foods high in toxins or tannins. Since cooking predictably destroys many toxins, we may have evolved a relatively sensitive palate. By contrast, if we were adapted to a raw-meat diet we would expect to see evidence of resistance to the toxins produced by bacteria that live on meat. No such evidence is known. Even when we cook our meat we are vulnerable to bacterial infections….The best prevention is to cook meat, fish, and eggs beyond 140º F (60º C), and not to eat foods containing unpasteurized milk or eggs. The cooking hypothesis suggests that because our ancestors have typically been able to cook their meat, humans have remained vulnerable to bacteria that live on raw meat….We fare poorly on raw diets, no cultures rely on them, and adaptations in our bodies explain why we cannot easily utilize raw foods. Even vegetarians thrive on cooked diets. We are cooks more than carnivores.

Wrangham’s “cooking hypothesis” is certainly intriguing.  But is there any hard evidence that humans and apes (to say nothing of mice) actually have different detoxification mechanisms going on in their bodies?

I believe the answer is yes, but we first need a brief detour into basic toxicology to understand how mammals deal with dietary toxins.

Detox 101.  All organisms process and detoxify “foreign” or poorly tolerated compounds (including alcohol, caffeine and prescription drugs) by a process called “xenobiotic metabolism. (“xeno” = foreign).  While the liver is our primary organ of detoxification, cells throughout the body have the ability to detoxify.  The key point to understand is that detoxification occurs in two steps — Phase I and Phase II:

  • The Phase I detoxification system primarily employs what is known as the Cytochrome P450 system to modify toxins.  The Cytochrome P450 (CYP) system varies by species and occurs in all life forms: animals, plants, microbes — even in viruses.  In humans, CYP enzymes are located not just in the liver, but also in the mitochondria or endoplastic reticulum of cells in most tissues of the body.  The CYP enzymes work by chemically modifying (hydrolyzing, oxidizing or reducing) the toxins into less harmful, more soluble compounds.  However, the end products of Phase I are typically reactive a “free radicals” that are often potential carcinogens if they persist in excessive amounts.
  • The Phase II detoxification system is a separate set of enzymes that render the products of Phase I less harmful by “conjugating” them – that is, combining their reactive groups with cysteine, glycine or sulfur molecules to create unreactive, water soluble compounds that are readily excreted in the urine or bile.

So what do we know about how our Phase I and Phase II enzymes handle the compounds in our cooked food diet? Let’s look at Phase 1 and Phase II separately.

Phase 1 enzymes and the human diet.  As noted in an excellent study by Kumar et. al. (2009), the single most important and abundant Phase I enzyme is humans is called CYP3A4.  It is found not just in the liver, but throughout our bodies. CYP3A4 accounts for 30% of all the P450 enzymes expressed in the liver and about 80% of the CYP enzymes in the intestine!  According to Kumar et al, CYP3A4 is considered “the most important drug-metabolizing enzyme in humans, due to abundance, wide spectrum and indelibility.”

Now here is an amazing fact.  Despite the fact that we share 99% of our DNA with chimpanzees, we humans have developed a very different detoxification system.  Our CYP system has changed enormously since the evolutionary split from chimps.  Our most prevalent detox enzyme, CYP3A4, is found at about twice the level, and in a changed form, in humans relative to chimpanzees. According to Kumar et al. “CYP3A4 evolution in the human lineage would most likely reflect the adaptation to a change in the physiology or environment of our direct ancestors.”

While the human CYP3A4 for the most part has similar breadth and specificity of activity against toxins as the chimp version, the is one interesting exception: unlike the chip enzyme, the human variant very active in  deactivating (“de-benzylating”)  a toxic bile acid compound known as lithocholic acid (LCA).  As Kumar notes,

The activation of human CYP3A4 by LCA reported in our present work would be expected to increase the detoxification of this and other bile acids metabolized by the enzyme, although this remains to be formally demonstrated. In contrast to LCA, no activation differences were detected in response to the less toxic LCA precursor chenodeoxycholic acid and the other major primary bile acid, cholic acid. This suggested a previously unknown defense mechanism against LCA-mediated cholestasis, which evolved after the split of the common human-chimpanzee lineage. The physiological necessity of such a mechanism may be related to our ancestors, beginning with Homo erectus some 1.8 million years ago, having adapted to an energy-dense, meat-based diet. Contemporary human foraging populations derive more than half of their dietary energy from animal foods, in comparison with 5 to 10% observed in chimpanzees. This adaptation may have been a prerequisite for the subsequent dramatic increase in the brain size in the human lineage. It is noteworthy that meat-based diet increased the load of animal steroids and thus the risk of cholestasis.

In short, the explosive increase of a single detoxifying enzyme, CYP3A4 is a kind of “molecular archeology” that points to the human adaptation to a cooked meat diet.

Interestingly, CYP3A4 is also  induced by  PAH compounds, one of those “compounds of concern” associated with charred meat.  In other words, humans are much better adapted than chimpanzees to detoxifying potential meat toxins.

Phase 2 enzymes and the human diet.  Now let’s look at how well humans deal the the reaction products of the Phase 1 system.  In Phase II, the altered toxins from Phase I are conjugated and neutralized to form easily excreted compounds by the Phase II antioxidant enzymes.  The Phase II system, sometimes called the Antioxidant Response Element (ARE) consists of endogenous antioxidant enzymes such as

  • glutathione reductase
  • glutathione transferase
  • glutathione peroxidase
  • glucuronysyl transferase
  • quinone reductase
  • epoxide hydrolase
  • superoxide dismutase
  • gamma glutamylcysteine

In an earlier post, The case against antioxidants, I described how our body’s own Antioxidant Response Element is far more capable and nuanced in dealing with free radical oxidants, than the ingestion of exogenous antioxidant “vitamins” like Vitamins C and E.

How do we activate the Phase II enzymes? The Phase 2 system is not so much “induced” as activated by Phase 1 products and by strengthened by nutrients.  Different foods and nutrients activate different Phase II enzymes:

  • Unknown-4Glutathione conjugation: Brassica family foods (cabbage, broccoli, Brussels sprouts); limonene-containing foods (citrus peel, dill weed oil, caraway oil)
  • Amino acid conjugation: Glycine
  • Methylation: Lipotropic nutrients (choline, methionine, betaine, folic acid, vitamin B12)
  • Sulfation: Cysteine, methionine, taurine
  • Glucuronidation: Fish oils, limonene-containing foods

In my earlier post, The case against antioxidants, I provided a more extensive list of phytochemical-rich plant foods, herbs and spices that have been shown to activate the Phase II enzymes, including curcumin, green tea, garlic, rosemary, ginko, bee propilis, and even…coffee!

亚搏体育客户端下载 ary hormesis. Once we understand that the Phase II system is turned on in response to environmental exposure, it becomes logical to ask the question:  Can the char compounds in cooked meat themselves, if consumed in moderation,  actually improve our health by building our tolerance for potential toxins and carcinogens?

There is some evidence that this is indeed the case.  For example, a 2007 study by Hayes in the European Journal of Clinical Nutrition found evidence that for dietary acrylamide as a “hormesis-inducing agent”.  Among the studies cited by Hayes

  • Mucci (2003) found that that higher levels of acrylamide intake were associated with significant reductions in large bowel, colorectal and kidney cancers
  • Collins (1989) in a study of 9000 workers exposed to acrylamide over 50 years, found a statistically  significant decrease in deaths from all causes

More to the point, there is some evidence that consumption of grilled beef itself activates our endogenous Phase II anti-oxidant enzymes.  In a 2013 study of grilled beef consumption by 29 healthy non-smoking males, without prior occupational exposure to PAHs, it was found that twice-daily consumption of charcoal-broiled hamburgers resulted in significant increases in serum levels of antioxidant enzymes GOT, GPT and ALP.  Levels of non-enzymatic “sacrificial” antioxidants decreased, but that is not surprising given the “load” put on the detoxification system.

The bottom line.  It’s time to summarize the argument in this blog post:

  1. Unknown-3Unlike any other mammal or primate, humans evolved to specialize in eating cooked meats and plants
  2. There is no direct evidence that the compounds produced by grilling meat are toxic or carcinogenic to humans.
  3. Activating our xenobiotic detoxification system by moderate consumption of grilled meat may actually strengthen our generalized ability to neutralize toxins and carcinogens
  4. Therefore, embrace grilled meats — don’t fear them.
  5. If you have any remaining qualms, grill in moderation, and eat a side of broccoli or Brussels sprouts with your steak or burger, or mix a little turmeric, rosemary, or garlic into the recipe.
Happy grilling!

19 Comments

  1. Van

    Very insightful post. I bought into the concerns with regards to the pernicious effects of acrylamide, HCAs etc. Thanks for contributing to my edification.

    A quick question about vegetables. This is a question proposed to Micheal Eades in his blog:

    “What I’m struggling with now is why even need vegetables and fruits at all in our diet. I can’t help wonder what the First Nations people ate 9000 years ago. There is plenty of wild game and fish and wild seasonal fruit (berries) but I’ve yet to see a wild carrot patch. Why are vegetables and fruits so sacred? What’s your take on this?”

    “My take is that we really don’t need them, but we eat them because they’re there and because they taste good. And they keep the diet from being so boring. I’m sure they add a few phytonutrients that are good for us, but I’m not sure we wouldn’t get the same nutrition in just a little different form from an all meat diet.” Michael Eades

    Are vegetables overrated?

    Flavonoids/SCFAs: The truth is that human beings are very inefficient at extracting mineral nutrients from foods of animal source, and almost completely unable to absorb minerals from plant sources. Almost all the biological activity of polyphenolic compounds comes not from the phyto-nutrient, and not from the liver metabolite of the original polyphenol, but from derivatives of the phyto-nutrient produced by the intestinal flora. Those with healthy intestinal flora convert these phyto-nutrients into extremely beneficial polyphenol derivatives that are absorbed into the bloodstream and have tremendous biological activity. Those with rotten intestinal flora produce little or no beneficial phyto-nutrient derivatives, and in many cases have nasty bacteria that produce entirely different polyphenols that are actually toxic.

    Where, then, are we to get our mineral nutrients? Our drinking water, if sufficient in total dissolved solids (that is, ionized mineral salts) is our most reliable and efficiently absorbed source of minerals. Calcium and magnesium are particularly difficult to extract from our foods, and must be concentrated in our drinking water.

    SCFAs

    The structure and function of colon cells, as well as the link between the GI tract and the immune system, is tied in with the proper quantity and balance of short­ chain fatty acids. Short ­chain fatty acids (SCFA) are end ­products of the anaerobic colonic bacterial fermentation of carbohydrates. There are 3 SCFA: acetate, propionate, and butyrate, that must exist in the healthy adult colon in the ratio 3:1:1, and play a vital role in maintenance of colon cell integrity and metabolism, and modulate immune system activation.

    Vegetables (fibrous/non­starchy) provide indigestible carbohydrate as the primary food source for normal intestinal flora. The acetate formed via colonic fermentation of undigested carbohydrates is readily absorbed from the colon. Short chain fatty acids like butyrate not only provide a sustained source of energy, but support the health of the intestinal lumen.

    Reply
    • Todd

      Great question about vegetables. Anthropology and biology tell us that humans are opportunistic omnivores, “flex-fuel” creatures able to subsist and thrive on a wide variety of diets, from the mostly carnivorous (like the Inuit) to the highly vegetarian (like the Kitavans). I do agree with the points you make about the benefits of short chain fatty acids, which beneficial gut flora ferment from vegetable-derived soluble fiber. In that sense, a “high fiber” diet is actually a high fat diet! SCFAs are not only a great instant source of energy, they appear to play an important role in the health and integrity of the intestines themselves. And one should not underestimate the important role that “good” gut flora like bifidobacteria, lactobacilli and other species play in regulating immunity and moderating inflammation. We are only beginning to understand the complexity of their function.

      I also concur that vegetable-derived phytonutrients are beneficial, less because of their inherent antioxidant activity and more because of the way they stimulate the Nrf2 system to upregulate detoxifying Phase II enzymes. I discussed that in some detail in “The case against antioxidants” .

      I’ve not written much about minerals and mineral deficiencies. What I’ve gathered is that the problem for most people is not a deficiency of minerals in their food intake, but rather poor absorption or even misallocation by the metabolism. For example hormones such as insulin can have a big impact on what the body decides to with dietary calcium: an insulinogenic diet tends to lay down calcium within the arteries instead of the bones, leading to the paradox that consuming high levels of calcium in supplements fails to correct osteoporosis. Conversely with a healthy gut, inhabited by good microflora, eating a whole food diet of modest mineral content, is able to readily absorb and hold onto the minerals they need.

      Todd

      Reply
  2. Tom Brennan

    Another great post. Thanks for your in-depth analysis.

    Reply
  3. Seamus

    Studies seem to show smoke inhalation is harmful, be it from open fires, cigarettes, burning candles or incense, etc. Where can this fit in from an evolutionary standpoint? Can smoking one cigarette per week be beneficial from a hormetic point of view?

    Reply
    • Todd

      Seamus,

      What you say about smoke is true, but unless you are a miserable barbecue chef, I don’t image you’ll be inhaling an excessive amount of smoke and soot — at least relative to the amount of char on your steak 🙂

      I have not researched the topic of smoking and hormesis, so I can’t give you a well considered answer, only some preliminary thoughts. The question of whether occasional smoking might have hormetic benefit has actually been debated quite a bit. I’m not sure there is a clear answer; as you might imagine, it’s not the easiest question to answer experimentally. But consider that — despite the proven association between smoking and cancer — exposure to tobacco smoke still only involves a statistical change in the odds. Only 10% of smokers get lung cancer and 15% of lung cancer victims didn’t smoke:

      http://www.utsouthwestern.edu/life-at/med-talks/why-do-smokers-never-get-lung-cancer.html
      http://www.science20.com/science_20/cigarette_smoking_woo_time_day_impacts_cancer_risk-81804

      It still remains that smoking sharply increases your odds of getting cancer. Yet, as with anything to do with hormesis, there is likely an optimum dose – and that optimum may vary depending on the age, health and previous experience of the individual. I don’t smoke and never have, but I’m open to the possibility that moderate smoking could have benefits. And I think that uncured tobacco may be better tolerated, or more hormetic, than tobacco “cured” with sugar and chemicals.

      If you are interested in delving deeper into this topic, here is one thread on a Paleo forum that includes some provocative thinking on the topic:

      https://www.paleohacks.com/china-study/could-smoking-tobacco-have-a-benefit-if-so-what-is-the-mechanism-177

      Todd

      Reply
  4. Igor Bukanov

    I think evolutionarily arguments here ae rather weak. It could be that humans consumed cooked meat for a million of years, but it does not imply that it was grilled meat prepared at high-temperature open flame like modern grills do. In particular, I do not see proofs that food was mostly grilled, not baked.

    For example, without cookware baking is pretty much the only way to prepare the vegetables.

    Also, an easy way to cook a killed bird or a small animal in a forest is to put it into a small pit (ideally one covers the body with clay first, but this is not necessary), put some dirt on top and make fire over it. After that one pulls the body from the pit and just peel away the skin with feather or fur. It preserves almost all animal fat with all its calories. Moreover, as baking takes time, the meat stays under high temperature much longer than on open grill so parasites get better chance to die. Such baking in one form or another is also used by aboriginal people.

    And consider that meat for grills comes a from farmed animals that are very different from their wild counterpart. In particular, as they are much fattier, baking would leave too much fat so people grill to burn some of it away. And if one eats results of modern day industrial farming, all evolutionally arguments should be just thrown away.

    Reply
    • Todd

      Igor,

      You do raise a valid concern — how do we know that food was frequently cooked on an open flame, rather than baked? Wrangham devotes a good part of Chapter 4 in his book to this question. The most relevant evidence comes from archaeological sites and from surveying the cooking habits of modern hunter gatherers. (I don’t know what other possible evidence there could be). At sites in Spain, Gibraltar and Egypt, dozens of cooking hearths have been found all with abundant charcoal, charred logs, and deep ash deposits. That in itself doesn’t establish whether the food was directly exposed to flames, or buried for baking.

      However the archaelogical evidence also finds ample evidence of charred grasses, seeds and protein of meat origin. So at the very least, our ancestors were not too careful to avoid burning, or at least heating food to high temperatures. We should keep in mind that heating to high temperatures — without charring — is sufficient to produce the HCAs, PAHs and acrylamide that are discussed in my post.

      As to fat content: you are right that industrialized farming produces a fatter cut of muscle meat. But hunter gathers tend to eat the whole animal, savoring also the fatty internal organs, glands and brains; and in artic regions, blubber. The “expensive tissue hypothesis” of Aiello and Wheeler argues that this helped provide the energy that drove the evolution of the enlarged human brain. So we can’t assume that our forebears ate a low fat diet.

      I will agree that that evidence of archaeology by itself is not conclusive. But add to that the evidence of the reduced gut, diminished jaw and elevated levels of CYP3A4 and the case gets stronger that our food was often cooked hot, burned or charred.

      I also add one common sense argument: If you grant that our ancestors cooked with fire, is it really reasonable to suppose they went out of their way to avoid high temperatures and charring? Why would they be so picky? I will also grant you may be right that early humans might have baked their meat and vegetables some of the time. But is it reasonable that they ONLY baked and never grilled on the open flame? What would motivate such a restrictive practice?

      I personally find that a little charring enhances the flavor and texture of meat, and I don’t think I’m alone in this preference. Without being warned by finicky modern toxicologists to avoid the high heat and blackening, what would motivate earlier humans to fear the char?

      Todd

      Reply
      • Igor Bukanov

        Sometime ago I read an article that discussed grilling from energy point of view. Grilling lean bodies of wild animals really looses almost all their fat even if one eat them whole. As digesting of animal proteins is rather energy-intensive process (all those enzymes that you listed are very expensive to make), in some cases eating burned meat could take more calories than it brings making one hungrier.

        Another thing is that modern aboriginal people have much more sophisticated tools than those that were available, say, 100 thousands years ago. For example, try to remove fur or feathers with what we think was available at that time. It would be very time consuming process!

        That puzzled me. If archaeological finding about meat consumption is right, then perhaps grilling meat was a cultural/socializing ritual rather than a source of calories. Then I read a story in Russian about survivor camp training that strongly recommended on baking small game precisely because it preserves all the fat, the biggest source of calories.

        That matched my student experience when on a trip I saw how to bake a domestic chicken that we got from a farmer. It was a very convenient process. We needed no tools. The only drawback was that we have to wait at the fire for quite some time. But that nicely kept mosquito away. However, the end result was not very pleasant as all the fat from chicken’s body stayed.

        My own speculation on this is that it was only big game that was grilled making it a *rare* socializing event. Vegetables and small game was baked as it was about survival and the need to get all possible calories from food. Of cause, I have no proof for this.

        In any case, I also want to point out that we have no idea from archaeological evidences how often the grilling or any other food preparation process took place. Another argument about weakness of the current evidences is that humans have very little variations in their DNA meaning that we are ancestors of a very small group of survivors. Thus unless we will get evidence that our direct ancestors with our particular DNA really used a particular food preparation process, current archaeological findings point essential to extinct DNA variations. And what if that DNA went extinct precisely because of that particular food preparation process that archaeologists discovered?

        Reply
        • Todd

          You raise good questions and ideas, and have a curious mind, Igor. I think you would make a good anthropologist or archaeologist. 🙂

          Reply
          • Igor Bukanov

            Note that besides the evolutionally arguments, your article nicely pointed out that we do not have evidence that consuming ocasionally grilled meet and vegetables can harm. Moreover, it presented a very resonable mechanism how in fact it can be even beneficial to health as long as one gives sufficient time to recovery.

            So one may avoid grilled meat for ethical, religious etc. reasons. But not for health reasons

            Reply
  5. Seamus

    Eagerly awaiting a response/rebuttal to the well-publicized WHO release this week on processed meat and red meat. Having read the statement, the WHO specifically tries to pinpoint cooking methods as a culprit.

    I’ve read Mark Sisson’s response and others but was interested in yours as other authors seem much more sympathetic to the conventional wisdom that charred meat is indeed bad for health.

    Reply
    • Todd

      Seamus,

      Yes indeed. The W.H.O. analysis looks at correlation, not causation, because it considers only observation data on large populations. Futhermore, it fails to correct for the covariance of other dietary and lifestyle factors. It is well know that those who eat the most meat tend to be heavier and less active. For all we know, they also consume more carbohydrates and eat less fiber.

      On my Facebook page, I’ve linked to a nice deconstruction of the W.H.O. study by Chris Kresser.

      I read Mark’s column about the W.H.O. study. He makes a number of good points, but repeats the same erroneous information about the risks of eating meat that is cooked at high temperatures. I wonder if Mark thinks our paleolithic forebears were really so careful to avoid exposing their meat to fire.

      Todd

      Reply
  6. Skeezix

    Thanks for a good article. I’ve always felt that studies involving rodents have little to no bearing on what’s bad or good for human beings. Conventional Wisdom seldom recognizes this and insists on promoting various rat studies as gospel. Thankfully there are many of us (like yourself) who are able to roll our eyes at the nonsense instead of buying into it.

    Reply
    • Todd

      Thanks, Skeezlx.

      I have no problem with rodent studies when they are biologically relevant or even helpful. But you can’t just blindly always use them. Certain systems, such as the circulatory system, are quite similar among mammals. However, others, like the detoxification or immune systems, are highly differentiated and adaptive to specific environmental exposures. So you need to apply some intelligence. If one wants to get information on the ability to detoxify compounds in cooked foods, one might think about the fact that rats don’t cook their food. Then take the next step and look for the prevalence of specific detox enzymes. Answer: rats lack what humans have.

      I’m not always against rat studies. I’m always open to new information, but one has to look at the full context and not follow a fixed recipe when doing science.

      Todd

      Reply
  7. Van

    Thanks for the link (Chris Kresser’s analysis) on your FB page.

    Does IF significantly improve the gut microbiome (assuming one’s eating plan is optimized)?

    Going off on a tangent: what is your take on Ray Peat? Ray Peat is an amusing character. I have read much of what he has written over many years. What I’ve gathered about his personal eating plan is that he consumes every day: 1 pint of coffee, 1 pint of orange juice, 1 pint of milk, and 1 pint of ice cream. Surely he must spend so much time standing at the urinal that it is a wonder he has time to do anything else.

    Reply
    • Todd

      Hi Van,

      It is likely that IF alters the gut microbiome — in a positive way. I could not find any direct evidence of this, but there is some indirect evidence that comes from studies of calorie restriction per se:

      http://www.nature.com/ncomms/2013/130716/ncomms3163/full/ncomms3163.html

      “Calorie restriction enriches phylotypes positively correlated with lifespan, for example, the genus Lactobacillus on low-fat diet, and reduces phylotypes negatively correlated with lifespan. These calorie restriction-induced changes in the gut microbiota are concomitant with significantly reduced serum levels of lipopolysaccharide-binding protein, suggesting that animals under calorie restriction can establish a structurally balanced architecture of gut microbiota that may exert a health benefit to the host via reduction of antigen load from the gut.”

      By itself, studies like this don’t establish that IF has the same impact on the microbiome as general calorie restriction. But indirect evidence, showing that CR and IF have similar effects on insulin resistance, inflammation and immunity, makes such an inference likely.

      As to Ray Peat, that’s a hard one. He is definitely hard to pigeonhole. I think he has some provocative ideas worthy of further investigation (like his skepticism about fish oil), and other ideas with some partial validity that he takes too far (like his ideas about sugar). My biggest disconnect with him is that he is stuck in worldview that oxidative stress and “damage accumulation” is at the root of aging and disease and his failure to understand hormesis. I’ve discussed the flaws in the damage accumulation theory in my book review of “Spring Chicken” in the post “Live Longer“.

      For a critical but balance analysis of Ray Peat, I found Michael Allen Smith’s blog post to be instructive:
      http://criticalmas.com/2012/11/the-peatarian-diet-for-those-of-us-with-average-iqs/

      And his follow-up article regarding the issue of hormesis (with a reference to my blog 🙂 )
      http://criticalmas.com/2013/01/peat-atarians-and-fear-of-hormetic-stress/

      Todd

      Reply
  8. Coco

    That was really interesting! I still won’t eat charred food because to me it has the foulest taste I ever tasted. In fact, it tastes so bad to me that I thought it confirmed the theory that it’s not good for our health. And when I say charred, I think overcooked in broad way. I can’t eat bread crust to save my life and actively avoid the bread aisle at the grocery store because it smells awful. Same for black tea of coffee. The list is long. Thanks!

    Reply
    • Todd

      Hi Coco,

      I remember how before I weaned myself off of sugar I used to love my coffee with cream and sugar. After going low carb / Paleo, I cut the sugar, but still loved my coffee with cream. Couldn’t fathom black coffee or black tea — too bitter! But a year ago I stopped putting cream in my coffee and just drink it black now. And I love the subtle flavors of different black and green teas.

      Bitterness can be a natural sign of toxicity, but mild bitterness is frequently associated with health-inducing phytonutrents. What is toxic at high doses can hormetically stimulate endogenous detoxification pathways at lower doses, for example by activating the Nrf2 pathway and Phase II detox enzymes, as I explained in The case against antioxidants

      Mildly bitter flavors are an acquired taste — a learned pleasure. Or maybe it is more accurate to say that it takes time for the mask of sweetness to wash out and allow the rich subtlety and variety of bitter flavors to re-establish their rightful place in an evolved palate.

      That said I’m sympathetic to anyone who rejects these flavors at first encounter. The subjective repulsion is what it is. But consider giving it some time and gradually allowing for some degree of caramelization in grilled, or reduced sweetening of coffee and tea. Based on my own personal trajectory, I find my reformed palate to be able to detect a wider and more beautiful array of flavors.

      Todd

      Reply
      • Coco

        Interesting answer, thanks! It’s funny that I just purchased some bitter herbs from my local health store.

        I need to say that I’ve never been able to drink coffee with any amount of milk or sugar and I don’t plan to ever try again.

        I enjoy the taste of caramelization fine (caramelized onions anyone?) it’s the other type of “browness” that I can’t manage. It’s hard to describe but I can’t eat my eggs overcooked and sometimes there is not even a hint of brown.

        Reply

4 Trackbacks/Pingbacks

  1. the membrane | deconstipation 08 09 15
  2. Interesting Brain Bubbles: Strength through fire and metal….and some retail stuff. | JasonsBrainBubbles 13 10 15
  3. Is cooking meat at high temperatures really that dangerous? | Mark's Daily Apple Health and Fitness Forum page 08 11 15
  4. Low Carb Cooking Class! (LC3) — Roasted Vegetables – Tuit Nutrition 23 07 17

Add Your Comment