In the beginning, medicine was messy. Before we had the technology to see into cells, all we had was observation, inference, and good old fashioned knives. And in the very beginning, we didn’t even have the luxury of being able to cut people open to see how everything fit together: in many parts of the world, for religious and cultural reasons, human dissection was strictly off the table. So early students of human form and function had to make do with animals and hope that our own guts were arranged in the same way. If you wanted to be a surgeon in the early Roman empire, you also had serious social stigma working against you, since surgery was considered to be dirty, undignified manual labor by the wealthy upper class. Thankfully for anyone who’s ever studied science or been to a hospital, this didn’t deter Galen, a young philosopher who started out as a gladiatorial medic and ended up authoring stacks of meticulous treatises on human diet and disease that formed the bedrock of much of modern medicine.
In part due to the social strictures of the times, early doctors like Galen were primarily dietitians. Among the models of disease Galen advanced was the idea that optimal health could be achieved through the balance of the “humours” — blood, phlegm, and black and yellow bile — and that excesses or deficits in any of these could be brought about by poor diet or overexertion, leading to specific kinds of symptoms. In a land before syringes and anesthesia, the primary means of removing things from patients were sharp and frightening and often involved fire. So rather than grisly and injurious elimination of disease-causing agents from the body (with a very high potential for the introduction of even more), a focus on food became paramount in treatment, and also became the subject of careful accounts of different diets and their effects.
We seem to find ourselves in a similar situation today. Meal planning and fitness apps are ubiquitous, and meal kit delivery services cater to every possible dietary regimen currently vying for attention as the most effective way to lose weight, boost your mental acuity, and help you live longer. A cursory Google will bombard you with rejoinders to try paleo, keto, plant-based, “clean” lifestyles, assuring you that you’ll reap immediate benefits. I personally embrace the lifestyle choices (if only in theory and not strictly in practice)of Jeanne Calment, whose recorded age at the time of her death was 122. She spent the majority of her century on earth chain-smoking, drinking port wine, eating chocolate, and possibly also lying about her age. Whether or not this last part is true is still contentious but either way, in my mind, it stands as a ringing endorsement for just doing whatever the hell you feel like doing and seeing what happens.
Because a lot of these diets that don’t consist primarily of alcohol and addictive stimulants are restrictive and require skillful nutritional planning, many people find them hard to stick to for prolonged periods of time and turn to more enticing alternatives that dictate when you eat, but not necessarily what you eat. There are many variations on the theme of intermittent fasting, but the basic idea is that you restrict your food intake to a brief window during the day, usually between 8 and 10 hours, and during this time you can eat whatever kinds of food you want (provided you don’t lean too heavily into the port wine and chocolate). The rest of the time, your body switches metabolic gears and initiates pathways that reportedly defend against cancer, protect you from neurodegenerative disease, prolong life, and ultimately guide you down the yellow-brick road to the elusive, magical land of “Optimal Health”.
That yellow-brick road is paved with poorly-controlled studies, weak correlations, and conflicting results that are further muddied by attempts to translate physiological findings across species with fundamentally different physiology. But the basic biology is (mostly) straightforward and many of these claims do have solid support in mice and humans. At the center of the story is your brain: a greedy, glucose-hungry monster that constantly demands attention from the rest of your tireless body. And the easiest place to find glucose is in carbohydrate-rich foods that, for evolutionary reasons, we find highly palatable and tend to binge on around the holidays.
Glucose is the basic building block of all the cellular energy we produce under normal conditions, but when we can’t find glucose, our bodies have to find another way to make it or face the wrath of an angry brain. So we scavenge alternate energy sources stored in our fat and free amino acids that would otherwise be used to make new proteins. This is actually a generalized cellular response to any kind of extreme stress: in lean times, we switch from building things to breaking things down.
Some interesting things start to happen at the intersection of cell biology and behavior. One of these things is that when the stress response pathway is activated and your intracellular protein factories are temporarily shut down for repairs, lots of different kinds of proteins and chemical messengers are produced to aid in this process — a kind of molecular demolition derby and a whole team of highly specialized janitors. The last thing you want at a time like this is more, new junk coming in that you’ll eventually have to find a way to get rid of.
One of the proteins engaged in the cellular stress response pathway is GDF15, a growth factor whose biological functions are so numerous it’s easier to ask what it doesn’t do. But one surprising thing it does seem to do is curb your appetite. Two recent studies found that in both mice and humans at least part of the reason the anti-diabetic drug Metformin is so effective is that it causes people (or mice) to eat less, especially if they’re obese or on a high fat diet. The way it does this is by stimulating the production of GDF15 by chemically triggering the stress response. Other studies in mice have also pointed to GDF15 as a possible gut-to-brain signal of nutritional imbalance. In taste preference tests, when given a choice, mice will learn to actively avoid an otherwise neutral flavor if it’s paired with a dose of GDF15. Elevated levels of the protein have also been found in pregnant women who suffer from severe morning sickness, strongly suggesting that beyond simply keeping you from going back for seconds, GDF15 might just steer you out of the lunch line in the first place.
All of this presumes that we consider food first and foremost as an energy source. This may have been true hundreds of thousands of years ago, but it isn’t anymore. Eating is a social activity. From dinner dates to weddings to wakes, food brings us closer to each other. And if you think of the kinds of food you might find at any of these events, you probably aren’t envisioning tables buckling beneath a mouth-watering bounty of berries, kale, and flax seeds. Foods that are delicious didn’t become that way by accident; glucose was assigned a very high reward value by our brains when we learned that if we ate a lot of it, we could run farther and climb faster and were far less likely to die of exhaustion or being eaten. I personally can’t remember the last time I was forced to run because I was being pursued by anything more relentless or frightening than my own guilt and neuroses, and the existence of the elliptical is testament to the fact that this is probably also true for a lot of people in present-day Western society.
If you’re shopping for diets to help you lose weight, you’ll find that in spite of their differences, most of them are united against the latter-day nutritional demons that are carbohydrates. None are quite so restrictive as the ketogenic diet, which aims ultimately to trigger the same biochemical processes as extreme stress, forcing your cells to go to a lot of trouble to maintain your basic energy requirements. The energetic currency of your cells is a very valuable little molecule called ATP, and to earn it they have to break glucose down into smaller and smaller chunks of carbon until they get to another molecule called acetyl CoA. This gets fed into the Krebs cycle, which you may remember from the many painstaking drawings you made in biology class and then gleefully tossed into the garbage as soon as you’d passed your exams. The major things to remember for our purposes, though, are that acetyl CoA enters, and eventually ATP leaves, and your cells then have the energy to go about their business. When you start running low on glucose, though, your fat stores provide the building blocks for ATP. Lipids are broken down into smaller parts, free fatty acid chains, that are in turn metabolized to ketones, which can be further metabolized into the acetyl CoA that the Krebs cycle needs to keep on Krebbing.
There are a lot of diets currently branding themselves as ketogenic, but many of these simply recommend increasing your ratio of protein and fat to simple carbohydrates. True ketogenic diets began as treatment plans in extreme cases of epilepsy, and only as a last resort when surgery and drugs had failed. These diets require you to almost exclusively consume fat, with about 10 percent of your calorie allotment coming from protein and carbohydrates. And this works — consistently, patients see a dramatic reduction in the frequency of their seizures within just a few days. Exactly how this happens is biochemically mysterious, but the overall effect is to dial down the brain’s electrical activity. Even then, it’s still unclear whether ketone bodies themselves are the missing molecular puzzle piece; prior to the implementation of ketogenic diet plans, starvation was found to be equally effective in seizure control (although obviously this wasn’t a sustainable treatment model in the long run).
This brings us back to the basic idea of how our cells decide whether to create or destroy, depending on the resources available to them. Creating things chiefly means creating new proteins, and part of the reason fasting or ketogenic diets might be so effective is that neurotransmission is a very complex and energetically expensive process that requires help from a daunting array of specific kinds of proteins that help connect nerve cells, or help convey chemical signals between them. One idea is that when fewer of these proteins are available, the flow of electricity slows down. Not only epilepsy but many neurodegenerative diseases, like Alzheimer’s and Parkinson’s, are thought to arise or progress more aggressively because of a phenomenon called excitotoxicity. The idea here is that if receptors that promote neuronal activity are engaged for too long, the cell is overwhelmed and eventually dies. In this context, it makes sense that drugs that target stress pathways, or dietary interventions that activate them, can also be effective in the management or treatment of a variety of neurological conditions beyond epilepsy.
What a lot of health and fitness websites gloss over in their ardent endorsements of these kinds of extreme diets for anyone looking to lose a few pounds or boost their energy levels is that the list of risks and unpleasant side effects of eating mainly fat and protein is pretty extensive. Patients who follow a ketogenic diet for seizures are usually advised to do so with the guidance of a nutritionist and routine follow-ups with a physician. The high ratio of fat to carbs required of the diet can raise LDL (better known as the “bad cholesterol”) and lead to hypoglycemia, especially in people who are already taking insulin for diabetes. It’s also been shown to lead to kidney stones, and could pose serious health risks to people with underlying kidney issues. On top of the strict counter-indications for people with specific metabolic conditions, commonly reported side effects are abdominal pain, mood and sleep disturbances, and a general feeling of sickness similar to the flu.
It’s just as hard to draw general conclusions that a diet or exercise trend is uniformly bad as it is that it’s a miracle cure for every ailment. Dubious results from dozens of studies might also leave you at a loss if you’re trying to make a decision based on solid science. One important consideration is that many of these initial studies were conducted in children, or in people for whom this diet was being used as a treatment for a serious medical condition. Far fewer long-term studies have looked at its effects on otherwise healthy adults. A strong case can be made, though, for the beneficial effects of mild short-term activation of cellular stress response pathways, which you can achieve even through short bursts of exercise or limiting your calorie intake for as few as 12 hours out of every 24. But as human beings, we tend to drift toward extremes, and it seems like Oscar Wilde’s tongue-in-cheek exhortation to practice everything in moderation — even moderation — continues to be the hardest advice for us to follow.