Cooking for yourself: You are worth the effort.

I think this is a lesson that all of us could stand a refresher in, myself included, as we continue down the path toward feeding ourselves like competent adults.

Caveat: Feeling that it’s not worth the effort to prepare food for yourself is very different from literally not having the energy or resources to prepare food for yourself. The latter happens sometimes to all of us – either we are just bone-tired and pressed for time and eating is just not going to happen (or can only happen in rudimentary form), or we honestly don’t have the money to buy the food we want or need, or we haven’t yet learned to cook in the way we want or need. I have personally been in every single one of these situations at some point, and none of them are easy. But they are not what I am talking about in this article, just to be clear.

What I am talking about is that feeling, when you genuinely do have time and energy and food and skill on hand, but you manage to talk yourself out of making food that would truly nourish you because somehow cooking “just for yourself” doesn’t feel worthwhile.

Story time: the last time I was in this situation was when my husband starting working really late hours, not getting home until like 8pm. Previously, we were eating dinner together around 6:30, and then I would have evening clients afterward. This change in his schedule meant we were each on our own for dinner, and I wouldn’t even see him until 9pm.

For the first couple of weeks, I was THE SADDEST, gazing out the window like fat Cyndi Lauper. I completely abandoned the wifely habit of having dinner ready promptly at 6:30pm (smoke detector blazing) and would just sort of listlessly snack on whatever random food came to hand. Toast, cereal, Cheetos, peanut butter from the jar, whatever.

Many cat selfies were taken during this dark time.

I was a tragic, grazing Camille, and inside a week, I felt horrible. Even more horrible than lonely. I soon connected the horribleness to the lack of eating an actual dinner. (Thank you, nutrition degree! $20,000 well spent.) So I resolved that I would Cook An Actual Dinner, no matter what time my husband came home, and I would even dish up a plate for him like the saddest make-believe tea party of all time.

It took some activation energy, no doubt. I had to convince myself to stop dragging my feet, and remind myself that not feeling like crap truly was worth the effort, that my own well-being (and by extension, I) was truly worth the effort. Through the bad-and-wrong feelings and the internal toddler whining, “I don’t waaaaant to,” I did it once. And it was tasty and made me feel better. So I did it again. And again.

I plated my husband’s part, wrapped it in foil, and put it in the oven to warm, just like I imagine many pre-microwave era housewives had done. He was appreciative, and most importantly, both of us were well-fed instead of coasting on fumes.

So, anecdote concluded. Not feeling like death: it’s worth it!

Let’s talk strategies. For those of you with roommates or significant others who share food, the cooking-and-keeping-warm strategy I did above can work if you’re already in the routine of cooking most nights. You could also keep it cold instead by putting leftovers in the fridge. Either way, you’re staying out of the temperature DANGER ZONE. (As well as the Kenny Loggins DANGER ZONE.) And for some reason, just feeling like you are cooking for more than one person, even if that person isn’t physically there, can get you over the hump.

For those of you who live or eat alone, things are a little trickier because you don’t have another person depending on you (and therefore motivating you) to cook something every night. When I was single, I solved this problem by cooking once or twice on the weekend and freezing it in portions. I tell you this as someone who is not a fancy cook: it’s not as hard or as fussy as it sounds. I believe in cooking things in a single pot, if at all possible – especially since at the time, I lived in an apartment whose kitchen was a stove, a fridge, a sink, one rickety square foot of counter, and a cart I bought from Canadian Tire.

More importantly, it is worth it if you are sick of eating out or scrounging. I have five or so basic recipes that I learned can be frozen in portions and reheated rather well: Hoppin’ John with rice, Beans Bretonne with arborio rice, beef stew, creamed chicken and mushrooms with mashed potatoes, and chili con carne. These all have the advantage of being one-pot meals (oh, except the mashed potatoes), and several of them make good vegetarian recipes with small adjustments. I make them all in a Dutch oven.

The trick is to remind yourself that having a stash of frozen meals does not mean you are obligated to eat those meals every single day. If you pressure yourself with this, there will be resentment and tears, trust me. These meals are your safety net for when better plans (going out with a friend, getting a roasted chicken from the store, ordering a pizza) don’t materialize. You will never be in a situation where you regret having a few frozen meals stashed away. Eventually, you may even come to prefer your own cooking to ordering pizza. But only if you don’t force it. Intersperse eating them with your scrounging method of choice.

I still cook and freeze something every other week, so I can have a hot, homemade lunch during the weekdays, and for emergency dinners. In fact, I just finished a nice cup of chili, an apple and some rye bread for lunch. It was delicious. Furthermore, I made it spicy enough that my husband won’t go near it. TIGER BLOOD.

(I know Charlie Sheen jokes are very 2011, but I’m old and time moves more rapidly for me.)

Alright, so here’s where I ask for your input, like a good little blogger: I am a pretty uninspired, workmanlike cook, so I’m sure the rest of you have even more brilliant ideas for 1) convincing yourself you are worth the effort, and 2) using amazing Crock-Pot technologies to sink further into lazy debauchery. Go for it in comments.

Posted in eating, Food and Recipes | Comments closed

Health at Every Size is not fat politics.

Despite all the words I have spent on this topic over the past decade or so, there are still a lot of misunderstandings about what Health at Every Size is and what it isn’t.

People often conflate the Health at Every Size philosophy with the fat political movement, assuming they are one and the same thing, while simultaneously framing Health at Every Size as completely opposed to current weight and health science.

In truth, Health at Every Size does intersect with both fat politics and weight science, and yet it is neither of these things. It incorporates parts of both to form a bridge between them.

Health at Every Size developed as something of a response, or corollary, to fat politics. The principles of HAES arise from a foundation of (personal, and maybe political) fat acceptance, while not actually being the fat acceptance movement. It is a different, but attached, thing.

Health at Every Size is also not in complete disagreement with current weight science — or at least, not any more than weight science is in disagreement with itself much of the time. Health at Every Size acknowledges the data of weight science, but interprets its methods and context critically — sometimes agreeing, and sometimes disagreeing with its conclusions. It is a different, but compatible, thing.

I’d like to offer a more in-depth definition of these terms, and describe how they relate to Health at Every Size, starting today with the fat political movement.

1. Fat Politics

Fat politics is sometimes termed “the fat acceptance movement” or “fat liberation,” but it goes by other names as well. The goal of this movement is political and social: to address societal power imbalances affecting fat people, and, hopefully, to restore balance through political actions like agitating for legal protections from size discrimination, and advocating for change in how fat people are treated in settings ranging from the sidewalk to the workplace to local businesses to the doctor’s office.

People who benefit from thin privilege may feel excluded from fat politics and from the social milieu that has developed around the fat acceptance movement. That is because fat politics is primarily a movement for fat people — even though fat stigma affects people of all shapes and sizes, even though a reduction in fat discrimination and inequality is likely to benefit everyone as a side effect, and even though, fundamentally, the problem of fat oppression is not located in the fat body itself, but rather in a hierarchical social order that is pathologically devoted to defining certain people as worthy, and others as garbage to be thrown away.

Because it is intended expressly to help fat people, the fat acceptance movement is one of the few areas of our thin-centric culture that does not prioritize the needs, viewpoints, and feelings of thin or average-sized people. This can make it an uncomfortable place for thin or average-sized people to be, especially if they are not well-versed in these concepts.

That’s okay; as a thin or average-sized person, you can still educate yourself and be supportive of people of all shapes and sizes without needing to access fat politics or fat social spaces for your own personal use. I know it hurts to feel excluded on the basis of your body size, trust me. You will survive.

Within fat politics, sometimes people talk about the science of health and weight, but that science is presented to make a political and moral argument: that fat people are worthy of rights and equal respect, and that negative stereotypes about fat people (most of which centre around health because, in our culture, health is used as a proxy for moral goodness and deservingness of basic human rights) are both inaccurate and morally wrong.

I actually wish that the conversation about health within fat politics would shift more to a social model of disability perspective — which means affirming that people naturally come in a diverse array of different bodies, and rather than labeling some bodies “right” and other bodies “wrong,” and setting up societies to only accommodate “right” bodies, and then seeking to address the resulting inequities by forcing the “wrong” ones to more closely resemble the “right,” it is actually the responsibility of society at large to ensure that all bodies are accommodated, valued, and given equitable access to the human world.

I also wish the conversation would focus more on social determinants of health and less on individual health habits, and also less on stereotype-busting to prove that fat people can be “healthy” by what I think is an exclusionary, unrealistic, and ultimately oppressive definition of “health” — but you can’t always get what you want.

When people discuss health and science within fat politics, you must take those points in context, for what they are: they are tools to serve an ultimately moral, not scientific, argument — that fat people are human beings who belong in the world, and who deserve basic rights, compassion, and dignity. They are not intended, in the context of a political discussion, to be engaged in a search for the ultimate medical and scientific truth about body weight (interesting as that subject is to me), nor are they being used with clinical detachment. This doesn’t make the scientific arguments inherently untrue, but it does mean they are secondary to the moral agenda.

There is always some bias in using scientific evidence to service what is fundamentally a moral argument (as opposed to a political argument that arises from scientific findings.) The truth is, regardless of what the science says about weight and health, the moral argument will always stand: fat people exist, they are in the world, and if human history is any example, they will continue to exist — and therefore, they must be afforded the same rights, access, and dignity that other human beings enjoy. Regardless of their health.

Within fat acceptance, some people do a better job at scientific accuracy than others, and many fat political arguments using weight science have been published in peer-reviewed journals. But when you enter the world of fat social gatherings, Facebook status updates, message board grudge matches, Twitter and personal blogs, you are going to witness wide variation in the accuracy and subjectivity with which science is presented to service the moral argument. Some arguments will be painstakingly accurate. Others will highlight one truth while displacing another to make a larger point. And some will be hopelessly garbled, or oversimplified to the point of uselessness.

There is also going to be heat and defensiveness and loss of temper — because people are not really fighting about whether science shows that fat people can be healthy, they are fighting to be treated as human beings.

That is true for any political movement. Politics are emotional. Politics are important. But they are not science, and they are not exactly Health at Every Size.


This should always be assumed, but I want to make it clear that I am not the official ambassador for Health at Every Size or the fat acceptance movement, and the above is just my viewpoint. I’m sure people involved in either movement might disagree, and that is fine.

Posted in Fatness | Comments closed

On wheat and death.

Several months ago, I happened upon this little review about the connection between wheat (and other grains) on inflammation, which was pretty interesting.

It reports that there are plausible physiological mechanisms linking wheat to inflammation, that there is some animal and some human evidence available to back them up, but also that population-based studies and human trials have either not shown a significant effect, or haven’t been controlled in such a way to properly isolate the question of whether wheat and its inflammatory effects have measurable, significant health outcomes on people.

The evidence is suggestive in some respects, but not conclusive by any stretch – meaning that the basic dietary advice given to the general population stands: eat a variety of foods, including whole grains, provided you can tolerate them. If you can’t, you probably already know that by now. If you aren’t sure, go see a doctor (preferably an allergist and/or gastroenterologist) to get assessed, and see a dietitian for appropriate nutrition advice. (And beware of geeks bearing IgG tests.)

What follows is a selection of quotes from the review about the limitations of the research in humans:

  • “It should be noted that whole grains contain phytochemicals, like polyphenols, that can exert anti-inflammatory effects which could possibly offset any potentially pro-inflammatory effects of gluten and lectins [73].”
  • “Most of the intervention studies mentioned above attempted to increase whole grain intake and were using refined grain diets as controls, thereby making it very difficult to draw any conclusions on the independent role of cereal grains in disease and inflammation.”
  • “There are few studies that investigate the influence of a paleolithic type diet comprising lean meat, fruits, vegetables and nuts, and excluding food types, such as dairy, legumes and cereal grains, on health.”
  • “Because these [paleolithic diet] studies are confounded by the presence or absence of other dietary substances and by differences in energy and macronutrient intake, factors that could all affect markers of inflammation, it is difficult to make a concise statement on the impact of cereal grains on these health outcomes.”

The authors call, as most reviews of this nature do, for more research, preferably of the randomized controlled trial variety, or population studies that do a better job of controlling for confounders.

In other words: don’t panic. There’s a whole lot we still don’t know, and no one is taking anyone’s wheat away.

One thing that is missing in this discussion, so far, however, is acknowledgement of the cultural and practical importance of wheat and other grains in our diets. It always concerns me when this is left out, because whether we want to believe it or not, tradition, cultural foodways, and plain old accessibility probably inform the average person’s eating habits to a much greater extent than biochemical considerations of the inflammatory response provoked by selected components of a given staple.

Even though we might not want this to be true, it is true – and even though we might not want this to be important, it is important. We are humans. We are omnivores. We eat lots of different things, and not all of them for reasons of pure biochemistry.

It is difficult to overstate wheat’s importance in feeding the people of the world, both in a biological sense and in a cultural sense. Wheat forms the basis for cultural food staples spanning from bread to noodles to couscous to pastry to beer to gravy to breakfast cereals.

Wheat, a grass that today feeds 35 percent of the earth’s population, appeared as a crop among the world’s first farmers 10,000 years ago. It increased in importance from its initial role as a major food for Mediterranean peoples in the Old World to become the world’s largest cereal crop, feeding more than a billion people in the late twentieth century (Feldman 1976: 121).

-Cambridge World History of Food, Volume 1

Which means you are going to need seriously strong evidence to impugn a food source that supports a huge proportion of our world’s nearly seven billion people. If there really were something nutritionally catastrophic about wheat, it would be a major concern – but, again, extraordinary claims require extraordinary proof. And this paper is not it.

Aside from its pure biological importance, the cultivation of wheat also marks a technological milestone in human evolution —

…with the domestication of wheat, humankind began the shift from hunting and gathering food to producing it. This change in lifestyle set humans on a new evolutionary course, and their society and environment were never the same after farming was established.


— making it not only an extraordinarily important food source, but an extraordinarily symbolic food. Wheat is one of the pivotal crops of modernity. Though its first cultivation well predates the modern era, it set us on the path that led to the industrialized food production systems many of us rely on today, for better or worse.

Wheat and similar grains also require more intensive processing to be edible, compared to many fruits and vegetables that can be eaten whole and raw, compared to dairy, which is often processed for either safety (pasteurization) or preservation (cheese), but which is still consumable in raw form, and even meat and fish which, at its simplest, requires killing, butchering, and cooking (and sometimes not even cooking.)

From an early time, wheat was harvested, milled into flour, stripped of various parts of its grain, and further combined with other ingredients, then boiled or baked to produce edible products. Or the grain was fermented and/or distilled for alcoholic beverages. In modern industrialized food systems, wheat and other grains provide the basis for many highly-processed, and profitable, food products that are shelf-stable, very palatable, and very cheap.

I don’t think this fact of processing is lost on people, even people who don’t routinely think much about where their food comes from. Humans are masters of symbolic thinking, and I believe there is some level of awareness that wheat and other grains are subject to high levels of processing and refining which, depending on how you view those activities on a social and moral level, imbues the food product itself with either a sense of purity and goodness, or contamination and risk.

I regularly speak with people who are concerned about “nutrients being stripped from, and then sprayed back on” wheat products like enriched flour. There is a sense that all the good things have been taken away through refining, and suspect, man-made substitutes sneaked back in to fool everyone into thinking the food is nutritious. And yet, it is this same enriched flour that has significantly reduced the incidence of vitamin deficiencies and neural tube defects in the decades since its implementation in Canada and the U.S.

Humans are naturally both curious and suspicious of their food, in a sort of Hegelian dialectic referred to by food scholars (namely, Claude Fischler and Paul Rozin) as “the omnivore’s paradox.” We express anxiety about this paradox in a variety of different ways, including, in my opinion, through popular food fads – both positive fads, where some food is (usually temporarily, and often misleadingly) awarded the health halo and exalted to superfood status, and negative fads, where a former superfood or a perfectly neutral-seeming dietary staple is blamed for all human misery and misfortune.

I believe this is happening currently with several foods, the most notable, to me, being wheat. I believe this is due to several factors: our natural suspicion about food and its potential contamination or toxicity, combined with an increasing cultural discomfort with the products of modernity which has focused largely on industrialized food production and its discontents, as well as a growing awareness of Celiac disease, food allergy and intolerance, and non-Celiac gluten sensitivity.

Add to this our frustration with the fact that, despite all our technological advances, there still exist medical conditions that defy treatment or explanation, our ever-present fear of death, our idealization-bordering-on-worship of perfect biomedical health, our vulnerability to placebo and nocebo effects, and a soupçon of trendiness derived from evolutionary nutrition theories, and you get a heady cocktail intoxicating enough to produce a damn-near religious conversion.

If wheat doesn’t work for you, for whatever reason, you don’t have to eat it. You can find ways to live well without it, though it will take some effort and some money.

But if you’re looking for something to believe in, something to resolve existential angst, the discomfort of ambiguity and not-knowing, and the fear of your own mortality, avoidance of wheat is probably not going to do you much good.

Posted in Research | Comments closed

Why diets don’t work.

Most diets seem to succeed in the short-term, and fail in the long-term. This is not a new, or even particularly controversial, observation among researchers:

“There are two indisputable facts regarding dietary treatment of obesity. The first is that virtually all programs appear to be able to demonstrate moderate success in promoting at least some short-term weight loss. The second is that there is virtually no evidence that clinically significant weight loss can be maintained over the long-term by the vast majority of people.”

-Confronting the failure of behavioral and dietary treatments for obesity, Garner & Wooley, 1991

“Although weight loss can usually be achieved through dietary restriction and/or increased physical activity, the overwhelming majority of people regain the weight that they have lost over the long-term.”

-The Defence of Body Weight: A physiological basis for weight regain after weight loss, Sumithran & Proietto, 2013

“Of course, we can all endorse the call for a healthier lifestyle, but we must be realistic about what it can and cannot accomplish – including that it is not by itself an effective approach to long-term obesity treatment.”

-An Inconvenient Truth about Obesity, Schwartz, 2012

More in-depth analysis of the failure rate of dieting can wait for another post. The question I’m asking here is, if diets fail for some proportion of people, which they indisputably do, why is that? What is the reason? What are the specific mechanisms at work?

The usual assumption among non-researchers about why diets fail is that when a dieter regains weight, it must be because they stopped dieting, which is in turn attributed to things like not having enough willpower, personal and moral failure, gluttony and laziness, or being too ignorant to know better.

These are assumptions which reflect the mythology of our culture: that anyone, if they try hard enough, can be anything they want — and therefore that weight is entirely a choice, a product of effort and moral character. This story centres the individual, their behaviour, their character traits, and their moral attributes as the cause of fatness in the first place, and the reason why weight is regained following a diet.

But these explanations are not satisfactory to me, nor, as you will see, are they reflected in the scientific literature.

To explore other answers, I haphazardly gathered peer-reviewed articles, spanning a range of more than 30 years, that investigated or discussed the various reasons why weight loss produced by dieting is not maintained long-term.

Here is what they theorize about why diets fail.

1. Behavioural relapse, a.k.a. “going off the diet”

The earlier papers on the failure of dieting focused on behavioural factors, since dieting was, at the time, a relatively new and exciting behavioural intervention for “obesity.” (By the mid-20th century, dieting as a popular pastime was not new, but as a subject of medical research, it was still fairly novel.) Researchers assumed that when someone could not sustain weight loss, it must’ve been due to a breakdown in their new behaviours — people must have gone back to eating more and moving less, just as is popularly assumed.

However, the researchers tended not to lean so heavily on moral explanations for this relapse. One study suggested that the fault lay with lack of scholarly attention to the maintenance phase of behavioural change in designing weight loss plans. This was further complicated by the fact that no one can avoid eating entirely, which makes dieting quite different from other behavioural interventions like smoking cessation programs and abstinence from alcohol.

Alongside this were proposed cultural and commercial pressures to eat, especially calorie-rich and highly palatable foods. There also appeared to be few natural rewards provided by dieting once the intervention phase ended — apparently nothing, not even thinness, feels as good as food tastes.

The researchers were not very optimistic about the usefulness of dieting if it only resulted in regaining weight. An illuminating quote from the conclusion of one paper:

“Research on humans suggests that the deleterious effects of obesity are exerted primarily during periods of weight gain…Its medical consequences may be unfortunate enough that if people cannot maintain weight loss, they would be better off not trying to lose weight!”

-Behavior Modification in the Treatment of Obesity: The problem of maintaining weight loss, Stunkard & Penick, 1979

Another paper suggested that culprits for the breakdown of dieting behaviours were negative moods, emotional stress, social pressures to eat more, as well feelings of intense hunger that prompted overeating. But an interesting quote from this same article hints of more than purely behavioural factors:

“The obvious reason for weight regain after weight loss treatment is that participants return to inappropriate eating and exercise habits. These habits need not be as bad as pretreatment habits to cause regain, because metabolic factors may make it easier to regain after a period of dietary restriction…The pattern of relapse and regain appears to be the result of a war between the will and physiologic demands over which self-control appears relatively powerless.”

-Why Treatments for Obesity Don’t Last, Goodrick & Foreyt, 1991

So even in cases where behavioural relapse were implicated, researchers seemed to acknowledge that other factors contributed to that relapse (like stress, biological and cultural pressures to eat, and increased hunger), or to the weight regain itself (metabolic changes.)

2. Lowered energy expenditure

Reduced calorie intake and weight loss, it turns out, cause some interesting changes to the body that result in expending fewer calories. In animal studies, changes include decreased body temperature, less spontaneous activity, and lowered resting metabolic rate (the amount of energy the body uses while at rest.)

Reduced total energy expenditure and, possibly, lowered resting metabolic rate after diet-induced weight loss have also been observed in humans. (Conversely, humans who gain weight above their baseline weight through eating have been observed to have an increased resting metabolic rate.)

A person who gains weight would be expected to expend more energy just due to their increased body mass, thus requiring more energy to physically move and biologically maintain it. The same, but in reverse, is true for someone who loses weight – less energy is required to maintain a smaller body.

But the changes in energy expenditure resulting from dieting have been described as “disproportionate,” meaning that they were greater than the changes expected for the amount of weight gain or loss, indicating that some compensatory mechanism meant to restore preferred weight may exist.

In other words, a person who lost weight to reach 150 lbs. may expend fewer calories just existing than someone who has always weighed 150 lbs. And someone who purposely gained weight to reach 150 lbs. may use more calories to maintain their weight than the person who has always weighed 150 lbs.

However, other studies of weight loss in humans have not demonstrated the effect of lowered resting metabolic rate, which leaves the question open.

A nod to weight diversity from the last study linked:

“Body weight in adults is remarkably stable for long periods of time. In the Framingham Study the body weight of the average adult increased by only 10 percent over a 20-year period. Such a fine balance is evidence of the presence of regulatory systems for body weight. Whatever the mechanism (or mechanisms), the weight at which regulation occurs differs from one person to another, and these differences are almost certainly due in part to genetic and developmental influences.”

-Changes in Energy Expenditure Resulting from Altered Body Weight, Leibel, Rosenbaum, and Hirsch, 1995

3. Fat storage and insulin sensitivity

Another physiological change produced by weight loss is increased insulin sensitivity. This is generally considered a good thing, but it may also leave people vulnerable to weight regain. We may need to go back to a little high school biology to cover this one adequately.

Insulin is a hormone that the pancreas releases into your bloodstream. Insulin’s main life goal is to act like a key that allows glucose, also flowing through your bloodstream, into your cells, which then use the glucose for energy.

When a person’s cells become resistant to insulin, the glucose can’t get into the cells — it then builds up in the blood, eventually causing high blood sugar. Meanwhile, the cells switch to using fat for fuel.

With weight loss, cells become more sensitive to insulin, which allows glucose to enter the cell once more. Those cells use that glucose, and the fat that would otherwise be used for energy is directed back into storage, which may spell weight gain.

Experimental research in humans has indeed demonstrated that increased insulin sensitivity following weight loss from dieting predicts the amount of weight the person will eventually regain. The researchers are careful to point out that increased insulin sensitivity, alone, is not enough to cause weight regain, but in combination with lowered energy expenditure (see above) and increased food intake (see below), it certainly helps.

From this same paper:

“Following weight reduction, there is a 95% failure rate for obese individuals to stay weight-reduced more than 4 years (5). After obese subjects undergo weight reduction, metabolism shifts to favor weight regain…These metabolic phenomena result in the shunting of lipid fuels away from oxidation in muscle to storage in adipose tissue, and in the setting of positive energy balance, increases in body weight and percent body fat occur.”

-Weight Regain Following Sustained Weight Reduction is Predicted by Relative Insulin Sensitivity, Yost, Jensen, and Eckel, 1995

4. Increased appetite

During and after weight loss, levels of several hormones involved in appetite regulation change significantly.

Hormones that promote feelings of fullness and inhibit food intake (including leptin, peptide YY, GLP-1, cholecystokinin, and amylin) are decreased with weight loss. Meanwhile, ghrelin, a hormone that stimulates hunger, is increased, along with reported food preoccupation and appetite.

Again, these responses may indicate the existence of a regulatory mechanism intended to restore preferred body weight:

“Taken together, these findings indicate that in obese persons who have lost weight, multiple compensatory mechanisms [encourage] weight gain…Furthermore, the activation of this coordinated response in people who remain obese after weight loss supports the view that there is an elevated body-weight set point in obese persons and that efforts to reduce weight below this point are vigorously resisted.”

-Long-Term Persistence of Hormonal Adaptations to Weight Loss, Sumithran et al., 2011

In addition to feeling hungrier, weight-reduced people show a stronger preference for high-calorie (high sugar and high fat) foods. There are also changes in brain activity patterns indicating that weight-reduced people are more responsive to food rewards, while brain areas associated with controlling one’s food intake are less active.

The hypothalamus, a part of the brain that may act as a “brake” on the homeostatic tendency toward weight gain, shows decreased activity in people who have lost weight, which affects both food foraging behaviour and metabolism to favour eating more and regaining weight.

5. Genetic predisposition to gain weight

It has long been understood that body weight has a significant genetic component.

Research in pairs of identical twins shows that there is also a significant genetic component to weight loss, including how much and what type of fat is lost, and the rate of fat burning relative to use of glucose for energy.

On the other side of the coin, population studies of twins have shown an association between dieting attempts and subsequent weight gain, which probably reflects a pre-existing tendency to gain weight that is powerful enough to counteract weight loss attempts.

From that study:

“The poor success in weight maintenance after dieting predisposes individuals to the vicious cycle of frequent dieting attempts and weight regain. The relation between weight cycling and subsequent weight gain is well described in the literature. Part of the weight gain occurring in young adults may be regarded as physiologic, and is likely to occur independently of attempts to lose weight.”

-Weight-loss attempts and risk of major weight gain: a prospective study in Finnish adults, Korkeila et al., 1999

Another study using twin data indicates that some of the weight gain may also be due to dieting itself, independent of genetics.


As you can see, moral explanations for weight regain leave a lot to be desired. They reflect lazy thinking. A person’s drive to eat, combined with their tendency to regain lost weight, is clearly more dependent on physiology than on moral corruption, or even simple ignorance.

Biology drives behaviour. It also primes the body to most efficiently exploit that behaviour. What is often interpreted as weakness of will and greediness by our culture is actually the result of a complex orchestration of genetic, homeostatic, metabolic, hormonal, and neurological processes influencing us to eat, restore lost weight, and ultimately survive.

And a final quote:

“…metabolic conditions after weight loss may not be the same as they were prior to gaining the weight in the first place. Instead of working in our favor to prevent weight gain, biology becomes one of the driving pressures that underlie weight regain.”

-Biology’s response to dieting: the impetus for weight regain, MacLean et al., 2011

If you’ve ever regained weight after a diet, you are in very good company. Most dieters regain the weight. You are not lazy, stupid, or greedy. You did not fail — on the contrary, your body worked hard to save you.

Posted in Diets | Comments closed

Real food.

“Real food” is a term I dislike almost as much as “real women,” and for many of the same reasons.

On occasion, I run into this idea coupled with the concept of intuitive eating. People will proclaim how much they believe in permission and fulfilling your hunger and eating whatever you want (so far, so good)…but with one small caveat (uh-oh.) Permission and eating as much as you’re hungry for and eating what you like are, apparently, only legitimate if the food being eaten meets some mysterious criteria that imbues it with that holiest of all holy contemporary food values, the coveted title of “real food.”

For some people, real food means “food I make entirely at home from scratch [for varying values of ‘from scratch.’]” For some, it means “mainly plant-based foods with a smattering of dairy and animal protein.” For others, it means “entirely raw foods that have not been cooked.” And for yet others, it might mean anything from “a vegetarian diet” to “mostly meat and certain vegetables and no grains” to “a vegan diet composed entirely of homemade food” to “I grow everything I eat on my own land, including grains which I mill into flour myself and then deep-fry unrepentantly.”

There is a lot of wiggle-room in this term.

Before I go further, it is important for me to make it crystal clear that for people who choose to eat in one of these ways, I say good for you. I sincerely hope you enjoy it and feel great. Rock on. I am all for people making very personal choices about what foods they eat and don’t eat. I think the above are all decent options, but most importantly, it doesn’t matter what I think, because your body belongs to you. Personal autonomy around food is the driving force behind this entire website.

The problem is that I’ve met very few people who make personal choices of the “real food” persuasion without also pressuring those around them…without also proclaiming that the foods most people rely on to survive are inherently inferior…without also implying that the reason the rest of us are fat, or poor, or don’t have shiny hair, or don’t walk around perpetually bathed in magical sunbeams of happiness, is entirely because we eat the terrible, horrible, no-good, very bad food — the food that is not Real.

(Those who can make such choices without also being rude about other people’s food choices often comment here, or hang out with me on Twitter, and to all of you I give my unalloyed thanks.)

That large caveat disposed of, I will now proceed to my central argument, which is this:

All foods, like all women, are real.

No, this does not mean that all foods are nutritionally equivalent, or that all foods are good for all people in all situations, but it does mean that choices around food must be individual, that all food choices can be valid, depending on the person and the circumstances, and that universal pronouncements on a food’s relative realness are not helpful or, well…real.

“Real food” is not a real thing. Because what constitutes food is too many things.

There is no One True Way to Eat. Most people tend to accept this as a generality, and express mild agreement through such aphorisms as “Do what works for you,” “Your mileage may vary,” etc. But I’m afraid general, mild agreement with the idea that different people are different does not quite do justice to the reality. Thankfully, I can provide you with a little glimpse into that reality.

The reality is, even foods we tend to recognize as universally wholesome and healthy are not actually appropriate for everyone. Bodies differ and circumstances also differ. For example, our universally beloved super food, dark leafy greens, are considered a food to avoid (along with a bunch of other “healthy” foods like whole grains, legumes, and many fruits and vegetables) for people with kidney disease who require a low potassium diet.

Eating more sodium instead of less sodium can actually be a critical thing for people who experience hypotension — when I was working in the hospital, we actually had to stop purchasing a popular brand of bouillon for this purpose when they lowered the sodium in their product in an attempt to provide a healthier option to consumers. Well, it wasn’t healthier for our patients on tube feeds, some of whom required a sodium boost between feedings — in fact it was quite dangerous.

And while you may be tempted to write off hospitalized patients as the exception to the rule, they are consumers too, and there are far more people with serious medical conditions in the world than our culture allows us to be aware of.

Some of them are kept out of sight and out of mind in hospitals (except to those of us who work there), but many more are living their lives and buying their food right alongside us. I wish we could all be a little more aware of that, and I wish food companies and public health nutrition education campaigns alike would take these very real and present needs into account, rather than continually and exclusively prioritizing the speculative health needs of the generally well.

Right this minute, there is someone going through chemotherapy shopping at your grocery store, buying popsicles and ice cream to help their sore mouth, and worrying what the cashier is going to think.

There is someone on hemodialysis buying white bread instead of whole wheat, trying to keep their phosphorus levels reasonable between appointments and hoping for the best.

There is a person attending intensive outpatient treatment for their eating disorder who has been challenged by their therapist to buy a Frappuccino.

There are dietitians picking up a dozen different candy bars to eat with their clients, who feel ashamed and guilty about enjoying them.

There is someone who just doesn’t have it in them to cook right now, and this frozen pizza and canned soup will keep them going.

There are people recovering from chronic dieting and semi-starvation who are buying chocolate and chips at their deprived body’s insistence.

All around us are people listening to what their bodies need and attempting to make the best possible choice within a context of overwhelming food pressure. All of their choices are valid, and every single one of these foods is “real.”

It is not a coincidence that the foods popularly imbued with “realness” map so cleanly onto class-related ideas of healthy, high-status food. Yes, they may be nourishing and wonderful, but these foods also tend to require more resources to acquire or prepare.

Those resources might be financial, in the case of going out to eat at a splendid restaurant, or they might be temporal and energetic, in the case of high-quality raw ingredients that require careful shopping, preparation, and cooking. They might even be educational, in the case of culturally novel foods requiring that you learn of their existence in the first place, and then have the knowledge and skill to render them into something edible to you. Resources can also be emotional and psychological, in the form of having a good relationship with food and being lucky enough not to feel either overly compliant with, or stubbornly rebellious against, cultural messages telling you what and how to eat.

None of it comes cheap.

It is wonderful if you have these resources and inclinations, and if the resulting food choices work well with your unique needs, but it is also lucky. Which means you should appreciate your good fortune enough not to go around spoiling other people’s food choices by insinuating that only yours are real.

If food is keeping someone, somewhere alive, then it is real enough.


P.S. Someone brought up an important point that I want to include, which is: Buying and eating food just because you like it is just as valid as any other reason. I only highlight people with clinical nutrition needs here because often they are overlooked.

Posted in eating | Tagged | Comments closed
  • Categories

  • Archives