Decision Time

Posted on by

Shortly after midnight on Saturday, April 14, 2018, I had a dream in which I was at the staging area for that morning’s Newport marathon, but I had not yet checked in for the race or stored my belongings even though it was 7:13 AM, just 17 minute shy of race time. Frantic, I was trying to figure out how I was going to take care of these logistical to-do items and get to the starting line on schedule. Then I woke up.

Approximately seven hours later, I was at the staging area down in Newport for the actual marathon. The truck containing the mobile locker I had rented in advance was mysteriously not there yet. Confused and anxious, I wondered what I was going to do with all of the gear I had planned on locking up, including my wallet, keys, phone, clothes, and post-race snacks. Standing there feeling somewhat paralyzed by uncertainty, I took out my phone and checked the time. It was 7:15 AM. (Premonitions allow for a two-minute margin of error, no?)

Midnight clairvoyance and the subsequent inauspicious sunrise set the tone for the rest of my day. Eventually, I got in line for gear check, an unsecured area for runners to leave their belongings, for I could no longer wait for the mobile lockers to arrive. Good thing I did not hold out for them either, as I found out later that the driver overslept and did not arrive with the lockers until well after my race began.

As I was standing in a long line comprised mainly of runners competing in the 5K and half marathon events that were commencing later in the morning, I heard the national anthem and then saw my fellow marathoners starting down the road. After several minutes, I got the attention of a volunteer and stammered, “I don’t know what I’m doing and my race just started without me.” He told me to drop my bag, that he would take care of it, and from there I hurried to the course and crossed the starting line well after the rest of the field.

While I prepare meticulously for race-day logistics, my pre-race plans went out the window due to the chaos that ensued from the mobile lockers’ absence. About a mile down the road, I realized I had accidentally left two of my three anti-nausea medicines in the bag I checked with the volunteer. Such a mistake was quite concerning, as nausea tends to be my limiting factor in marathons, even more than muscle soreness or general fatigue.

Not having my medication only compounded problems that began with a poor training cycle due to a herniated disc in my lower back, an abdominal hernia for which surgery was scheduled six days after the marathon, and a couple of other medical hindrances. Things were not looking good already, and yet they got worse.

Quickly, my fellow runners and I discovered that hydration was going to be a problem. Unlike most marathons that offer both water and sports drinks regularly along the course, most of the beverage stations on this course featured only water. Moreover, the cups were maybe a quarter full. Subtract from that the fluid that splashed out during the drinking process, and the net amount that made it down my throat was not nearly enough to keep me hydrated. The stations that did offer a beverage other than water had a low-calorie electrolyte drink, woefully insufficient to replenish the carbohydrates expended during such an endeavor.

Despite these challenges, I was inexplicably on pace for my all-time best marathon through mile 18, but by then things were getting ridiculous. We had not had a beverage station since mile 13, no electrolyte drink since probably mile 11, and the course was in the midst of a miles-long uphill stretch that felt more challenging to me than Boston’s Heartbreak Hill ever has.

The nausea, which had been building slowly, was pronounced enough where I felt like the time was right to use the one anti-nausea medication that I remembered to bring out on the course with me. In keeping with the theme of the day, the pills promptly fell out of my Ziploc bag onto the road. The quiet tick of the medication hitting the pavement was likely inaudible to anybody else, but to me it was the thunder of my last hope for a great marathon finish crashing down.

Limited by nausea and dehydration-induced muscle cramping, my pace slowed significantly over the final miles. Around mile 25, a blister that I did not know I had burst on the bottom of my right foot, altering my gait and slowing me even further. Hobbled, I kept running and crossed the finish line limping.

Somehow, out of the day’s nonsense sprang my fastest marathon time in 15 years, but this is less a story of resolve and more a tale of someone struggling in real time to weigh the pros and cons of disregarding or honoring his body’s signals, which in this case were clearly telling me to drop out of the race.

The course was essentially a figure eight with the start, midpoint, and finish all at the center. If I was going to call it a day early, hitting the eject button at the midpoint made the most sense, so I took stock of the situation as I neared the 13.1-mile mark. Inadequate fluids, dehydration and cramping that were already setting in, insufficient medication, and memories of my 2004 Boston marathon – which ended with an ambulance ride to the emergency room – all suggested that dropping out was the sensible and safest play.

On the other hand, my speed was inexplicably fast up to that point and I did not want to take for granted that I would ever have a shot at a marathon personal best again. While I reserve the right to change my mind, I went into this race figuring it was probably my last marathon. While I enjoy the training and racing, impending parenthood had me looking at the situation from a different perspective. Long training runs take a lot out of me, so much so that I am pretty much useless the rest of the day, and I do not think it is fair to put our daughter in a position where daddy cannot play, or go to the playground, or go for a walk, or do pretty much anything at all because he ran far and needs to rest.

Even if I do decide to train for another marathon someday, who knows, I could wake up sick on race day, or sprain my ankle on a Baby Einstein guitar while heading out the door to the starting line, or suffer any item on a tremendously long list of inflictions or mishaps beyond my control that could throw the whole endeavor out the window at any point in my training cycle or at the very last instant.

As I neared the half, I was cognizant of the reality that being 13 miles into a marathon with a chance for a personal best might never happen again. For as much had gone wrong, a lot had also gone right to allow me to be in such a position. Having weighed the pros and cons, I decided to continue on with the race despite all of the reasons to stop.

Disregarding my body’s cues eventually caught up with me. A few minutes after I crossed the finish line, the nausea worsened, I was shivering (a symptom of dehydration) despite the warm temperatures, and my breathing was abnormally rapid. Laying face up in the sun while wearing a hooded sweatshirt and winter jacket did not help. With my condition deteriorating, I made my way to the medical tent.

The paramedics took my blood pressure, which was sky high compared to my norm, and I was having trouble answering their questions. While I have a history of occasionally feeling miserable after long runs, this was worse than my norm. The scariest part to me was that I was aware of my incoherence, yet I could not do anything about it. They asked me what medications I take, but I could not put together an articulate response. In my mind, I was like, “Come on, dude, you know what meds you take, just tell them,” but I was incapable of getting the words out.

The paramedics wrapped me in blankets, put me in the back of an ambulance, and cranked the heat to warm me. They gave me oxygen and placed leads to monitor my pulse, heart rhythm, and oxygen saturation. After two hours of laying on the gurney getting rehydrated and warmed, we agreed that I was well enough – but albeit still far from 100% – to leave the ambulance and make my way back to my car.

Stepping out of the ambulance, I was startled to discover that the finish area was virtually deserted, as the spectators, volunteers, race organizers, and my fellow runners had pretty much all gone home. Watching the few remaining workers disassemble the food tent and the final handful of artifacts from a post-race party that had presumably been so happy and festive just a short time earlier, I felt an eerie and unsettling sensation: loneliness.

Later that evening, Joanne commented over dinner that I looked sad. She was right. Ending up in an ambulance with a health scare is no way to conclude an event. Finishing a marathon normally yields a significant sense of accomplishment, but this time I felt conflicted and somewhat hollow. Even though completing the course was a triumph of sorts, I had mixed feelings for having put myself in unnecessary jeopardy.

Like I tell my patients who are working on listening to and honoring their internal cues: assessing hunger and fullness levels, sorting through matching criteria, checking for humming and beckoning, and utilizing other intuitive eating tools are never meant to be leading questions, and there are no such things as absolute right answers. Decisions made regarding what, when, and how much to eat matter much less than having utilized a thoughtful process to reach them.

Similarly, having considered all of the pros and cons of the options available to me at the moment I had to make a choice, I feel like continuing to run was the best course of action for me despite my body’s cues suggesting that I stop. Ultimately, I am glad I finished the race even if I did pay a price for my decision.

Just after crossing the finish line. Am I having fun or what?

Intuitive Eating: An Introduction

Posted on by

This article originally appeared as a guest piece in the Progress Wellness newsletter.

What the heck is intuitive eating? We often hear the term, but what does it mean, how can it help us, what are its common misconceptions, and how can we begin to put it into practice?

First, some context: In our society, we are often taught that we cannot trust our bodies and that we need something external from ourselves to guide our eating. Hence, we have calorie counting, tracking apps, points systems, lists of foods to eat and those to avoid, meal plans, and other tools that tell us what, when, and how much to eat.

Intuitive eating, on the other hand, is a system based on the reality that contrary to popular belief, we can actually trust our bodies to guide our eating decisions. Internal signals give us information regarding our hunger and fullness, what foods will hit the spot at any given eating occasion, and how much of those foods we need to feel satisfied. Think of how much better water tastes when we are thirsty versus when we are already well hydrated, for example. Someone with anemia might not know that red meat is high in iron; they just know that a hamburger sounds mighty fine.

In contrast to external tools, intuitive eating tends to be a more peaceful and satisfying way of making decisions regarding what, when, and how much to eat. Not only that, but clinical trials have also found that intuitive eating is associated with improvements in physiological measures (blood pressure, blood lipids), health behaviors (eating and physical activity habits, dietary quality), and psychosocial outcomes (body image, self-esteem).

Whereas diet culture has rules and judgment, intuitive eating offers guidelines and flexibility, and it encourages neutral curiosity when events do not transpire as one would hope. Some people turn intuitive eating into the “hunger and fullness diet” by believing that they must eat when they reach a certain level of hunger and must stop when a certain level of fullness is attained, but such action is an oversimplification and misuse of the skills. If someone practicing intuitive eating ends up overly full, rather than beating themselves up for it and judging themselves as bad or undisciplined, they will just explore what happened to see if perhaps next time they might want to make a different decision.

Some people use intuitive eating as a weight loss tool, but doing so is a mistake. While some individuals will lose weight when they eat intuitively, many will not. By focusing on weight loss, people are likely going to end up disappointed and also stunt their development as intuitive eaters.

We are born intuitive eaters, and internal eating cues still reside in virtually all of us. Even if we fear our signals are gone, more likely they are simply buried by years of disuse, and we can uncover them and put them to use once again.

As a first step, when you are considering eating, take a moment to ask yourself, “How hungry am I right now?” You can imagine hunger and fullness existing on a linear continuum with extreme hunger at one end and extreme fullness at the opposite end. Ask yourself where on that continuum you are. Keep in mind that this is never to be a leading question, and your answer has nothing to do with permission to eat. You are simply gathering data and trying to notice the signals that your body gives you.

As a second step, if you have decided you are going to eat, rather than jumping to immediately see what your options are, take a moment to first look inward. Ask yourself if a particular flavor (sweet, salty, spicy, etc.) would hit the spot. Similarly, consider temperature (hot, frozen, chilled, room temperature, etc.), texture (crunchy, smooth, liquid, etc.), and even color. You might not have answers for all of these questions, but even knowing one of them (Temperature tends to be easiest for most people to discern.) can give you some direction. With your answer(s) in mind, now survey your choices, whether on a restaurant menu or in your own pantry or refrigerator, and try choosing the food that most matches your identified criteria.

Most people who are looking to become intuitive eaters need more help than can be found in a blog. Consider seeking the help of a registered dietitian who specializes in intuitive eating, and remember to be patient, as it can often take six months to a year, or even longer, of work and practice before your intuitive eating skills once again take their natural place as your default decision-making tools.

“Sometimes I want to binge so bad.”

Posted on by

A guy two months removed from spinal fusion surgery has no business moving a 45-pound plate. For that reason, in the late spring of 2014, I introduced myself to a new personal trainer at my gym and asked him to please put away the plate that another member had left on a machine so that I could use the equipment.

Typically, I shy away from new trainers, who tend to pitch themselves to virtually every member they meet in an effort to build their client rosters. As a former trainer myself, I get it, but I also do not like being pressured. This trainer was different though, and once I saw that he was not going to push me for a sale, I began talking with him on a regular basis. That hey-can-you-please-put-this-weight-away interaction turned out to mark the beginning of what has evolved into a friendship of sorts.

In the five years since, we have chatted about superficial matters, such as the rise and fall of the Celtics, as well as issues of more substance, like marriage and fatherhood. Despite the connection we have developed and my opinion that he is generally an excellent trainer, I have never referred my patients to him because of one factor that makes it ethically impossible for me to do so: He unintentionally encourages disordered eating.

Food and eating behaviors are common topics of conversation during his training sessions. Calories, cheat days, tracking apps, Halo Top, junk food, clean eating, intermittent fasting, and willpower are just some of the buzz words and trendy features of diet culture that I frequently hear him and his clients discuss.

My patients and I sometimes talk about these topics too, but the substance of our conversations is entirely different. Whereas I work towards dismantling diet culture and helping my patients understand the harm that comes from relating to food in such a way, this trainer sees these as positives. He tracks his calories, fasts, and weighs himself regularly, and he cites his own weight loss from the past year as evidence that his behaviors are the secrets to success that his clients should replicate.

Last week, one of his clients texted him to say he was going to be a half hour late. With an unexpected chunk of free time on his hands, the trainer came over and struck up a conversation with me while I was stretching. “Do you help people lose weight?” he asked. No, I do not, and I gave him my elevator speech explanation as to why.

His response somewhat surprised me. He told me how difficult weight loss was for him, how exhausting it is to track everything he eats, and how he just cannot keep up the behaviors. “Sometimes I want to binge so bad,” he conceded. The restriction is unmaintainable, he regains the 15 pounds he lost, then resolves to become lean again, reengages in his previous diet behaviors, again loses 15 pounds, and the cycle repeats.

In the last five years, I have overheard literally hundreds of conversations he has had with his clients regarding nutrition, many of which have referenced his own eating behaviors, but never have I witnessed him disclose his struggles and concerns as he did last week when none of his clients were around to hear about them.

So, I told him about the Ancel Keys starvation study and how binge behaviors were commonplace among the subjects once the dietary restrictions placed upon them were lifted. In their excellent book, Beyond a Shadow of a Diet, Judith Matz and Ellen Frankel explain the following:

“What these men [the study’s subjects] experienced as a result of their semi-starvation is typical of feelings and behaviors exhibited by dieters. When the men entered the refeeding portion of the study, the food restrictions were lifted. Free to eat what they wanted, the men engaged in binge eating for weeks yet continued to feel ravenous. They overate frequently, sometimes to the point of becoming ill, yet they continued to feel intense hunger. The men quickly regained the lost weight as fat. Most of the subjects lost the muscle tone they enjoyed before the experiment began, and some of the men added more pounds than their pre-diet weight. Only after weight was restored did the men’s energy and emotional stability return.”

Modern day dieting, I pointed out to the trainer, is really just self-imposed starvation, and it is completely understandable that dieters respond just like the study’s subjects. It is not a matter of willpower, but rather one of biological mechanisms, honed through evolution, that resist weight loss and encourage weight gain in order to help our species survive famines and other times of food scarcity.

Soon enough, our day’s conversation came to a close. He had to get ready to train his client, and it was time for me to head home and prepare for my own day’s work. Just before we went our separate ways, he told me that his clients have no idea how hard it is for him to try to maintain his eating behaviors, and we agreed that we never really know what someone else is dealing with behind the scenes.

Our parting sentiment is also the key takeaway from this blog. Said differently, consider the words of one of our most experienced and knowledgeable colleagues, Dr. Deb Burgard, who once said, “In almost 40 years of treating eating issues, I have found that when someone sits down across from me, I have no idea what they are going to tell me they are doing with food.”

In this trainer’s case, while many of his clients see him as a role model and look to him for nutrition advice, they do not realize that he is struggling and that the behaviors they seek to emulate are actually signs of disordered eating.

Macy’s

Posted on by

This month, Macy’s found themselves in hot water for selling plates, made by Pourtions, that many people criticized for encouraging eating disorders and fat shaming.

One of the plates, for example, features three concentric circles, the smallest of which is labeled “skinny jeans,” while the middle one reads “favorite jeans,” and on the largest of the three circles is emblazoned “mom jeans,” insinuating that the bigger the portion, the larger the pants size.

According to Huffington Post, Mary Cassidy, Pourtions’ president, explained, “Pourtions is intended to support healthy eating and drinking. Everyone who has appreciated Pourtions knows that it can be tough sometimes to be as mindful and moderate in our eating and drinking as we’d like, but that a gentle reminder can make a big difference. That was all we ever meant to encourage.”

Her company’s intentions do matter, for if they had purposely intended harm, then this would be a very different matter, but the impact remains the same whether their actions were malicious or an attempt at humor that missed the mark.

“These expectations can actually kill someone, and I know someone it has,” read a tweet from one responder, who elaborated that the plates spread a “toxic message, promoting even greater women beauty standards and dangerous health habits.”

Eating disorders are serious business. They can wreak havoc on one’s health, family, career, and life in general. And yes, they can be fatal. Additionally, they are more common than many people realize.

“As we all know, pressure to be thin leads to dieting, which can lead to a variety of problems, including eating disorders,” I wrote in the April 2016 issue of Boston Baseball. “These life-threatening illnesses are so common in Massachusetts that if the crowd at a sold-out Fenway Park represented a random sample of the state’s population, those in attendance with a diagnosed eating disorder would fill section 41,” which is a large section in the bleachers behind the Red Sox bullpen.

One does not even have to have a diagnosed eating disorder to be suffering the effects of diet culture and weight stigma. We see plenty of disordered eating which can be comprised of a constellation of symptoms, such as a strong good/bad food dichotomy or feelings of guilt and virtue associated with eating behaviors, that does not meet the diagnostic criteria for a specific eating disorder but can be just as disruptive and dangerous.

When we work with people recovering from eating disorders and disordered eating, we help them to uncouple judgment from their eating behaviors, and part of this work entails exploring where they learned such judgment in the first place.

The judgments implied by the Pourtions plates are so blatant that they are self-explanatory, but sometimes the message is more subtle. For example, Trader Joe’s has a line of “reduced guilt” products, such as their low-fat mac and cheese, which implies increased guilt for its full-fat counterpart. One might argue that the “reduced guilt” tag is a tongue-in-cheek marketing gimmick and is not to be taken to heart. Perhaps, but messages like these – whether in your face or toned down – are so commonplace that they are insidious.

Honoring internal eating cues is difficult to do in a society with pervasive messages that our bodies are not to be trusted. We have 100-calorie snack packs, for example, that people often utilize in an attempt to limit their consumption via an external control – in this case, the pre-portioned quantity – but the implication is that 100 calories is the correct amount to consume, that it should be enough food. In some cases, it will be, but 100 calories is an arbitrary amount of energy, and chances are low that it will just so happen to match up with someone’s hunger/fullness cues. If someone gets to the bottom of the bag and yet they are still hungry, the dissonance between their body saying, “Hey, I need more food,” and society saying, “Hey, you have already eaten enough,” is confusing and stressful.

The small print on food labels reads, “Percent Daily Values are based on a 2,000 calorie diet. Your daily values may be higher or lower depending on your calorie needs,” but time and time again, I have patients who believe they should be consuming 2,000 daily calories because food labels imply that this is the standard amount for an adult human. They then have difficulty making sense of their bodies asking for more food than that and feel tempted to restrict in an effort to match the label.

While I am not advocating for the abolition of food labels or snack packs, we have to consider the gap between impact and intent and realize that these tools might not actually be as helpful in reality as they seemed in their creators’ imaginations.

To Macy’s credit, they took the feedback they received to heart; seemingly realized that despite the humorous intent of the Pourtions products, the reality is that the plates are offensive and send harmful and dangerous messages; and consequently stopped selling them.

A Reader’s Intuitive Eating Question

Posted on by

“The concept of intuitive eating is hard for me to grasp. The way I understand it is that I need to listen to my body so I will recognize when I’m hungry, and eat until my body tells me I’m not hungry anymore. If that’s basically correct, my problem is that I’m rarely ever hungry because I only recently ate, and always continue to eat until external clues tell me to stop (e.g. I ran out of time or food, or my eating partner has finished). How can I begin to listen to my body so I know when I’ve become hungry enough so that it’s okay to eat, and when I should stop eating?”

A reader emailed us the question above in response to an invitation in a previous newsletter to suggest future topics. It sounds as if the writer is still working to fully understand the concept of intuitive eating and how to incorporate it into his life, and I hope I can help.

Some of the language that the writer uses caught my eye: need, enough, okay, should. Diets have rules and directives that are clear and crisp. Even though diets typically fail in the end, part of the reason they are enticing is that they tell us what to do, which simplifies things by taking some of the decision-making out of our hands while paradoxically making us feel like we have more control over the situation.

People who are coming to intuitive eating from a history of dieting commonly and understandably assume that intuitive eating is just a different house built from the same framework of dieting, hence absolute language that implies a set of rules. In reality, intuitive eating has no rules, but rather guidelines and ideas for consideration. The difference is more than semantics, as people who attempt to pound intuitive eating into a rules-based framework end up warping it into the hunger-and-fullness diet, which both misses the point of the approach and makes incorporation more difficult.

With that in mind, I might suggest tweaking the writer’s question in order to remove the implication that his hunger has to reach a certain threshold for him to gain permission to eat and that he must stop when his fullness hits a particular level. He – and everybody else who follows an intuitive eating approach – always has unconditional permission to eat. Tearing down constructs that tell us when we can and cannot eat oftentimes feels scary, but it is essential in order to create the space necessary for us to make multifaceted eating decisions that are in our own best interests.

Instead of the question being how can the writer listen to his body so he can adhere to rules regarding when he can and cannot eat, perhaps a more helpful set of questions would include: How can he listen to his body so he can notice what different levels of hunger and fullness feel like and how different foods make him physically feel? How can he listen to his body in order to be more adept at distinguishing between times when he is eating for physical hunger versus some other factor, such as emotional or social reasons?

In that sense, I actually think the writer is more ahead of the game than he realizes, for he listed some of the external factors – time, quantity of food available, his partner’s own eating behaviors – that are hindering him from making food decisions from an internal standpoint. The next step on this front might be to explore the pros and cons of maintaining the status quo versus implementing change in order to determine the extent to which he wants to and is ready to create change.

Another avenue to explore is the writer’s statement that as a consequence of his eating behaviors, he rarely experiences hunger cues. If we are not hungry as we head into an eating experience, detecting subtle signs of fullness as they set in can be more difficult due to a lack of contrast. In other words, we cannot notice hunger signals subsiding if they were never there to begin with. If we grow accustomed to an absence of hunger cues, we might lose the ability to recognize the more subtle stages of hunger. Therefore, the writer might benefit from performing some experiments to intentionally let himself get hungry, to really notice what that feels like, and then consciously eat in response to it and see how the experience contrasts to when he eats in response to external cues.

Becoming an intuitive eater is a process. The journey never looks exactly the same for two people, as we are all so different and unique, but one commonality is that the road traveled is rarely direct. We discuss ideas, experiment, gather data that suggests areas of opportunity for further growth, and repeat the cycle until someone finds peace with food.

 

Walking While Jacketed

Posted on by

The Needham police stopped me while I was out for a walk yesterday morning. Reportedly, someone had called them to express, umm, “concern” that I was pushing an empty stroller. But the stroller was not empty, as the officer quickly realized when I introduced him to our infant daughter.

Even if the stroller had been empty, that is not a crime. Maybe I was returning home from dropping my baby off at daycare, or on my way to pick her up from visiting with a family member. Perhaps I was going to use the stroller to transport groceries home from the supermarket.

After I asked the officer exactly what the caller said, he made mention of the heavy winter jacket I was wearing, suggesting that my wardrobe choice raised suspicion. Some people run warm, some people run cold like me, but neither one of these characteristics is illegal either.

Before I get to the elements of this incident with which I take issue, let me first state what my problems are not:

My problems are not with the police department, and I am glad they responded to the call. What if I had actually been up to no good and they declined to pursue a tip that could have prevented a crime?

My problems are not with the responding officer. He was respectful throughout our encounter, and while he was understandably guarded at the outset, he became super friendly once he saw our daughter.

My problems are not with somebody keeping an eye on the neighborhood. “See something, say something” is an important call to action. Even in a relatively safe town like Needham, crimes still do occur, and we have to look out for each other and help the police to protect us.

My first problem is that what constitutes suspicion needs to be set at a higher threshold than what was exhibited yesterday. All the caller saw was a guy, a stroller, and their own prejudices.

My second problem is that not everybody gets treated the same by first responders, so when somebody ponders calling the police, they have to consider not just what crimes their call might prevent, but also what crimes their call might cause. As a white guy, I can see a police officer approaching me and feel confident that whatever transpires during our imminent encounter, I am likely going to be treated fairly and that my safety is probably not in danger. If I had dark skin, I would be less optimistic. We do not have to watch the news for very long before we see examples of seemingly-benign calls to the police resulting in murders of minorities.

My third problem – and the reason I am writing about this in a nutrition blog – is that this incident is emblematic of a broader issue in our town: We judge each other for our looks. Some of my fellow Needhamites have given me a hard time for my appearance as far back as elementary school, when my chosen attire and hair style were out of step with the hip childrens’ fashions of the day. While I am not equating picking on a kid on the playground for his hair and clothes with calling the police on an adult for his jacket, I am saying that they exist on the same bullying continuum and that they are both symptomatic of an intolerance/phobia/disrespect of people who are different than oneself.

This latter point is what most frustrated and disappointed me about yesterday morning. All these years later, from the 1980s Broadmeadow playground to 2019 in my own neighborhood, the message is the same: Look different in this town at your own peril. Despite all of the changes that Needham has undergone over the past few decades, the pressure to conform remains fully intact.

Nobody should be surprised then that so many of our patients are working to overcome eating disorders, many of which – but certainly not all – were triggered by a desire to escape weight-based stigma, shaming, and bullying and to become a member of a more socially accepted group. No wonder then that some of our patients with restrictive disorders are reluctant to weight restore; after having a taste of thin privilege, surrendering it and returning to the crosshairs of stigma is a difficult proposition. Similarly, it is understandable that patients of all ages have a hard time giving up their fantasies of becoming thin, which is a necessary step in healing their disordered relationships with food.

A small fraction of our readers take umbrage at our occasional discussion of politics and societal issues, but most people seem to understand that if we are truly going to help our patients with their nutrition, we have to do more than address the nitty-gritty of food and eating behaviors. We have to advocate not just for greater tolerance of questionable fashion choices, but also for serious issues of equality. We have to fight for size acceptance.

Humming and Beckoning

Posted on by

Patients working with me on intuitive eating inevitably hear me use the terms humming and beckoning in the context of eating dynamics. Based on the feedback I receive, being able to differentiate between humming and beckoning is one of the most helpful skills for an aspiring intuitive eater to develop. So, what do these two terms mean, and why are they helpful? Let’s discuss.

Humming occurs when we are internally inspired to consume a food. Unprompted by anything external, we just feel that a particular food would hit the spot. Maybe you are sitting at your desk in the late afternoon, and as you begin to look ahead to dinner and consider what to purchase or make, you think to yourself, “Man, I could really go for [insert the object of your food craving].” Organically and unprompted, you just really want a particular food.

Beckoning happens when we are externally inspired to consume a food. We are not thinking about a given food, but circumstances unfold that result in us wanting it. Maybe brownies are not on your mind at all, perhaps you are not even hungry, but you walk by the break room, spy that someone brought in a pile of the homemade goodies, and suddenly you think, “Oh, hey brownies!” and take a couple back to your desk to munch on while you work.

Whether a food is humming or beckoning is not directly based upon a food’s nutrition profile, our ability to obtain the food, our beliefs about its appropriateness for the meal/snack at hand, our feelings about the food, or where it might fit on our good/bad food dichotomy (if applicable). Rather, humming and beckoning are directly based upon the source of our motivation – whether internal or external – for wanting a particular food.

Indirectly, however, our relationships with food can certainly influence our humming/beckoning dynamic. Going back to the brownie scenario I previously mentioned, someone who restricts their intake of sweets will likely experience a stronger pull towards the brownies than somebody who has a healthier relationship with such treats and knows they are free to have brownies at any time. The brownies might still beckon to both people, but the intensity of the sparkle differs, as might their responses.

Eating in response to humming has its upsides. From the standpoint of satisfaction, foods that we are humming for are more likely to hit the spot and leave us feeling content. In contrast, if we are humming for one food but eat something else for whatever reason, we might overconsume in an effort to make up for quality with quantity, or we might scrounge around going from food to food in search of satisfaction. Think of someone who really wants ice cream but gets frozen yogurt instead because they believe it to be healthier. They might overeat on the yogurt and perhaps eat another dessert or two afterwards, whereas if they had just had a little bit of ice cream in the first place, it would have hit the spot, and they could have gotten on with the rest of their day having found contentment in their eating experience.

Sometimes we do not give enough credit to our bodies, which are pretty good at directing us towards what we need. Think of how water tastes so much better and is that much more satisfying to drink when we are thirsty versus when we are already well hydrated. Someone with anemia might not know that red meat is high in iron; they just know that they could really go for a steak, as their body increases its perceived appeal of high-iron foods. Personally, I discovered that salted crackers and pretzels were particularly satisfying during and after marathons long before I understood that my body was trying to replenish its sodium and carbohydrate stores.

While eating in response to humming is typically a positive, beckoning is often viewed as a negative phenomenon, something to be resisted. However, I believe that beckoning gets a bad rap, and sometimes letting it guide our eating decisions is actually both sensible and helpful. Consider the following examples.

Rarity: My first job as a dietitian was a research position that had me flying all over the eastern United States examining food and eating behaviors in elementary school cafeterias. Every night, I went out for dinner at local restaurants. In Philadelphia, I ordered a steak and cheese. A few weeks later in Tennessee, I made sure to get barbecue. My last trip took me to Tampa, where I ate plenty of seafood. These were not cases of humming just so happening to coincide with popular regional cuisines. Rather, these foods beckoned to me because these locales were known for them, and I wanted to take advantage of my rare opportunities to experience authentic fare.

Similarly, you likely find yourself in situations on occasion in which you have an atypical opportunity to try a particular food. One of my patients, for example, told me that his co-worker makes amazing Chinese dumplings every year for their office holiday party. If he passed them up one December, he would have to wait another year for the opportunity to come around again, so of course he partakes in the dumplings whether or not he is humming for them the day of the party. Letting a rare chance slip away could leave one feel like they are not living life to its fullest

Deprivation: For someone still working to improve their relationship with food, especially if they have a history of dieting or other form of restriction, denying themselves a beckoning food can trigger feelings of deprivation that can have ramifications, such as subsequent overconsumption. Someone might decline the cake and ice cream at a birthday party and then rebel against their self-imposed restriction by consuming an entire pint of ice cream later in the day. In this example, the person would have been better served to remind themselves that they have unconditional permission to eat whatever and whenever they want and then celebrate with the other partygoers by having a little dessert.

Uncertainty: Sometimes our humming signals are just not that strong. We know we are hungry, but identifying the best fit proves a challenge. We might ask ourselves matching questions regarding what taste, color, temperature, or flavor food we feel like consuming, yet come up with limited criteria that still leave us feeling directionless and frustrated. In such cases, beckoning can be our friend by helping us to resolve the uncertainty and make a decision. For example, you might be gazing at a restaurant menu in frustration, unsure which entree to order, but then you glance at another patron’s meal, think to yourself, “That looks good,” and suddenly you have your answer.

Other times, not responding to beckoning might be the best move. The person who walks by the break room and spies the brownies might decide, “You know what, those brownies do look good, but I was not really feeling like having brownies; I am only interested in them because I saw them, and they are probably not going to hit the spot as they would if I were humming for them. Besides, I have unconditional permission, so I can make or buy brownies anytime I want. So, I am going to pass on them for today.” Five minutes later, they could be back at their desk and engrossed in their work, having totally forgotten about the brownies.

In my view, eating because of either humming or beckoning are both morally neutral actions, and there are no absolute right or wrong responses. However, understanding the dynamics behind our draw to a food can help us engage in whatever eating behavior we feel like is in our best interest at the given time.

What’s the deal with that egg study?

Posted on by

One of the most common sources of nutrition-related frustration that patients express to me is the apparent fickleness of nutrition advice. It feels as though headlines and sound bites demonize a food that only yesterday was deemed the food of the Gods, or vice versa, leaving exasperated and confused eaters at a loss.

Eggs became the latest example when a recent Northwestern University study was picked up by mainstream media and turned into “clickbaity” headlines, such as “Bad news for egg lovers,” “Eating Eggs and Cholesterol Linked to Heart Disease and Death Risk,” “Are eggs good or bad for you? New research rekindles the debate,” and “Northwestern study cracks dietary guidelines for eggs.”

Unfortunately, disconnects often exist between headlines – which, remember, are sometimes sensationalized and designed to generate clicks, views, and shares – and the research behind them. For example, the Northwestern study in question is not actually bad news for egg lovers. Far from it. Let’s take a look at the study.

The study relied on self-reported dietary data, which are terribly flawed. Sometimes during the course of our work, I may ask a patient to keep a food journal and return it to me for analysis. Despite patients’ best efforts to keep accurate journals, their sources of error are ultimately numerous. People misremember what they consumed, forget to report some of what they ate, provide vague information that I can easily misinterpret, and purposely falsify data for fear of judgment.

Close to a decade ago, I was working on a research study that in part required that I interview people about what they ate the preceding day. As I sit here right now, I could not tell you what I ate for dinner last night, and the subjects were no different. One of the gentlemen I interviewed got frustrated because I had to drill down to such a specific level of detail that I was asking him for the measurements of the piece of lettuce he put in his previous day’s sandwich; meanwhile, he could not even be sure that he had eaten a sandwich at all. Eventually, my research team made the decision to drop the dietary recall portion from our study because the data were just so poor. Similarly, how confident can we really be that subjects included in the Northwestern study accurately reported their egg consumption?

Even if we take the data at face value and assume them to be completely accurate, we must remember that this study only found associations between egg consumption and disease, which is not the same as establishing a causal relationship. One of the most common mistakes that people make is to assume that correlation implies causation, but such an assumption is premature at best and can turn out to be just plain wrong.

Just because two events tend to occur together does not mean that one causes the other. Consider what our friend and colleague, Ragen Chastain, famously wrote in 2017. “Imagine if I got together everyone who had survived a skydiving accident when their parachute didn’t open and started looking for things they have in common. Even if every single one of them wore a green shirt and had oatmeal for breakfast, I cannot say that wearing a green shirt and eating oatmeal will allow you to survive a skydiving accident, nor can I ethically start Ragen’s School of No Parachute Skydiving ‘free green shirt and oatmeal with every jump!'”

In other words, even if it is true that people who consumed more eggs had a greater incidence of cardiovascular incidents and death, we cannot say for sure that the eggs were responsible, just as we cannot say that blueberries reduced heart attack risk, because it could be that another factor – or combination of factors – common to people who consumed more eggs is responsible for their disease and death as opposed to the eggs themselves.

Observational studies like these are great for developing hypotheses to be explored in subsequent research, but their design prevents them from establishing causal relationships. Unfortunately, this incredibly important point is often glossed over or ignored entirely when a study is distilled to pop culture news articles and then further condensed into headlines.

Consequently, the news that we see leaves us with the impression that nutrition information and guidance are always changing like early springtime New England weather. Don’t like seeing your favorite food being vilified? Just wait until tomorrow when a new headline will sing its virtues.

In reality, nutrition science moves at a more glacial pace. One study generates hypotheses that subsequent studies investigate, followed by yet more research that looks at the given questions from different angles in an attempt to confirm or refute the original findings and gain a deeper understanding that policymakers eventually take into account when issuing dietary guidelines.

If someone’s current egg consumption is working for them, I see no compelling reason – based on what we know at this point – for changing it.

Evelyn

Posted on by

Some blogs take me longer to write than others. This one, I started four years ago, shortly after my grandmother, Evelyn, died suddenly of a stroke at 95 years old. Ravaged by Alzheimer’s, her memory had badly deteriorated, and she was residing in a senior living facility with a great staff who cared for her.

The latter point is at least the rumor because I do not know firsthand; I never actually visited her there. My grandmother and I had not seen each other in probably a couple of years when she passed. Although her memory problems were at first an annoyance to which we responded with humor – for example, my father would respond to her “How’s work?” questions with “Fine” rather than remind her that he was retired – her memory grew more concerning over time. First, she called my wife by the wrong name, then forgot her name entirely. My fear was that I would walk into her room and hear, “Who are you?” That would have been tough to take.

My grandmother was a complicated person. Everybody has challenges, some more than others, and she quite often met hers with twists of the truth. If you knew Evelyn well, then you know exactly what I mean. So the distance that divided us in recent years was both of my own making and her limitations.

Before that though, our relationship was solid. Although Evelyn was a reluctant mother who never truly embraced parenthood and the life changes that it requires, grandmotherhood was an entirely different story, and she was damn good at it. That included great-grandmotherhood. At a family gathering close to a decade ago, my niece and nephew were acting a bit rambunctiously and ignoring their parents’ directives to calm down. Their great-grandmother came over and said to the kids, “Let’s have a contest to see who can stay quietest the longest.” Right away, both children went silent. My brother turned to me, shocked. “I can’t believe that actually worked!”

My three favorite memories of my grandmother are as follows:

  1. When I was little – and I mean little, like nursery school or early elementary school little – she handed me a couple of dollars, as my grandparents often generously did when they visited. Not meaning it as a hint, but rather just stating a fact, I told her that I was just a couple more dollars shy of being able to buy a Dukes of Hazzard toy that I wanted. Right away, she reached into her pocket and gave me the money I needed. Thirty-something years later, that generous move has stuck with me.
  2. My brother and I occasionally had sleepovers at my grandparents’ condo. Typically, I stayed in one room with my grandmother while my brother shared a room with my grandfather. One evening, they switched things up, which did not go over well. Faced with the prospect of spending the night with my grandfather, I began crying. And then, apparently, I did not stop. I remember him, totally at a loss, calling for his wife, “Ev, he’s crying!” We switched back to the traditional configuration. In the morning, I woke up to find my grandmother looking at me and smiling, and I remember feeling very comfortable and safe.
  3. My grandparents visited us practically every Sunday except during the winters when they migrated to Florida. Each week, Evelyn arrived with food, including baked goods of various qualities. When I was a teenager, she caught wind of my liking peanut butter and jelly sandwiches. Every Sunday, for weeks and weeks on end, she showed up with PB&J she had made for me. Peanut butter and jelly is cool and all, but there is a limit. Afraid of offending her, I was wary of asking her to stop, yet I could see no end in sight. Anxiously, I dreaded waves of weekly sandwiches that could potentially keep coming until I went away to college. Still, I certainly appreciated the kindness behind her gesture, and that is what I remember most.

Food was a source of stress with my grandmother in other ways, too. As is typical of people who lived through the Great Depression, both she and my grandfather hated to waste food themselves, and it irked them when others did as well. Americans often forget that it was not too long ago in our history that food scarcity was a widespread and significant problem. Some of the original dietary guidelines from the 1940s emphasized the importance of butter and sugar because so many calorie-starved young men were failing their military physicals. Today, our area food banks and the lines outside food pantries are evidence that many of our neighbors still struggle to get enough sustenance.

People who have experienced food shortages oftentimes rebound by eating too much when food eventually becomes plentiful again. Virtually anybody who has ever dieted can relate to this, as food scarcity is often self-imposed. For Evelyn, these behaviors became so ingrained that decades later she still cleaned her plate and expressed dismay if others left food. “But there are starving people in China!” she would exclaim, as if someone overeating in Boston would make any difference whatsoever for a malnourished individual on the other side of the globe.

Eating with my grandparents was stressful, as I never liked being told to continue eating when I knew I was already full. To my parents’ credit, they stood up for me and overrode my grandparents’ commands. Still, the tension made family meals unpleasant because I felt pressure from both grandparents to eat past the point of comfortable fullness. They would comment if the portion I served myself seemed too small to them, and I certainly heard about it if I left food on my plate.

It took me years to figure out why I sometimes get anxious eating in restaurants, but through working on my own relationship with food, now I understand that it traces back to my grandparents. If a portion is set in front of me that I assess as more than I can comfortably eat, the anxiety sets in, the enjoyment of eating diminishes, and then the internal questioning begins. What fraction of the meal must I eat to feel confident that the waitstaff will not get mad at me? Can I entice my wife to eat some of it? Will anybody notice if I hide food in my napkin?

Rationally, I know the truth is that the waitstaff probably do not care how much I eat. So long as I pay for the food, how much of it I eat is irrelevant to them. If they do judge my consumption, it probably has more to do with disturbances in their own relationships with food or perhaps fear that I did not enjoy my meal.

Irrationally though, I continue to project my grandparents’ judgment onto the waitstaff. My work is ongoing, and I know that eventually I will overcome this, but in the meantime, I have figured out some workarounds that mitigate my anxiety while also honoring my body’s intuitive eating cues. For example, I may ask the waitstaff to pack up the remainder of my meal even if I know I will dispose of the leftovers as soon as we leave. One might argue that is a waste of packing materials, a valid point, but it is certainly a better choice than using my body as a garbage disposal.

Sometimes, I challenge myself. If I feel particularly courageous, I will just leave a heap of food on my plate, ask the waitstaff to take it away, and see how they react. In literally every single case, the waitstaff have never made a comment about the amount that I have left. Seeing the juxtaposition between my fears and reality has helped significantly, but the process continues.

Few of you care about my grandmother and my own food woes, a reality to which I take no offense, but all of this is meant to illustrate that the work we do in my office is typically deeper than people expect. In order to create meaningful change, we often have to look beyond calories and grams and instead focus on how people make decisions about what, when, and how much to eat. Doing so may involve examining the historical influences that shaped one’s current eating behaviors, which in turn paves the way for moving into the future with a healthier relationship with food.

The Natural Purple Pill?

Posted on by

At this year’s Cardiometabolic Health Congress, a cardiologist I will call “Dr. Q” began his nutrition presentation with a factoid: 90% of cardiologists reported zero or minimal nutrition education, yet 95% of them felt it was their personal responsibility to discuss it. Meanwhile, 61% of the public thinks that doctors are “very credible” sources of nutrition information.

In other words, we have doctors who do not know what they are talking about talking about it anyway, and patients are listening and trusting them because they are doctors.

He called blueberries “the natural purple pill” and cited research showing that 93,600 women who were studied over 18 years and who consumed three servings of blueberries per week throughout the study had a 34% reduced risk of a myocardial infarction. He then flashed a slide listing the dozens of known chemical compounds in blueberries, asked how we know which nutrient or combination of nutrients is responsible for the benefits, and answered his own question with, “I don’t think any of that really matters,” intimating that the bottom line is that blueberries offer health benefits.

But the underlying mechanism absolutely does matter. He assumed a causal relationship between at least one of the chemical compounds and reduced risk of heart attack, but the relationship between blueberry intake and heart attack risk could also be correlation. For example, the real factor at play might not be some minute compound, but rather money.

Relative to other fruits, blueberries are incredibly expensive. According to data I obtained from Peapod.com, blueberries cost $0.44-$0.64/oz. (depending on the size of the container purchased), which exceeds apples, grapes, melons, strawberries, and all other fruits I examined except for pomegranate seeds ($0.63/oz.) and raspberries ($0.56/oz.)

Could it be that the women in the study who could afford to eat blueberries three times a week also had other financial advantages that enabled them to take better care of themselves, such as the ability to absorb higher insurance costs for office visits and testing, health club memberships, time off from work or no work at all, massages, and psychotherapy?

On the flip side, you know who is probably not splurging on blueberries or able to engage so extensively in taking care of their health? Those working multiple jobs just to get by, those living paycheck to paycheck, those suffering from food scarcity, those relying upon the Thrifty Food Plan, and those who need to make $3.33 stretch enough to buy multiple items to feed their entire family instead of blowing it on a small container of “purple pills.”

“Whether measured by income, formal education, or job status, there is a socioeconomic gradient to health,” Bacon and Aphramor write in Body Respect. “And the greater the inequality in society, the steeper the gradient. The United States has the greatest inequality of all wealthy nations – and the greatest health disparities.”

This is what I was getting at last year when I wrote about nutrition and politics. We talk about the concept of intersectionality and how various layers of oppression aggregate. The further one’s identity lies from that of the pinnacle of privilege – a thin, white, heterosexual, educated, wealthy, American-born, Christian male – the more the individual is subject to oppression.

It might not just be that one’s economic situation makes regularly consuming blueberries unrealistic and limits their access to health care, but that in addition to fretting about cash flow, that person might also have to worry about suffering a hate crime or having their rights stripped away. Even if someone does not fall victim to such misfortune, remember that stress itself is associated with cardiovascular disease, so the very threat itself is problematic.

Assuming that the reduced risk of heart attack was due to a few weekly handfuls of berries without considering the greater context is ridiculous and exemplifies the problems inherent in viewing nutrition solely as a hard science. Anybody who has extensively studied the field should know to consider social, cultural, and other factors, which makes me wonder: When Dr. Q told us that 90% of cardiologists reported zero or minimal nutrition education and yet 95% of them felt it was their personal responsibility to discuss it, was he describing himself?