** epic post warning – in which I discuss teenage “O” research

I’d usually start with “Happy New Year”, but it doesn’t feel right. These horrendous bushfires have devastated so many Australian communities, and it’s not over yet. My heart goes out to everyone across this wide brown land, and I hope that you and your family, friends and colleagues are safe. May it be over soon, and may our government heed Mother Natures’ call – we simply have to get our heads out of the sand, take climate change seriously and throw everything we can at reducing our emissions.

Last week I was interviewed for the Full Bloom podcast about everything I’ve been doing to push back against the Fast Track trial, a prolonged semi-starvation diet experiment targeting teens in Sydney & Melbourne. It was great to reflect on how much has happened since the trial first came to light at that eating disorders conference in 2018.

In my last newsletter, I told you about an awesome win in this epic battle, which was having my Letter to the Editor published in “Obesity Reviews”. My letter detailed major concerns about the Fast Track team’s recent paper, a meta-analysis which concluded that putting teenagers on diets is safe and doesn’t increase eating disorder (ED) risk. The Fast Trackers published a response to my letter, which I’m going to unpack for you. I’m doing this because my letter & their response are behind a paywall, so unless you’re an academic you won’t get free access. You can read a blog post I wrote dissecting the meta-analysis here.

The Fast Track team’s response began by “acknowledging” 2 “typographical errors” in the meta-analysis which I’d pointed out (where they’d overstated the number of studies in their analysis – sloppy).

They then dismissed my other points, airily claiming that

“the remaining concerns raised by Ms Adams were misunderstandings of the inclusion criteria for this review and of the way in which data were reported”.

So let’s dig in and see just what I have allegedly “misunderstood”:

Insufficient Long Term Data

I said: The researcher’s grand claim that weight loss dieting for teens is ‘safe’ is absurd, given the absence of long term data to back it up. Eating disorders take time to appear in adolescents – one eating disorder research team (Stice & Van Ryzin, 2019) empirically demonstrated that once teenagers begin to diet, eating disorder symptoms don’t show up for 27 months. The Fast Tracker’s review only had 3 studies which followed the teens for this long, representing just 7.5% of the entire sample.

They said: Stice & Van Ryzin’s model doesn’t apply, because it wasn’t based on fat kids.

They then tried to claw back legitimacy by claiming that 7 of the studies followed the teens for 2 years or more “from baseline”. This is an arbitrary definition of ‘long term’ which they have created – and it’s sneaky. Most people would read that phrase and think that the researchers followed up 2 years after the diets had finished, but that’s not the case. 2 years from baseline includes the time where the kids are dieting. Defining things this way makes a study look like it’s longer and more important than it really was. After a study finishes – once participants are away from the researchers and their agendas – is when the important long term effects can be studied.

But let’s humour the Fast Trackers. Let’s assume that 7 studies with 2 year follow up are worth throwing into the mix. I’ve already analysed 3 of them in a previous blog, and the outcomes weren’t great, showing 5 to 9% of the kids had increased ED risk.

When we turn to the remaining 4 studies, we immediately discover that there’s actually only 3 to discuss, because one (Goldschmidt et al 2014) didn’t even bother to report follow up ED data, rendering it totally useless and makes you wonder why they left this study in the meta-analysis?

So now we’re down to 3 papers. The first is by Epstein and colleagues (2001), who subjected kids aged 8-12 to the infamous “Traffic Light Diet,” recently in the spotlight thanks to WW using it in their Kurbo app. This diet cut the kids’ calorie intake down to 1200 (and below), and taught them to avoid ‘red’ foods and limit ‘orange’ foods. 64 kids started, and 47 finished. The diet went for 6 months, and kids were followed up 18 months later.

The Traffic Light Diet is often touted as ‘scientifically proven’ and ‘safe’. But, 7 children who did not have eating disorder symptoms when they began the diet had elevated ED scores at 2 year follow up. Half of the children with elevated ED scores at the beginning were no better 2 years on, and half of the children with elevated ED scores at the beginning reduced their ED scores by the end. But does this mean that they were psychologically well? Or had this group simply learned to “white knuckle” the diet – to become ‘better’ at dieting?

7 new cases of elevated ED risk in 47 very young kids – that’s 15%. That’s high. The real number is likely to be even higher – Epstein himself noted that a limitation of this study was “the age of the children is below the age at which many symptoms of disordered eating increase.” AND we don’t know the fate of the kids who dropped out.

We also know that at least one of the apparent “success stories” from the Epstein study was actually very damaged by it. In her blog, “Liza not Lisa” described her horror when she heard about the Fast Track study, because it triggered so many painful memories from her time as a participant in Epstein’s trial. She wrote:

“I was considered “successful” during the initial program, and probably when they followed up because I think it stayed off for a little while, and like all shams diet studies, they didn’t follow up far enough in the future to get the truth.”

In reality, Liza’s relationship with food, her body, and exercise was damaged for years –

“I spent the bulk of my teen years and early twenties trapped in a cycle of trying a new ridiculous diet, being unable to sustain it, and feeling terrible. Rinse, repeat…I still have a very “all or nothing” approach to whether or not I’ve succeeded at something, much like we did in our reward system. But hey, at least Dr. Epstein got JAMA publication and eventually a lifetime achievement award, amirite?”

The truth is, “childhood obesity” research is simply not sensitive enough to detect the myriad ways we damage kids’ relationships with food, exercise, and their bodies when we introduce them to diets at a young age. And the researchers are long gone once the real impacts are felt.

The next medium term study was by Halberstadt et al (2016). This included 120 kids aged 8-19 who were subjected to an intensive inpatient diet program (literally locked in a hospital ward & forced to diet). The diet prison period went for 1 year, and 76 kids were followed up 1 year later.

Percentage of girls & boys with complete ED data who scored above average at baseline, post treatment & 1 year follow up

The table above shows the percentages of kids who were showing signs of disordered eating before, just after, and 1 year after the intense diet. There’s a lot happening here – especially for the girls. The % of girls in the above average range for ‘eating for external reasons’ scores reduced during the enforced diet (not surprising), but were up again 1 year later. The % of girls showing increased dietary restraint almost doubled, meaning that half of them were showing worrying signs by the time the diet had ended. For a lot of girls, this treatment is mucking with their sense of permission to eat – those high scores on restraint are a definite worry. We can also see the temporary impact of the diet intervention – the return of higher % of girls eating for external reasons is an example of “white knuckling” the diet. The boys look like they were less damaged than the girls, but again, how are they really going? Have they simply been taught to be more compliant dieters? What happened to them after this study finished?

Our final long-term-that’s-not-really study involved our very own Fast Track team, in the RESIST trial which targeted kids aged 10 to 17 with “pre-diabetes or insulin resistance”. All of these kids were put on metformin, a drug known for enhancing weight loss.

It was a 6 month intervention where they divided the kids into 3 diet groups with different macronutrient levels (messing with carbs & fat). 111 kids started but only 42 returned for a 2 year follow up – a staggering 62% dropout rate. As a weight loss intervention it was a spectacular failure – by the end of the experiment, all of the kids had regained the weight.

For the few kids who returned the ED symptom survey (just 20 questions) at 2 years, there were no changes in dietary restraint or parental pressure to eat. There were reductions in external eating and emotional eating scores, but not enough detail was given to work out how many kids were elevated to begin with, or how many may have worsened over the course of the experiment. And with such massive dropouts we certainly can’t conclude anything about ED safety following childhood obesity treatment from this study!

In summary, adding data from the additional 3 studies shows that even in the medium term, signs of eating disorder risk following the diets were present in 15-48% of the kids enrolled. This is hardly a “small number”. There’s also no way of determining the impact of these diets on the kids who dropped out. Drilling into the data also shows that many of the observed “improvements” in ED measures could be attributed to kids temporarily “white knuckling” their way through diet compliance in a high pressure environment. Let’s not forget that in clinical trials, participants experience significant demand characteristics – they know that they’re in a diet study, and want to please the researchers. With internalised weight bias undoubtedly playing a role in these kids’ perceptions of what is ‘desirable’, it’s not surprising that in the medium term, some of the ED measures will show what look to be improvements.

And returning to the original point – even if we add in the extra kids from these 4 (really 3) studies, we still only have complete ED risk data on 360 teens. That’s a measly 14% of the whole sample. It is RIDICULOUS to claim that something is safe when 86% of the data is missing.

The Missing Data

I said: The Fast Tracker’s paper also left out important ED results from a long term paper which showed 9% of the kids had elevated ED scores 6 years after a hospital based weight loss trial, and one had subsequently been hospitalised for an ED.

They said: Their analysis was only interested in the CHANGE in ED symptoms from baseline to follow up, and because this particular ED measure was only taken at follow up, it was not relevant and “did not meet inclusion criteria for this review”.

But then they immediately contradicted themselves by stating “These data provide useful insight into the prevalence of bulimic symptoms in treatment seeking samples”.

I can understand why they wouldn’t include the data in their statistical analysis, but it absolutely should have been discussed in the paper. As they themselves agree, it is RELEVANT! Leaving it out entirely smacks of a cover up, an agenda-building paper as opposed to a paper which genuinely seeks to understand the intricacies of this topic.

With regards to the teen who was hospitalised for an ED after the diet, they said “the participant may have entered the trial with an ED”. I mean JFC. Is it so hard to admit it might equally be due to the extreme diet? Data which shows potential harm SHOULD HAVE BEEN INCLUDED in a paper which is supposedly interested in exploring the relationship between adolescent dieting and eating disorder risk.

Choice to Meta-Analyse

I said: The meta analysis clumped together studies which were hugely different from each other: from inpatient hospital treatments to a Jenny Craig program for teens. I also mentioned that another team, De Giuseppe et al (2019), had reviewed exactly the same topic but decided not to meta-analyse because the studies were too different from each other.

They said: Their superior statistical wizardry skills made all of this ok.

Quality of Included Papers

I said: Researchers often rate the quality of the papers they have chosen to include in their meta-analyses to give readers an idea of how confident they can be that the studies their conclusions are based on are of decent quality.

De Giuseppe et al conducted quality ratings for several of the same papers, and for 4 of the 5 studies in common, the Fast Tracker’s quality ratings were suspiciously higher than De Giuseppe’s:

Comparison of Quality Ratings of Included Studies

They said: “There are two studies in common between our review and that conducted by De Giuseppe et al which may be perceived as having a different quality ratings (p.1)”

This is clearly not true – there were 5 with different quality ratings.

A team which cannot tell the difference between 2 and 5 must be questioned!

Dietary Restraint

I said: The paper completely left out any analysis of dietary restraint, which is weird because it is an important risk factor for ED development, especially in teens. The Fast Trackers managed to analyse bulimic symptoms, binge eating, emotional eating, drive for thinness, and eating concern, but dietary restraint was no-where to be seen.

They said: They didn’t need to, because subscales of dietary restraint were included within measures of “global ED risk”, which using their statistical wizardry found an overall reduction – so there.

This defence makes no sense whatsoever. Their paper painstakingly extracted, analysed, and discussed all sorts of ED subscales – such as binge eating, emotional eating – but dietary restraint wasn’t worth looking at?

Could it be that this is because it’s a hugely contentious issue, because dietary restraint is an important early warning sign for development of an ED, because childhood obesity treatments are clearly increasing dietary restraint in our teens, and they don’t want that fact exposed?

This is lazy, dangerous science that smacks of an agenda being built.

Bits They Simply Ignored

I also included a list of 9 errors which I asked to be corrected. 5 of them were simply ignored by the researchers, who had clearly had enough of explaining themselves. So we’re left with a paper riddled with simple errors.

It’s arrogant to ignore valid criticisms, and simply awful to consider that this extremely damaging paper is now out there, circulating and building the illusion of safety when it’s really nothing more than smoke and mirrors. I am heartened that my letter to the editor is out there, and I hope that researchers in this field start to take the valid concerns of eating disorders experts more seriously.

I don’t believe that the issues in this paper are due to my “misunderstanding”. I think the problem here is the Fast Track team misrepresenting the data and hoodwinking us into thinking that dieting is safe. The Fast Trackers desperately want to continue to subject adolescents to ineffective, harmful weight loss interventions in order to keep the funding rolling in and to further their careers. It’s enough to make my blood boil!

I promised to tell you more about how the Fast Trackers came after my website, but I have definitely stretched the word limit for this post! More on that next time!