Last week, the peer-reviewed medical journal JAMA Internal Medicine published a new study that found that greater consumption of sweetened soft drinks was associated with a higher risk of death. According to the study, which tracked more than 450,000 adults across 10 European countries over 16 years, habitual drinkers of artificially sweetened soda — like Diet Coke— were 26 percent more likely to die prematurely compared to those who rarely consumed sugar-free soft drinks, while consumers of sugar sweetened soda — like regular Coca-Cola — were 8 percent more likely to be linked to early death compared to those who seldom drank sugary pop.
News outlets seized on this particular finding with predictably flashy headlines: “You’re even more likely to die if you choose diet soda” (Washington Examiner), “Just two Diet Cokes a day ‘increases your risk of deadly heart attack or stroke by 50%’” (The Sun), “It’s Not Just Sugary Drinks That Are Bad for You—Artificially Sweetened Ones Also Appear to Increase the Risk of Death” (Newsweek).
But, as the New York Times points out, the study was still “unable to resolve a key question: Does consuming drinks sweetened with aspartame or saccharin harm your health? Or could it be that people who drink lots of Diet Snapple or Sprite Zero lead a more unhealthy lifestyle to begin with?” For instance, Harvard T.H. Chan School of Public Health researcher Vasanti S. Malik — who authored a study in April that failed to confirm a conclusive link between artificial sweeteners and increased mortality in women — told the Times, it’s entirely feasible that diet soda drinkers could be using their consumption of Diet Coke to rationalize an indulgence in less healthy foods.
In other words, there’s a risk of conflating correlation with causation. As expansive as this particular study was, and as much as the researchers tried to mitigate those risks by removing subjects who were smokers or obese, observational studies aren’t easily able to establish cause and effect due to the number of variables outside researchers’ control (as opposed to clinical trials, which are more airtight but can be both expensive and logistically unmanageable).
Food science — and, by extension, writing about food science — is rife with studies that both journalists and readers would do better than to take at face value, without taking note of their inherent complications and caveats. As journalist John Bohannon highlighted in his 2015 stunt that spread the “scientific” finding that chocolate leads to weight loss, pitfalls can include flawed study designs (too few subjects, too many variables, etc.) and shady publications that are pay-for-play. Sometimes health studies are funded by the very industries that the researchers are studying, like when Coca-Cola provided financial and logistical support to a new nonprofit that promoted the idea that it’s more effective to address obesity through exercise than through food and drink consumption. Or, as in the recent downfall of prolific Cornell food scientist Brian Wansink, some studies may be the product of deliberate data manipulation in order to produce more interesting — and headline-making — findings.
Food writing is often seen as one of the “soft” subsets of journalism, as opposed to the hard-hitting beats of politics, business, and the like. But food journalism isn’t (or shouldn’t be) just rosy profiles of star chefs, or Big Brand press releases regurgitated as gushing 200-word articles: food is politics, and labor, and business, and tech, and culture, and crucially, health and science. Figuring out what and how to eat and drink is already hard enough; let’s all just add a dash more healthy skepticism to our content consumption habits to avoid complicating it even further.