It’s impossible to read the news these days without encountering one (or several) headlines about AI. Seemingly every industry is exploring ways to leverage AI-powered tech, and these applications have the potential to transform the way we interact with the world and each other. This is understandable: AI capabilities are advancing at an exponential rate, and the possibilities feel endless. But it’s vital to recognize not only the possibilities, but also the limitations — and eating disorder treatment is one place where this comes vividly to life.
Most AI news has centered around highly advanced chatbots, most notably those that use GPT, or Generative Pre-training Transformers. GPT is, essentially, a large language model that uses natural language processing technology to understand user input and generate helpful, relevant responses. The technology pulls from a huge dataset of existing (human-created) language to learn about language, grammar, the meaning of words, and more, and informs how it both interprets and responds to human input.
It would be an understatement to say that both consumers and industry leaders are enthusiastic about the potential of AI. About three-quarters of internet users prefer using chatbots to people when looking for answers to simple questions, 65% of consumers feel comfortable handling an issue without a human agent, and 57% of executives say that chatbots bring in significant ROI with minimal effort. A striking 84% of companies believe that AI chatbots will become more important for communication between customers and businesses.
The healthcare industry isn’t exempt from these seismic shifts despite the more fragile nature of the industry’s work. Experts estimate that up to 73% of healthcare admin tasks could be automated by AI within the next year, and chatbots are starting to take over more “human” roles as well. A 2021 survey found that 22% of adults had used a mental health chatbot, and 47% were interested in using one. When asked why they sought out a chatbot over a human, most respondents cited affordability, ease of use, and ability to connect with a chatbot at any time.
At first glance, it makes sense that chatbots could serve as quick, low-cost alternatives to trained mental health professionals. After all, GPT technology is adept at responding to questions, synthesizing research, and having human-like conversations. But AI-powered chatbots also have points of failure, and in the realm of mental health — and eating disorders in particular — these can come at a serious, high cost.
Eating disorders are complex and highly nuanced mental illnesses; their signs and symptoms may be hidden or counterintuitive, and treatment needs to be highly individualized in order to be effective. What’s more, up to 90% of people with eating disorders have co-occurring illnesses, which makes their treatment even more complex. This murky landscape is not one in which chatbots thrive. Because the language models look for rules and patterns in their dataset, they’re likely to miss eating disorders that don’t fit a certain mold, or to misdiagnose someone whose symptoms correspond to more than one pattern.
There’s also the reality that while bots themselves don’t have biases, the data they learn from does. GPT chatbots are trained on massive sets of human-created language, and much human-created content around food, weight, bodies, and eating disorders is shaped by the powerful force of diet culture. This means that their interpretation of and response to questions may reflect weight bias, fatphobia, and the thin ideal. As Dr. Angela Celio Doyle, Equip’s VP of Behavioral Health, put it in a recent article: “Our society endorses many unhealthy attitudes toward weight and shape, pushing thinness over physical or mental health. This means AI will automatically pull information that’s directly unhelpful or harmful to someone struggling with an eating disorder.”
This potential risk was brought to life recently, when the National Association for Eating Disorders replaced its human helpline staff with a chatbot named Tessa. Within a week, it was reported that Tessa gave harmful advice that promoted dieting and weight loss behaviors to callers who struggled with an eating disorder, and the bot was shut down.
With chatbots, there’s also the risk of providing false information, which can be extremely detrimental in the case of eating disorder patients. While chatbots are designed to provide only information validated through their training data, they can be prone to what are called “hallucinations,” which is essentially when bots make things up and deliver them as facts. For a parent who is trying to assess their child for eating disorder risk, a patient researching treatment options, or a provider looking up the medical criteria to admit a patient to the hospital, this sort of false information can have dire consequences.
Lastly, chatbots simply can’t deliver the support that has been proven to help people recover from eating disorders. Though eating disorders are mental illnesses, the most important initial focus of treatment is behavioral modifications. This means helping patients normalize their eating habits, stop eating disorder behaviors (like restricting, binge eating, purging, or compulsive exercise), and restore their weight if necessary. While a chatbot might be able to advise someone to do all these things, they cannot supervise or support them in doing so; and without that accountability and wraparound support, someone in the throes of an eating disorder is unlikely to simply stop their disordered behaviors cold turkey.
Human connection is another vital part of eating disorder recovery. Research shows that the support of loved ones greatly increases a person’s likelihood of recovery, and clinical guidelines recommend that family members be involved in care. Mentorship — connecting with someone who has been through recovery themselves — has also been shown to increase a person’s likelihood of achieving full recovery. Chatbots, no matter how intelligent, simply can’t provide this sort of emotional, human element, and likely won’t be able to for a long time, if ever. Eating disorder care without empathy will always fall short.
This isn’t to say that AI has no place in the eating disorder treatment field. The technology is powerful and expansive, and we’re only just beginning to scratch the surface of what it might be able to do. And even in its current state, it’s easy to see use cases for it.
For instance, AI could be programmed to send “nudges” to people in treatment, using patterns and other input to sense when they may be in a high-risk situation (at a restaurant, for instance, or at the gym) and suggesting healthy coping techniques. AI could help direct patients and loved ones to the most relevant resources, giving people timely and tailored information. AI could be used to synthesize and summarize a patient’s different treatment options, comparing and contrasting them across an endless list of variables, like cost, efficacy, time-intensiveness, and more. Dietitians could leverage AI to put together individualized meal plans. Researchers could use AI to summarize large amounts of academic papers and determine new areas of research. Physicians could use AI-powered algorithms to generate highly individualized weight goals for patients. So on and on.
In the United States alone, some 30 million people will be affected by an eating disorder in their lifetime, and there’s a nationwide shortage of mental health professionals (not to mention mental health professionals who are also trained in treating eating disorders). This is to say: there’s a great deal of work to be done, and AI offers a promising way to lighten the load of both eating disorder treatment providers and the patients they serve.
AI isn’t bad. It is likely to be a powerful tool as companies like Equip and our colleagues in the field work to transform the eating disorder treatment landscape and deliver effective care to everyone who needs it. But in order to ensure we do more good than harm, it’s essential to always be aware of the limitations of GPT and similar technology innovations. We need to apply thoughtful scrutiny, to honestly acknowledge their mistakes and failures, and to adjust course accordingly.
At Equip, our treatment is data-driven and built on technology — we know that in order to tackle a problem as massive as the eating disorder crisis, we need technology — but it’s powered by humans. Learn more about what Equip’s evidence-based care looks like, and how we balance the human and the automated to deliver eating disorder care that works.