Article 6BYB0 Eating Disorder Helpline Fires Staff, Transitions to Chatbot After Unionization

Eating Disorder Helpline Fires Staff, Transitions to Chatbot After Unionization

by
Chloe Xiang
from on (#6BYB0)
Story Image

Executives at the National Eating Disorders Association (NEDA) decided to replace hotline workers with a chatbot named Tessa four days after the workers unionized.

NEDA, the largest nonprofit organization dedicated to eating disorders, has had a helpline for the last twenty years that provided support to hundreds of thousands of people via chat, phone call, and text. NEDA claims this was a long-anticipated change and that AI can better serve those with eating disorders. But do not be fooled-this isn't really about a chatbot. This is about union busting, plain and simple," helpline associate and union member Abbie Harper wrote in a blog post.

According to Harper, the helpline is composed of six paid staffers, a couple of supervisors, and up to 200 volunteers at any given time. A group of four full-time workers at NEDA, including Harper, decided to unionize because they felt overwhelmed and understaffed.

We asked for adequate staffing and ongoing training to keep up with our changing and growing Helpline, and opportunities for promotion to grow within NEDA. We didn't even ask for more money," Harper wrote. When NEDA refused [to recognize our union], we filed for an election with the National Labor Relations Board and won on March 17. Then, four days after our election results were certified, all four of us were told we were being let go and replaced by a chatbot."

The chatbot, named Tessa, is described as a wellness chatbot" and has been in operation since February 2022. The Helpline program will end starting June 1, and Tessa will become the main support system available through NEDA. Helpline volunteers were also asked to step down from their one-on-one support roles and serve as testers" for the chatbot. According to NPR, which obtained a recording of the call where NEDA fired helpline staff and announced a transition to the chatbot, Tessa was created by a team at Washington University's medical school and spearheaded by Dr. Ellen Fitzsimmons-Craft. The chatbot was trained to specifically address body image issues using therapeutic methods and only has a limited number of responses.

The chatbot was created based on decades of research conducted by myself and my colleagues," Fitzsimmons-Craft told Motherboard. I'm not discounting in any way the potential helpfulness to talk to somebody about concerns. It's an entirely different service designed to teach people evidence-based strategies to prevent and provide some early intervention for eating disorder symptoms."

Please note that Tessa, the chatbot program, is NOT a replacement for the Helpline; it is a completely different program offering and was borne out of the need to adapt to the changing needs and expectations of our community," a NEDA spokesperson told Motherboard. Also, Tessa is NOT ChatGBT [sic], this is a rule-based, guided conversation. Tessa does not make decisions or grow' with the chatter; the program follows predetermined pathways based upon the researcher's knowledge of individuals and their needs."

The NEDA spokesperson also told Motherboard that Tessa was tested on 700 women between November 2021 through 2023 and 375 of them gave Tessa a 100% helpful rating. As the researchers concluded their evaluation of the study, they found the success of Tessa demonstrates the potential advantages of chatbots as a cost-effective, easily accessible, and non-stigmatizing option for prevention and intervention in eating disorders," they wrote.

Harper thinks that the implementation of Tessa strips away the personal aspect of the support hotline, in which many of the associates can speak from their own experiences. Some of us have personally recovered from eating disorders and bring that invaluable experience to our work. All of us came to this job because of our passion for eating disorders and mental health advocacy and our desire to make a difference," she wrote in her blog post.

Harper told NPR that many times people ask the staffers if they are a real person or a robot. No one's like, oh, shoot. You're a person. Well, bye. It's not the same. And there's something very special about being able to share that kind of lived experience with another person."

We, Helpline Associates United, are heartbroken to lose our jobs and deeply disappointed that the National Eating Disorders Association (NEDA) has chosen to move forward with shutting down the helpline. We're not quitting. We're not striking. We will continue to show up every day to support our community until June 1st. We condemn NEDA's decision to shutter the Helpline in the strongest possible terms. A chat bot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community," the Helpline Associates United told Motherboard in a statement.

Motherboard tested the currently public version of Tessa and was told that it was a chatbot off the bat. Hi there, I'm Tessa. I am a mental health support chatbot here to help you feel better whenever you need a stigma-free way to talk - day or night," the first text read. The chatbot then failed to respond to any texts I sent including I'm feeling down," and I hate my body."

Though Tessa is not GPT-based and has a limited range of what it can say, there have been many instances of AI going off the rails when being applied to people in mental health crises. In January, a mental health nonprofit called Koko came under fire for using GPT-3 on people seeking counseling. Founder Rob Morris said that when people found out they had been talking to a bot, they were disturbed by the simulated empathy." AI researchers I spoke to then warned against the application of chatbots on people in mental health crises, especially when chatbots are left to operate without human supervision. In a more severe recent case, a Belgian man committed suicide after speaking with a personified AI chatbot called Eliza. Even when people know they are talking to a chatbot, the presentation of a chatbot using a name and first-person pronouns makes it extremely difficult for users to understand that the chatbot is not actually sentient or capable of feeling any emotions.

Update 5/25:This article was updated with additional information and comment from Tessa lead Dr. Ellen Fitzsimmons-Craft.

Update 5/26: This article was updated with additional information and comment from the Helpline Associates United.

External Content
Source RSS or Atom Feed
Feed Location http://motherboard.vice.com/rss
Feed Title
Feed Link http://motherboard.vice.com/
Reply 0 comments