Loading

Australian Psychology Society This browser is not supported. Please upgrade your browser.

Insights > AI chatbot for weight loss prompts warning: APS in Australian Financial Review

AI chatbot for weight loss prompts warning: APS in Australian Financial Review

Artificial Intelligence (AI) | Mental health | Wellbeing
Person holding their mobile phone

This article is featured in Australian Financial Review and is republished with permission.

Eucalyptus-owned medical weight loss company Juniper says it has trained an AI chatbot to provide advice and coaching to its users on topics such as protein intake, medication use, and plateau management.

Juniper specialises in providing weight loss advice and medical treatments similar to Ozempic to women and has recently introduced an AI bot, named June, which handles 98.94 per cent of Juniper’s conversations with customers and has been used by 44 per cent of the application’s total user base.

But experts warned using AI for sensitive topics such as weight loss, which can trigger serious mental health issues, must be carefully managed.

Nicole Liu, health futurist and principal product manager at Eucalyptus, said June has been strictly programmed to only advise on topics it has been trained on and redirects the user to a human if the discussion errs beyond its scope. The bot responses had been tested for accuracy, tone and appropriateness.

“We’re really clear on the scope that June has or does not have,” Liu said, “Obviously, when it comes to healthcare, we have to be really careful. June is not a health coach, and also not a human.”

Liu said the AI bot trainers started with a clear understanding of their philosophy for weight loss and taught June the answers to common questions. From there, the AI was tested and developed to ensure June was formulating correct responses.

“It’s pulling from our vetted knowledge base that is passed in by all our clinicians and approved by our advisers, and that is the source of truth,” Liu said. “Then that goes through a really rigorous evaluation process.”

Chief executive of the Australian Psychological Society Zena Burgess said AI could be a valuable resource but warned AI interventions must be carefully safeguarded.

“When used for weight management, potential risks of AI may include generic advice that ignores medical and psychological conditions and medications as well as disordered eating triggers, which could reinforce existing maladaptive thought patterns and behaviours,” Burgess said.

Burgess worried there was no clear assurance about how AI models such as June identify and learn from errors.

“Getting things wrong in a therapy context can be very dangerous, especially in crisis situations. In the context of weight management, this may apply to detecting eating disorders.”

Liu said June was cautious in its approach and was trained to defer to humans at an early stage.

“Anything to do with side-effects, medication, the more emotional eating side of things, anything that starts to be a lot more nuanced and requires a lot more clinical knowledge, gets escalated very quickly,” Liu said.

The professionals who trained June now have more purposeful roles, she said. Instead of answering questions such as “Can I eat this chocolate bar?” or “I don’t feel hungry, is that normal?” the coaches and advisers provide advice to people with more complex questions.

“We’re not trying to replace jobs, we’re trying to change them,” Liu said, “We’re trying to free up and unlock the time of these practitioners so they’re able to really give high-impact care and leverage the best out of their qualifications.”