Australian Psychology Society This browser is not supported. Please upgrade your browser.

Insights > Can AI be used to treat mental health conditions? HMRI aims to find out

Can AI be used to treat mental health conditions? HMRI aims to find out

Artificial Intelligence (AI)
AI-placeholder-image

This news article originally appeared in Newcastle Herald and has been republished with permission.

The ability of artificial intelligence [AI] to fill the role of human mental health professionals will be examined in a study.

About 100 people will be recruited for the joint University of Newcastle and Hunter Medical Research Institute (HMRI) project.

Participants will be asked to rate AI and human responses to questions and scenarios related to mental health and addiction.

A psychologist and social worker from the project's research team will provide written responses to questions.

"The AI will generate responses to the same questions," said Louise Thornton, of HMRI's Healthy Minds Research Program.

Participants will be asked in an online survey to rate the responses and whether they seem trustworthy, accurate and empathetic.

"We don't tell the participant which response is an AI chatbot or real person," Dr Thornton said.

"We're trying to work out if an AI chatbot can get close to providing empathetic and accurate responses."

About 120,000 people in the Hunter live with high levels of psychological distress.

And about 18,000 people in the region aged 25 to 64 live with a severe mental illness.

Federal data showed there was a 32 per cent shortfall in mental health workers in 2022, which was expected to rise to 42 per cent by 2030.

The Newcastle study will examine if AI could help fill this gap.

In a 2023 University of California San Diego study, Chat GPT3 was used to generate responses to questions posted to a Reddit medical forum.

The study found the chatbot was preferred to doctors in 79 per cent of the 195 responses studied. These preferences came from healthcare professionals evaluating the responses for quality and empathy.

The study concluded that more research could assess "if using AI assistants might improve responses, lower clinician burnout and improve patient outcomes".

The Australian Psychological Society said in February that AI could lead to better mental health services being available to "more people than ever before".

"Psychologists can work hand-in-hand with this technology," society chief executive Zena Burgess said.

"At the same time, AI is challenging because it presents unprecedented, and possibly unforeseeable, risks from unethical application."

Dr Burgess added that "research-informed integration of AI with human-based models of care is vital".

Dr Thornton said it remained to be seen if AI could "effectively and safely deal with complex issues like mental health and substance use".

"We're not willing to take that risk without doing our due diligence and taking things slowly. This project is a first step."

The research team previously developed a social networking site called Breathing Space, where people can seek support from clinicians and each other to improve wellbeing and resilience.

"We use it in our trials to support people using our online digital health resources," Dr Thornton said.

"It offers people a safe space to connect. However, manual moderation is resource-intensive, which limits our scalability."

By training an AI model, the research team will test whether it can increase effectiveness, particularly for 24/7 support.

The study is open to people over 18. Visit the website for details.