Article summary:
-
AI enhances psychology by streamlining tasks, improving diagnostics and offering new therapeutic tools.
-
Key uses include transcription tools and mood tracking via wearables.
-
Ethical considerations: client consent, data security and privacy are essential.
-
Future innovations: decoding dreams and internal voices for PTSD and schizophrenia treatment.
-
Practical preparation: focus on secure tools, stay informed on AI advancements and explore APS cybersecurity, privacy and data-protection CPD resources.
-
AI offers opportunities to improve care, but thoughtful integration is key to maintaining trust and efficacy.
-
APS is taking proactive steps to address the evolving challenges and opportunities brought about by AI, via research, advocacy work and learning opportunities. See bottom of article for the full list.
From enabling clients’ moods to be tracked in real-time to assisting dream decoding and the management of complex trauma, the potential applications of AI in psychology are both vast and astonishing.
We are on the cusp of revolutionary AI technology entering the mainstream psychology field – including innovations with the potential to fundamentally reshape how psychologists connect with and treat their clients.
"As the practice of psychology goes, AI has a lot to offer us," says Dr Oliver Guidetti MAPS, a postdoctoral research scientist for the Cyber Security Research Cooperative and the Edith Cowan University Experimental Psychology Unit.
The applications range from streamlining administrative tasks, freeing up valuable time for psychologists, to transformative advancements in diagnosing and treating clients.
According to Dr Guidetti, this not-so-distant future could redefine the practice of psychology forever. With due consideration paid to the security, privacy and ethical risks that may be associated with this technology with this technology, the next three to five years could see some incredible advancements in this area.
However, before we dive into some of these incoming innovations, let's explore some of the current AI ‘use cases’ that psychologists could apply immediately.
Psychology use cases
A common use case many psychologists may already be experimenting with is using AI technology to transcribe sessions with clients.
"Gone are the days where voice-to-text technology is terrible," says Guidetti. " It's now about 90 to 96 per cent decent. That allows you to be much more present and in the moment [with your clients]."
It's important to acknowledge that some clients may feel hesitant about using such technology, making consent an essential, non-negotiable aspect of its implementation.
Psychologists, too, might share concerns, particularly around data security and the handling of sensitive information.
"We don't want client data being pushed out into the ether," he says.
For example, Dr Guidetti points out there are platforms which are "fantastic" at identifying multiple speakers in a conversation, "which is great for things like couples counselling or family counselling – but the downside is that it sends the recording to a server somewhere that no one really knows about. There's no good, clear answer about what might be happening with that client data.”
This isn't to say psychologists should avoid transcription programs altogether. It just means they might need to spend more time exploring for the right tools and, in some instances, build on their cybersecurity and privacy knowledge to help you identify and avoid risks.
In August 2024, Ahpra released guidance about health practitioners' use of AI. Ahpra recognised that the safe use of AI in healthcare has "significant potential to improve health outcomes and create a more person-centred health system," but set out key principles for practitioners to keep in mind to meet their professional obligations.
Psychologists should review this guidance and seek professional supervision or consultation as needed.
Access APS's cybersecurity, privacy and data-protection CPD here.
"There are alternative [platforms] out there. There's one called Ollama, which can run offline on a laptop, meaning it won't send [the data] anywhere."
While these offline systems can be much more complex to set up, they are worth investing time into learning about, as this can give both yourself and your clients peace of mind.
Psychologists should encourage continuous discussion and debate around the ethical concerns and solutions involving this technology, says Dr Guidetti, particularly within student cohorts because "they are the psychologists of tomorrow".
"[This way] they won't be afraid to adopt these new technologies. But we do need to provide them with some cautionary messages, like, at the very least, know where the data is going and to make sure you're not using your personal account to log into these services."
There are also incredible opportunities to be gleaned from the proliferation of wearable technology, says Dr Guidetti.
"If a patient with any kind of mood disorder has a recently released smartwatch, you can continuously track their emotionality."
Again, the major caveat here is that all of this needs to occur with the client's expressed and ongoing consent.
"For someone who has deep depression, clinically speaking, the idea is to try and stabilise them. So, in this example, it is very possible for a psychologist to get an alert if there's a massive deviation [in mood] – positive or negative."
Of course, there are considerations to keep in mind with adoption of mood-tracking software like this, such as the potential to create further demands on psychologists’ personal and professional time. Dr Guidetti notes that it’s important for psychologist to have “the right to disconnect” from their work too, so it’s worth considering boundaries that might need to be put in place.
The next question becomes how to use that data.
"You could use it to detect large deviations. For example, for a client who has a sudden burst of energy whatever they pass by a casino, that can be noted, or the patient with depression who has been stable for a while and then suddenly has an extremely bad day, maybe that's worth a welfare check. This is something a psychologist could start doing tomorrow."
This technology could also be used to help track longer-term changes, he adds.
"For a patient with depression or anxiety, you might use [that data as a signal] for whether that person is actually getting better? Is the intervention you're applying having long-term gains? It makes it less artisanal and much more data-driven. These data sets are as robust as a thermometer measuring temperature."
Dr Guidetti is particularly excited about how AI technology might enhance urgent care needs.
"Urgent care needs quite particular interventions. I can see that the tooling around that is probably going to reach a point where it's not offloading the need for a psychologist, but it is going to save a lot more lives, in the sense that people who need help will get it immediately and maybe not necessarily from a human agent.
"It's the capacity to deliver these interventions at scale that's going to massively augment our work which, given our workforce shortages, is exactly the sort of capability that psychologists need."
Stepping into the mind
Beyond the everyday applications of AI, there are also groundbreaking innovations on the horizon that feel more like scenes from a sci-fi movie than real life.
"One of Jung's desires was to 'rescue dream content for the clinic.' He saw dreams as being clinically informative. For example, for disorders like PTSD or complex trauma, dream content can be of significance.
"We can actually use various methods of AI, along with a wearable EEG, that can decode information about dreams and play it back like a video. I've seen it be done. It is truly a Black Mirror capability."
Researchers in Japan have recently made "massive strides" in this space by learning how to decode video from the occipital lobes, he says.
"The internal voice can also be decoded," he says. "That has implications for disorders like schizophrenia, where voices are heard in the head. What exactly this means for a psychologist, I'm not sure. But I think psychologists and psychiatrists could benefit from the extraction of that information."
While these advancements are promising, they also raise important ethical and practical concerns. For example, how might decoding private thoughts or dreams impact patient autonomy and confidentiality? Additionally, there is limited evidence on the clinical efficacy of these technologies, as they are still in their infancy.
Early research shows potential, but further studies are needed to confirm their reliability and applicability in therapeutic settings. Psychologists must weigh these risks and consider how best to use such technologies responsibly and in line with their ethical responsibilities.
Dr Guidetti believes this could become mainstream technology in under five years – and that's his conservative answer.
"It's just a matter of time on the research that will get us there."
While he says these technologies can “feel like feats of magic”, they don’t actually understand biology or neuroscience.
“The way that AI works is that the more time you put into building the AI, the better and better it gets. It's a matter of the volume of data that informs it.”
You can watch this video to learn more about how this technology works.
How to prepare?
This is complex technology, but there's no need for its use to be.
"There's no need to learn how to code. It's more of a case of looking for the right tools that are going to best support your ability to do your work.
"Have an open mind, explore the tools and make sure that the data security is not going to be a concern – you can look at a company's privacy policies for that information – and then experiment with some tools."
Dr Guidetti suggests looking at leaders in the field of AI and following their work to help you stay abreast of important advancements. It’s also important to defer to Ahpra’s guidelines and seek supervision from other psychologists who may already be using AI technology practice.
With proper consideration of ethical, security and privacy concerns, these emerging AI technologies hold the potential to transform the practice of psychology, enabling psychologists to deliver more effective, data-driven and accessible care. As these innovations continue to evolve, embracing them thoughtfully could unlock new ways to support clients and address complex challenges in the field.
What the APS is doing
APS is taking proactive steps to address the evolving challenges and opportunities presented by advancements in technology, cybersecurity and artificial intelligence. Here’s how we are leading the way:
-
Educating members on cybersecurity and data protection: The APS hosted a webinar on cybersecurity, privacy and data protection, equipping psychologists with the knowledge to safeguard sensitive client information in a digital world.
-
Exploring the intersection of AI and psychology: At the 2023 AI & Psychology Members’ Symposium, international experts discussed the profound implications of artificial intelligence for the field of psychology. Members can continue to engage with this critical topic by watching the symposium on-demand. (6.5 CPD hours).
-
Calling for government investment to help psychologists use and understand AI effectively: In our 2024-25 Pre-Budget Submission, we sought funding from the Federal Government for the APS to develop and deliver training and learning opportunities to help the psychology and mental health workforce to use AI to support their clinical practice. In addition, we proposed that the APS coordinate a psychologist-led project to understand the full psychological impacts of AI on young Australians. In a recent submissions, we have reiterated the urgent need for this investment.
-
Advocating for safe and responsible AI: In a 2024 submission to the Department of Health and Aged Care, the APS recommended targeted regulatory reform which would allow the benefits of AI in healthcare to be made available to all Australians and not just to advance the interests of commercial AI providers.
-
Advancing research into digital mental health services: The APS is driving research into digital mental health services to ensure evidence-based practices remain at the forefront of this rapidly growing area.
-
Facilitating peer learning and discussion about AI and technology: The APS ePsychology Interest Group has been active in discussing AI use in clinical practice. There have also been insightful conversations on PsyCommunity about AI for psychologists and examples of how AI can benefit psychologists.
These initiatives demonstrate our commitment to ensuring psychology remains adaptable, ethical and effective in an increasingly digital world. Together, we can shape the future of our profession in ways that prioritise both innovation and the wellbeing of those we serve.