State of Mind
For millions of people around the world who were already struggling with mental health issues, the past two-and-a-half years of the coronavirus pandemic have been a further trial. Isolation, a sudden shortage of opportunities to interact with friends or family in person, additional stresses in the workplace or the home, new financial worries, and difficulty in accessing appropriate mental healthcare have taken their toll, experts in the field told The ACCJ Journal.
How artificial intelligence is helping identify mental health concerns for better treatment
Listen to this story:
For millions of people around the world who were already struggling with mental health issues, the past two-and-a-half years of the coronavirus pandemic have been a further trial. Isolation, a sudden shortage of opportunities to interact with friends or family in person, additional stresses in the workplace or the home, new financial worries, and difficulty in accessing appropriate mental healthcare have taken their toll, experts in the field told The ACCJ Journal.
However, in the battle against mental health complaints, this time of adversity has also served to fast-track development and adoption of a new tool: artificial intelligence (AI). While the technology may be relatively new to the sector, the potential is huge, according to companies that are applying it to assist physicians with diagnosis and treatment.
A Tool for Our Time
AI has come a very long way since the first chatbots appeared back in the 1990s, and early mental health monitoring apps became available, explained Vickie Skorji, Lifeline services director at the Tokyo-based TELL Lifeline and counseling service. And it is urgently needed, she added.
“When we have something such as Covid-19 come along on a global scale, there is inevitably a sharp increase in anxiety, stress, and depression. The mental healthcare systems that were in place were simply flooded,” she said.
“A lot of companies were already playing around in the area of AI and mental healthcare, but the pandemic has really pushed these opportunities to the forefront,” she explained. “If, for example, a physician is not able to meet a client in person, there are now ways to get around that, and there has been an explosion in those options.”
Not every purported tool is effective, she cautions, and there are going to be questions around client confidentiality and keeping data current. The clinician must also become sufficiently adept at interpreting a client’s genuine state of mind, which might be different from the feelings that are communicated through the technology. On the whole, however, Skorji sees AI as an extremely useful weapon in the clinician’s armory.
Voice Matters
One of the most innovative solutions has recently been launched by Kintsugi, a collaboration between Grace Chang and Rima Seiilova-Olson, engineers who met at the 2019 OpenAI Hackathon in San Francisco. In just a couple of years, the company has gone from a startup to being named in the Forbes list of North America’s top 50 AI companies.
Kintsugi has developed an application programming interface called Kintsugi Voice which can be integrated into clinical call centers, telehealth platforms, and remote patient monitoring applications. It enables a provider who is not a mental health expert to support someone whose speech indicates they may require assistance.
Instead of using natural language processing (NLP), Kintsugi’s unique machine learning models focus on signals from voice biomarkers that are indicative of symptoms of clinical depression and anxiety. Producing speech involves the coordination of various cognitive and motor processes, which can be used to provide insight into the state of a person’s physical and mental health.
In the view of Prentice Tom, chief medical officer of the Berkeley, California-based company, passive signals derived from voice biomarkers in clinical calls can greatly improve speed to triage, enhance behavioral health metadata capture, and benefit the patient.
“Real-time data that augments the clinician’s ability to improve care—and that can be easily embedded in current clinical workflows, such as Kintsugi’s voice biomarker tool—is a critical component necessary for us to move to a more efficient, quality-driven, value-based healthcare system,” he explained. The technology is already in use in the United States, and Japan is on the waiting list for expansion in the near future.
Chang, the company’s chief executive officer, is confident that they are just scratching the surface of what is possible with AI, with one estimate suggesting that AI could help reduce the time between the appearance of initial symptoms and intervention by as much as 10 years.
“Our work in voice biomarkers to detect signs of clinical depression and anxiety from short clips of speech is just the beginning,” she said. “Our team is looking forward to a future where we can look back and say, ‘Wow, I can’t believe there was a time when we couldn’t get people access to mental healthcare and deliver help to people at their time of need.’
“My dream and goal as the CEO of Kintsugi is that we can create opportunities for everyone to access mental health in an equitable way that is both timely and transformational,” she added.
The Power of Data
Maria Liakata, a professor of NLP at Queen Mary University of London, is also the joint lead on NLP and data science for mental health groups at the UK’s Alan Turing Institute. She has studied the use and effectiveness of AI in communicating with the public during a pandemic.
Liakata’s own work has focused on developing NLP methods to automatically capture changes in individuals’ mood and cognition over time, as manifested through their language and other digital content. This information can be used to construct new monitoring tools for clinicians and individuals.
But, she said, a couple of other projects have caught her eye.
One is Ieso Digital Health, a UK-based company that offers online cognitive behavioral therapy for the National Health Service, utilizing NLP technology to analyze sessions and provide data to physicians. And last October, US-based mental and behavioral health company SonderMind Inc. acquired Qntfy, which builds tools powered by AI and machine learning that analyze online behavioral data to help people find the most appropriate mental health treatment.
“There has definitely been a boom over the past few years in terms of the development of AI solutions for mental health,” Liakata said. “The availability of large fora in the past 10 years where individuals share experiences about mental health-related issues has certainly helped in this respect. The first work that came to my attention and sparked my interest in this domain was a paper in 2011 by the Cincinnati Children’s Hospital. It was about constructing a corpus of suicide notes for use in training machine learning models.”
Yet, as is the case during the early stages of any technology being implemented, there are issues that need to be ironed out.
“One big hurdle is the availability of good quality data, especially data over time,” she continued. “Such datasets are hard to collect and annotate. Another hurdle is the personalization of AI models and transferring across domains. What works well, let’s say, for identifying a low mood for one person may not work as well for other people. And there is also the challenge of moving across different domains and platforms, such as Reddit versus Twitter.
“I think there is also some reluctance on the part of clinicians to adopt solutions, and this is why it is very important that AI solutions are created in consultation with clinical experts.”
Over the longer term, however, the outlook is positive, and Liakata anticipates the deployment of AI-based tools to help with the early diagnosis of a range of mental health and neurological conditions, including depression, schizophrenia, and dementia. These tools would also be able to justify and provide evidence for their diagnosis, she suggested.
To Assist, Not Replace
Elsewhere, AI tools will be deployed to monitor the progression of mental health conditions, summarize these with appropriate evidence, and suggest interventions likely to be of benefit. These would be used by both individuals, to self-manage their conditions, and clinicians.
Despite all the potential positives, Skorji emphasizes that AI needs to be applied in conjunction with in-person treatment for mental health complaints, rather than as a replacement.
“The biggest problem we are seeing around the world at the moment is loneliness,” she said. “Technology is useful, but it does not give people access to people. How we deal with problems, what the causes of our stress are, how can we have healthy relationships with other people—we are not going to get that from AI. We need to be there as well.”