Listening device with microphone icon on it.

Many new inventions and uses for technology are shaping how we treat and adapt the treatment of patients. We questioned a few companies on how they are using technology and voice AI.

We are in a world of changing technologies. Many new inventions and uses for technology are shaping how we treat and adapt the treatment of patients. We questioned a few companies on how they are using technology and voice AI to address mental health concerns. We heard from Ellipsis Health, Lyssn, Kintsugi, and OPTT Health on their technology and how it has impacted their care. 

Ellipsis Health

Mainul Mondal, Founder and CEO of Ellipsis Health, gave the following responses. 

How is Ellipsis Health leveraging technology for early identification of depression and suicidal ideation? 

Today, we are leveraging our technology specifically for the early identification of the severity of anxiety and depression - either of these, left untreated, can lead to a deterioration in condition or crisis, including suicidal ideation. We do this by leveraging the unique power of the human voice and artificial intelligence to identify people in need and then connect them to integrated behavioral health services and personalized care pathways.  

Now, through our groundbreaking technology, we have pioneered the only clinically-validated vital sign for mental health. Our solution generates a clinical-grade assessment of the severity of anxiety and depression by analyzing a short voice sample  - creating the first objective, personalized, and scalable clinical decision support tool for mental health. With the use of our technology, we can help payers, providers, and digital care platforms shorten the time to diagnosis, increase efficiencies, reduce costs and improve patient outcomes. 

Triage/Risk Stratification: Our partners look to us to improve risk stratification as we are able to quickly triage people into appropriate care pathways, improving patient outcomes and efficiency and reducing costs. In the clinical setting, care teams lean on us for early identification of mental health conditions to ensure the patient is triaged for the right care at the right time. Also, through partnerships with digital health platforms that offer self-help tools like meditation and stress management apps, we can identify those in need of more clinical care and help introduce them to options like telehealth and/or teletherapy provided by these platforms. 

Longitudinal Measurement: Through our technology, care teams are able to remotely monitor treatment efficacy and be alerted when a person is experiencing a significant change in the severity of their anxiety or depression - eliminating gaps in care by providing ongoing/in-between visit monitoring. We are also seeing our technology being used by care teams to support those with comorbidities before and after a major medical event, such as surgery, so the individual can avoid costly readmission or ED visit.  

Increased Engagement: The current status quo for measuring the severity of anxiety or depression is self-reported paper surveys (PROs) which are impersonal and unengaging. Our technology creates higher engagement than the current paper assessment. For a large partner, 70% of users surveyed were satisfied with our technology compared with only 30% on a paper survey.  

Do you have any examples/stories/use cases you would like to share? 

We have partnered with a large behavioral health provider organization that has implemented our technology within their clinical treatment workflow to determine its effectiveness and ability to identify emerging crisis events. The study had 90  participants who had a Depressive Disorder diagnosis and were in treatment, and the treatment team and participants utilized our technology before and/or between sessions. The results were utilized to inform care providers of patient progress or lack thereof in treatment as well as to allow clinicians to prioritize patient safety during the study period.  

Additionally, in a recent pilot with a digital health provider, users give a short voice sample and, based on their scores, are provided with digital interventions like education and mindfulness activities. Users were highly satisfied with this experience. One found  that “answering the questions felt comfortable and inspiring,” while another reported that  they were almost amazed at how good it made [them] feel.” Another user found that “This is easier and more personal. I didn’t feel examined in a non-empathetic way.” Ellipsis Health aims to meet people where they are at, asking low-stigma questions geared towards making users feel like they are speaking to a friend. Whether we engage patients actively through an app or passively layer onto care calls, we are providing a simple, comfortable way to improve mental health outcomes through early detection, triage, and monitoring.  

What do you think needs to happen to make this technology more widespread? 

Once large healthcare companies, payers, and providers adopt innovative technologies,  like ours, the impact of the technology becomes greater - reinforcing the financial impact and improved patient outcomes. Armed with the power of data, physicians will be able to drive measurement-based care for mental health - shortening the time to diagnosis,  increasing efficiencies, reducing costs, improving patient outcomes, and saving lives. 


OPTT Health

Moshen Omrani, MD, PhD, CEO & President of OPTT Health, gave the following responses.

How is OPTT leveraging technology for early identification of depression and suicidal ideation?

To address this challenge, OPTT has developed a novel Natural Language Processing (NLP) algorithm by combining a custom classifier with a publicly available deep learning Transformer model. This innovative algorithm compiles clinically relevant Natural Language Comprehension; meaning it evaluates the relationship of a textual statement to a set of CBT-relevant concepts. These concepts include depression, anxiety, and the five-part elements of CBT: situation (positive or negative), thought (positive or negative), emotion (positive or negative), behaviour, and physical reaction. For instance, when the algorithm is given the statement: “I feel my life is challenging most days. I feel irritable and down,” the following output is provided: this statement is 73% related to depression, 68% to anxiety, and 59% to negative thoughts. The observed percentage in the output is referred to as the “Symptomatic Score” and indicates the probability that our algorithm considers the provided statement to belong/relate to the CBT-relevant concepts (i.e., depression, anxiety, and the five-part elements of CBT). This algorithm allows us to objectively evaluate clinically relevant variables reflecting a patient’s mental status, which is essential in developing algorithmic and evidenced-based decision-making processes.

Do you have any examples/stories/use cases you would like to share?

There is a huge mismatch between mental health demand and resources, and yet there is no reliable method of matching the scarce resources to those with the most needs. For instance, there are around 1,960 patients with mental health problems per 1 psychiatrist in the US, which is 4-5 times more than the number of patients they can handle per year. This means patients should endure long wait times for their initial evaluation, whether they have a mild problem that would have benefited from lower-level resources faster or those with severe problems which should wait a long time to receive appropriate care. We have designed a triage module that relies on patients’ personal narratives of their problems and challenges to provide clinicians with an accurate diagnosis, predict patient compliance, and suggest courses and levels of treatment.

For instance, an important challenge in mental healthcare is the low compliance of patients in completing their course of treatment. Using our proprietary NLP algorithms, we were able to predict patients’ dropout with 70% accuracy, 4 weeks in advance of them dropping out, in a group of +250 patients across 4 clinical trials. This information could help the clinicians increase the level of their engagement with the patients to encourage them to finish their therapy. In fact, in our latest clinical trial in which the level of patient engagement was adjusted through their treatment, the number of sessions completed by patients increased by 20%, and the number of patients completing the whole round of therapy increased by 35%.

What do you think needs to happen to make this technology more widespread?

Given the absence of robust and precise quantitative measures, evidence-based practices, which are the cornerstones of all progress in the rest of medicine, have been largely missing in mental health. It is essential for the field to focus its attention on developing and using data-driven practices, which makes way for innovative solutions like ours.


Kintsugi

Prentice A. Tom, MD, Chief Medical Officer of Kintsugi, gave the following responses. 

How is Kintsugi leveraging technology for early identification of depression and suicidal ideation?

Kintsugi is the global leader in leveraging AI voice biomarker analytic technology to screen for depression and anxiety across populations. We have developed the technology to quantitatively assess any person’s mental health across these two common conditions and can provide a fully automated and unbiased* assessment of whether the user of our technology suffers from any degree of depression or anxiety. Because our technology is completely non-invasive and scalable, it can be used to truly move the needle in the early identification of these mental health conditions.  

*(with respect to gender, educational level, language spoken, ethnicity, or other patient demographics)

Do you have any examples/stories/use cases you would like to share?

Our Kintsugi Voice Analysis Technology is currently being used in the spectrum of clinical settings. We do not own the patient data, as that information belongs to clinical partners who are using the technology. Also, because the technology is not dependent on the content of the conversation but rather analyzes the impact of psychiatric conditions on human voice generation, IE: the production of the sounds that make up the components of speech, we do not require any patient-specific information. Thus, for the above reasons, we do not have access to nor archive any individual patient stories. We can say that we are currently working with some of the nation’s largest healthcare entities, and our customers have found their clinicians truly appreciate how our technology augments their ability to identify patients suffering from depression and anxiety disorders. In fact, we are currently ramping up to greatly expand the availability of our voice analytic tool to meet the growing demand of our healthcare industry customers.  

What do you think needs to happen to make this technology more widespread?

Depression in the elderly, anxiety disorder in our nation’s youth, and burnout in our working-age population are now recognized as serious and growing health concerns. The impact of mental health conditions as an independent risk factor for diseases such as heart disease is a primary factor contributing to the cost of care in the US (65% of all recurrent emergency department users have been shown to have some mental health condition), and as a comorbidity to other diseases, such as post-partum depression -- is garnering much greater national attention. To date, our inability to quantitatively screen for these conditions and our inability to easily identify those who may be suffering from these conditions has led to mental health conditions being one of the most frequently, if not the most frequently, underdiagnosed conditions in healthcare. We need to create policies requiring our primary care clinicians to screen for mental health just as they screen for high blood pressure and diabetes, and we need to reimburse clinicians for performing this screening.    

The fact that literally millions suffer needlessly or have worsening other medical problems because of their untreated mental health conditions is unacceptable. As patients, we need to demand that we be assessed and screened for our mental wellness, just as we are for many physical health conditions. As clinicians, we need to take the lead in making it the standard of practice that all patients receive appropriate mental health screening. On the government and commercial insurance side, we need to recognize how early identification of mental health conditions reduces the total cost of care and appropriately incentivizes mental wellness screening for everyone.

Looking to the Future

The future is bright for the use of voice AI in the behavioral health industry. We look forward to seeing the opportunity expand into other common areas of the health field. This advancement has a major opportunity to help in the evaluation of people in crisis, as well as different levels of care needed at different levels of intervention. Let’s hope this leading technology makes a change that can alter how we view and treat mental health.