Software detects changes in clinical states in voice data of patients with bipolar, schizophrenia and depressive disorders as accurately as attending doctors
The USC Signal Analysis and Interpretation Lab (SAIL), a center that focuses on analyzing signals by people, from people and for people, in new collaboration with UCLA, has found that AI can accurately detect changes in clinical states from speech as well as physicians. The study appears in PLOS One.
SAIL, which has long applied AI and ML to identify and classify video, audio and physiological data, partnered with researchers at UCLA to analyze voice data from patients being treated for serious mental illnesses, including bipolar disorder, schizophrenia and major depressive disorders. These individuals and their treating clinicians used the MyCoachConnect interactive voice and mobile tool, created and hosted on the Chorus platform at UCLA, to provide voice diaries related to their mental health states. SAIL then collaborated with UCLA researchers to apply AI to listen to hundreds of voicemails using custom software to detect changes in patients’ clinical states. The SAIL AI was able to match clinicians’ ratings of their patients.
“ML allowed us to illuminate the various clinically-meaningful dimensions of language use and vocal patterns of the patients over time and personalized at each individual level,” said senior author Dr. Shri Narayanan, Niki and Max Nikias Chair in Engineering and Director of SAIL at the USC Viterbi School of Engineering.