It seems like everybody is using the term “AI” as a cure-all for any problem, no matter how large or small. I need help with a literature review—let this AI tool help you. My calendar is a mess—let AI organize it for you. I need help with routine tasks—let an AI assistant get those cleared up for you. We’re at the point where people who aren’t even tech-savvy are espousing the virtues of AI. So, what do psychiatrists need to know when using these powerful tools, and what are the best use cases?
For those of you who are willing to be vulnerable and admit that you don’t know what AI exactly means, let’s start with the American Psychiatric Association’s (APA) definition. Artificial intelligence (AI) is “a machine-based system that can perform tasks that otherwise would require human intelligence, including making predictions, recommendations, or decisions.” The association prefers the term “augmented intelligence,” inferring that AI should assist and support, not be all-powerful.
“The role of AI is not to take the place of psychiatrists but to help us accomplish tasks we don’t need to be performing ourselves,” says Todd Peters, MD, senior vice president, chief medical officer, and chief medical informatics officer at Sheppard Pratt. Dr. Peters also serves as chair of the Resource Group on Artificial Intelligence and co-chair of the Health Information Technology Committee for the American Academy of Child and Adolescent Psychiatry (AACAP).
Acknowledging the “different flavors” of AI is necessary as psychiatrists consider how actively and purposefully they can utilize it in their day-to-day workflow, he says.
Could AI provide administrative relief?
Per the APA, most psychiatrists spend just 60% of their time with patients. Physicians in general report spending several hours per week documenting patient care and interactions in EHRs—leading to burnout and decreased job satisfaction.
Dr. Peters posits that perhaps the best use of AI is as a clinician’s multitasking personal assistant. “AI can help us with tasks that don’t require ‘top of license’ skills, such as return-to-school notes or prior authorizations,” he says.
Clinicians can also benefit from HIPAA-compliant ambient listening tools that are integrated into their EHR systems. “I can say, ‘I’m seeing Susie, and I’d like you to order a CBC for next week at the lab,’” he says. “AI can take care of this administrative action item by listening, summarizing, and writing the note for me.”
The APA corroborates this, noting that AI-assisted point-of-care documentation and alerts can reduce medical errors.
Given the national psychiatrist shortage, Dr. Peters supports using one’s time most efficiently. “Ideally, if we could whittle down documentation time, we could see more patients and provide more direct care. I’m in support of using a tool that prevents me from getting five weeks behind on notes.”
Balancing sensitive data
The APA survey also noted that “beyond concerns about accuracy, veracity, and bias of large language model (LLM) tools,” the issue of patient privacy looms large. People in general are tempted to share sensitive information with conversational bots that act like a new best friend when bestowed a cursory “thank you.”
Dr. Peters worries about entering patients’ protected health information into large open-source data sets and violating HIPAA. “Each version of a dataset can pull from vast quantities of data and continues to expand as people put more data into it that could be extracted.” The question remains: How do we create secure, walled-off spaces? “It’s easy to think, ‘I’m going to take this string of data and see what AI generates,’” he says. “A lot of clinicians are sitting some of this out to observe where privacy concerns go.”
Tomorrow is here
Experts predict that AI’s use in mental healthcare will continue to advance quickly, even as standards for its use are scarce and the field is underregulated. Its promise is being strategically evaluated with a healthy dose of skepticism while its proponents try to restrain themselves from shouting from the rooftops.
For example, AI may help fill a massive void for the underserved—like youth. As a late 2023 study in JAMA Pediatrics acknowledged, less than 20% of children get the mental healthcare they need from a specialist. Many clinicians believe AI could make a big difference when it comes to accessing care.
The study also notes that children and adolescents comfortably use more technology than other groups, and anonymity of nonjudgmental AI can help them overcome stigma about their mental health.
Caution, paired with asking and answering questions, remains central to judicious use of this new technology, Dr. Peters says. “There’s a lot of hope and optimism that this can curtail ‘the busy work.’ It’s all very exciting.”
How are you using AI in your clinical practice?
We want to hear from you!
Featured Expert
-
Todd Peters, MD
Senior Vice President and Chief Medical OfficerSpecialties:Child and Adolescent Psychiatry, Electroconvulsive Therapy (ECT), LGBTQ+ Mental Health Issues, Medical Informatics, Pediatric Mood Disorders, Pediatric Anxiety Disorders