Can AI therapy help when BIPOC therapists are in short supply?

Can AI therapy help when BIPOC therapists are in short supply?

Last spring, a mixed-race UW–Madison student tried to begin therapy. Like many people seeking mental health support, she knew she would run into delays. But after a nearly month-long wait to see a therapist, the sessions, once they started, felt polite but limited, the result, she thought, of a cultural mismatch between her and the provider. 

The student described the provider, a white woman, to be a qualified therapist, but not the right match to understand the cultural context behind what she wanted to discuss. 

Switching therapists meant restarting the entire process, so she continued seeing the provider she was assigned to, but said the experience never felt like the right fit. 

Her situation reflects a larger reality. For many people, particularly people of color, finding mental health support that is timely, consistent and culturally responsive remains difficult. The nationwide demand for mental health care continues to increase while the number of BIPOC therapists remains disproportionately low. The National Library of Medicine reports that, as of 2020, 74% of therapists are white. 

Brian Benford, a success coach at the UW Odyssey project, sees those barriers every day in the work he does counseling students. 

“Culturally competent physical and mental health care is really lacking within Dane County,” he said. 

Even when clients finally find a therapist they feel comfortable with, the typically high cost of therapy becomes another barrier: “If they don’t have insurance, which 90% of the time,” and they don’t have BadgerCare, “which doesn’t cover a whole lot,” accessing therapy can be economically out of reach for many, Benford said.

Long waitlists, limited provider options, insurance barriers and cultural mismatches push many people to look for alternatives. In that gap, artificial intelligence mental health tools have become increasingly common. When culturally attuned therapists are not readily available, apps such as Woebot, Wysa and Mindspa offer conversational support, mood tracking, guided prompts and 24/7 availability, all without insurance barriers or long waitlists. 

The shortage of BIPOC therapists shapes nearly every part of the mental health landscape for people of color. Research on cross-racial therapy has found that race can add an extra, sometimes stressful, layer to the relationship between clients and therapists when they come from different racial backgrounds. In these situations, research shows clients can feel less free to talk openly about their race-related concerns which can make it harder to open up about their main problems in therapy. These disconnects aren’t usually intentional, but they often reflect a lack of lived context behind what a client is describing.

Against this backdrop, and with affordable and timely in-person therapy often difficult to access, the appeal of using AI as a personal coach or therapist is understandable: AI tools respond instantly, don’t judge and are available at moments when traditional support isn’t. 

Wellness educator Angelee Andorfer-Lopez told AFROPUNK, an online platform focused on Black culture and community, that many younger BIPOC users aren’t choosing AI over therapy, but rather they’re using it because the existing system doesn’t meet their needs. AI is what’s available when instant help is needed. 

However, a growing body of research suggests that while AI tools can offer support, they may not be as beneficial as the support patients receive from human therapists. A 2025 study published in the International Journal of Human-Computer Interaction compared people’s experiences with AI-guided emotional conversations to sessions with trained clinicians. Participants reported that AI tools felt more approachable and better at active listening than its human counterparts. The study found that users appreciated the nonjudgmental nature of AI chatbots and felt more willing to express emotions quickly. 

But the same research also found limits: While AI could mirror emotions, offer coping suggestions and give users the feeling that they were being heard, study participants rated human therapists far higher in warmth, nuance, cultural attunement and the ability to make them feel deeply understood. The presence of a real person who can notice tone shifts, body language and cultural subtext is critical for building trust and working through emotions. 

This aligns with what Andorfer-Lopez told AFROPUNK: AI can help people reflect, organize their thoughts or feel less alone in the moment, but it can’t fully replicate relational understanding, cultural context or the sense of safety human clinicians provide. She worries, however, “that if we begin to rely solely on AI for healing, that this only further encourages individualism — something our BIPOC community members are realizing more and more has not been serving us.” 

Despite the growing popularity of these platforms and of large language models’ capacity to mimic supportive conversation, they cannot read nonverbal cues, track a user’s emotional state over time or reliably recognize when someone is slipping from everyday stress into crisis. Researchers studying these interactions, including those from the 2024 Journal of Medical Internet Research Mental Health, have found that chatbots often miss warning signs a trained clinician would catch.

These shortcomings have raised concerns. Last November, the American Psychological Association issued a health advisory cautioning against the use of AI therapy bots or wellness apps as a replacement for a qualified mental health care provider because of the risks they pose, including in providing misinformation, incomplete assessments or unreliable crisis management. 

Even some app developers are having second thoughts. Joe Braidwood, co-founder of the AI therapy startup Yara, shut down the platform in November after concluding that the technology could not operate safely for vulnerable users. In an interview with Fortune, he said that while AI worked well for stress management and day-to-day emotional processing, it became “dangerous” the moment someone with trauma or acute distress reached out to Yara for help. 

Braidwood said they spent months testing guardrails, crisis-detection tools and safety filters, but even then the AI models struggled to tell the difference between someone needing basic support and someone experiencing a mental health crisis. “Sometimes the most valuable thing you can learn is where to stop,” he said.

Even with unanswered questions about safety, many people are consulting AI for interpersonal or therapeutic advice. A study published in the journal JAMA Open Network last November found that about one in eight adolescents and young adults in the U.S. use chatbots for mental health advice, and 92.7 percent of respondents to the study’s survey found the advice somewhat or very helpful.

And for people who might have to wait weeks between appointments, having a resource for support or advice available 24/7 can be part of its appeal.

Having access to everyday support is what appeals to Benford, “as someone who’s tried AI therapy” and “liked it,” he said. Being able to tap questions into his phone and get feedback reminded him of early mindfulness apps developed at UW–Madison and elsewhere: tools that offered consistency and attention when traditional support wasn’t available. 

“Anything that’s going to help someone, I’m really all for it,” he said. 

When asked whether he would ever recommend AI counseling as a partner to vent to when facing routine life stress issues to his Odyssey students, he didn’t hesitate: “Oh, absolutely.” He hasn’t yet, but he sees how it could relieve stress for students facing structural barriers and provide them with hope, he said. 

There is also growing interest in how AI might help BIPOC therapists themselves. Some providers frame it as a potential load-lightener that can complete administrative tasks, generate worksheets or create journaling prompts, which allows therapists more time to focus on their actual patients. If used this way, AI could be what makes human support more available, rather than being relied on to provide support itself. 

The existence of AI therapy points to the needs that remain unmet: timely access, cultural understanding and care that feels responsive to real experiences, as well as more research to understand when chatbots can provide support safely and when their use might be riskier. Until those needs are addressed on a larger scale, AI is likely to continue sitting in the background of how people manage their mental health day to day.

Adobe Stock Image.

Share

Written by:

18 Posts

View All Posts
Follow Me :