Teens Are Turning to AI for Mental Health Help—But Experts Warn It May Miss the Reality of Racial Stress

A new concern is emerging at the intersection of technology and equity: teens are increasingly turning to artificial intelligence for emotional support, but the tools they rely on may not be equipped to understand their lived experiences.

Dr. Riana Elyse Anderson, a New York City-based youth clinical and community psychologist and Associate Professor at Columbia University School of Social Work, is raising awareness about how AI chatbots are being used by young people seeking guidance—and where those tools fall short.

AI platforms are often praised for being accessible, private, and immediate. For teens hesitant to speak with parents, teachers, or counselors, chatbots can feel like a safe first step. But Dr. Anderson warns that these systems are not consistently designed to recognize racial stress, discrimination, or racialized bullying—realities that disproportionately affect Black youth.

Data underscores the urgency. In 2023, nearly one in three high school students reported experiencing racism in school, with that number rising to 45.9% among Black students. At the same time, 34% of teens reported being bullied, and about one in eight adolescents and young adults say they’ve already used generative AI for mental health advice.

The problem is not just usage—it’s accuracy and empathy.

AI tools may offer what appears to be reasonable advice in general bullying scenarios, but they often miss the cultural and emotional weight of race-related harm. When a teen describes a racially charged incident, a generic response like “ignore them” or “stay positive” can minimize the experience rather than validate it. That gap in understanding can have real consequences.

“Racial stress is not ‘drama’ or ‘miscommunication.’ It’s a health issue,” Dr. Anderson emphasizes through her work. If AI fails to recognize racism or treats it as a generic conflict, it risks normalizing harm rather than addressing it.

This raises concerns about unequal outcomes. If some teens receive culturally responsive guidance while others do not, AI could unintentionally widen disparities in mental health support.

Dr. Anderson, who developed the EMBRace intervention to support Black families navigating racial stress, says parents and educators need to treat AI as a tool—not a replacement for human care. She encourages families to have open conversations about racism and to help children process their experiences with trusted adults who understand cultural context.

For schools, the message is equally clear: when bullying is racialized, responses must go beyond standard discipline policies and address the deeper emotional and psychological impact.

As AI becomes more embedded in daily life, the question is no longer whether teens will use it—but whether the technology will evolve to truly see them.

Leave a Comment