Why Youth in Taiwan and China Are Choosing AI Over Therapy

Swagta Nath
5 Min Read

In the quiet hours before dawn, Ann Li, a 30-year-old woman from Taiwan, finds herself turning to an unusual confidante: an AI chatbot. Diagnosed with a serious health condition and unable to confide in her family or wake her friends, she opens a chat window with ChatGPT. “It’s easier to talk to AI during those nights,” she tells.

Li is not alone. Across Taiwan and China, a growing number of people—especially young adults—are using AI chatbots for emotional support in place of professional mental health services. In China, 25-year-old Yang from Guangdong says she avoided seeking help from friends or therapists out of fear and stigma. But she now chats with an AI “day and night.”

As mental health needs soar and access to trained professionals remains inadequate, AI tools are emerging as surrogate listeners, offering comfort and non-judgmental responses on demand. While ChatGPT is the preferred option in Taiwan, people in mainland China rely on domestic AI platforms like Baidu’s Ernie Bot or DeepSeek, due to the ban on Western apps.

ALSO READ: “Centre for Police Technology” Launched as Common Platform for Police, OEMs, and Vendors to Drive Smart Policing

Mental Health Professionals Caution Against Overreliance on AI

Although AI tools can be highly responsive and emotionally attuned in tone, experts warn against relying solely on them for serious mental health issues. Dr Yi-Hsien Su, a clinical psychologist based in Taiwan, says AI can offer accessibility and emotional validation, especially in cultures where emotional restraint is normalized. However, Su stresses the limits of machine empathy.

There are things we call non-verbal inputs—body language, tone, pacing—that AI just doesn’t catch,” Su explains. “In clinical settings, these cues are essential to understanding a patient’s true condition.

Some users, like Li, acknowledge these limitations. “ChatGPT gives me the answer I’d get after two or three therapy sessions,” she says. “But it skips the process of self-discovery that makes counseling meaningful.

The Taiwan Counselling Psychology Association echoes this concern, stating that while AI can be a helpful auxiliary tool, it cannot replicate crisis intervention or deep interpersonal therapeutic work. The association points out that AI lacks peer review, clinical oversight, and can sometimes be excessively optimistic or inaccurate, potentially delaying necessary medical care.

ALSO READ: FCRF Launches Campus Ambassador Program to Empower India’s Next-Gen Cyber Defenders

Accessibility vs Accuracy: A Double-Edged Sword for the Digitally Distressed

The growing reliance on AI comes against the backdrop of rising mental health challenges in East Asia, especially among younger demographics. Appointments with licensed therapists are hard to secure and often cost-prohibitive, leaving many people to fend for themselves emotionally. In this vacuum, AI appears to offer a low-cost, anonymous, and always-available alternative.

According to a recent Harvard Business Review analysis, psychological support has become one of the top global use cases for generative AI tools. On social media, hundreds of thousands of users express gratitude for the solace these chatbots offer. Yet the digital refuge is not without danger. In some tragic cases, distressed individuals who sought support exclusively from chatbots have taken their own lives. These incidents underscore the need for careful boundaries and public awareness around the role of AI in mental health care.

For Yang, however, the chatbot was a first step toward healing. “Only recently have I begun to realize I might actually need a diagnosis at a hospital,” she says. “Being able to talk to AI helped me eventually talk to real people—something I never thought I could do before.

Dr Su remains hopeful about the future, particularly around using AI for training mental health professionals and identifying at-risk users online. But until AI evolves with safeguards, transparency, and true diagnostic capability, he urges caution.

AI is a simulation—it can listen and talk, but it doesn’t truly understand,” Su says. “For now, it should be a tool, not a therapist.

Stay Connected