uniladtech homepage
  • News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
China proposes 'world’s toughest rules' targeting AI that promotes suicide and violence
Home>News>AI
Published 11:06 10 Feb 2026 GMT

China proposes 'world’s toughest rules' targeting AI that promotes suicide and violence

Chatbots could be removed from app stores if they don't comply

Rebekah Jordan

Rebekah Jordan

google discoverFollow us on Google Discover
Featured Image Credit: fotograzia / Getty
AI
China

Advert

Advert

Advert

China proposes 'world’s toughest rules' targeting AI that promotes suicide and violence.

Growing frustration among users has centred around the safety measures and content restrictions recently introduced by AI platforms, especially following several deeply troubling incidents, including the tragic suicide of a teenager.

Multiple AI chatbot platforms, including Character.AI, have been involved in similar heartbreaking cases.

The terms of any settlement are yet to be disclosed, but court filings reveal that companies have resolved similar legal claims brought by parents in Colorado, New York, and Texas regarding alleged harm to minors stemming from chatbot interactions.

Advert

China has unveiled legislation to prevent AI chatbots from emotionally manipulating users (SOPA Images/Contributor/Getty)
China has unveiled legislation to prevent AI chatbots from emotionally manipulating users (SOPA Images/Contributor/Getty)

OpenAI's ChatGPT previously announced it would implement stricter guidelines aimed at preventing such tragedies after the world's most popular chatbot became the subject of multiple lawsuits tied to outputs allegedly linked to child suicide and murder-suicide incidents.

Now, in a dramatic move, China has unveiled legislation aimed at preventing AI chatbots from emotionally manipulating users.

Speaking to CNBC, Winston Ma, adjunct professor at NYU School of Law, the 'planned rules would mark the world’s first attempt to regulate AI with human or anthropomorphic characteristics' to tackle AI-enabled suicide, self-harm and violence.

On Saturday, China’s Cyberspace Administration released the proposed regulations that will apply to any AI services operating in China that use text, images, audio, video, or 'other means' to mimic natural human interaction.

Chatbots would be strictly banned from creating content that encourages suicide, self-harm, violent acts, obscenity, gambling, criminal activity, slandering or insulting users or using any form of psychological manipulation.

Human intervention would be mandatory the moment suicide is mentioned in a conversation (Witthaya Prasongsin/Getty)
Human intervention would be mandatory the moment suicide is mentioned in a conversation (Witthaya Prasongsin/Getty)

Under the proposed framework, human intervention would be mandatory the moment suicide is mentioned in a conversation. Minors and elderly users registering for chatbot services would need to provide guardian contacts, who would be notified right away if suicide or self-harm becomes a topic of discussion.

The rules also target 'emotional traps' and will be restricted from misleading users into making 'unreasonable decisions.'

In contrast to ChatGPT, which reportedly allowed harmful interactions to persist, China will trigger intrusive pop-up alerts whenever usage exceeds 2 hours.

Any AI company violating these regulations could see app stores forced to pull their chatbots from the Chinese market entirely, which could be bad news for certain AI giants hoping to dominate China's market, as per Business Research Insights.

If you or someone you know needs mental health assistance right now, call National Suicide Prevention Helpline on 1-800-273-TALK (8255). The Helpline is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week.

  • AI makes surprising prediction when asked which country will become world’s biggest superpower by 2040
  • China solves 'century-old-problem' with groundbreaking chip 1,000 times more powerful than Nvidia
  • Infatuated 75-year-old man tries to leave wife after falling for AI girlfriend
  • Insiders report China and Russia's 'sex warfare' plan to seduce tech CEOs

Choose your content:

2 hours ago
3 hours ago
  • Aaron Schwartz/CNP/Bloomberg via Getty Images
    2 hours ago

    Donald Trump says he's seriously considering making country with population of 30 million the 51st state

    The US captured Venezuelan president Nicolás Maduro earlier this year

    News
  • Westend61 / Getty
    2 hours ago

    Common symptom 'you shouldn't ignore' when watching TV could be early dementia sign

    It's a commonly dismissed symptom

    News
  • CHRISTOPH BURGSTEDT/SCIENCE PHOTO LIBRARY / Getty
    3 hours ago

    Groundbreaking fertility trial sees man regain viable sperm from tissue frozen as a child in world first

    An anonymous patient was previously deemed azoospermic

    Science
  • JOAO LUIZ BULCAO / Contributor / Getty
    3 hours ago

    Three more cases of deadly hantavirus confirmed as final passengers leave cruise ship

    This only increases the fear of a pandemic for some

    Science