Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
AI China

China Drafts World's Strictest Rules To End AI-Encouraged Suicide, Violence (arstechnica.com) 34

An anonymous reader quotes a report from Ars Technica: China drafted landmark rules to stop AI chatbots from emotionally manipulating users, including what could become the strictest policy worldwide intended to prevent AI-supported suicides, self-harm, and violence. China's Cyberspace Administration proposed the rules on Saturday. If finalized, they would apply to any AI products or services publicly available in China that use text, images, audio, video, or "other means" to simulate engaging human conversation. Winston Ma, adjunct professor at NYU School of Law, told CNBC that the "planned rules would mark the world's first attempt to regulate AI with human or anthropomorphic characteristics" at a time when companion bot usage is rising globally.

[...] Proposed rules would require, for example, that a human intervene as soon as suicide is mentioned. The rules also dictate that all minor and elderly users must provide the contact information for a guardian when they register -- the guardian would be notified if suicide or self-harm is discussed. Generally, chatbots would be prohibited from generating content that encourages suicide, self-harm, or violence, as well as attempts to emotionally manipulate a user, such as by making false promises. Chatbots would also be banned from promoting obscenity, gambling, or instigation of a crime, as well as from slandering or insulting users. Also banned are what are termed "emotional traps," -- chatbots would additionally be prevented from misleading users into making "unreasonable decisions," a translation of the rules indicates.

Perhaps most troubling to AI developers, China's rules would also put an end to building chatbots that "induce addiction and dependence as design goals." [...] AI developers will also likely balk at annual safety tests and audits that China wants to require for any service or products exceeding 1 million registered users or more than 100,000 monthly active users. Those audits would log user complaints, which may multiply if the rules pass, as China also plans to require AI developers to make it easier to report complaints and feedback. Should any AI company fail to follow the rules, app stores could be ordered to terminate access to their chatbots in China. That could mess with AI firms' hopes for global dominance, as China's market is key to promoting companion bots, Business Research Insights reported earlier this month.

China Drafts World's Strictest Rules To End AI-Encouraged Suicide, Violence

Comments Filter:
  • by DrMrLordX ( 559371 ) on Monday December 29, 2025 @05:05PM (#65888909)

    Have there been widespread suicides in China exacerbated by the usage of LLM chatbots?

    • by rwrife ( 712064 )
      Probably something close to 0, but all they need to do is show a suicidal person used an LLM for anything and it'll take the blame.
    • by gweihir ( 88907 ) on Monday December 29, 2025 @05:22PM (#65888941)

      There likely have been suicides. The 966 stupidity alone will see to that. Restricting LLMs may be just to show "something is being done" (and western governments like this one too...), or there may be a real connection or this is because LLMs do usually actually mostly report facts and some for those facts do not look too good for dear leader and his party and politics. My money is on the last one as most likely, as there have been some stories about that happening.

      And ss soon as you have any kind of monitoring infrastructure in place (in the west done just the same by pushing lies and FUD), you can use that infrastructure nicely to do mass surveillance. Many politicians and authoritarian assholes really loooooove that. Cannot have people have privacy, can we? They may think THINGS! Or even do THINGS!

      • Mass surveillance is a fact of the world today. It's where the money is. It can be used for good (like in this Chinese example) or it can be used for bad (like when Facebook sells people's information to right wing political groups like Cambridge Analytica).

        The foundational question for this age is better stated in the original Latin, though: "Quis custodiet custodes?"

        • China, a totalitarian police state, uses mass surveillance to oppress and control the Chinese people. They do not "use it for good". Were I forced to choose between that and a company selling my data to make money, instead of doing it to keep me under the thumb of a genocidal dictatorship, I'd go with Facebook any day.

          You picked dictatorship over someone making money. Think about that.

      • by AmiMoJo ( 196126 )

        Maybe they are just getting ahead of things. I'm the US you rely on lawsuits, in the EU it takes a long time to regulate.

        Honestly it's a pretty basic requirement for any reasonable AI that it doesn't talk people to death. Ironic how Star Trek thought it would be Kirk talking robots to death, not the other way around.

        • Everyone relies on lawsuits. That's part of rule of law. We write legislation just like you do. The way the courts function (or don't) is different, but in both cases it's where how the law actually functions is supposed to be decided.

    • by ffkom ( 3519199 )

      Have there been widespread suicides in China exacerbated by the usage of LLM chatbots?

      Doesn't matter - what matters is to establish technology/processes that can then also be used to prevent any form of dissent from the ruling party line. Just like prevention of a few rare crimes is also used in the West as a pretense for culling freedom.

    • by dbialac ( 320955 )
      I can't speak about suicides, but a friend's son OD'd a couple of weeks ago after asking ChatGPT how to get high and increase the intensity of the high. Given that, I'd say probably yes.
    • by allo ( 1728082 )

      The motivation is probably more the part about banning "misinformation" (about Chinese politics).

    • by mjwx ( 966435 )

      Have there been widespread suicides in China exacerbated by the usage of LLM chatbots?

      No, it's just that the government wants to be the only one who tells people they should die.

    • by caseih ( 160668 )

      Life was already pretty bleak for young chinese before LLMs came along. Officially the young adult unemployment rate is 19%, but in reality it's a lot higher than that. Some estimate as high as 40%. The CCP doesn't have a lot to offer them. Also the CCP-caused gender imbalance is demoralizing to their young men as the odds of starting a family are low.

      While China watchers such as China Uncensored are unnecessarily prone to hyperbole in their rhetoric, it's very true that some aspects of Chinese society a

  • by Ritz_Just_Ritz ( 883997 ) on Monday December 29, 2025 @05:15PM (#65888931)

    The CCP could use AI to predict those suicides and get a prison surgeon there in time to harvest the organs. Waste not, want not.

    • Even better, instruct people in suiciding in a way that will preserve their organs. Get concerned when the AI tells you to fill your bathtub with ice.

  • by SeaFox ( 739806 ) on Monday December 29, 2025 @05:23PM (#65888947)

    If AI isn't allowed to emotionally manipulate people, how will the glorious AI-ad-filled future be realized in the world's fastest-growing consumer market?

  • by gweihir ( 88907 ) on Monday December 29, 2025 @05:39PM (#65888981)

    Sure, it is China, so they likely do not want their own propaganda countered. But apart from that these rules do make a lot of sense. The AI pushers have put all known manipulation techniques aggressively into LLM chatbots and that is not good at all.

  • Sometimes I think it would be nice to have a functional government. Oh, well.

    • Dictatorships move fast. It doesn't mean they are more functional, it just means they are faster. Not having to care about what anyone else thinks speeds things along.

      You do not want that.

      • by caseih ( 160668 )

        Hey Trump is only doing what the people elected him to do! If the people want it, therefore it must be constitutional (you must acquit!). America needs a decisive leader who acts quickly on right things, rather than depend on the glacial, democratic process of congressional law making. Such as stopping those unamerican off-shore wind projects. And funneling subsidies back to god-fearing oil barons instead of those godless hippy electric car and solar energy pushers. And to stand up against those epstei

        • I miss when the cuckservatives were demanding the release of the Epstein files.

          I'm not sad they've recently gotten a bit quieter about certain things they were always hypocritical about, though.

      • China is not a dictatorship. That's not to say it isn't authoritarian. A dictatorship is a specific thing and China isn't that thing. Taiwan was a dictatorship until this century, and might just slide back into it given recent political unfoldings on the island. China is a single party authoritarian communist state. They have non-hereditary peaceful transfers of power through elections. Admittedly only party members can vote in those elections, but the elections are still meaningful, and no nation allows al

  • Will I be wrong to assume they will not be implementing these rules on the versions used outside of China?
    • by caseih ( 160668 )

      That is correct. You will be wrong. Report to your nearest CCP police station. Now with convenient locations in all major countries.

Suburbia is where the developer bulldozes out the trees, then names the streets after them. -- Bill Vaughn

Working...