Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Security

A Single Cloud Compromise Can Feed an Army of AI Sex Bots (krebsonsecurity.com) 28

An anonymous reader quotes a report from KrebsOnSecurity: Organizations that get relieved of credentials to their cloud environments can quickly find themselves part of a disturbing new trend: Cybercriminals using stolen cloud credentials to operate and resell sexualized AI-powered chat services. Researchers say these illicit chat bots, which use custom jailbreaks to bypass content filtering, often veer into darker role-playing scenarios, including child sexual exploitation and rape. Researchers at security firm Permiso Security say attacks against generative artificial intelligence (AI) infrastructure like Bedrock from Amazon Web Services (AWS) have increased markedly over the last six months, particularly when someone in the organization accidentally exposes their cloud credentials or key online, such as in a code repository like GitHub.

Investigating the abuse of AWS accounts for several organizations, Permiso found attackers had seized on stolen AWS credentials to interact with the large language models (LLMs) available on Bedrock. But they also soon discovered none of these AWS users had enabled logging (it is off by default), and thus they lacked any visibility into what attackers were doing with that access. So Permiso researchers decided to leak their own test AWS key on GitHub, while turning on logging so that they could see exactly what an attacker might ask for, and what the responses might be. Within minutes, their bait key was scooped up and used in a service that offers AI-powered sex chats online.

"After reviewing the prompts and responses it became clear that the attacker was hosting an AI roleplaying service that leverages common jailbreak techniques to get the models to accept and respond with content that would normally be blocked," Permiso researchers wrote in a report released today. "Almost all of the roleplaying was of a sexual nature, with some of the content straying into darker topics such as child sexual abuse," they continued. "Over the course of two days we saw over 75,000 successful model invocations, almost all of a sexual nature."

This discussion has been archived. No new comments can be posted.

A Single Cloud Compromise Can Feed an Army of AI Sex Bots

Comments Filter:
  • Oh shit (Score:5, Funny)

    by locater16 ( 2326718 ) on Thursday October 03, 2024 @06:24PM (#64838241)
    Skynet is sexy now, we're doomed
    • On the bright side, we now have a "Libraries of Congres"-style unit measure for cloud data compromises. The hack of MS executives' email must have been at least ten stardand armies of sex bots in size.
    • by AmiMoJo ( 196126 )

      That was literally the plot of Terminator 3. The TX (female terminator) quickly realizes that it can use sex to manipulate men. Apparently they had a lot of trouble with the scene where she inflates her breasts, they couldn't get the two airbags to work reliably together.

      • by mjwx ( 966435 )

        That was literally the plot of Terminator 3. The TX (female terminator) quickly realizes that it can use sex to manipulate men. Apparently they had a lot of trouble with the scene where she inflates her breasts, they couldn't get the two airbags to work reliably together.

        I wouldn't exactly call it a plot, more a trope, maybe a cliche (see also: MIB 2).

        I thought it was CG... anyway.

        I've thought before, why didn't Skynet use lady terminators as infiltration units against the resistance (The T-X was a prototype anti-terminator terminator so the gender really is irrelevant). Psychologically it would make sense as women are generally seen as more trustworthy, less of a threat and the whole male ego (I.E. desire to protect women). Then I realised that human men wouldn't be

        • by AmiMoJo ( 196126 )

          The canon explanation was that the early model Terminators' exoskeleton was too large for anything but a big jacked up guy. Later models were more compact, and we saw female Terminators like Cameron in Sarah Connor Chronicles.

          I guess the T1000 liquid metal models were genderless, being able to take any form. The TX was an evolution of that, able to create complex machines like guns.

  • by ls671 ( 1122017 ) on Thursday October 03, 2024 @06:26PM (#64838245) Homepage

    Given previous FA, I guess I'd prefer an army of physical AI sex bots versus a 3.8 Tbps DDOS attack. /s

    • by gweihir ( 88907 )

      Probably not if they go around and rape people (and children)...

      Hmm. There may be some valid applications for AI in warfare here. I mean, let the poor soldiers from barbaric country XYZ rest after a day full of killing enemy soldiers, instead of then having to rape the local civilians all evening! "Rape-Bot" as an anti-civilian terror weapon. AI has so many great potential applications!

      (Yes, some simply rape civilians as a personal criminal choice, but in many wars it gets ordered as a war-crime to terroriz

      • Probably not if they go around and rape people (and children)...

        You can't rape anyone with pixels, and since there is no child involved, it isn't "child exploitation," even if the customer thinks it is.

        • by gweihir ( 88907 )

          Those that can read have a clear advantage ...

          But since you seem to be getting old and senile, here, let me help you: I did answer to "I'd prefer an army of physical AI sex bots". "Physical AI sex bots" would be robots with a robotic body that allows them to act in the real world. "Pixels" commonly refer to "virtual" entities, not physical ones.

    • by vbdasc ( 146051 )

      I guess I'd prefer an army of physical AI sex bots

      Until they start demanding that you satisfy their needs.

  • Funny thing (Score:2, Redundant)

    by gweihir ( 88907 )

    About 35 years ago, while studying CS I talked with a friend about the possibility of AI powered-sex chats as one of the few probably viable mainstream AI applications. We both agreed that it would probably be possible, but were not interested in actually doing something like that. I am surprised it took so long to become a thing.

  • Why though? (Score:2, Interesting)

    by Anonymous Coward

    AI roleplaying service that leverages common jailbreak techniques to get the models to accept and respond with content that would normally be blocked

    Why, though?
    I get blocking information about creating dirty bombs, but are we really going to ban dirty talk?

    Almost all of the roleplaying was of a sexual nature, with some of the content straying into darker topics such as child sexual abuse

    So, what you are saying that most of the roleplaying was perfectly legal by any measure, while other was on uncomfortable topics that's probably still legal (IANAL, but I imagine fictional stories are still legal, regardless of the subject).

  • by Baron_Yam ( 643147 ) on Thursday October 03, 2024 @07:45PM (#64838389)

    What was in the training data and where did it come from?

    • Transcripts from phone sex lines?
      Romance novels?
        Cosmopolitan?

    • It also raises the question of why host them illegally, when there is nothing apparently illegal with making sex chat bots (it seems from the summary that most were not about children).
      • ... apparently illegal with making sex chat bots ...

        I'm guessing Freedom of Speech , even in the USA, means a computer cannot claim to be a naked 12 year-old girl. Of course, this will be a non-issue in countries without Freedom of Speech.

        In this case, the cry will be "think of the police": They can't find a real naked 12 y.o. victim if the internet is full of fakes. Also, "think of the children" because the absence of a girl, somehow still makes girls, victims.

      • The article notes “If undiscovered, this type of attack could result in over $46,000 of LLM consumption costs per day for the victim.”.

        Presumably being against the ToS makes finding unwitting intermediaries even more attractive; but this sounds like a case where the economics of stealing service are particularly favorable: extremely high compute cost, and very low volume of data that needs to be sent in and retrieved.

        The zOMG pedobots! angle makes for a good headline; and it wouldn't be a
        • Fascinating that someone would prefer to use the computer time for this rather than for Bitcoin searching.
          • My understanding is that the AWS 'Bedrock' service that is being exploited in this case is some sort of managed-serverless-abstracted overlay on an Amazon copy of something from Anthropic; it's not just a "here's an EC2 instance with a bunch of H200s connected to it; you do you".

            It wouldn't be entirely surprising if people do try to sneak some mining in on free-as-in-stolen AWS resources(probably not bitcoin; that has basically only been worthwhile on ASICs for a while now; monero seems to be the one you

A LISP programmer knows the value of everything, but the cost of nothing. -- Alan Perlis

Working...