Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
The Internet AI

South Korea Faces Deepfake Porn 'Emergency' 54

An anonymous reader quotes a report from the BBC: South Korea's president has urged authorities to do more to "eradicate" the country's digital sex crime epidemic, amid a flood of deepfake pornography targeting young women. Authorities, journalists and social media users recently identified a large number of chat groups where members were creating and sharing sexually explicit "deepfake" images -- including some of underage girls. Deepfakes are generated using artificial intelligence, and often combine the face of a real person with a fake body. South Korea's media regulator is holding an emergency meeting in the wake of the discoveries.

The spate of chat groups, linked to individual schools and universities across the country, were discovered on the social media app Telegram over the past week. Users, mainly teenage students, would upload photos of people they knew -- both classmates and teachers -- and other users would then turn them into sexually explicit deepfake images. The discoveries follow the arrest of the Russian-born founder of Telegram, Pavel Durov, on Saturday, after it was alleged that child pornography, drug trafficking and fraud were taking place on the encrypted messaging app.
South Korean President Yoon Suk Yeol on Tuesday instructed authorities to "thoroughly investigate and address these digital sex crimes to eradicate them."

"Recently, deepfake videos targeting an unspecified number of people have been circulating rapidly on social media," President Yoon said at a cabinet meeting. "The victims are often minors and the perpetrators are mostly teenagers." To build a "healthy media culture," President Yoon said young men needed to be better educated. "Although it is often dismissed as 'just a prank,' it is clearly a criminal act that exploits technology to hide behind the shield of anonymity," he said.

The Guardian notes that making sexually explicit deepfakes with the intention of distributing them is punishable by five years in prison or a fine of $37,500.

Further reading: 1 in 10 Minors Say Their Friends Use AI to Generate Nudes of Other Kids, Survey Finds (Source: 404 Media)
This discussion has been archived. No new comments can be posted.

South Korea Faces Deepfake Porn 'Emergency'

Comments Filter:
  • by Anonymous Coward

    I'm sure it's not the only country deeply affected by fakes.

    • by ShanghaiBill ( 739463 ) on Wednesday August 28, 2024 @09:10PM (#64745256)

      I'm sure it's not the only country deeply affected by fakes.

      But perhaps it is the only country with politicians who are naive enough to think declaring it an "emergency" will have any effect.

      Deepfakes are here to stay. Photographs can no longer be trusted. Get over it.

      • Re: (Score:2, Insightful)

        by dirk ( 87083 )

        Or they are one of the only governments to actually care about their female citizens. Yes, fakes will always be around you can't completely stop them. But you can certainly make it a crime and prosecute people for doing it, this giving them a big incentive not to do it. They aren't just making these for themselves, they are posting them online and asking other people to denigrate the women, going so far as to try and find people they know to include. This goes far beyond a "fake picture" and into pure misog

        • Re: (Score:2, Insightful)

          by dfghjk ( 711126 )

          "Or they are one of the only governments to actually care about their female citizens. "
          No, that's not it. What kind of person thinks that? And why do you think this is only about "female citizens"?

          "But you can certainly make it a crime and prosecute people for doing it, this giving them a big incentive not to do it."
          Make exactly WHAT a crime?

          "...and asking other people to denigrate the women..."
          What?

          "...going so far as to try and find people they know to include."
          What???

          "This goes far beyond a "fake pic

          • by dirk ( 87083 ) <dirk@one.net> on Wednesday August 28, 2024 @10:15PM (#64745336) Homepage

            "Or they are one of the only governments to actually care about their female citizens. "
            No, that's not it. What kind of person thinks that? And why do you think this is only about "female citizens"?

            I think it is female citizens because that is over 90% of the people this is happening to. Are there males? Most likely. But overall, and especially in South Korea, it is men making deepfakes of women and girls they know.

            "But you can certainly make it a crime and prosecute people for doing it, this giving them a big incentive not to do it."
            Make exactly WHAT a crime?

            Publishing nude photos of people without their consent. This is the same as revenge porn. Just because they are fake pictures doesn't change things.

            "...and asking other people to denigrate the women..."
            What?
            "...going so far as to try and find people they know to include."
            What???

            Read some of the other article about what is going on in South Korea. They are setting up telegram groups for specific schools and posting the deepfakes there and asking people to denigrate the girls. This is not people making them for themselves, they are doing it specifically to post for others that know them and encouraging them to post vile things about them.

            "This goes far beyond a "fake picture" and into pure misogyny."
            LOL You've gone off the reservation.

            Again, read about what is going on and then come back and speak on it.

            "The victims are often minors and the perpetrators are mostly teenagers."
            Spend more time thinking and less running your mouth.

            I never said that, but it is true. Does that make it somehow better? Does the fact that is is fake child porn somehow make it better in your eyes? Or that boys in high school are doing it to girls in high school somehow make it less misogynistic?

            • by AmiMoJo ( 196126 )

              Worth mentioning that pornography is basically illegal in South Korea. Even if the victims were not upset by it, it would still be illegal to produce.

          • by nukenerd ( 172703 ) on Thursday August 29, 2024 @03:58AM (#64745596)

            "Or they are one of the only governments to actually care about their female citizens. " No, that's not it. What kind of person thinks that? And why do you think this is only about "female citizens"?

            And who the heck modded this offtopic too? Just because they don't agree? I don't particularly agree either, but it is still on topic.

        • The current president of South Korean actively courted the anti-feminist vote for the 2022 election, and his government didn't stop there.

          There is a gender-war occurring in Korea right now. Even being suspected of being a feminist could cause a bunch of angry creepy men to make a women a target for online abuse/harassment.

          https://www.vice.com/en/articl... [vice.com]

          • by Anonymous Coward

            The current president of South Korean actively courted the anti-feminist vote for the 2022 election, and his government didn't stop there.

            There is a gender-war occurring in Korea right now. Even being suspected of being a feminist could cause a bunch of angry creepy men to make a women a target for online abuse/harassment.

            https://www.vice.com/en/articl... [vice.com]

            A 1960s feminist wouldn’t even recognize feminism today. We have entire countries debating the definition of a “woman”. Men happily Going Their Own Way to the detriment of women who refuse to see they are the cause of that problem. Lesbians fought hard for marriage rights, only to confirm even another woman can’t get along with a modern woman and now lead a fucking horrific divorce rate. The 304 culture and women being proud of their whore-rific accomplishments, with STDs being p

            • yeah theyd recognize it as victory
            • I'm currently living and working in the Philippines.

              Lots of Korean men come here to shop for a spouse.

              Many Korean women are uninterested in marriage and even more uninterested in having a family. The birth rate in Korea is 0.8 BPW, the lowest in the world.

              I predict that future generations of Koreans will look more and more like Filipinos.

        • ,,,Yes, fakes will always be around you can't completely stop them. But you can certainly make it a crime and prosecute people for doing it, this giving them a big incentive not to do it.

          Who the heck modded this "Offtopic"? It's on-topic whether you agree or not.

        • Or they are one of the only governments to actually care about their female citizens.

          No they aren't. They don't care nearly as much about female citizens as the US government. The US government even protects females from the burden of having to make decisions about their own body. sooo much more generous than just addressing deepfakes. /s

      • Photographs can no longer be trusted.

        Wait... you trusted photographs? I have a great book from 1930 showing photographs depicting a diver emerging from a shallow puddle in the street, a tree supporting a house complete with garden, a city sitting on a cloud clearly in the sky - not just some optical illusion.

        Photographs have been manipulated since the days of the darkroom. It's not a question of not trusting them *now*.

        That said it is truly trivial to make a convincing AI fake these days. Whereas a few years ago you still had to have something

        • by unrtst ( 777550 )

          Totally agree, but most people did/do trust photos. And video. I'm kinda thankful for how AI has made apparent to all that all that imagery can be artificial, and I'm looking forward to seeing how the current generation proceeds - they'll be the first to come up in a world of realistic AI generated images, video, and sound, much like the generation when photography came out, and the one for radio, and the one for TV. How will that change the interpretation of things like "photographic evidence" and such?

      • by hey! ( 33014 )

        Deepfakes are here to stay. Photographs can no longer be trusted. Get over it.

        Interesting. So how widely are we to apply this logic? For example to this date, no government in the world has been able to eradicate burglary or fraud. So does this mean we should all get over it and not attempt to prosecute these things?

        I do think futility *should* play a role in public policy. For example marijuana laws are so widely flouted practically nobody really wants to see them universally and uniformly applied. But it's jumping to conclusions that government can do nothing to reduce the dee

      • I'm sure it's not the only country deeply affected by fakes.

        But perhaps it is the only country with politicians who are naive enough to think declaring it an "emergency" will have any effect.

        Deepfakes are here to stay. Photographs can no longer be trusted. Get over it.

        Which is going to have interesting consequences for law enforcement, legal proceedings, and national security.

        • by suutar ( 1860506 )

          It's going to be a chain-of-evidence situation. The media can be trusted only as much as the least-trusted link in the chain.

          Now, how long it will take the courts to recognize that is an open question. But I'm sure the national security agencies are already thinking about that.

      • Deepfakes are here to stay. Photographs can no longer be trusted. Get over it.

        Text has never been trustworthy and so libel laws date back to ancient Rome.

        Your attitude is just a funhouse mirror image of all those shitty patents for "X but on the INTERNET". Makin' shit up about someone and publishing it has been illegal for a very long time, and it doesn't magically become OK just because AI was involved.

  • by zkiwi34 ( 974563 ) on Wednesday August 28, 2024 @09:00PM (#64745236)
    Of making porn at least somewhat protected.
    • But Google tells me:
      "Pornographic websites, books, writings, films, magazines, photographs or other materials of a pornographic nature are illegal in South Korea. Distribution of pornography is a felony, and can result in a fine or a prison sentence not exceeding one-year."

  • What? None of the other comments are any better.

  • by dknj ( 441802 ) on Wednesday August 28, 2024 @09:29PM (#64745280) Journal

    Telegram going to go the way of TikTok, Mega, Backpage. TPTB will not allow it

  • by Big Hairy Gorilla ( 9839972 ) on Wednesday August 28, 2024 @09:40PM (#64745292)
    that teenagers love sex or that anyone is surprised this is happening.

    It's supposed to be alarming that the tools are so obviously designed specifically for fraud are hand waved away as harmless. The malignant application of these tools clearly create negative impacts on society. To say otherwise just means you're a fool .. or possibly profiting off the idea, ie. self interest.

    This is Black Mirror territory. The prime minister is blackmailed into having sex with a live pig on TV. Yup. Coming to a channel near you. Alot of people probably would want to see that, but, I'm fairly sure there's a preponderance of people who would agree the negative impact is real.
    • by AmiMoJo ( 196126 )

      It would help if the law was clarified to state that firing someone because of deep fake images, or stolen images, is illegal. That would at least remove one of the biggest reasons why this kind of extortion works.

      • I certainly agree the laws need to catch up.

        However, I'm not optimistic because legislators really and truly don't understand anything about technology. It might as well be Medieval Alchemy or Rocket Surgery for all any politician can understand, and now with climate and humanitarian crises exploding all around us, I doubt this will get priority.
    • This is Black Mirror territory. The prime minister is blackmailed into having sex with a live pig on TV.

      OH please NOOOOoo...

      I don't wanna see Biden riding Kamala....you just put that in my head and now I cannot UN-see it...

      [shudder]

  • You don't need AI (Score:2, Insightful)

    by Powercntrl ( 458442 )

    In ye olden days people would just do the head swap thing using Photoshop. It still works for that, too. Ever wonder what Musk would look like if he spent more time at a gym rather than on Twitter? On second thought, never mind, I'm pretty sure no one here wants to see that.

    The only difference with AI is that it makes these sort of shenanigans accessible to people who don't have the patience to watch a YouTube video explaining how layer masks work.

    • Re: (Score:3, Interesting)

      Exactly, and it can be automated. As class photos are published online, so are the equivalent nudes generated and published on Telegram (or where-ever, because Mastodon is probably next, just less talked about).

      This isn't going to go away and all it's going to do is create a black market for the problem. Want to see Kamala Harris naked? $10. (not serious, officer)

      The only realistic outcome is that this be the norm. You wont be able to arrest all of the people that do this in every country, so focus on
      • Getting rid of the "tech savvy" kids capable of pulling this off is only going to hurt the economy by limiting the number of talented individuals to those that conformed as a child, which are few.

        I think you're overestimating the level of savviness required to generate these images. This is script-kiddy stuff - not something that requires a deep understanding of gen AI models.

        Yes I get that it's a problem, but jail is not the solution unless an existing law is being broken (IE: harassment, using AIG-CSAM to lure 15 year olds, etc.).

        Why should we take for granted that existing laws are sufficient to address a problem created by novel technology?

    • Ever wonder what Musk would look like if he spent more time at a gym rather than on Twitter?

      Are you telling me that those pictures with Musk's head on top of a polythene sack of lard are the real ones?

    • by AmiMoJo ( 196126 )

      It requires skill and effort to create a convincing fake in Photoshop. It requires a few seconds to create an AI deepfake nude that is reasonably convincing.

      What's more, the companies behind these tools advertise them specifically and explicitly for sextortion. Their ads show men sending woman AI nude, and the woman responding by begging them to delete it and promising to "do anything".

      If a company offered to photoshop someone's head onto a nude body, and advertised it as being ideal for sextortion, the fac

      • I am somewhat proficient with GIMP and from my experience you can do a good face swap in a couple of minutes after you learn the process. Without any AI, just masks and layer blending.

  • .... that has such people in it!

  • We need to accept that we're going to have to ruin the lives of a few bad kids to protect all the good kids. Start charging teenagers as adults for child pornography and make the cases (and the punishments) very public.

  • by quenda ( 644621 ) on Thursday August 29, 2024 @09:52AM (#64746080)

    ... now when you ex leaks your embarrassingly bad home sex videos, you can claim they are just deepfakes.

    Maybe the solution is just to accept that the genie is out of the bottle and stop getting so uptight about it. Why are Koreans and Americans so freaked out about nudity and sex?

  • Why isn't it treated as a case of more generic reputation destruction attempt by spreading forged evidence? How is that different from distributing a deep fake video of a victim doing drugs or getting caught at shop-lifiting? What make deep fake porn special here?
  • Thats disgusting! where?

If a train station is a place where a train stops, what's a workstation?

Working...