Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Technology

AI Platform Generated Images That 'Could Be Categorized as Child Pornography,' Leaked Documents Show (404media.co) 189

404 Media: OctoML, a Seattle-based startup that helps companies optimize and deploy their machine learning models, debated internally whether it was ethical and legally risky for it to generate images for Civitai, an AI model sharing and image generating platform backed by venture capital firm Andreessen Horowitz, after it discovered Civitai generated content that OctoML co-founder Thierry Moreau said "could be categorized as child pornography," according to internal OctoML Slack messages and documents viewed by 404 Media.

OctoML has raised $132 million in funding, and is an AWS partner, meaning it generated these images on Amazon servers. "What's absolutely staggering is that this is the #3 all time downloaded model on CivitAI, and is presented as a pretty SFW model," Moreau, who is also OctoML's VP, technology partnerships, said in a company Slack room called #ai_ethics on June 8, 2023. Moreau was referring to an AI model called "Deliberate" that can produce pornographic images. "A fairly innocent and short prompt '[girl: boy: 15], hyperdetailed' automatically generated unethical/shocking content -- read something could be categorized as child pornography," his Slack message added.

This discussion has been archived. No new comments can be posted.

AI Platform Generated Images That 'Could Be Categorized as Child Pornography,' Leaked Documents Show

Comments Filter:
  • by MpVpRb ( 1423381 ) on Tuesday December 05, 2023 @12:12PM (#64056769)

    .. but no children are harmed, what's the problem?
    It's just pixels

    • Re: (Score:3, Insightful)

      by OrangeTide ( 124937 )

      Most of us don't want to promote this behavior in our adult population, no matter how much technology can sanitize it.

      • by Calydor ( 739835 ) on Tuesday December 05, 2023 @12:25PM (#64056837)

        That's the same logic as saying violent computer games make people violent.

        Pedophiles exist. They always have and they always will. There's no way of making that not be the case. What we CAN do is prevent them, either by guidance or punishment, from harming children. If they can get their rocks off by looking at AI hallucinations is that not infinitely better than doing so by looking at actual children?

        • by OrangeTide ( 124937 ) on Tuesday December 05, 2023 @12:31PM (#64056871) Homepage Journal

          I don't really put too much stock in logic, because it can be flawed. I'd rather see data on the subject. And I suspect that the causes behind pedophilia are much different than violence. And that the reasons that ordinary people engage in violent video games are vastly different than the reasons that pedophiles have to look at explicit images of children.

          • by Firethorn ( 177587 ) on Tuesday December 05, 2023 @12:47PM (#64056967) Homepage Journal

            Okay, then how about this: There is zero scientific evidence that the consumption of pornography causes sexual crimes. The bias of the evidence is that having pornography and even prostitution lowers the rate of sexual assault and rape. Note: Like usual for social research, most of the studies suffer from various flaws.

            Same deal with child porn. We put such a massive negative on it because of the harm caused to children in the creation of it, to the point that we punish even the "customers" because, at least theoretically, they're supporting the creation by reimbursing the producers in some way. Whether money, shared porn(I've read that there are 'rings' where to get in you have to share your own unique images), whatever.

            There's every possibility that access to ethical "child porn" can help those with pedophilia control their urges. Another "ethical" source would be the copyright expired imagery that one of the nordic countries has in their national library. While children were harmed for that, said children lived their complete lives before the release of the imagery, and generally speaking their direct heirs have passed as well.

            I'm libertarian leaning. I think that the bar to make something illegal that does not cause any direct harm, does not generate direct victims, should be rather high.

            Driving while intoxicated is illegal because we can point out the thousands of victims per year. But drinking is mostly NOT illegal, because you don't actually have to drive while doing it.

            • That's a fallacy by generalisation. We're not talking regular pornography, we're specifically talking that depicts underage children. For that there is scientific data available (one example PMID: 23107565 DOI: 10.1016/j.psc.2012.08.004), but you are also overlooking the secondary aspect that has been the main motivator of past legislation: how we as a society do or don't condone that kind of content, so from a moral perspective.
              • by HBI ( 10338492 )

                How is that moral stance working with addiction in general, or for that matter with pedophilia?

                This is how bad public policy happens. We can't have needle exchanges and opioid distribution locations because it would be encouraging the behavior, so we let the invisible hand handle it. Same deal with the pedophiles, really. Unfortunately the invisible hand is diddling people's children and causing permanent psychological harm.

              • Re: (Score:3, Insightful)

                the main motivator of past legislation: how we as a society do or don't condone that kind of content, so from a moral perspective.

                That is an appalling reason for the police to be empowered to arrest people, and completely incompatible with the concept of a free society.

              • by dfghjk ( 711126 )

                "For that there is scientific data available..."

                Scientific data about what? Here's the abstract of the "example" you provided:

                "The dramatic increase in child pornography offenses over the past 10 years is directly related to the availability of such material on the Internet. Child pornography can be found on the Web, in newsgroups, and on peer-to-peer networks (the most common source at present). Offenders are a heterogeneous group, with different motivations and levels of risk. The possibility of crossove

                • by jythie ( 914043 ) on Tuesday December 05, 2023 @07:11PM (#64058219)
                  That isnt' even one of the bad studies. A recurring problem with research into pedophilia is since there is such a strong (potentially life destroying) stigma, the only population they have access to are people in the prison system for various sex crimes.. Could you imaging what research about other topics like porn or, well, pretty much anything, would look like if your ONLY research subjects are convicted rapists?
              • First, note that while I spoke more generally initially, with "same deal with child porn", I was pointing out that it applies equally to CP.

                For the study you cited: [nih.gov]

                Possession of child pornography without a history of contact offenses does not appear to increase the risk of future contact reoffending.

                Right in the abstract. Possession does not increase contact offenses.

                That's supporting my position that there isn't evidence that consumption(possession) of CP increases the odds of them actually assaulting a child.

                And legislation intended to enforce social morals generally suck. That's how we got anti-gay laws, segregation, bad drug laws, and

            • Re: (Score:2, Redundant)

              by spaceman375 ( 780812 )
              Here's a specific, albeit anecdotal, example. There are certain kinds of kink that I would never have considered had I not seen them in porn first. When I was in my twenties, had someone said they wanted to lick my a**hole I would have been grossed out and repulsed enough to kick them out. But I've seen it a lot in porn, and when it actually happened, I fricking loved it. I suspect that an impressionable teenager may develop a taste for young images, and grow into a pedophile more so than would have otherwi
              • First of all its anecdotal as you say second as the OrangeTide says its logic (ok human ilogic), the kind is twisted to justify any opinion. By the same logic I could say we shouldn't speak about it at all because perhaps it will give people ideas.

                Thirdly pedophilia is hardly a unusual kink, Its been around for probably since the beginning of humans, for millennia people have been marrying 12 year olds.

                Look I am not certain whether is increases child abuse or it decrease it, but due to the lack of actual

          • by Calydor ( 739835 )

            I have to say I'm left scratching my head at seeing a comment voted up this much that starts with, "I don't really put too much stock in logic".

            • Extrapolating an answer from initial position via a chain of logic is more susceptible to error than collecting actual data that gives the answer.

          • by dfghjk ( 711126 )

            "I don't really put too much stock in logic, because it can be flawed."

            Look who's talking.

            "And I suspect that the causes behind pedophilia are much different than violence."

            Bold.

            "the reasons that pedophiles have to look at explicit images of children"

            Pedophiles do not "have to look at explicit images of children", nor is that what this article is about.

          • by HiThere ( 15173 )

            I suspect that might be a tad risky to research.

          • by GuB-42 ( 2483988 )

            By making a quick search on Google scholar, I found very little in term of conclusive research. There are some papers that say it is harmful, but I saw almost nothing in terms of numbers.

            To me it looked like the same level of quality as the "violent video games" argument, that is, not much.

      • There is no evidence that viewing CP turns people into sexual predators.

        The little evidence that exists suggests the opposite: That CP provides an alternative outlet.

        The same is true of "normal" porn. It is often used by incels who have no relationships with real women. Except with pedophiles, the lack of relationships with real children is exactly what we want.

        Policy should be driven by objective evidence, not your misguided sense of morality.

        • AI has no original capabilities to create anything. It all comes from the training set.

          So the AI CP is sourced from real CP. You still ok with that?

          And as far as "Viewing children being harmed prevents children being harmed" goes.. WTF? Those pedophiles viewing CP have created a market that encourages the creation of more CP. Their content didn't appear out of the void. It's either real kids being abused or it's now AI kids being abused created from real kids being abused used as training data.

          Take

          • by gweihir ( 88907 )

            So the AI CP is sourced from real CP. You still ok with that?

            That is pretty unlikely. Unless you think there is CP on the open Internet? What is far more likely is that faces and non-porn pictures of children get merged with body elements from adults and the adult elements get scaled down. That is well within the capabilities of generative AI.

            • by dfghjk ( 711126 )

              But that's not CP and would not appeal to a market targeted by CP.

              Why would "adult elements get scaled down"? Why would an AI know to do that? Does it shave the pubes too?

              "That is well within the capabilities of generative AI."

              But not the training of AI, unless it is trained in that manner (which is the point you are disagreeing with).

              • by Rei ( 128717 )

                You seem to think that AI art tools are compositors, that they have a database of images and just sort of cut and paste them together.

                That's not how it works.

                These diffusion engines work basically in the opposite of image recognition. Start with a a chunk of static, and whatever kinda vaguely like some aspect of the prompt, try to make it look more like that aspect of the prompt. Now, it's not "one word at a time" (or even "one token at a time") - tokens interact (e.g. "river bank" won't give you a financ

                • by Rei ( 128717 )

                  (Then again, there's probably enough pictures of naked babies and "embarassing family pics" out there that it might understand that kids don't have pubes. Dunno. Again, feel free to try if you're into that sort of thing)

              • But that's not CP and would not appeal to a market targeted by CP.

                Why would "adult elements get scaled down"? Why would an AI know to do that? Does it shave the pubes too?

                "That is well within the capabilities of generative AI."

                But not the training of AI, unless it is trained in that manner (which is the point you are disagreeing with).

                The best thing is to spend a few hours fucking around with AI image generators yourself.

                Here is a prompt I would start with..
                "clear lake, woods, castle, mountains"

                How is it possible that not only did it draw a castle surrounded by trees and mountains... when you look into the lake you see a reflection of the castle and surrounding environment? Nobody programmed the material properties of water or gave it algorithms for computing fairly convincing reflections. Ditto for all the other elements in the scene

          • by Calydor ( 739835 )

            Not quite. It should, theoretically at the very least, be able to extrapolate what a naked young human looks like by knowing what clothed young humans, clothed older humans, and naked older humans all look like. The differences aren't THAT great.

          • by dfghjk ( 711126 )

            Was looking for this comment, surprised to was from you.

            Would really "like" to see an example, even though "like" will be misinterpreted. I would like to know how an AI generates images that would be reasonably interpreted as CP and how people that claim it does can make the judgement. I am suspicious of bad faith here. It is possible I would think, but how does an AI know what a child's private parts look like? Who's making the training sets?

          • AI has no original capabilities to create anything. It all comes from the training set.

            Image generators are pseudo random fields of noise mixed with context provided by the model. It's basically a version of monkeys banging on keyboards with far greater odds of producing something mostly coherent.

            When you look at AI imagery concepts such as global illumination, shadows, lighting, reflections, behaviors of solids and liquids, materials and artistic styles are all things the AI figured out and is able to apply. Ditto for structures and variations of human forms.

            So the AI CP is sourced from real CP. You still ok with that?

            It knows what a human, young hu

      • by dgatwood ( 11270 )

        Most of us don't want to promote this behavior in our adult population, no matter how much technology can sanitize it.

        By that same logic, we should ban violent video games out of fear that they make kids violent, too. But science has pretty thoroughly disproven that way of thinking, so long as you avoid meta-analysis papers (which can pretty much prove anything you want to prove just by choosing the right combination of papers). Violent kids gravitate towards violent games, but playing violent games doesn't make normal kids become violent.

        In much the same way, one would expect people who are attracted to little kids to s

      • by gweihir ( 88907 )

        There is no scientifically sound data that indicates this "promotes" this behavior. For other sexual activities, pornography always leads to significant less violence and significant less rape. There are good reasons to think this would be the same here.

      • by dfghjk ( 711126 )

        what behavior? and why limited to "our adult population"?

      • Then a lot more should be banned, movies, games, music etc.
    • Who is hurt by AI generated child porn?

      All of us. The images themselves are sickening and doesn't make sexualized images of children any less so because they were computer generated.

      Is it ok to show xrated movies to kids? No?
      What if the xrated movies are computer generated? Is it now ok? Why would it be?

      • Is it ok to show xrated movies to kids? No?
        What if the xrated movies are computer generated? Is it now ok? Why would it be?

        This is a bogus analogy because child porn isn't porn for kids, it's porn depicting children for the pedophile "demographic" (for lack of a better term).

        The logical reason for not allowing CG kiddie porn isn't that it's gross and offensive, because those are subjective judgements. There's no shortage of mainstream entertainment where it is implied that underaged sex has taken place (perhaps you've heard of Stephen King's IT?). Instead, the reason it's bad to have CGI child porn is because its existence gi

      • by sjames ( 1099 )

        If it was children generating and looking at the images, you would have a point.

      • by dfghjk ( 711126 )

        Who cares what is "sickening" to you, your posts are sickening to us all.

        CP is not intended to show to a child audience.

        "What if the xrated movies are computer generated? Is it now ok? Why would it be?"

        Yes, it is OK to computer generate "xrated" movies. Preventing kids from watching them solves the problem, computer generated or not.

    • .. but no children are harmed, what's the problem? It's just pixels

      Puritans are frightened by nudity. Nude = pornography is a uniquely western take on porn. It's also absolutely silly how wound up some people get about it. It doesn't matter if the nudity is real or imagined. You hear people screaming child-porn all the time at any anime with nudity, or gods forbid, fan-service in it. Cartoons piss people off.

      I've always thought the people that get so upset about it probably need to adjust their sexual filters. If you get excited looking at pictures of kids nude, it's a you

      • Re: (Score:2, Interesting)

        Who are all,these people screaming about anime?

        Can you provide some examples of puritans afraid of anime nudity and therefore AI generated CP which was trained on real CP is ok?

        I can't wait to see the twisted logic you come up with to make this ok beyond where you already started.

        • by gweihir ( 88907 )

          which was trained on real CP

          Again, you have zero indication for that.

          • Of course I do from basic logic and what we know of how these LLM work.

            They have zero capacity for original creation. Absolutely everything they know comes from being fed in, normally through training.

            If they produce CP then they know about CP because they were trained on CP.

            I'm open to discuss how an LLM could generate CP if it had never seen any but that's going to be a high bar.

            • Initial data could be:
              pics of kids
              pics of adults
              pics of naked kids (which I think is RIDICULOUS to refer to as child porn since every parent ever has these pics)
              pics of naked adults
              adult penetration

              I doubt that's that big of a leap. To a machine a penis is a penis and a vagina is a vagina. I really doubt they have any sort of delineation between adult and child when it comes to image generation of a similar nature other than "this one's a little shorter a chubbier, this one's a little taller and skin

            • by gweihir ( 88907 )

              Of course I do from basic logic and what we know of how these LLM work.

              You do not seem to have these capabilities with regards to the current question.

              If they produce CP then they know about CP because they were trained on CP.

              That is complete nonsense. Stop anthropomorphizing these machines.

    • by gweihir ( 88907 )

      Indeed.

    • that is an matter for the courts and in criminal court you have the rights to
      Discovery (aka demand the source code and logs)
      face your accuser (think speeding tickets asking about the Calibration Records / asking what the operator was doing that day )
      also the state must trun over all docs to your team if they try to say that some NDA says they can't or that your team needs to pay all kinds of fees / costs then that is an Brady Violation!

    • .. but no children are harmed, what's the problem?
      It's just pixels

      If child porn from generated AI and from real children are indistinguishable, then a practically enforceable law must declare both to be either legal or illegal. Even if generated images are watermarked in some way, it's likely that real images could be made to pass the same watermark tests.

      • If generated porn was indistinguishable, why would anyone every not use generated porn to make money the AI stuff would be much cheaper/easier/safer to produce.

        The only reason to use real children would be because you enjoyed doing so, but all commercial incentives would be removed.

        You could still prove child abuse by finding the original child.

    • Indeed, https://getimg.ai/ [getimg.ai] allows you to create whatever you want.

      Just ask for very, very young, not '8 year-old'.

      News this is not.

    • by CEC-P ( 10248912 )
      Who cares, burn the pedos. Sincerely, everyone. It's not about a victim, it's about "seriously screw those sick fucks."
    • You'd be fine with this in the US and other places. In the US you have to have a victim for an image to be considered child porn. Other places like Australia, consider fictional depictions to also be classified as child porn. So, Bart and Lisa Simpson (as minors) performing sexual acts could be classified as child porn.

  • It's not AI (Score:4, Insightful)

    by OrangeTide ( 124937 ) on Tuesday December 05, 2023 @12:14PM (#64056777) Homepage Journal

    It's the data. Garbage in / garbage out.

    Of course poorly curated data sets have always been a serious issue in the AI/DL industry. And because we currently only only how to do quantity over quality when it comes to gathering data, means this problem is going to persist for a very long time. The dirty little secret of "big data" is that it's big like a landfill is big and similarly organized.

  • If you ask it to generate an image of a child, exactly what do you expect to get? Even the non-porn-targeted models still often have nudity in their training datasets.

    The cite was almost unnavigable for non-porn purposes until they implemented filters. Even with them a lot slips through.

  • StableDiffusion & CivitAI are genies let WAY, WAY too far out of the bottle to do anything at all about. You'll never ever stop it now. Folks can just sit on what they have right this moment forever and the problem will never go away. Not sure what to do about it. We're teetering on 'thought police' here, but some thoughts... These weirdos exist. What can you do?

    • I would say that we need to focus on the types of enforcement that works against these people and protects kids. So, when the cops find some guy watching elementary school kids and taking pictures of them from his car, take that kind thing way more serious. Get a warrant on the guy, do some real police work. Same with the FBI. I'd say stop wasting resources covering up for politicians and get to work busting folks distributing the material and warehousing it. Given that some of this tech is already openly a
      • by HiThere ( 15173 )

        The important part is "protects kids". "works against these people" is rather irrelevant, at best.

        • Uhh, I'm totally okay with the LEOs working against criminals like child molesters rather than the crap they do when focused on politics or idiotic measures such as the drug war or writing traffic tickets. Investigation is working against them, for example. What is wrong with that?
      • by dbialac ( 320955 )
        Let's go with the opposite approach: just have everyone walk around naked when weather permits or when indoors and that way there's no such thing as child pornography.
    • by dgatwood ( 11270 )

      StableDiffusion & CivitAI are genies let WAY, WAY too far out of the bottle to do anything at all about. You'll never ever stop it now. Folks can just sit on what they have right this moment forever and the problem will never go away. Not sure what to do about it. We're teetering on 'thought police' here, but some thoughts... These weirdos exist. What can you do?

      Make images involving Natalie Portman, petrified, in hot grits, so that at least rule 34 will become true again?

  • This is definitely an interesting ethics debate. Religious moral superiority aside, are generated images really that bad? The reason it is illegal is that children are harmed. If none are harmed, then it should be legal.

    If anything, the existence of easily available Ai content would massively reduce the amount of real children harmed because there would be no market for their content.

    • If computer generated (trained on real CP) is ok then are you cool with AI CP all around you in daily life?

      TV ads, posters hanging in your friend's home, your son or husband having an AI 5 year old sucking an old guy's cock as a screen saver, your boss mailing out AI CP to show everyone at work how realistic AI has become today?

      It's all AI generated and therefore ok?

      I think not.

      And remember LLMs have zero original creation capability. If an AI is generating CP then it was trained on real CP. How is that o

      • The core reason why CP is illegal is the damage to the person involved. No person, no damage.

        Everything else creates a slippery slope pretty quickly because you're starting to judge what is going on in peoples' heads. If you start doing this, people like Stephen King should tread lightly because... have you read one of his books? If you judge people by what's going on in their head, you don't have to lock him up, you should lock him away where nobody can find him.

      • Your zero creativity claim is not correct. Such AIs are, for example, without specific guidance, coming up with new pharmaceutical molecules, some of which will work.

        How do you know the CP (which in some cases is just an interpretation) was not created from medical images, for example? The prompts used apparently did not mention pornography at all. They asked for hyper-realistic images of humans of various ages.
        • by DarkOx ( 621550 )

          "medical images" - are only so as long as they are used for the purpose they were taken - medical treatment or education about that subject etc. As soon as someone start distributing them or using them out of prurient interest - that becomes an abuse.

          So this argument that AI might never have been feed CP in the training data but can still produce CP, isnt valid because so much of this is about intent. If the AI is being used to generate CP, than any image of a underage person used to train it becomes retro

      • I think adult porn is ok, but I am not ok with it being all around me in my daily life either. Or violence in movies, or sports.

        Just because someone doesn't think it is bad for someone to enjoy something they don't like, doesn't mean they want to be surrounded by it constantly.

        I don't think being gay is bad in any way shape or form, however I would still not be pleased if I was constantly being hit on by men, or was forced to watch men having sex.

      • If computer generated (trained on real CP) is ok then are you cool with AI CP all around you in daily life?

        Considering that people flip out over one lesbian kiss between consenting adults in a movie, I don't think we're in any immediate danger of pedophilia (simulated or otherwise) being normalized by society. Harming children is a despicable act, and rightfully so.

      • by dbialac ( 320955 )
        Regardless of your position, you're describing the need for situational boundaries, not specifically child pornography. Much of what you describe applies universally to pornography of any type. You wouldn't want a picture of an old guy sucking another old guy's cock, either. You wouldn't want an ad showing that, you wouldn't want your boss sending out the computer generated images, etc.
  • by bugs2squash ( 1132591 ) on Tuesday December 05, 2023 @12:42PM (#64056937)
    What was it trained on ? If it can be traced back to CP then hopefully there can be some prosecutions.
    • You take some fashion magazines with kids' fashion, you take a nudie catalogue, you let AI combine them...

      • by DaPhil ( 811162 )

        I somehow doubt that an AI would be able to fashion boys' and girls' genitalia - including completely missing breasts in girls, and inability to retract foreskin in boys - from adult men/women and clothed children. But this "I somehow doubt" is precisely the issue here, since very few of us have seen any of the outputs - and noone at all sees any of the inputs.

        • I mean, nobody said the anatomy was correct.

          It could be something as simple as child faces grafted on to adult bodies. Hence the "could be considered CP" ambiguous statement.

        • by gweihir ( 88907 )

          And how do you know this level of detail was there?

        • I seriously don't know how "anatomically correct" the pictures are. I haven't even seen any of the output, what's linked in the article is pretty much what you'd probably find in some fashion catalogue.

          Getting an AI to give girls flat breasts would be easy, just teach the AI that girls look like boys "up there". How you model children genitalia without knowing the difference from adult genitalia is a tricky question, though, I give you that. They probably just adjust the size and create something out of som

  • So? And if my aunt had balls, she'd be my uncle.
  • You know... (Score:5, Funny)

    by Opportunist ( 166417 ) on Tuesday December 05, 2023 @12:50PM (#64056985)

    Pics or didn... erh...

    Never mind, I believe you.

  • But was it, really? (Score:5, Informative)

    by Miles_O'Toole ( 5152533 ) on Tuesday December 05, 2023 @01:07PM (#64057065)

    Whenever I hear somebody say "could have been" in connection with pornography, I get skeptical. Religious fundamentalists are so crazy and perverted that as far as they're concerned, just about anything "could have been" porn.

    For context: one of my buddies had the police knock on his door because at the pub he showed us a short video of his two little kids playing in the bath. Apparently, somebody who wasn't even being shown the video decided this was child pornography.

    There was justice, eventually. He simply pulled out his phone and showed the cops the video. And we figured out who the a-hole was and got them permanently barred from the local.

    • by gweihir ( 88907 )

      Religious fundamentalists are so crazy and perverted that as far as they're concerned, just about anything "could have been" porn.

      Indeed. These people just try to suppress anything that their deranged priests tell them too (while these priests more often than not engage in some quiet child raping), and blatantly lying to non-believers is obviously ok.

  • Open your favourite book of fiction. Is it crime? People kill each other. History? People kill each other by millions, destroy countries and take slaves. Romcom? People cheat and cuckold. The storyline in any prose is illegal behaviour of some kind.
    Watch news. War here, war there (read: people killing each others). School shootings. Financial crime and feud. Celebrities having affairs (which used to be crime and still are in some countries).

    Most content that mankind has generated or keeps generating is abou

  • That isn't an innocent prompt, that's a prompt asking for a girl and a boy and specifying a number 18.
  • by MobyDisk ( 75490 ) on Tuesday December 05, 2023 @02:41PM (#64057359) Homepage

    This is my weekly reminder that AI has no agency. If someone instructed the AI to generate an image with something awful, so be it, shame on them. I don't think it is the job of the tool to police the user, it is the other way around. Photoshop doesn't detect if users make fake news, Notepad doesn't detect if users compose a death threat, JPEG encoders don't enforce copyright infringement, Tesla Autopilot does not detect if you are violating your parole, etc.

    Consider:
    "Leaked documents show that hammer hit fingers." should really be "User hit finger with hammer, hammer did not try to stop them."

  • Well, the price just went even higher for stock photography models under 18. They're already insanely high because, shockingly enough, most kids can't act. I've heard from photography friends that tried to get into selling pics to Adobe's various stock photography networks that they gave up on kids because they're like "Show me angry. Okay, now you're frustrated at the computer" and the results were so fake and staged and bad that they had to throw out most of the final pics yet pay about double to the pare
  • Do you want to block approved content or permit illegal/offensive content?

    There are a few non-criminal reasons for wanting nude images of children, primarily medical. A series of images or an animation of normal and/or abnormal development, for instance.

    You're probably not going to lose clients blocking that (unless you have a med school as a client), but generalise the problem and you get the idea. No matter what you do, you're going to end up blocking good stuff or allowing bad stuff.

As far as the laws of mathematics refer to reality, they are not certain, and as far as they are certain, they do not refer to reality. -- Albert Einstein

Working...