AI-generated child sex imagery has every US attorney general calling for action::“A race against time to protect the children of our country from the dangers of AI.”

  • db2@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    89
    arrow-down
    9
    ·
    1 year ago

    They’re not pictures of real people, proceeding against it on that basis undermines the point and makes them look like idiots. It should be banned on principle but ffs there’s got to be a way that doesn’t look reactionary and foolish.

    • ram@lemmy.ca
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      15
      ·
      1 year ago

      Except when they are pictures of real people doing a body swap

      • db2@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        35
        arrow-down
        2
        ·
        1 year ago

        That isn’t at all what an AI generated image is. People have been doing that for better than 50 years.

        • ram@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          11
          ·
          1 year ago

          🤦‍♀️ I obviously mean the replaced portions of the body are AI generated, like photoshop and various other tools have been using.

      • BadRS@lemmy.world
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        1
        ·
        1 year ago

        Thats been possible since before photoshop and certainly is possible after

      • fidodo@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        1 year ago

        Shouldn’t that already be covered under revenge porn laws? At least the distribution side of it.

    • ThrowawayOnLemmy@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      10
      ·
      1 year ago

      But aren’t these models built from source material? I imagine if you want CP AI, you need actual CP to train it, no? That definitely would be a problem.

      • Rivalarrival@lemmy.today
        link
        fedilink
        English
        arrow-up
        17
        ·
        1 year ago

        No, you can use a genetic algorithm. You have your audience rate a legal, acceptable work. You present the same work to an AI and ask it to manipulate traits, and provide a panel of works to your audience. Any derivative that the audience rates better than the original is then given back to the AI for more mutations.

        Feed all your mutation and rating data to an AI, and it can begin to learn what the audience wants to see.

        Have a bunch of pedophiles doing the training, and you end up with “BeyondCP”.

      • mind@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        1 year ago

        No, for the same reason the AI pictures of astronauts riding horses on the moon are not trained from real astronauts riding horses on the moon.

    • NegativeInf@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      6
      ·
      1 year ago

      My question is where did they get the training data for a sufficiently advance CP image generator. Unless it’s just ai porn with kids faces? Which is still creepy, but I guess there are tons of pictures that people post of their own kids?

      • SinningStromgald@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        1 year ago

        Manga, manwha(?) CG sets etc of shota/loli. Sprinkle in some general child statistics for height, weight etc . And I’m sure social media helped as well, with people making accounts for their babies for God sake.

        Plenty of “data” I’m sure to train up an AI.

      • bobs_monkey@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        1 year ago

        Wouldn’t put it past some suck fucks to feed undesirable content into an AI training

      • Kerfuffle@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        36
        arrow-down
        4
        ·
        1 year ago

        It’s obviously very distasteful but those needs don’t just go away. If people with that inclination can’t satisfy their sexual urges at home just looking at porn, it seems more likely they’re going to go out into the world and try to find some other way to do it.

        Also, controlling what people do at home that isn’t affecting anyone else, even in a case like this isn’t likely to target exactly just those people and it’s also very likely not to stop there either. I’d personally be very hesitant to ban/persecute stuff like that unless there was actual evidence that it was harmful and that the cure wasn’t going to be worse than the disease.

        • phx@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          31
          ·
          1 year ago

          Even if the imagery was 100 computer-generated without a training model based in abuse, at some point it becomes a line between “is somebody being hurt” and “what is acceptable in civilized society”. If we accept CG CSAM then what else? Gore porn? Snuff? Bestiality? How about a child being sexually assaulted and mutilated by animals at the same time? There is always stuff that’s going to push you envelope further and further, and how do you even tell if real individuals are involved, if it’s CG, or some combination of such?

          If they’re all CG, then nobody real is being hurt BUT there’s still gotta be a line between acceptable and unacceptable. Society - at least western society- by and large has decided that the majority of those categories are not acceptable by law, so regardless of how it’s made it’s still illegal, and the dissemination of such can still have a harmful effect on society in general.

          • PopOfAfrica@lemmy.world
            link
            fedilink
            English
            arrow-up
            13
            ·
            edit-2
            1 year ago

            We already have that line though. Beheading photos, for example, arent illegal, but they are banned from most websites.

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          38
          ·
          1 year ago

          My brother in Christ they have to train the fucking models on real CSAM. It’s not like AI generated CSAM is suddenly a victimless crime.

        • ram@lemmy.ca
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          44
          ·
          1 year ago

          those needs don’t just go away

          Get psychological help

          If people with that inclination can’t satisfy their sexual urges at home just looking at porn, it seems more likely they’re going to go out into the world and try to find some other way to do it.

          Get psychological help

          Also, controlling what people do at home that isn’t affecting anyone else

          Feeding pedophilia is directly harmful to children who grow more at risk

          I’d personally be very hesitant to ban/persecute stuff like that unless there was actual evidence that it was harmful and that the cure wasn’t going to be worse than the disease

          I’d personally be very hesitant to say “it’s okay to beat off to children” unless there was an actual clinical psychologist involved with the person I’m speaking to saying as such.

          • Kerfuffle@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            34
            arrow-down
            5
            ·
            edit-2
            1 year ago

            Get psychological help

            How about addressing my points instead of the ad hominem attacks?

            Feeding pedophilia is directly harmful to children who grow more at risk

            Like I said: “I’d personally be very hesitant to ban/persecute stuff like that unless there was actual evidence that it was harmful

            If what you’re saying here is actually true then the type of evidence I mentioned would exist. I kind of doubt it works that way though. If you stop “feeding” being straight, gay, whatever, does it just go away and you no longer have those sexual desires? I doubt it.

            Much as we might hate it that some people do have those urges, it’s the reality. Pretending reality doesn’t exist usually doesn’t work out well.

            I’d personally be very hesitant to say “it’s okay to beat off to children”

            I never said any such thing. Also, in this case, we’re also talking about images that resemble children, not actual children.

            It should be very clear to anyone reading I’m not defending any kind of abuse. A knee-jerk emotion response here could easily increase the chances children are abused. Or we could give up our rights “for the children” in a way that doesn’t actually help them at all. Those are the things I’m not in favor of.

            • Jamie@jamie.moe
              link
              fedilink
              English
              arrow-up
              10
              ·
              edit-2
              1 year ago

              I’m not the guy you’re replying to, but I will say this is a topic that is never going to see a good consensus, because there are two questions of morality at play, which under normal circumstances are completely agreeable. However, when placed into this context, they collide.

              1. Pornography depicting underage persons is reprehensible and should not exist

              2. The production and related abuse of children should absolutely be stopped

              To allow AI child porn is to say that to some extent, we allow the material to exist, even if it depicts an approximation of a real person whether they are real or not, but at the potential gain of harming the industry producing the real thing. To make it illegal is to agree with the consensus that it shouldn’t exist, but will maintain the status quo for issue #2 and, in theory, cause more real children to be harmed.

              Of course, the argument here goes much deeper than that. If you try to dig into it mentally, you end up going into recursive branches that lead in both directions. I’m not trying to dive into that rabbit hole here, but I simply wanted to illustrate the moral dilemma of it.

              • Droechai@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                So we should ban books like Lolita since it can be interpreted as porn, or is it only visual that should be banned? If books are okay, are an imahe of stick figures with a sign “child” okay? How much detail should the visual image have before it gets banned?

                How about 1000 year old dragons in a child’s body? How about images of porn stars with very petite bodies?

            • ram@lemmy.ca
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              21
              ·
              1 year ago

              How about addressing my points instead of the ad hominem attacks?

              That is addressing your point. These people need to get psychological help.

              If you stop “feeding” being straight, gay, whatever, does it just go away and you no longer have those sexual desires? I doubt it.

              The harms brought by conversion therapy to the gay and straight people outweigh the harms that are brought about by allowing them to exist. The same is not true of pedophilia. Though it is interesting if you do see these as the same, are you for the persecution of gay or straight people as you are pedophiles, or are you in favour of pedophiles being able to enact their desires?

              Much as we might hate it that some people do have those urges, it’s the reality. Pretending reality doesn’t exist usually doesn’t work out well.

              It is the reality, and pretending people will just safely keep their desires to themselves has proven to not work.

              I never said any such thing.

              I never said you said it, but it is the result of what you’re saying.

              we’re also talking about images that resemble children

              Since you’re drawing this distinction from the words you decided were thrust in your mouth, they weren’t, would you say “it’s okay to beat off to children who may not exist”?

              It should be very clear to anyone reading I’m not defending any kind of abuse

              You’re outwardly expressing pedophile apologia.

              Or we could give up our rights “for the children” in a way that doesn’t actually help them at all.

              What rights are you giving up?

              • Falmarri@lemmy.world
                link
                fedilink
                English
                arrow-up
                10
                ·
                1 year ago

                That is addressing your point. These people need to get psychological help.

                What help do you think exists for these people? And what if CG porn is that help

                • ram@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  9
                  ·
                  1 year ago

                  Psychologists.

                  There’s no evidence that CSAM, real or virtual, helps reduce rates of child predation.

              • mean_bean279@lemmy.world
                link
                fedilink
                English
                arrow-up
                6
                arrow-down
                19
                ·
                1 year ago

                There’s a disgusting number of people on this site that I’ve seen defending ai pedos. I honestly don’t understand where it comes from. Some people cannot and should not be helped as their views are incompatible with society.

                Not to mention that AI pedophilia could simply be creating a massive stepping stone to the real thing. Which I’ve also seen a number of people on Lemmy defend people possessing CSAM and saying they didn’t produce it therefore they aren’t the criminal. It’s pure insanity. I’m incredibly liberal and progressive and even I know that’s a slope I don’t wish to have society slip down it’s not worth the risk to children who are innocent to be caught in the crossfire.

                • ram@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  2
                  ·
                  1 year ago

                  I can’t say I entirely agree. I do think that they should be helped, but in a measured and rigorous way. None of this “let them find shit online that quells their needs”. Pedophilia, in the psychological profession, is viewed in a similar light to sexual orientations; of that the person I’m responding to is correct. It’s simply that they seem to be blind to nuance beyond that stance that they’re stuck.

                  AI pedophilia is certainly a very risky move for us to simply accept, when we don’t even have any data on how consumption of real or virtual CSAM impacts those who indulge in it, and to get that data would require us to do very unethical and likely illegal research as far as I can tell. The approach Kerfuffle@shi.tjust.works is suggesting is one that is naive and myopic in the most generous light; which is how I’m choosing to take it so as to not accuse them of something they may not be guilty of.

                  I’m also someone who’s extremely progressive, and while I can sympathize with people who have these urges and no true wish to act on them, I think it’s outright malicious to say that the solution is to simply allow them to exist with informal self-treatments based on online “common sense” idealism. Mental health support should absolutely be available and encouraged; part of that is making sure people are safe to disclose this stuff to medical professionals, but no part of that is just having this shit freely spread online.

                  I appreciate your measured and metered response. I think these are extremely tricky conversations to have, but important, especially with how technology is progressing.

      • PilferJynx@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Humans have been raping kids since our inception. Childhood is a relatively modern concept that young adults are now apart of. It’s an ugly and pervasive subject that needs further study to reduce child harm.

    • Tibert@compuverse.uk
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      29
      ·
      1 year ago

      Whatever you wanted to say I do not unserstand your point.

      It’s not about protecting directly real people images, it’s about preventing Pedophilia.

      Pedophilia IS AN ADDICTION!!! Fueling it with anything, even AI, will worsen the ADDICTION!

      • hh93@lemm.ee
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        3
        ·
        1 year ago

        No one chooses to be a pedophile - as far as we know it’s just the unluckiest possible sexual attraction.

        Stigmatizing it won’t help anyone - those people need help and everything that doesn’t hurt real children until they get themselves that psychological help is good in my book

        • Tibert@compuverse.uk
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          26
          ·
          1 year ago

          Wtf are you talking about.

          They may not be choosing it, but accepting they exist and ignoring them is a way to say you accept Pedophilia.

          They should know that they do something wrong and that there is help (at lest in France there are processes to try and help them, and it’s about the same as an addiction but with other content).

          • hh93@lemm.ee
            link
            fedilink
            English
            arrow-up
            18
            ·
            1 year ago

            Of course they know that what they do is wrong

            But thoughts aren’t criminal and if they manage to control their attraction in a way that hurts no-one (like this ai-generated stuff or the child-sex-dolls or other things that are very creepy to almost everyone else) then they should totally be normal members of society

            I’m not saying that they should be proud of it and see it as something normal - it’s definitely a psychological issue - but denying them stuff that helps them satisfy their urges without hurting anyone should be a good step to help them face their sexuality instead of suppressing them

            I think it should totally be possible to access things like it but it should also be necessary to link to a helpline like the one you mentioned (Germany has something like that) similar to how a helpline for suicide is necessary if someone mentions that subject

      • Ted Bunny@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Are you saying they’re “born this way?” Has that been demonstrated anywhere?

  • DarkSpectrum@lemmy.world
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    6
    ·
    1 year ago

    Isn’t AI generated better than content sourced from real life? It could actually drive a reduction in child sexual abuse instances due to offenders leveraging alternative sources.

    • doggle@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      2
      ·
      1 year ago

      I see what you’re saying, but ai has yet to offset regular porn production at all. There’s no reason I see to think accepting ai cp would do anything but normalize it and make it more accessible, possibly increasing demand for the real stuff.

      Also, the ai models need to be trained on something…

      • regbin_@lemmy.world
        link
        fedilink
        English
        arrow-up
        28
        ·
        1 year ago

        Based on my understanding of how current diffusion models work, you actually don’t need to train it on CP. As long as it knows how humans look like without clothes and how children look like even if fully clothed with abayas and stuff, it can make the relation and generate CP when asked to.

        Just to be clear, I’m totally against any form of CP and CSAM. Just explaining how the tech works.

    • krayj@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 year ago

      One big problem is that it makes enforcement of real abuse impossible. If there is an explosion of that kind of ai generated content and it gets good enough to be confused for the real thing, then real abuse will slip under the radar. It would be impossible to sift through all that content trying to differentiate between ai generated and real if ai generated were ever allowed.

    • Tibert@compuverse.uk
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      33
      ·
      1 year ago

      No it won’t.

      Fueling the child addiction will harden the persons mental health problem.

      One of the ways to help a person with such addiction, is to replace it with adult pornography.

      Fueling it with more of that content won’t do any good.

      • Thorny_Thicket@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        54
        arrow-down
        4
        ·
        edit-2
        1 year ago

        Mind linking the study you’re refering here?

        Majority of pedofiles never offend and most of the people in jail for child abuse are plain old rapists with no particular interest in kids per se, they’re just an easy target.

        This is the same old “violent games makes people violent” -argument all over again.

        • Tibert@compuverse.uk
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          25
          ·
          1 year ago

          Wtf.

          I’m not talking about all of them being violent or creating issues to all the children in the world.

          Everyone is different, and only a small % is violent.

          And no it’s not the same as violent games, nor was there anything in my argument pointing to that, wtf!

          If you want a study or whatever here https://www.cairn.info/revue-l-information-psychiatrique-2011-2-page-133.htm It’s a bit old, things may have changed.

          • Thorny_Thicket@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            26
            arrow-down
            1
            ·
            1 year ago

            You mind quoting the part where it talks about the role of adult pornography and how fueling this “addiction” makes it worse?

            • Tibert@compuverse.uk
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              18
              ·
              edit-2
              1 year ago

              There is no part to quote for worsening, there are plenty of studies on addiction to pornography, and addiction to child abuse/porn is the same thing.

              However child abuse/pornography is in a way a different content from adult pornography and can both have their level of addiction.

                • Tibert@compuverse.uk
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  13
                  ·
                  1 year ago

                  I’m not sure to follow you, why wouldn’t it have a link?

                  It’s obviously stated there is psychological help.

                  Doesn’t it answer your first link question?

                  If you want more link go look on Wikipedia, there are 100ds of them.

                  Won’t bother with you not reading them.

            • Tibert@compuverse.uk
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              19
              ·
              1 year ago

              It was in a news/investigation journal on TV at one point. I can’t because I don’t remember the time and journal.

              They talked about people being put in some sort of helping home. There they helped the people get away from such content and replacing it with adult content.

              This was pretty recent if I remember well. So the article above being from 2012 may not have all studies and information.

  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    8
    ·
    1 year ago

    In the United States, there are significantly greater dangers to kids than AI porn. Hunger, poverty and the climate crisis come to mind.

    If we are refusing to address these for ideological reasons (e.g. because its socialism ) then the established system itself is a threat to kids.

    Priorities.

  • Sanctus@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    1 year ago

    Pandora’s digital box has been opened. And I dont think this one ends with everything going back in the box.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    On Wednesday, American attorneys general from all 50 states and four territories sent a letter to Congress urging lawmakers to establish an expert commission to study how generative AI can be used to exploit children through child sexual abuse material (CSAM).

    In particular, open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability.

    Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation’s top prosecutors.

    Establishing a proper balance between the necessity of protecting children from exploitation and not unduly hamstringing a rapidly unfolding tech field (or impinging on individual rights) may be difficult in practice, which is likely why the attorneys general recommend the creation of a commission to study any potential regulation.

    In the past, some well-intentioned battles against CSAM in technology have included controversial side effects, opening doors for potential overreach that could affect the privacy and rights of law-abiding people.

    Similarly, the letter’s authors use a dramatic call to action to convey the depth of their concern: "We are engaged in a race against time to protect the children of our country from the dangers of AI.


    The original article contains 960 words, the summary contains 225 words. Saved 77%. I’m a bot and I’m open source!

  • Blue2a2@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    I’m sure it is entirely coincidental that the call to action is to restrict/control the free open-source software, and leaves Google and Microsoft safely in control with their curated models.

    This is just like the time the US made websites responsible for their users’ content, and coincidentally made it much more legally dangerous to start your own social media platform.

    But sure, I mean, just think of the (imaginary) children! We need to stop this theoretical abuse of imaginary children by passing laws that make it harder for any AI not created by a tech giant to operate.

  • ram@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    27
    ·
    1 year ago

    A lot of people here really defending child pornography

    • Serdan@lemm.ee
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      3
      ·
      1 year ago

      It’s an obvious overreach.

      An AI generated image is essentially the solution to a math problem. Say the images are/become illegal. Is it then also illegal to possess the input to that equation? The input can be used to perfectly replicate the illegal image after all. What if I change a word in the prompt such that the subject of the generated image becomes clothed? Is that then suddenly legal?

      I understand the concern, but it’s just incredibly messy to legislate what amounts to thought crimes.

      Maybe we could do something to discourage distribution, but the law would have to be very carefully worded to prevent abuse.

    • Absolutemehperson@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      And everyone pointing that out gets downvoted into oblivion, rofl. I hate the internet and the sick degenerates on it.