• HauntedCupcake@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      I’m not sure where you’re going with that? I would argue that yes, it is. As it’s sexual material of a child, with that child’s face on it, explicitly made for the purpose of defaming her. So I would say it sexually abused a child.

      But you could also be taking the stance of “AI trains on adult porn, and is mearly recreating child porn. No child was actually harmed during the process.” Which as I’ve said above, I disagree with, especially in this particular circumstance.

      Apologies if it’s just my reading comprehension being shit

      • Todd Bonzalez@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        4 months ago

        That’s one definition, sure.

        Now answer the very simple question I asked about whether or not child porn is abusive.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        4 months ago

        It’s actually not clear that viewing material leads that person to causing in person abuse

        Providing non harmful ways to access the content may lead to less abuse as the content they seek no longer comes from abuse, reducing demand for abusive content.

        That being said, this instance isn’t completely fabricated and given its further release is harmful as it it involves a real person and will have emotional impact.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            4 months ago

            There has been yes, but it doesn’t mean it’s the right ruling law. The law varies on that by jurisdiction as well because it is a murky area.

            Edit: in the USA it might not even be illegal unless there was intent to distribute

            By the statute’s own terms, the law does not make all fictional child pornography illegal, only that found to be obscene or lacking in serious value. The mere possession of said images is not a violation of the law unless it can be proven that they were transmitted through a common carrier, such as the mail or the Internet, transported across state lines, or of an amount that showed intent to distribute.[

            So local AI generating fictional material that is not distributed may be okay federally in the USA.

            • delirious_owl@discuss.online
              link
              fedilink
              arrow-up
              0
              ·
              4 months ago

              Serious value? How does one legally argue that their AI-generated child porn stash has “serious value” so they they don’t get incarcerated.

              Laws are weird.

              • NotMyOldRedditName@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                4 months ago

                Have the AI try to recreate existing CP already deemed to have serious value and then have all the prompts/variations leading up to the closest match as part of an exhibit.

                Edit: I should add, don’t try this at home, they’ll still probably say it has no value and throw you in jail.

      • Todd Bonzalez@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        So you don’t think that nudifying pics of kids is abusive?

        Says something about you I think…

              • Zoot@reddthat.com
                link
                fedilink
                arrow-up
                0
                ·
                4 months ago

                Then don’t defend them? You’re trying to tell everyone that what is literally above in an article, about a child who had PHOTOREALISTIC pictures made of her, that it isn’t CSAM.

                It is. Deleting everyone’s comments who disagree with you will not change that, and if anything, WILL make you seem even more like the bad guy.

          • Todd Bonzalez@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            4 months ago

            drawings

            Nobody said anything about drawings, but interesting default argument… Thanks for telling the class that you’re a lolicon pedo.

            the liberty of masses be stomped and murdered

            Nobody said that anyone should be stomped and murdered, so calm down, lmao. We’re just saying that child porn producers, consumers, and apologists are vile, disgusting perverts who should be held accountable for their crimes against children.

              • magi@lemmy.blahaj.zone
                link
                fedilink
                arrow-up
                0
                ·
                4 months ago

                They’re making them unsafe? You and your bullshit are making them unsafe. Every comment you post reeks of your true character. Go get help.

                • Todd Bonzalez@lemm.ee
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  4 months ago

                  I’m making kids unsafe by…

                  checks notes

                  …being firmly and unwaveringly against the sexual exploitation of children.

                  I really can’t stress enough that this was an actual 15 year old girl who was pornified with AI. This isn’t some “erotic drawings” argument, the end results were photorealistic nudes with her unmodified face. This isn’t some completely AI generated likeness. Pictures of her from social media were exploited to remove her clothes and fill in the gaps with models trained on porn. It was nonconsensual pornography of a kid.

                  Anyone who read this story, and feels the need to defend what was done to this girl is a fucking monster.

                  I can’t believe that the person defending sex crimes of this magnitude is a fucking mod.