A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

  • Sume@reddthat.com
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    4
    ·
    1 year ago

    Not sure how people will be so into this shit. It’s all so generic looking

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      The actual scary use case for AI porn is that if you can get 50 or more photos of the same person’s face (almost anyone with an Instagram account), you can train your own LoRA model to generate believable images of them, which means you can now make “generic looking” porn with pretty much any person you want to see in it. Basically the modern equivalent of gluing cutouts of your crush’s face onto the Playboy centerfold, only with automated distribution over the Internet…

      • lloram239@feddit.de
        link
        fedilink
        English
        arrow-up
        13
        ·
        1 year ago

        Using a LoRA was the old way, these days you can use Roop, FaceSwapLab or ReActor, which not only can work with as little as a single good photo, they also produce better locking results than LoRA. There is no time consuming training either, just drag&drog an image and you get results in a couple of seconds.

        • pinkdrunkenelephants@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          4
          ·
          1 year ago

          So how will any progressive politician be able to be elected then? Because all the fascists would have to do is generate porn with their opponent’s likeness to smear them.

          Or even worse, deepfake evidence of rape.

          Or even worse than that, generate CSAM with their likeness portrayed abusing a child.

          They could use that to imprison not only their political opponents, but anyone for anything, and people would think whoever is being disappeared this week actually is a pedophile or a rapist and think nothing of it.

          Actual victims’ movements would be chopped off at the knee, because now there’s no definitive way to prove an actual rape happened since defendants could credibly claim real videos are just AI generated crap and get acquitted. No rape or abuse claims would ever be believed because there is now no way to establish objective truth.

          This would leave the fascists open to do whatever they want to anybody with no serious consequences.

          But no one cares because they want AI to do their homework for them so they don’t have to think, write, or learn to be creative on their own. They want to sit around on their asses and do nothing.

          • hyperhopper@lemmy.ml
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 year ago

            People will have to learn to stop believing everything they see. This has been possible with Photoshop for even more than a decade now. All that’s changed is that it takes less skill and time now.

            • pinkdrunkenelephants@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              3
              ·
              1 year ago

              That’s not possible with AI-generated images impossible to distinguish from reality, or even expertly done photoshops. The practice, and generative AI as a whole, needs to be banned. They’re putting AI in photoshop too so ban that garbage too.

              It has to stop. We can’t allow the tech industry to enable fascism and propaganda.

          • Silinde@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            1 year ago

            Because that’s called Libel and is very much illegal in practically any country on earth - and depending on the country it’s either easy or trivial to put forth and win a case of libel in court, since it’s the onus of the defendant to prove what they said was entirely true, and “just trust me and this actress I hired, bro” doesn’t cut it.

              • Silinde@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                The burden of liability will then fall on the media company, which can then be sued for not carrying out due dilligance in reporting.

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            We’re going to go back to the old model of trust, before videos and photos existed. Consistent, coherent stories from sources known to be trustworthy will be key. Physical evidence will be helpful as well.

            • pinkdrunkenelephants@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              1 year ago

              But then people will say “Well how do we know they’re not lying?” and then it’s back to square 1.

              Victims might not ever be able to get justice again if this garbage is allowed to continue. Society’s going so off-track.

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      6
      ·
      1 year ago

      AI is still a brand new tech. It’s like getting mad at AM radio for being staticy and low quality. It’ll improve with time as we get better tech.

      Personally I can’t wait to see what the future holds for AI porn. I’m imagining being able to get exactly what you want with a single prompt, and it looks just as real as reality. No more opening 50 tabs until you find the perfect video. Sign me the fuck up.

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          The article mentioned that at least one OnlyFans creator reached out to make a model of their own content and also mentioned that some OnlyFans creators outsource writers to chat with fans. I don’t think this will meaningfully affect cam girls’ jobs. Once we are able to make live animated images realtime with convincing speech and phrases then probably.