Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • P03 Locke@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    219
    arrow-down
    18
    ·
    edit-2
    1 year ago

    There is so much wrong with just the title of this article:

    1. What marketplace? CivitAI is free. Unstable Diffusion Discord is free. Stable Diffusion is free. All of the models and LoRAs are free to download. The only cost is a video card (even a basic one) and some time to figure this shit out.
    2. “Everyone is for sale”. No, that’s current fucking situation, where human trafficking runs rampant throughout the sex and porn industry. AI porn is conflict-free. You don’t need to force an underaged, kidnapped teenager to perform a sex act in front of a camera to create AI porn.
    3. “For Sale”. Again, where’s the sale? This shit is free.

    A 404 Media investigation shows that recent developments

    Get the fuck outta here! This two bit blog want to call itself “a 404 Media investigation”? Maybe don’t tackle subjects you have no knowledge or expertise in.

    The Product

    Repeat: FOR FREE! No product!

    In one user’s feed, I saw eight images of the cartoon character from the children’s’ show Ben 10, Gwen Tennyson, in a revealing maid’s uniform. Then, nine images of her making the “ahegao” face in front of an erect penis. Then more than a dozen images of her in bed, in pajamas, with very large breasts. Earlier the same day, that user generated dozens of innocuous images of various female celebrities in the style of red carpet or fashion magazine photos. Scrolling down further, I can see the user fixate on specific celebrities and fictional characters, Disney princesses, anime characters, and actresses, each rotated through a series of images posing them in lingerie, schoolgirl uniforms, and hardcore pornography.

    Have you seen Danbooru? Or F95 Zone? This shit is out there, everywhere. Rule 34 has existed for decades. So has the literal site called “Rule 34”. You remember that whole Tifa porn video that showed up in an Italian court room? Somebody had to animate that. 3D porn artists takes its donations from Patreon. Are you going to go after Patreon, too?

    These dumbasses are describing things like they’ve been living in a rock for the past 25 years, watching cable TV with no Internet access, just NOW discovered AI porn as their first vice, and decided to write an article about it to get rid of the undeserved guilt of what they found.

    What a shitty, pathetic attempt at creating some sort of moral panic.

    • jeremyparker@programming.dev
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      2
      ·
      edit-2
      1 year ago

      The danbooru aspect of the “AI” moral panic is what annoys me.

      So many of my friends - many of whom are amateur artists - hate computer generated images because the copyright of the artists were violated, and they weren’t even asked. And I agree that does kinda suck - but - how did that happen?

      Danbooru.

      The art had already been “stolen” and was available online for free. Where was their morality then? For the last decade or whatever that danbooru has been up? Danbooru is who violated the copyright, not stable diffusion or whatever.

      At least computer generated imagery is different, like, the stuff it was trained on was exactly their art, while this stuff, while might look like theirs, is unique. (And often with a unique number of fingers.)

      And, if “copyright” is their real concern, them surely they understand that copyright only protects against someone making a profit of their work, right? Surely they’ll have looked into it and they already know that “art” made by models that used copyrighted content for training are provided from being copyrighted themselves, right? And that you can only buy/sell content made from models that are in the copyright clear, surely they know all this?

      No, of course not. They don’t give a shit about copyright, they just got the ickies from new tech.

      • adrian783@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        5
        ·
        1 year ago

        no one is moral panicking over ai. people just want control over their creation, whether it’s profit sharing or not being used to train models.

        you really can’t see how an imageboard has completely different considerations over image generating models?

        or that people are going after ai because there is only like a couple of models that everyone uses vs uncountable image hosts?

        both danbooru and stable diffusion could violate copyright, not one or the other.

        why would someone want training models to ingest their creation just to spit out free forgeries that they cannot claim the copyright to?

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          Yeah. It’s pretty iffy to go “well, these other guys violated copyright so they might as well take it” as if once violated it’s all over and nobody else is liable.

          • jeremyparker@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            1 year ago

            This is a bad faith reading. The argument isn’t that “someone else did it first” - the argument is that the concern over copyright is suspiciously sudden. No one has gotten mad about danbooru - or Reddit, or Facebook, or any of the other billions of sites that use content created by others to draw users and make a profit from ad revenue. Why are people mad about some neckbeard’s $3/month patreon based on an unoriginal art style, but not about Facebook (etc) destroying the entire thing that used to be called journalism? Danbooru literally stole the work, why is no one mad about that? Why are they only mad when someone figuratively steals the work?

            AI art has a similar potential to do to set what Facebook did to journalism - I just wrote a long post about it in another reply in this thread so I won’t repeat it all here - but, wealthy corporations will be able to use AI art to destroy the career of being an artist. That’s what’s dangerous about AI.

        • P03 Locke@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 year ago

          no one is moral panicking over ai.

          This is one of the most inaccurate statements I’ve seen in 2023.

          Everybody is morally panicking over AI.

          stable diffusion could violate copyright, not one or the other.

          Or they don’t, because Stable Diffusion is a 4GB file of weights and numbers that have little to do with the content it was trained on. And, you can’t copyright a style.

        • jeremyparker@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          you really can’t see how an imageboard has completely different considerations over image generating models?

          Of course I see the difference - direct, outright theft and direct profiting from the theft is much worse then using content that’s been stolen to train computer image generation software.

          If your complaint is about the copyright infringement, then danbooru should be the target of your complaint - but no one seems to care about that. Why don’t people care about that?

          If the concern is that this software makes it easier to commit crimes, sure, I guess? But, again, danbooru. And like every other site on the internet.

          The concern, it seems to me, is with person A being an artist, person B makes art and tries to pass it off as an original work by person A. And that’s valid - but I still don’t feel like it’s worse than actually just taking the artwork and calling it “content” and using it to generate as revenue.

          The main problem i have with this criticism is that (imo) there are much more important issues at stake with midjourney or whatever - and this (alleged) concern (alleged because it only seems to go skin-deep) prevents people from caring about the real issues.

          Many many many jobs now, when a person leaves, they’re replaced with 2 part time people. This benefits profits and hurts everyone else.

          The issue with computer generated images is that, when a movie studio needs a sci fi background, it used to require an artist; now, it just requires midjourney - and you can hire the artist for 4 hours (instead of 4 days) to touch it up, fix the fingers, etc - which not only takes less time, but also less talent, which increases the labor supply, which pushes wages down.

          This technology has the potential to take the career of being an artist and turns out into a low-wage, part time thing that you can’t live off of. This has happened in so many parts of our economy and it’s really bad, and we need to protect artists from that fate.

          So no, I really can’t muster up giving a shit about whether someone on pixiv copies your art and makes 3$ a month from a patreon. The entire field of visual arts is under threat of complete annihilation from greedy capitalists. They’re the villains here, not some neckbeard’s patreon.

    • Schneemensch@programming.dev
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      9
      ·
      1 year ago

      Just because something is free it does not mean that there is no marketplace or product. Sozial Media is generally free, but I would still call Facebook, Tiktok or Instagram a product.

      Nowadays a lot of industries start out completely free, but move into paid subscription models later.

        • JuxtaposedJaguar@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          People buy and sell paintings despite the fact that you could also make paintings pretty easily. You’re paying for the time they spent creating it and the expertise it required. Just because some people scan and upload their paintings for free, doesn’t mean that all paintings are not products. I don’t see why the same couldn’t be true for AI porn.

      • Touching_Grass@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        You pay in giving up your free time which they sell. Technically we’re just working for free and the product is our attention

    • drfuzzyness@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      1 year ago

      I’m guessing that the “marketplace” and “sale” refers to sites like “Mage Space” which charge money per image generated or offer subscriptions. The article mentions that the model trainers also received a percentage of earnings off of the paid renderings using their models.

      Obviously you could run these models on your own, but my guess is that the crux of the article is about monetizing the work, rather than just training your own models and sharing the checkpoints.

      The article is somewhat interesting as it covers the topic from an outsider’s perspective more geared towards how monetization infests open sharing, but yeah the headline is kinda clickbait.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        6
        ·
        1 year ago

        “Mage Space” which charge money per image generated

        Well, instead of bitching about the AI porn aspect, perhaps they should spend more time talking about how much of a scam it is to charge for AI-generated images.

        • darth_helmet@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 year ago

          Compute costs money, it’s more ethical to charge your users than it is to throw shady ads at them which link to malware.

          • JuxtaposedJaguar@lemmy.ml
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 year ago

            Also buying and eventually replacing expensive hardware. Running AI at scale requires hundreds of thousands of dollars of infrastructure.

          • P03 Locke@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I get no malware or shady ads when I generate AI images with Stable Diffusion. I don’t know what kind of sites or tools you’re using where you’re getting shady ads, but you’re getting ripped off.

    • Send_me_nude_girls@feddit.de
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      I just wanted to say I love your comment. Your totally correct and I enjoyed the passion in your words. That’s how we got to deal with shit article more often. Thx

    • solstice@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I mean that’s kind of worse though isn’t it? The point I got from this is that people can make porn of celebs, exes, colleagues, whoever, super easy now. Whether you gotta pay or not is beside the point. Maybe I’m misunderstanding the situation and your point though?

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        The point I got from this is that people can make porn of celebs, exes, colleagues, whoever, super easy now.

        So I can, but I could also do that without AI. People have photoshopped celebrities heads onto porn actors bodies for decades. It doesn’t happen as much now because there’s no point.

        Realistically, what is really changed except for the tools?

        • solstice@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Simplicity, barriers of entry, skill requirements? Kinda different to just enter a prompt “such and such actress choking on a dildo” than to photoshop it isn’t it? I for one don’t know how to do one but could probably figure out the other.

          Again I’m just speculating, I don’t really know.

          • Krauerking@lemy.lol
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            This is absolutely accurate. Basically humanity is constantly reducing the cost and skill barriers for tasks and jobs. It’s weird that we are now aggressively doing it on creative aspects but that’s what has been done and it’s making a mess of garbage media and porn that could have happened before but much higher quantities and less oversight/Input from multiple people.

    • rhabarba@feddit.deOP
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      56
      ·
      1 year ago

      Repeat: FOR FREE! No product!

      If it’s free, chances are you’re the product. I assume that there is a market for user-generated “prompts” somewhere.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        79
        arrow-down
        7
        ·
        1 year ago

        No, that’s not how open-source or open-source philosophies work. They share their work because they were able to download other people’s work, and sometimes people improve upon their own work.

        These aren’t corporations. You don’t need to immediately jump to capitalistic conclusions. Just jump on Unstable Diffusion Discord or CivitAI yourself. It’s all free.

        • Sethayy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          1 year ago

          Maybe there’s commissions for specific people/poses, cause I certainly couldn’t keep a hard on long enough to generate a spakin worth image

        • rhabarba@feddit.deOP
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          20
          ·
          1 year ago

          These aren’t corporations.

          I know, I know: “but the website is free” (for now). However, Civit AI, Inc. is not a loose community. There must be something that pays their bills. I wonder what it is.

            • jeremyparker@programming.dev
              link
              fedilink
              English
              arrow-up
              8
              ·
              1 year ago

              I feel like you’re implying people should look into things before making accusations. Like, find out if what they’re saying is true before they say it. And that’s why no one asked you to the prom.

          • infamousta@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            4
            ·
            1 year ago

            They’re probably losing money now and just trying to build a user base as a first-mover. They accept donations and subscriptions with fairly minor benefits, but I imagine hosting and serving sizable AI models is not cheap.

            They’ll probably have to transition to paid access at some point, but I don’t see it as particularly unethical as they have bills to pay and do attempt to moderate content on the site.

            I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made. I don’t think there should be open avenues for sharing that kind of stuff online, and their rules should be better enforced.

            • Joshua Casey@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made.

              wholeheartedly disagree. “real porn” is literally made by consenting adult performers. Hence, it’s ethical. Generating adult content of real people is (typically) done without the consent of the people involved, thereby making it unethical.

              • infamousta@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 year ago

                If you don’t think anything unethical happens in the production of porn I’m not sure what to tell you. It’s getting better but exploitation, sex trafficking, revenge porn, etc. have been a thing since pornography was invented.

                AI porn at least does not necessarily need to consider consent. Plenty of AI porn involves animated figures or photorealistic humans that don’t represent any identifiable person.

                The only hang up I have is producing images of actual people without their consent, and I don’t think it’s a new problem as photoshop has existed for a while.

                • Joshua Casey@lemmynsfw.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 year ago

                  i’m sorry to tell you but you have swallowed the propaganda from anti-porn/anti-sex work organizations like Exodus Cry and Morality in Media (who now go by the name NCOSE).

            • aesthelete@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              7
              ·
              edit-2
              1 year ago

              I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made.

              Well, even if that were the case, the “real porn” is still required to train the model in the first place.

              So, it’s unethical shit on top of what you think was even more unethical.

              • infamousta@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                Sure, and “impossible” meat wouldn’t have existed if people weren’t already eating actual meat. But it’s a better alternative. Porn is not going anywhere. If generative AI means less real people get exploited that’s a win in my book.

                • aesthelete@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  Sure, and “impossible” meat wouldn’t have existed if people weren’t already eating actual meat

                  This comparison only holds water if impossible meat were composed of bits of rearranged animal meat… Which it isn’t.

                  If generative AI means less real people get exploited that’s a win in my book.

                  That’s not necessarily a win for everyone. Some people actually like working in the porn industry. Besides that, their likenesses are being stolen and used to produce reproductions and derivative works without consent or compensation.

                  Also, I think you and your buddies here are missing the plot. Generated porn and generated porn of real people are related but different things. I think that’s pretty commonly understood which is why these sites have policies in the first place.

      • And009@reddthat.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        There’s a market for commission artists doing this for money since the dawn of art