A nightmare scenario previously only imagined by AI researchers, where AI image generators accidentally spit out non-consensual pornography of real people, is now reality.

  • bioemerl@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    OP has pointed out that he doesn’t actually think there are exact replicas being produced, which just makes this even more confusing.

    Your misread their first comment, I think.

    They were saying that DESPITE the common arguments that AI only learns and doesn’t copy exactly it might still be good to require consent for people’s content to be in training data.