I’m not sure where you’re going with that? I would argue that yes, it is. As it’s sexual material of a child, with that child’s face on it, explicitly made for the purpose of defaming her. So I would say it sexually abused a child.
But you could also be taking the stance of “AI trains on adult porn, and is mearly recreating child porn. No child was actually harmed during the process.” Which as I’ve said above, I disagree with, especially in this particular circumstance.
Apologies if it’s just my reading comprehension being shit
It’s actually not clear that viewing material leads that person to causing in person abuse
Providing non harmful ways to access the content may lead to less abuse as the content they seek no longer comes from abuse, reducing demand for abusive content.
That being said, this instance isn’t completely fabricated and given its further release is harmful as it it involves a real person and will have emotional impact.
There has been yes, but it doesn’t mean it’s the right ruling law. The law varies on that by jurisdiction as well because it is a murky area.
Edit: in the USA it might not even be illegal unless there was intent to distribute
By the statute’s own terms, the law does not make all fictional child pornography illegal, only that found to be obscene or lacking in serious value. The mere possession of said images is not a violation of the law unless it can be proven that they were transmitted through a common carrier, such as the mail or the Internet, transported across state lines, or of an amount that showed intent to distribute.[
So local AI generating fictional material that is not distributed may be okay federally in the USA.
Have the AI try to recreate existing CP already deemed to have serious value and then have all the prompts/variations leading up to the closest match as part of an exhibit.
Edit: I should add, don’t try this at home, they’ll still probably say it has no value and throw you in jail.
Then don’t defend them? You’re trying to tell everyone that what is literally above in an article, about a child who had PHOTOREALISTIC pictures made of her, that it isn’t CSAM.
It is. Deleting everyone’s comments who disagree with you will not change that, and if anything, WILL make you seem even more like the bad guy.
Nobody said anything about drawings, but interesting default argument… Thanks for telling the class that you’re a lolicon pedo.
the liberty of masses be stomped and murdered
Nobody said that anyone should be stomped and murdered, so calm down, lmao. We’re just saying that child porn producers, consumers, and apologists are vile, disgusting perverts who should be held accountable for their crimes against children.
…being firmly and unwaveringly against the sexual exploitation of children.
I really can’t stress enough that this was an actual 15 year old girl who was pornified with AI. This isn’t some “erotic drawings” argument, the end results were photorealistic nudes with her unmodified face. This isn’t some completely AI generated likeness. Pictures of her from social media were exploited to remove her clothes and fill in the gaps with models trained on porn. It was nonconsensual pornography of a kid.
Anyone who read this story, and feels the need to defend what was done to this girl is a fucking monster.
I can’t believe that the person defending sex crimes of this magnitude is a fucking mod.
Is it material that sexually abuses a child?
it would be material of and or containing child sexual abuse in it.
I’m not sure where you’re going with that? I would argue that yes, it is. As it’s sexual material of a child, with that child’s face on it, explicitly made for the purpose of defaming her. So I would say it sexually abused a child.
But you could also be taking the stance of “AI trains on adult porn, and is mearly recreating child porn. No child was actually harmed during the process.” Which as I’ve said above, I disagree with, especially in this particular circumstance.
Apologies if it’s just my reading comprehension being shit
Is it material that may encourage people to asexually abuse a child?
That’s one definition, sure.
Now answer the very simple question I asked about whether or not child porn is abusive.
Any sex act involving a adult and a child/minor is abusive by its very nature.
It’s actually not clear that viewing material leads that person to causing in person abuse
Providing non harmful ways to access the content may lead to less abuse as the content they seek no longer comes from abuse, reducing demand for abusive content.
That being said, this instance isn’t completely fabricated and given its further release is harmful as it it involves a real person and will have emotional impact.
There’s other instances where it was completely fabricated, and the courts ruled it was CSAM and convicted
There has been yes, but it doesn’t mean it’s the right
rulinglaw. The law varies on that by jurisdiction as well because it is a murky area.Edit: in the USA it might not even be illegal unless there was intent to distribute
So local AI generating fictional material that is not distributed may be okay federally in the USA.
Serious value? How does one legally argue that their AI-generated child porn stash has “serious value” so they they don’t get incarcerated.
Laws are weird.
Have the AI try to recreate existing CP already deemed to have serious value and then have all the prompts/variations leading up to the closest match as part of an exhibit.
Edit: I should add, don’t try this at home, they’ll still probably say it has no value and throw you in jail.
Prison*
Ah my bad, you’re right.
Then you’ll probably get shanked if any of the other inmates find out you were sent there for CP.
Removed by mod
So you don’t think that nudifying pics of kids is abusive?
Says something about you I think…
Removed by mod
attack the argument, not the person
Removed by mod
Then don’t defend them? You’re trying to tell everyone that what is literally above in an article, about a child who had PHOTOREALISTIC pictures made of her, that it isn’t CSAM.
It is. Deleting everyone’s comments who disagree with you will not change that, and if anything, WILL make you seem even more like the bad guy.
Nobody said anything about drawings, but interesting default argument… Thanks for telling the class that you’re a lolicon pedo.
Nobody said that anyone should be stomped and murdered, so calm down, lmao. We’re just saying that child porn producers, consumers, and apologists are vile, disgusting perverts who should be held accountable for their crimes against children.
Removed by mod
They’re making them unsafe? You and your bullshit are making them unsafe. Every comment you post reeks of your true character. Go get help.
I’m making kids unsafe by…
checks notes
…being firmly and unwaveringly against the sexual exploitation of children.
I really can’t stress enough that this was an actual 15 year old girl who was pornified with AI. This isn’t some “erotic drawings” argument, the end results were photorealistic nudes with her unmodified face. This isn’t some completely AI generated likeness. Pictures of her from social media were exploited to remove her clothes and fill in the gaps with models trained on porn. It was nonconsensual pornography of a kid.
Anyone who read this story, and feels the need to defend what was done to this girl is a fucking monster.
I can’t believe that the person defending sex crimes of this magnitude is a fucking mod.
Removed by mod
Removed by mod
Do you really think being insufferable is going to change any minds here?
Found the weirdo
Found the Loli* ftfy