Drawn art depicting minors in sexual situations has been deemed protected as free speech in the US. It’s why, at least in the US, you don’t have to worry about the anime girl that’s 17 landing you in prison on child porn charges. The reasoning: there is no victim, the anime girl is not sentient, therefore the creation of that art is protected as free speech.
I suspect a similar thing will happen with this. As long as it is not depicting a real person, the completely invented person is not sentient, there is no victim, this will fall under free speech. At least in the US.
However, it is likely a very, very bad idea to have any photo-realistic art of this manner, as it may not be clear to authorities if it is from AI or if there is in fact a person you are victimizing. Doubly so if you download this from someone else, as you don’t know if that is a real person either.
Deepfakes of an actual child should be considered defamatory use of a person’s image; but they aren’t evidence of actual abuse the way real CSAM is.
Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.
Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)
But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.
Sadly, a lot of it does evolve from wanting to “watch” to wanting to do
This is the part where I disagree and I would love ppl to prove me wrong. Because whether this is true or false, it will probably be the deciding factor in allowing or restricting “artificial CSAM”.
Possession of CSAM that’s 20 years old (i.e. the subject is now adult) or even 100 years old (i.e. the subject is likely deceased) is not legal. You don’t have to pay for it or create it, just possess it. Yeah, they’ll find a way to prosecute for images of non-existent children.
Anything that looks realistic should be illegal if you ask me as otherwise it would become harder to prosecuting real child porn. “Oh that picture is just modified with AI” could be hard to dissprove…
We shouldn’t be prosecuting people because they have things that look like child porn, we have to prove that there’s a victim, or people get accused of things they didn’t do.
We shouldn’t be prosecuting these people, but we should be figuring out how to get them help.
An adult person who is attracted to children can obviously not have any legal sexual contact with a child, just like anybody else, and so we need to make sure they have the tools and ability to get by without that.
I don’t know what’s best for these people. Maybe the best way to help them is to let them have this fake material. Maybe the best way to help them is to try to deny them this sort of material. There’s probably some scientist out there who has studied what is the best thing.
Allowing someone to act out on their deranged fantasies just results in reinforcing this behavior. No, it would not help them.
We learned in the early eighties that allowing people to scream, tear up stuff, and generally destroy things that it did not help them move past their feelings of anger. If you hit things to deal with anger it becomes a feedback loop of hitting more things more often to deal with the emotion.
Exactly that has already been tried, and struck down by the supreme Court in Ashcroft v. Free Speech Coalition. It turns out, porn of people over 18 very often looks the same as porn of people under 18, therefore such a law bans a considerable amount of legal adult content.
Drawn art depicting minors in sexual situations has been deemed protected as free speech in the US. It’s why, at least in the US, you don’t have to worry about the anime girl that’s 17 landing you in prison on child porn charges. The reasoning: there is no victim, the anime girl is not sentient, therefore the creation of that art is protected as free speech.
I suspect a similar thing will happen with this. As long as it is not depicting a real person, the completely invented person is not sentient, there is no victim, this will fall under free speech. At least in the US.
However, it is likely a very, very bad idea to have any photo-realistic art of this manner, as it may not be clear to authorities if it is from AI or if there is in fact a person you are victimizing. Doubly so if you download this from someone else, as you don’t know if that is a real person either.
Deepfakes of an actual child should be considered defamatory use of a person’s image; but they aren’t evidence of actual abuse the way real CSAM is.
Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.
Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)
But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.
Add in an extra twist. Hopefully if the sickos are at least happy with AI stuff they won’t need “real”
Sadly, a lot of it does evolve from wanting to “watch” to wanting to do
This is the part where I disagree and I would love ppl to prove me wrong. Because whether this is true or false, it will probably be the deciding factor in allowing or restricting “artificial CSAM”.
Have you got some source about this ?
Some actually fetishize causing suffering.
Some people are sadists and rapists, yes, regardless of what age group they’d want to do it with.
Possession of CSAM that’s 20 years old (i.e. the subject is now adult) or even 100 years old (i.e. the subject is likely deceased) is not legal. You don’t have to pay for it or create it, just possess it. Yeah, they’ll find a way to prosecute for images of non-existent children.
Anything that looks realistic should be illegal if you ask me as otherwise it would become harder to prosecuting real child porn. “Oh that picture is just modified with AI” could be hard to dissprove…
We shouldn’t be prosecuting people because they have things that look like child porn, we have to prove that there’s a victim, or people get accused of things they didn’t do.
https://www.midsouthcriminaldefense.com/blog/2013/february/porn-star-appears-in-court-and-vindicates-man-ch/
We shouldn’t be prosecuting these people, but we should be figuring out how to get them help.
An adult person who is attracted to children can obviously not have any legal sexual contact with a child, just like anybody else, and so we need to make sure they have the tools and ability to get by without that.
I don’t know what’s best for these people. Maybe the best way to help them is to let them have this fake material. Maybe the best way to help them is to try to deny them this sort of material. There’s probably some scientist out there who has studied what is the best thing.
Allowing someone to act out on their deranged fantasies just results in reinforcing this behavior. No, it would not help them.
We learned in the early eighties that allowing people to scream, tear up stuff, and generally destroy things that it did not help them move past their feelings of anger. If you hit things to deal with anger it becomes a feedback loop of hitting more things more often to deal with the emotion.
Exactly that has already been tried, and struck down by the supreme Court in Ashcroft v. Free Speech Coalition. It turns out, porn of people over 18 very often looks the same as porn of people under 18, therefore such a law bans a considerable amount of legal adult content.