A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.
The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”
What a weird populist law tbh. There’s already an established law framework that covers this: defamation. Not a lawyer but it seems like this should be addressed instead of writing up some new memes.
They’ll use this as an opportunity to sneak in more government spyware/control is my guess.
It’s not defamation. And the new law will likely fail to hold up to 1A scrutiny, if the description of it is accurate (it often is not, for multiple reasons that include these bills generally changing over time). This is more of a free speech issue than photoshopping someone’s head onto someone else’s nude body, because no real person’s head or body is involved, just an inhumanly good artist drawing a nude, and on top of that the law punishes possession, not just creation.
An example question any judge is going to have for the prosecutor if this goes to trial is how the image the law bans is meaningfully different from writing a lurid description of what someone looks like naked without actually knowing. Can you imagine going to jail because you have in your pocket a note someone else wrote and handed you that describes Trump as having a small penis? Or a drawn image of Trump naked? Because that’s what’s being pitched here.
It actually proposes “possession with the intention to distribute” which just show what a meme law this is. How do you determine the intention to distribute for an image?
And I disagree with your take that this can’t be defamation. Quick googling says the general consensus is that this would fall in the defamation family of laws which makes absolute sense since a deepfake is an intentional misrepresentation.
Removed by mod
When you find it broken down into individual baggies?
When they find the scale too
Even better: Intentional infliction of emotional distress
There are business interests behind this. There is a push to turn a likeness (and voice, etc.) into an intellectual property. This bill is not about protecting anyone from emotional distress or harm to their reputation. It is about requiring “consent”, which can obviously be acquired with money (and also commercial porn is an explicit exception). This bill would establish this new kind of IP in principle. It’s a baby step but still a step.
You can see in this thread that proposing to expand this to all deepfakes gets a lot of upvotes. Indeed, there are bills out there that go all the way and would even make “piracy” of this IP a federal crime.
Taylor Swift could be out there, making music or having fun, while also making money from “her consent”, IE by licensing her likeness. She could star in movies or makes cameos by deepfaking her on some nobody actor. She could license all sorts of youtube channels. Or how about a webcam chat with Taylor? She could be an avatar for ChatGPT, or she could be deepfaked onto one of those Indian or Kenyan low-wage workers who do tech support now.
We are not quite there yet, technologically, but we will obviously get there soonish. Fakes in the past were just some pervs who were making fan art of a sort. Now the smell of money is in the air.
This seems like the most likely scenario tbh. I’m not sure whether personal likeness IP is a bad thing per se but one thing is sure - it’s not being done to “protect the kids”.
It is. It means that famous people (or their heirs, or maybe just the rights-owner) can make even more money from their fame without having to do extra work. That should be opposed out of principle.
The extra money for the licensing fees has to come from somewhere. The only place it can come from is working people.
It would mean more inequality; more entrenchment of the current elite. I see no benefit to society.
Not necessarily I’m optimistic that this could lead to empowering status and personality as main resources and push money out of society.
How so? Fame is already a monetizable resource. The main changes that I see are that 1) no opportunity to show their face and make their voice heard needs to be missed for lack of time, and 2) age no longer needs to be a problem.
When you steal a person’s likeness for profit or defame them, then that’s a CIVIL matter.
This bill will make AI sexualization a CRIMINAL matter.
Where do you see that?
Here:
That doesn’t seem to be correct. More like a typo as criminalize =/= criminal law.
Always interesting to see people who even admit that they don’t know, but they still have a rather strong opinion.
So only lawyers can have an opinion on law and be allowed public discourse? Lol
Obviously not. Everyone is allowed to voice their opinion and has to accept that other people might find his opinion stupid and tell them so.
My point is more, that you seem on one hand to realize that it’s a complex matter and you lack the expert knowledge (I’m not a lawyer), but on other hand still feel the need to express your opinion. There is nothing inherently wrong with that. It’s extremely common. Just something I have fun pointing out.
How interesting that someone chose to comment, here of all places, the comment section.
And I have chosen to comment on that comment, go figure.
Nobody’s saying you should be barred from participating, you just rightfully look like an idiot while you do it.