Sorry for the short post, I’m not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:
Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.
It will not possible to upload any new avatars or banners while this limit is in effect.
I’m really sorry for the disruption, it’s a necessary trade-off for now until we figure out the way forward.
I know there are automated tools that exist for detection CSAM- given the challenges the fediverse has had with this issue it really feels like it’d be worthwhile for the folks developing platforms like lemmy and mastodon to start thinking about how to integrate those tools with their platforms to better support moderators and folks running instances.
I just shut down my instance because of this attack. Once there are better controls to prevent this, I will stand it back up.
Yeah, there was a gardening instance run by a great guy who just did the same
What do you think the purpose of these attacks are? The fediverse is so small in the grand scheme that I can only assume the worst.
Good thing my instance is only friends and friends of friends, otherwise I’d have to do the same
What was your instance?
https://github.com/LemmyNet/lemmy/issues/3920
That’s fucking dope, thank you very much for the link to the issue!