319
Lemmyshitpost community closed until further notice - Lemmy.World
lemmy.worldHello everyone, We unfortunately have to close the !lemmyshitpost community for
the time being. We have been fighting the CSAM (Child Sexual Assault Material)
posts all day but there is nothing we can do because they will just post from
another instance since we changed our registration policy. We keep working on a
solution, we have a few things in the works but that won’t help us now. Thank
you for your understanding and apologies to our users, moderators and admins of
other instances who had to deal with this. Edit: @Striker@lemmy.world
[https://lemmy.world/u/Striker] the moderator of the affected community made a
post apologizing for what happened. But this could not be stopped even with 10
moderators. And if it wasn’t his community it would have been another one. And
it is clear this could happen on any instance. But we will not give up. We are
lucky to have a very dedicated team and we can hopefully make an announcement
about what’s next very soon. Edit 2: removed that bit about the moderator tools.
That came out a bit harsher than how we meant it. It’s been a long day and
having to deal with this kind of stuff got some of us a bit salty to say the
least. Remember we also had to deal with people posting scat not too long ago so
this isn’t the first time we felt helpless. Anyway, I hope we can announce
something more positive soon.
They also shut down registration
Whoever is spamming CP deserves the woodchipper
Stop misconstruing it as safety. It’s about legality. Nobody’s safety is in jeopardy because they saw an illegal image accidentally. This is about following the law, not protecting the safety of users.
It ties into safety as well, websites have “trust and safety” teams. This is where it falls under. Sorry for not being more concise.
No need to apologize, I just think safety is a misnomer here.
You know, except for those abuse victims whose pictures are being spread around lemmy. Just sayin’
The theory behind why CSAM is illegal is that if someone is willing to pay for CSAM, the idea is that it incentivizes production of even more CSAM content to receive more payment. That incentivized additional production means even more abuse. A perfectly reasonable take and something that I think can be demonstrated.
But why would you accidentally seeing CSAM prompt you to give payment to create that incentivization? Are you worried that you’re a closeted pedophile that will be ready to shower those who record such content to see more and more as soon as you get your first taste?
I thought it was pretty apparent we were talking about Lemmy, but okay.
The statements were about the Lemmy devs can and/or should be doing for safety. They simply do not have the power to stop child abuse by developing a social media platform. So then the safety in question must be the safety of people using Lemmy, because the Lemmy devs have some direct power over that.
I’m sure you feel very morally aloof with your righteous retort, though.
“CSAM laws aren’t for the safety of real people” is one of the hottest takes I’ve ever seen in my life
Straight outta reddit with that one.
I’m just going to copy paste my other comment:
I thought it was pretty apparent we were talking about Lemmy, but okay.
The statements were about the Lemmy devs can and/or should be doing for safety. They simply do not have the power to stop child abuse by developing a social media platform. So then the safety in question must be the safety of people using Lemmy, because the Lemmy devs have some direct power over that.
I’m sure you feel very morally aloof with your righteous retort, though.
Yes. Obviously we’re talking about Lemmy. We just still fundamentally disagree on the forms of harm, psychic and physical, that can be experienced through the rapid propagation of CSAM. Lemmy’s lacking mod tools have been a major topic of discussion for a while now. I don’t care to carry on this conversation because it’s clear our starting points are too far apart to meet in the middle
I think the other guy’s comment is well suited as a response to this, so again I’ll copy paste:
How could reason possibly prevail when the subject matter is so sensitive?