cross-posted from: https://jamie.moe/post/113630

There have been users spamming CSAM content in !lemmyshitpost@lemmy.world causing it to federate to other instances. If your instance is subscribed to this community, you should take action to rectify it immediately. I recommend performing a hard delete via command line on the server.

I deleted every image from the past 24 hours personally, using the following command: sudo find /srv/lemmy/example.com/volumes/pictrs/files -type f -ctime -1 -exec shred {} \;

Note: Your local jurisdiction may impose a duty to report or other obligations. Check with these, but always prioritize ensuring that the content does not continue to be served.

Update

Apparently the Lemmy Shitpost community is shut down as of now.

  • regalia
    link
    fedilink
    English
    arrow-up
    71
    arrow-down
    2
    ·
    1 year ago

    Why do these deranged fucks do this

      • regalia
        link
        fedilink
        English
        arrow-up
        71
        arrow-down
        3
        ·
        1 year ago

        This isn’t trolling, this is just disgusting crime.

        • Not_Alec_Baldwin@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          19
          ·
          edit-2
          1 year ago

          The crime happened in the past when the children were abused. This is some weird amalgam of criminal trolling.

          Edit: yeah yeah I get that csam is criminal, that’s why I called it an amalgam. It’s both trolling and criminal.

          • chiisana@lemmy.chiisana.net
            link
            fedilink
            English
            arrow-up
            11
            ·
            1 year ago

            Depending on jurisdiction, I am not a lawyer, etc etc, but I’d imagine with fairly high degree of probability that re-distribution of CSAM is also a crime.

          • ChunkMcHorkle@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            The crime happened in the past when the children were abused.

            That’s true. You could look at it that way and stop right there and remain absolutely correct. Or, you could also look at it from the eventual viewpoint of that victim as a human being: as long as that picture exists, they are being victimized by every new use of it, even if the act itself was done decades ago.

            Not trying to pile on, but anyone who has suffered that kind of violation as a child suffers for life to some extent. There are many who kill themselves, and even more that cannot escape addiction because the addiction is the only safe mental haven they have where life itself is bearable. Even more have PTSD and other mental difficulties that are beyond understanding for those who have not had their childhood development shattered by that, or worse, had that kind of abuse be a regular occurrence for them growing up.

            So to me, adding a visual record of that original violating act to the public domain that anyone can find and use for sick pleasure is an extension of the original violation and not very different from it, IMO.

            The visual records are kind of a sick gift that never stop giving, and worse still if the victim knows the pics or videos are out there somewhere.

            I am well aware not everyone sees it this way, but an extra bit of understanding for the victims would not go amiss. Imagine being an adult and browsing the web, thinking it’s all in the past and maybe you’re safe now, and stumbling across a picture of yourself being raped at the age of five, or whatever, or worse still, having friends or family or spouse or children stumble across it.

            So speaking only for myself, I think CSAM is a moral crime whenever it is accessed, one of the most hellish that can be committed against another human being, regardless of the specificities of the law.

            I don’t have a problem with much else that people share, but goddamn I do have a problem with that.