• The Cuuuuube
    link
    fedilink
    English
    710 months ago

    “CSAM laws aren’t for the safety of real people” is one of the hottest takes I’ve ever seen in my life

    • @Rodeo@lemmy.ca
      link
      fedilink
      -110 months ago

      Straight outta reddit with that one.

      I’m just going to copy paste my other comment:

      I thought it was pretty apparent we were talking about Lemmy, but okay.

      The statements were about the Lemmy devs can and/or should be doing for safety. They simply do not have the power to stop child abuse by developing a social media platform. So then the safety in question must be the safety of people using Lemmy, because the Lemmy devs have some direct power over that.

      I’m sure you feel very morally aloof with your righteous retort, though.

      • The Cuuuuube
        link
        fedilink
        English
        310 months ago

        Yes. Obviously we’re talking about Lemmy. We just still fundamentally disagree on the forms of harm, psychic and physical, that can be experienced through the rapid propagation of CSAM. Lemmy’s lacking mod tools have been a major topic of discussion for a while now. I don’t care to carry on this conversation because it’s clear our starting points are too far apart to meet in the middle

        • @Rodeo@lemmy.ca
          link
          fedilink
          110 months ago

          I think the other guy’s comment is well suited as a response to this, so again I’ll copy paste:

          The theory behind why CSAM is illegal is that if someone is willing to pay for CSAM it incentivizes production of even more CSAM content to receive more payment. That incentivized additional production means even more abuse. A perfectly reasonable take and something that I think can be demonstrated.

          But why would you accidentally seeing CSAM prompt you to give payment to create that incentivization?

          How could reason possibly prevail when the subject matter is so sensitive?