• mysoulishome@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Ok…so I’m aware there is a feature “check for sensitive media” that parents can turn on and AI can send an alert to you if it seems like your kid might be texting nude pics….only works with iMessage since apple doesn’t have access to photos in other apps. No human sees the photos. But that isn’t the same as what you’re saying and I don’t know if what you’re saying is accurate.

      • 6xpipe_@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Apple Kills Its Plan to Scan Your Photos for CSAM

        That headline literally says they’re not doing that. It was a well-meaning initiative that they rightfully backed down on when called out.

        I am one of the first to typically assume malice or profit when a company does something, but I really think Apple was trying to do something good for society in a way that is otherwise as privacy-focused as they could be. They just didn’t stop to consider whether or not they should be proactive in legal matters, and when they got reamed by privacy advocates, they decided not to go forward with it.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          Good on them for canceling those plans but they only did so because of the massive public outcry. They still intended to start scanning your photos and that is worrying.

          However I’m not denying that it’s probably still the most privacy focused phone you can get. For now.

          • kirklennon@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            They still intended to start scanning your photos and that is worrying.

            They wanted to scan photos stored in iCloud. Apple has an entirely legitimate interest in not storing CSAM on their servers. Instead of doing it like every other photo service does, which scans all of your photos on the server, they created a complex privacy-preserving method to do an initial scan on device as part of the upload process and, through the magic of math, these would only get matched as CSAM on the server if they were confident (one in a trillion false-positives) you were uploading literally dozens of CSAM images, at which point they’d then have a person verify to make absolutely certain, and then finally report your crime.

            The system would do the seemingly impossible of preserving the privacy of literally everybody except the people that everyone agrees don’t deserve it. If you didn’t upload a bunch of CSAM, Apple itself would legitimately never scan your images. The scan happened on device and the match happened in the cloud, and only if there were a enough matches to guarantee confidence. It’s honestly brilliant but people freaked out after a relentless FUD campaign, including from people and organizations who absolutely should know better.

          • dynamojoe@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            but they only did so because of the massive public outcry

            Well, shit. For once the voice of the people worked and you’re still bitching about it.

            • Thorny_Thicket@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              1 year ago

              You’re right. Maybe I’m being a bit too harsh and should give them some credit. After all they reversed the decision to switch to those shitty butterfly switches on the macbook keyboard too and brought back HDMI and SD card slot. Also ditched that stupid touch bar