• mysoulishome@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Yep. They really doubled down on privacy/security and it’s pretty admirable. The President doesn’t use an android or a blackberry for a reason. (Well, two in the case of blackberry. Security and existing). If only there were no other problematic areas of Apple’s business (manufacturing, wages, environmental impact).

      • Areopagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Can’t wait for them to put their money where their mouth is and do the same in China and other large population countries that demand the same thing 😂

      • Thorny_Thicket@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        1 year ago

        They’re hypocrites though. Branding themselves as privacy focused and in some cases actually being that too but at the same time also scanning your photos and messages and reporting to authorities/parents if there something inappropriate.

        Inb4 no need to worry if you have nothing to hide -argument

        • mysoulishome@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Ok…so I’m aware there is a feature “check for sensitive media” that parents can turn on and AI can send an alert to you if it seems like your kid might be texting nude pics….only works with iMessage since apple doesn’t have access to photos in other apps. No human sees the photos. But that isn’t the same as what you’re saying and I don’t know if what you’re saying is accurate.

            • 6xpipe_@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              Apple Kills Its Plan to Scan Your Photos for CSAM

              That headline literally says they’re not doing that. It was a well-meaning initiative that they rightfully backed down on when called out.

              I am one of the first to typically assume malice or profit when a company does something, but I really think Apple was trying to do something good for society in a way that is otherwise as privacy-focused as they could be. They just didn’t stop to consider whether or not they should be proactive in legal matters, and when they got reamed by privacy advocates, they decided not to go forward with it.

              • Thorny_Thicket@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                1 year ago

                Good on them for canceling those plans but they only did so because of the massive public outcry. They still intended to start scanning your photos and that is worrying.

                However I’m not denying that it’s probably still the most privacy focused phone you can get. For now.

                • kirklennon@kbin.social
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  They still intended to start scanning your photos and that is worrying.

                  They wanted to scan photos stored in iCloud. Apple has an entirely legitimate interest in not storing CSAM on their servers. Instead of doing it like every other photo service does, which scans all of your photos on the server, they created a complex privacy-preserving method to do an initial scan on device as part of the upload process and, through the magic of math, these would only get matched as CSAM on the server if they were confident (one in a trillion false-positives) you were uploading literally dozens of CSAM images, at which point they’d then have a person verify to make absolutely certain, and then finally report your crime.

                  The system would do the seemingly impossible of preserving the privacy of literally everybody except the people that everyone agrees don’t deserve it. If you didn’t upload a bunch of CSAM, Apple itself would legitimately never scan your images. The scan happened on device and the match happened in the cloud, and only if there were a enough matches to guarantee confidence. It’s honestly brilliant but people freaked out after a relentless FUD campaign, including from people and organizations who absolutely should know better.

                • dynamojoe@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  but they only did so because of the massive public outcry

                  Well, shit. For once the voice of the people worked and you’re still bitching about it.

  • Artemis@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Mine is mainly a YouTube and Books machine. During the NFL season I’ll use it to keep tabs on games that my team isn’t in, or pull up NFL Redzone as a PiP kind of setup from the couch.

    Sometimes I use it for recipes too

  • Ghostalmedia@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    This is one thing Apple has been pretty firm on. You can’t have a secure product and have backdoors. You can try to hide them all you want, but a backdoor will always be a massive security vulnerability.

    • roadkill@kbin.social
      link
      fedilink
      arrow-up
      1
      arrow-down
      20
      ·
      1 year ago

      ahem

      https://techcrunch.com/2018/02/25/apple-moves-icloud-encryption-keys-for-chinese-users-to-china/

      Chinese authorities can now get a Chinese legal order and tell Apple’s local partner to hand over user data. The local partner (and by extension Apple) will have no choice but to comply with the order.

      Apple’s statement to Reuters is quite telling. “While we advocated against iCloud being subject to these laws, we were ultimately unsuccessful,” The company told Reuters. Apple simply couldn’t win this fight.

      They moved the storage of encryption keys for Chinese users to servers in China instead of shutting down iMessage and Facetime. Quite the different response compared to here.

      • kirklennon@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        They moved the storage of encryption keys for Chinese users to servers in China instead of shutting down iMessage and Facetime.

        These are totally separate things. Apple users in China can still use iMessage and FaceTime and those are still end-to-end encrypted. If you choose to store your iMessages in iCloud, those can be accessed by the government, but that’s the same as they can in every other country. The UK’s proposal is to directly break the security of iMessage itself, something worse than what China has done.

      • abhibeckert@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 year ago

        They moved the storage of encryption keys for Chinese users to servers in China

        No they didn’t. iMessage can only be decrypted by keys stored in the secure enclave on your device.

        There are some things that the Chinese government can access. The contents of messages isn’t one of them.

        And as for Facetime… those calls aren’t recorded at all. Not sure how a legal order is supposed to allow access to data that doesn’t even exist.

        • JiveTurkey@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          I agree that’s not how it works in most places but I don’t assume to know the inner working of a Chinese iphone or the version of iOS it’s running. If there is a financial incentive apple will bend for China while also saying it didn’t.

          • abhibeckert@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            The way Facetime works is extensively documented and thoroughly audited by third parties - many of whom publish their findings.

            If China had a back door into Facetime, I suspect I’d know about it as someone who follows these things pretty closely.

            • JiveTurkey@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Right but none of it is open source so being extensively documented doesn’t mean much and what I said still stands. You are assuming that what apple has told you is the truth with zero 3rd party audits of the underlying code.