2024 might be the breakout year for efficient ARM chips in desktop and laptop PCs.

  • Justin@lemmy.jlh.name
    link
    fedilink
    English
    arrow-up
    70
    arrow-down
    13
    ·
    edit-2
    1 year ago

    This entire article just to hype up Qualcomm releasing a new CPU? I havent seen any evidence to suggest that this new Qualcomm CPU won’t be trash like all the other ones.

    ARM on PC isn’t happening any time soon. They’re not more efficient than x86 CPUs at all.

    Here’s a speed comparison between Qualcomm and AMD’s best cpus from last year. Same TDP.

    https://www.cpu-monkey.com/en/compare_cpu-qualcomm_snapdragon_microsoft_sq3-vs-amd_ryzen_7_7840u

    Here’s Jim Keller, the father of both AMD Ryzen and the Apple M1, saying that ARM is not necessarily more efficient than x86:

    https://chipsandcheese.com/2021/07/13/arm-or-x86-isa-doesnt-matter/

    The only reason why Apple was able to make a successful ARM CPU was because they control the entire OS and the entire supply chain, and they have super expensive exclusivity contracts with TSMC. (because they literally make 50% of all phones in the world)

    AMD’s x86 CPUs are actually faster and more efficient than Apple’s ARM CPUs on the same 5nm process node, but Apple is consistently 2 years ahead when it comes to silicon manufacturing, because of their TSMC deals.

    Qualcomm doesn’t have any of that, and there is no way their CPUs are going to be so much better than AMD’s that people are going to be willing to put up with ISA incompatibilities. Windows on ARM has been a flop.

    At least servers are more reasonable to see ARM chips, because all the software is open-source and all the major cloud vendors are making their own CPUs.

    Nothing against ARM, or alternative ISAs in general, people just don’t understand that x86 vs ARM is not about power efficiency at all, it’s about supply chains and software compatibility.

      • MonkderZweite@feddit.ch
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        1 year ago

        I don’t think anyone thinks slapping an ARM processor in a Windows laptop is going to suddenly make them more efficient.

        I say most think exactly that.

    • Lojcs@lemm.ee
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      5
      ·
      1 year ago

      Here’s a speed comparison between Qualcomm and AMD’s best cpus from last year. Same TDP.

      Amd’s chip runs on 28 watts and is built on 4nm, qc’s runs on 7 watts and is built on 5nm. They are not equivalent.

      AMD’s x86 CPUs are actually faster and more efficient than Apple’s ARM CPUs on the same 5nm process node, but Apple is consistently 2 years ahead when it comes to silicon manufacturing, because of their TSMC deals.

      Comparing amd 7840u pro (4 nm, 28W) with apple m2 pro 10 core (5 nm, 28W), amd is 7% faster in single core and 10% faster in multi core. It’s unclear how it would be if they were on the same node. Feels they’d be the same

    • corbin@infosec.pubOP
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      6
      ·
      1 year ago

      The SQ3 was a custom design only for Surface tablets, I’m not sure it’s representative of Qualcomm’s future generally-available hardware. Early benchmarks on the Snapdragon Elite are much more promising but TDP and other important details are still missing.

      You’re definitely right that software vertical integration is the missing piece. We’re starting to see a little bit of that in the PC ecosystem (e.g. windows using the AI core on newer CPUs/SoCs for live camera and mic effects) but more needs to happen there.

      • Justin@lemmy.jlh.name
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        1 year ago

        That’s true. I haven’t looked that closely at QC’s most recent chips, just pointing out that they’re usually slower/hotter/more-expensive

        It’s good to see competition, but people should manage their expectations. They’re gonna have to be a lot faster/efficient than the AMD 7840u in order to make running ARM worth it on PC.

        It’ll be a fight, and in 2025 they’ll have to compete with Zen 5, too.

    • mryessir@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      My X13s with Linux, at 250 nits brightness while browsing via WLAN and playing music from the browser via bluetooth uses 5-8W in total.

    • frezik@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      You shouldn’t trust TDP numbers. They’re most useful to get a ballpark idea of what size cooler you’ll need for a given chip (and even then, Nactua has their own rating system for matching coolers to chips). AMD, in particular, reinvents their TDP formula regularly and plays with the numbers to get the output they want for comparison purposes.

      Anyway, I’d be fine if ARM ends up being only on par with x86. It’s still a way out of the insanity of the x86 architecture and opens up so many more companies who can make chips.

  • TheGrandNagus@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    2
    ·
    edit-2
    1 year ago

    This is a Qualcomm marketing piece.

    And no, the most exciting 2024 tech won’t be a CPU with similar or lower performance to other comparable CPUs on the market, with the added benefit of less software compatibility.

  • Eager Eagle@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    2
    ·
    1 year ago

    Can’t wait. I recently bought a firewall that gets noticeably warm on idle, even with a little case that has a heat sink. We need more energy efficient PCs.

  • geekworking@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    1 year ago

    One of the hurdles to ARM is that you need to recompile and maintain a separate version of every piece of software for the different processors.

    This is a much easier task for a tightly controlled ecosystem like Mac than the tons of different suppliers Windows ecosystem. You can do some sort of emulation to run non-native stuff, but at the cost of the optimization that you were hoping to gain.

    Another OS variation also adds a big cost/burden to enterprise customers where they need to manage patches, security, etc.

    I would expect to see more inroads in non-corporate areas following Apple success, but not any sort of explosion.

    • originalucifer@moist.catsweat.com
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      1 year ago

      micrsoft has spent the last few years rebuilding their shit to work on ARM. no idea how far theyve come, but you will absolutely see windows on arm for the enterprise.

      • frezik@midwest.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Apple has the benefit of having done architecture transitions a few times already. Microsoft has been trying to get everyone out of the “Program Files (x86)” directory for over decade.

        • originalucifer@moist.catsweat.com
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          edit-2
          1 year ago

          apple doesnt have the burden of being backwards compatible for 3 decades and able to run on most commoditized hardware.

          apple undoubtedly has it easier than a company thats actually in use in most of the business world.

          • frezik@midwest.social
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            Uhh, one reason they don’t is that they have made the switch twice. Even if they didn’t have to deal with any other third party, they still had to convince Adobe, and Adobe doesn’t want to do shit if they don’t have to.

    • qjkxbmwvz@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      On the other hand, a completely open ecosystem works well too — ARM for Linux feels exactly like ARM on x86/64 in my experience. Granted this is for headless stuff on an (RPi and Orange Pi, both ARM, both running Debian), but really the only difference is the bootloader situation.

  • bamboo@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    It would be fascinating to see Qualcomm, NVIDIA, AMD, Mediatek, and possibly others all competing to build the best ARM SoCs for windows devices, especially after so many years of Intel stagnating and Apple eating their lunch with their ARM SoCs.

    • akrot@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      1 year ago

      competing to build the best ARM SoCs for windows devices

      You mean desktop, and not Windows? Because if anything Windows is becoming a botnet device. I hope linux support is OOB.

      • bamboo@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Windows arm devices boot with UEFI, so standard ARM UEFI images should work, just like on x86. I would bet drivers should be alright too, since these ARM SoCs will likely be similar to ones used in Linux SBCs and Android devices.

  • 👍Maximum Derek👍@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    Does anyone else worry that the rise of personal computers using super custom SOCs is going to have negative effects on our abilities to build our own machines?

  • smileyhead@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Ah yes, let’s welcome one device - one operating system myth to the desktops, with people choosing hardware because of software feauture that could be installable. Welcome the expiration date on computers called “years of software support” and welcome overall unfriendlyness for alternative systems.

    Performance and efficency is one side of the coin. But let me remind you that Qualcomm (among with Google) is the reason we cannot have lifetime updates for our phones, ROMs build needs to be specific for each model and making a phone with anything but Android is nearly impossible.

    I’ll take ARM over x86, but I’ll take AMD/Intel over Qualcomm thousand times more.

  • Chemical Wonka@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    But one detail that we cannot forget is that with the increase in ARM architecture in PCs and laptops we will probably see an increase in fully locked hardware. We don’t need the expansion of the ARM architecture for PCs if it doesn’t come with hardware and software freedom

  • R0cket_M00se@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    11
    ·
    1 year ago

    “The most exciting tech isn’t the thing that currently exists and is being improved and integrated daily, it’s this other thing we don’t even know for sure will maybe happen.”

    • corbin@infosec.pubOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      Right, it’s less exciting now because it’s already here. I’m not expecting radically improved GPT models or whatever in 2024, probably just more iteration. The most exciting stuff there might be local AI tech becoming more usable, like we’ve seen with stable diffusion.

      • sir_reginald@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        I’m just expecting performance optimisations, especially for local LLMs. Right now there are models as good as GPT-4 (Goliath 120B), but they require 2 RTX 4090 to run.

        The models that require less powerful equipment are not as good, of course.

        But hopefully, given enough time, good enough models will be able to run with mid end hardware.

  • fubarx@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    1 year ago

    How long before indie devs can make their own custom processor chips?

    • stealth_cookies@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      Now? FPGAs have been a thing for decades and are the closest thing I can see to getting custom chips made without massive investments.

      • fubarx@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        1 year ago

        Yup. But was thinking more of ultra-small-run ARM or RISC-V processors. Be cool if we ever get there.

        • StarDreamer@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          You can build a risc core using an fpga. Plenty of people have done that.

          Performance will probably be an issue.

    • sir_reginald@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      1 year ago

      do you know how a CPU is designed? it’s just crazy hard to study the design of simple RISC CPUs we studied in college. And those were very simple, old processors.

      A modern processor with performance that can match modern CPUs is no task for one indie dev, at all.

      You need a team of professionals in the field, a huge budget and the technology to manufacture it, which you would probably end outsourcing to one of the big manufacturers anyway because it’s very rare.

      So the answer to your question is never, unless you’re expecting low performance CPUs based on FPGAs.