Intel doesn’t think that Arm CPUs will make a dent in the laptop market::“They’ve been relegated to pretty insignificant roles in the PC business.”

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    86
    arrow-down
    11
    ·
    1 year ago

    Nobody tell Intel about Apple Silicon! Or that Apple’s sales are increasing while they rest of the industry is in a slump.

    • Trippin@feddit.nl
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      4
      ·
      1 year ago

      Do you have numbers? Cause I’m thinking at at 8.6% worldwide, it’s not really a big chunk of the pie. Especially as the article states, it’s declining compared to the year before.

      • GenderNeutralBro@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        3
        ·
        edit-2
        1 year ago

        The article you linked pretty much sums it up.

        Apple’s Mac market share increased to 8.6%, reporting year-over-year shipment growth of 10.3%, the only major manufacturer to do so.

        The year-over-year Mac shipment growth comes even as the broader market and competitors notch sharp declines in shipments, and as the Intel transition wraps up.

        Lenovo, HQ, Dell and Acer all had year-over-year drops in shipments, according to IDC data.

    • long_chicken_boat@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      22
      ·
      1 year ago

      you clearly don’t know what you’re talking about. Apple’s laptops sales are decreasing. And most Mac users can’t tell the differences between Intel, the M chips, AMD or whatever. They just know that there’s a pretty apple on the back of their laptop and that’s why they buy it.

      • vzq@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        24
        arrow-down
        5
        ·
        edit-2
        1 year ago

        Apple’s laptops sales are decreasing.

        Right now. But that’s because the M1 and M2 Mac sold like hot cakes and we’re at a quiet stop in the product timeline. They pick right back up when the new models drop.

        And most Mac users can’t tell the differences between Intel, the M chips, AMD or whatever.

        This is silly. The M1 models were leaps and bounds better than the Intel predecessors. That’s not all due to the ARM chip, but list a “MacBook Air 2020” for sale and watch the Mac heads stumble over each other to ask “intel or m1”.

        They just know that there’s a pretty apple on the back of their laptop and that’s why they buy it.

        People generally tend to stick to one ecosystem because of lock in effects, such as investment in apps etc, but that’s equally true in the windows world. I’m not sure I would classify Mac users on the whole as tech illiterate though. Most know exactly what they are getting and why.

        And they are also A LOT of Mac users that just want a decent Unix machine and sturdy, capable hardware.

        • ink@r.nf
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          5
          ·
          edit-2
          1 year ago

          But that’s because the M1 and M2 Mac sold like hot cakes and we’re at a quiet stop in the product timeline.

          So you agree that it’s mainly Apple stans buy in whenever a new Mac drops, and hasn’t been able to really influence non Apple stans.

          That’s what the poster above you said. Apple stans buy in because they’re a cult, so of course their sales will increase once a new mac drops.

          sold like hot cakes

          The same was true for every other manufacturers during covid as everyone was bound to work from home and needed a laptop, but nobody seems to mention that

          • vzq@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            So you agree that it’s mainly Apple stans buy in whenever a new Mac drops, and hasn’t been able to really influence non Apple stans.

            No, I very much do not. M1 laptops have drawn in a lot of new users, simply because of the battery life and performance.

            Apple stans buy in because they’re a cult, so of course their sales will increase once a new mac drops.

            I’m not sure what you’re trying to say here. There are a lot of factors that influence purchasing decisions. It’s expensive equipment and people try to decide whether it’s worth spending cash on. Apple users are not made of money.

            The same was true for every other manufacturers

            Yes. So?

            Market share of ARM laptops skyrocketed from 0% to almost 10% overnight regardless. That’s the topic right?

  • simple@lemm.ee
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    3
    ·
    1 year ago

    Of course intel would be the last company to admit x86 is dying. It just doesn’t make sense to keep doubling down on it anymore, Apple has proven ARM is more power efficient and in many cases more powerful than x86. I wanted to buy a new laptop this year but it makes no sense to do so considering Windows ARM machines are right around the corner and will triple battery life and increase performance.

    • MrSpArkle@lemmy.ca
      link
      fedilink
      English
      arrow-up
      25
      ·
      1 year ago

      Intel is a licensed ARM manufacturer. They’re just doing PR but are capable of playing both sides.

    • morrowind@lemmy.ml
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      1 year ago

      This seems to be doggedly persistent rumor. Apple’s M chips are better due to better engineering and vertical integration.

      There is no inherent benefit to the underlying isa

      • simple@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        1 year ago

        ARM has a more efficient instruction set, uses less power, and generates less heat while matching performance. Not really a rumor.

        • morrowind@lemmy.ml
          link
          fedilink
          English
          arrow-up
          12
          ·
          1 year ago

          Source?

          Here’s mine

          It’s down to the engineering. Saying ARM has a more efficient instruction set is like saying C has more efficient syntax than python. Especially these days with pipelining 'n stuff, it all becomes very similar under the hood.

          • Cocodapuf@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 year ago

            Source?

            Here’s mine

            That article may be out of date though. From the article:

            What limits computer performance today is predictability, and the two big ones are instruction/branch predictability, and data locality.

            This is true, and it points out one of the ways Intel has made their architecture so competitive, Intel has bet very heavily on branch prediction and they’ve done a lot of optimisation around it.

            But more recently branch prediction has proven to be quite problematic in terms of security. Branch prediction was the root of the problem that led to the meltdown and spectre vulnerabilities. And the only real mitigation for this problem was to completely redesign how branch prediction was done, and significantly reducing the performance gains.

            So yeah to sum up, one of the big differences between ARM and intel’s X86 architecture is branch prediction, except branch prediction just got nerfed big time.

      • vzq@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        Is it me or is this even worse news for Intel?

        The new guys have better engineering that the guys they have been doing it since the dawn of the semiconductor age.

        Pack it in.

    • NotSoCoolWhip@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      CPU aside, it’s best to wait for thunderbolt 5 to mature. Might finally be able to go to using one device for travel and an eGPU for gaming.

      • polle@feddit.de
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Did you have issues with tb and an egpu? Beside some minor issues, my laptop with tb3 and an egpu works actually good.

      • LemmyIsFantastic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        I mean that’s fine. I’m just saying that x86 chips are still faster. If you want a beefy laptop, especially a work device that only needs to be slightly portable eg drag it to conference rooms and back to your desk, there is little current reason to go with ARM. I’m not saying they won’t catch up but folks in here seem to be thinking that ARM is currently faster.

      • simple@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        M2 Max chips are close to the high end i9, but the M series cpus are mobile chips. They’re designed for laptops. If competition is a bit harder then no doubt desktop-focused ARM CPUs will match their performance soon.

        • AzureKevin@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          AFAIK they’re large chips though, and larger generally is more performance but also much more expensive to manufacture.

        • LemmyIsFantastic@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          3
          ·
          1 year ago

          Which apple chips are faster than AMD or Intel’s best. M2? Because it’s not faster. More efficient sure. Not faster.

          • Telodzrum@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            5
            ·
            1 year ago

            It’s often faster. Usecases vary and for a lot of workflows, there’s nothing as fast as Apple Silicon at the consumer level.

            • zik@zorg.social
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              Basically few to non-existent cases where M2 is faster than either Intel or AMD’s best. On power consumption however it’s a total win for Apple. But performance… No. Not in any way.

              • ASeriesOfPoorChoices@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                1 year ago
                1. Article is about laptop cpus.

                2. M2 is the budget entry chip. M2 Pro/Max is a big jump up.

                3. there’s basically nothing out there for laptops that comes close to the M2 Pro.

                4. the only benefit Intel/AMD have right now is access to external and better graphics chips made by companies that aren’t Intel.

                5. desktops / servers are a different matter, but again, the article is talking about laptops.

                • zik@zorg.social
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  1 year ago

                  there’s basically nothing out there for laptops that comes close to the M2 Pro

                  The Intel Core i9-13980HX laptop CPU is between 20% and 100% faster than the M2 Max on nearly every benchmark.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    2
    ·
    1 year ago

    Even AMD showed just how power hungry and thermal inefficient intel generally is

    As Arm develops more every year, laptop OEMs will eventually switch just because of the insane power and thermal benefit.

    I hope RISC-V gets its chance to shine too

    • ichbinjasokreativ@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      1 year ago

      Current gen AMD laptop CPUs rival apple silicon in performance and power consumption on mobile. x86 is nowhere near as close to dying as people think.

      • JK_Flip_Flop@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        Aye exactly, Apple’s marketing, which is often basically lying, has a lot to answer for in the prevelence of this idea. They’d have you believe that they’re making chips with 14 billion percent more performance per watt and class beating performance. Whereas in reality they’re very much going toe to toe with AMD and other high end ARM chip vendors

        • CobraChicken@lemmy.ca
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 year ago

          My M2 air is silent because it has no fans. I’ve never had any trouble doing any office / photoshop / illustrator work. Battery life lasts hours and hours and hours.

          It’s not all marketing, there’s substance too

          • JK_Flip_Flop@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            1 year ago

            Did I ever say it was all lies? They’re incredibly capable machines, I’d love to own one. I just take issue with Apple’s lark of transparency in the marketing of the performance of the chips vs competition.

        • Defaced@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          1 year ago

          Every vendor is guilty of doing this not just apple, even AMD. The fact is apple found a way to make desktop arm chips accessible and viable. If you’ve ever used an m1 or m2 Mac, you’ll understand how big of an impact they’ve made. My m1 Mac mini 8gb could run several games above 60fps at 1440p at reasonable settings, examples being WoW (retail with upgraded graphics), LoL and DotA2, StarCraft 2, diablo 3, etc. It was and still is a very capable chip.

    • Never_Sm1le@lemdro.id
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      3
      ·
      1 year ago

      No laptop manufacturers would switch to arm until a good x86 compatibility comes along. People would make huge fuss if they can’t use their favorite apps or if those apps don’t run decently

      • Natanael@slrpnk.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        There’s already CPUs with extra instructions specifically designed for efficient emulation of other instruction sets. This includes ARM CPUs with x86 emulation at near native speed.

        • AzureKevin@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          Wouldn’t adding a bunch of extra features to an ARM CPU make it become less power efficient and more like x86?

          I’ve heard that ARM isn’t inherently more power efficient in some special way over x86, x86 has just been around so long and has had so many extra instructions added to it over the years, but that’s what allows it to do so much / be so performant. If you took an ARM CPU and did the same you’d have roughly the same performance/watt.

          • Natanael@slrpnk.net
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            Yes and no, it’s not about the instruction set size but about general overhead.

            The x86 architecture makes a lot of assumptions that require a bunch of circuitry to be powered on continously unless you spend a ton if effort on power management and making sure anything not currently needed can go into idle - for mobile CPUs there’s a lot of talk about “race to idle” as a way to minimize power consumption for this exact reason, you try to run everything in batches and then cut power.

            The more you try to make ARM cover the same usecases and emulate x86 the more overhead you add, but you can keep all that extra stuff powered off when not in use. So you wouldn’t increase baseline power usage much, but once you turn everything on at once then efficiency ends up being very similar.

    • vzq@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      RISCV is going to be huge, but it will take at least another decade for performance version to catch up with Intel and ARM. Hopefully by that time we know how to deal with architecture changes in consumer gear because of the ARM switch and can just painlessly move over.

    • jabjoe@feddit.uk
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      My fear is losing what we have x86 PCs in the standardization of the platform. ARM and even more RISC-V, is a messy sea of bespokeness. I want hardware to be auto-discoverable so a generic OS can be installed.

    • sanqueue@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      1 year ago

      Blockbuster also didn’t think Netflix would make a dent in the entertainment industry. Guess where they are now? 😂

      • pandacoder@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        If the entertainment industry was a car, Netflix is a truck that keeps backing up and repeatedly t-boning the door of quality.

        It made more than a dent.

  • arthurpizza@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    2
    ·
    1 year ago

    Intel is finally innovating because of increased pressure. Don’t let the Pat Gelsinger’s calm tone fool you, he knows exactly what the competition is bringing. Apple has proven what Linux users have known for a few years, the CPU architecture is not as directly tied to the software as it once was. It doesn’t matter if it’s x86, ARM, or RISC-V. As long as we have native builds (or a powerful compatibility layer) it’s going to be business as usual.

    • kalleboo@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      1 year ago

      the CPU architecture is not as directly tied to the software as it once was

      Yeah it used to be that emulating anything all would be slow as balls. These days, as long as you have a native browser you’re halfway there, then 90% of native software will emulate without the user noticing since it doesn’t need much power at all, and you just need to entice stuff that really needs power (Photoshop etc), half of which is already ARM-ready since it supports Macs.

      The big wrench in switching to ARM will be games. Game developers are very stubborn, see how all games stopped working on Mac when Apple dropped 32-bit support, even though no Macs have been 32-bit for a decade.

      • Powerpoint@lemmy.ca
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        The game support was pretty much crap even before then and a lot of the blame lies on Apple.

  • vzq@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    1 year ago

    Intel is doing all the things dying companies do.

    I hope I’m wrong about this, but I don’t think I am.

    • fat_stig@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Under previous non-technical CEOs Intel lost it’s focus on innovation leadership and became a commodity supplier, losing ground to AMD and NVIDIA. Pat Gelsinger is different, he’s an engineer, he led the 486 program, he is committed to regaining technical leadership and compete with TSMC as a foundry player. The Intel 4 node is now in mass production, Intel 3, 20A and 18A will follow in the next 2 years. New foundry capacity is being added in every factory, and new sites are being developed, 10’s of billions of investment, None-core business units are being divested. 15th generation processors with be AI native, the plan is that in the same way as Centrino kick started WiFi, AI support on the desktop will be a game changer.

      Are these things that dying companies do?

      • vzq@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 year ago

        Spend money they don’t have in a last ditch attempt to hit the right buzz words and turn things around for real this time? Yes. That’s exactly what they do.

        But then again, so do companies that have just hit a minor slump.

        Let’s hope they manage to fix things. I don’t like TSMCs monopoly position right now.

  • onlinepersona@programming.dev
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Will Intel exist in 2026? NVIDIA and AMD are making ARM chips for 2025, China is investing heavily in RISC-V, and AMD already released a CPU that rivals Apple’s M2 which is x86. Who knows how things will turn out once they release an ARM chip.

    Things are shaping up to become an NVIDIA vs AMD arms race with some Chinese company becoming a dark horse and announcing a RISC-V chip in 2-3 years.

    There was a company that announced a major technological advancement in chip fabrication in the US, but I can’t remember who or what it was. My maggot brain thinks something with light-based chips or something? I dunno… that might also be something to look out for

    Edit: it was intel: Intel Demos 8-Core, 528-Thread PIUMA Chip with 1 TB/s Silicon Photonics

    • BetaDoggo_@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      It will take at least another 10 years to get a majority of the market off of x86 with the 20+ years of legacy software bound to it. Not to mention all of the current gen x86 CPUs that will still be usable 10 years from now.

      • PopOfAfrica@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 year ago

        Honestly, we just need some sort of compatibility layer. Direct porting isn’t completely required yet.

      • Patch@feddit.uk
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        1 year ago

        You don’t really need the majority of the market to have moved before things start to get tricky for Intel. They’re a very much non-diversified company; the entire house is bet on x86. They’ve only just started dabbling in discrete GPUs, despite having made integrated GPU SOCs for years. Other than a bit of contract fabbing, almost every penny they make is from x86.

        If ARM starts to make inroads into the laptop/desktop space and RISC-V starts to take a chunk of the server market, the bottom could fall out of Intel’s business model fast.

      • onlinepersona@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        I’m not sure about that. If for example the EU says “for the environment, you may not use chips that use X watts/Ghz” or something, x86 might be out of the game pretty quickly. Also, becoming market leader doesn’t mean old hardware, it’s the new hardware. I bet by 2030, the majority of chipsets sold will be either ARM or RISC-V. AMD did make an ARM rival with the 7840U, but with their entry in to ARM in 2025, it’s not preposterous to believe the ARM ecosystem will pick up steam.

        Also, recompiling opensource stuff for ARM is probably not going to be a huge issue. clang and gcc already support ARM as a compilation target, and unless there’s x86 specific code in python or ruby interpreters, UI frameworks like Qt and GTK, they should be able to be compiled without much issue. If proprietary code can’t keep up or won’t keep up, the most likely outcome will be x86 emulators or the dumping of money into QEMU or stuff like Rosetta for windows.

        Anyway, I’m talking out of my ass here as I don’t write C/C++ and don’t have to deal with cross-compilation, nor do I have any experience in hardware. It’s all just a feeling.

        • Marius@lemmy.mariusdavid.fr
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          1 year ago

          FYI, arm can already handle most Open Source Software with no problem as far compiling them is concerned. In particular, Qt and GTK does work, and cross compiling too is very easy. Not that it’s necessary anyway (aside of probably faster compilation unless you have really good ARM CPU). In particular, QEMU have qemu-user (if you didn’t know), which basically Rosetta for Linux, but with a good performance hit when testing cross-compiled code.

          Edit: In my opinion, what will switch the faster to a non-x86 on a large scale (for computers, not counting phones, tablet and microcontroller, not using them anyway) are servers. A lot of them use standard open source software, so switching might be pretty easy if the package manager abstract it (like… All of those I know).

          I mean, certain cloud provider are starting to offer renting such servers (and not speaking of all those hacker who host server on raspi (and then those who use standard linux on mobile phone too))

          • Giooschi@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            QEMU have qemu-user (if you didn’t know), which basically Rosetta for Linux, but with a good performance hit when testing cross-compiled code.

            Aren’t Box64 and FEX faster though?

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        edit-2
        1 year ago

        I think it’s safe to say Apple has proved that wrong three times.
        When they switched from Motorola to Power, then from Power to Intel, and latest from Intel to Arm.
        If necessary software will be quickly modified, or it will run well enough on compatibility layers.

        The switch can happen very fast for new hardware. The old systems may stay around for a while, but the previous CPU architecture can be fazed out very quickly in new systems. Apple has proven that.

        • vzq@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 year ago

          Apples advantage is that it controls the whole stack from silicon to App Store. That’s a problem for all sorts of reasons, but here they can use that power to implement the shift in a way that minimally impact the users.

          Keep in mind that M1 it’s not just an ARM CPU. It’s an ARM CPU that has specially designed features that make Intel compatibility fast. Rosetta 2 is a marvel of technology, but it would run like crap on anything that does not have Intel memory model emulation in hardware.

          If you are in a position where

          1. Old binaries have to keep running indefinitely (ie the entire windows market) and
          2. You have to buy regular ARM chips from a regular supplier without special x86 emulation support features

          things are looking quite a bit less rosy for you.

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            from silicon to App Store.

            Not ther case from Motorola to Power or Power to X86. Very similar software infrastructure to what PC’s have, with lots and lots of 3rd party software vendors.

    • ██████████@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      1 year ago

      i think its neet how geopolitically this is all connected to the taiwan issue and only when the mainland can make chips as good as NvDia in taiwain will they be able to economically handle the invasion

      if they invade today gpus and cpus prices explode into dumbdum levels for a few years bro it wouod suck for the whole world

  • some_guy@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    1 year ago

    Posturing. It’s already obvious that Arm is kicking ass. It may not take over, but it’s more than made a dent.

  • MystikIncarnate@lemmy.ca
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    A brief history lesson relating to Intel and ARM… Intel made ARM processors. They were not great. Of course, this was many many years ago, but even compared to others of the same from the same generation and year range, they were kind of poo.

    The product was Intel Xscale. Manufactured starting in 2002, and only lasted about 3-4 years before being dropped. Right before there was a big smartphone boom. The processors found their way into the smartphone predecessor, the PDA. Notably, I purchased one device with this type of processor right before the whole thing collapsed… A Dell Axiom x51v. It ran Windows Mobile, which later turned into Microsoft’s attempt to compete with the likes of Google and Apple in the smartphone space, and it’s obvious how that worked out for them.

    Intel is saying this because they have to believe it’s true. They’ve abandoned all ARM development and seem to have no intention of picking it up again. They failed in the ARM space, creating fairly poor versions of the chips that they did produce, and they seem to have no intention of repeating that failure.

    Mark my words, Intel will likely go all in on RISC-V if anything. They’ll continue to build x86, they have way too much invested in that space and it’s one of few that they’ve actually had significant success in, but when it comes to mobile/RISC, ARM isn’t something that they will be picking up again.

    So bluntly, this is true… For Intel. They must believe it because they have given themselves no alternative.

  • banneryear1868@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    1 year ago

    ARM just makes sense for portable devices for obvious reasons, x86 isn’t dying though. For the average person who needs a laptop to do some professional-managerial work ARM is perfect.

    • flying_sheep@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      What are those reasons that you think are so obvious? I have no idea what you could be referring to 😅

      • banneryear1868@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        1 year ago

        ARM is more efficient and as a “system on chip” reduces the need for as many other components on the boards, phones for example. Unless you’re doing heavy cpu or gpu intensive tasks there’s a bunch of upsides and no downsides to ARM.

        • flying_sheep@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          That’s my impression as well. I’m confused about the “just”. There’s many non-portable devices that don’t have too heavy workloads and that I’d think would benefit from better energy efficiency.

          • banneryear1868@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            Oh yeah the article is about the laptop market, but of course all sort of non-portable devices run on non-x86 platform. I’d even say x86 is the minority unless you reduce it to just desktop workstations.

      • neeshie@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Arm tends to be a lot more power efficient, so you can get better battery life on portable devices.

        • flying_sheep@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          And lower power consumption and heat production on all devices, so I don’t get the “just”

    • Rednax@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      There is also a sizable market for laptops that do not do much more than log onto a remote desktop. Especially with remote working, that has becomes the perfect middle ground between security, cost, and ease of use. A cheap ARM processor would work perfectly for those machines.

      • banneryear1868@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        I’m a sysadmin and would much rather have a light arm machine to remote in from than a standard Intel laptop.

  • gnuplusmatt@startrek.website
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    whatever the new architecture ends up being, at some point we will see x86 relegated to a daughter board in the machine while we transition, or x86 will live in a datacenter and you’ll buy time on a “cloud pc” like what microsoft will already sell you in azure

    • terminhell@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I’ve been saying that MS is likely trying to ditch the NT kernel for a while now. But I forgot about them azure cloud desktop. I can see them rolling out a Chromebook like environment (Linux based) that would hook into a cloud azure full desktop instance. That way, their Surface devices (for example) could be used for basic web browsing stuff on its own, then you could connect to your desktop for everything else.

  • spudwart@spudwart.com
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    Ah yes, the “the weather is clear now” argument for not putting up an umbrella because you fear you’re going to get wet if you’re wrong is peak copium.

    See how well that worked for Sears, Blockbuster, Dial-Up providers, TV… etc.