I guess my question is, if ARM chips in desktops and laptops are the future, who will be making those chips? Will Intel and AMD switch over? Will other companies step in to provide CPUs? Will we see windows machines with chips made by Apple? Etc.
I can't believe we'll ever see Windows systems with Apple chips in them. I also can't believe Intel and AMD will switch to the ARM instruction set, especially since they will soon be owned by Nvidia.
Intel's and AMD's problem is that they both are hugely invested in two architectures. The x86-64bit instruction set, which is very complex, and their underlying microcode execution architectures, which they both call "micro-ops". They both have a deep pipeline of engineering innovation based on these technologies, and the inertia they'd have to overcome to switch to a different CPU instruction and architecture strategy seems overwhelming.
If I were running Intel, after I fired at least half of their senior executive team, I'd take the best CPU architects and engineers they have, put them in a dedicated and isolated team, and direct them to design a new ground-up architecture to lead the industry. It wouldn't look like x86 does, that's for sure. It would be unlikely to have the ARM instruction set, because there's nothing magic about it, and I wouldn't want Nvidia dictating the instruction set I had to support. The catch with this strategy is that it takes a while to develop a new instruction set and CPU architecture. Two years would be a miracle. Even RISC-V took longer than that. (Interesting tidbit: RISC-V, an open CPU instruction set and architecture definition emerging as a competitor to ARM, was conceived in the Parallel Computing Lab at Berkeley, originally funded by Intel and Microsoft.)
Intel could probably hasten development quite a bit by just using and extending the RISC-V instruction set, but then they'd have to donate any extensions they came up with up, and may have to support extensions others get accepted. I can't picture Intel giving up control like that, and that wouldn't be my choice. But still, Intel needs a new ground-up instruction set and architecture, and it has to be better than their last attempt, which was Itanium. (Yuck, although HP deserves as much blame as Intel does.) Intel's current strategy, be a jack of all trades with CPUs, GPUs, FPGAs, AI chips, smart NICs, switches, and a software ecosystem to tie it all together to make a you-name-it-we've-got-it system architecture solution does seem a bit like aiming at the past. The cloud computing world, the real market to aim at, does their own software ecosystems, and doesn't seem to like to lock themselves into pretty much anything from anyone else. They're even replacing TCP, for example. (
https://engineering.fb.com/2020/10/21/networking-traffic/how-facebook-is-bringing-quic-to-billions/)
One way or another Intel (and AMD for that matter) is going to have to innovate their way out of their x86 mess. AMD has out-innovated Intel, at least for now, but both IMO are vulnerable to someone starting from a blank page and not legacy thinking. The Apple M1 is just a hint of what can happen.