• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

The Slow Death and Rebirth of Intel

Shame, because nothing beats a custom build. The level of control and parts selection is just far greater if you DIY.
I expect my Intel i5K custom DIY gaming box to last me 10 years + and even up to 15 years for basic surfing and video viewing.
 
AI workloads benefit from maximum number of threads, hence AMD is currently the rising star with threadripper.
AI uses massively parallel matrix math and benefits from thousands to millions of parallel cores. Threadripper is nice on the desktop because it has a lot of PCI lanes to push data between GPUs and memory / storage, but high end desktop is a bit player.

Similarly when in an AI context the CPU compute at scale is pretty irrelevant. PCI lanes and bandwidth (and ability to use things like CXL and accelerators) become as important or more than the computer cores. Outside of an AI context but still at scale, CPUs are more relevant but accelerators, cores, PCI lanes remain the dominant priorities. Intel is actually positioned pretty well here with the latest round of Xeons, a lot better than they've been the last few years.

The downsides of this are that the hyperscale market is leaving the desktop market behind, and what used to be pretty direct tech trickle-down from high performance commercial compute to PCs is increasingly disconnected. (This is true on the GPU side as well.)

I wouldn't expect home PCs to die out but the market is still looking for the next killer app, that was going to be VR but that's been fizzling, and it doesn't look like it's going to be Copilot.
 
The ARM instruction set is ARM IP; Apple, Qualcomm and others pay ARM for an Architectural license in order to build their own CPU designs using the ARM architecture.
That's true, the instruction set definition is IP (which you get via an agreement Arm calls an "Architecture License"), but it is very thin IP. AMD and Intel have the equivalent of a cross-licensed (meaning anything one company invents in the instruction set, the other company gets it) "Architecture License" for the variants of the x86 instruction set, but the CPU designs are vastly different.
But I was talking about is the fact that most of the world has an ARM CPU in their pocket and do most of their computing with it. Most of the world have already transitioned away from x86 (or were never there in the first place).
For mobile devices and tablets that's true, but for laptop and desktop computing you're incorrect.
 
For mobile devices and tablets that's true, but for laptop and desktop computing you're incorrect.

Why make the distinction between device types? They all have CPUs in them. If i'm honest, I could do most, if not all, of my personal computing on my tablet and phone - I don't really need a 12 core Ryzen and 64GB of RAM under my desk.
 
Why make the distinction between device types? They all have CPUs in them. If i'm honest, I could do most, if not all, of my personal computing on my tablet and phone - I don't really need a 12 core Ryzen and 64GB of RAM under my desk.
Well, if you want to do multi-programming (run multiple apps simultaneously), OS virtualization, multi-threaded applications, or high performance graphics, you'll probably want an OS like Windows or MacOS, much larger real and virtual memories, and sophisticated memory protection. Apple has been putting M-series CPUs in iPads for a few years now, I have one, but the iPad can't replace what I do with a Mac.
 
I’d think that if you want to build an enthusiast-grade x86 PC, go right ahead and do it. Because Windows on ARM has been around for years, only now it’s being touted as the AI future. But the success of Copilot+ devices (as they are now called) isn’t assured, and so far, seems limited to a few premium-priced portable devices aimed at early adopters. Not aware of reference designs for other classes of hardware being promoted yet.

To date, the only real benefit that I’ve gotten from AI is as natural language front-end to search engines. And that’s being done in the cloud, not on my 7th-gen i7 desktop.
 
Well, if you want to do multi-programming (run multiple apps simultaneously), OS virtualization, multi-threaded applications, or high performance graphics, you'll probably want an OS like Windows or MacOS, much larger real and virtual memories, and sophisticated memory protection. Apple has been putting M-series CPUs in iPads for a few years now, I have one, but the iPad can't replace what I do with a Mac.
In fairness this is because the iPad runs on a battery and has a super-low thermal dissipation capability, not because of the chip or software architecture. The same is true of phones, generally.
 
To date, the only real benefit that I’ve gotten from AI is as natural language front-end to search engines. And that’s being done in the cloud
...on linux based clusters packed full of Nvidia GPUs and quite possibly with AMD processors.
 
In fairness this is because the iPad runs on a battery and has a super-low thermal dissipation capability, not because of the chip or software architecture. The same is true of phones, generally.
The software architecture you're actually referring to, the IOS operating system, is the cause of lesser capability even on iPads that use the M-series CPUs. The same is true with Android.
 
Well, if you want to do multi-programming (run multiple apps simultaneously), OS virtualization, multi-threaded applications, or high performance graphics, you'll probably want an OS like Windows or MacOS, much larger real and virtual memories, and sophisticated memory protection.

Does the world need any of this to read the news, check their bank balance, check their train or flight time, file a tax return, book a medical appointment, order a pizza, watch a film, listen to some music, read a book or chat to their friends on their favourite audio forum?

At home, I run multiple PC & laptops (running Windows 11 and Linux), I have custom built pfSense router, I run 2 Proxmox hosts with multiple VMs, I run two Synolgy NAS. I have wired networking to most rooms because I flood wired my house with CAT5e 15 years ago, this is now running at 2.5GbE after a recent switch upgrade and I have 10G OM3 fibre 'backbone' linking the switches in my office and attic. I also have multiple 802.11ax APs (with wired backhaul) providing Wifi6 everywhere. Do I need any of this to carry out the common tasks above? No, I could do it all with a telco provided wireless router and my phone/tablet, but it's my hobby and my career and it keeps me (mostly) out of mischief :)
 
I’d think that if you want to build an enthusiast-grade x86 PC, go right ahead and do it. Because Windows on ARM has been around for years, only now it’s being touted as the AI future. But the success of Copilot+ devices (as they are now called) isn’t assured, and so far, seems limited to a few premium-priced portable devices aimed at early adopters. Not aware of reference designs for other classes of hardware being promoted yet.

To date, the only real benefit that I’ve gotten from AI is as natural language front-end to search engines. And that’s being done in the cloud, not on my 7th-gen i7 desktop.
I have a feeling AI is behind the recent increase in data breaches. Don't think I consider that a benefit.
 
I’d think that if you want to build an enthusiast-grade x86 PC, go right ahead and do it. Because Windows on ARM has been around for years, only now it’s being touted as the AI future. But the success of Copilot+ devices (as they are now called) isn’t assured, and so far, seems limited to a few premium-priced portable devices aimed at early adopters.

I just organized one of the latest Copilot+ notebooks (Lenovo Yoga S7 QualComm Snap Dragon X1E78100 12 Core processor (3.4GHz)) for a close mate of mine. It's obscenely fast. I mean, it's hard to comprehend how quick it is at everything. Nothing fazes it. 3k screen, full touch, all USB-C, he needs a USB-C to 4 port HDMI for legacy work on his 4x1080p displays. Can do another 2 exteranl 4k screens at the same time. Also grabbed a nice LG curved monitor, 3840x1600 (usb-c).

Just released a month back- it had to be ordered and came with "DO NOT SELL OR DISPLAY BEFORE x date" seal on the box with the threats to retailer they would breach their contracts with Lenovo if they sold it prior.
 
I attended a small Financial Services gig in the City last week. Robert Hormuth from AMD was one of the panellists and he some interesting things to say about ARM vs x86 and challenges of cloud, hybrid and polycloud deployment. One of his arguments was that x86 was ubiquitous and therefore workloads running on it could run on-prem and in the cloud without re-compilation. Whilst this is certainly a valid point, it does smell a bit of FUD directed to large enterprises with a high number of legacy systems. i.e. this argument is much less relevant to a start-up with no on-prem infrastructure or legacy systems to maintain or migrate. Even in the enterprise space, it's becoming less relevant as we consume for more SaaS for core business functions. e.g. Do I care what architecture Microsoft365, Salesforce or ServiceNow is running on as long as it's functional, reliable, performant and cheap?
 
Nothing fazes it.
Try gaming on it ;)

Sadly the graphics drivers don’t seem to be made to get the most out of the systems, at least not yet. Give it some time to let the drivers catch up and have the games patched to run better, and then gaming will also be a halfway decent on these.
 
Try gaming on it ;)

Sadly the graphics drivers don’t seem to be made to get the most out of the systems, at least not yet. Give it some time to let the drivers catch up and have the games patched to run better, and then gaming will also be a halfway decent on these.
Yes reviews I've read of it say it is not great for DaVinci on video or photoshop or gaming. Not that it cannot be one day, but software isn't there yet. That sort thing is what the AMD guy had in mind in Berwhales comments.
 
Intel would need to replace a large percentage of their staff for a rebirth... This era is seeing a lot of megacaps starting to fail technically (See Boeing for recent obvious example), for reasons I think all of us know but cannot say

A lot of the above discussion is about ARM and architectures and stuff but AMD exists as a x86 competitor
1720877454869.png
 
I attended a small Financial Services gig in the City last week. Robert Hormuth from AMD was one of the panellists and he some interesting things to say about ARM vs x86 and challenges of cloud, hybrid and polycloud deployment. One of his arguments was that x86 was ubiquitous and therefore workloads running on it could run on-prem and in the cloud without re-compilation. Whilst this is certainly a valid point, it does smell a bit of FUD directed to large enterprises with a high number of legacy systems. i.e. this argument is much less relevant to a start-up with no on-prem infrastructure or legacy systems to maintain or migrate. Even in the enterprise space, it's becoming less relevant as we consume for more SaaS for core business functions. e.g. Do I care what architecture Microsoft365, Salesforce or ServiceNow is running on as long as it's functional, reliable, performant and cheap?
Agreed. Hormuth used to work for Intel, and the software optimization for x86 argument was invented there (not by him), though back then it was more specific and aimed at just Intel CPUs.

And x86 object code compatibility means a lot less than it used to. On desktop/laptop CPUs, all NPUs from everyone are proprietary. And you don't access them via instructions and compilers, you use libraries. Libraries mean programming changes. And, Intel for example, puts proprietary hardware accelerators in Xeon server CPUs now, which are accessed in differing ways. AMD and Intel CPUs are growing further apart, regarding compatibility.
 
Agreed. Hormuth used to work for Intel, and the software optimization for x86 argument was invented there (not by him), though back then it was more specific and aimed at just Intel CPUs.

And x86 object code compatibility means a lot less than it used to. On desktop/laptop CPUs, all NPUs from everyone are proprietary. And you don't access them via instructions and compilers, you use libraries. Libraries mean programming changes. And, Intel for example, puts proprietary hardware accelerators in Xeon server CPUs now, which are accessed in differing ways. AMD and Intel CPUs are growing further apart, regarding compatibility.

That kind of proprietary lock-in is not new. My company are only getting off our old Netezza Twinfins (with their FPGA accelerators) because we're closing a DC and you can't pay IBM (or anyone else) enough to move them with a guarantee that they will come back to life at the other end.

One of the slightly surprising take-aways from the round table was the unanimous advice to use open systems for everything. I wasn't surprised to hear this from the lady from Redhat, but she had Microsoft and AMD either side and they said the same thing. I guess when your enemies are Broadcom/Vmware and ARM, everyone else is a friend :).
 
Back
Top Bottom