
The CPU comeback nobody saw coming
For years, the AI story was basically: GPUs, GPUs, GPUs. But Counterpoint Research says the script is getting a rewrite, and the humble CPU is suddenly back in the group chat.
As AI shifts from training big models to actually using them, the plumbing underneath gets a lot more important. Inference pipelines, orchestration layers, and those ever-busy head nodes still need serious x86 muscle. Translation: the stuff that keeps AI systems from becoming expensive chaos still runs on CPUs.
Agentic AI = more work for the processor in the middle
The report argues that agentic AI is changing the hardware math in a pretty big way:
- CPU-to-GPU ratios are moving closer to 1:1, versus earlier setups that were more like 1:8
- CPUs may now handle roughly 50% to 90% of agentic AI workloads
- The data center CPU market could grow to about $80 billion by 2028
That’s not a cute little tweak. That’s a whole new layer of demand for the server stack.
Who wins? Depends who can feed the beast
Intel, AMD, and Arm-based chip providers are all fighting for the same AI-optimized server dollars, while Taiwan Semiconductor sits in the background like the overbooked restaurant everyone needs a reservation at. The report also flags advanced nodes like Intel 18A, Intel 14A, and TSMC’s 3nm and 2nm processes as key battlegrounds.
For investors, the headline here is simple: AI isn’t just a GPU party anymore. If this shift sticks, the CPU market gets a much bigger seat at the table — and the winners are the vendors that can build faster, more efficient chips without melting the power bill.
Big picture: the AI boom may be broadening out, and that usually means more companies get to participate — which is great until everyone starts fighting over the same wafers.
