Nvidia revealed a $5 billion investment in Intel common stock at $23.28 per share, giving it around a 5% ownership stake.
The deal, however, goes far beyond equity—it’s about co-developing multiple generations of custom chips.
For data centers, Intel will build custom x86 CPUs integrated into Nvidia’s AI infrastructure platforms.
On the PC side, Intel plans to launch new x86 RTX SoCs—system-on-chips combining Intel CPUs with Nvidia RTX GPU chiplets—targeting the massive 150 million-unit laptop market.
Although the investment is pending regulatory approval, analysts believe this marks a pivotal turning point in AI system architecture.
Goldman Sachs: AMD And ARM Under Pressure
Goldman Sachs analyst James Schneider called the Nvidia-Intel collaboration a net negative for both Advanced Micro Devices Inc. (AMD) and Arm Holdings Inc. (ARM)
, noting that the partnership could shift enterprise AI and PC dynamics in favor of the two tech giants.
Schneider said this custom chip initiative will likely “weaken AMD’s enterprise server CPU position on the margin” and slow its pace of desktop share gains.
On ARM, he said the move could “dampen investor sentiment on ARM’s rate of datacenter market share gain.”
Goldman Sachs also highlighted Synopsys Inc.(SNPS)
as a key beneficiary of the Nvidia-Intel collaboration, tying it to a rebound in confidence surrounding one of its largest customers—Intel Corp.
“We view this announcement as a positive for Synopsys, as it should help restore confidence in the health of a key customer (Intel) following the company’s muted FY26 IP guidance (driven in part by Intel).”
22V Research: This Is About Infrastructure, Not Hype
Jordi Visser, analyst at 22V Research, said the Nvidia-Intel tie-up was the logical next step after Oracle Corp. (ORCL)
‘s blockbuster earnings earlier this month, which he called “the first real proof that AI demand is outstripping supply.”
Visser argued the market is underestimating the implications of inference—running AI models in real time—as the next driver of infrastructure demand.
“This wasn’t about future hype,” Visser said. “It was about businesses already committing billions of dollars to use AI every day.”
By integrating Intel’s x86 CPUs into its AI systems, Nvidia is scaling its proprietary NVLink interconnect protocol to an architecture that has long dominated enterprise and cloud computing. “AI is now infrastructure,” Visser added. “And this infrastructure, just like the internet or electricity, needs factories.”
He also highlighted the shift toward “physical AI”—embedding intelligence into robots, autonomous machines, and industrial systems—as the next frontier.
The implication? This collaboration is not just about servers and chips. It’s about equipping machines to reason and act in the real world.
Bank of America: A $25B To $50B Market Opportunity
Bank of America’s semiconductor team, led by analyst Vivek Arya, called the deal a “historic collaboration” with long-term revenue potential between $25 billion and $50 billion per year. The analysts said the partnership benefits both Intel and Nvidia—but with caveats.
For Intel, the deal validates its x86 CPU technology and significantly increases exposure to the booming enterprise AI market. However, Bank of America flagged ongoing concerns around the lack of foundry customers on Intel’s cutting-edge nodes like 18A and 14A.
For Nvidia, the benefit is clear: increased scale-up capabilities for enterprise customers using x86 CPUs, deeper integration of its NVLink protocol, and improved access to customers beyond the ARM ecosystem.
Importantly, the collaboration keeps Nvidia’s foundry options open, allowing it to scale without relying on Intel manufacturing.
While the announcement included no commitment to Intel Foundry Services, Bank of America noted that the expanded Nvidia-Intel ecosystem could be a headwind for PCIe and UALink suppliers, as NVLink gains adoption.
Source: Benziga