Intel’s Gaudi accelerators have been challenged in the broader AI market, but hope is emerging for the enterprise segment as opposed to the cloud world where Nvidia is king with its formidable GPUs.
The hope comes partly in the form of what Dell announced at Dell Technologies World 2025 on Monday—a Dell AI platform with Gaudi 3 AI accelerators by way of the Dell AI Factory. As the companies put it, a combination of Gaudi 3 hardware with an open source stack and Dell’s infrastructure expertise will accelerate AI innovations across the industrial base.
The Dell AI platform with Intel is comprised of the PowerEdge XE9680 server that is made up of eight Gaudi 3s with 128 GB of HBM memory and 3.7 TB/s bandwidth. There are also 5th gen Intel Xeons and air cooling.
Intel noted that Gaudi 3 offers 70% better price-performance on inference jobs with Llama 3 80B over Nvidia H100 GPUs. Notably, Intel’s solution avoids vendor lock-in for long term flexibility with open networking and software stacks. “With the Dell AI platform with Intel, we’re delivering a future-ready foundation for AI innovation that grows with our customers’ needs,” said Varun Chhabra, senior vice president of Dell Technologies, speaking in a statement.
Jack Gold, an analyst at J. Gold Associates, told Fierce the Intel approach with Dell is the same reason IBM is offering Gaudi based on the IBM cloud. IBM also focuses on the more modest small language models and not the extreme large language models that customers see as attractive for more specific jobs.
“Getting Dell to offer these AI systems is important for Intel, but it won’t result in a major competitive posture against the likes of Nvidia and AMD GPUs, but could help at the mid- to lower-end of AI solutions in a market just now emerging,” Gold added. “Dell supports AMD and Nvidia and now Intel Gaudi. Many AI factory systems already go out with Intel CPUs powering them, so this is another option for customers to keep a fully Intel-power solution in play. “
The announcement Is significant for Intel because it provides customers a way to buy Gaudi system from a major supplier along with the support needed to make them work, Gold said. “Gaudi is a good solution for smaller, scale, inference based AI needs or for more modest training use,” he said. Intel claims a significant inference per watt advantage so the cost and efficiency of Gaudi systems could be attractive to organizations running inference workloads, not major model training workloads like those reserved for the major Nvidia hyperscaler solutions.”
In addition to the Dell news with Gaudi, Intel announced at Computex a new Arc Pro B-Series GPUs for use with AI inference and workstations.
And also at Computex, Intel said its Gaudi 3s are available in PCIe cards for inferencing within existing data center servers. The PCIe cards will be available in second half of 2025. Before the Dell announcement with Gaudi 3, Intel at Computex had disclosed Gaudi 3 rack scale systems, without naming the Dell partnership.