At Dell Technologies World in Las Vegas, the company has announced Dell AI Factory advancements – including powerful and energy-efficient AI infrastructure, integrated partner ecosystem solutions, and professional services to drive simpler and faster AI deployments.
AI is now essential for businesses, with 75% of organisations saying AI is key to their strategy, and 65% successfully moving AI projects into production. However, challenges like data quality, security concerns and high costs can slow progress.
“It has been a non-stop year of innovating for enterprises, and we’re not slowing down,” says Jeff Clarke, COO at Dell. “We have introduced more than 200 updates to the Dell AI Factory since last year. Our latest AI advancements – from groundbreaking AI PCs to cutting-edge data centre solutions – are designed to help organisations of every size to seamlessly adopt AI, drive faster insights, improve efficiency, and accelerate their results.”
The Dell AI Factory approach can be up to 62% more cost-effective for inferencing LLMs on-premises than the public cloud and helps organisations securely and easily deploy enterprise AI workloads at any scale. Dell says it offers the industry’s most comprehensive AI portfolio designed for deployments across client devices, data centres, edge locations and clouds. More than 3 000 global customers across industries are accelerating their AI initiatives with the Dell AI Factory.
With the announcement, Dell introduces end-to-end AI infrastructure to support everything from edge inferencing on an AI PC to managing massive enterprise AI workloads in the data centre:
The Dell Pro Max Plus laptop with Qualcomm AI 100 PC Inference Card is the world’s first mobile workstation with an enterprise-grade discrete NPU. It offers fast and secure on-device inferencing at the edge for large AI models typically run in the cloud, such as today’s 109-billion-parameter model.
The Qualcomm AI 100 PC Inference Card features 32 AI-cores and 64GB memory, providing power to meet the needs of AI engineers and data scientists deploying large models for edge inferencing.
The industry-first Dell PowerCool Enclosed Rear Door Heat Exchanger (eRDHx) is a Dell-engineered alternative to standard rear door heat exchangers. Designed to capture 100% of IT heat generated with its self-contained airflow system, the eRDHx could reduce cooling energy costs by up to 60% compared to currently available solutions.
And with Dell’s factory integrated IR7000 racks equipped with future-ready eRDHx technology, organisations can:
- Significantly cut costs and eliminate reliance on expensive chillers given the eRDHx operates with water temperatures warmer than traditional solutions (between 32 and 36 degrees Celsius).
- Maximise data centre capacity by deploying up to 16% more racks of dense compute, without increasing power consumption.
- Enable air cooling capacity up to 80 kW per rack for dense AI and HPC deployments.
- Minimise risk with advanced leak detection, real-time thermal monitoring and unified management of all rack-level components with Dell Integrated Rack Controller software.
Dell PowerEdge XE9785 and XE9785L servers will support AMD MI350x accelerators with 288GB of HBM3e memory per GPU and up to 35 times greater inferencing performance. Available in liquid-cooled and air-cooled configurations, the servers will reduce facility cooling energy costs.
Because AI is only as powerful as the data that fuels it, organisations need a platform designed for performance and scalability. The Dell AI Data Platform updates improve access to high quality structured, semi-structured, and unstructured data across the AI lifecycle.
- Dell Project Lightning is the world’s fastest parallel file system, accelerating training time for large-scale and complex AI workflows. According to early benchmarking results, Project Lightning delivers up to two times greater throughput than competing parallel file systems.
- Dell Data Lakehouse enhancements simplify AI workflows and accelerate use cases – such as recommendation engines, semantic search, and customer intent detection – by creating and querying AI-ready datasets.