Intel and Aible, an end-to-end serverless generative AI (GenAI) and augmented analytics enterprise solution, now offer solutions to shared customers to run advanced GenAI and retrieval-augmented generation (RAG) use cases on multiple generations of Intel Xeon CPUs.

The collaboration, which includes engineering optimizations and a benchmarking program, enhances Aible’s ability to deliver GenAI results at a low cost for enterprise customers and helps developers embed AI intelligence into applications.

Together, the companies offer scalable and efficient AI solutions that draw on high-performing hardware to help customers solve challenges with AI and Intel.

“Customers are looking for efficient, enterprise-grade solutions to harness the power of AI,” says Mishali Naik, Intel senior principal engineer: Data Centre and AI Group. “Our collaboration with Aible shows how we’re closely working with the industry to deliver innovation in AI and lowering the barrier to entry for many customers to run the latest GenAI workloads using Intel Xeon processors.”

Aible’s solutions demonstrate how CPUs can significantly enhance performance across a range of the latest AI workloads, from running language models to RAG. Optimized for Intel processors, Aible’s technology utilizes an efficient serverless end-to-end approach for AI, consuming resources only when there are active user requests. For example, the vector database activates for just a few seconds to retrieve information relevant to a user query, and the language model similarly powers up briefly to process and respond to the request. This on-demand operation helps reduce the total cost of ownership (TCO).

While RAG is often implemented using GPUs (graphics processing units) and accelerators to leverage their parallel processing capabilities, Aible’s serverless technique, combined with Intel Xeon Scalable processors, allows RAG use cases to be powered entirely by CPUs. The performance data shows that multiple generations of Intel Xeon processors can run RAG workloads efficiently.