Kathy Gibson reports from Red Hat Summit in Johannesburg –  Artificial intelligence (AI) has been the big IT buzzword for the last year and is ready to become a mainstream technology that can help organisations solve their IT and business challenges.

Dion Harvey, regional GM of Red Hat Sub-Saharan Africa, says 2024 has been a fast-paced year for innovation in the IT industry – and it’s only going to get faster.

“Whether we are grappling with the continued impact of cloud computing, or disruption in the virtualisation space, or trying to plot a path for AI – that is the nature of the industry,” Harvey says. “But in change and disruption there is opportunity.”

Hans Roth, senior vice-president and GM: EMEA at Red Hat, explains that businesses around the world are looking to new innovations in the AI space.

“As the leader in open source, we need to ensure that we are enabling the open source community to contribute to AI,” he says. “We truly believe that AI will be the next open revolution.”

Red Hat has played a big role in bringing waves of technology to more customers through the application of open source technology and collaboration, Roth points out.

“Together, Red Hat and its partners have built value for customers – and we have the foundation to do more.”

Customers face a number of challenges in implementing AI, including how to integrate it into a tremendously complex environment where applications are running in multi-clouds, on-premises and on the edge.

“They need a secure automated infrastructure that can run any application,” says Roth. “And the best teams are those that are working together.”

Red Hat recently announced that it is expanding its Open Hybrid Cloud, enabling the entire portfolio with AI. The new platform elevates automation to become a mission-critical function and has added tools to make developers more productive. The goal is to deliver AI in trusted, secure swim lanes to keep business safe. It has also expanded the edge portfolio with AI.

“AI has become the core innovation driver of the past 10 years and there is much more to come with the advent of GenAI,” Roth says.

Brian Gracely, senior director: portfolio strategy at Red Hat, agrees that we are at the beginning of the GenAI era. “AI is not new, but it feels like the first time we can all touch feel and experiment with AI.”

And open source has a big role to play in how AI rolls out going forward, he says.

“Open technology has influenced every era that came before,” Gracely says. “We have seen corporate IT embracing open technologies; private cloud, public cloud and hybrid cloud all moving to open source.”

AI continues to evolve quickly, Gracely adds. “The first models were quite limited and it wasn’t open. In the last year, we have seen the marketplace become a lot more open and more permissive.”

But there is a lot that can still be done: “At Red Hat, we have thought about what we can do – as a leader in open source – to really make it more open, and to take advantage of what can happen when communities work together.”

To start, Red Hat has become involved in open source AI models. “Yes, some models are open in a permissive way,” Gracely says. “But, most often, users can’t see where data came from, or change the models.”

The goal is to create open source models that deliver what we expect from software, Gracely explains.

“So we collaborated with the IBM Research team to take the Granite models and open source those with open licensing, open access to data source, and open access to how the model is built.”

The open source Granite models are now available on HuggingFace and GitHub.

Red Hat also took a step back and reconsidered why models have to be so big. “Every few weeks, there is a new biggest model, with millions of parameters, containing huge volumes of data,” says Gracely. “But, in the perspective of your business, do you need all of that data?”

Red Hat has found that companies are experiencing significant results from smaller models that are often more powerful – with smaller footprints that are less costly to run. “With these smaller models, companies can get big results provided by experts across the organisation,” Gracely says.

Another challenge for companies is the cost of running AI, which limits its accessibility.

“Because AI can be so powerful, it can also be expensive,” Gracely explains. “Many organisations don’t have the data scientists available to create the models. It needed to be easier for subject matter experts to use the technology to build the models.

“And, if these models are smaller, we can bring down the cost of running them.”

Because customers wanted access to tooling that lets them build their own models, and take control of their own destinies, Red Hat earlier this year announced InstructLab. This is a set of tools that allows subject matter experts to bring their data to these models in a simple way.

“We want business to bring their own AI – on-premises or in the cloud – and own the AI, knowing their data is not going to get mixed up in ways you don’t want it to be,” he says. ““Red Hat wants to let customers run their models their own way, with their own standards.”

In May, Red Hat launched a complete AI portfolio running on Open Hybrid Cloud platforms. This includes AI models with Red Hat Enterprise Linux (RHEL) AI; an AI platform from OpenShift AI; an AI-enabled portfolio with Red Hat Lightspeed; and AI workloads.

“We believe this gives customers trust, choice and consistency,” Gracely says.