Kathy Gibson reports from Gitex, Dubai – Cloud computing is the basic building block that will allow artificial intelligence (AI) to flourish.
That’s the word from Uwem Ukpong, vice-president: global services at Amazon Web Services (AWS), who points out that CEOs are under tremendous pressure to demonstrate what they are doing with AI.
“CEOs and boards around the world are looking for use cases, and running proof of concept projects,” Ukpong says. “But then they face the challenge of how to get these into production.”
The challenge lies in the fact that AI, and particularly generative AI using large language models (LLMs), churns through a lot of data. This data needs to be secured and ring-fenced. “When you introduce an LLM into the company you don’t want your data to become public, so it has to sit within the company firewall and effectively become your own LLM.”
To do this effectively, on-premise can add to the challenge, which is why Ukpong believes cloud is the only effective way to go. “AWS complies with 98 different data privacy and security standards, so our systems are already watertight.”
The hardware required to effectively run AI is expensive and in short supply. “You need GPUs to do any kind of generative AI,” Ukpong explains, adding that the hyperscaler cloud providers can offer these platforms.
There are already a number of real-world instances where companies are deploying generative AI, Ukpong adds. These include applications for education, compliance regulation, and customer experience.
When it comes to making AI investments, he urges companies to consider the advantages offered by the cloud providers. “The investments required for generative AI are going to be costly, especially when it comes to GPU chips and memory. Driving down the cost is going to be key, so enterprises must ensure they are on the right platform.”