Kathy Gibson reports from CeBit in Hannover – Doubts that public cloud services are able to handle just about any workloads can be laid to rest with the news that CERN breaks out into the cloud when it needs more power to boost its own IT systems.
Tim Bell, compute and monitoring group leader at CERN IT, explains that the organisation is the European centre for particle physics research.
Most of the centre is underground, where a 27km ring fires beams of protons.
As much as CERN pushes the boundaries of science, it also asks a lot from its computing platforms.
“We recognised that we need to collaborate to resolve challenges as we go forward,” Bell says.
The organisation set up the CERN OpenLab in 2001. Here, it works with the industry to solve CERN’s computing problems, validate then in the CERN environment and then make the available back to the industry.
“This is a public-private partnership to solve difficult computing problems,” Bell explains.
The CERN IT environment is complex and powerful, an OpenStack cloud running 220 000 cores supports the needs of 22 000 physicists.
“At the same time, we sometimes need more than that,” says Bell. “And so we have been exploring the use of public cloud resources.”
In 2016, CERN tested Hadron Large Collider (HLC) workloads on Deutsche Telekom’s Open Telekom Cloud network built in collaboration with Huawei.
“We were able to use standard tooling to deploy virtual machines both on-premise and in the cloud,” Bell says. “And we were able to run all workloads successfully.”
By collaborating with Open Telekom, CERN will have the ability to work within OpenStack to source solutions for its own environment. And these will be made available to the open stack community so that all users will be able to benefit.