Kathy Gibson at Huawei Connect in Shanghai – Cern has partnered with Huawei on OpenStack developments, to provide the compute power required for nuclear particle experimentation.
The Large Hadron Collider (LHC) is a massive computing challenge, powered by 13 000 servers that produce 10-million collisions per second, and generating 1 petabyte of data per second.
Jan van Eldik, resource provisioning team lead at Cern, explains that Cern as set up to study the fundamental structure of the universe, which it does by conducting large-scale physics experiments.
The 27km long LHC is running four experiments that collect data at an enormous rate, Van Eldik says — up to 9Gbps between them.
This data is sent to the organisation’s data centre, where it needs to be stored safely. It is also distributed to a worldwide computer grid where further analysis is carried out, along with processing and analysis by the Cern data centres in Switzerland and Budapest.
The organisation has been running its private OpenStack cloud since 2013, now consisting of 8 500 servers with 280 000 CPU cores, 33 000 virtual machines running 4 700 volumes in 3 600 projects and a wide range of applications, users and service levels.
“But there are dark clouds on the horizon,” Van Eldik says. “We realised that we are soon going to run out of CPU capacity, and have started to look at extending the private cloud into the public cloud.”
After a tender process that attracted 21 responses, Cern signed up for T-Systems’ Open Cloud, which has been running for a few months.
“Public cloud is attractive to boost resource provisioning,” Van Eldik says.