Scientific discovery has long been a function of the tools developed and utilised by the pioneers of innovation.

By Jim Holland, regional director of Lenovo

While Lippershey was busy patenting his telescope, Galileo pointed his to the heavens, detected craters on the moon, rings around Saturn and moons orbiting Jupiter.

In the field of astronomy, observations made with the naked eye became obsolete, and the telescope became the indispensable tool of the trade. Multiple other instruments such as barometers for weather, microscopes for bacteria and cell study, and others, attained similar “vital” status.

But in the last 25 or so years, they have all become secondary to the penultimate scientific tool of the 21st century, the supercomputer.

Ever since the mid-1990s, when the use of supercomputers to map the human genome was first proposed, then executed, technology has been the “man behind the curtain” of discovery.

Recently the world was captivated by the first ever image of a black hole. To create the image, radio telescopes around the world were linked together to create a virtual single telescope “the size of the earth”.

Fascinating science from gifted astrophysicists. However, the data collected took nearly two years to compile, using multiple supercomputers across a number of sites. Such an endeavour could not have been considered without supercomputers.

Today, the ability to connect thousands of inexpensive yet powerful Intel x86 systems together to solve pieces of a problem, then reassemble those pieces into an answer, has introduced high performance computing (HPC) to an even greater spectrum of research.

Indeed, across the world, Lenovo HPC customers all over the globe are working on solving many of humanity’s greatest challenges in areas such as seismology (Harvard University, US and LRZ, Germany), Lunar surface identification (SciNet, Canada), childhood cancer treatments (LRZ) and continental species distribution modelling (University of Adelaide, Australia).

These are but a few of the thousands of projects that rely on the computational muscle of a supercomputer.

Tomorrow, the advent of sensor technology, and its ability to replace multiple scientific instruments, to provide a constant stream of actionable data, and 5G cellular networks that will transmit that data in near-realtime, creates a dilemma not of “how do I get the data?” but of “where do I start?” Data is no longer a boundary, or an obstacle, but a plentiful asset, which presents its own set of challenges.

To deal with this data deluge, scientists will need to adapt. One can easily envision an augmented research environment emerging where Artificial Intelligence (AI) assists in sifting through the data to identify the most compelling pieces upon which researchers focus.

HPC systems themselves will evolve. Today, we stand on the doorstep of the exascale era, where systems will be achieve a quintillion (1018) floating point operations per second (FLOPs). That will require some serious technological heft.

Compute, storage and networking will all get faster, with greater capacity, but significant advances in cooling technologies, packaging and management are also required. Making those technologies available to HPC customers of all sizes is what Lenovo’s “Exascale to Everyscale” initiative is all about.

Microscopes, telescopes and seismographs will not go away any time soon. There will always be a place in science for tools that have allowed us to “think big.” But now, because of supercomputers, scientists can “think unlimited”.

One wonders what Galileo would be able find with a radio telescope and supercomputer.