Human exploration has made leaps and bounds since man first landed on the moon in 1969. We have now entered a paradigm where data takes centre stage, writes Morne Bekker, country manager of NetApp South Africa.
Of course creating a 3D map of over one billion stars, mapping the milky way and even searching for life on Mars is no small feat. It requires decades of hard work and reliable access to massive quantities of data to move beyond one small step for man towards one giant leap for mankind.
In order to analyse this ‘library of the universe’ artificial intelligence and specifically machine learning needs to play a pivotal role. Machine learning is a machine’s ability to automatically learn and improve its performance without being programmed. This is critically important for space exploration for a few reasons.
Firstly, it is impossible for scientists around the world to articulate their knowledge under one umbrella as well as automate tasks and second machines are excellent learners. They can turn data into assets, allowing scientists to accelerate innovation and achieve superhuman performance, drive efficiencies, create insights and even aid new research developments.
It therefore goes without saying that the capabilities for AI and machine learning in the processing of mass amounts of data are far-reaching. Not only does it equate to extreme performance, but also to massive non-disruptive scalability where scientists can scale to 20 PB and beyond in a single namespace, to support the largest of learning data sets.
Importantly, it also allows scientists to expand their data where needed. NetApp enables these organisations to do that by building a data fabric for hybrid cloud that seamlessly connects resources and allows data management, movement, and protection across internal and external cloud services. From a space exploration perspective, this has assisted with the challenge where data from every mission needs to be indefinitely accessible so that future scientists may continue their exploration of the universe using historical data.
A practical example of this in action can be seen through the work done by the European Space Agency (ESA). On 30 September 2016, the world watched via livestream as ESA’s Rosetta spacecraft successfully landed on the surface of a comet. For more than two years, the spacecraft had travelled with the comet along its orbit gathering data.
But with the comet speeding away from the sun at 120 000kmph, Rosetta would soon lose solar power.
Scientists seized the opportunity to attempt what no one had ever tried before – to gather unique observations through a controlled impact with the comet. Despite blistering speeds and countless unknowns, the spacecraft landed just 33m from its target point, sending invaluable high-resolution images and measurements back to Earth.
As the data authority in a hybrid cloud world, NetApp used artificial intelligence and machine learning to constantly analyse and provide consistent insight across the data centre, so scientists could monitor and manage hybrid IT multivendor storage, compute, and networking infrastructure. In fact, every day, ESA receives massive volumes of raw telemetry data from its spacecraft and observatories.
That data must be stored and processed before it can be archived or shared. Scientists across Europe depend on ESA’s daily observations, so the reliability of that data is critical.
NetApp’s high availability means data is available to scientists around the clock.In coming years and as additional missions are launched, NetApp will continue to provide solutions that are scalable so that scientists can really reach for the stars.