South Africa’s victory in being chosen to host the bulk of the world’s most powerful radio telescope, the Square Kilometre Array (SKA), holds major ramifications for the IT sector.

South Africa will host 70% of the R23bn super-telescope, hailed as the largest scientific project in the world. The SKA will allow scientists to study how galaxies have evolved as they peer back in time using 3,000 antennas, concentrated in the Northern Cape with others in Namibia, Botswana, Ghana, Kenya, Madagascar, Mauritius, Mozambique and Zambia.

The SKA will bring new infrastructure, foreign investment and the need for highly specialised skills. Already, 400 bursaries have been awarded to university students, and science and engineering centres of expertise will be developed throughout Africa.

The project is also highlighting the need to develop big data skills locally. Big data is the term coined to describe huge volumes of information, measured in terabytes, petabytes and yottabytes, in structured or unstructured forms.

Big data is not only an issue for stargazers, but for any large corporation facing an ever-increasing data deluge. All that information must be stored, processed and interrogated using analytics and business intelligence tools to gain greater insight into business processes.

To help companies achieve that, Big Data has been added to the agenda of IP Expo, a technology event launched in Johannesburg last year. The inaugural show focused on IP infrastructure, virtualisation and cloud computing. Now big data is joining that line-up as event organisers Montgomery Africa ensure that IP Expo covers the hottest technology issues facing businesses today.

Visitors to IP Expo will be able to explore the problems and solutions at workshops and seminars, while exhibitors will demonstrate the latest technologies to help companies not only cope with the data, but use it to improve their operations and gain a competitive edge.

Big data is a challenge for IT departments because making sense of massive volumes of information requires key skills such as analytics, which are already expensive and in short supply. Organisations will either have to employ those skills at a very significant cost, or outsource their big data processes to keep costs down and gain access to the right technology skills.

IP Expo manager Michelle Meldau believes outsourcing will prove very popular, as it will overcome the skills shortages and allow companies to rely on specialist providers with a deep understanding of their software solutions. Outsourcing will also help companies stay ahead by having access to the latest technologies, as well as saving them the headache of maintaining the tools themselves.

Meldau says the problem of big data has arisen thanks to internet searches, social media, user-generated content, machine-to-machine telemetry and industries like astronomy and climate research continually pumping out information.

Yet businesses can capitalise on that by analysing the data to identify useful patterns and trends, and applying them to improve productivity, drive innovation and maximise profits. New tools and applications are being developed to manage, interpret and analyse the information being created, Meldau says.

Meeting the storage requirements created by this trend will be one of the biggest IT challenges facing companies in the medium-term, however. Storage issues can often be resolved by outsourcing and cloud computing options. Storage providers can host data in virtual data centres comprised of numerous separate centres connected via high-speed networks. These virtual data centres have the capacity to accommodate all the data being created and the processing power to analyse it in real-time. They can also accommodate all varieties of data in a single repository, and integrate it into other hosted applications and services.

That leaves businesses free to focus on unlocking the value hidden within, Meldau says.