As the hyper-scale cloud providers make massive investments in South Africa, opening their in-country data centres, they are prompting organisations to look for seriously at public cloud services to enhance their digital strategies.

It’s important that the hyperscale cloud providers are now operating in South Africa, says Abdul Moosa, chief technology officer of Fujitsu South Africa. “Prior to them investing locally, there was a lot of concern about data locality, data sovereignty, security and performance, all of which affected the adoption of cloud services.”

Bandwidth was another issue that suppressed public cloud adoption, he adds. “When it comes to being able to consume services offshore, bandwidth could be a big constraint.

“The opening of the cloud hyperscalers in South Africa means that organisations can leverage the extensive fibre networks that have been established; and they will also be able to take advantage of 5G when it is commercially available.

“Overall the investments into South Africa mean that customers are starting to seriously look at the adoption of public cloud.”

There’s been a big uptick in interest in the public cloud, says Moosa, and there have been some significant announcements around adoption. On the whole, however, organisations are taking a managed risk approach in terms of their cloud adoption.

“The biggest challenge is that South African companies have not necessarily bedded down their cloud strategies or the policies that direct how they will adopt cloud,” Moosa points out.

Having said that, organisations are already adopting digital services that are not necessarily available in their on-premise environments. These are typically around data analytics, artificial intelligence (AI) and Internet of Things (IoT).

“This is typically the first step towards getting the benefit of public cloud,” Moosa says. “The second step is usually in terms of infrastructure: when organisations want to implement new systems and look to the cloud for their infrastructure.

“The first deployments in the cloud would usually be non-mission critical systems like test and development, disaster recovery and backup. This is pretty much where South African organisations are now.”

With the range of public cloud services available, coupled with companies’ existing on-premise data centres, a multi-cloud strategy has become vital to help manage what is rapidly turning into a complex IT environment.

 

What is multi-cloud?

One of the biggest headaches the organisations face when moving into the cloud environment is which hyper-scale cloud provider to work with, Moosa explains. “Each of the cloud providers have a unique value proposition in terms of what they provide; so companies look for the feature sets that best suit their business.

“But they could end up using two or more public clouds.”

Multi-cloud is becoming the prevailing ecosystem for most organisations around the world; with a majority of them now employing at least five clouds.

These consist of cloud services from private of public clouds, with each cloud service used for a particular business application. In this way, organisations can customise their cloud needs, and select services the closely match the requirements of a particular workload.

Importantly, in a multi-cloud environment, organisations employ a cloud-first model where all the clouds are interconnected.

Among the trends driving increased multi-cloud adoption are new technologies like Internet of Things (IoT) and machine learning, while governance and policy management are also key.

Organisations also find the ability to use microservices, agility and automation, and the ability to avoid lock-in attractive reasons to adopt a multi-cloud environment.

 

Data management challenges

Being part of a multi-cloud ecosystem creates challenges around data handling and management, says Moosa. Because each cloud provider holds data in different formats, it’s difficult for organisations to move their data between clouds, or to migrate it if they change providers.

When it comes to monitoring and managing this data, CIOs tend to rely on the tools provided by the public cloud vendor. The drawback to this is that they end up managing the data in a particular cloud provider’s data centre in isolation and not as part of a cohesive multi-cloud ecosystem.

Compounding this issue is that fact that the data residing on-premise is often not able to be managed by cloud tools at all – and vice versa: on-premise tools are not designed to cope with cloud-based workloads.

As if these challenges aren’t enough, CIOs have to cope with issues around data or workload migration from one cloud to another; or from on-premise to cloud.

Compliance is often the first victim of these challenges. It’s too common to sign up a cloud provider before ascertaining their ability to meet industry or country compliance requirements.

Increasingly, security is also seen to suffer. Not only is it vital to evaluate the options and have security SLAs with each cloud provider, CIOs need to make sure data is secured during migration as well.

With these challenges in mind, CIOs must find ways to manage the data risks in a multi-cloud ecosystem, ensuring that data can safely traverse the eco-system in a unified format.

“This is where NetApp comes into play,” Moosa says. “Because NetApp has a relationship with all the hyper-scalers, it gives the organisation control of their own data. They are able to where it is residing and where it is being consumed; plus they are able to move data around within the multi-cloud ecosystem.”

NetApp is able to offer this value through its development of software-defined storage (SDS), where data is not tied to any particular hardware, and can co-exist in any format, Moosa explains.

“This ability to traverse the multi-cloud ecosystem is the data fabric; which allows data to run on any cloud provider with a consistent look and feel.”

 

What is a data fabric?

A data fabric is an architecture and set of data services that provides consistent capabilities across a choice of endpoints spanning on-premises and multiple cloud environments.

A data fabric simplifies and integrates data management across cloud and on-premises to accelerate digital transformation. It delivers consistent and integrated hybrid cloud data services for data visibility and insights, data access and control and data protection and security.

The NetApp Data Fabric helps organisations employ the power of data to meet business demands and gain a competitive edge. It allows the IT organisation to better harness the power of hybrid cloud, build a next-generation data centre, and modernise storage through data management.

This data fabric can be achieved by using a software-defined storage interface that integrates into any storage offering freedom of choice for data movement via a single pane of glass.

Bringing this into the multi-cloud ecosystem, NetApp offers software-defined storage solutions and has existing partnerships with the major cloud hyper-scalers putting data management back into your control.

 

Microsoft Azure Platform

The partnership between NetApp and Microsoft lets customers deliver seamless cloud services using Azure NetApp Files.

Azure NetApp Files is a high-performance Azure file-share service powered by NetApp that allows for integrated enterprise-grade data management for both Windows and Linux file shares.

Because it is hardwired into the Azure data centre it provides a rich hybrid cloud experience with great performance – all with a simple, intuitive Azure native interface.

With Azure NetApp File, CIOs are assured of the data performance their workloads require for file-based applications, DevOps and analytics.

Because the rich data-management capabilities of the NetApp ONTAP software platform is hardwired into Azure, the solution provides multiple high-performance data-service levels to meet the organisation’s application needs. So users you can migrate more of your applications to Azure faster with both NFS and SMB file shares.

Developers can reduce cycle times and gain productivity by bringing advanced data management to the development pipeline. Azure NetApp Files features Snapshot copies with near instant deployment of new workspaces, so developers can spend less time managing the environment and more time developing.

NetApp tools make it easier to consume data as well, easing data movement across on-premises and cloud infrastructures. This lets users get closer to the analytics, compute, and application services they require from Azure. Azure NetApp Files features the same data management technologies that are used on-premises, providing a consistent experience.

 

Google Cloud

NetApp extends the reach of its world-class data services to Google Cloud’s application development, analytics, and machine learning environment.

NetApp Cloud Volumes for Google Cloud Platform is a simple, cloud-native, no-ops file storage service for running applications, analytics and DevOps in Google Cloud Platform (GCP).

This allows users to move data back and forth between Google Cloud and on-premise.

In addition, CIOs can use NetApp Cloud Volumes Service for Google to manage workloads and applications with ease. Primary use cases for this service include file services, analytics, DevOps, and databases.

Together, NetApp and Google Cloud provide a tightly integrated, self-service, multiprotocol, file service offering.

Customers are able to provision, automate, and scale through simple API calls or the Google Cloud Launcher GUI. They can sync datasets automatically, take snapshots of data regularly, or even create full copies of their dataset in seconds – all without impacting application performance.

 

Amazon Web Services

NetApp File Storage for Amazon Web Services (AWS) delivers agile, secure, and flexible data services for the most demanding cloud requirements.

User can migrate file-based applications to the cloud, achieving the precise performance, scale, and security each application demands.

It also allows them to move analytics to the cloud, making data-driven decisions faster.

NetApp File Storage for AWS allows users to store databases in high-performance file services in the cloud, with the ability to revert to previous states and to create rapid clones for dev/test.

DevOps is simplified, with the ability to automate development and test environments as well as to clone hundreds of environments in minutes.

 

Fujitsu and NetApp

As a NetApp Global System Partner, Fujitsu offers one-stop shopping for integrated and automated data management products, solutions, and services that help their joint customers maximise IT investments and adapt quickly and easily to fast-changing business environments.

Highlights of the NetApp and Fujitsu partnership include a collaborative go-to-market strategy that drives synchronised programs to sell joint solutions; OEM and partner agreements worldwide; and maintenance, support, and managed services across the full shared portfolio handled by Fujitsu.

Fujitsu and NetApp have had a relationship since 1998, Moosa explains.

“We also do joint initiatives, and are involved in a lot of the NetApp product development,” Moosa says. “Because of that relationship we are a true value-added distributor and are able to supply end-to-end solutions that embrace design, strategy and roadmaps.”

Fujitsu engages with its customers on a strategic level in understanding and planning their digital transformation journey. “We understand how their business needs to transform; and work with them on Fujitsu and NetApp solutions that can assist them in accelerating that transformation.

“One of the key elements is helping them with the move to cloud services, giving them the tools to transform.”

Moosa adds that South African customers get the advantage of dealing with a global team. “Fujitsu is able to harness skills, knowledge and co-creation from around the world to help our customers to accelerate adoption.”