Today’s business environment is radically different from the one at the beginning of 2020.
By Daniel Thenga, senior Netapp BDM at Westcon-Comstor Sub-Saharan Africa
It has taken roughly 17 months to disrupt the traditional processes and systems which companies spent decades perfecting. Fundamental to this has been the importance of data and how powerful it has become in a digitally-driven world.
We have changed the way we work, live, and relax thanks to our increased connectedness fuelled by things such as social distancing and lockdowns. Maintaining competitiveness in this ‘new normal’ requires organisations to simplify how they analyse data and accelerate the transformation of insights into business benefits.
The normalisation of digital transformation has highlighted how data has become the most significant asset any organisation can have. Pre-pandemic, it could be argued that every company was considered a technology company, given how IT permeated every facet of the organisation.
Over the past several months, what’s become apparent is that this has been supplanted by the fact that every company has now transformed into a data company. For every new app installed on a smartphone, every customer touchpoint, and every engagement with any stakeholder, data is being collected in the background, stored, and analysed to power the modern business landscape.
Digital transformation is changing how data management is done. With this comes previously unimaginable opportunities. However, these are only possible if organisations have the right expertise and tools to extract value from their data. In turn, this improves efficiencies, provides companies with more value, and delivers the capabilities necessary to develop new business models with alternative revenue streams.
Warming up data
Forming part of this is to assess the relevance of the data available to the organisation. With the hybrid cloud becoming the new standard architecture for data management and accelerating business outcomes, companies would do well to ensure the data stored in that environment is as relevant as possible. This has given rise to the concept of cold data.
Essentially, this is data that the business has not touched for 30 days. Filling a data centre with this ‘old’ data is akin to relying on last year’s news to inform the decision-making process.
The high-performance computing systems of the cloud can deliver the most benefit when it analyses the latest data. In this way, decision-makers get the latest insights on what customers want now, not what they needed a month ago.
By eliminating infrequently accessed data from the ‘active’ environment, companies can significantly speed up how analysis is done. Leveraging machine learning and artificial intelligence to assist in this regard adds even more speed to a landscape that requires as much ‘real-time’ insights as possible.
Even as this shift to data-centricity is becoming the norm, IT teams are under constant pressure to reduce operational costs. These transformations are also causing a shift in personas that operate the infrastructure.
Instead of the point specialists of the last decade, IT generalists with broad skill sets now manage the full stack of infrastructure. Today, with the dwindling number of specialists, nobody has the time to fine-tune storage system performance anymore.
This is even more reason to ensure the data being used for analysis, regardless of where this is done, is relevant to the immediate and future needs of the organisation.
There is an underlying customer-centricity to all this data management. People want flexibility, security, efficiency, and immediacy. Where relevant, they want self-service opportunities to manage things in their own time without requiring going through a call centre.
Inevitably, this is resulting in organisations needing to manage data analysis in far more integrated ways than in the past.
But data management can prove to be time-consuming and complex. Instead, companies should shift these tasks to data centres that can deliver more proactive and predictable storage management while giving access to higher performance computing and network resources.
These forces combine to see the increasing use of IT as a service (ITaaS). Going this subscription route means organisations can move from a capex model to an opex one, tapping into new budgets to fund the ‘new’ environment.
It comes down to businesses needing operational agility to adjust to the data-driven insights that come from this hybrid cloud environment. They need to accommodate the new applications that come up as fast as their internal teams need them.
Companies can no longer manage data infrastructure as they did at the start of 2020. It is now about delivering the right set of data services for customers faster while continually evaluating and customising as they go along.
Much of this comes down to simplifying how data, and the rest of the technology stack, is managed. Companies must do more to let the technology manage the complexity relying on things such as automation and machine learning to do the heavy lifting leaving decision-makers open to focus on addressing business deliverables.