Data is the one thing every business has a lot of. And that is probably why many organisations struggle to turn it into a strategic asset.
By Henry Adams, country manager of InterSystems
Disconnected legacy systems and data stored in siloes and in different formats are just some of the challenges facing business leaders as they search for the best information to run the company and identify risks and opportunities.
Some have described data as the ‘new oil’ or the ‘new gold’. And yet, you might say data is like water. Too much you drown, too little you thirst. When it is old, stagnant, or dirty – it can harm you. It is, therefore, the fresh, clean, healthy data that an organisation needs to thrive.
As the world struggles to cope with the Covid-19 pandemic, companies have realised the importance of having access to this clean, current, and healthy data. Of course, having a proper data management infrastructure in place to extract the information and insights hidden away in different systems and sources to gain real-time visibility and adapt to changing market conditions is not a new concept.
What has changed is that competitive businesses have a better appreciation for the need to differentiate their offerings when competition is intense, and the pace of innovation is fast and unrelenting. Much of this comes down to taming the data analytics beast. Research shows that 89% of retail and manufacturing organisations see effective data management technology as an absolute requirement in the current landscape.
Data comes first
It, therefore, stands to reason that any data strategy must begin with the data at hand. Data quality is perhaps the most significant obstacle facing organisations looking to harness the power that artificial intelligence (AI) and machine learning (ML) analytics can provide. Without fast and easy access to the right kind of data, the deployment of AI and analytics will fail to harness the potential of these technologies resulting in the inability of any big data project to gain momentum.
Throughout this, handling the volume of data in the organisation remains a challenge. Therefore, it is critical for the company to determine what data it needs before it goes into the analysis stage. Further complicating this is the amorphous nature of data. It grows and expands as the business changes. For instance, in retail, it can be seasonal or influenced by promotions. Being agile will help businesses handle spikes caused by real-time market changes or surges in transactions that generate increased streaming levels.
At a fundamental level, companies must integrate their data silos to access and analyse the data they need, eliminate blind spots, improve agility, and deliver outstanding customer service – all elements that have come to the fore following developments of the past 16 months.
Step by step
More than three-quarters of companies have started focusing on increasing the adoption of data management technologies due to the pandemic. But much like digital transformation or cloud computing migration projects, this is not something that can be done on a ‘rip and replace’ basis. If a business tries to conduct a master data management strategy across all its data in one go, it will be akin to eating an elephant in one sitting.
Yes, modern data management platforms are scalable, provide rich data integration and transformation capabilities, and use advanced analytics such as AI and ML to enable scenario planning. These also equip the organisation with the ability to respond faster to unexpected events and incorporate intelligence into automated processes to become more resilient to fluctuations in the environment.
But it must be managed in a phased approach to deliver the same consistency throughout the organisation. The temptation to rush into using more sophisticated analytical tools and gain as much insight as quickly as possible from the available data will be there. However, by identifying where the most pressing business cases are, business leaders can manage this rollout in a more structured manner.
Overcoming the constraints of legacy systems that do not speak the same language as more modern ones will be a vital steppingstone on the road to analytical transformation. As such, consolidation and standardisation should be key to a data strategy. Without them, any dreams of harnessing data to perform big data analytics will not make it out of the starting blocks.
To fully exploit AI and big data analytics, a company must ensure that unstructured data can play a role alongside structured internal data. This is not an easy or familiar process for many businesses. It is relatively unfamiliar territory, and the integration of these two sources requires care and expertise. Marrying disparate kinds of data sources, especially when there is third party data, can come with risks. This has led to many businesses building solutions that only employ their internal data rather than data from external sources such as social media.
An alternative to this bespoke development is through a smart fabric approach. This speeds up and simplifies access to data assets across the business. It accesses, transforms, and harmonises data from multiple sources, on-demand, to make it usable and actionable for a variety of business approaches. The smart fabric empowers the company to interweave all kinds of information from many sources. Therefore, a business can pool together fast-moving, batch-oriented data with a smart data fabric approach, along with data from cloud and legacy sources, connecting where required through APIs or Web services.
A smart data fabric gives an organisation the ability to scale dynamically to cope with surges in data volumes, making information usable through a simplified, streamlined architecture, and it is a data location-agnostic approach to analytics. And this is ultimately what a company is looking for in today’s continuously evolving market landscape that puts the focus squarely on extracting meaningful insights from the data at hand as quickly as possible.