subscribe: Daily Newsletter

 

Data analytics are going boldly into the future

0 comments

Controlling costs and optimising performance from IT assets is not a new mandate for corporate IT execs – particularly in the data analytics arena. But there’s a slight change in the way managers are approaching it these days, because enterprises want operational analytics; analytics that feed them corporate performance management information and operational business intelligence.

Those two capabilities drive a number of benefits, not least among which are better customer retention and greater customer value, writes Mervyn Mooi, director at Knowledge Integration Dynamics (KID).
And the change? Gartner reckons that enterprises are beginning to accept a higher initial cost for lower ongoing costs and better performance.
As a result, data architecture is evolving – and that means it is one of the most exciting times to be a data-head.
Sure, the future of analytics is predictive and it will see the incorporation of social media data into the fold, but right now on the ground in South African enterprises, slowly receding into memory are the dark days when users placed data in a warehouse and accessed it as needed.
Replacing them are lighter times – that demand users be on top of their game – in which they must deliver data to those who need it and help them to make sense of it, putting it in context and relating it to their needs. It’s proactive and not reactive.
Enterprises today have tons of data, but it is making sense of the data – turning it into information and putting it into the hands of the right people – that is really the key to maximising its value. As an example: a business has customer data, information about what customers bought, where and when, how much they paid, and then they have product data, distribution and supply chain data, and HR data.
These undoubtedly sit in different systems, possibly even in different geographic locations. Users need to combine the data so that they can begin to mine it and make some business decisions.
There’s much integration and data quality work that goes on in the background but so far that’s nothing new. And if businesses know their data from their information, their analytics from their intelligence, and their dashboard from their tool (not always easily identifiable), they get a data warehouse together, copy all the data from the original sources and put it into a warehouse.
But that’s a slow process. It requires data replication – always a problem. And it means they’re basing their decisions on what happened yesterday, last week, or even last month.
Surely it would be far better to do without the data warehouse? That would save costs. Warehouses are terrifically expensive and there’s the ongoing cost of running and maintaining them. Using operational stores means no duplication of data. It also means using current data – literally up to the minute.
The ramifications are that this architecture grows the online transaction processing systems into data warehouses and business intelligence back-ends. Executives probably don’t care about that, though, so tell them it saves money and it gives them up-to-the-minute data with which to work.
But surely one of the biggest problems is the performance hit that the systems will take. Not so. The architecture uses data appliance technology to create a virtual warehouse and a business intelligence model is created using standard query and reporting tools plugged into the appliance. There is no need to acquire additional warehousing skills. And it is supported by a high-performance virtual hardware environment.
And then it goes one step further: it sweats existing assets by using the disaster recovery system to shadow the appliance and load-balance processing on the appliance.
It’s going to fundamentally change vendor market dynamics because modern analytics is the final frontier – enterprises boldly going where none has yet ventured.