Data integration and data integrity are two of the most critical challenges keeping a company from true enterprise intelligence. 

While our technology systems grow and grow in the back-end there is often very little possibility for us to be able to view all of our data from an integrated, single, trustworthy data store, writes Annemarie Cronje, project manager: DI Studio at SAS Institute SA.
It is with this ideal in mind that data integration needs to be a critical component of an overarching business intelligence (BI) and storage strategy.
Companies which have embraced a BI policy need to be able to quickly get to and manage data which is gathered throughout the organisation.
This data has to be transformed to make it consistent and trustworthy in order to provide credible strategic BI as opposed to ad-hoc reporting on data which is obtained from silos in the company’s storage areas.
A good data integration tool can remove the headache of moving, accessing and transforming data throughout the enterprise.
After researching our customers’ enhancement requests and improvement suggestions, we discovered that customers are looking for a series of data integration tools which not only respond to data integration and data quality improvement requirements but also enable the enterprise to consolidate the number of vendors they use by standardising on a single integration tool platform.
This ensures enterprisewide data integrity, reduces the cost of data integration, enhances the sustainability of integration projects and ultimately reduces the overall cost of downstream data exploitation.
In our experience, the keyword for data integration is elimination – eliminating arduous and time-consuming custom hand-coding; eliminating documentation backlogs; eliminating reliance on specialist skill sets – and it is with this ideal that we have developed a series of data integration tools to assist companies in ensuring that the data in their organisations meets with all the required health checks they need to.
When working with a client to decide on their data requirements the first element we look at is the elimination of delivery delays and the high costs associated with having customised systems built for every integration project.
Not only is it expensive to develop customised software tools, but it can be extremely dangerous if the intellectual property which developed that software leaves the company.
In many instances these tools are developed to handle only a single element or integration challenge of the business, and often prove ineffective when needed to execute data integration across a myriad of disparate systems.
A data integration tool which allows integration tasks to be developed in a collaborative, self documenting, re-usable manner and can run across disparate systems, being able to read and transform a variety of different data streams can prove to be more effective to a company.
This also eliminates time delays in execution and potential hold-ups in the business environment due to ineffective data integrity tools.
But bad data input results in is bad data propagation , and data integration is not only about getting any data into a single system for data exploration to be able to occur, but also to improve the accuracy and relevance of the data contents, using standardising and matching techniques, during integration.
Data which is inaccurate or contradictory can have devastating effects on reporting, analysing or forecasting. Imagine investing in a company with 5-million subscribers, only to find out that 1-million were duplicate clients.
It is our opinion that the issue of data quality should dovetail your data integration efforts as opposed to being an afterthought in the integration process.
By embedding data quality within a data integration process the end result if one which is clear and consistent, allowing problems to be identified before the fact, as opposed to wasting time and money after the fact to track down inaccurate or inconsistent data.
The customer is then also ensured of having an accurate trustworthy data platform to support future analytics and business performance management. across your enterprise.
But projects change, and when one data integration task has ended there is often cause for a new one. In the past, companies have had to view each data integration project as a separate occurrence and had to develop a budget and project scope for each instance.
With a single enterprisewide data integration tool users can avoid the spiralling costs of each new project, as existing tools can be recycled to work across a variety of projects, as they are able to interface with all of the existing systems within an organisation.
Real BI is not achieved in the analytics or presentation of the data after extraction – it starts with having the right data, which is clean, healthy and in place right from the onset of every project.