In any typical IT project, most businesses focus solely on the functional design of the solution, to the exclusion of the data and how this data supports business processes, says Gary Allemann, MD at Master Data Management.

However, if data quality and integration is not planned for from the start, the data then has to be retrofitted into the solution, which often causes issues such as delays, additional budget, and a lack of functionality within the system itself.

Any data intensive project should not only factor in data from the outset, it also offers an opportunity to improve data quality, ensuring that the project is delivered successfully, on time and on budget, and will continue to benefit the organisation in the future.

When implementing an IT project, there are three aspects, or streams, that are typically involved:
* The functional components of the new system;
* Change management; and
* Data.

Solutions such as customer relationship management (CRM), master data management (MDM), enterprise resource planning (ERP), business intelligence (BI), data warehousing, data governance, and any data migration or consolidation, rely upon data to deliver on the expected benefits.

Despite the critical nature of data, however, the data stream itself is often an afterthought in any IT project, or is excluded from the scope of the implementation. This leads to a number of issues, as data must be made to fit the solution after it has been implemented, often leading to extended delivery times, budget overruns and system problems.

To avoid these problems, it is vital to bring the data stream into play from the early stages of planning a solution. When implementing a new system, the majority of organisations focus their attention on the functional components, with some attention given to change management, training and behaviour modification, particularly on bigger projects with a large user base.

However, the data stream tends to be overlooked, although it is equally critical to project success. Organisations need to plan for the data stream that will be required, and then integrate this from the beginning, to prevent the issues that leaving this to the last minute will cause.

It is also critical to have the involvement of business users throughout the process, as they are the ones who need to consume the data after a new solution has been implemented, and therefore have a greater knowledge of what information they need the system to deliver.

Business users understand how information supports their ability to deliver on the business objectives of both the new solution and the organisation as a whole.

A typical project plan consists of several phases, with related business processes. To include the data stream in this plan, technical processes should be added at each stage. During project preparation, the current technology should be analysed, and data risks assessed.

During the creation of a blueprint, source data should be analysed, a data baseline captured, and a data architecture designed. During the implementation phase, data quality processes should be created, including investigation, cleansing and standardisation, matching and linking. Data quality processes should also be integrated with applications and services during this phase.

During rollout preparation, the data production system cutover plan should be defined, and the initial data cleanse and loading should take place. During the ‘go live’ transition, on-going data quality processing should be performed, and finally during project maintenance the technology should be tuned, and change requests and exceptions managed.

For project success, data should be planned and managed from the start, with a full project plan including what the data stream should look like, who needs to be involved at different levels and their roles.

This will assist the project manager to put a project plan together that involves the right users at the right time, both business and technical, merging the functional and data streams and preventing data from becoming a problem later on.
Tags: application development, data quality, IT projects

Data quality is critical for IT projects

In any typical IT project, most businesses focus solely on the functional design of the solution, to the exclusion of the data and how this data supports business processes, says Gary Allemann, MD at Master Data Management.

However, if data quality and integration is not planned for from the start, the data then has to be retrofitted into the solution, which often causes issues such as delays, additional budget, and a lack of functionality within the system itself.

Any data intensive project should not only factor in data from the outset, it also offers an opportunity to improve data quality, ensuring that the project is delivered successfully, on time and on budget, and will continue to benefit the organisation in the future.

When implementing an IT project, there are three aspects, or streams, that are typically involved:
* The functional components of the new system;
* Change management; and
* Data.

Solutions such as Customer Relationship Management (CRM), Master Data Management (MDM), Enterprise Resource Planning (ERP), Business Intelligence (BI), data warehousing, data governance, and any data migration or consolidation, rely upon data to deliver on the expected benefits.

Despite the critical nature of data, however, the data stream itself is often an afterthought in any IT project, or is excluded from the scope of the implementation. This leads to a number of issues, as data must be made to fit the solution after it has been implemented, often leading to extended delivery times, budget overruns and system problems.

To avoid these problems, it is vital to bring the data stream into play from the early stages of planning a solution. When implementing a new system, the majority of organisations focus their attention on the functional components, with some attention given to change management, training and behaviour modification, particularly on bigger projects with a large user base.

However, the data stream tends to be overlooked, although it is equally critical to project success. Organisations need to plan for the data stream that will be required, and then integrate this from the beginning, to prevent the issues that leaving this to the last minute will cause.

It is also critical to have the involvement of business users throughout the process, as they are the ones who need to consume the data after a new solution has been implemented, and therefore have a greater knowledge of what information they need the system to deliver.

Business users understand how information supports their ability to deliver on the business objectives of both the new solution and the organisation as a whole.

A typical project plan consists of several phases, with related business processes. To include the data stream in this plan, technical processes should be added at each stage. During project preparation, the current technology should be analysed, and data risks assessed.

During the creation of a blueprint, source data should be analysed, a data baseline captured, and a data architecture designed. During the implementation phase, data quality processes should be created, including investigation, cleansing and standardisation, matching and linking. Data quality processes should also be integrated with applications and services during this phase.

During rollout preparation, the data production system cutover plan should be defined, and the initial data cleanse and loading should take place. During the ‘go live’ transition, on-going data quality processing should be performed, and finally during project maintenance the technology should be tuned, and change requests and exceptions managed.

For project success, data should be planned and managed from the start, with a full project plan including what the data stream should look like, who needs to be involved at different levels and their roles.

This will assist the project manager to put a project plan together that involves the right users at the right time, both business and technical, merging the functional and data streams and preventing data from becoming a problem later on.