Many market commentators are invoking master data management, or MDM, as a comprehensive and enduring solution to the prickly issue of data quality, writes Mervyn Mooi, a director of Knowledge Integration Dynamics (KID).

Some of them have oversold the promise of MDM, not surprising given that it is a technology still in the early part of the hype curve.
First of all, for anyone involved in the daily fight to get data right, it is clear that there is no one single technology or approach will fulfil all data quality requirements.
It's worth revisiting a definition of MDM, as provided by Wikipedia: "… a set of processes and tools that consistently defines and manages the non-transactional data entities of an organisation (also called reference data). MDM has the objective of providing processes for collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organisation to ensure consistency and control in the ongoing maintenance and application use of this information."
So MDM has a very clear and noble role to play but on its own, it cannot guarantee data quality. MDM deals primarily with applying rules that match the same entity to one single convention and identity. This is effective to achieving a "single view" of the customer which enables linking across systems and data, using a uniquely identified key assigned or used by MDM. But this still is not the "golden record" of the customer companies are seeking.
For a golden record, you need ALL of the customer's attributes validated and adhering to standard/agreed data rules. This falls in the realm of reference data, of which MDM is a part.
In addition to this, there is the application of business rules, which are used to surface data anomalies that typically are not identified in the standard manner. This is where data profiling and rules engines become relevant. They can be built on an enterprise service bus (ESB), in a data warehouse or in an enterprise application integration solution.
Even more of a requirement to preserve the integrity of data resources, is to manage its definitions, or metadata, in order to achieve integration, which again is a function of data quality.
So, for trustworthy data, blend master and reference data management, add data profiling, and stir in metadata management.