The Data Quality Skeptic
As a Configuration Management Database (CMDB) or IT Asset Management (ITAM) repository owner or sponsor, it’s only a matter of time until you meet The Data Quality Skeptic within your organization.
The Data Quality Skeptic has the following characteristics:
- Does not believe that the repository can be kept up to date
- Refuses to use repository data and disparages it at every opportunity. “I saw bad data in there! You can’t trust it.”
- May be in a position to direct an entire team to avoid the repository, reducing your value proposition
- Often has their own data management capability or process they favor
In the worst cases, The Skeptic can derail your entire CMDB or ITAM repository initiative.
How do you deal with this person?
The first thing is to accept that their concerns are valid. A CMDB or ITAM repository with poor data quality may be worse than no CMDB at all. Discounting data quality as an issue will not help you!
Instead, you need to change the terms of the debate. Fortunately, while data quality is often an unfamiliar topic to IT service management, it has its own community who have developed useful guidance over the years.
The Data Management Body of Knowledge is a rich and detailed survey of data management best practices. You can start with the concept of a data strategy:
Typically, a data strategy is a data management program strategy – a plan for maintaining and improving data quality, integrity, security and access. However, a data strategy may also include business plans to use information to competitive advantage and support enterprise goals.
Data Management Body of Knowledge
A Data strategy for a CMDB must place data quality front and center. Data quality typically includes the following:
- Quality indicators and controls (what is the data expected to look like)
- Exception reports
- Trending (display/use for Organizational Change Management purposes)
- Continuous improvement effort that addresses the causes of quality issues (not just fixing defects)
Perhaps the biggest challenge with data quality is that perfect data quality is not possible, any more than 100% uptime is possible. This makes dealing with The Skeptic difficult, as they will use any instance of data inaccuracy to discredit the entire repository effort. The best response you can have is again to change the debate from one of “managing by anecdote” to one of “continuous improvement”.
This requires measurement. You must track your exceptions (whether manually or automatically identified) and be able to present data such as this:
Percentage of Data Quality Exceptions
What does the above diagram represent? It is tracking two kinds of data quality exceptions:
- Hosts with no services (e.g. applications) – we know the asset is there, but we do not know what it is doing or how it is providing value. Is it the Accounts Receivable System? Or the employee potluck signup?
- Hosts in the IT Asset Management system but not in the Fixed Asset system.
It is telling us that both of these exceptions have trended down over the year (perhaps as the result of some focused cleanup programs).
With data like this, you change the conversation:
The question is, how much data quality do you need and can you afford? “No errors” is unrealistic. Instead, you need to have a business conversation about the risk that bad data quality presents and understand those risks in terms of all the other risks your organization must respond to. For example, it is very difficult to fully reconcile IT Asset with Fixed Asset systems (even 10% might be unrealistic). Your company might decide that 20% exceptions are OK and the risks of those exceptions will be handled via other controls.
The important point is to focus on the business needs that drive your need for quality IT management data. By keeping this as your priority and basing the conversation on “continuous improvement” rather than “unattainable perfection,” your CMDB/ITAM effort will be on solid footing.