(248) 735-0648

Analysts, operations staff, managers and IT leaders spend a tremendous amount of time looking at data about their IT operations. “What do we have?” “For what is it used?” “How accurate is it?” “Why don’t the numbers add correctly?” “What happens if we miss something?” These are all very necessary questions for managing an organization’s IT environment, ensuring resources are allocated efficiently and addressing risk and vulnerability. Why is managing this data so difficult for most organizations? The answer is found in the data quality issue.


It’s good you have data. It’s better if you have data from a variety of different sources. It is even better if you are gathering the data in one place, such as a CMDB, that provides you a single source of truth about your IT environment. It is great if that data is used for driving decisions. It is not so great if the data is incomplete and inaccurate and parts are outdated. What started as the potential for creating valuable insights is now causing a monumental headache for your staff.


At the heart of the issue is the way companies integrate data and import it into their CMDB. Discovery tools each provide a narrow view of the overall environment and when you view them as a whole, they often don’t fit together well. There are gaps, there are overlaps and there are conflicts among different perspectives. There is also the issue of timing – data about an environment that is moving and changing is collected at different times. The result is that different perspectives are difficult to reconcile because they were captured at different times. When the data from different discovery tools is loaded into a CMDB, the problems don’t just disappear – the data in the CMDB assumes the same quality issues as the source systems. To make sense of what you’re seeing, the data must be reconciled and viewed as a single image.


Today, most IT organizations utilize staff with experience and subject matter expertise to scrub data as it is coming from the CMDB and being transformed into information and insights to drive decisions. While this approach can be somewhat effective, it isn’t very efficient, and it doesn’t address the underlying problem. As the amount of data increases and becomes more complex, more resources are required to address the data quality issues. A better approach is to clean the data and address the data quality issues as new data is entering the CMDB – fixing it once and creating a solid foundation to drive decision making.


Once your organization is very dependent on your operations and configuration data to make decisions, there isn’t any other option than clearing the mess, hitting reset and starting again. You are where you are and you must initiate a two-part strategy to fix the challenges and obstacles. The first part is cleaning your current data. If you’ve been operating your ITSM system for a few years and collecting data during that entire time, then you likely have both issues of data quality and data quantity to address. The second part of the strategy is changing your methodology to ensure new data you capture makes the overall picture clearer and the data cleaner (instead of making the situation worse).


It seems simple, right? Just clean the mess and keep it clean (it’s a bit like confronting a teenager’s bedroom). For many organizations, however, this change can be difficult. Legacy discovery and collection processes, lack of a data lifecycle management process, fragmented data ownership and the overwhelming scale of the challenge can seem unsurmountable. The good news is that improving your data quality isn’t as difficult as it seems and the benefits for you are tremendous. The key is having the right tools to help you.


Blazent’s Data Quality Management solutions enable you to reconcile all of your discovery feeds and other sources with the current data in your CMDB to verify accuracy and completeness, resolve timing issues to ensure your data is the most current and to highlight elements and attributes that are missing, so you can resolve them. The result is a set of data in your CMDB that is more complete, correct and current and that can serve as a trustworthy foundation for decision making. In short, better data quality that leads to better decisions. Find out more about the top 5 costs associated with poor data quality by downloading the whitepaper here.