(248) 735-0648

In the wake of the most recent (May 2017) malware attack impacting computer systems around the world, company executives are in urgent discussions with IT leaders, asking them to provide assessments of risks and vulnerabilities and recommendations to safeguard the company’s information and operations. CIOs and IT leaders strongly depend on the accuracy, completeness and trustworthiness of the data at their disposal to make informed decisions. How confident are you of the data being used to protect your organization from harm?

 

One of the biggest challenges for IT leaders is creating a dependable “big picture” view of their organization’s technology ecosystem and dependencies, because pieces of their operational data infrastructure are spread across a wide variety of technology management tools, ITSM systems, asset inventories, maintenance/patching logs and fragmented architecture models. While all of the pieces of the puzzle may be available, they don’t often fit together well (or easily) and data quality issues frequently appear, such as gaps, duplications, overlaps and conflicts, as well as problems with accuracy and out-of-date data. The result is a confusing picture for IT leaders and one that cannot be shared with company executives without raising concerns about the confidence of IT leaders’ recommendations and decisions.

 

If the quality of a decision is only as good as the data that is the basis for making that decision, then the solution to this problem seems simple: “Improve the quality of the data.” For most organizations, “simple” is not a term that applies to data management. Integrating IT data from multiple systems into a unified enterprise picture involves a series of complex steps; integration, validation, reconciliation, correlation and contextualization must all be performed to ensure the quality and trustworthiness of the information consumed. Unfortunately, most companies’ ITSM, Data Warehouse, Operations Management and even reporting systems lack the capabilities to effectively perform the unification steps necessary to ensure the required levels of data quality. This is where specialized Data Quality management technology is needed.

 

Consider for a moment where the IT operational data that relate to the latest malware attack resides – focusing on identifying the areas of vulnerability and assessing the potential extent of impact on the business. This latest attack was related to a known security issue with certain versions of operating systems in use both on end-user computer systems and some systems in data centers.

 

The asset records that identify the potentially impacted devices are typically found in an Asset Management system, Finance/Purchasing system, in network access logs or as configuration items in the CMDB of the ITSM system. Patching records that indicate what version of the operating system the devices are running may be found in a change management or deployment system (used by IT to distribute patches to users); asset management system (if devices are frequently inventoried); infrastructure management system (if the devices are in a data center); or in the ITSM system (if records are maintained when manual patching is done).

 

Once potentially vulnerable devices have been identified, IT staff and decision makers must understand where the devices are being used within the organization to assess the impact on business operations. For end-user devices, assigned-user/owner data is typically contained both in asset inventory records, IT access management/account management systems and server access logs. The user can be associated with a business function or department through HR records. For devices installed in data centers and other common locations, the ITSM system, purchasing records, asset inventories and/or architecture models can often be used to identify the relationships between the device and a business process or responsible department/function.

 

There are commonly at least 5 independent sources of data that must be combined to identify what devices are potentially vulnerable and what business functions depend on them. When these data sets are gathered, there will undoubtedly be a large number of duplicates, partial records, records for devices that have been retired or replaced, conflicting data about the same device and records with old data that is inaccurate. According to Gartner, at any moment, as much as 40% of enterprise data is inaccurate, missing or incomplete. Data quality technology can help integrate the data, resolve  the issues, alert data management staff to areas that need attention and help decision makers understand the accuracy and completeness of the data on which they depend.

 

Blazent has been a leader in providing Data Quality solutions for more than 10 years and is an expert in integrating the types of IT operational data needed to help CIOs and IT leaders assemble an accurate and unified big picture view of their technology ecosystem. With data quality and trustworthiness enabled by Blazent’s technology, your leaders and decision makers can be confident that the information they are using to assess vulnerabilities and risks will lead to solid recommendations and decisions that protect your organization from harm. Learn more about how Blazent’s approach to building a high-quality CMDB in our best practice white paper here, or contact us directly at Sales@blazent.com.