(248) 735-0648

This past weekend, I had the pleasure of racing in the Reebok Spartan Race at AT&T Park in San Francisco. For those of you unfamiliar with this event, it involves running for long stretches up and down bleacher steps or uneven terrain until you reach your next “reward,” which turns out to be one of 20 or so obstacles which include spartanesque activities, such as scaling six-foot high walls, 20-foot high rope climbs, carrying heavy objects, or even traversing 25 feet of uneven monkey bars.

 

One of the reasons I enjoy this event so much is because even though you can prepare by running, lifting weights, doing pull-ups, executing hundreds of box jumps or performing many other cross-training exercises, race day obstacles are a known unknown.  Which means there will be obstacles; you just don’t know their type, order, or placement. In order to succeed, athletes must employ a level of predictive qualities using the data sources at their disposal.

 

So what does this have to do with IT?

 

Poor Data Quality Leads to Failure

 

According to Gartner research, “Poor data quality is a primary reason for 40% of all business initiatives failing to achieve their targeted benefits.” Just as with the Spartan Race, IT has to balance discipline and preparation with the ability to respond to the known unknowns that are a daily part of the IT life.

 

The idea of the known unknown comes to mind every time I read about a technology-related business failure, such as the troubles at United Airlines and the NYSE. In these instances, it appears that the outages were self-inflicted, which means they could have been avoided or prevented with the right approach and technology. Still, the questions abound: did they need to happen at all? Were these organizations operating and making decisions based on high-quality data? Had they taken the time to prepare and plan for their obstacles to success?  Did they even have the data from all of the sources they needed to properly plan and execute?

 

Data Quality: The Key to Prevention and Preparation

 

In one outage, the culprit is now being linked to poor change management practices. Effective change management relies on the stakeholders being able to understand relationships, dependencies, and conduct change impact management. At Blazent, after 13 years of working with enterprise data, we’ve found that at any one time, up to 40% of IT data can be inaccurate or incomplete. This is why a data intelligence platform that can provide data that is current, complete, and accurate is critical to supporting any IT or business initiative.

 

As I look at what happened at the Wall Street Journal, it makes me think how many IT organizations are predicting what might happen based on the known unknown. When the NYSE went down, and United Airlines flights were grounded, did anyone at the WSJ or any algorithm predict the possible obstacle to success of their own business? On this occasion, it appears the answer is a ‘No.’ But now that they have the information as part of their history let’s hope the executives are taking measures to ensure they are properly prepared for the next known unknown.

 

By this point, everyone knows IT is under immense pressure, and the complexity of systems is only increasing. IT and the enterprise know obstacles are lurking around the corner, but they just don’t know when or where the obstacles will appear.  In order to reduce risk, increase agility, and improve overall efficiency, IT must be operating with quality data they can trust and act upon.

 

A Data Intelligence Platform

 

At Blazent, we are committed to enabling IT to make decisions based on quality data.  To that end, we have developed our Blazent Data Intelligence Platform, which includes a five-step Data Evolution process that will reduce that 40% of bad data down to zero. We want to ensure bad things don’t happen to good people, by telling our customers what they don’t know, and helping prepare them for the known unknown.

 

In his latest blog, our CEO discussed ‘Rebalancing the Data/Decision Equation.’ In it, he highlighted some key characteristics of the Data Decision Equation; that data must be:

  • All-encompassing. Based on ALL relevant data, no matter the type or source. If it exists, we should be able to use it.
  • Capable of giving immediate (real-time) actionable information.
  • Be as ‘association-rich’ as possible.
  • Be as ‘history-rich’ as possible.

 

In all outages listed above, the spotlight was shone on IT for the wrong reasons, and now their PR teams are hard at work to rebuild customer confidence. What about all of the near misses that we never heard about? What are your IT teams doing right now to prepare and predict for the known unknown? Are they really prepared for their moment in the spotlight?