Removing Roadblocks in the Path to Data Analytics Success | Automation.com

Removing Roadblocks in the Path to Data Analytics Success

Removing Roadblocks in the Path to Data Analytics Success

By Michael Risse, Vice President, Seeq Corporation

Trade shows and conferences are about what’s new; vendors introduce the latest in products, architectures and services—and claim these new offerings are much better than their predecessors. And in the last few years, there have been many waves of “new” for data analytics and other industrial automation vendors to present. In a rough order of progression, they might include:

  • Cloud computing
  • Big data
  • The Internet of Things, whatever it may be called in your organization
  • Fog computing
  • Wireless sensors, and
  • A flavor of cognitive computing such as machine learning, artificial intelligence or deep learning. 

So many new things, and so much attention and concern for end users to make the right technology choices in a changing market.

But if you’ve been through the exercise of owning or implementing a data analytics or business intelligence project, you know looking at new technologies first is the wrong way to proceed.  The failure rates on data analytics projects are very high, estimated at 70 to 80% by industry analysts. 

Decisions on which technology to use often get most of the attention, but choosing Microsoft Azure instead of Amazon AWS doesn’t make or break a project. Neither does the flavor of MapReduce (CloudEra or HortonWorks) selected, or the specific sensor endpoint technology chosen.

What does matter is people in organizations charged with implementing data analytics projects, as demonstrated at a recent industry conference (Figure 1). In a hotel conference room with a panel of speakers on a raised dais, the topic was sensors, but an entirely different discussion broke out on the organizational implications of new and different data insights on production results. 

Figure 1, Panel discussion. Panel discussions at automation industry conferences often start out as technology discussions, but often switch to people issues, and rightly so.

Who would be impacted?  What was the plan for change management?  How was the project sold to stakeholders and key leaders?  And what were the key governance issues – data, security, and processes – associated with the project plan?

It was as if the discussion got turned upside down: not bottoms-up starting with sensors, but starting from the top with business objectives and organizational issues. Technology matters, of course, but the killers of analytics projects are typically people issues. The panelists were obviously experienced project leads with successful implementations to inform their perspective. 

Would that I had their experience during the first project I did, when I was introduced to the five people you meet implementing analytics projects, all potential roadblocks:

  • The IT person who won’t
  • The employee who isn’t incented
  • The data expert who said it was easy
  • The colleague with ownership
  • The executive with talking body parts

These are people you already know, they just take on a different role when it comes to organizational issues of control, access and information when implementing a data analytics project.

Here’s how these people can negatively impact your data analytics project (Figure 2). These descriptions will help identify points of resistance in advance so you can deal proactively with each.

Figure 2, Person as roadblock. Certain types of people within organizations can block implementation of data analytics projects.

The IT person who won’t

There is a maxim that 70% of resources and budget in IT departments is spent on maintaining existing projects and applications, with the balance deployed on new work. This means any new request to an IT department isn’t going to land on the desk of a capable, accomplished employee just waiting for a project to show up.

They are already busy, and likely under pressure to deliver on existing commitments. To get on their priority list, there is of course a planning cycle, likely annual, with executive reviews and processes to compete for attention and budget. And there is the challenging issue that many analytics projects are fishing expeditions to find where changes can be made to improve performance.  Therefore, analytics projects are in some way speculative to begin with, which makes them easy to question or criticize for their lack of provable ROI.

For example, if the objective is to operationalize a set of KPIs for process or business effectiveness, how do the KPIs demonstrate, in advance, that the improved business impact will justify the cost of the project? That result is the IT department, the critical resource and partner in many analytics projects, can be a roadblock for data analytic project owners needing priority and support for their efforts.

The employee who isn’t incentivized

Any organization of reasonable size has an annual review cycle where employees are reviewed and given incentives for outcome or production in the coming year. The question is what happens when either the process or the findings of the data analytics effort conflict with an employee’s annual objectives. They may, in fact, prove the efforts of the other employee are counter to the desired outcomes, so the incentives run counter to finding insights with data analytics and improving operations.

From an organizational perspective, goals are set and maintained by managers, and they are backed up by training, KPIs and metrics. At the individual level, these are internalized as incentives, with a known low-risk outcome from doing nothing versus possibly undermining their performance review by taking on the risk of doing something which may result in a significant benefit to the organization. 

Noncompliance and actively undermining the objectives of a project can result, from “I know what’s best,” to “I’m sticking to my existing goals until they are changed next year.”  This is entirely rational from each employee’s perspective, but is not what’s best for the organization, which must constantly achieve continuous improvement by implementing new solutions.

The data expert who said it was easy

The fastest way to get a sense of the challenges associated with modern data analytics is to understand the magnitude of the challenges of data –variety, volume, and velocity – and then the amount of time spent just getting the data ready to work. 

If data scientists, for example, spend between 70 and 80% of their time simply prepping data for analytics efforts, then it’s a tremendous part of their effort, an experience shared with engineers in process manufacturing and other roles who work with large and complex data sets. 

The result is most expectations for the aggregation, cleansing, contextualization and modeling of data will have a high degree of uncertainly built into them. This may work out well if the estimates are correct, but that doesn’t change the potential variability of the project timelines. 

Further, this data set-up work must happen first, before any actual analytics take place, so it means the actual iteration of algorithms or investigation by engineers is all downstream of the “mature data” state.  Additional issues can come from the initial access to data sources and security concerns over where (cloud?) and how the data will be used and by whom.

The challenges of getting to a known good set of data to work with, and the variability implicit in the process, are hard to overstate. Which means, along the way, timelines will be under pressure as even experienced data wranglers and analysts search for innovative ways to address data readiness challenges.

The colleague with ownership

As mentioned in the data expert section, the complexity of the modern data environment yields a set of challenges related to getting data ready to work, which might mean crossing organizational boundaries to secure access, permission, and use of data from various departments or groups.

Moreover, in a typical organization, the ownership of all parts of the analytics project including data acquisition, analysis, and corrective action must cross organizational boundaries. This means perceptions of ownership and possible impact may create opposition to analytics projects.

Technology decisions like sensor vendor or wireless communication protocol may be restricted to one department, but not so for the rest of the data analytics project. 

It’s hard to imagine many interesting data analytics projects that don’t span organizational boundary, and that don’t cross responsibility lines for employees.  These are the very engagements required for success as peers become partners, rather than protective of whatever part of the project may impact them.

The executive with talking body parts

Executive engagement in analytics projects is typically in one of two places, either as the sponsor of a project, or as the final reviewer of the outcomes.  In addition to engagement on a specific project, executives are typically under tremendous pressure to drive improvements in the organization through the adoption of innovations in data analytics technologies. Their bosses know there’s value in the data owned by the organization, and they are pressuring the executives below them to find it.

Many big data or six sigma initiatives are the result of this pressure to achieve better business results. But without prior interaction and active engagement of the right stakeholders, lack of project context and confidence can lead to an unpleasant outcome when the results of data analytics project run counter to established orthodoxy or expectations. 

Often the sponsoring executive, who up until that point has claimed to be data-driven, suddenly goes sideways and wants to rely on gut feelings and instincts. This is not a good place for anyone – the person driving the project, the project team, or the organization – because employees with talking body parts are never a sign of inquiry and confidence. 

This is a clear sign of a project in trouble.  The data and analytics have gone one way and instinct another, to the point where there is a sizeable gap so uncomfortable for those responsible that they must stop the project, or at least delay if for further education and reconciliation with expectations.

Removing roadblocks

Analytics or business intelligence projects always begin with optimism: they are frequently initiated by employees with an enthusiasm for change and a sense of confidence that there must be a better way to create value from data (Figure 3). But for anyone engaging in, or for one who has completed an analytics project, the personalities described are potential roadblocks.

The keys to avoiding these obstacles are proactive engagement, a clear focus on business value and outcomes, realistic expectations for project timelines, and well-defined milestones for progress. It’s not as exciting to focus on these project execution details as is to discuss the finer points of alternative new technologies, but they are the keys to success when implementing a data analytics project.

Figure 3, Seeq screen shot. Picking the right data analytics tool, like Seeq, can deliver clear visuals showing insights to enable improved outcomes.

About the Author

Michael Risse is a Vice President at Seeq Corporation, a company building innovative productivity applications for engineers and analysts that accelerate insights into industrial process data. He was formerly a consultant with big data platform and application companies, and prior to that worked with Microsoft for 20 years. Michael is a graduate of the University of Wisconsin at Madison, and he lives in Seattle.

MORE ARTICLES

VIEW ALL

RELATED