Bright Wolf Blog

How to turn IoT data science into real business value

You’ve succeeded in connecting your devices, which are now dutifully publishing IoT data to the cloud. Now you’re in a position to remotely monitor and control your equipment in a production environment.  Congratulations, that’s no small accomplishment! However, you have bigger plans for your IoT investment. You know there are valuable insights to be gained by applying data science to your burgeoning store of bits and bytes that have the potential to significantly affect your bottom line.

You’re on to something.  

Having worked in industrial IoT for over a decade, It’s become increasingly clear to us at Bright Wolf that much of the value of IoT technology lies beyond the basic connect and monitor scenario, with the ability to analyze IoT data to improve products and operations as well as discover new business opportunities.  

There’s gold in them thar’ IoT data hills.  So how do we get it?  

At Bright Wolf, we’ve seen first hand the astonishing percentage of IoT initiatives that end up either discontinued or in pilot purgatory. It’s not because the technology isn’t really cool (it is), but rather because the initiatives fail to demonstrate any real or potential return on investment. For this reason, we have a pretty unwavering commitment to a methodology we call Zero Waste Engineering™. The basic idea is to employ an iterative discovery process that doesn’t break the bank while proving out the value of your initiative. While this seems like common sense, many industrial organizations are trapped in legacy patterns of development resulting in IoT project failures like the $3 millon spreadsheet and other common missteps on the path toward digital transformation.

With Zero Waste Engineering™ as a backdrop, and an eye toward better business outcomes, let’s walk through a proven approach for applying data science to industrial IoT systems.

Define the goal

With a nod to Stephen Covey, it’s imperative to start with the end in mind. If you don’t know where you’re going, you’re unlikely to get there. It’s important to clearly define the problem you are trying to solve, or the specific insight you’re looking for, and it’s equally important that the solution aligns with and supports your business goals.   

A common IoT data science application is predictive maintenance, an effort aimed at eliminating expensive, disruptive, or catastrophic failures. Other common use cases are discovering opportunities to lower operational costs, product and service improvements, and better customer support.

Build your coalition

Finding and operationalizing insights to improve your business or enterprise requires a team; you can’t do this work alone, even if you are wearing your super hero cape. You obviously need a data scientist, but It’s critical to identify and enlist others to ensure the best chance of success.  

Your coalition will certainly include management at some level; the project sponsor or champion is responsible for defining the business goals. Of equal importance are subject matter experts who have an in-depth understanding of the equipment or process you’re focusing on. They could be plant managers, operators, or maintenance personnel. Your sales and customer support organizations are likely to provide valuable insights here as well. Product owners and engineers should also have a seat at the table.

Finally, be sure to include information technology (IT) leaders, and involve them early. They will be instrumental in accessing the data needed for your research, and are likely to be involved in anything you ultimately deploy resulting from your work.

Generate your hypotheses

Data science is just that – science, and this is how science is done. It’s an iterative process of observation-based “guess and check”. Your hypotheses will emerge from the collaborative efforts of your coalition, and you will generate and prove or disprove many hypotheses over the course of your investigations.

Identify data requirements

Data science generally requires lots of data. For IoT applications, this will be both device data as well as reference data needed for context. Identify the device data you need, how much history is required, where it resides, and how to access it. Are there additional sensors that need to be added to your installations, perhaps temperature, humidity, or vibration? There are inexpensive off the shelf sensors you can start with for initial investigations. Your IT team and device engineers are your friends, and will help you with this.

You’ll use other data sources to contextualize the IoT data. Enterprise data from your ERP, CRM, and other systems may be relevant. Similar for external data sources such as weather, population, and energy or fuel pricing. For example, a predictive maintenance use case might require equipment age and model – information that resides in a separate sales system.

Collect your data

Depending on the goal of your effort, collecting data for data science is often a considerable undertaking that requires IT and data architect support. If your organization has already invested in an IoT data pipeline platform as discussed previously in this series, this will be a much easier task. 

Choose your analysis tools

Your data scientist probably has a preferred set of tools, so have them start with those unless they are cost prohibitive. There are many good options you can download free and run locally, such as the individual edition of Anaconda’s data science suite. High powered data science machines and services can be spun up and accessed as needed via the public cloud providers, eliminating the need to invest in specialized hardware and software. Both AWS and Azure also offer managed services for machine learning that provide a suite of tools for data science.  

Do your investigations

Data investigations and experiments are the heart of the discovery process. This is where you take a ton of data and (hopefully) turn it into game changing insights to transform your business. This will take time. Be prepared to iterate, and do expect failures. This is normal. Your goal is to prove or disprove each hypothesis and ultimately get to your business value as quickly as possible.  

Note, it’s crucial to have a skilled data scientist with a thorough understanding of the math and methods doing the investigations. It’s very easy to use the tools to hack together something that allows you to jump to the wrong conclusions at lightning speed with numbers that seemingly back it up. It is a much more difficult task to produce an accurate, meaningful, quality model that informs the matter at hand, but that’s what you’re after.

Verify results

When you think you’ve found a difference maker, it’s time to test your newly discovered insight to determine if it delivers the desired impact. At this point you’re not looking to operationalize anything, you’re just performing a trial to validate your conclusions.  

As with all experiments, you need to quantify assumptions and expectations. Start by defining and baselining key performance indicators (KPIs). Apply your newly found insights to the problem or process at hand. You may have a friendly customer who is willing to participate. Monitor KPIs. Is this experiment delivering the expected results? Adjust and repeat as necessary until you’ve either proved and refined, or disproved your theory. Remember, this is an iterative process.

It’s important to incorporate coalition members into the feedback loop. Share successes and failures. Even seemingly innocuous results may mean something to stakeholders and may inform subsequent activity.

Operationalize

Once you’ve proven your hypothesis, and confirmed at least on a small scale that it will indeed achieve your goal and provide the expected return on investment, then you need to determine what and how to operationalize the insights you’ve gained.

This is a project in it’s own right. It’s worth a second mention: Involve IT early. Whatever you ultimately deploy will likely be integrated with your existing systems and infrastructure.

Azure Machine Learning service provides tools that make it fairly simple to create, manage and deploy ML models. AWS has a similar service called SageMaker which provides the same capabilities. We’ll dive deeper into ML tools in the next post.

To learn more, contact us today and we’ll be happy to share a few best practices and provide an initial evaluation for how we can help you achieve your goals.

About Bright Wolf

Bright Wolf helps industrial enterprises increase business value by transforming operations and organizations with digital strategy, technology, solution delivery, and team enablement.

Industrial IoT Newsletter
Protected by reCAPTCHA, Google Privacy Policy and Terms of Service apply.
Featured in…

IoT OneCIO ReviewIoT Agenda Manufacturing.net IoT Evolution IoT Inc IoT Central IoT for All Industry Today

Learn how Bright Wolf can help your team

Bright Wolf Services

Digital strategy, architecture, development, integration, and operations

IoT Platform Accelerators

Connect equipment and generate value in the cloud faster with AWS and Azure solution starters

Client Success Stories

Learn how Bright Wolf clients are optimizing operations & creating business value for customers