29-08-2018

The Democratization of the Internet of Things

Deel dit bericht

Industry 4.0, Manufacturing 4.0, Digital Transformation. Across industries, we’ve seen a few labels created to try and encapsulate the massive opportunities and challenging associated with the flood of data coming from a variety of sources, and in particular – machines.

In the consumer space, the data tsunami is primarily coming from our smart phones and home security and climate control systems. Wearables are also on the cusp of gaining some critical mass. We have also seen the threat behind these when it comes to data mis- or abuse. This led to GDPR and a myriad of state data privacy regulations. With every data breach in the news, the discussions are reinvigorated asking who controls the data, what uses can we gain from it and how much do we really need?

What you hear little about is the last two aspects: its use and critical mass to make decisions.

The same is true for the B2B side of things. In this case, wearables are monitoring employees’ health and performance to boost safety. Most often, though, organizations use sensor data for monitoring equipment. This could take the form of real-time monitoring of production lines, measuring yield of a set of industrial robots. Another use case is the monitoring of equipment during its uptime (operations) to see how it deteriorates during certain environmental factors not directly related to its inputs and outputs (e.g. temperature, tilt, humidity, pressure, distance). Another angle for companies to explore is the use of sensor data during the R&D (engineering) process. This could use field performance data as I mentioned earlier but also behavioral data in a more controlled environment to isolate root causes of failure during the quality assurance process.

To explore and really take advantage of all the promise Industry 4.0 holds, organizations need to stop treating machine/sensor data on a plant project, study or even departmental level and start considering the longer-term, cross-enterprise benefits. They also need to develop an agile methodology for rapid failure so they do not waste resources on continued exploration of fruitless testing, especially if prior test data indicates a similar outcome. Even more important is the ability to ensure a trust in the results because the data used has been certified via communal (crowd sourced) vetting given available reviews and source lineage. This means a central repository, likely Hadoop-based, for developers, engineers and analysts to work together to search for, augment, blend, annotate and reuse data sets given their work with them.

Unlike the current understanding of “failure-as-a-service”, which is more to do with cloud based high availability infrastructure, I dub this approach “failure-as-a-service 2.0” because this is ultimately what sensors deliver to us. They are supposed to let us know quickly in what scenarios things are going sideways and advanced (machine learning) algorithms allows us to predict similar outcomes in the future and move on to our next problem. This allows our intelligent machines to operate with little unscheduled downtime and operate at peak yield.

IoT solutions like the one developed by our partner Integrationworx on top of the Informatica Intelligent Data Platform are a prime example for this approach. Gone are the days where teams of quality engineers and field techs take days or weeks to set up a sensor array on a piece of equipment, calibrate and test it, wait for results only to find out that the data sets are not meaningful or are very similar to a past test they were unaware of. More importantly, engineers are now able to work with their business counterparts to assess if a new sensor output reflects a need to change existing processes given prior test data history captured in historians. These SWAT teams are now technically enabled to become citizen integrators for rapid prototyping projects.

This is a sharing economy, so share your meaningful data with other engineers, suppliers, customers or even competitors. Competitors or situations akin to co-opetition often allow all participants to share some basic insight to improve the common good and reduce economic externalities, such as emissions. You could even develop a new revenue stream from this work to monetize your investment.

If you provide high-fidelity sensor data sets in context to verifiable pieces of other “core” information, such as the training and certifications of the staff who installed the equipment and sensors, who the equipment supplier is, the model and serial number, its location, environmental conditions during install, operation and test, you are making better decisions.

This affects not only operational performance but also changes in equipment design, fault testing, training, build-process redesign, regulatory reporting; and ultimately, customer success, which means repeat business.

Stephan Zoder is Business Consultant at Informatica.

Partners