Industry 4.0, Manufacturing 4.0, Digital Transformation. Across industries, we’ve seen a few labels created to try and encapsulate the massive opportunities and challenging associated with the flood of data coming from a variety of sources, and in particular – machines.
In the consumer space, the data tsunami is primarily coming from our smart phones and home security and climate control systems. Wearables are also on the cusp of gaining some critical mass. We have also seen the threat behind these when it comes to data mis- or abuse. This led to GDPR and a myriad of state data privacy regulations. With every data breach in the news, the discussions are reinvigorated asking who controls the data, what uses can we gain from it and how much do we really need?
What you hear little about is the last two aspects: its use and critical mass to make decisions.
The same is true for the B2B side of things. In this case, wearables are monitoring employees’ health and performance to boost safety. Most often, though, organizations use sensor data for monitoring equipment. This could take the form of real-time monitoring of production lines, measuring yield of a set of industrial robots. Another use case is the monitoring of equipment during its uptime (operations) to see how it deteriorates during certain environmental factors not directly related to its inputs and outputs (e.g. temperature, tilt, humidity, pressure, distance). Another angle for companies to explore is the use of sensor data during the R&D (engineering) process. This could use field performance data as I mentioned earlier but also behavioral data in a more controlled environment to isolate root causes of failure during the quality assurance process.
To explore and really take advantage of all the promise Industry 4.0 holds, organizations need to stop treating machine/sensor data on a plant project, study or even departmental level and start considering the longer-term, cross-enterprise benefits. They also need to develop an agile methodology for rapid failure so they do not waste resources on continued exploration of fruitless testing, especially if prior test data indicates a similar outcome. Even more important is the ability to ensure a trust in the results because the data used has been certified via communal (crowd sourced) vetting given available reviews and source lineage. This means a central repository, likely Hadoop-based, for developers, engineers and analysts to work together to search for, augment, blend, annotate and reuse data sets given their work with them.
Unlike the current understanding of “failure-as-a-service”, which is more to do with cloud based high availability infrastructure, I dub this approach “failure-as-a-service 2.0” because this is ultimately what sensors deliver to us. They are supposed to let us know quickly in what scenarios things are going sideways and advanced (machine learning) algorithms allows us to predict similar outcomes in the future and move on to our next problem. This allows our intelligent machines to operate with little unscheduled downtime and operate at peak yield.
IoT solutions like the one developed by our partner Integrationworx on top of the Informatica Intelligent Data Platform are a prime example for this approach. Gone are the days where teams of quality engineers and field techs take days or weeks to set up a sensor array on a piece of equipment, calibrate and test it, wait for results only to find out that the data sets are not meaningful or are very similar to a past test they were unaware of. More importantly, engineers are now able to work with their business counterparts to assess if a new sensor output reflects a need to change existing processes given prior test data history captured in historians. These SWAT teams are now technically enabled to become citizen integrators for rapid prototyping projects.
This is a sharing economy, so share your meaningful data with other engineers, suppliers, customers or even competitors. Competitors or situations akin to co-opetition often allow all participants to share some basic insight to improve the common good and reduce economic externalities, such as emissions. You could even develop a new revenue stream from this work to monetize your investment.
If you provide high-fidelity sensor data sets in context to verifiable pieces of other “core” information, such as the training and certifications of the staff who installed the equipment and sensors, who the equipment supplier is, the model and serial number, its location, environmental conditions during install, operation and test, you are making better decisions.
This affects not only operational performance but also changes in equipment design, fault testing, training, build-process redesign, regulatory reporting; and ultimately, customer success, which means repeat business.
Stephan Zoder is Business Consultant at Informatica.
2 april 2025 Schrijf in voor al weer de twaalfde editie van ons jaarlijkse congres met wederom een ijzersterke sprekers line-up. Op deze editie behandelen wij belangrijke thema’s als Moderne (Native-Cloud) Data Architecturen, Datawarehouse Desi...
3 april 2025 (halve dag)Praktische workshop met Alec Sharp [Halve dag] Deze workshop door Alec Sharp introduceert conceptmodellering vanuit een non-technisch perspectief. Alec geeft tips en richtlijnen voor de analist, en verkent datamodellering op c...
3 april 2025 Deze workshop met Winfried Etzel behandelt de centrale pijler van Data Mesh: Federated Data Governance. Hoe zorg je voor een goede balans tussen autonomie en centrale regie? Praktische workshop van een halve dag op 3 april in Utre...
3 april 2025 In de snel veranderende wereld van vandaag is het effectief benutten en beheren van gegevens een kritieke succesfactor voor organisaties. Deze cursus biedt een fundamenteel begrip van Master Data Management (MDM) en de centrale ro...
7 t/m 9 april 2025Praktische workshop met internationaal gerenommeerde spreker Alec Sharp over het modelleren met Entity-Relationship vanuit business perspectief. De workshop wordt ondersteund met praktijkvoorbeelden en duidelijke, herbruikbare richt...
10, 11 en 14 april 2025Praktische driedaagse workshop met internationaal gerenommeerde spreker Alec Sharp over herkennen, beschrijven en ontwerpen van business processen. De workshop wordt ondersteund met praktijkvoorbeelden en duidelijke, herbruikba...
15 april 2025 Praktische workshop Datavisualisatie - Dashboards en Data Storytelling. Hoe gaat u van data naar inzicht? En hoe gaat u om met grote hoeveelheden data, de noodzaak van storytelling en data science? Lex Pierik behandelt de stromingen in ...
14 en 15 mei 2025 Organisaties hebben behoefte aan data science, selfservice BI, embedded BI, edge analytics en klantgedreven BI. Vaak is het dan ook tijd voor een nieuwe, toekomstbestendige data-architectuur. Dit tweedaagse seminar geeft antwoord op...
Deel dit bericht