This series discusses the stark realisation that less than 10% of manufacturers are satisfied with their Industry 4.0 accomplishments.
Part one of this series criticised the ‘technology first approach’ to implementing digital solutions, and argued that the complexity of broad Industry 4.0 solutions has led to delayed or non-existent ROI. Read on for part two, where we discuss how unrealistic expectations of Industry 4.0 technology can lead to further dissatisfaction.
What is Industry 4.0?
The Fourth Industrial Revolution, coined Industry 4.0, is the cyber-physical transformation of manufacturing. Industry 4.0 is marked by technology breakthroughs like ‘The Smart Factory’, ‘Communications Platforms’ and ‘Event Driven Services’, and is effectively transforming traditional manufacturing.
As mentioned in part one of this series, the terms used to describe Industry 4.0 are complex. The premise of a 'Smart Factory' is essentially digitally modelling critical components and functions of a production facility (using a digital twin and system modelling) as well as the elements in the facility that are required to perform them (cyber-physical systems) and using these digital models to control the equipment (using an industrial digital computer), enabling you to use automation based on events to control production processes and services.
By digitally modelling the production facility, staff are able to input and retrieve data on the move, using their mobile devices and wearables. They are also able to communicate with computers and utilise communications platforms and natural language processing.
Staff become empowered with resources and information that enhances their roles rather than eliminates them.
Production staff, by symbiotically working alongside computers, enable the baseline of expected performance to be updated with relevant data from throughout the factory. Machine learning and predictive analytics are then able to control and improve production processes, detect and predict when there will be a mechanical failure on site (PHM) and even trigger a maintenance event (PdM/CBM).
Hype or Hope
Disillusionment occurs when the expectation of a particular technology exceeds its level of maturity or capability. Gartner, a global research and advisory firm, have published a graphical representation of the difference between the expectations of technology and the actual outcome. Gartner have termed this graph the "Hype Cycle".
The Hype Cycle illustrates that expectations of new technology and innovative products follow a predictable pattern throughout their life cycle. The pattern begins with the 'Technology Trigger' when a product is first released. At this point the expectations for the technology are low, as only those close and knowledgeable to the innovation are making
judgements and these are generally fact based.
Expectations then rise dramatically during the 'Peak of Inflated Expectations' as general consumers hear of the innovative product and begin to blur what can be achieved with their expectations. Once the realisation sets in that the product will not be able to live up to consumers unrealistic expectations the 'Trough of Disillusionment' follows. Only then do expectations and product capability slowly grow together during the 'Slope of Enlightenment' when realism and understanding of the new technology begins.
Unfortunately, today, many Industry 4.0 technologies are placed in the land of 'Peak of Inflated Expectations'. The Hype Cycle is used to help distinguish if a company should invest their time and resources into one of these technologies or wait until it begins its path on the 'Slope of Enlightenment'. The hype causes confusion but it does not imply that these technologies are without real benefit, rather, it says that judging the technology by the expectations rather than how mainstream ready it is, will lead to disappointment.
At the opposite end of the spectrum is the idea that Industry 4.0 is nothing new. I mean... hasn't the manufacturing industry been digitising, automating, simulating and monitoring for years?
The answer to this turns out to be “Yes”. Many of the central concepts behind Industry 4.0
have roots tracing back to early in the 20th century.
The digital twin relies on simulation and origins of simulation, date back to the 1940s and the origin of AI to the 50s. Data has been collected for over 50 years. Three dimensional printing in the form of stereo-lithography dates back to the late 80's, and WiFi (802.11) to the late 90s.
So how is industry 4.0 considered to be new and innovative when this technology has been around for years?
Prerequisite Technologies Arrived
It is worth looking, therefore, at what may have changed to make these technologies deliver more today than they have in the past. For a technology to move from possible to practical there are always a number of prerequisite technologies needed for it to actually work. Consider the smartphone. Before they could exist, they needed computer chip miniaturisation, touch screen technology, satellites, and rockets to launch satellites so Google™ maps could work. They needed network technology, voice and video compression technology and much more. Most of these innovations were not created for the smartphone, but were prerequisites for its existence. Once they all existed the smartphone was almost inevitable (Kevin Kelly, The Inevitable) .
The Internet of Things is just "things" without internet.
In manufacturing, networked factories are the fundamental glue that has only recently been realised. Many companies are yet to start even today. Clearly if devices are to talk to the office and each other, they need a way to talk. The evolution of robust industrial strength WiFi removed a major cost barrier, especially in clean environments which may need stainless steel conduit everywhere you want a “blue wire”.
Machine Learning Matures
Another big change to manufacturing has been the maturing of categories of AI, such as Machine Learning, from the AI Winter into Spring. Experts have been predicting we would achieve a fully cognitive AI system since the 1960s. This moved to the 70s, the 80s and was not realised until the mid 2000s with the availability of expensive graphics GPUs, only then the momentum began to grow. This in turn spawned better machine learning algorithms and it all accelerated from there.
Some forms of Machine Learning have undergone year on year incremental improvement until they reached a threshold of performance which is better than humans and therefore became useful. In this situation there is no breakthrough leap. Rather, one day it is a novelty and the next day a useful tool.
Consider the machine learning applications that are used to identify images or voice recognition and interpretation. Early versions of both were very limited and were not useful, hence tend to be dismissed.
It was, for example, difficult to create a general machine learning image recognition solution to reliably distinguish between a puppy and a muffin or cookie.
But incrementally, it could take years, technology gets better and better until, one day you find it is within the human error range, at which time the usage of the capability explodes.
Siri, Alexa, Cortana or equivalent systems, went from being useless toys to attaining a level of performance where many writers already find these systems the most efficient ways to create documents and home assistant function at very high levels.
Soon we will be saying the same of AI and manufacturing facilities. Then we will have a truly 'Smart Factory'.
We might be in the Twilight Zone of Industry 4.0. Some things are real, and some, practically speaking, still fantasy. Keeping your company moving towards Industry 4.0 means focusing on high value solutions, attainable only with today’s technology. This takes effort. At the same time, an experimental mindset to patiently keep up to pace with the evolution of technology, and leap only when the innovations become practical is more important than ever. Ensuring that your company doesn't fall prey to inflated expectations but is always open to being innovative and exploring options that could save time and resources isn’t easy, but it is fun.
Roland Thomas is the