Submitted by Andrew Waycott, COO & Co-Founder at TwinThread
A little bit of history. Through the 1960s and 70s, PLC and automation technology was introduced and integrated into the manufacturing sector. At first, this evolution did not prompt the complete removal of the human element from production. It wasn’t uncommon to have one operator physically stationed per asset. This operator, held responsible for the proper function of their assigned asset, could easily oversee the full scope of operation as the asset automated the production process. By way of example, when a box jammed in a machine, the operator was there to unjam it - problem solved.
However, as industry began to open itself up to the idea of technological infusion and as trust grew from experience, this provoked a steady scaling back of the human element - with regard to proximity. More and more, operators were physically relocated further and further away from the assets they claimed dominion over. However, this progress created somewhat of a “one step forward, two steps back” scenario. When operators moved from the factory floor to a control room, they gained more expansive control and greater safety, but lost direct situational awareness resulting from spatial disconnect. Then, when they were relocated again, operators were made capable of managing an entire facility’s assets instead of just a line and were equipped with far more sophisticated control systems that fed them seemingly endless amounts of information.
However, the price of admission for these advancements was, again, further disconnection from real-time asset operation and factory conditions, as well as the introduction of information overload. It seems counter-intuitive but, without anything tying together the wealth of data points, the technology was introducing risk by blindfolding operators in a way they had never been before.
Yes, the machine automation worked and yes, the operators were now able to control assets from a removed position. However, with the lack of situational awareness introduced by this evolved approach and the overbearing delivery of information that could not be meaningfully interpreted, it was a bit like they were riding a runaway train. Good news is, no one needs to accept the potential (and probable) chaos this scenario presents. With the aligning technology of today, operators and engineers can accurately and immediately interpret performance across their entire operational spread, and even contrast assets, one against the other, to identify opportunities to optimize process and functionality.
At TwinThread, we believe predictive data centers will, in some ways, give operators back an element of the situational awareness they’ve lost. By being enabled to “listen” to the process through constant data analysis, they will once again be made aware of what’s going on with their assets - at a far deeper level. This reconstitution of awareness through accurately curated data will empower problem-solvers to make critical decisions with the best information their assets have to offer.
To activate your domain experts with the full extent of your data -get started now.