I’ve been writing about standards used in applications for Industrial Internet of Things and parts of the IoT ecosystem such as Big Data lately—REST, MQTT, and OPC.
Dennis Nash, president of Control Station noticed the series and called about a big data and IoT application using Control Station software as part of a system for helping engineers optimize control loops in a process plant.
Even if a plant has installed and uses an APC or MPC application, things happen that loops eventually slip out of tune. These loops can cost a plant a lot of money even though the process does not generate alarms or does not appear to be generating problems.
The system builds from historical loop change data recorded in an OSI PI historian. The data flows into a model of a tuned loop. Says Nash, “The basis for our innovation starts with a unique ability to model highly variable process data (i.e. noisy, oscillatory data). We use a proprietary method that no one has successfully emulated.”
He continues, “With the ability to accurately model highly variable data, control loop performance monitoring (CLPM) tools like ours can capitalize on the 100s/1000s of output changes that happen everyday. By aggregating the model date and comparing results with existing tuning parameters, CLPM tools are now getting into the Big Data game.”
Checking loop performance, especially when there are hundreds or thousands, rarely hits an engineer’s to-do list. This system will send a notification of worst actor loops where action can actually improve plant efficiency and profits.
This system works in a one-off application. What if there are more applications in a plant? That is where interoperability and standards come to play. Not so much a standard within Control Station, but where an application such as Control Station can use standards and data interoperability to grab data from a variety of sources. The extensive use of these standards and data interoperability enable the continual push of innovation and process improvement.