Select Page

TwinThread Launches Perfect Batch

TwinThread is one of those smallish software companies within an interesting niche that I can’t believe has yet to find a buyer. I quoted noted software developer and LinkedIn commentator Rick Bullota in 2020 extolling the value of the AVEVA/TwinThread link with the AVEVA purchase of OSIsoft. Just last year, I wrote about a stronger partnership between the two.

I see the company has pivoted a bit to now proclaiming itself as “the world’s first to have a complete Industrial AI platform.” I’ll leave that proclamation to your judgement. But this product looks worthwhile to check out.

Last week’s news involved TwinThread releasing an AI-powered manufacturing analytics solution targeting batch processes called Perfect Batch. This product empowers manufacturers to standardize and consistently replicate their best performing or “golden batches”.

Perfect Batch applies industrial AI to dynamically identify ideal batch profiles from historical data and actively recommend actions for increasing efficiency. This enables organizations to rapidly shift from reactive firefighting to proactive optimization in a matter of weeks – not months or years.

Perfect Batch At-a-Glance:

  • Rapid Speed to Value: Perfect Batch connects to existing batch execution systems and automatically interrogates past data to build digital twins and apply models in hours.
  • Dynamic Perfect Profile Learning: Instead of setting limits manually, Perfect Batch dynamically learns ideal control limits and process centerlines, based on actual process capability and historical performance.
  • Unlocked Hidden Capacity: Granular cycle time analysis identifies bottlenecks and lost production time, facilitating capacity improvements from existing assets without new capital investment.
  • Optimization by Exception: Automated alerting and issue diagnosis empowers operations teams to focus on solving problems, without getting bogged down with endless troubleshooting and investigations.
  • Optional Closed-Loop Action: Thread Builder, a real-time workflow engine that works with Perfect Batch, automates anomaly responses, performs automatic diagnoses, and can trigger specific corrective actions automatically.
  • Automated Compliance: Tailored for regulated industries, Perfect Batch provides automated material tracking, quality and yield conformance, and audit-ready histories.

Beyond the plant floor, Perfect Batch helps drive strategic and collaborative alignment across organizations’ entire manufacturing portfolios by providing a global view of asset utilization and batch making performance. As a result, the platform serves as a single source of truth for cross-functional teams. This offers a common lens that operations teams, engineering teams, and supply chain leaders can all use to identify, prioritize, and proactively execute improvement initiatives that optimize the deployment of capital across the supply network.

Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.

Buggy AI-generated Code

Notes and news about AI continue to build in my pending folder. Too many to figure out. I’ll start with this one. I saw this news item from Morning Brew, one of my daily news feeds…The AI…it filled the code with bugs.

The amount of bugs popping up in AI-generated code is reaching the loose Sour Patch Kids under a camper’s bunk level. Amazon’s e-commerce senior VP, Dave Treadwell, called an all-hands for engineers at the company yesterday to address the growing frequency of outages, some of which can be traced back to code developed by generative AI, according to the Financial Times.

It continues…

  • Last week, Amazon’s store malfunctioned for a few hours, which the company attributed to “a software code deployment.”
  • And Amazon’s cloud services unit, AWS, had at least two large outages recently related to AI coding assistants. In December, the company’s cost calculator was down for 13 hours when Kiro, its AI coding tool, tried to change the code, and delete and remake the entire system.
  • Though Amazon downplayed the meeting as routine in comments to the FT, the paper reported that Treadwell told employees that senior engineers will now need to sign off on AI-assisted changes made by junior and mid-level engineers.

Solutions?

An expensive solution. Anthropic rolled out a review tool yesterday in Claude Code to (hopefully) catch those vibe-coded mistakes—but with each pull request costing up to $25, it may get pricey fast.

Concurrently with this news, I received a PR request to interview Pramin Pradeep, CEO of BotGauge AI. I receive this sort of thing many times daily. Supposedly, Pradeep wanted to talk about “shadow code” left behind, supposedly maliciously, by AI generated code.

I asked for something in writing. They sent the usual PR thing that mentions shadow code but switches the topic to cybersecurity and then cites an irrelevant “case study”.

However, BotGaugeAI does participate in a market (Claude told me they were 20 out of 128 in that market for what it’s worth) called AI-assisted QA for code. 

My research revealed that the basic problem comes from where the LLM AI code was trained. If trained too broadly, it will tend to being in superfluous code. Managers, meanwhile, are discovering that while maybe coders can save time in development using LLMs the task of checking and approving is becoming onerous. 

If you’re using LLMs to help code, it would probably pay to check out the companies like BotGaugeAI for automated QA.

Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.

Velotic Launches Combining Former Proficy, Kepware, ThingWorx

I thought something like this would happen when asset management firm TPG scarfed up some castoff small software divisions of larger companies. They’ve brought GE Vernova’s former Proficy business and PTC’s former Kepware and ThingWorx businesses together into a new company. 

From the press release (which must mention AI to meet today’s standards), Velotic will provide new levels of AI-driven manufacturing efficiency, productivity, and data visibility.

The company will be led by Brian Shepherd (CEO) and James Heppelmann (Executive Chairman). Shepherd was formerly at Rockwell Automation and PTC. Heppelmann led PTC.

My observation is that this company will have a tough go competing against Inductive Automation (yes, they sponsor me, but don’t tell me what to write, and I like their continual innovation). Then there are established companies such as AVEVA (Wonderware, etc.) and Rockwell Automation (FactoryTalk etc.). 

This will be interesting to watch.

Velotic today announced its launch as a leading independent industrial software company, uniting multiple trusted platforms to advance a new era for industrial and manufacturing technology. The formation of Velotic coincides with the closings of TPG’s previously announced acquisitions of Proficy, the former manufacturing software business of GE Vernova, and PTC’s former industrial connectivity and Internet of Things (IoT) businesses. Backed by TPG, Velotic delivers a leading suite of data-driven solutions focused on improving processes by unlocking efficiency, enhancing productivity, and providing visibility across complex data and industrial operations.

The obligatory marketing justification geared not to you, the prospect or user, but to market analysts.

Velotic is purpose-built to meet the rapidly evolving productivity and data needs of manufacturing operators across the globe with a focus on creating next-generation, AI-powered industrial and manufacturing software solutions. By bringing together Proficy’s automation and production management expertise with Kepware’s industrial connectivity leadership and ThingWorx’s best-in-class industrial data and analytics applications, Velotic will provide customers with greater visibility, unparalleled insight, and the robust data and AI capabilities needed to produce and compete in today’s complex manufacturing environment.

Status.

Based in the Boston area, Velotic has more than $300 million of revenue and serves customers across manufacturing, oil & gas, utilities, and infrastructure. Proficy, Kepware, and ThingWorx will remain as distinct product lines within the broader Velotic portfolio, now operating under one mission and platform.

Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.

ABB Robotics Partners with NVIDIA to Deliver Industrial-Grade Physical AI at Scale 

I walked into my local Starbucks this morning for my usual Doppio Espresso with cinnamon powder. I told my barista I was about to listen to a press conference on “physical AI.” “What do you think that is?” I asked her. “I don’t know. Maybe something like robots?” she countered. She saved me doing a deep dive with my buddy Claude.

The press conference was with ABB Robotics and NVIDIA announcing an expansion (for a fee) of ABB’s RobotStudio software to incorporate AI models establishing a new product called RobotStudio HyperReality coming to a computer near you in a few months.

  • ABB Robotics integrates NVIDIA Omniverse libraries into RobotStudio to deliver physical AI for industry, closing the gap from virtual training to real-world deployment with up to 99% accuracy
  • New RobotStudio HyperReality, available second half of 2026, will fundamentally change how quickly and reliably manufacturers can scale production, reducing costs by up to 40% and accelerating time-to-market by 50%  
  • Full range and breadth of industrial applications, with real-world pilot being conducted by Foxconn in consumer electronics assembly 
  • At NVIDIA GTC, the robotic workforce company WORKR will showcase how it’s using the solution to help manufacturers across the U.S. address critical labor shortages  

The collaboration focuses on combining ABB Robotics’ software programming, design and simulation suite, RobotStudio, with the physically accurate simulation power of NVIDIA Omniverse libraries to close technology’s long-standing ‘sim-to-real’ gap. Developers can simulate robots in digital twins and generate synthetic data to train their physical AI models, enabling businesses of all types and sizes to deploy AI-driven robotics for various industrial workflows.  

Called RobotStudio HyperReality, the resulting physically accurate simulations and foundation models are endlessly optimized with real-world data feedback continuously improving the system. These models can be used to train any number of ABB robots, anywhere in the world, with the reliability and accuracy demanded by industry.   

The long-standing deficit between simulation accuracy and real-world lighting, materials and environments is known as the ‘sim-to-real’ gap. For decades, this gap has limited the ability of manufacturers to design and develop advanced manufacturing processes in the virtual world.  

By integrating NVIDIA Omniverse libraries into RobotStudio, ABB Robotics will deliver unprecedented robotics simulation and synthetic data generation capabilities that will allow intelligent robots to bridge this gap with up to 99 percent accuracy. ABB is the only robot manufacturer with a virtual controller running the same firmware as the hardware, ensuring near perfect correlation between simulation and real world performance. Combined with ABB Robotics’ Absolute Accuracy technology, which reduces positioning errors from 8–15 mm to around 0.5 mm, ABB delivers unmatched precision in both virtual and physical environments, making it suited to high-precision industrial-grade applications.  

ABB Robotics is also assessing the potential to integrate the NVIDIA Jetson edge computing plat-form into its Omnicore controller to achieve real-time AI inference at the edge for its extensive robot portfolio. Today’s announcement builds upon ABB Robotics’ long-standing work with NVIDIA, including the previous integration of NVIDIA Jetson into ABB Robotics’ VSLAM autonomous mobile robots as well as the development of gigawatt-scale AI data centers.   

RobotStudio HyperReality will serve industrial clients at any scale, across a breadth of industries and applications, with select customers already testing its capabilities ahead of a full release to ABB Robotics’ 60,000 RobotStudio customers worldwide in the second half of 2026.   

Foxconn, the world’s largest electronics contract manufacturer, is piloting the first joint use case in consumer electronics assembly. Automating the assembly of a tiny piece in consumer electronics is challenging, as multiple device variants require different production methods and the delicate metal structure requires precise pick-and-place and assembly control, as well as fine-tuned setup, often demanding additional debugging time and engineering resources. Using RobotStudio HyperReality, Foxconn’s assembly robots are trained virtually, using synthetic data to perfect multiple real-world production processes in various scenarios, before moving them to the production line with 99 percent accuracy. By optimizing production lines virtually, Foxconn will reduce set-up times and costs by eliminating physical training and tests, and accelerate time-to-market for consumer electronics. 

WORKR, a California based robotic workforce company that delivers robotic manufacturing solu-tions to industry, is extending the reach of this technology to small and medium manufacturers across the United States. At NVIDIA GTC 2026 (March 16-19, San Jose, CA), WORKR will demonstrate AI- powered robotic systems built on ABB technology, trained with synthetic data using NVIDIA Omniverse libraries, and deployed without operators needing to know any program-ming. By combining ABB’s industrial grade robotics with its proprietary WorkrCore™ AI platform, the company is helping manufacturers address critical labor shortages with its robotic workforce that can learn new tasks in minutes and be operated by anyone. 

AI and Programming: A Useful tool

I reflected recently on the changers in programming since my first experiences around 1977.

Everything back then was text based. You typed everything line-by-line. I started with BASIC and assembler. And also RPG on an IBM minicomputer. Went to C and C++ and then picked up Java in the early 90s.

Then I discovered integrated development environments (IDE), such as eclipse for Java. Then the IDE for C#. At that point, I was thinking, “this isn’t programming. There’s so much built in that you don’t even have to type.”

I try to forget the horrible experience of Ladder Diagram on a PLC.

(Oh, I should note that I was never a professional programmer. Fortunately, I had other roles.)

Lately, automation suppliers have been adding CoPilot to their programming interfaces.

Why this reflection on migration? I’ve been reading mass media and social media angst about the end of programmers with things like Vibe Coding and Claude Code.

Programming automation has been a constant for decades. They all served to make programmers better and faster and better able to tackle tougher problems.

Even with AI, someone must have the ideas of what needs to be developed, do the thinking about approaching the problem, and make the decisions for the best application.

We’re only going to see better applications solving harder problems. Those who lose their jobs will be those who cannot adapt.

New people? They will just think it’s the only way.

Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.

IEC 61131 Process Control Function Standards Working Group Launched

The Open Process Automation Forum has been building a standard of standards to promote open and interoperable technology for process automation. PLCOpen has been at the forefront of international standards promulgation as the organization behind IEC 61131. This latter organization has instituted a Working Group to create IEC61131 process automation standard and certifications for application engineers to efficiently deploy PLC, DCS, and open platform controls in process industry applications.

I’ve been following and promoting open and interoperability for decades. This should be a useful step forward.

Bill Lydon sent this explanation of the background and current status of programming standards.

The cost of programming process automation and control continues to grow and is a significant part of project costs.  Each supplier having unique function blocks that do not follow a single worldwide standard increases training, application development costs, and project profit risk.  PLCopen standardization and modular methodology lowers training time, project development costs, and lowers project cost overruns risk.

This further expands the base of  PLCopen standards that enable No-Code/Low-Code industrial automation programming across vendor platforms including industrial computers. This will include incorporation of the function blocks defined in the O-PAS standard into a new PLCopen standard.

The new PLCopen Process Functions standards and certification make it easier for application engineers to deploy PLC,  DCS, and open platform controls in process applications.  

Working Group Goal

The PLCopen Process Industry Working Group goal is accelerating the convergence of discrete and process control & automation into harmonized PLC, DCS and open platform system architectures to achieve industrial business digitalization.

Today there are a diverse number of ways to program applications for process control and automation.  The goal is to develop PLCopen function block standards for process control functions.   Function Blocks are encapsulations of variables, parameters and their processing algorithms.  Similar standardization has been done with PLCopen standards developed for motion control, safety, fluid power, XML Program Interchange, and OPC UA.

He notes process control applications being done using PLCs. I actually sold a PLC to a chemical plant engineer, who used it to control one of his processes. That was in 1995. So, while unusual, not unheard of.

Today many process control applications are being done using PLCs (Programmable Logic Controllers) since the capabilities of these devices is far beyond original 1970s relay replacement applications.  The emerging use of industrial edge computers with IEC 611 31 runtime software engines is another segment that benefits from the results of the PLCopen Process Industry Working Group.

PLCopen Background

PLCopen has been successful defining IEC 61131 functions and certifications used widely throughout industry worldwide increasing engineering efficiency, quality and empowering a wider number of people in  motion control, fluid power, safety, and other functions. The standards define common inputs outputs and behaviors with vendor certifying conformance to accomplish the functions or additional features.

PLCopen Standards

  • Logic – The PLCopen basis is provided by the world wide standard IEC 61131, and especially Part 3 – Programming Languages.
  • Motion Control – Creating reusable, hardware independent Motion Control applications via IEC 61131-3 and PLCopen Function Blocks including Fluid Power.
  • Safety -PLCopen Safety integrates safety functionality into the IEC 61131-3 development environments.  Meets IEC 61508 & related standards.
  • Communication – PLCopen and OPC Foundation  combine their technologies to a platform and manufacturer-independent information and communication architecture.
  • XML Exchange – PLCopen added independent XML schemes to IEC 61131-3

Movements including Industry 4.0, Industrial Internet of Things, The Open Process Automation Forum, and Smart Manufacturing are creating a drive for more standards.  IEC 61131-3 along with PLCopen extensions and certifications are well established in discrete and hybrid applications and with the addition of OPC Function blocks is already part of the newer Industry 4.0 and Industrial Internet of Things offerings.

Working Group

As part of our ongoing efforts to drive standardization and interoperability in industrial automation PLCopen will start a new workgroup exploring the incorporation of the function blocks we have developed for the O-PAS standard into a new PLCopen standard.

The O-PAS (Open Process Automation Standard) is an open, interoperable, and vendor-neutral standard developed by the Open Process Automation Forum (OPAF) to enable flexible and modular process automation systems. It is designed to replace traditional, proprietary DCS’ with a standards-based, plug-and-play architecture, allowing components from different vendors to work seamlessly together. O-PAS is based on existing industry standards, such as (among others) IEC 61131 & IEC 61499.

Part 6.4 of the O-PAS defines a set of standard function blocks to ensure interoperability, consistency, and comparability across different process automation systems. These FBs provide a reference model with standardized inputs, outputs, and behaviors. By establishing a uniform function block framework, part 6.4 supports modular automation, making it easier to adopt open, vendor-independent control solutions. PLCopen helped creating several pre-defined function blocks for part 6.4 of the O-PAS standard.

In order to standardizing these function blocks within PLCopen we are starting a new workgroup to create a new PLCopen standard for the process automation.

Click on the Follow button at the bottom of the page to subscribe to a weekly email update of posts. Click on the mail icon to subscribe to additional email thoughts.

Follow this blog

Get a weekly email of all new posts.