Many engineers and programmers like open source projects combined with open APIs. Some open source catches on and quietly becomes widely used. Others languish. The Linux Foundation’s Edge project, especially EdgeX Foundry, keeps quietly growing. What are the odds that this becomes a widely used Internet of Things tool?
Today’s news in brief:
- EdgeX’s fifth release offers more scalable solutions to move data from devices to cloud, enterprise and on-premises applications
- The first LF Edge project to achieve Stage 3 ratification, EdgeX hits widespread adoption and production-level maturity
- EdgeX and LF Edge onsite at IoT Solutions World Congress with demos from Dell Technologies, Home Edge, IOTech and Project EVE
EdgeX Foundry, a project under the LF Edge umbrella organization within the Linux Foundation that aims to establish an open, interoperable framework for IoT edge computing independent of connectivity protocol, hardware, operating system, applications or cloud, announced the availability of its “Fuji” release. This release offers additional security and testing features on top of the production-ready “Edinburgh” release launched this spring.
“EdgeX Foundry has experienced significant momentum in developing an open IoT platform for edge-related applications and shows no signs of slowing down,” said Arpit Joshipura, general manager, Networking, Edge and IoT, the Linux Foundation. “As the only Stage 3 project under LF Edge, EdgeX Foundry is a clear example of how open collaboration is the key to an active community dedicated to creating an interoperable open source framework across IoT, Enterprise, Cloud and Telco Edge.”
Launched in April 2017, and now part of the LF Edge umbrella, EdgeX Foundry is an open source, loosely-coupled microservices framework that provides the choice to plug and play from a growing ecosystem of available third-party offerings or to augment proprietary innovations. With a focus on the IoT Edge, EdgeX simplifies the process to design, develop and deploy solutions across industrial, enterprise, and consumer applications. As a Stage 3 project under LF Edge, EdgeX is a self-sustaining cycle of development, maintenance, and long-term support. As an example of the rapidly accelerating use of the code, EdgeX hit a milestone of 1 million platform container downloads, which almost half of these took place in the last few months.
“The 1M container download isn’t our only milestone,” said Keith Steele, EdgeX Foundry chair of the Technical Steering Committee and LF Edge Governing Board member. “The development team has expanded with more than 150 active contributors globally and the partner ecosystem of complementary products and services continues to increase. As a result, we’re seeing more end-user case studies that range from energy and utilities, building automation, industrial process control and factory automation, smart cities, retail stores and distribution and health monitoring.”
The Fuji Release
As the fifth release in the EdgeX Foundry roadmap, Fuji offers significant enhancements to the Edinburgh 1.0 release, which launched in July, including:
- New and improved security features to include PKI infrastructure for token/key generation.
- Application services that now offer full replacement capability to the older export services provided with previous EdgeX releases. These application services offer more scalable and easier to use solutions to get data from the EdgeX framework to cloud, enterprise and on-premises applications.
- Example application services are provided with this release to allow users to quickly move data from EdgeX to the Azure and AWS IoT platforms.
- A new applications function Software Development Kit (SDK) also provides the EdgeX user community with the ability to create new and customized solutions on top of EdgeX – for example, allowing EdgeX to move edge data to legacy and non-standard environments.
- Unit test coverage is considerably increased (in some services by more than 200 percent) across EdgeX core and supporting microservices.
- New device service connectors to BLE, BACNet, IP camera, OPC UA, GPS, and REST device services.
- Choices for commercially-supported EdgeX device connectors are also starting to blossom with offerings for CANopen, PROFINET, Zigbee, and EtherCat available through EdgeX community members.
Inaugural EdgeX Open
The EdgeX Foundry community recently kicked off a series of hackathons, titled the EdgeX Open. More than 70 attendees participated in the first event on October 7- 8, 2019, in Chicago. Hosted by LF Edge and the Retail Industry Leader Association (RILA), and sponsored by Canonical, Dell Technologies, Deep Vision, Intel, IOTech, IoTium and Zededa, the event featured five teams that competed in retail use case categories. More details on the event, including the winning use case from Volteo, are available in this blog post.
The next hackathon will coincide with the Geneva release, targeted for Spring 2020. It will be centered on the Manufacturing vertical and held in a location in Europe.
The IT architecture of industrial / manufacturing applications increasingly boosts the role of cloud and edge. These technologies have become core to Industrial Internet of Things (IIoT) and improved Software as a Service (SaaS).
These recent acquisition news items reflect the acceleration of the trend. One is from Siemens and the other PTC.
Siemens plans acquisition of Edge technology
- Siemens further expands its digitalization portfolio for industry
- Technology basis is the Docker IT standard
- Siemens Industrial Edge ecosystem enables easy and flexible use of Edge apps
Siemens is planning the acquisition of Edge technology from the US company Pixeom. With this action, Siemens is strengthening its Industrial Edge portfolio by adding software components for Edge runtime and for device management. Siemens Industrial Edge provides an ecosystem, which enables the flexible provision and use of apps. This means for example that appropriate apps can analyze data locally at the machine and send relevant data to the higher-level Industrial Edge Management System for global analytics. With this acquisition, Siemens is driving forward the expansion of its Digital Enterprise portfolio and the integration of cutting-edge technologies for the digital transformation of industry.
With the resulting Industrial Edge ecosystem, industrial companies can use production data even more efficiently and react more flexibly to changes in conditions.
Ralf-Michael Franke, CEO of Siemens’ Factory Automation Business Unit, explains: “Cutting edge technologies such as Edge Computing open up new scope for automation. With Siemens Industrial Edge, we are creating an open edge ecosystem which offers benefits for companies of any size.”
Siemens is using Docker standard container technology: the provision of apps in the management system will therefore be just as simple as functional upgrades and updates of Edge devices in the factory from a central point.
Siemens intends to acquire this technology from Pixeom and use it in the Factory Automation Business Unit, which is part of Siemens Digital Industries. Pixeom has sites in San José, California and Udaipur, India and employs 81 people worldwide. Closing of the transaction is planned for the fourth quarter of 2019. Both companies have agreed not to comment on the financial details of the transaction.
PTC Makes SaaS Acquisition
I sat in on the analysts/press conference where PTC president and CEO Jim Heppelmann discussed the reason for this announced acquisition of Onshape, creators of the “first” Software as a Service product development platform. The company had also just released fourth quarter results. PTC has a little more than $1 billion in revenues, with about 45% CAD and 35% PLM. Interestingly, the IoT business contributes just over 10% of revenues.
Onshape’s product development platform unites computer aided design (CAD) with data management and collaboration tools, for approximately $470 million, net of cash acquired. The acquisition is expected to accelerate PTC’s ability to attract new customers with a SaaS-based product offering and position the company to capitalize on the inevitable industry transition to SaaS. Heppelmann believes that that cloud-based SaaS is the future of CAD. Pending regulatory approval and satisfaction of other closing conditions, the transaction is expected to be completed in November 2019.
Located in Cambridge, MA, Onshape was founded in 2012 by CAD pioneers and tech legends, including Jon Hirschtick, John McEleney, and Dave Corcoran, inventors and former executives of SolidWorks. Onshape has secured more than $150 million in funding from leading venture capital firms and has more than 5,000 subscribers around the world. The company’s software offering is delivered in a SaaS model, making it accessible from any connected location or device, eliminating the need for costly hardware and administrative staff to maintain. Distributed and mobile teams of designers, engineers, and others can benefit from the product’s cloud nature, enabling them to improve collaboration and to dramatically reduce the time needed to bring new products to market – while simultaneously staying current with the latest software.
“PTC has earned a reputation for successfully pursuing new innovations that drive corporate growth,” said Heppelmann. “Building on the strong momentum we have with our on-premises CAD and PLM businesses, we look to our future and see a new growth play with SaaS.”
This acquisition is the logical next step in PTC’s overall evolution to a recurring revenue business model, the first step of which was the company’s successful transition to subscription licensing, completed in January 2019. The SaaS model, while nascent in the CAD and PLM market, is rapidly becoming industry best practice across most other software domains.
“Today, we see small and medium-sized CAD customers in the high-growth part of the CAD market shifting their interest toward SaaS delivery models, and we expect interest from larger customers to grow over time,” continued Heppelmann. “The acquisition of Onshape complements our on-premises business with the industry’s only proven, scalable pure SaaS platform, which we expect will open new CAD and PLM growth opportunities while positioning PTC to be the leader as the market transitions toward the SaaS model.”
For customers, the SaaS model enables faster work, improved collaboration and innovation, with lower up-front costs and with no IT infrastructure to administer and maintain. For software providers, the SaaS model has been proven to generate a more stable and predictable revenue stream, increase customer loyalty as customers benefit from earlier adoption of technology innovations, and enable expansions into new segments and geographies.
“At Onshape, we share PTC’s vision for helping organizations transform the way they develop products,” said Jon Hirschtick, CEO and co-founder, Onshape. “We and PTC believe that the product development industry is nearing the ‘tipping point’ for SaaS adoption of CAD and data management tools. We look forward to empowering the customers we serve with the latest innovations to improve their competitive positions.”
Onshape will operate as a business unit within PTC, with current management reporting directly to Heppelmann.
This is still more followup from Emerson Global Users Exchange relative to sessions on Projects Pilot Purgatory. I thought I had already written this, but just discovered it languishing in my drafts folder. While in Nashville, I ran into Jonas Berge, senior director, applied technology for Plantweb at Emerson Automation. He has been a source for technology updates for years. We followed up a brief conversation with a flurry of emails where he updated me on some presentations.
One important topic centered on IoT projects—actually applicable to other types of projects as well. He told me the secret sauce is to start small. “A World Economic Forum white paper on the fourth industrial revolution in collaboration with McKinsey suggests that to avoid getting stuck in prolonged “pilot purgatory” plants shall start small with multiple projects – just like we spoke about at EGUE and just like Denka and Chevron Oronite and others have done,” he told me.
“I personally believe the problem is when plants get advice to take a ‘big bang’ approach starting by spending years and millions on an additional ‘single software platform’ or data lake and hiring a data science team even before the first use case is tackled,” said Berge. “My blog post explains this approach to avoiding pilot purgatory in greater detail.”
I recommend visiting Berge’s blog for more detail, but I’ll provide some teaser ideas here.
First he recommends
- Think Big
- Start Small
- Scale Fast
Plants must scale digital transformation across the entire site to fully enjoy the safety benefits like fewer incidents, faster incident response time, reduced instances of non-compliance, as well as reliability benefits such as greater availability, reduced maintenance cost, extend equipment life, greater integrity (fewer instances of loss of containment), shorter turnarounds, and longer between turnarounds. The same holds true for energy benefits like lower energy consumption, cost, and reduced emissions and carbon footprint, as well as production benefits like reduced off-spec product (higher quality/yield), greater throughput, greater flexibility (feedstock use, and products/grades), reduced operations cost, and shorter lead-time.
The organization can only absorb so much change at any one time. If too many changes are introduced in one go, the digitalization will stall:
- Too many technologies at once
- Too many data aggregation layers
- Too many custom applications
- Too many new roles
- Too many vendors
Multiple Phased Projects
McKinsey research shows plants successfully scaling digital transformation instead run smaller digitalization projects; multiple small projects across the functional areas. This matches what I have personally seen in projects I have worked on.
From what I can tell it is plants that attempt a big bang approach with many digital technologies at once that struggle to scale. There are forces that encourage companies to try to achieve sweeping changes to go digital, which can lead to counterproductive overreaching.
The Boston Consulting Group (BCG) suggests a disciplined phased approach rather than attempting to boil the ocean. I have seen plants focus on a technology that can digitally transform and help multiple functional areas with common infrastructure. A good example is wireless sensor networks. Deploying wireless sensor networks in turn enables many small projects that help many departments digitally transform the way they work. The infrastructure for one technology can be deployed relatively quickly after which many small projects are executed in phases.
Small projects are low-risk. A small trial of a solution in one plant unit finishes fast. After a quick success, then scale it to the full plant area, and then scale to the entire plant. Then the team can move on to start the next pilot project. This way plants move from PoC to full-scale plant-wide implementation at speed. For large organization with multiple plants, innovations often emerge at an individual plant, then gets replicated at other sites, rolled out nation-wide and globally.
Use Existing Platform
I have also seen big bang approach where plant pours a lot of money and resources into an additional “single software platform” layer for data aggregation before the first use-case even gets started. This new data aggregation platform layer is meant to be added above the ERP with the intention to collect data from the ERP and plant historian before making it available to analytics through proprietary API requiring custom programming.
Instead, successful plants start small projects using the existing data aggregation platform; the plant historian. The historian can be scaled with additional tags as needed. This way a project can be implemented within two weeks, with the pilot running an additional three months, at low-risk.
I personally like to add you must also think of the bigger vision. A plant cannot run multiple small projects in isolation resulting in siloed solutions. Plants successful with digital transformation early on establish a vision of what the end goal looks like. Based on this they can select the technologies and architecture to build the infrastructure that supports this end goal.
NAMUR Open Architecture (NOA)
The system architecture for the digital operational infrastructure (DOI) is important. The wrong architecture leads to delays and inability to scale. NAMUR (User Association of Automation Technology in Process Industries) has defined the NAMUR Open Architecture (NOA) to enable Industry 4.0. I have found that plants that have deployed digital operational infrastructure (DOI) modelled on the same principles as NOA are able to pilot and scale very fast. Flying StartThe I&C department in plants can accelerate digital transformation to achieve operational excellence and top quartile performance by remembering Think Big, Start Small, Scale Fast. These translate into a few simple design principles:
- Phased approach
- Architecture modeled on the NAMUR Open Architecture
- Ready-made apps
- East-to-use software
- Digital ecosystem
I have just returned from a weekend in Eastern Ohio at a youth soccer tournament. You learn a lot about human nature–your own as well as others–when you’re in a competitive tightly compressed space.
The games I refereed had coaches and parents carrying exhuberance carried way too far–probably into less positive descriptions. As director of referees for the tournament, I walked around observing other games, as well. Talked with a 15-year-old girl about her game. She told me the parents were the worst. They yelled unkind things directly at their goalkeeper including calling her a “bitch”. Sometimes I wonder.
This week I’m heading west for another IT conference. This one is Hitachi Vantara. I have had a few interviews lately with people from there as they have ramped up an Industrial IoT practice. I’m sure there will be more later this week.
What started me thinking about human nature and Industrial IoT suppliers was a comment I received a couple of weeks ago at another conference. “The trouble with the IT companies is that their sales people come in and promise that their Industrial IoT solution will solve all their problems.”
What engineer do you know who would believe that? Which ones would immediately tune them out and start thinking about their hobby?
I was a sales guy once. Or twice. I also was the guy from engineering who tried to explain the technology, benefits, and competitive advantage of our product versus the market. I also watched for when the sales peoples’ eyes glazed over. They didn’t want too much information. Too much gets in the way of a sales pitch. It’s partly just human nature and partly knowing their job.
That was a good comment. I don’t work with sales at these companies. Sure, the CEO is “selling” when they talk to me, but it’s a different selling. I write; I don’t buy.
It taught me to probe a little deeper into all these companies I cover–IT and OT–and get into what message they take to the prospect or customer. It may be entirely different from what I hear. And that would be a valuable part of the story.
Presentations abound at Emerson Global Users Exchange. Attendees can choose to take deep technical dives into Emerson products, get overviews and trends of technology and the industry, and even personal development. Yes, there was even a 6 am fitness time with either running or Yoga.
Where’s “The Edge”? Yes, you can use good presentation skills for career success. Building Your Personal Brand through Digital Transformation–or social media an networking. Here’s a recap of the 2019 Emerson Global Users Exchange based upon several sessions I attended led by people I’ve known for a long time–Dave Imming, Mike Boudreaux, and Jim Cahill.
The Secure First Mile–IIoT and the Edge
A panel discussion assembled and led by Emerson’s Director of Connected Plant Mike Boudreaux, discussed Industrial Internet of Things in relation to “Where is the Edge”. The blend of IT and OT on the panel was refreshing and informative. Most instructive was how far each has come toward understanding the entire picture broadening from each’s silos.
Attila Fazekas, ExxonMobil, stated that IoT connects to Level 4 of the Purdue model. He is part of the IT organization taking the view from that side of the divide. He noted that his company tries to have a hard line between the IoT (IT) and control systems, although he admitted that occasionally the line becomes blurred. He was a strong proponent of IT governance, notes they have a hard line between IoT and control system (although in effect the line sometimes gets a bit smudged).
Peter Zornio, CTO Emerson Automation, relates IoT and Edge to “a giant SCADA system.” He reflects those who come from the plant where intelligent devices are connected to an automation system, which formerly was the single point where data was collected and then passed through. I have talked with Zornio for years. Few people in the industry are as knowledgeable about the plant. He is beginning to adjust to the IT world with which he’s going to have to work in the future. Especially given Emerson’s expanded strategy into digital transformation and “Top Quartile Performance.” He sees security helping drive Edge applications to divide systems providing a firm break between control systems and IT systems.
Jose Valle, CTO Energy/Manufacturing at MIcrosoft, brought another IT view to the panel. For him, The Edge becomes a place for security with a separation of functions. He also brought an emphasis on provisioning devices through the cloud.
Rich Carpenter, Executive Product Manager, Emerson Automation / Machinery (former CTO of GE Fanuc/GE Intelligent Platforms), discussed a new Edge computer from Emerson (GE). It uses Hypervisor to run RTOS and PLC control on part of chip segmented by firewall from regular PC chip running Linux for IoT functions. Noted that for the latter, they’ve discovered it better to use Node-RED and Python for programming. Congratulations to Rich for landing at Emerson—he’s another long-time contact. And thanks for mentioning Node-RED.
Overall, the panel expressed concerns about providing security with the IIoT and Edge devices. The best part was Boudreaux’s assembling a panel split evenly with IT and OT and there was no acrimony or “you think this, we think that” nonsense. They are all trying to solve bigger problems than just IT or OT only. Businesses are driving them together to solve “digital transformation” challenges. Good stuff.
DataOps—a phrase I had not heard before. Now I know. Last week while I was in California I ran into John Harrington, who along with other former Kepware leaders Tony Paine and Torey Penrod-Cambra, had left Kepware following its acquisition by PTC to found a new company in the DataOps for Industry market. The news he told me about went live yesterday. HighByte announced that its beta program for HighByte Intelligence Hub is now live. More than a dozen manufacturers, distributors, and system integrators from the United States, Europe, and Asia have already been accepted into the program and granted early access to the software in a exchange for their feedback.
HighByte Intelligence Hub will be the company’s first product to market since incorporating in August 2018. HighByte launched the beta program as part of its Agile approach to software design and development. The aim of the program is to improve performance, features, functionality, and user experience of the product prior to its commercial launch later this year.
HighByte Intelligence Hub belongs to a new classification of software in the industrial market known as DataOps solutions. HighByte Intelligence Hub was developed to solve data integration and security problems for industrial businesses. It is the only solution on the market that combines edge operations, advanced data contextualization, and the ability to deliver secure, application-specific information. Other approaches are highly customized and require extensive scripting and manual manipulation, which cannot scale beyond initial requirements and are not viable solutions for long-term digital transformation.
“We recognized a major problem in the market,” said Tony Paine, Co-Founder & CEO of HighByte. “Industrial companies are drowning in data, but they are unable to use it. The data is in the wrong place; it is in the wrong format; it has no context; and it lacks consistency. We are looking to solve this problem with HighByte Intelligence Hub.”
The company’s R&D efforts have been fueled by two non-equity grants awarded by the Maine Technology Institute (MTI) in 2019. “We are excited to join HighByte on their journey to building a great product and a great company here in Maine,” said Lou Simms, Investment Officer at MTI. “HighByte was awarded these grants because of the experience and track record of their founding team, large addressable market, and ability to meet business and product milestones.”
To further accelerate product development and go-to-market activities, HighByte is actively raising a seed investment round. For more information, please contact [email protected]
Learn more about the HighByte founding team —All people I’ve know for many years in the data connectivity business.
From Wikipedia: DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations.
DataOps incorporates the Agile methodology to shorten the cycle time of analytics development in alignment with business goals.
DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, quality, security, access and ease of use.
From Oracle, DataOps, or data operations, is the latest agile operations methodology to spring from the collective consciousness of IT and big data professionals. It focuses on cultivating data management practices and processes that improve the speed and accuracy of analytics, including data access, quality control, automation, integration, and, ultimately, model deployment and management.
At its core, DataOps is about aligning the way you manage your data with the goals you have for that data. If you want to, say, reduce your customer churn rate, you could leverage your customer data to build a recommendation engine that surfaces products that are relevant to your customers — which would keep them buying longer. But that’s only possible if your data science team has access to the data they need to build that system and the tools to deploy it, and can integrate it with your website, continually feed it new data, monitor performance, etc., an ongoing process that will likely include input from your engineering, IT, and business teams.
As we move further along the Digital Transformation path of leveraging digital data to its utmost, this looks to be a good tool in the utility belt.