NI Connect, its annual user group done virtually again this year. It has announced several product advances this year. A couple relate to advanced driver-assisted systems with wider applicability, I’m sure. First was a brief discussion of digital thread—something NI was doing before the buzz word was invented. I loved the many years of co-founder Jeff Kodosky’s technical discussions of software defined instruments and data traces through the software.
To quote from this year, “NI’s software-connected approach creates a more complete enterprise data and insight chain, collecting and connecting the data that accelerates digital transformation, enabling customers to optimize every step of the product life cycle.”
“A digital thread of data across each phase of the product life cycle delivers powerful insights to enhance product performance,” said NI CEO Eric Starkloff. “At NI, our software-connected approach unlocks the power of test, from early research to the production floor and beyond. We continue to aggressively invest in the technology to make this compelling vision a reality.”
Product announcements include:
- Streamlined SystemLink Software Interface to Increase Efficiency — By connecting test systems and test data to enterprise outcomes, SystemLink software substantially accelerates each phase of the product life cycle. With a unified view of test operations in design validation and production environments, SystemLink manages and simplifies test scheduling, resource utilization, system health and maintenance. The latest software enhancements include new UI customization options, simplified product navigation and expanded asset health monitoring capabilities. The result is test insight acceleration, more efficient use of assets and reduced cost of test.
- New LabVIEW 2021 to Improve Interoperability with Python and MathWorks MATLAB Software — Open-source software is increasingly important as systems become more diverse and complex. NI’s 2021 version of LabVIEW, the leading software platform for building test and measurement systems, features improved interoperability with Python and MathWorks MATLAB software, improved support for version control using Git and usability enhancements. These updates make it easier for engineers to connect disparate systems and hardware to accelerate innovation, especially in the design and validation environments.
- PXI Hardware Solution to Enable Software-Connected Workflow in a Smaller, Cost-Effective Package — Like open-source software, modular hardware is also increasingly important to flexibly connect with existing systems and workflows. PXI hardware delivers openness, software options, modularity and I/O coverage for customers seeking to develop adaptive and scalable systems. NI’s first 2-slot PXI chassis delivers these benefits in a smaller, more cost-effective package. Modular hardware like PXI enables a software-connected workflow to achieve better results.
- NI Collaboration with Seagate to Deliver First-of-Its-Kind In-Vehicle Edge Storage and Data Transfer Service — The next generation of autonomous vehicles requires more real road data than ever before, making efficient data storage exceedingly important. NI and Seagate Technology Holdings, a world leader in data storage infrastructure solutions, announced a new collaboration to enhance data storage services, including a first-of-its-kind advanced driver-assistance systems (ADAS) record offering. This in-vehicle data storage as a service (STaaS), powered by Seagate’s Lyve Mobile edge storage and data transfer service, enables original equipment manufacturers (OEMs) and suppliers to modernize their data storage strategy from self-managed to STaaS, leading to reduced costs and efficient storage.
- NI Ettus USRP X410 Software Defined Radio Platform to Accelerate Wireless Innovation — The next generation of wireless technologies, 5G and 6G, are poised to transform the way people and systems connect, making test data insights that much more important. Because wireless technologies are becoming increasingly complex, advanced tools to support research and prototyping are needed. The new NI Ettus USRP X410 Software Defined Radio Platform is high performance and fully open source, allowing engineers to achieve a faster time to prototype and accelerate wireless innovation
In this year of all things data, I’ve only just discovered a company called Datadobi (news items in March and May). It bills itself as the “global leader in unstructured data management software.” I’ve had an interest in the growth of unstructured data management for several years knowing that a rapid growth in data from manufacturing was coming (and now is here).
I have two news items from the company. Just released today was news about a new training portal. Some news about an application of its technology came my way a week or so ago. That is below.
Training Portal for DatadobiDriven program
Customers Can Now Eliminate Pain and Risk Associated with Complex Data Migrations
Datadobi announced the launch of a new DatadobiDriven Training Portal, intended to provide strategic partners and end customers with tailored technical certification, support, and on-going communication. The new portal is an enhancement to the DatadobiDriven Program, which is focused on adding value for Sales Engineers, Professionals, and Administrators that helps to drive business success.
Over a decade ago, Datadobi raised the bar for data migration solutions with the launch of DobiMigrate, enterprise-class migration software for NAS (network-attached storage) and object data. With DobiMigrate, channel solutions providers and end customers now have a solution that is proven and can be trusted in the most complex and demanding environments to deliver fast, efficient, secure, accurate, and verifiable migration to new storage and/or the cloud.
“With the launch of the DatadobiDriven Training Portal, we continue to set new industry standards. It is now faster and easier than ever for channel partners to be trained and prepared to sell, deploy, and support Datadobi solutions. As a result, our partners are able to increase customer satisfaction, enjoy optimum revenue, and accelerate time to profitability,” said Michael Jack, Chief Revenue Officer and Co-Founder, Datadobi. “Likewise, for end users the portal facilitates direct access to information, training, and solutions for eliminating the pain and risk associated with seemingly straightforward, but more often than not, complex data migrations.”
In related news today, Datadobi announced it has partnered with CLIMB Channel Solutions to provide DatadobiDriven Program benefits to its Climbing Club members. “The Climbing Club is an exclusive group of valued reseller partners that we reward for their efforts in working with Climb and its partners,” said Charles Bass, Vice President of Alliances and Marketing, Climb.
Sports Gear and Equipment Company Teams with Datadobi
DobiMigrate Meets Complex and Demanding Requirements — Compliantly Migrating Heterogeneous Archive Data
Datadobi announced Decathlon, a global leader in sports gear and equipment, has deployed its DobiMigrate software to help enable the move of Decathlon’s entire IT operations into the cloud.
To support the company’s tremendous growth and success, Decathlon made the decision to completely leave its onsite datacenters and migrate to a number of the main cloud providers (Azure, AWS, GCP, Alibaba Cloud, Yandex, and others). One of the final steps would be one of its most critical on its trek towards a successful digital transformation — the migration of all its on-premises unstructured archive data, ranging from product and inventory to customer data.
“We knew a migration of this magnitude could be very complicated, particularly in relation to moving the archive data,” said Tony Devert, IS Engineer, Decathlon. “We considered using CSP’s data movement capabilities but knew this wasn’t their core competency and that their tools were not really qualified to conserve our legal timestamp.” He explained, “We had over six years of archive data to move to the cloud. Every application has its own particularity and would have needed to do its own migration. In other words, the project would have been chopped up into little pieces individually by application. In addition, we had the added complication that the archives included legal data that had associated required retention periods. So, we needed to have extra checks and safety measures in place that would provide proof of the correct migration of the content.”
After careful research and a successful POC, Decathlon chose to deploy DobiMigrate as it found it to be the ideal solution for meeting its complex and demanding requirements. This included maintaining data integrity in addition to providing chain of custody via its hashing of every single file as it is migrated. With DobiMigrate, a file would only be declared successfully migrated if the source and target were an identical match. A report could then be created to show every single hash of every single file, which could be kept for future auditing.
“With DobiMigrate, we were able to dramatically accelerate our migration and complete it well under our timeline objective,” said Devert. “And, with its chain of custody capabilities, we don’t have to check and double check that our data was moved to the destination. With DobiMigrate, if we are audited in two years, five years, or 10 years we can be confident our data is there, and it is correct.”
He continued. “Now that we are in our new cloud environment we can benefit from the speed, agility, and elasticity of on-demand solutions that we can intelligently adapt to our business requirements, thereby positively impacting our bottom-line.”
With its IT infrastructure now deployed 100% across public clouds, Decathlon’s training, recruitment, and partnership with technology visionaries and experts has become a key strategic priority. Its goal is to continue to leverage the newest and most innovative technologies in areas such as serverless applications, automation, and continuous integration and delivery (CI/CD) such as Terraform, Git, Kubernetes, and the like.
Containers have become a must have technology for those pursuing some form of Digital Transformation, or whatever you wish to label it. I’ve written little about the subject. Following is a news release concerning a way for cloud-native Microsoft SQL Server.
DH2i, a provider of multi-platform Software Defined Perimeter (SDP) and Smart Availability software, announced June 22 the general availability (GA) of DxEnterprise (DxE) for Containers, enabling cloud-native Microsoft SQL Server container Availability Groups (AG) outside and inside Kubernetes (K8).
Container use is skyrocketing for digital transformation projects—particularly the use of stateful containers for databases such as Microsoft SQL Server. This growing stateful database container use is also generating a hard production deployment requirement for database-level high availability (HA) in Kubernetes.
For medium and large organizations running SQL Server, database-level HA has traditionally been provided by SQL Server Availability Groups (AGs). However, SQL Server AGs have not been supported in Kubernetes until now—hindering organizations’ ability to undergo digital transformations. DxEnterprise (DxE) for Containers is the answer to the problem.
DxEnterprise for Containers accelerates an enterprise’s digital transformation (DX) by speeding the adoption of highly available stateful containers. DxEnterprise (DxE) for Containers provides SQL Server Availability Group (AG) support for SQL Server containers, including for Kubernetes clusters. It enables customers to deploy stateful containers to create new and innovative applications while also improving operations with near-zero RTO to more efficiently deliver better products and services at a lower cost. Additionally, it helps organizations generate new revenue streams by enabling them to build distributed Kubernetes AG clusters across availability zones/regions, resulting in hybrid cloud and multi-cloud environments which can rapidly adapt to changes in market conditions and consumer preferences.
“Kubernetes lacks SQL Server AG support, which is essential for using stateful containers in production,” said Shamus McGillicuddy, Vice President of Research, EMA Network Management Practice. “DxEnterprise for Containers solves this problem. It enables AG support in Kubernetes.”
“DxE for Containers is the perfect complement to Kubernetes’ pod/node-level cluster HA,” said Don Boxley, DH2i CEO and Co-Founder. “DxE for Containers enables Microsoft users to confidently deploy highly available SQL Server containers in production, speeding their organizations’ digital transformation.”
DxEnterprise for Containers Features & Benefits:
– Kubernetes SQL Server Container Availability Groups with automatic failover, an industry first – Enables customers to deploy stateful containers to create new and innovative applications
– Near-zero recovery time objective (RTO) container database-level failover – Improves operations to more efficiently and resiliently deliver better products and services at a lower cost to the business
– Distributed Kubernetes AG clusters across availability zones/regions, hybrid cloud and multi-cloud environment with built-in secure multi-subnet express micro-tunnel technology – Enables customers to rapidly adapt to changes in market conditions and consumer preferences
– Intelligent Health & performance QoS monitoring, alerting management – Simplifies system management
– Mix and match support for Windows and Linux; bare metal, virtual, cloud servers – Maximizes IT budget ROI
Organizations can now purchase DxEnterprise (DxE) for Containers directly from the DH2i website to get immediate full access to the software and support. Customers have the flexibility to select the support level and subscription duration to best meet the needs of their organization. Users can also subscribe to the Developer Edition of DxEnterprise (DxE) for Containers to dive into the technology for free for non-production use.
DH2i Company is the leading provider of multi-platform Software Defined Perimeter (SDP) and Smart Availability software for Windows and Linux. DH2i software products DxOdyssey and DxEnterprise enable customers to create an entire IT infrastructure that is “always-secure and always-on.”
An example of software integration and pulling data for improved business.
GE Digital announced it is adding Proficy Operations Analytics to its Proficy suite of software solutions. As an integral part of the Proficy suite, Proficy Operations Analytics can use the data already collected in Proficy Historian, Proficy Plant Applications / MES, and Proficy Manufacturing Data Cloud to achieve 2-5% more efficiency from manufacturing operations year over year.
Proficy Operations Analytics is a self-provisioning, ready-to-deploy SaaS-based predictive operations center for industrial IoT and AI. This cloud analytics solution helps operations, engineering, and executive teams gain data visibility, uncover efficiencies and operational proactive actions at enterprise scale. Within minutes of connecting Proficy Operations Analytics to operational and maintenance data sources, users can achieve visibility to insights that can improve operational and revenue performance.
To achieve speedy visibility, Proficy Operations Analytics includes 100+ industrial pre-built data agents that automatically connect to historians, PLCs, MES, SCADA, ERP, Lab DBs, and IoT devices in a secure-by-design, frictionless way with automated normalization of disparate properties for immediate analysis. Self-provisioning processes configure Digital Twins automatically to add the context required to automate analytics on all process and event data. Thirty pre-built industrial predictive applications easily assigned to Digital Twins without any data science or application development requirements help to optimize operations.
With Proficy Operations Analytics, typical deployments achieve fast Return on Investment on the subscription investment. These gains are accelerated with pre-built predictive analytics applications such as Predictive Quality, Predictive Throughput, Predictive Energy Efficiency, Predictive Uptime, Predictive Asset Reliability, and Predictive Asset Life. These applications do not require data science expertise to implement and incorporate curated datasets that are easily visualized in traditional analytics dashboards, so operators can surface the economic impact of actions to executives.
As part of an overall plan to continuously optimize quality and production, a progressive manufacturer of specialty film products has leveraged predictive technologies to pull in data from multiple disparate data sources, monitor more than 400 measures of line stability, and leverage machine learning to continuously predict top potential causes of line instability and film breaks in real time. Predictive insights are presented to the process engineers in easy-to-understand displays, distilling thousands of pieces of data into just the elements needed to address the current problems. Process engineers can use analytics to understand the causes of line instability and make real-time recommendations to the operations team.
“GE Digital is uniquely positioned to help industrial companies accelerate AI and ML with a full set of industrial data management and analytics-based solutions that feature a scalable architecture, single pane of glass for visibility, and security and availability on premise and in the cloud,” said Richard Kenedi, General Manager of GE Digital’s Manufacturing and Digital Plant business. “Proficy Operations Analytics puts industrial data to work to empower workers and lead global decision-making frameworks, putting data in context to drive resilient business outcomes that make people, assets, and processes work together efficiently.”
HPE Discover was held this week, virtually, of course. I can’t wait for the return of in-person conferences. It’s easier for me to get relevant conversations and learn from technology users when you’re all gathered together. You attend on demand here.
I didn’t have any specific industrial/manufacturing discussions this year, although I had met up with Dr. Tom Bradicich earlier to get the latest on IoT and Edge. You can check out that conversation here.
I suppose the biggest company news was the acquisition of Determined AI (see news release below). This year’s theme Age of Insight (into data) and AI and ML are technologies required to pull insight out of the swamp of data.
HPE’s strategy remains to be an as-a-Service company. This strategy is gaining momentum. They announced 97% customer retention with Greenlake, the cloud-as-a-service platform. We are seeing an uptake of this strategy in specifically manufacturing software companies, so I hope you manufacturing IT people are studying this.
Dr. Eng Lim Goh, CTO, stated in his keynote, “We are awash in data, but it is siloed. This brings a need for a federation layer.” Later, in the HPE Labs keynote, the concept of Dataspace was discussed. My introduction to that concept came from a consortium in Europe. More on that in a bit. Goh gazed into the future predicting that we need to know what data to collect, and then look at how and where to collect and find and store data.
The HPE Labs look into Dataspaces highlighted these important characteristic: democratize data access; lead with open source; connect data producers/consumers; and remove silos. Compute can’t keep up with amount of data being generated, therefore the need for the exascale compute HPE is developing. Further, AI & ML are critical capability, but data is growing too fast to train it.
The Labs presentation brought out the need to think differently about programming in the future. There was also a look into future connectivity—looking at photonics research. This technology will enhance data movement, increase bandwidth with low power consumption. To realize the benefits, engineers will have to realize it’s more than wire-to-wire exchange. This connectivity opens up new avenues of design freedom. Also to obtain best results for exploiting this technology for data movement companies and universities must emphasize cross-disciplinary training.
Following is the news release on the Determined AI acquisition.
HPE acquires Determined AI to accelerate artificial intelligence innovation
Hewlett Packard Enterprise has acquired Determined AI, a San Francisco-based startup that delivers a software stack to train AI models faster, at any scale, using its open source machine learning (ML) platform.
HPE will combine Determined AI’s unique software solution with its world-leading AI and high performance computing (HPC) offerings to enable ML engineers to easily implement and train machine learning models to provide faster and more accurate insights from their data in almost every industry.
“As we enter the Age of Insight, our customers recognize the need to add machine learning to deliver better and faster answers from their data,” said Justin Hotard, senior vice president and general manager, HPC and Mission Critical Solutions (MCS), HPE. “AI-powered technologies will play an increasingly critical role in turning data into readily available, actionable information to fuel this new era. Determined AI’s unique open source platform allows ML engineers to build models faster and deliver business value sooner without having to worry about the underlying infrastructure. I am pleased to welcome the world-class Determined AI team, who share our vision to make AI more accessible for our customers and users, into the HPE family.”
Building and training optimized machine learning models at scale is considered the most demanding and critical stage of ML development, and doing it well increasingly requires researchers and scientists to face many challenges frequently found in HPC. These include properly setting up and managing a highly parallel software ecosystem and infrastructure spanning specialized compute, storage, fabric and accelerators. Additionally, users need to program, schedule and train their models efficiently to maximize the utilization of the highly specialized infrastructure they have set up, creating complexity and slowing down productivity.
Determined AI’s open source machine learning training platform closes this gap to help researchers and scientists to focus on innovation and accelerate their time to delivery by removing the complexity and cost associated with machine learning development. This includes making it easy to set-up, configure, manage and share workstations or AI clusters that run on-premises or in the cloud.
Determined AI also makes it easier and faster for users to train their models through a range of capabilities that significantly speed up training, which in one use case related to drug discovery, went from three days to three hours. These capabilities include accelerator scheduling, fault tolerance, high speed parallel and distributed training of models, advanced hyperparameter optimization and neural architecture search, reproducible collaboration and metrics tracking.
“The Determined AI team is excited to join HPE, who shares our vision to realize the potential of AI,” said Evan Sparks, CEO of Determined AI. “Over the last several years, building AI applications has become extremely compute, data, and communication intensive. By combining with HPE’s industry-leading HPC and AI solutions, we can accelerate our mission to build cutting edge AI applications and significantly expand our customer reach.” To tackle the growing complexity of AI with faster time-to-market, HPE is committed to continue delivering advanced and diverse HPC solutions to train machine learning models and optimize applications for any AI need, in any environment. By combining Determined AI’s open source capabilities, HPE is furthering its mission in making AI heterogeneous and empowering ML engineers to build AI models at a greater scale.
Additionally, through HPE GreenLake cloud services for High Performance Computing (HPC), HPE is making HPC and AI solutions even more accessible and affordable to the commercial market with fully managed services that can run in a customer’s data center, in a colocation or at the edge using the HPE GreenLake edge to cloud platform.
Determined AI was founded in 2017 by Neil Conway, Evan Sparks, and Ameet Talwalkar, and based in San Francisco. It launched its open-source platform in 2020.
These IT cloud services are penetrating ever more deeply into industrial and manufacturing applications. I’m beginning to wonder where the trend is going for traditional industrial suppliers combining AWS (and Google Cloud and Azure) with control becoming more and more a commodity. What sort of business shake-ups lie in store for us? At any rate, here is news from a company called Element Analytics, which bills itself as “a leading software provider in IT/OT data management”. I’ve written about this company a couple of times recently. It has started strongly.
Element, a leading software provider in IT/OT data management for industrial companies, announced June 9 a new offering featuring an API integration between its Element Unify product and AWS IoT SiteWise, a managed service from Amazon Web Services Inc. (AWS), that makes it easy to collect, store, organize, and monitor data from industrial equipment at scale. The API integration is designed to give customers the ability to centralize plant data model integration and metadata management, enabling data to be ingested into AWS services, including AWS IoT SiteWise and Amazon Simple Storage Service (S3) industrial data lake.
Available in AWS Marketplace, the Element Unify AWS IoT Sitewise API integration is designed to allow engineers and operators to monitor operations across facilities, quickly compute performance metrics, create applications that analyze industrial equipment data to prevent costly equipment issues, and reduce gaps in production.
“We are looking forward to bridging the on-premises data models we’ve built for systems like OSIsoft PI to AWS for equipment data monitoring using Element Unify,” said Philipp Frenzel, Head of Competence Center Digital Services at Covestro.
Element also announced its ISO 27001 certification proving both the security controls protect customer data and the Information Security Management System (ISMS) provide governance, risk management, and controls required for modern SaaS applications. Element Unify also supports AWS PrivateLink to provide an additional level of network security and control for customers.
“Our customers are looking for solutions that can help them improve equipment uptime, avoid revenue loss, cut O&M costs, and improve safety,” said Prabal Acharyya, Global Head of IoT Partners for Energy at AWS. “Now, the Industrial Machine Connectivity (IMC) on AWS initiative, along with Element Unify, makes possible a seamless API integration of both real-time and asset context OT data from multiple systems into an industrial data lake on AWS.”
Industrial customers need the ability to digitally transform to maximize productivity and asset availability, and lower costs in order to remain competitive. Element Unify, which aligns IT and operational technology (OT) around contextualized industrial IoT data, delivers key enterprise integration and governance capabilities to industrials. This provides them with rich insight, enabling smarter, more efficient operations. By integrating previously siloed and critical time-series metadata generated by sensors across industrial operations with established IT systems, such as Enterprise Asset Management, IT and OT teams can now easily work together using context data for their entire industrial IoT environment.
Though the API integration, AWS IoT SiteWise customers can now:
- Centralize plant data model integration and metadata management into a single contextual service for consumption by AWS IoT SiteWise.
- Ingest data directly into AWS services, including AWS IoT SiteWise and Amazon S3 data lake.
- Integrate and contextualize metadata from IT/OT sources in single-site, multi-site, and multi-instance deployments, and deploy the data model(s) to AWS IoT SiteWise.
- Enable metadata to be imported into AWS IoT SiteWise from the customers systems through Inductive Automation Ignition and PTC KEPServerEX to create context data.
- Keep both greenfield and brownfield AWS IoT SiteWise asset models and asset hierarchies updated as the Element Unify models adapt to changes in the underlying data.
- Close the gap between raw, siloed, and disorganized IT/OT data and enriched, contextualized, and structured data that can be easily paired with business intelligence (BI), analytical tools, and condition monitoring tools like AWS IoT SiteWise.
- Empower the user to build complex asset data models with ease.
- Build and deploy data models to AWS IoT SiteWise and keep them up to date with changes happening in source systems.
- Easily create the underlying data models to power AWS IoT SiteWise and many other analytical systems including Amazon SageMaker, Amazon QuickSight, and Amazon Lookout for Equipment.
“Operations data must be easy to use and understand in both the plant and in the boardroom. Industrial organizations who transform their plant data into context data for use in AWS IoT SiteWise, and other AWS services, will drive greater operational insights and achieve breakthrough business outcomes,” said Andy Bane, CEO of Element.
The Element Unify and AWS IoT SiteWise integration will be available to AWS IoT SiteWise customers in AWS Marketplace.