DataOps—a phrase I had not heard before. Now I know. Last week while I was in California I ran into John Harrington, who along with other former Kepware leaders Tony Paine and Torey Penrod-Cambra, had left Kepware following its acquisition by PTC to found a new company in the DataOps for Industry market. The news he told me about went live yesterday. HighByte announced that its beta program for HighByte Intelligence Hub is now live. More than a dozen manufacturers, distributors, and system integrators from the United States, Europe, and Asia have already been accepted into the program and granted early access to the software in a exchange for their feedback.
HighByte Intelligence Hub will be the company’s first product to market since incorporating in August 2018. HighByte launched the beta program as part of its Agile approach to software design and development. The aim of the program is to improve performance, features, functionality, and user experience of the product prior to its commercial launch later this year.
HighByte Intelligence Hub belongs to a new classification of software in the industrial market known as DataOps solutions. HighByte Intelligence Hub was developed to solve data integration and security problems for industrial businesses. It is the only solution on the market that combines edge operations, advanced data contextualization, and the ability to deliver secure, application-specific information. Other approaches are highly customized and require extensive scripting and manual manipulation, which cannot scale beyond initial requirements and are not viable solutions for long-term digital transformation.
“We recognized a major problem in the market,” said Tony Paine, Co-Founder & CEO of HighByte. “Industrial companies are drowning in data, but they are unable to use it. The data is in the wrong place; it is in the wrong format; it has no context; and it lacks consistency. We are looking to solve this problem with HighByte Intelligence Hub.”
The company’s R&D efforts have been fueled by two non-equity grants awarded by the Maine Technology Institute (MTI) in 2019. “We are excited to join HighByte on their journey to building a great product and a great company here in Maine,” said Lou Simms, Investment Officer at MTI. “HighByte was awarded these grants because of the experience and track record of their founding team, large addressable market, and ability to meet business and product milestones.”
To further accelerate product development and go-to-market activities, HighByte is actively raising a seed investment round. For more information, please contact [email protected]
Learn more about the HighByte founding team —All people I’ve know for many years in the data connectivity business.
From Wikipedia: DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations.
DataOps incorporates the Agile methodology to shorten the cycle time of analytics development in alignment with business goals.
DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, quality, security, access and ease of use.
From Oracle, DataOps, or data operations, is the latest agile operations methodology to spring from the collective consciousness of IT and big data professionals. It focuses on cultivating data management practices and processes that improve the speed and accuracy of analytics, including data access, quality control, automation, integration, and, ultimately, model deployment and management.
At its core, DataOps is about aligning the way you manage your data with the goals you have for that data. If you want to, say, reduce your customer churn rate, you could leverage your customer data to build a recommendation engine that surfaces products that are relevant to your customers — which would keep them buying longer. But that’s only possible if your data science team has access to the data they need to build that system and the tools to deploy it, and can integrate it with your website, continually feed it new data, monitor performance, etc., an ongoing process that will likely include input from your engineering, IT, and business teams.
As we move further along the Digital Transformation path of leveraging digital data to its utmost, this looks to be a good tool in the utility belt.
Digitalization requires digital data, which in turn requires a place to robustly store that data. OSIsoft PI System must be the most widely used industrial historian database. Last November I wrote about the company bringing its PI System to Amazon Web Services (AWS). The company has released OSIsoft Cloud Services—a cloud-native, real-time data management system for unifying and augmenting critical operations data from across an organization to accelerate industrial analytics, data science projects, data sharing, and other digital transformation initiatives.
Given how deep I’ve been led into IT over the past few years, this advancement from OSIsoft becomes part of a significant trend of blending IT and OT just above the control layer. In fact, if you segregate off the actual “control” part of automation, that system itself has become an important element of Internet of Things (IoT) blending into the overall IT infrastructure. If you are not thinking the big picture in today’s industrial environment, then you will be missing important paths to profitability.
Back to today’s OSIsoft news. The OSIsoft Cloud Services (OCS) in a capsule:
- Data sharing – partner companies can access a shared data stream to remotely monitor technology
- Functionality – seamless crossover between the PI System and OCS to compare facilities, perform root cause analysis and run hypotheticals
- Scalability – tests proved OCS can simultaneously manage over two billion data streams, and safely share information with partners
- Petuum uses OCS to stream historical data and live data on production, temperature and variability to its AI platform to assist Cemex, a global cement manufacturer, improve yield and energy to 7% from 2%.
- DERNetSoft uses OCS to aggregate data in one place, allowing users to access useful analytics for ways to reduce power and save money.
- Pharma companies will use OCS to give a regulator access to anonymized drug testing or production, without risk of unauthorized users in the manufacturing networks.
With OCS, an engineer at a chemical producer, for example, could combine maintenance and energy data from multiple facilities into a live superset of information to boost production in real-time while planning analysts could merge several years’ worth of output and yield data to create a ‘perfect plant’ model for capital forecasts.
OCS can also be leveraged by software developers and system integrators to build new applications and services or to link remote assets.
“OSIsoft Cloud Services is a fundamental part of our mission to help people get the most out of the data that is at the foundation of their business. We want their cost of curiosity to be as close to zero as possible,” said Gregg Le Blanc, Vice President of Product at OSIsoft. “OCS is designed to complement the PI System by giving customers a way to uncover new operational insights and use their data to solve new problems that would have been impractical or impossible before.”
Data scientists spend 50 percent or more of their time curating large data sets instead of conducting analytics. IT teams get bogged down in managing VPNs for third parties or writing code for basic administrative tasks. Data becomes inaccessible and locked in silos. Over 1,000 utilities, 80% of the largest oil and gas companies, and 65% of the Fortune 500 industrial companies already use the PI System to harness critical operations data, turning it into an asset for improving productivity, saving money, and developing new services.
Natively compatible with the PI System, OCS extends the range of possible applications and use cases of OSIsoft’s data infrastructure while eliminating the challenges of capturing, managing, enhancing, and delivering operations data across an organization. Within a few hours, thousands of data streams containing years of historical data can be transferred to OCS, allowing customers to explore, experiment, and share large data sets the same day.
The core of OCS is a highly scalable sequential data store optimized for time series data, depth measurements, temperature readings, and similar data. OSIsoft has also embedded numerous usability features for connecting devices, managing users, searching, transferring data from the PI System to OCS, and other functions. OCS can also accept data from devices outside of traditional control networks or other sources.
“The scale and scope of data that will be generated over the coming decades is unprecedented, but our mission remains the same,” said Dr. J. Patrick Kennedy, CEO and Founder of OSIsoft. “OSIsoft Cloud Services represent the latest step in a nearly 40 year journey and there’s more to come.”
OCS is a subscription service currently available to customers and partners for use in facilities in North America. OCS will be extended to Europe and to other regions in the near future.
The Manufacturing Connection conceived in 2013 when I decided to go it alone in the world from the ideas of a new industrial infrastructure and enhanced connectivity. I even had worked out a cool mind map to figure it out.
Last week I was on vacation spending some time at the beach and reading and thinking catching up on some long neglected things. Next week I am off to Las Vegas for the Hewlett Packard Enterprise “Discover” conference where I’ll be inundated with learning about new ideas in infrastructure.
Meanwhile, I’ll share something I picked up from the Sloan Management Review (from MIT). This article was developed from a blog post by Jason Killmeyer, enterprise operations manager in the Government and Public Sector practice of Deloitte Consulting LLP, and Brenna Sniderman, senior manager in Deloitte Services LP.
They approach things from a much higher level in the organization than I usually do. They recognize what I’ve often stated about business executives reading about all these new technologies, such as, cloud computing, internet of things, AI, blockchain, and others. “The potential resulting haste to adopt new technology and harness transformative change can lead organizations to treat these emerging technologies in the same manner as other, more traditional IT investments — as something explored in isolation and disconnected from the broader technological needs of the organization. In the end, those projects can eventually stall or be written off, leaving in their wake skepticism about the usefulness of emerging technologies.”
This analysis correctly identifies the organizational challenges when leaders read things or hear other executives at the Club talk about them.
The good news, according to the authors: “These new technologies are beginning to converge, and this convergence enables them to yield a much greater value. Moreover, once converged, these technologies form a new industrial infrastructure, transforming how and where organizations can operate and the ways in which they compete. Augmenting these trends is a third factor: the blending of the cyber and the physical into a connected ecosystem, which marks a major shift that could enable organizations to generate more information about their processes and drive more informed decisions.”
They identify three capabilities and three important technologies that make them possible:
Connect: Wi-Fi and other connectivity enablers. Wi-Fi and related technologies, such as low-power wide-area networks (LPWAN), allow for cable-free connection to the internet almost anywhere. Wi-Fi and other connectivity and communications technologies (such as 5G) and standards connect a wide range of devices, from laptops to IoT sensors, across locations and pave the way for the extension of a digital-physical layer across a broader range of physical locations. This proliferation of connectivity allows organizations to expand their connectivity to new markets and geographies more easily.
Store, analyze, and manage: cloud computing. The cloud has revolutionized how many organizations distribute critical storage and computing functions. Just as Wi-Fi can free users’ access to the internet across geographies, the cloud can free individuals and organizations from relying on nearby physical servers. The virtualization inherent in cloud, supplemented by closer-to-the-source edge computing, can serve as a key element of the next wave of technologies blending the digital and physical.
Exchange and transact: blockchain. If cloud allows for nonlocal storage and computing of data — and thus the addition or extraction of value via the leveraging of that data — blockchain supports the exchange of that value (typically via relevant metadata markers). As a mechanism for value or asset exchange that executes in both a virtualized and distributed environment, blockchain allows for the secure transacting of valuable data anywhere in the world a node or other transactor is located. Blockchain appears poised to become an industrial and commercial transaction fabric, uniting sensor data, stakeholders, and systems.
My final thought about infrastructure—they made it a nice round number, namely three. However, I’d add another piece especially to the IT hardware part. That would be the Edge. Right now it is all happening at the edge. I bet I will have a lot to say and tweet next week about that.
The point of manufacturing is to design and make a product then deliver the right product to the customer. Sometimes we just churn out a large quantity and hope they will sell. Sometimes we configure to order or “mass customize” products.
For example, I once had a job where I reported to the Vice President of product development for a manufacturing company. Two of us reported to him. The other guy headed the engineering teams for all of our standard products. I had a small team and we did special projects. One task was to help sales people go through complex specs and configure our product to meet the specs. My technology was a 4-column accounting ledger, pencil, and Singer adding machine.
I bet I made mistakes.
Frederic Laziou, CEO of Tacton, a Swedish company with a product in the CPQ space, just talked with me about what’s happening with his company and the technology.
CPQ stands for Configure Price Quote recognized by Gartner with Tacton firmly situated in Gartner’s Magic Quadrant in CPQ.
A SaaS company born in the cloud from a research institute in Sweden, Tacton used AI plus search research to become a product search engine. Among CPQ companies, Tacton is unique as a niche player in manufacturing with key domain knowledge in manufacturing.
CPQ would have made my job from the late 70s easier, more accurate, and better documented. Good stuff.
Laziou states Tacton has presence in Europe, Japan, and North America. There is a common thread—they cannot compete on price. Mass customization or individualizations helps companies compete.
The company was seeking to deepen its North American presence, niche player in these areas:
- Production lines
- Power Generation (Dresser/Siemens)
- Medical Technology (Siemens and GE)
- Heavy commercial vehicles
- Fluid and air flow
30-40% of that business is in the US. In order to sustain leadership, need to be closer to the customer and partners, therefore have US headquarters. It established an office in the Chicago area. Two additional reasons for the Chicago location include direct flights to Stockholm and a wealth of necessary talent in manufacturing, software, and sales and marketing.
Putting Tacton in context of Industry 4.0, it puts customer in the center of the customer’s digitization.
Laziou says that as more people are bringing consumer technology into the business context, Tacton developed AR to enhance customer engagement. Not to mention that AR addresses big pain point—errors in quotes.
To fuel the expansion and drive increased sales, the company is making a $12 million investment over the next three years to establish joint headquarters in Chicago
In conjunction with its expansion and investment in the U.S. market, Tacton is also announcing new capabilities in its cloud-based CPQ platform. The new features include augmented reality (AR)-powered visualization and expanded integration with Salesforce that makes it even easier for manufacturers to design, configure and sell complex products.
Founded in 1998, Tacton CPQ software and design automation solutions help the world’s largest manufacturers, such as Bosch, Siemens and Caterpillar to manage the complexities traditionally associated with producing customized and configured products that meet strict customer requirements.
For example, the Industrial Power division at Siemens uses Tacton’s sales configuration software to slash the time it takes to prepare price quotes and simplify product configuration of custom solutions. It used to take Siemens eight weeks to produce a custom quote for its gas turbine units, With Tacton’s CPQ, the sales team now produces the same quote in a matter of minutes, without requiring any help from product specialists.
“The beauty of the Tacton Configurator is that it will guide the sales representative and get the product configuration for an accurate price quote each time. It now takes us only five minutes to generate a complete budget offer including pricing. This saves us tremendous amounts of time and money,” said Siemens Senior Engineer CRM process & IT Development Jan Nilsson.
Tacton CPQ now includes visual product configuration, including real time, interactive 3D drag-and-drop functionality. Sales engineers can interact directly with a sophisticated configuration tool, powered by AR to visualize the configuration within the actual environment.
The new capabilities extend Tacton’s integration with Salesforce to boost sales with features including needs-based configuration. Salesforce Sales Cloud customers can now connect to the full power of Tacton CPQ leveraging its best-in-class AI-driven configurator for the manufacturing industry. Tacton CPQ for Salesforce features automated CAD drawing and engineer-to-order (ETO) processes with out-of-the-box integration to all leading CAD solutions, integration with SAP ERP and Variant Configurator (SAP LO-VC) and open APIs that enable full integration with the existing IT stack.
“We experienced significant customer adoption for Tacton CPQ solutions among European manufacturers over the last two years, making it the ideal time to meet a similar demand in the United States,” said Frederic Laziou, CEO of Tacton. “By continuing to add breakthrough enhancements like AR-powered visual configuration to our CPQ solution, we can drive even greater efficiency for manufacturers, making it even simpler and faster to sell complex products.”
The CEO of Zededa told me in an interview a few months ago that his mission was no less than to build the largest computing company on Earth without owning infrastructure. Its vision—create a new edge economy that allows applications to run anywhere.
When I wrote in April, the company was emerging from stealth mode. Its most recent announcement proclaims:
- First demonstrable cloud-native platform for edge applications, early customer access to end-to-end app operations platform purpose built for the edge underway
- Zero-touch infrastructure modernization for legacy embedded systems; simple to move legacy apps and OS from outdated systems to newer, cloud-native edge hardware
- Zededa joins EdgeX Foundry to bolster the organization’s vision of an open and secure cloud-native future that enables all new IoT applications
- Major edge system vendors turning to Zededa for operational automation, insights and protection of applications running on their systems
Zededa announced early access to its platform that provides real-time apps a simple “on-ramp” to the cloud-native edge. From legacy embedded systems to modern, AI-based IoT apps, the platform provides the scalability, security and visibility required to allow operations teams to unlock the power of real-time apps without concerns about bandwidth, latency or dependency on the cloud.
Operations technology teams have three primary situations to deal with when it comes to IoT applications: how to upgrade and secure a massive install base of legacy embedded systems, how to retrofit existing equipment with IoT sensors and applications to take advantage of real-time data, and how to deploy entirely new applications like AI-powered robots and self-driving fleets.
Closed, monolithic systems at the edge—either closed by design or closed because of legacy embedded device development workflows—are the last major impediment to solving these problems and enabling IoT to achieve its stated $1.3 trillion market potential. Zededa’s platform demonstrates how cloud-native edge solves the most urgent problem for organizations looking into digital transformation—upgrading and protecting legacy systems without truck-rolls—and gives solution providers a way to easily adopt IoT sensors and industrial gateways to provide real-time data to operational software. Initial natively-supported hardware partners include platforms built on ARM and Intel x86 processors from leading vendors including Advantech Corporation, Lanner, SuperMicro, and Scalys.
“Cloud-native edge computing will be a diverse universe unlike anything in cloud datacenters today,” said Roman Shaposhnik, VP of Product and Strategy at Zededa. “We are making the modernization of edge infrastructure secure, simple and automated in preparation for a fundamental shift away from legacy embedded systems. An open system that allows BYO hardware into a cloud-native platform is a start of the future: a computing environment that is distributed, autonomous and cooperative.”
To help drive entirely new applications and operational possibilities at the edge across a diverse universe of devices, Zededa has joined EdgeX Foundry, a vendor-neutral open source project hosted by The Linux Foundation with a goal to build a common open framework for IoT edge computing.
“Interoperability and convergence on common industry standards is vital for organizations deploying next-generation distributed computing solutions at the IoT Edge,” said Jason Shepherd, Chair of EdgeX Foundry Governing Board and Dell Technologies IoT CTO. “By joining EdgeX Foundry’s efforts Zededa will help promote the project’s important work of creating an open ecosystem of secure, interoperable edge applications that will change user experiences and drive the future of business.”
Currently providing early access to select customers, Zededa is accepting sign-ups for demonstrations and private briefings.
Founded in 2016, Zededa is pioneering a cloud-native approach to the deployment, management and security of real-time edge applications at hyperscale for solutions ranging from self-driving cars to industrial robots. Zededa is headquartered in Santa Clara, CA with engineering and market development teams based in India, UK, Germany and Korea.
EdgeX Foundry is an open source project hosted by The Linux Foundation building a common open framework for IoT edge computing and an ecosystem of interoperable components that unifies the marketplace and accelerates the deployment of IoT solutions. Designed to run on any hardware or operating system and with any combination of application environments, EdgeX enables developers to quickly create flexible IoT edge solutions that can easily adapt to changing business needs.
I’m interested in where all this IoT, IT/OT convergence, and digital transformation technology is taking us. This survey on platforms is revealing. Nearly forty percent of respondents are using Platform as a Service (PaaS), containers, and serverless technologies together for flexibility and interoperability.
Companies are using different cloud-native technologies side-by-side at an increasing pace to build both new cloud-native applications and to refactor traditional applications, according to the latest report <https://www.cloudfoundry.org/multi-platform-trend-report-2018> released by the Cloud Foundry Foundation <http://www.cloudfoundry.org/>, home of the most widely-adopted open source cloud technologies in the world.
“As IT Decision Makers settle into their cloud journey, they are more broadly deploying a combination of available platforms, including PaaS, containers and serverless,” said Abby Kearns, Executive Director, Cloud Foundry Foundation. “In this multi-platform world, it should come as no surprise that, as they become more comfortable with these tools, IT decision makers are searching for a suite of technologies to work together. They want technologies that integrate with their current solutions in order to address their needs today—but are flexible enough to address their needs in the future.”
Key Findings include:
• A Multi-Platform World:Technologies are being used side by side more than ever before. IT decision makers report 77 percent are using or evaluating Platforms-as-a-Service (PaaS), 72 percent are using or evaluating containers and 46 percent are using or evaluating serverless computing. More than a third (39 percent) are using a combination of all three technologies together.
• A Mix of New Cloud-Native and Refactoring Legacy Applications:57 percent of IT decision makers report their companies do a mix of building new cloud-native applications and refactoring existing applications, an increase of nine percentage points from late 2017.
• Containers Have Crossed the Chasm: For companies choosing to develop new or refactor existing applications, they are choosing containers.
• Serverless is on the Upswing: Serverless computing is being evaluated with rapid momentum. Only 43 percent of respondents are not using serverless and 10 percent more companies are evaluating serverless than in 2017.
• PaaS Usage Continues to Swell: PaaS is being more broadly deployed than ever before and companies are developing new cloud-native applications at increased momentum. It stands to reason that these two upsurges happen in tandem. This growth in usage is bolstered by the 62 percent of IT decision makers who report their companies save over $100,000 by using a PaaS.
• Flexibility and Interoperability are Key: IT decision makers ranked “Integration with existing tools” and “Flexibility to work with new tools” in the top five attributes in a platform, alongside Security, Support, and Price.
Less than six months ago, the Foundation’s 2017 report “Innovation & Relevance: A Crossroads on the Cloud Journey <https://www.cloudfoundry.org/cloud-journey-report/>” indicated IT decision makers were advancing in their cloud journeys. In 2016, IT Decision Makers reported a lack of clarity around IaaS (Infrastructure-as-a-Service) and PaaS (Platform-as-a-Service) technologies, and most were still in the evaluation stages. By late 2017, they had progressed to selection and broader deployment of cloud solutions. Twenty percent report of IT decision makers report primarily building new cloud-native applications, up five percentage points from last year, while 13 percent say they are primarily refactoring, a drop of 11 points. As an increasing number of companies are developing new cloud-native applications, PaaS is being broadly deployed by more companies than ever. It stands to reason that these two upsurges happen in tandem.
Cloud Foundry Application Runtime is a mature and growing cloud application platform used by large enterprises to develop and deploy cloud-native applications, saving them significant amounts of time and resources. Enterprises benefit from the consistency of Cloud Foundry Application Runtime across a variety of distributions of the platform, thanks to a Certified Provider program. Cloud Foundry Container Runtime combines Kubernetes with the power of Cloud Foundry BOSH, enabling a uniform way to instantiate, deploy and manage highly available Kubernetes clusters on any cloud, making deployment, management and integration of containers easy.
To receive a copy of the survey, go here <https://www.cloudfoundry.org/multi-platform-trend-report-2018>. The survey was conducted and produced by ClearPath Strategies <http://www.clearpath-strategies.com/>, a strategic consulting and research firm for the world’s leaders and progressive forces.
Cloud Foundry is an open source technology backed by the largest technology companies in the world, including Dell EMC, Google, IBM, Microsoft, Pivotal, SAP and SUSE, and is being used by leaders in manufacturing, telecommunications and financial services. Only Cloud Foundry delivers the velocity needed to continuously deliver apps at the speed of business. Cloud Foundry’s container-based architecture runs apps in any language on your choice of cloud — Amazon Web Services (AWS), Google Cloud Platform (GCP), IBM Cloud, Microsoft Azure, OpenStack, VMware vSphere, and more. With a robust services ecosystem and simple integration with existing technologies, Cloud Foundry is the modern standard for mission critical apps for global organizations.
The Cloud Foundry Foundation is an independent non-profit open source organization formed to sustain the development, promotion and adoption of Cloud Foundry as the industry standard for delivering the best developer experiences to companies of all sizes. The Foundation projects include Cloud Foundry Application Runtime, Cloud Foundry Container Runtime, BOSH, Open Service Broker API, Abacus, CF-Local, CredHub, ServiceFabrik, Stratos and more. Cloud Foundry makes it faster and easier to build, test, deploy and scale applications, and is used by more than half the Fortune 500, representing nearly $15 trillion in combined revenue. Cloud Foundry is hosted by The Linux Foundation and is an Apache 2.0 licensed project available on Github:https://github.com/cloudfoundry. To learn more, visit:http://www.cloudfoundry.org <http://www.cloudfoundry.org/>.