Digitalization requires digital data, which in turn requires a place to robustly store that data. This place is most often the cloud these days. OSIsoft PI System must be the most widely used industrial database. The company has released OSIsoft Cloud Services—a cloud-native, real-time data management system for unifying and augmenting critical operations data from across an organization to accelerate industrial analytics, data science projects, data sharing, and other digital transformation initiatives.
OCS highlights and capabilities:
- Data sharing – partner companies can access a shared data stream to remotely monitor technology
- Functionality – seamless crossover between the PI System and OCS to compare facilities, perform root cause analysis and run hypotheticals
- Scalability – tests proved OCS can simultaneously manage over two billion data streams, and safely share information with partners
- Petuum uses OCS to stream historical data and live data on production, temperature and variability to its AI platform to assist Cemex, a global cement manufacturer, improve yield and energy to 7% from 2%.
- DERNetSoft uses OCS to aggregate data in one place, allowing users to access useful analytics for ways to reduce power and save money.
- Pharma companies will use OCS to give a regulator access to anonymized drug testing or production, without risk of unauthorized users in the manufacturing networks.
With OCS, an engineer at a chemical producer, for example, could combine maintenance and energy data from multiple facilities into a live superset of information to boost production in real-time while planning analysts could merge several years’ worth of output and yield data to create a ‘perfect plant’ model for capital forecasts.
OCS can also be leveraged by software developers and system integrators to build new applications and services or to link remote assets.
“OSIsoft Cloud Services is a fundamental part of our mission to help people get the most out of the data that is at the foundation of their business. We want their cost of curiosity to be as close to zero as possible,” said Gregg Le Blanc, Vice President of Product at OSIsoft. “OCS is designed to complement the PI System by giving customers a way to uncover new operational insights and use their data to solve new problems that would have been impractical or impossible before.”
The Data Dilemma
Critical operations data—i.e. data generated by production lines, safety equipment, grids, and other systems essential to a company’s survival—is part of one of the fastest growing segments in the data universe. IDC and Seagate estimate in “Data Age 2025: The Evolution of Data to Life Critical” that “hypercritical” data for applications such as distributed control systems is growing by 54% a year and will constitute 10% of all data by 2025 while real-time data will nearly double to more than 25% of all data.
Critical operations data, however, can be extremely difficult to manage or use.
Data scientists spend 50 percent or more of their time curating large data sets instead of conducting analytics. IT teams get bogged down in managing VPNs for third parties or writing code for basic administrative tasks. Data becomes inaccessible and locked in silos. Over 1,000 utilities, 80% of the largest oil and gas companies, and 65% of the Fortune 500 industrial companies already use the PI System to harness critical operations data, turning it into an asset for improving productivity, saving money, and developing new services.
Natively compatible with the PI System, OCS extends the range of possible applications and use cases of OSIsoft’s data infrastructure while eliminating the challenges of capturing, managing, enhancing, and delivering operations data across an organization. Within a few hours, thousands of data streams containing years of historical data can be transferred to OCS, allowing customers to explore, experiment, and share large data sets the same day.
Two Billion Data Streams
The core of OCS is a highly scalable sequential data store optimized for time series data, depth measurements, temperature readings, and similar data. OSIsoft has also embedded numerous usability features for connecting devices, managing users, searching, transferring data from the PI System to OCS, and other functions. OCS can also accept data from devices outside of traditional control networks or other sources.
“The scale and scope of data that will be generated over the coming decades is unprecedented, but our mission remains the same,” said Dr. J. Patrick Kennedy, CEO and Founder of OSIsoft. “OSIsoft Cloud Services represent the latest step in a nearly 40 year journey and there’s more to come.”
To test the scalability and stability of OCS, OSIsoft created a deployment that contained the equivalent of the data generated by all of the smart meters in the U.S. over the last two years, or two billion data streams (100 million meters with 20 data streams each). OCS successfully stored up to 1.2 billion data points per hour and was managing all two billion streams simultaneously within 48 hours.
PaaS for OSIsoft Marketplace Partners
Software developers are already creating services based around OCS. DERNetSoft is creating a secure marketplace for sharing utility and electric power data to improve energy forecasts and peak shaving strategies. Meanwhile, others are collaborating with customers on efforts to bolster well integrity at oil drilling sites, pinpoint tank leakage, predict maintenance problems, and reduce energy consumption with OCS. OSIsoft partners developing OCS services include Petuum, Seeq, Toumetis, Transpara, Aperio, and TrendMiner. These services will be available from OSIsoft marketplace as they are released.
“Digital transformation requires the ability to compare data and outcomes across multiple plants and data sources,” says Michael Risse, VP/CMO at Seeq. “OCS is a unified solution for process manufacturing customers to enable this type of analysis, generating predictive insights on thousands of assets across company operations to improve production outcomes.”
Pricing and Availability
OCS is a subscription service currently available to customers and partners for use in facilities in North America. OCS will be extended to Europe and to other regions in the near future.
Pricing is based on the average number of data streams accessed, rather than the unique data streams stored, giving customers the freedom to experiment more freely with their data without incurring added costs..
Siemens is serious about building out its IoT platform, Mindsphere, on it way to realizing the vision of the technology supplier of digital transformation in manufacturing. How else to describe the €0.6 billion (or about $700 million) acquisition of Mendix, a popular low-code application development platform.
Mendix, which was founded in the Netherlands but now has its headquarters in Boston, will continue to operate as usual and keep its name, but Siemens notes that it will also use the company’s technology to accelerate its own cloud, IoT and digital enterprise ambitions.
“As part of our digitalization strategy, Siemens continues to invest in software offerings for the Digital Enterprise. With the acquisition of Mendix, Siemens continues to add to its comprehensive Digital Enterprise and MindSphere IoT portfolio, with cloud domain expertise, cloud agnostic platform solutions and highly skilled people,” said Jan Mrosik, CEO of Siemens’ Digital Factory Division.
Mendix’s service is already deeply integrated into IBMs’, SAP’s and Pivotal‘s cloud services. Mendix co-founder and CEO Derek Roos notes that his company and Siemens first discussed a strategic partnership, but as those talks progressed, the two companies moved toward an acquisition instead. Roos argues that the two companies’ visions are quite similar and that Siemens is committed to helping accelerate Mendix’s growth, extend the company’s platform and combine it with Siemen’s existing MindSphere IoT system.
“If you’ve ever wondered which low-code platform will have the viability to invest and win in the long term, you no longer have to guess,” Roos writes. “This commitment and investment from Siemens will allow us to accelerate R&D and geo-expansion investments significantly. You’re going to see faster innovation, more reach and an even better customer experience from us.”
Over the course of the last few years, ‘low-code’ has become increasingly popular as more and more enterprises try to enable all of their employees to access and use the data they now store. Not every employee is going to learn how to program, though, so tools like Mendix, K2 and others now make it easy for non-developers to quickly built (mostly database-backed) applications. (See my last post on ERP and “consumerization”.)
Here is a longer explanation from Roos’ blog:
As the world around us gets increasingly connected, organizations are facing increasing challenges to cope with vast amounts of data and customers are increasingly expecting entirely new experiences and interactions. New technologies like VR, IoT and AI will drive an incredible convergence between the digital and physical worlds, creating entirely new industries and business moments in which people, data, businesses and things work together, dynamically.
This, once again, will put more pressure on business/IT organizations to adapt and change how apps are built and consumed, in ways that few can comprehend right now. And just like we’ve done for web and mobile applications, we also intend to set the direction and lead the market for our customers in this new era.
And this is where Siemens comes in.
As one of the world’s largest industrial powerhouses, there are few companies on the planet that are dealing more mission-critical data and better positioned to blur the lines between our physical and digital worlds. With millions of connected devices and systems, operations in more than 200 countries, and more than 15,000 software engineers, Siemens has access to know-how, expertise and reach few others can match. Even fewer software companies can attempt to compete with such scale in ‘things’.
Siemens has been on a mission to leverage its foothold and data-rich infrastructure in the physical world, to become a leader in the digital world, investing over $10B in the last decade to acquire and build out software businesses, and to create the Industrial IoT platform, MindSphere.
Our two teams first met over a year ago and what started as a discussion about a strategic partnership, gradually evolved into a much bigger vision. The more time we spent together, the more we realized how our visions were aligned. Together with Joe Kaeser, CEO Siemens AG, Jan Mrosik, CEO Digital Factory Division, and Tony Hemmelgarn, CEO PLM Software, we identified three strategic areas where we could win together:
- Accelerate Mendix’ leadership in low-code by doubling down on R&D investments and geographical expansion: By becoming a part of Siemens, we will be able to access an even bigger investment than going public, and we will immediately get access to an enormous global infrastructure that would take much longer to stand-up ourselves. We are committed to extending our leadership in low-code and will significantly accelerate investments in R&D, Customer Success and global expansion.
- Combine Mendix and MindSphere to create the digital operating system for the physical world: With billions of intelligent devices and machines connected to the cloud, organizations will require a new kind of platform to turn these massive amounts of data into real-time business value. By combining Mendix and MindSphere, we will be in a unique position to bridge the physical and digital worlds.
- Extend the Mendix platform to develop world-class and deeply integrated industry SaaS solutions: Becoming a part of Siemens gives us unprecedented access to deep industry know-how, network and expertise. Together with our partner ecosystem, we’ll be able to extend the Mendix platform with deeply integrated vertical solutions across a wide range of industries. Combining low-code with best-practice solutions and templates will provide even more value and speed to market for our customers.
I’m all about IoT and digitalization anymore. This is the next movement following the automation trend I championed some 15 years ago.
Last month, I started receiving emails about predictions for 2018. Not my favorite topic, but I started saving them. Really only received a couple good ones. Here they are—one from Cisco and one from FogHorn Systems.
From Cisco blog written by Cisco’s SVP of Internet of Things (IoT) and Applications Division, Rowan Trollope, comes several looks at IoT from a variety of angles. There is more at the blog. I encourage you to visit for more details.
Until now, the Internet-of-Things revolution has been, with notable outlier examples, largely theoretical and experimental. In 2018, we expect that many existing projects will show measurable returns, and more projects get launched to capitalize on data produced by billions of new connected things.
With increased adoption there will be challenges: Our networks were not built to support the volumes and types of traffic that IoT generates. Security systems were not originally designed to protect connected infrastructure against IoT attacks. And managing industrial equipment that is connected to traditional IT requires new partnerships.
I asked the leaders of some of the IoT-focused teams at Cisco to describe their predictions for the coming year, to showcase some of these changes. Here they are.
IoT Data Becomes a Bankable Asset
In 2018, winning with IoT will mean taking control of the overwhelming flood of new data coming from the millions of things already connected, and the billions more to come. Simply consolidating that data isn’t the solution, neither is giving data away with the vague hope of achieving business benefits down the line. Data owners need to take control of their IoT data to drive towards business growth. The Economist this year said, “Data is the new oil,” and we agree.
This level of data control will help businesses deliver new services that drive top-line results.
– Jahangir Mohammed, VP & GM of IoT, Cisco
AI Revolutionizes Data Analytics
In 2018, we will see a growing convergence between the Internet of Things and Artificial Intelligence. AI+IoT will lead to a shift away from batch analytics based on static datasets, to dynamic analytics that leverages streaming data.
Typically, AI learns from patterns. It can predict future trends and recommend business-critical actions. AI plus IoT can recommend, say, when to service a part before it fails or how to route transit vehicles based on constantly-changing data.
– Maciej Kranz, VP, Strategic Innovation at Cisco, and author of New York Times bestseller, Building the Internet of Things
Interoperable IoT Becomes the Norm
The growth of devices and the business need for links between them has made for a wild west of communications in IoT. In 2018, a semblance of order will come to the space.
With the release of the Open Connectivity Foundation (OCF) 1.3 specification, consumer goods manufacturers can now choose a secure, standards-based approach to device-to-device interactions and device-to-cloud services in a common format, without having to rely on, or settle for, a proprietary device-to-cloud ecosystem.
Enterprise IoT providers will also begin to leverage OCF for device-to-device communications in workplace and warehouse applications, and Open Mobile Alliance’s Lightweight Machine-to-Machine (LwM2M) standard will take hold as the clear choice for remote management of IoT devices.
In Industrial IoT, Open Process Communication’s Unified Architecture (OPC-UA) has emerged as the clear standard for interoperability, seeing record growth in adoption with over 120 million installs expected as 2017 draws to an end. It will continue to grow into new industrial areas in 2018 driven by support for Time Sensitive Networking.
– Chris Steck, Head of Standardization, IoT & Industries, Cisco
IoT Enables Next-Gen Manufacturing
Manufacturing is buzzing about Industrie 4.0, the term for a collection of new capabilities for smart factories, that is driving what is literally the next industrial revolution. IoT technologies are connecting new devices, sensors, machines, and other assets together, while Lean Six Sigma and continuous improvement methodologies are harvesting value from new IoT data. Early adopters are already seeing big reductions in equipment downtime (from 15 to 95%), process waste and energy consumption in factories.
– Bryan Tantzen, Senior Director, Industry Products, Cisco
Connected Roadways Lay the Groundwork for Connected Cars
Intelligent roadways that sense conditions and traffic will adjust speed limits, synchronize street lights, and issue driver warnings, leading to faster and safer trips for drivers and pedestrians sharing the roadways. As these technologies are deployed, they become a bridge to the connected vehicles of tomorrow. The roadside data infrastructure gives connected cars a head start.
Connected cities will begin using machine learning (ML) to strategically deploy emergency response and proactive maintenance vehicles like tow trucks, snow plows, and more.
– Bryan Tantzen, Senior Director, Industry Products, Cisco
Botnets Make More Trouble
Millions of new connected consumer devices make a nice attack surface for hackers, who will continue to probe the connections between low-power, somewhat dumb devices and critical infrastructure.
The biggest security challenge I see is the creation of Distributed Destruction of Service (DDeOS) attacks that employ swarms of poorly-protected consumer devices to attack public infrastructure through massively coordinated misuse of communication channels.
IoT botnets can direct enormous swarms of connected sensors like thermostats or sprinkler controllers to cause damaging and unpredictable spikes in infrastructure use, leading to things like power surges, destructive water hammer attacks, or reduced availability of critical infrastructure on a city or state-wide level.
– Shaun Cooley, VP and CTO, Cisco
Blockchain Adds Trust
Cities are uniquely complex connected systems that don’t work without one key shared resource: trust.
From governmental infrastructure to private resources, to financial networks, to residents and visitors, all of a city’s constituents have to trust, for example, that the roads are sound and that power systems and communication networks are reliable. Those working on city infrastructure itself can’t live up to this trust without knowing that they are getting accurate data. With the growth of IoT, the data from sensors, devices, people, and processes is getting increasingly decentralized—yet systems are more interdependent than ever.
As more cities adopt IoT technologies to become smart—thus relying more heavily on digital transactions to operate—we see blockchain technology being used more broadly to put trust into data exchanges of all kinds. A decentralized data structure that monitors and verifies digital transactions, blockchain technology can ensure that each transaction—whether a bit of data streaming from distributed air quality sensors, a transaction passing between customs agencies at an international port, or a connection to remote digital voting equipment—be intact and verifiable.
– Anil Menon, SVP & Global President, Smart+Connected Communities, Cisco
Sastry Malladi, CTO of FogHorn Systems, has shared his top five predictions for the IIoT in 2018.
1. Momentum for edge analytics and edge intelligence in the IIoT will accelerate in 2018.
Almost every notable hardware vendor has a ruggedized line of products promoting edge processing. This indicates that the market is prime for Industrial IoT (IIoT) adoption. With technology giants announcing software stacks for the edge, there is little doubt that this momentum will only accelerate during 2018. Furthermore, traditional industries, like manufacturing, that have been struggling to showcase differentiated products, will now embrace edge analytics to drive new revenue streams and/or significant yield improvements for their customers.
2. Additionally, any industry with assets being digitized and making the leap toward connecting or instrumenting brownfield environments is well positioned to leverage the value of edge intelligence.
Usually, the goal of these initiatives is to have deep business impact. This can be delivered by tapping into previously unknown or unrealized efficiencies and optimizations. Often these surprising insights are uncovered only through analytics and machine learning. Industries with often limited access to bandwidth, such as oil and gas, mining, fleet and other verticals, truly benefit from edge intelligence.
3. Business cases and ROI are critical for IIoT pilots and adoption in 2018
The year 2017 was about exploring IIoT and led to the explosion of proof of concepts and pilot implementations. While this trend will continue into 2018, we expect increased awareness about the business value edge technologies bring to the table. Companies that have been burned by the “Big Data Hype” – where data was collected but little was leveraged – will assess IIoT engagements and deployments for definitive ROI. As edge technologies pick up speed in proving business value, the adoption rate will exponentially rise to meet the demands of ever-increasing IoT applications.
IIoT standards will be driven by customer successes and company partnerships
4. IT and OT teams will collaborate for successful IIoT deployments
IIoT deployments will start forcing closer engagement between IT and operations technology (OT) teams. Line of business leaders will get more serious around investing in digitization, and IT will become the cornerstone required for the success of these initiatives. What was considered a wide gap between the two sectors – IT and OT – will bridge thanks to the recognized collaboration needed to successfully deploy IIoT solutions and initiatives.
5. Edge computing will reduce security vulnerabilities for IIoT assets.
While industries do recognize the impact of an IIoT security breach there is surprisingly little implementation of specific solutions. This stems from two emerging trends:
a) Traditional IT security vendors are still repositioning their existing products to address IIoT security concerns.
b) A number of new entrants are developing targeted security solutions that are specific to a layer in the stack, or a particular vertical.
This creates the expectation that, if and when an event occurs, these two classes of security solutions are sufficient enough. Often IoT deployments are considered greenfield and emerging, so these security breaches still seem very futuristic, even though they are happening now. Consequently, there is little acceleration to deploy security solutions, and most leaders seem to employ a wait-and-watch approach. The good news is major security threats, like WannaCry, Petya/Goldeneye and BadRabbit, do resurface IIoT security concerns during the regular news cycle. However, until security solutions are more targeted, and evoke trust, they may not help move the needle.
Standards that enable interoperability drives innovation and industry growth.
For some reason, technology suppliers tend to avoid standards at almost all costs—and the costs can be substantial in terms of losing market share or momentum—in order to build a “complete” solution all their own.
One reason beyond the obvious is that standards creation can be a time-consuming and tedious process.
Where would we be without standardized shipping containers, standardized railway tracks and cars, standardized Ethernet and the ISO stack, and more?
I’ve been working with OPC Foundation and am finishing a white paper about the technology of combining two standards—OPC UA and Time Sensitive Networking. This is going to be huge some day.
I also work with a standards organization known as MIMOSA which has promulgated an information standard for asset lifecycle management.
These are key technologies that can move industry forward
I ran across this article by Ron Miller on TechCrunch about standards in another area—cloud services. This article discusses Amazon Web Services (AWS).
AWS just proved why standards drive technology platforms
Quoting from Miller:
When AWS today became a full-fledged member of the container standards body, the Cloud Native Computing Foundation, it represented a significant milestone. By joining Google, IBM, Microsoft, Red Hat and just about every company that matters in the space, AWS has acknowledged that when it comes to container management, standards matter.
AWS has been known to go the proprietary route, after all. When you’re that big and powerful, and control vast swaths of market share as AWS does, you can afford to go your own way from time to time. Containers is an area it hasn’t controlled, though. That belongs to Kubernetes, the open source container management tool originally developed inside Google.
AWS was smart enough to recognize that Kubernetes is becoming an industry standard in itself, and that when it comes to build versus buy versus going open source, AWS wisely recognized that battle has been fought and won.
What we have now is a clearer path to containerization, a technology that is all the rage inside large companies — for many good reasons. They allow you to break down the application into discrete manageable chunks, making updates a heck of a lot easier, and clearly dividing developer tasks and operations tasks in a DevOps model.
Standards provide a common basis for managing containers. Everyone can build their own tools on top of them. Google already has when it built Kubernetes, Red Hat has OpenShift, Microsoft makes Azure Container Service — and so forth and so on.
Companies like standards because they know the technology is going to work a certain way, regardless of who built it. Each vendor provides a similar set of basic services, then differentiates itself based on what it builds on top.
Technology tends to take off once a standard is agreed upon by the majority of the industry. Look at the World Wide Web. It has taken off because there is a standard way of building web sites. When companies agree to the building blocks, everything else seems to fall into place.
A lack of standards has traditionally held back technology. Having common building blocks just make sense. Sometimes a clear market leader doesn’t always agree. Today AWS showed why it matters, even to them.
One under-the-radar trend in industrial automation and software is the development of a marketplace. Several companies have one type of marketplace or another. I think it’s going to prove to be a powerful concept. Here is a take on it from Advantech.
Advantech, a leader of the global industrial computing market, launches the WISE-PaaS Marketplace, an online software shopping website that features exclusive software services provided by Advantech and its partners. The WISE-PaaS Marketplace provides diverse WISE-PaaS IoT application software, including WebAccess/SCADA, WebAccess/HMI, WebAccess/IVS, WebAccess/IMM, WebAccess/NMS, online IoT cloud services, and IoT security services. The WISE-PaaS Marketplace is a sharing platform to integrate with IoT solutions developed by solution partners to provide the building blocks for customers to upgrade existing business systems to Industrial IoT and Industry 4.0 quickly and easily.
Share Success, Grow Business, and Innovate Services
The Wise-PaaS Marketplace is aimed at diverse solution offerings that provide cloud infrastructure services, security services, integrated WISE-PaaS IoT software services, and domain-focused applications for simple and rapid deployment. The WISE-PaaS Marketplace is a value-sharing platform/ecosystem that enables customers to market unique IoT applications and services, increase business opportunities and growth, and maximize returns under the profit-sharing system.
Ongoing Innovation for Future IoT Trends
Customers can subscribe software services via Wise-PaaS Marketplace with WISE-points included in the WISE-PaaS VIP membership packages to access numerous IoT solutions and create IoT innovations for future IoT trends.