Ecosystem Collaboration on 5G Distributed Edge Solution for Service Providers

Ecosystem Collaboration on 5G Distributed Edge Solution for Service Providers

This announcement hits many trends and things you will eventually grow tired of hearing—partnerships, collaboration among companies, ecosystems, Kubernetes, containers, and, yes, 5G. The latter is coming. We just don’t know when and how, yet.

Wind River, a leader in delivering software for the intelligent edge, announced that it is collaborating with Dell EMC as a key hardware partner for distributed edge solutions. A combined software and hardware platform would integrate Wind River Cloud Platform, a Kubernetes-based software offering for managing edge cloud infrastructure, with Dell EMC PowerEdge server hardware. The initial target use case will be virtual RAN (vRAN) infrastructure for 5G networks.

“As telecom infrastructure continues to evolve, service providers are facing daunting challenges around deploying and managing a physically distributed, cloud native vRAN infrastructure,” said Paul Miller, vice president of Telecommunications at Wind River. “By working with Dell EMC to pre-integrate our technologies into a reference distributed cloud solution, we can cost-effectively deliver carrier grade performance, massive scalability, and rapid service instantiation to service providers as their foundation for 5G networks.”

“In a 5G world, new services and applications will not be driven by massively scaled, centralized data centers but by intelligently distributed systems built at the network edge,” said Kevin Shatzkamer, vice president of Enterprise and Service Provider Strategy and Solutions at Dell EMC. “The combination of Dell EMC and Wind River technology creates a foundation for a complete, pre-integrated distributed cloud solution that delivers unrivaled reliability and performance, massive scalability, and significant cost savings compared to conventional RAN architectures. The solution will provide CSPs with what they need to migrate to 5G vRAN and better realize a cloud computing future.”

Wind River Cloud Platform combines a fully cloud-native, Kubernetes and container-based architecture with the ability to manage a truly physically and geographically separated infrastructure for vRAN and core data center sites. Cloud Platform delivers single pane of glass, zero-touch automated management of thousands of nodes.

Dell EMC hardware delivers potent compute power, high performance and high capacity memory is well suited to low-latency applications.

A commercial implementation of the open source project StarlingX, Cloud Platform scales from a single compute node at the network edge, up to thousands of nodes in the core to meet the needs of high value applications. With deterministic low latency required by edge applications and tools that make the distributed edge manageable, Cloud Platform provides a container-based infrastructure for edge implementations in scalable solutions ready for production.

Ecosystem Collaboration on 5G Distributed Edge Solution for Service Providers

Project Alvarium from Linux Foundation for Trusted Data

The IoT group that I’ve been working with for the past few years has been absorbed into the OEM group which is carrying on an expanded function. This blog post from Steve Todd, Dell Technologies Fellow, details the development of data confidence work that has been contributed to the open source Linux Foundation to seed Project Alvarium.

Following is a quick summary. Go to the blog for additional information about trusted data work.

A team of Dell Technologies specialists finished building the first-ever Data Confidence Fabric (DCF for short). The prototype code will be contributed to the Linux Foundation to seed Project Alvarium.

For several years, the CTO of the Dell Technologies Edge and IoT business unit has been touting a vision of data monetization. However, it’s hard to monetize untrusted Edge and IoT data. As he likes to say, “It’s midnight. Do you know where your data has been?” 

Enterprise storage systems have delivered trusted data to applications for a long time. We started our initial investigation wondering if these same trust principles could be applied to Edge and IoT ecosystems. Recent developments in data valuationdistributed ledgers, and data marketplaces facilitated everything coming together.

Five Levels of Trust

We started with the EdgeX Foundry chair of the Core Working Group, Trevor Conn. Trevor wrote the first-ever Data Confidence Fabric software using Go Lang, the same programming language EdgeX is written in. His Data Confidence Fabric software registered with EdgeX as a client and began processing simulated device data. The initial confidence score for this data was “0” (no trust was inserted). 

Dell Technologies then hired three computer science interns from Texas A&M to deploy EdgeX and the Data Confidence Fabric software on a Dell Gateway 3000 with a Trusted Platform Module (TPM) chip.

EdgeX was then adjusted to support N-S-E-W authentication by using VMware’s open-source Lightwave technology.

Dell Boomi software was invoked by the Data Confidence Fabric software to gather provenance and appended this metadata to the sensor reading.

The Data Confidence Fabric software then stored the data locally using IPFS (an immutable, open-source storage system). This fourth level of trust insertion gives an application confidence that the data/provenance has not been tampered with. It also has the additional benefit of enabling analytics to access data closer to the source.

The Data Confidence Fabric software then registered the data into VMware’s blockchain (based on the open-sourceProject Concord consensus algorithm). 

Ecosystem Collaboration on 5G Distributed Edge Solution for Service Providers

Taking a Digital Journey

Keynoters have a tough time with originality these Digital Days with everyone emphasizing Digital Transformation. Steve Lomholt-Thomson, chief revenue officer of AVEVA, took us on a Digital Journey this morning. Setting the tone of the three days of AVEVA World Congress (North America edition).

Three technology trends to watch: an IoT boom; cloud/empowered edge; and, AI / ML. The theme is digital. The Digital Organization discovers its Digital DNA, figures out how to build that Digital DNA through people who challenge the status quo; and then figures out how to track talent flow.

Which all starts us on our Digital Journey. On this journey, we unify end-to-end data, connect data silos taking an wholistic view of the business, and then visualize our assets and supply chain. I believe implied in all this is the company’s product AVEVA System Platform. The company touted six customer stories with at least five of them (and probably the sixth) all leveraging System Platform.

Oh, and the only time the “W” word was used referred to past tense.

Other areas of the company were highlighted:

Focus on assets–asset performance management including how to use machine learning (ML) and artificial intelligence (AI) for predictive analytics (predictive maintenance.

How to combine it all into a Digital Twin–bringing the design lifecycle and physical lifecycle into congruence.

Recently hired head of North America business, Christine Harding, interviewed customers from Campbell’s (soup/snacks), Quantum Solutions (integration project at St. Louis/Lambert airport), and Suncor (Canadian oil sands).

I have the rest of today and then tomorrow to take deeper dives into many of these topics. If there is anything you want me to ask, send a note.

Ecosystem Collaboration on 5G Distributed Edge Solution for Service Providers

HighByte – A New Company Unveiled for DataOps

DataOps—a phrase I had not heard before. Now I know. Last week while I was in California I ran into John Harrington, who along with other former Kepware leaders Tony Paine and Torey Penrod-Cambra, had left Kepware following its acquisition by PTC to found a new company in the DataOps for Industry market. The news he told me about went live yesterday. HighByte announced that its beta program for HighByte Intelligence Hub is now live. More than a dozen manufacturers, distributors, and system integrators from the United States, Europe, and Asia have already been accepted into the program and granted early access to the software in a exchange for their feedback.

Intelligence Hub

HighByte Intelligence Hub will be the company’s first product to market since incorporating in August 2018. HighByte launched the beta program as part of its Agile approach to software design and development. The aim of the program is to improve performance, features, functionality, and user experience of the product prior to its commercial launch later this year.

HighByte Intelligence Hub belongs to a new classification of software in the industrial market known as DataOps solutions. HighByte Intelligence Hub was developed to solve data integration and security problems for industrial businesses. It is the only solution on the market that combines edge operations, advanced data contextualization, and the ability to deliver secure, application-specific information. Other approaches are highly customized and require extensive scripting and manual manipulation, which cannot scale beyond initial requirements and are not viable solutions for long-term digital transformation.

“We recognized a major problem in the market,” said Tony Paine, Co-Founder & CEO of HighByte. “Industrial companies are drowning in data, but they are unable to use it. The data is in the wrong place; it is in the wrong format; it has no context; and it lacks consistency. We are looking to solve this problem with HighByte Intelligence Hub.”

The company’s R&D efforts have been fueled by two non-equity grants awarded by the Maine Technology Institute (MTI) in 2019. “We are excited to join HighByte on their journey to building a great product and a great company here in Maine,” said Lou Simms, Investment Officer at MTI. “HighByte was awarded these grants because of the experience and track record of their founding team, large addressable market, and ability to meet business and product milestones.”

To further accelerate product development and go-to-market activities, HighByte is actively raising a seed investment round. For more information, please contact [email protected]

Learn more about the HighByte founding team —All people I’ve know for many years in the data connectivity business.

Background

From Wikipedia: DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations.

DataOps incorporates the Agile methodology to shorten the cycle time of analytics development in alignment with business goals.

DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, quality, security, access and ease of use.

From Oracle, DataOps, or data operations, is the latest agile operations methodology to spring from the collective consciousness of IT and big data professionals. It focuses on cultivating data management practices and processes that improve the speed and accuracy of analytics, including data access, quality control, automation, integration, and, ultimately, model deployment and management.

At its core, DataOps is about aligning the way you manage your data with the goals you have for that data. If you want to, say, reduce your customer churn rate, you could leverage your customer data to build a recommendation engine that surfaces products that are relevant to your customers  — which would keep them buying longer. But that’s only possible if your data science team has access to the data they need to build that system and the tools to deploy it, and can integrate it with your website, continually feed it new data, monitor performance, etc., an ongoing process that will likely include input from your engineering, IT, and business teams.

Conclusion

As we move further along the Digital Transformation path of leveraging digital data to its utmost, this looks to be a good tool in the utility belt.

Cloud-based Database Platform Complements PI System

Cloud-based Database Platform Complements PI System

Digitalization requires digital data, which in turn requires a place to robustly store that data. OSIsoft PI System must be the most widely used industrial historian database. Last November I wrote about the company bringing its PI System to Amazon Web Services (AWS). The company has released OSIsoft Cloud Services—a cloud-native, real-time data management system for unifying and augmenting critical operations data from across an organization to accelerate industrial analytics, data science projects, data sharing, and other digital transformation initiatives.

Given how deep I’ve been led into IT over the past few years, this advancement from OSIsoft becomes part of a significant trend of blending IT and OT just above the control layer. In fact, if you segregate off the actual “control” part of automation, that system itself has become an important element of Internet of Things (IoT) blending into the overall IT infrastructure. If you are not thinking the big picture in today’s industrial environment, then you will be missing important paths to profitability.

Back to today’s OSIsoft news. The OSIsoft Cloud Services (OCS) in a capsule:

  • Data sharing – partner companies can access a shared data stream to remotely monitor technology
  • Functionality – seamless crossover between the PI System and OCS to compare facilities, perform root cause analysis and run hypotheticals
  • Scalability – tests proved OCS can simultaneously manage over two billion data streams, and safely share information with partners
  • Petuum uses OCS to stream historical data and live data on production, temperature and variability to its AI platform to assist Cemex, a global cement manufacturer, improve yield and energy to 7% from 2%.
  • DERNetSoft uses OCS to aggregate data in one place, allowing users to access useful analytics for ways to reduce power and save money.
  • Pharma companies will use OCS to give a regulator access to anonymized drug testing or production, without risk of unauthorized users in the manufacturing networks.

With OCS, an engineer at a chemical producer, for example, could combine maintenance and energy data from multiple facilities into a live superset of information to boost production in real-time while planning analysts could merge several years’ worth of output and yield data to create a ‘perfect plant’ model for capital forecasts.

OCS can also be leveraged by software developers and system integrators to build new applications and services or to link remote assets.

“OSIsoft Cloud Services is a fundamental part of our mission to help people get the most out of the data that is at the foundation of their business. We want their cost of curiosity to be as close to zero as possible,” said Gregg Le Blanc, Vice President of Product at OSIsoft. “OCS is designed to complement the PI System by giving customers a way to uncover new operational insights and use their data to solve new problems that would have been impractical or impossible before.”

Data scientists spend 50 percent or more of their time curating large data sets instead of conducting analytics. IT teams get bogged down in managing VPNs for third parties or writing code for basic administrative tasks. Data becomes inaccessible and locked in silos. Over 1,000 utilities, 80% of the largest oil and gas companies, and 65% of the Fortune 500 industrial companies already use the PI System to harness critical operations data, turning it into an asset for improving productivity, saving money, and developing new services.

Natively compatible with the PI System, OCS extends the range of possible applications and use cases of OSIsoft’s data infrastructure while eliminating the challenges of capturing, managing, enhancing, and delivering operations data across an organization. Within a few hours, thousands of data streams containing years of historical data can be transferred to OCS, allowing customers to explore, experiment, and share large data sets the same day.

The core of OCS is a highly scalable sequential data store optimized for time series data, depth measurements, temperature readings, and similar data. OSIsoft has also embedded numerous usability features for connecting devices, managing users, searching, transferring data from the PI System to OCS, and other functions. OCS can also accept data from devices outside of traditional control networks or other sources.

“The scale and scope of data that will be generated over the coming decades is unprecedented, but our mission remains the same,” said Dr. J. Patrick Kennedy, CEO and Founder of OSIsoft. “OSIsoft Cloud Services represent the latest step in a nearly 40 year journey and there’s more to come.”

OCS is a subscription service currently available to customers and partners for use in facilities in North America. OCS will be extended to Europe and to other regions in the near future.

Navigating a New Industrial Infrastructure

Navigating a New Industrial Infrastructure

The Manufacturing Connection conceived in 2013 when I decided to go it alone in the world from the ideas of a new industrial infrastructure and enhanced connectivity. I even had worked out a cool mind map to figure it out.

Last week I was on vacation spending some time at the beach and reading and thinking catching up on some long neglected things. Next week I am off to Las Vegas for the Hewlett Packard Enterprise “Discover” conference where I’ll be inundated with learning about new ideas in infrastructure.

Meanwhile, I’ll share something I picked up from the Sloan Management Review (from MIT). This article was developed from a blog post by Jason Killmeyer, enterprise operations manager in the Government and Public Sector practice of Deloitte Consulting LLP, and Brenna Sniderman, senior manager in Deloitte Services LP.

They approach things from a much higher level in the organization than I usually do. They recognize what I’ve often stated about business executives reading about all these new technologies, such as, cloud computing, internet of things, AI, blockchain, and others. “The potential resulting haste to adopt new technology and harness transformative change can lead organizations to treat these emerging technologies in the same manner as other, more traditional IT investments — as something explored in isolation and disconnected from the broader technological needs of the organization. In the end, those projects can eventually stall or be written off, leaving in their wake skepticism about the usefulness of emerging technologies.”

This analysis correctly identifies the organizational challenges when leaders read things or hear other executives at the Club talk about them.

The good news, according to the authors: “These new technologies are beginning to converge, and this convergence enables them to yield a much greater value. Moreover, once converged, these technologies form a new industrial infrastructure, transforming how and where organizations can operate and the ways in which they compete. Augmenting these trends is a third factor: the blending of the cyber and the physical into a connected ecosystem, which marks a major shift that could enable organizations to generate more information about their processes and drive more informed decisions.”

They identify three capabilities and three important technologies that make them possible:

Connect: Wi-Fi and other connectivity enablers. Wi-Fi and related technologies, such as low-power wide-area networks (LPWAN), allow for cable-free connection to the internet almost anywhere. Wi-Fi and other connectivity and communications technologies (such as 5G) and standards connect a wide range of devices, from laptops to IoT sensors, across locations and pave the way for the extension of a digital-physical layer across a broader range of physical locations. This proliferation of connectivity allows organizations to expand their connectivity to new markets and geographies more easily.

Store, analyze, and manage: cloud computing. The cloud has revolutionized how many organizations distribute critical storage and computing functions. Just as Wi-Fi can free users’ access to the internet across geographies, the cloud can free individuals and organizations from relying on nearby physical servers. The virtualization inherent in cloud, supplemented by closer-to-the-source edge computing, can serve as a key element of the next wave of technologies blending the digital and physical.

Exchange and transact: blockchain. If cloud allows for nonlocal storage and computing of data — and thus the addition or extraction of value via the leveraging of that data — blockchain supports the exchange of that value (typically via relevant metadata markers). As a mechanism for value or asset exchange that executes in both a virtualized and distributed environment, blockchain allows for the secure transacting of valuable data anywhere in the world a node or other transactor is located. Blockchain appears poised to become an industrial and commercial transaction fabric, uniting sensor data, stakeholders, and systems.

My final thought about infrastructure—they made it a nice round number, namely three. However, I’d add another piece especially to the IT hardware part. That would be the Edge. Right now it is all happening at the edge. I bet I will have a lot to say and tweet next week about that.

Follow this blog

Get every new post delivered right to your inbox.