The Salesforce Economy Bolsters Manufacturing Cloud

The Salesforce Economy Bolsters Manufacturing Cloud

Salesforce recently began reaching out to me. I found a (to me) surprising connection to industrial / manufacturing applications beyond CRM and the like. In general, more and more applications are moving to the cloud. In Brief: New research finds The Salesforce Economy will create more than $1 trillion in new business revenues and 4.2 million jobs between 2019 and 2024. Salesforce ecosystem is on track to become nearly six times larger than Salesforce itself by 2024, earning $5.80 for every dollar Salesforce makes.

Financial services, manufacturing and retail industries will lead the way, creating $224 billion, $212 billion and $134 billion in new business revenue respectively by 2024.

Salesforce announced new research from IDC that finds Salesforce and its ecosystem of partners will create 4.2 million new jobs and $1.2 trillion in new business revenues worldwide between 2019 and 2024. The research also finds Salesforce is driving massive gains for its partner ecosystem, which will see $5.80 in gains for every $1 Salesforce makes by 2024.

Cloud computing is driving this growth and giving rise to a host of new technologies, including mobile, social, IoT and AI, that are creating new revenue streams and jobs that further fuel the growth of the cloud — creating an ongoing virtuous cycle of innovation and growth. According to IDC, by 2024 nearly 50 percent of cloud computing software spend will be tied to digital transformation and will account for nearly half of all software sales. Worldwide spending on cloud computing between now and 2024 will grow 19 percent annually, from $179 billion in 2019 to $418 billion in 2024.

“The Salesforce ecosystem is made possible by the amazing work of our customers and partners around the world, and because of our collaboration we’re able to generate the business and job growth that we see today,” said Tyler Prince, EVP, Industries and Partners at Salesforce. “Whether it’s through industry-specific extensions or business-aligned apps, the Salesforce Customer 360 platform helps accelerate the growth of our partner ecosystem, and most importantly, the growth of our customers.”

Because organizations that spend on cloud computing subscriptions also spend on ancillary products and services, the Salesforce ecosystem in 2019 is more than four times larger than Salesforce itself and will grow to almost six times larger by 2024. IDC estimates that from 2019 through 2024, Salesforce will drive the creation of 6.6 million indirect jobs, which are created from spending in the general economy by those people filling the 4.2 million jobs previously mentioned.

“The tech skills gap will become a major roadblock for economic growth if we don’t empower everyone – regardless of class, race or gender – to skill up for the Fourth Industrial Revolution,” said Sarah Franklin, EVP and GM of Platform, Developers and Trailhead at Salesforce. “With Trailhead, our free online learning platform, people don’t need to carry six figures in debt to land a top job; instead, anyone with an Internet connection can now have an equal pathway to landing a job in the Salesforce Economy.”

Industry Economic Benefits of the Salesforce Economy

Specifically, Manufacturing industry will gain $211.7 billion in new revenues and 765,800 new jobs will be created by 2024.

Salesforce’s multi-faceted ecosystem is the driving force behind the Salesforce Economy’s massive growth:

  • The global ecosystem includes multiple stakeholders, all of which play an integral part in the Salesforce Economy. This includes the world’s top five consulting firms, all of whom have prominent Salesforce digital transformation practices; independent software vendors (ISVs) that build their businesses on the Salesforce Customer 360 Platform and bring Salesforce into new industries; more than 1,200 Community Groups, with different areas of focus and expertise; and more than 200 Salesforce MVPs, product experts and brand advocates.
  • Launched in 2006, Salesforce AppExchange is the world’s largest enterprise cloud marketplace, and hosts more than 4,000 solutions including apps, templates, bots and components that have been downloaded more than 7 million times. Ninety-five percent of the Fortune 100, 81 percent of the Fortune 500, and 86 percent of Salesforce customers are using AppExchange apps.
  • Trailhead is Salesforce’s free online learning platform that empowers anyone to skill up for the future, learn in-demand skills and land a top job in the Salesforce Economy. Since Trailhead launched in 2014, more than 1.7 million Trailblazers have earned over 17.5 million badges; a quarter of all learners on Trailhead have leveraged their newfound skills to jump-start their careers with new jobs. Indeed, the world’s #1 job site, included Salesforce Developer in its list of best jobs in the US for 2019, noting that the number of job postings for that position had increased 129 percent year-over-year.
Inductive Automation Announces Ignition Firebrand Awards

Inductive Automation Announces Ignition Firebrand Awards

Inductive Automation has selected the recipients of its Ignition Firebrand Awards for 2019. The announcements were made at the Ignition Community Conference (ICC), which took place September 17-19. I get to see the poster displays and chat with the companies at ICC. I love the technology developers, but it’s fascinating to talk with people who actually use the products.

[Disclaimer: Inductive Automation is a long-time and much appreciated sponsor of The Manufacturing Connection. If you are a supplier, you, too, could be a sponsor. Contact me for more details. You would benefit from great visibility.]

The Ignition Firebrand Awards recognize system integrators and industrial organizations that use the Ignition software platform to create innovative new projects. Ignition by Inductive Automation is an industrial application platform with tools for the rapid development of solutions in human-machine interface (HMI), supervisory control and data acquisition (SCADA), manufacturing execution systems (MES), and the Industrial Internet of Things (IIoT). Ignition is used in virtually every industry, in more than 100 countries.

“The award-winning projects this year were really impressive,” said Don Pearson, chief strategy officer for Inductive Automation. “Many of them featured Ignition 8 and the new Ignition Perspective Module, both of which were released just six months ago. We were really impressed with how quickly people were able to create great projects with the new capabilities.”

These Ignition Firebrand Award winners demonstrated the power and flexibility of Ignition:

  • Brock Solutions worked with the Dublin Airport in Ireland to replace the baggage handling system in Terminal 2. The new system has 100,000 tags and is the largest Ignition-controlled airport baggage handling system in the world.
  • Corso Systems & SCS Engineers partnered on a pilot project for the landfill gas system of San Bernardino County, California. The pilot was so successful, it will be expanded to 27 other county sites. It provides a scalable platform with strong mobile capabilities from Ignition 8 and Ignition Perspective, plus 3D imaging from drone video and virtual reality applications.
  • ESM Australia developed a scalable asset management system to monitor performance and meet service requirements for a client with systems deployed all over Australia. The solution leveraged Ignition 8, Ignition Perspective, MQTT, and legacy FTP-enabled gateways in the field.
  • H2O Innovation & Automation Station partnered to create a SCADA system for the first membrane bioreactor wastewater treatment plant in Arkansas. The new system for the City of Decatur shares real-time data with neighboring water agencies as well as the mayor.
  • Industrial Networking Solutions created a new oil & gas SCADA system in just six months for 37 sites at ARB Midstream. The solution included hardware upgrades, a new control room, and a diverse collection of technologies with cloud-hosted SCADA, MQTT, Ignition Edge, and SD-WAN.
  • MTech Engineering developed an advanced real-time monitoring and control system for the largest data center campus in Italy. The project for Aruba S.p.A. had to work with huge amounts of data — and was done at a much lower cost than was possible with any other SCADA solution.
  • NLS Engineering created a single, powerful operations and management platform for more than 30 solar-power sites for Ecoplexus, a leader in renewable energy systems. The solution provided deep data acquisition, included more than 100,000 tags, and led to the creation of a platform that can be offered to other clients.
  • Streamline Innovations used Ignition, Ignition Edge, Ignition Perspective, and MQTT, to facilitate the automation of natural gas treating units that convert extremely toxic hydrogen sulfide into fertilizer-grade sulfur. The solution increased uptime, reduced costs, and provided access to much more data than Streamline had seen previously.
The Salesforce Economy Bolsters Manufacturing Cloud

How To Avoid Pilot Purgatory For Your Projects

This is still more followup from Emerson Global Users Exchange relative to sessions on Projects Pilot Purgatory. I thought I had already written this, but just discovered it languishing in my drafts folder. While in Nashville, I ran into Jonas Berge, senior director, applied technology for Plantweb at Emerson Automation. He has been a source for technology updates for years. We followed up a brief conversation with a flurry of emails where he updated me on some presentations.

One important topic centered on IoT projects—actually applicable to other types of projects as well. He told me the secret sauce is to start small. “A World Economic Forum white paper on the fourth industrial revolution in collaboration with McKinsey suggests that to avoid getting stuck in prolonged “pilot purgatory” plants shall start small with multiple projects – just like we spoke about at EGUE and just like Denka and Chevron Oronite and others have done,” he told me.

“I personally believe the problem is when plants get advice to take a ‘big bang’ approach starting by spending years and millions on an additional ‘single software platform’ or data lake and hiring a data science team even before the first use case is tackled,” said Berge. “My blog post explains this approach to avoiding pilot purgatory in greater detail.”

I recommend visiting Berge’s blog for more detail, but I’ll provide some teaser ideas here.

First he recommends

  • Think Big
  • Start Small
  • Scale Fast

Scale Fast

Plants must scale digital transformation across the entire site to fully enjoy the safety benefits like fewer incidents, faster incident response time, reduced instances of non-compliance, as well as reliability benefits such as greater availability, reduced maintenance cost, extend equipment life, greater integrity (fewer instances of loss of containment), shorter turnarounds, and longer between turnarounds. The same holds true for energy benefits like lower energy consumption, cost, and reduced emissions and carbon footprint, as well as production benefits like reduced off-spec product (higher quality/yield), greater throughput, greater flexibility (feedstock use, and products/grades), reduced operations cost, and shorter lead-time.

Start Small

The organization can only absorb so much change at any one time. If too many changes are introduced in one go, the digitalization will stall:

  • Too many technologies at once
  • Too many data aggregation layers
  • Too many custom applications
  • Too many new roles
  • Too many vendors

Multiple Phased Projects

McKinsey research shows plants successfully scaling digital transformation instead run smaller digitalization projects; multiple small projects across the functional areas. This matches what I have personally seen in projects I have worked on.

From what I can tell it is plants that attempt a big bang approach with many digital technologies at once that struggle to scale. There are forces that encourage companies to try to achieve sweeping changes to go digital, which can lead to counterproductive overreaching. 

The Boston Consulting Group (BCG) suggests a disciplined phased approach rather than attempting to boil the ocean. I have seen plants focus on a technology that can digitally transform and help multiple functional areas with common infrastructure. A good example is wireless sensor networks. Deploying wireless sensor networks in turn enables many small projects that help many departments digitally transform the way they work. The infrastructure for one technology can be deployed relatively quickly after which many small projects are executed in phases.

Small projects are low-risk. A small trial of a solution in one plant unit finishes fast. After a quick success, then scale it to the full plant area, and then scale to the entire plant. Then the team can move on to start the next pilot project. This way plants move from PoC to full-scale plant-wide implementation at speed. For large organization with multiple plants, innovations often emerge at an individual plant, then gets replicated at other sites, rolled out nation-wide and globally.

Use Existing Platform

I have also seen big bang approach where plant pours a lot of money and resources into an additional “single software platform” layer for data aggregation before the first use-case even gets started. This new data aggregation platform layer is meant to be added above the ERP with the intention to collect data from the ERP and plant historian before making it available to analytics through proprietary API requiring custom programming. 

Instead, successful plants start small projects using the existing data aggregation platform; the plant historian. The historian can be scaled with additional tags as needed. This way a project can be implemented within two weeks, with the pilot running an additional three months, at low-risk. 

Think Big
I personally like to add you must also think of the bigger vision. A plant cannot run multiple small projects in isolation resulting in siloed solutions. Plants successful with digital transformation early on establish a vision of what the end goal looks like. Based on this they can select the technologies and architecture to build the infrastructure that supports this end goal.
NAMUR Open Architecture (NOA)
The system architecture for the digital operational infrastructure (DOI) is important. The wrong architecture leads to delays and inability to scale. NAMUR (User Association of Automation Technology in Process Industries) has defined the NAMUR Open Architecture (NOA) to enable Industry 4.0. I have found that plants that have deployed digital operational infrastructure (DOI) modelled on the same principles as NOA are able to pilot and scale very fast. Flying StartThe I&C department in plants can accelerate digital transformation to achieve operational excellence and top quartile performance by remembering Think Big, Start Small, Scale Fast. These translate into a few simple design principles:

  • Phased approach
  • Architecture modeled on the NAMUR Open Architecture
  • Ready-made apps
  • East-to-use software
  • Digital ecosystem
The Salesforce Economy Bolsters Manufacturing Cloud

Integrating Engineering and Project Execution

The design engineering function originates data. It includes data about the structure of the plant or factory, data about the equipment and processes used to make the product, and data about the product(s) itself. In my early career, I embodied the movement of the data from design to operations and then back to design in a continuous loop of as designed—>as built—>as designed. I was also involved for a while in the development of a platform to automate this process using standards.

To say I’m interested in this area would be an understatement. And this process is important to all of you, too. Including those who siphon off some data for other uses such as accounting, customer service, maintenance, and reliability.

AVEVA, the integration of its iconic design engineering software and Schneider Electric’s software business, just introduced integrated engineering software designed to help customers transform the way capital projects are engineered, executed, and integrated into operations and maintenance.

The integrated portfolio comprises three software solutions. AVEVA Unified Engineering integrates process design with front-end engineering and detailed 3D based design. AVEVA Unified Project Execution links and streamlines procurement and construction processes for capital projects. AVEVA Enterprise Learning enables the rapid skilling of operators and engineers using Extended Reality (XR) and simulation tools, to ensure efficient startups and shutdowns, normal operations, and the ability to handle abnormal situations

“This launch builds on the recent news describing AVEVA’s capabilities as the first company in the engineering and industrial software market to comprehensively address the end-to-end digital transformation imperatives with an integrated portfolio of solutions that deliver efficiency, unlock value and empower people across the lifecycle of capital assets and operational value chains,” commented Craig Hayman, CEO, AVEVA. “It changes the way that owner operators engage with Engineering, Procurement and Construction (EPC) companies in designing, building, commissioning, and operating their capital assets.”

The functionality provided in these integrated solutions enables the realization of an EPC 4.0 strategy for owner operators, central to digital transformation in the capital-intensive process sectors. This allows collaboration on a global scale, through hybrid cloud architectures and on a common platform. The entire manufacturing process can be traced, tracked, and linked – from engineering and design, through procurement and construction, to handover and to operations and maintenance, as a comprehensive Digital Twin for the capital asset.

“As competition in the business world accelerates the time has come for industrial organization to innovate to facilitate the transition from the manual, document-centric processes, towards a data-driven vision of project design, procurement, and execution in order to increase safety, reduce costs, and minimize delays, “ commented Craig Hayman, CEO AVEVA. “With the launch of AVEVA Unified Engineering, a first of its kind solution, we are breaking down the silos between engineering disciplines and enabling our customers to turn conceptual designs into 3D models quickly, accelerating engineering to estimation and ensuring designs can be operated before committing billions of dollars.”

New AVEVA Unified Engineering enables the integration of the process model and plant model lifecycles from concept to detailed design, delivering frictionless collaboration for multi-discipline engineers to collaborate in the cloud. The net result is a minimum 50% improvement in engineering efficiency in FEED and up to 30% in detail design, which can yield a 3% total installed cost improvement. These savings can be re-invested to ensure engineering quality, accuracy, and maturity for downstream project execution business processes.

AVEVA Unified Project Execution solutions integrate with AVEVA Unified Engineering to further break down the silos within Procurement and Construction by combining key disciplines covering Contract Risk Management, Materials and Supply Chain Control, and Construction Management into one cloud based digital project execution environment. AVEVA Unified Project Execution solutions deliver up to 15% reduction in material costs, 10% reduction in field labor costs and reduces unbudgeted supplier change orders by up to 50%, which translates to 10% total installed costs savings opportunities for our customers.

AVEVA’s Enterprise Learning solutions combine traditional simulation-based learning with 3D connected learning management solutions. AVEVA’s learning solutions extend process models and 3D models from AVEVA Unified Engineering to fast track DCS panel operator training, field operator training, process and maintenance procedural training, and process safety situational awareness training using cloud and Extended Reality (XR) technology to deliver up to 2% Total Installed Cost reduction by improved operations readiness.

“Our Engineering portfolio enhancements will deliver increased agility for our customers, enabling them to reduce cost, risk, and delays, minimizing errors and driving rapid capital project execution. The cost savings are realized by mitigating capital investment risks at the process design stage, cutting engineering man-hours by up to 30% in plant design, reducing material costs in procurement by up to 15% as well as reducing field labor costs in construction by up to 10%,” commented Amish Sabharwal, SVP, Engineering Business, AVEVA. “With these new solutions AVEVA is providing integration across all stages of the capital project, from conceptual design to handover, to optimize collaboration and break down silos between both engineering disciplines and project stages.”

The Salesforce Economy Bolsters Manufacturing Cloud

Hitachi Vantara NEXT 2019–All About Data

[Updated] Hitachi, a large industrial conglomerate, brought together several businesses it owned and acquisitions of Pentaho and Lumada into a wholly owned enterprise called Hitachi Vantara in 2017. NEXT 2019, its third customer conference attracted a large attendance to Las Vegas.

Hitachi was founded more than 100 years ago “to make products for good.” Thus the conference theme “Powering Good.” I have to say, it is so refreshing to see some ethics emphasis on doing good on display. It gave donations to the American Heart Association and the Rainforest Connection. Nice to see some ethics and doing good on the part of corporations.

Hitachi is involved in manufacturing, so the IT group has roots there, which is of course relevant to all of us. It is not just a random act that brought a manufacturing emphasis to Lumada–although it is used in many other industries as well.

I previously wrote about new products under the Lumada brand in May and September. Following are summaries of important announcements from last week.

DataOps: Data Management for the AI Era

I walked into the stand and told the guy, “DataOps is my new hobby.” I learned much about this new (to me) technology that I wrote about for the first time only a couple of weeks ago.

DataOps was briefly described to me as a pipeline for data. Hitachi Vantara says, “DataOps is enterprise data management for the artificial intelligence (AI) era, seamlessly connecting data consumers with data creators to rapidly find and use all the value in an organization’s data.”

DataOps is not a product, service or solution. Rather, it’s a methodology, and a technological and cultural change, to improve an organization’s use of data through better data quality, shorter cycle time and superior data management. Because organizations are not analyzing most of the data they have due to legacy methods, Hitachi Vantara believes DataOps will have significant impact on the future of IT by unlocking vast amounts of previously unused data.

Hitachi Vantara announced the expansion of the Lumada platform services and solutions portfolio to help customers across industries break down data silos and drive more innovation through DataOps. Hitachi is now extending Lumada’s capabilities beyond the internet of things.

New and updated Lumada offerings include:

Lumada Data Services, a set of software services that help customers manage increasingly complex data ecosystems with an intelligent data foundation.

Interoperating with Hitachi’s proven technologies for object storage, data integration, and analytics – underpinning Hitachi Content Platform (HCP), Pentaho and Lumada – customers can now cost-effectively govern and manage all their data assets, including structured and unstructured, across data center, cloud and edge locations. Policy-based automation tools orchestrate enterprise data flows to deliver on cost savings, compliance and business growth demands.

Lumada Data Lake, an innovative, “smart” data lake offering that is self-optimizing – and which intelligently places data sets in an optimal location – continuously curates to avoid data swamps and is readily accessible to analytics anywhere.

Lumada Edge Intelligence, a new set of software and validated edge hardware devices that enable organizations to manage data and analytics at the network edge for digital use cases such as IoT, connected products, immersive customer experiences, remote and disconnected sites, and branch offices.

Hitachi Virtual Storage Platform (VSP) 5000 series and Hitachi Ops Center software form the company’s powerful next-generation storage and infrastructure foundation with a new scale- out, scale-up architecture for any workload at any scale. These technologies can accelerate data center workloads and deliver future-proof IT with a new, innovative architecture that is the ideal foundation for modernizing data center, cloud, and DataOps environments. The platform also features the world’s fastest NVMe flash array.

Hitachi Vantara expanded and enhanced its capabilities for cloud services in the first major announcement of the company’s newly formed cloud services portfolio. The portfolio includes cloud migration services, application modernization services, operations managed services, consulting services and Hitachi Enterprise Cloud (HEC). The portfolio leverages critical capabilities and industry-leading expertise from the company’s acquisition of REAN Cloud in 2018.

Data Integration and Analytics

Pentaho 8.3, the latest version of the company’s data integration and analytics platform software, introduces a series of features designed to support DataOps, a collaborative data management practice. This latest version delivers improved data agility from customers’ edge-to-multicloud environments while facilitating privacy, security and overall data governance.

Pentaho 8.3 introduces several enhancements:

  • Improved drag and drop data pipeline capabilities to access and blend data that’s difficult to access
  • New connector to SAP offers drag and drop blending, enriching, and offloading data from SAP ERP and Business Warehouse providing deeper insights into and greater analytic value from enterprise information
  • Amazon Kinesis provides real-time data capability in an AWS environment. Pentaho allows AWS developers to ingest and process streaming data in a powerful visual environment as opposed to writing code, and blend it with other data, reducing the manual effort
  • Improved integration with Hitachi Content Platform (HCP)
  • IBM Information Governance Catalog (IGC) Integration
  • Streaming data lineage make it easier to trace real-time data from popular protocols such as AMQP, JMS, Kafka, and MQTT
The Salesforce Economy Bolsters Manufacturing Cloud

Use The Right Kind of Analytics

I am still talking about Emerson Exchange, and have a few more to go. This post is about analytics. Jonas Berge, Senior Director, Applied Technology, Plantweb, Emerson Automation Solutions, has often supplied me with great insight usually about networks in the past. We chatted briefly at Exchange and then followed up with email conversations. In this one, he talked about analytics.

Digital Transformation has a foundation in data. Data is useless without a formal way of thinking about it. There are two kinds of analytics tools.

We are left with two tasks. We must first understand the two types, how they are derived and their strengths and weaknesses.Then we choose the right analytics tool for the problem.

There are principles-driven tools and data-driven tools.

Data Science

One must remember that advanced predictive techniques can only be practically applied to a subset of use cases.

An over-emphasis on one approach means companies won’t position themselves to capture all the potential benefits.

When factoring the effort and expertise required to develop accurate machine-learning models, remember most organizations already have systems in place to record maintenance- and reliability-related data, but the effectiveness of such systems can be undermined by poor housekeeping. The same assets or issues may be described in different ways in different systems, for example, making integration difficult. Companies may use free-text fields to record issues or maintenance actions, making automated search or data analysis harder. Or critical data may be inaccessible, locked away in spreadsheets or on paper notes.

The application of machine-learning techniques to monitor asset condition has already received considerable attention, even though their cost and complexity may ultimately limit their application.

Engineered Analytics

When a machine is prone to a narrow range of well-understood failure modes, it is often possible to address a potential problem in a simpler way, for example by monitoring the temperature or vibration of a component against a set threshold.

Model-based predictive maintenance becomes a breakthrough way to solve selected high-value problems. This approach has the most potential where there are well-documented failure modes with high associated downtime impact, for example in a critical machine on a larger production line.

Root-cause problem solving, using approaches such as fault-tree analysis as well as cause-and-effect or failure-modes-and-effects analysis (FMEA), is a fundamental part of any organization’s maintenance and reliability strategy.

Not all condition-monitoring techniques require elaborate algorithms or complex models, however. Data-driven condition-monitoring approaches use simple queries that are run periodically or in real time against time-series data generated by machines and external sensors. If threshold conditions are passed, these systems can trigger investigative or corrective action in the digital-reliability-engineering workflow, or directly to maintenance execution.

Follow this blog

Get every new post delivered right to your inbox.