The 2018 EY Industrial Products Survey was conducted among 500 Industrial Products (IP) executives whose businesses yield over $1B in annual revenue. These surveys are coming in with similar results. You can look at the results and say “Wow, almost half of executives at these companies see innovation as important, or see technology as important” or “How can half of all executives surveyed not see how important innovation is”.
I’ve had experience in manufacturing and marketing leadership and have studied it for many more years—and I lay most of the problems with manufacturing business squarely with (lack of) managerial leadership. I see these results and think that there will be many winners and just as many losers in the coming years.
The study surveyed executives from a variety of sectors including, aerospace and defense, industrial and mechanical components, machinery and electrical systems, chemicals and base materials, packaging and paper and wood. The survey was conducted between February 22, 2018 and March 22, 2018. The purpose of the study was to evaluate where IP companies fall on their journey towards continuous innovation.
Move over R&D: IP companies see digital technology and innovation as their path to success
- 48% Percentage of respondents who view innovation as quite/extremely important for company success
- 43% Percentage of businesses who are learning from and/or following the technology industry to influence innovation at their company
- 67% Percentage of companies who plan to make significant levels of investment in innovation past traditional R&D over the next three years
- 52% Percentage of businesses that say the adoption of emerging technologies will be quite important or critical to the success of their business in the next three years
Additional results from the survey include:
Facing a culture crisis: The perception of the IP industry is hindering the talent search
- 67% agree/strongly agree that the image of the industrial products industry hurts when recruiting for needed skills
- 38% that difficulty competing with tech-first companies for top talent is a leading barrier in filling the skills gap
- 25% say that attracting/retaining top talent is one of the biggest drivers of their company’s technology investment
- 64% agree/strongly agree that the IP industry needs to change their culture to thrive
IP is looking for outside inspiration. While the tech industry is the leading source, IP has a ways to go
- 43% of respondents are learning from and/or following the technology industry to influence innovation at their company
- Only 29% of business say they are extremely or quite innovative compared to close competitors
- 82% of respondents have made minimal or no investment in AI today
- 22% are learning from and/or following the automotive industry to influence innovation at their own company
- 21% are learning from and/or following the consumer products industry to influence innovation at their own company
Robotics, mobile and big data, oh my! What is getting the largest share of investment attention?
- 63% of respondents say that technology investments have driven measureable returns in agility to a significant/meaningful extent
- 46% are making substantial or major investments in robotics and 56% predict they will in the next three years
- 31% of businesses are increasing investment in emerging technologies in response to US tax reform
- 31% says that big data/analytics will be most influential on their business over the next three years
Not a matter of if but when disruption will hit. IP companies are staying nimble in order to prepare
- 49% of businesses say that preparation for disruption will be quite important or critical to the success of their business in the next three years
- 52% of businesses say that flexibility to adapt to trends will be quite important or critical to the success of their business in the next three years
- 53% of businesses say that access to specialized skills for emerging tech will be quite important or critical to the success of their business in the next three years
Living with technology a decade from now. Dell Technologies and the Institute for the Future conducted an in-depth discussion with 20 experts to explore how various social and technological drivers will influence the next decade and, specifically, how emerging technologies will recast our society and the way we conduct business by the year 2030.
There is no universally agreed upon determination of which technologies are considered emerging. For the purpose of this study, IFTF explored the impact that Robotics, Artificial Intelligence (AI) and Machine Learning, Virtual Reality (VR) and Augmented Reality (AR), and Cloud Computing, will have on society by 2030. These technologies, enabled by significant advances in software, will underpin the formation of new human-machine partnerships, according to the IFTF.
Talk of digital transformation is virtually everywhere in Information Technology circles and Operations Technology circles. My long and varied experiences have often placed me at the boundaries where the two meet—and are now increasingly overlapping.
The take on robotics is right on target. And forget about all the SciFi scare stories that mainstream media loves to promote. The future is definitely all about human-machine partnership or collaboration. For example I often talk with EMTs about life in the rescue squad. These people are always in the gym. Our population in the US has gotten so large and obese that they often have to lift 300+ lb. people who haven’t the strength to help themselves up. Think about a robot assistant helping the EMT.
The AI discussion is also fraught with prominent people like Ray Kurzweil or Elon Musk giving dystopian SciFi views of the future. We are a long way from “intelligence.” Where we are is really the use of machine learning and neural networks that help machines (and us) learn by deciphering recurring patterns.
Back to the study, the authors state, “If we start to approach the next decade as one in which partnerships between humans and machines transcend our limitations and build on our strengths, we can begin to create a more favorable future for everyone.”
Jordan Howard, Social Good Strategist and Executive Director of GenYNot, sees tremendous promise for the future of human-machine partnerships: “Many of the complex issues facing society today are rooted in waste, inefficiency, and simply not knowing stuff, like how to stop certain genes from mutating. What if we could solve these problems by pairing up more closely with machines and using the mass of data they provide to make breakthroughs at speed? As a team, we can aim higher, dream bigger, and accomplish more.”
Liam Quinn, Dell Chief Technology Officer, likens the emerging technologies of today to the roll-out of electricity 100 years ago. Quinn argues that we no longer fixate on the “mechanics” or the “wonders” of electricity, yet it underpins almost everything we do in our lives. Similarly, Quinn argues, in the 2030s, today’s emerging technologies will underpin our daily lives. As Quinn provokes, “Imagine the creativity and outlook that’s possible from the vantage point these tools will provide: In 2030, it will be less about the wonderment of the tool itself and more about what that tool can do.”
By 2030, we will no longer revere the technologies that are emerging today. They will have long disappeared into the background conditions of everyday life. If we engage in the hard work of empowering human-machine partnerships to succeed, their impact on society will enrich us all.
While offshoring manufacturing jobs to low-cost economies can save up to 65% on labor costs, replacing human workers with robots can save up to 90% of these costs.
China is currently embarking upon an effort to fill its factories with advanced manufacturing robots, as workers’ wages rise and technology allows the industry to become more efficient. The province of Guangdong, the heartland of Chinese manufacturing, has promised to invest $154 billion in installing robots.
Buoyed by their commercial success, the adoption of robots will extend beyond manufacturing plants and the workplace. Family robots, caregiving robots, and civic robots will all become commonplace as deep learning improves robots’ abilities to empathize and reason. Google recently won a patent to build worker robots with personalities.
Artificial Intelligence and Machine Learning
Approximately 1,500 companies in North America alone are doing something related to AI today, which equates to less than 1% of all medium-to-large companies. We’re seeing this in the financial services industry already, with data recognition, pattern recognition, and predictive analytics being applied to huge data sets on a broad scale. In a 2015 report, Bank of America Merrill Lynch estimated that the AI market will expand to $153 billion over the next five years—$83 billion for robots, and $70 billion for artificial intelligence-based systems.
In addition to their ability to make decisions with imperfect information, machines are now able to learn from their experiences and share that learning with other AI programs and robots. But AI progress also brings new challenges. Discussions surrounding who or what has moral and ethical responsibility for decisions made by machines will only increase in importance over the next decade.
Virtual Reality and Augmented Reality
Although both Virtual and Augmented Reality are changing the form factor of computing, there is a simple distinction between the two. VR blocks out the physical world and transports the user to a simulated world, whereas AR creates a digital layer over the physical world.
Despite the difference, both technologies represent a fundamental shift in information presentation because they allow people to engage in what Toshi Hoo, Director of IFTF’s Emerging Media Lab, calls “experiential media” as opposed to representative media. No longer depending on one or two of our senses to process data, immersive technologies like AR and VR will enable people to apply multiple senses—sight, touch, hearing, and soon, taste and smell—to experience media through embodied cognition.
Over the next decade, Hoo forecasts that VR, combined with vast sensor networks and connected technologies, will be one of many tools that enable distributed presence and embodied cognition, allowing people to experience media with all their senses.
It’s important to recognize that Cloud Computing isn’t a place, it’s a way of doing IT. Whether public, private, or hybrid (a combination of private and public), the technology is now used by 70% of U.S. organizations. This figure is expected to grow further, with 56% of businesses surveyed saying they are working on transferring more IT operations to the cloud, according to IDG Enterprise’s 2016 Cloud Computing Executive Summary.
While the cloud is not a recent technological advancement, cloud technology only really gathered momentum in recent years, as enterprise grade applications hit the market, virtualization technologies matured, and businesses became increasingly aware of its benefits in terms of efficiency and profitability. Increasing innovation in cloud-native apps and their propensity to be built and deployed in quick cadence to offer greater agility, resilience, and portability across clouds will drive further uptake. Start-ups are starting to use cloud-native approaches to disrupt traditional industries; and by 2030, cloud technologies will be so embedded, memories from the pre-cloud era will feel positively archaic by comparison.
Human Machine Partnership
Recent conversations, reports, and articles about the intersection of emerging technologies and society have tended to promote one of two extreme perspectives about the future: the anxiety-driven issue of technological unemployment or the optimistic view of tech-enabled panaceas for all social and environmental ills.
Perhaps a more useful conversation would focus on what the new relationship between technology and society could look like, and what needs to be considered to prepare accordingly.
By framing the relationship between humans and machines as a partnership, we can begin to build capacity in machines to improve their understanding of humans, and in society and organizations, so that more of us are prepared to engage meaningfully with emerging technologies.
Digital (Orchestra) Conductors
Digital natives will lead the charge. By 2030, many will be savvy digital orchestra conductors, relying on their suite of personal technologies, including voice-enabled connected devices, wearables, and implantables; to infer intent from their patterns and relationships, and activate and deactivate resources accordingly.
Yet, as is often the case with any shift in society, there is a risk that some segments of the population will get left behind. Individuals will need to strengthen their ability to team up with machines to arrange the elements of their daily lives to produce optimal outcomes. Without empowering more to hone their digital conducting skills, the benefits that will come from offloading ‘life admin’ to machine partners will be limited to the digitally literate.
Work Chasing People
Human-machine partnerships will not only help automate and coordinate lives, they will also transform how organizations find talent, manage teams, deliver products and services, and support professional development. Human-machine partnerships won’t spell the end of human jobs, but work will be vastly different.
By 2030, expectations of work will reset and the landscape for organizations will be redrawn, as the process of finding work gets flipped on its head. As an extension of what is often referred to as the ‘gig economy’ today, organizations will begin to automate how they source work and teams, breaking up work into tasks, and seeking out the best talent for a task.
Instead of expecting workers to bear the brunt of finding work, work will compete for the best resource to complete the job. Reputation engines, data visualization, and smart analytics will make individuals’ skills and competencies searchable, and organizations will pursue the best talent for discrete work tasks.
Hannover Messe continues to reflect the trend of companies joining alliances to develop and promote standards and interoperability. While I did not have an interview with the Avnu Alliance while I was in Hannover, I talked with some members and obtained other information. Avnu Alliance promotes adoption of the Time Sensitive Networking (TSN) extension to Ethernet.
Specifically, Avnu Alliance is a community creating an interoperable ecosystem of low-latency, time-synchronized, highly reliable networked devices using open standards. Avnu creates comprehensive certification programs to ensure interoperability of networked devices. The foundational technology enables deterministic synchronized networking based on IEEE Audio Video Bridging (AVB) / Time Sensitive Networking (TSN) base standards. The Alliance, in conjunction with other complimentary standards bodies and alliances, provides a united network foundation for use in professional AV, automotive, industrial control and consumer segments.
The adoption pace of TSN from 2017 to 2018 was amazing.
I always drop by the Industrial Internet Consortium (IIC) area at Hannover and check out the TSN Testbed for Flexible Manufacturing. The testbed was developed with two major goals – to show TSN’s readiness to accelerate the marketplace; and to show the business value of TSN in converged, deterministic IIoT networks. Momentum is increasing for the testbed, with the IIC hosting its 10th plugfest in an 18-month timeframe at the Bosch Rexroth facility in Frankfurt, Germany and its 9th plugfest, which was held in Austin, TX in February at National Instruments (NI) headquarters following a joint workshop on interoperability with Avnu Alliance. The TSN Testbed recently integrated test tools from Avnu Alliance members, Calnex, Ixia and Spirent into plugfest activities, and demonstrated interoperability of TSN devices from more than 25 companies performing real-time automation and control automation functions over TSN.
Any Avnu Alliance member is welcome to join the IIC TSN Testbed or to participate in a plugfest. Upcoming plugfests will be held in Austin, TX from June 26-29, 2018 and in Stuttgart from July 24-27, 2018.
The Edge Computing Consortium (ECC) along with members and Avnu Alliance, hosted a press conference to announce new developments surrounding the newly created OPC UA TSN testbed. The testbed demonstrates six major IIoT scenarios mimicking processes found in smart manufacturing settings and utilizing products across different TSN vendors. Avnu Alliance is a key partner supporting the development of the testbed with the ECC in the shared goal of enabling manufacturers to test their products for interoperability and conduct trials of real-world systems as an early check for problems.
Tom Weingartner, Avnu Alliance member and Analog Devices’ marketing director for Deterministic Ethernet Technology Group, represented the Alliance at an announcement ceremony.
Paul Didier, Avnu Alliance member and IoT solutions architect, Cisco delivered a talk at the Industrie 4.0 meet the Industrial Internet Forum, in a presentation titled “Time Sensitive Networks – Where does the technology stand and what to expect”. He will provide an update on TSN and how manufacturers, alliances and liaison groups are working together to advance the technology and its implementation in the IIoT.
Paul will present an additional lecture for the Forum on “Modernizing Your Industrial Manufacturing Network”. The presentation will follow the findings coming out of the IIC TSN Testbed and its capabilities, including information on how manufacturing automation and control infrastructure vendors and key decision-makers can leverage TSN for a variety of operational benefits, including increased connectivity between devices and the ability to extract and analyze valuable information through interconnectivity.
“HANNOVER continues to be a key industry event for both Avnu Alliance members and liaison groups that we work with to educate and increase awareness of TSN as a solution for the growing IIoT,” said Todd Walter, Avnu Alliance Industrial Segment Leader and Chief Marketing Manager at NI. “Whether through the developments coming from the TSN testbeds, speaking engagements or product demonstrations, our members and partners are committed to creating an interoperable TSN network that gives all industrial devices a more streamlined path to participating in the TSN ecosystem.
Much time was devoted last week at Dell Technologies World to Dell’s Legacy of Good highlighting people and companies doing some really cool and worthwhile things. I’m especially impressed with the AeroFarms people (see photos below) who are using IoT to find a better way to grow wholesome vegetables. Hey engineers–maybe there’s a thought in here to spark your next creative interest.
Let me take you on a photo journey through the prominent booth at the DT World Expo floor highlighting a number of projects.
Plastic waste floating in the ocean is fast becoming an environmental catastrophe. Here is someone doing something about it.
How about genetic mapping improvements for fighting rare diseases?
A bug’s eye view with drones to help the honeybee population.
All kinds of wild robot science fiction stories are hitting main-stream media. How about a reality check?
Oh, another main-stream media hype fest–AI. In reality is can be a boost to business not in a scary way.
Here is a manufacturing product lifecycle story.
And the AeroFarms story.
Artificial Intelligence, always known as AI, along with its sometime companion robots leads the mainstream media hype cycle. It’s going to put everyone out of jobs, destroy civilization as we know it, and probable destroy the planet.
I lived through the Japanese robotic revolution-that-wasn’t in the 80s. Media loved stories about robots taking over and how Japan was going to rule the industrialized world because they had so many. Probing the details told an entirely different story. Japan and the US counted robots differently. What we called simple pick-and-place mechanisms they called robots.
What set Japanese industrial companies apart in those days was not technology. It was management. The Toyota Production Method (aka Lean Manufacturing) turned the manufacturing world on its head.
My take for years based on living in manufacturing and selling and installing automation has been, and still is, that much of this technology actually assisted humans—it performed the dangerous work, removing humans from danger, taking over repetitive tasks that lead to long-term stress related injuries, and performing work humans realistically couldn’t do.
Now for AI. This press release went out the other day, “With AI, humans and machines work smarter and better, together.” So, I was intrigued. How do they define AI and what does it do?
Sensai, an augmented productivity platform for manufacturing operations, recently announced the launch of its pilot program in the United States. Sensai increases throughput and decreases downtime with an AI technology that enables manufacturing operations teams to effectively monitor machinery, accurately diagnose problems before they happen and quickly implement solutions.
The company says it empowers both people and digital transformation using a cloud-based collaboration hub.
“The possibility for momentous change within manufacturing operations through digital transformation is here and now,” said Porfirio Lima, CEO of Sensai. “As an augmented productivity platform, Sensai integrates seamlessly into old or new machinery and instantly maximizes uptime and productivity by harnessing the power of real time data, analytics and predictive AI. Armed with this information, every person involved – from the shop floor to the top floor – has the power to make better and faster decisions to increase productivity. Sensai is a true digital partner for the operations and maintenance team as the manufacturing industry takes the next step in digital transformation.”
By installing a set of non-invasive wireless sensors that interconnect through a smart mesh network of gateways, Sensai collects data through its IIoT Hub, gateways and sensors, and sends it to the cloud or an on-premise location to be processed and secured. Data visualization and collaboration are fostered through user-friendly dashboards, mobile applications and cloud-based connectivity to machinery.
The AI part
Sensai’s differentiator is that it provides a full state of awareness, not only of the current status, but also of the future conditions of the people, assets and processes on the manufacturing floor. Sensai will learn a businesses’ process and systems with coaching from machine operators, process and maintenance engineers. It will then make recommendations based on repeating patterns that were not previously detected. Sensai does this by assessing the team’s experiences and historical data from the knowledge base and cross checking patterns of previous failures against a real-time feed. With this information, Sensai provides recommendations to avoid costly downtime and production shutdowns. Sensai is a true digital peer connecting variables in ways that are not humanly possible to process at the speed required on a today’s modern plant floor.
About the Pilot Program
Participation in Sensai’s pilot program is possible now for interested manufacturers. Already incorporated throughout Metalsa, a leading global manufacturer of automotive structural components, Sensai is set to digitally disrupt the manufacturing industry through AI, including those in automotive, heavy metal and stamping, construction materials, consumer goods and more.
Porfirio Lima, Sensai CEO, answered a number of follow up questions I had. (I hate when I receive press releases with lots of vague benefits and buzz words.)
1. You mention AI, What specifically is meant by AI and how is it used?
Sensai uses many different aspects of Artificial Intelligence. We are specifically focused on machine learning (ML), natural language processing (NLP), deep learning, data science, and predictive analytics. When used together correctly, these tools serve a specific use case allowing us to generate knowledge from the resulting data. We use NLP to enable human and computer interaction helping us derive meaning from human input. We use ML and deep learning to learn from data and create predictive and statistical models. Finally, we use data science and predictive analytics to extract insights from the unstructured data deriving from multiple sources. All of these tools and techniques allow us to cultivate an environment of meaningful data that is coming from people, sensors, programmable logistics controllers (PLCs) and business systems.
2. “Learn processes through operators”—How do you get the input, how do you log it, how does it feed it back?
Our primary sources of data (inputs) are people, sensors, PLCs, and business systems. In the case of people on the shop floor or operators, we created a very intuitive and easy to use interface that they can use on their cellphones or in the Human Machine Interfaces (HMIs) that are installed in their machines, so they can give us feedback about the root causes of failures and machine stoppages. We acquire this data in real-time and utilize complex machine learning algorithms to generate knowledge that people can use in their day-to-day operations. Currently, we offer web and mobile interfaces so that users can quickly consume this knowledge to make decisions. We then store their decisions in our system and correlate it with the existing data allowing us to optimize their decision-making process through time. The more a set of decisions and conditions repeats, the easier for our system is to determine the expected outcome of a given set of data.
3. Pattern? What patterns? How is it derived? Where did the data come from? How is it displayed to managers/engineers?
We create “digital fingerprints” (patterns) with ALL the data we are collecting. These “patterns” allow us to see how indicators look before a failure occurs, enabling us to then predict when another failure will happen. Data comes from the machine operators, the machines or equipment, our sensors, and other systems that have been integrated to Sensai’s IIOT hub.
We trigger alerts to let managers and engineers know that a specific situation is happening. They are then able to review it in their cellphones as a push notification that takes them to a detailed description of the condition in their web browser where they can review more information in depth.
4. What specifically are you looking for from the pilots?
We are not a cumbersome solution, for us is all about staying true about agility and value creation. We look for pilots that can give us four main outcomes:
– Learn more about our customer needs and how to better serve them
– A clear business case that can deliver ROI in less than 6 months after implementation and can begin demonstrating value in less than 3 months.
– A pilot that is easy to scale up and replicate across the organization so we can take the findings from the pilot and capitalize them in a short period of time.
– A pilot that can help Sensai and its customers create a state of suspended disbelief that technology can truly deliver the value that is intended and that can be quickly deployed across the entire organization.