More Partnership News from Security Firm Claroty

More Partnership News from Security Firm Claroty

Claroty has been busy. Following the news of investments and partnership with Rockwell Automation, Claroty and Siemens announced a global partnership. Siemens will leverage Claroty’s advanced behavioral analysis technology in Siemens’ recently announced Industrial Anomaly Detection solution.

Siemens, through its global venture firm Next47, also invested in Claroty, joining a global syndicate of industrial giants that invested $60 million in the company’s Series B round, bringing the company’s total investment to date to $93 million.

Siemens initiated the Charter of Trust in February 2018, gaining the support of other giant companies in the global fight against the rising cybersecurity threat to industrial systems. Siemens also continues to expand its cybersecurity portfolio, debuting at the 2018 Hannover Messe industrial automation conference a new Industrial Anomaly Detection solution, which will deliver significant value for both operations and cybersecurity teams. Operations teams receive a detailed inventory of industrial assets and changes to the network. Cybersecurity teams can continuously monitor these critical networks for vulnerabilities, malicious activity, and high-risk changes, across distributed industrial sites.

Claroty was selected by Siemens following an intensive technical evaluation. “In selecting our security partner for Industrial Anomaly Detection, we reviewed the market, conducted a detailed evaluation, and rigorously tested possible technology in our industrial lab environment,” said Dr. Thomas Moser, CEO of the Siemens Customer Services business unit. “Claroty’s advanced behavioral analysis provides a significant advantage to our customers in reducing risk to their OT environment.”

“Our mission is to help our customers secure industrial networks so they can avoid costly operations downtime, and maintain the safety of people and expensive assets,” said Amir Zilberstein, Claroty Co-founder and CEO. “Siemens’ selection of Claroty as a strategic partner and their investment in our company is further validation of our technology, our team, and our ability to deliver world-class, enterprise-level protection.”

Siemens uses Claroty in a pre-packaged offering enabling customers to quickly and safely deploy anomaly detection in their operations. Siemens brings the offering to the market based on pre-installed packages on Siemens IPC. In the future, it is planned to also offer this based on Siemens switches with an Application Processing engine provided by the Ruggedcom RX1500 series.

Siemens, as owner and operator of nearly 300 factories, heavily leverages digitalizing for efficiency gains. Responsible digitalization must go hand in hand with cybersecurity. Therefore, Siemens is implementing a defense-in-depth security concept in its factories. Industrial Anomaly Detection is an important element of this concept.

The Claroty Platform is comprised of multiple integrated products, built on Claroty’s advanced CoreX technology. The products provide the full range of cybersecurity protection, control, detection, and response. Claroty has received multiple industry awards in recent months. It was recently named an Energy Innovation Pioneer at CERAWeek 2018, and the company’s flagship Continuous Threat Detection product won the ICS Detection Challenge during the S4x18 conference in Miami.

More Accurate Location Services

More Accurate Location Services

I met with the representative of an interesting company with a different take on indoor location services. Years ago I listened to a podcast called the Gillmor Gang and a famous (at the time) blogger Robert Scoble was always extolling the virtues of beacons. They will be everywhere and do all sorts of things, he repeated like a mantra.

Things got quiet, then I met Quuppa at Hannover Messe 2018. They have a beacon that has multiple antennas that does a better job of location than trying some of the older triangulation technologies.

The company has just announced a partner event, something that gives me an excuse to point you toward something interesting. I’m assuming that few if any of my readers are heading to Finland any time soon.

Quuppa, a Finnish company that delivers indoor positioning technology, announced its second annual partner event will take place June 5-7 in Helsinki, Finland. With a theme of “Defining the Future,” the event will feature speakers from Quuppa and its partner ecosystem, networking events and a Solutions Showcase Expo that demonstrates the current and future capabilities of real-time, global indoor location services and solutions. The event demonstrates the success Quuppa has had delivering on its go-to-market strategy that centers on providing an open positioning platform both in terms of hardware and software APIs, where each company focuses on what it does best, helping speed time-to-market.

The event will also highlight a day of presentations featuring “success stories,” with case study presentations that showcase the wide range of use cases for Quuppa’s unique indoor location technology. Featured success story topics include improving efficiency and customer experience in retail, asset tracking in large scale, Industry 4.0, manufacturing use cases from Japan, safety in a secure environment, generating business KPIs from location data, and employee safety indoors and outdoors.

Quuppa utilizes a unique combination of Bluetooth Low Energy (BLE) and the Angle of Arrival (AoA) methodologies, as well as advanced location algorithms that have been developed over the course of more than 15 years, to calculate highly accurate indoor positioning.

The Quuppa Ecosystem includes more than 70 best-of-breed companies worldwide that deliver best-in-class software solutions, tags and installation services, as well as system integrators and solution providers that offer end-to-end solutions. Companies across a wide range of industries, including manufacturing and logistics, retail, healthcare, sports, law enforcement and security, government and others rely on Quuppa and its ecosystem partners to unlock the full potential of indoor location-based services without compromising accuracy, compatibility or cost.

“Quuppa’s ecosystem continues to thrive, and our partner event is a place to gather and share expertise and best practices for global indoor location services,” said Fabio Belloni, head of Quuppa’s Partner Ecosystem. “What we are seeing more of as the ecosystem expands is partner companies seeking answers from their peers—not just from Quuppa—on wide-ranging topics such as how to launch a large-scale deployment, how to forge partnerships to grow in new geographic areas, how to best conduct a demo, and more. Companies are realizing they no longer need to develop everything on their own, they can choose best-of-breed solutions from our incredible ecosystem partners. It’s amazing to see how quickly the Quuppa Ecosystem is growing and the unique partnerships that are forming because of it.”

One such partnership that has emerged within the Quuppa Ecosystem is between Japanese motor manufacturer Nidec Corp. and Synapses Lab, an Italian technology design company. The companies work together utilizing Quuppa’s precision location technology, Synapses’ platform for tracking and 3D modeling, and Nidec’s electronics and engineering expertise to develop autonomous solutions that will deliver improved productivity and security in the manufacturing industry.

“Building a solid and reliable ecosystem is essential for our company,” said Domenico Mariotti, CEO and cofounder of Synapses. “Such a system enables us to tackle new challenges and different use cases every day, sometimes beating any expectations we ourselves had for our solutions.”

“In the Japanese manufacturing industry, some early birds are now trying to introduce IoT to their factories,” said Hiroshi Mochizuki, Small Precision Motor and Solutions business unit at Nidec. “They do not allow position data to have jitter, so Nidec decided to select Synapses’ platform utilizing the Quuppa Ecosystem. Synapses has successfully developed its platform, of which the filtering capability and database structure is duly optimized for Quuppa’s technology. Nidec strongly believes that problem-solving requests by its customers will be soon made, and good results in increase of productivity and security are expected to become visible in a short period of time, thanks to the availability of Synapses platform.”

Microsoft Acquires GitHub and Other Big Company and Open Source Thoughts

Microsoft Acquires GitHub and Other Big Company and Open Source Thoughts

Microsoft acquiring GitHub, the repository of many open source projects, on the surface appears almost as an oxymoron. However, as I’ve written previously about big companies and OPC UA standard big companies now find open source and interoperability to be sound business decisions rather than threatening to their proprietary hold on technology.

OPC and Standards

Two years ago in my Podcast Gary on Manufacturing 149 also found on YouTube, I asked the question why major suppliers of automation technology for manufacturing/production hated OPC UA—an industry information model standard. That is by far the most viewed YouTube podcast I’ve ever done. I followed up with Gary on Manufacturing 175 and YouTube to update the situation to current situation.

It is still getting comments, some two years later. Some guy (probably works for a big company?) even dissed me about it.

However, the industry witnessed an almost tectonic shift in the approach of these automation suppliers toward standards. First Siemens went all in on OPC UA. Then last November and following Rockwell Automation has had several deep discussions with me about the adoption of OPC UA.

Why? Users demand more interoperability. And using standards is the easiest way forward for interoperability. Suppliers have discovered that standards allow them to continue to push development of their “black boxes” of technology while allowing themselves and their customers to assemble systems of technology.

Microsoft News

In my favorite news site, Axios, Ina Fried writes:

Microsoft announced this morning it is acquiring GitHub, the social network for coders as well as home to millions of different software projects, for $7.5 billion.

“The era of the intelligent cloud and intelligent edge is upon us. Computing is becoming embedded in the world, with every part of our daily life and work and every aspect of our society and economy being transformed by digital technology. Developers are the builders of this new era, writing the world’s code. And GitHub is their home.”
— Satya Nadellla, CEO, Microsoft

Why it matters: This would further highlight the complete turnaround the company has already made in its stance toward source software.

Behind the scenes: While former Microsoft CEO Steve Ballmer once called Linux a cancer, the company has steadily warmed to open source, with Nadella embracing it with open arms.

GitHub plays into that strategy as it’s used by developers of all stripes to store their code projects. The San Francisco-based company was founded in 2008 and is now home to 80 million software repositories. The company has been searching for a new CEO since last year.

Why it matters: Playing host to the world’s code doesn’t necessarily make Microsoft a more central player, but it could tightly integrate GitHub into its developer tools. Microsoft decided last year to shut down its own CodePlex software repository, bowing to GitHub’s popularity.

What about Windows? Though certainly a fan of its homegrown operating system, Microsoft’s main goal these days is to be in tight with developers and get them writing code that can live in its Azure cloud.

Microsoft even dropped the Windows name from Azure, reflecting the fact you don’t have to use Windows to work with Azure.

History lesson: Microsoft’s shift to embrace Linux is somewhat reminiscent of the earlier move IBM made to do so. Both companies are now seen as the mature veterans of the enterprise market, more interested in meeting corporate computing needs than pushing homegrown architectures.

This information was also posted on the Microsoft Blog.

Other Open Source Information

My other travels and interviews have yielded other companies who have invested heavily in open source.

Within the last two years I have had a few conversations with Microsoft about their open source code donations. While I am a little surprised at acquiring GitHub, perhaps this will lend financial stability to the platform (although we do have to note that large company investments do not always insure financial stability.

Dell Technologies and Hewlett Packard Enterprise, two companies I have more recently studied are both proud to be contributors to open source. A couple of years ago considerable time at one of the keynotes at Dell World to open source projects.

I think that some of these companies are realizing that they don’t have to invent everything themselves. Being good software citizens benefits them as well as the community.

AI Understood and Misunderstood

AI Understood and Misunderstood

Artificial Intelligence (AI) draws a lot of attention and media space. But what is it, really? Elon Musk has recently spoken about AI in the terms of dystopian SciFi movies. Recently I’ve heard Eric Schmidt and Michael Dell talk much more positively about the potential for AI for solving human problems.

Most of the discussion about AI these days has little to do with replicating human brains with silicon circuitry. The core involves machine learning and neural networks. Recently two different organizations have sent me studies about the current state of AI. The first comes from ABI Research. The second from McKinsey Global Institute.

ABI Research AI to the Cloud

Artificial Intelligence (AI) will see a significant shift out of the cloud and on to the edge (aka on-device, gateway, and on-premise server). This will happen initially in terms of inference (machine learning) and then by training. This shift means a huge opportunity for those chipset vendors with power-efficient chipsets and other products that can meet the demand for edge AI computing.  Edge AI inference will grow from just 6% in 2017 to 43% in 2023, announced ABI Research, a market-foresight advisory firm providing strategic guidance on the most compelling transformative technologies.

“The shift to the edge for AI processing will be driven by cheaper edge hardware, mission-critical applications, a lack of reliable and cost-effective connectivity options, and a desire to avoid expensive cloud implementation. Consumer electronics, automotive, and machine vision vendors will play an initial critical role in driving the market for edge AI hardware. Scaling said hardware to a point where it becomes cost effective will enable a greater number of verticals to begin moving processing out of the cloud and on to the edge,” says Jack Vernon, Industry Analyst at ABI Research.

ABI Research has identified 11 verticals ripe for the adoption of AI, including automotive, mobile devices, wearables, smart home, robotics, small unmanned aerial vehicles, smart manufacturing, smart retail, smart video, smart building, and oil and gas sectors and split across a further 58 use cases.  By 2023 the market will witness 1.2 billion shipments of devices capable of on-device AI inference – up from 79 million in 2017.

Cloud providers will still play a pivotal role, particularly when it comes to AI training. Out of the 3 billion AI device shipments that will take place in 2023, over 2.2 billion will rely on cloud service providers for AI training – this is still a real-term decline in the cloud providers market share for AI training, which currently stands around 99%, but will fall to 76% by 2023. Hardware providers should not be too concerned about this shift away from the cloud, as AI training is likely to be supported by the same hardware, only at the edge, either on-premise servers or gateway systems.

The power-efficient chipset is the main driver of edge AI. Mobile vendor Huawei is already introducing on-device AI training for battery power management in its P20 pro handset, in partnership with Cambricon Technologies. Chip vendors NVIDIA, Intel, and Qualcomm are also making a push to deliver the hardware that will enable automotive OEMs to experiment with on-device AI training to support their efforts in autonomous driving. Training at the edge on-device is beginning to gain momentum in terms of R&D, but it could still take some take some time for it to be a realist approach in most segments.

“The massive growth in devices using AI is positive for all players in the ecosystem concerned, but critically those players enabling AI at the edge are going to see an increase in demand that the industry to date has overlooked. Vendors can no longer go on ignoring the potential of AI at the edge. As the market momentum continues to swing toward ultra-low latency and more robust analytics, end users must start to incorporate edge AI in their roadmap. They need to start thinking about new business models like end-to-end integration or chipset as a service,” Vernon concludes.

These findings are from ABI Research’s Artificial Intelligence and Machine Learning market data. This report is part of the company’s AI and Machine Learning research service, which includes research, data, and Executive Foresights.

McKinsey

AI Could Add $2 Trillion to Manufacturing Value, McKinsey Paper Says

Artificial intelligence for manufacturing is like the old BASF chemical company slogan that it does not make many of the things you buy, it makes them better.
As much as $2 trillion better, according to a McKinsey Global Institute discussion paper covering more than 400 AI use cases. In 69 percent of the use cases, researchers found that adding AI to established analytical techniques could improve performance and generate additional insights and applications.

Nearly a quarter of the use cases directly or indirectly touched manufacturing.
“Manufacturing is the second largest domain when it comes to potential in value creation (right behind Marketing & Sales,” said Mehdi Miremadi, an MGI partner and co-author of the paper. “Application of advanced deep learning models in manufacturing and supply chain have the potential to create $1.2-2 trillion in annual economic value.”

Two deep-learning neural networks offer the greatest promise in applying AI to manufacturing.

Feed Forward — information moves forward from the input layer forward through the “hidden” layers to the output layer. It has been enhanced by advances in computer power, training algorithms and available data.

Convolutional – Connections between the neural layers mimic the visual human cortex that processes images. These are well suited to visual perception tasks.

Of the two, Feed Forward Neural Networks is most applicable with wide applications in predictive maintenance, yield, efficiency and energy. The most value is derived when using AI with existing analytics tools.

“In predictive maintenance, neural networks improve the ability to incorporate and process a broader set of data, including unstructured data such as videos and images,” Miremadi said. Better algorithm precision and accuracy can result in better decisions. And there is the possibility of taking better advantage of live data.”

Small and medium size manufacturers, often left on the sidelines of Industry 4.0 technologies like AI, can do more than they might think, Miremadi said.

The cost of setting up AI applications is declining. Hardware sensors and actuators are much more affordable and reliable. And data systems and deep-learning algorithms are more accessible. This allows a choice of internal development or working with one of the many tech providers in the market.

Advantech Morphs Strategy for Internet of Things Development

Advantech Morphs Strategy for Internet of Things Development

Taiwan-based Advantech’s leaders have always been intellectual strategic thinkers. They have clued me in on several good management books. The company is an industrial computer company with industrial data acquisition and I/O devices that has successfully positioned itself as an edge device leader in the Internet of Things space.

The company has announced its strategies for entering the next phase of IoT development. To expand local operations, Advantech will fully activate the deployment of branch locations throughout various regions. In addition, a co-creation model will be adopted to construct the Industrial IoT (IIoT) ecosystem and strengthen the influence of vertical domains.

Advantech’s Executive Director of the Board, Chaney Ho, stated that since taking over as executive director last year, he has been focusing on developing regional strategies and establishing development goals and directions for each region, all of which are based on their scope.

In regions with a larger scope (Europe, United States, and China), to reinforce the Advantech brand recognition in IoT and Industry 4.0, talent cultivation and an increased presence in local sales are the company’s primary goals to actively respond to recent developments in Industry 4.0 trends in the EU, plans by the U.S. government to shift production back to America, and the China One Belt One Road policy.

For medium and small-scale regions, Mr. Ho stated that Advantech will develop Japan, South Korea, India, and Russia to generate $130 million in revenue. The company also plans to further increase investment in Malaysia and Thai IIoT organizations and new branch locations in Vietnam, Russia, and Turkey will be established through mergers and acquisitions as well as joint ventures.

Regarding developments in the European region, Miller Chang, President of Advantech’s Embedded-IoT (EIoT) Group, expressed that a sector-lead strategy has been practiced by the EIoT group since 2014. Various product divisions from headquarters have been fully connected with overseas frontline business teams and compound annual growth rate from 2014 to 2017 has reached 25%.

Key development points for the next three years in Europe are:

1. Elevating operation levels in five key regions, the UK, France, Germany, Italy, and the Netherlands.
2. Establishing branch offices in emerging European regions for conducting business and providing technical support.
3. Focusing on key industries, such as gaming, medical, transportation, and automotive, in Germany, UK, and the Netherlands.

With respect to development in the Greater China Region, Linda Tsai, President of Advantech’s IIoT Group, believes that the embedded systems/hardware from Phase I IOT development as well as IoT solutions platforms from Phase II are Advantech’s “double-growth engine” in IIoT development. Following this, three key strategies have been proposed.

1. Implement and IIoT sector-lead organizational development model expanding industry management and optimize regional resource allocations,
2. Set successful examples in the Greater China Region to accelerate the marketing of hardware/software and imaging solutions.
3. Actively cultivate local personal to become mid-to-high level supervisors to expand into the Chinese market.

Fantine Lee, Manager of Advantech’s Corporate Investment Division, pointed out that Advantech will continue to actively promote platform management during Phase II IoT development, SRP co-creation, and the co-created digital transformation of vertical industry cloud services during Phase III through the co-creation model. As for vertical industry, cloud service companies to be co-created during Phase III, Advantech plans to establish subsidiaries in Taiwan and China and will include domains such as Smart Manufacturing, Smart Environmental Protection, and Smart Retail. These companies will be managed together with Advantech’s co-creation partners. Furthermore, opportunities in other domains, such as Smart Hospitals, Smart Factories, Industrial Vision Systems, Consultant Training, and Integration Services will continue to be promoted and co-created.

Miss Lee further stated for Phase II development, Advantech’s WISE-PaaS cloud platform will serve as the foundation for building a comprehensive value chain for SRPs. This year, third-party software and WISE-PaaS platform integration with SaaS suppliers and collective sales/agents will be introduced at an accelerated pace. In addition, partnerships with software developers specializing in monitoring and diagnosing connected equipment, energy management, data analysis, machine learning, and other vertical industries will be established.

NI Focusing on Test and Measurement Updates LabView

NI Focusing on Test and Measurement Updates LabView

NI Week was last week, and for only the second time in 20 years, I didn’t go. NI, formerly National Instruments, has been focusing more on test and measurement lately. Not so much automation. My interest is mostly on its IoT efforts especially TSN. I figure I can get an interview with Todd Walter or whomever without the expense of a conference.

NI’s core competency lies as the provider of a software-defined platform that helps accelerate the development and performance of automated test and automated measurement systems. At NI Week it announced the release of LabVIEW 2018.

Applications that impact our daily lives are increasing in complexity due to the rapid innovation brought on by industry trends such as 5G, the Industrial Internet of Things, and autonomous vehicles. Consequently, the challenge of testing these devices to ensure reliability, quality and safety introduce new demands and test configurations, with decreased time and budget. Engineers need better tools to organize, develop and integrate systems so they can accomplish their goals within the acceptable boundaries.

Engineers can use LabVIEW 2018 to address a multitude of these challenges. They can integrate more third-party IP from tools like Python to make the most of the strengths of each package or existing IP from their stakeholders. Test engineers can use new functionality in LabVIEW 2018 to strengthen code reliability by automating the building and execution of software through integration with open interface tools like Jenkins for continuous delivery. Capabilities like this empower test engineers to focus on system integration and development where they can offer unique differentiation, rather than get bogged down in the semantics of how to use software tools or move IP from one to another. For test engineers using FPGAs for high-performance processing, new deep learning functions and improved floating-point operations can reduce time to market.

“NI’s continued commitment to its software-centric platform accelerates my productivity so I can focus on the challenges that yield the highest ROIs,” says Chris Cilino, LabVIEW framework architect at Cirrus Logic. “LabVIEW continues to minimize the effort of adding tests and code modifications to our validation framework, delivering a consistent process to maintain our software and incorporate the reuse of valuable IP without rewrites.”

To meet demands like testing higher complexity DUTs and shorter timeframes, engineers need tools tailored to their needs that they can efficiently use through their workflow, helping them to meet their exact application requirements. LabVIEW 2018 is the latest addition to NI’s software-centric platform that features products tailored to needs within distinct stages of their workflow – products that have been adopted in whole or in part by more than 300,000 active users.

With InstrumentStudio software providing an interactive multi-instrument experience, TestStand test management software handling overall execution and reporting and SystemLink software managing assets and software deployments, this workflow improves the productivity of test and validation labs across many industries. Each piece of the workflow is also interoperable with third-party software to maximize code and IP reuse and draws on the LabVIEW Tools Network ecosystem of add-ons and tools for more application-specific requirements.

Engineers can access both LabVIEW 2018 and LabVIEW NXG with a single purchase of LabVIEW.

Follow

Follow this blog

Get every new post delivered right to your inbox.