Year End Internet of Things Acquisition

I’ve taken some time during the holidays to get off the daily posting gerbil wheel to study even more deeply into the Industrial Internet of Things.

You may ask why. Every analyst firm now has an IoT practice. They do consulting of one sort or another. But, many are constrained by their models. I’ve seen some of the analyses. I think I can contribute.

Last week just before Christmas, PTC announced acquisition of Kepware to deepen its Internet of Things offering. I’ll have a longish analysis to kick off the New Year Monday.

The Internet of Things is a strategy, not a thing. It is described by an ecosystem, not a product.

A look at the 30-year history of the company reveals that it has grown by acquisition. First within its (then) core technology from CAD to modeling. Then into PLM. Then a Retail practice. Then it developed a services platform. None of these were core to my coverage, so PTC is not a company I’ve followed closely.

Then Jim Heppelmann, CEO, caught the Internet of Things virus. Meanwhile, the ThingWorx developers had a cool technology and cast about looking for a focus. Ah, it perfectly fits within the Internet of Things. Seemed like a fit, and an acquisition was consummated.

Following ThingWorx (2014) was Axeda–a company that itself had undergone more than one transformation. Then ColdLight helped complete the portfolio with its analytics engine.

I’ve consulted with a few companies and talked with others who wanted to jump into the Internet of Things by simply bolting on a product acquisition. They thought maybe just adding sensors (the “things” of the IoT, right?) they would be and IoT company. Or maybe a networking company.

No, PTC has the right idea. It remains to be seen if it bought the best technologies and if they can make them work together. I’ve seen companies fail at that point.

More later.

Meanwhile, I hope your 2015 was successful and that your 2016 will be one of personal growth and success.

High Performance Computing Advancements

High Performance Computing Advancements

Anyone who thinks PCs when the company name Dell comes up (“Hey, Dude, You’re getting a Dell.”) has missed the company’s growth over the past decade. I’ve written about its new foray into Internet of Things with a product specifically targeted at manufacturing industries. The company has announced some advances in its High Performance Computing platform.

High Performance Computing

These advances include innovative new systems designed to simplify mainstream adoption of HPC and data analytics in research, manufacturing and genomics. Dell also unveiled expansions to its HPC Innovation Lab and showcased next-generation technologies including the Intel Omni-Path Fabric.

HPC is becoming increasingly critical to how organizations of all sizes innovate and compete. Many organizations lack the in-house expertise to configure, build and deploy an HPC system without losing focus on their core science, engineering and analytic missions. As an example, according to the National Center for Manufacturing Sciences, 98 percent of all products will be designed digitally by 2020, yet 95 percent of the center’s 300,000 manufacturing companies have little or no HPC expertise.

“HPC is no longer a tool only for the most sophisticated researchers. We’re taking what we’ve learned from working with some of the most advanced, sophisticated universities and research institutions and customizing that for delivery to mainstream enterprises,” said Jim Ganthier, vice president and general manager, Engineered Solutions and Cloud, Dell. “As the leading provider of systems in this space, Dell continues to break down barriers and democratize HPC. We’re seeing customers in even more industry verticals embrace its power.”

Accelerating Mainstream Adoption

Dell HPC System Portfolio, a family of HPC and data analytics solutions, combines the flexibility of custom systems with the simplicity, reliability and value of a preconfigured, factory-built system that includes:

  • Simplified design, configuration, and ordering in a matter of hours instead of weeks;
  • Domain-specific design that’s designed and tuned by Dell engineers and domain experts for specific science, engineering and analytics workloads using flexible industry-standard building blocks; and,
  • Fully tested and validated systems by Dell engineering with a single point of hardware support and a wide range of additional service options.

New application-specific Dell HPC System Portfolio offerings include:

  • Dell HPC System for Genomic Data Analysis is designed to meet the needs of genomic research organizations to enable cost-effective bioinformatics centers delivering results and identifying treatments in clinically relevant timeframes while maintaining compliance and protecting confidential data. The platform is a result of key learnings from Dell’s relationship with Translational Genomics Research Institute (TGen) to help clinical researchers and doctors expand the reach and impact of the world’s first Food and Drug Administration-approved precision medicine trial for pediatric cancer. TGen has been able to improve outcomes for more patients by creating targeted treatments at least one week faster than they could be accomplished previously.
  • Dell HPC System for Manufacturing is designed for customers running complex manufacturing design simulations using workstations, clusters or both. Applicable use cases include Finite Element Analysis for structural analysis using ANSYS Mechanical & Computational Fluid Dynamics for predicting fluid behavior in designs utilizing ANSYS Fluent or CD-adapco STAR-CCM+.
  • Dell HPC System for Research is designed as a foundation, or reference architecture, for baseline research systems and numerous applications involving complex scientific analysis. This standard cluster configuration can be used as a starting point for Dell’s customers and systems engineers to quickly develop research systems that match the unique needs of research customers requiring systems for a wide variety of research agendas.

Accelerating HPC Technology Innovation and Partnerships

Dell announced a new expansion of its Dell HPC Innovation Lab in cooperation with Intel specifically for support of its Intel Scalable System Framework. This multi-million dollar expansion to the Austin, Texas, facility includes additional domain expertise, infrastructure and technologists. The lab is designed to unlock the capabilities and commercialize the benefits of advanced processing, network and storage technologies as well as enable open standards across the industry.

Beyond becoming the first major original equipment manufacturer (OEM) to join the Intel Fabric Builders program, Dell is working closely with Intel to support its Intel Scalable System Framework, which includesIntel Omni-Path Fabric technology, next-generation Intel Xeon processors, the Intel Xeon Phi processor family, and the Intel Enterprise Edition for Lustre. Announcements include:

  • New Dell Networking H-Series switches and adapters for PowerEdge servers featuring the Intel Omni-Path Architecture. These provide a next-generation fabric technology designed for HPC deployments. The architecture includes advanced features such as traffic flow optimization, packet integrity protection and dynamic lane scaling allowing for finer-grained control on the fabric level to enable high resiliency, high performance and optimized traffic movement.
  • Dell and Intel support for the Linux Foundation’s OpenHPC community. The community is designed to provide a common platform on which end-users can collaborate and innovate to simplify the complexity of installation, configuration and ongoing maintenance of implementing a custom software stack and easing a path to exascale.

“We’re excited to collaborate with Dell to bring advanced systems to market early next year using the Intel Scalable System Framework,” said Charles Wuischpard, vice president and general manager of HPC Platform Group at Intel. “Dell’s position as our largest and fastest-growing customer for Intel Enterprise Edition for Lustre, their work on Omni-Path Architecture and next-generation Intel Xeon Phi, and their initiatives to expand the Dell Innovation Lab demonstrate their commitment to rapidly expanding the ecosystem for HPC.”

Mellanox Partnership

Dell and Mellanox announced additional investment in Dell’s existing HPC Innovation Lab to provide an end-to-end EDR 100Gb/s InfiniBand supercomputer system. The system is designed to showcase extreme scalability by leveraging the offloading capabilities and advanced acceleration engines of the Mellanox interconnect as well as provide application specific benchmarking, and characterizations for customers and partners.

“With this new investment, Dell’s HPC Innovation Lab will now enable new levels of applications efficiency and innovative research capabilities. Together we will help build the solutions of the future,” said Gilad Shainer, vice president of marketing, Mellanox Technologies.

Availability

  • The Dell HPC System for Genomic Data Analysis is available today.
  • The Dell HPC Systems for Manufacturing and Research will be available in early 2016.
  • The Dell Networking H-series switches, adapters and software based on the Intel Omni-Path Architecture will be available in the first half of 2016.
High Performance Computing Advancements

Manufacturers Turn to Advanced Information Management Solutions

PwC US does some interesting and relevant research for my areas of interest. Here are details of the latest.

After five years of anemic economic recovery, manufacturers continue to add inventory to their books much faster than GDP growth. In order to better manage inventory levels while still ensuring the right part is in the right place at the right time, manufacturers are increasingly relying on advanced information management solutions, according to a survey released by PwC US in collaboration with Manufacturers Alliance for Productivity and Innovation (MAPI).

Further, inventory turns – which indicate whether the supply chain is getting more efficient at moving goods from suppliers to customers – have declined steadily since 2011. PwC and MAPI surveyed senior executives from 75 global manufacturers (with U.S. headquarters) to better understand this decline in inventory performance and polled respondents on the effectiveness and benefits of using advanced inventory data management strategies to reduce inventory.

“Inventory is often considered by manufacturers to be the most valuable category of assets on their books; however, it can tie up large amounts of cash and diminish in value for a host of reasons,” said Stephen Pillsbury, principal in PwC’s U.S. industrial products practice. “As a result, it has become common practice for manufacturers to minimize inventory as much as possible without hurting customer service levels. While they continue to focus on managing inventory, they seem to have reached a point of diminishing returns and are now turning to advanced information management solutions to further reduce their inventory.”

Benefits of Effective Information Management

When it comes to enabling agility, responsiveness and operating flexibility, 37 percent of respondents reported that their core ERP system was either not very effective or ineffective. Conversely, the other respondents with effective ERP systems were quite bullish on the usefulness of their supply chain visibility (SCV) systems when it comes to replacing inventory and costs with actionable and timely data.

Interestingly, companies with ineffective ERP systems experienced an average annual margin erosion of 3.5 percent while those with effective systems in place experienced an average growth of 2 percent. Companies with both effective ERP and SCV systems had even higher margins at 2.4 percent. Put another way, we found a clear connection between strong margin performance and effective ERP implementations.

In regards to inventory turns, almost half of those surveyed said their supply chain system was effective or very effective while one third said their system was not very effective or ineffective at replacing inventory and costs with actionable and timely data. When comparing the two groups, the companies with effective SVC systems outperformed the ineffective ones by 30 percent.

“Information management systems matter because they get the right information to the right place at the right time in order to improve effectiveness,” said Cam Mackey, SVP, Operations and Partnerships, MAPI. “To that end, many companies have invested a great deal of time and money implementing ERP and SVC systems. Together, these platforms can provide manufacturers with detailed information including orders, lead times, stock quantities and locations. Effectively integrating these information systems enables manufacturers to do a better job of synchronizing supplier deliveries with production schedules and customer orders, resulting in improved customer service and less overall inventory.”

Improving Supply Chain Management

In an effort to improve supply chain management, many companies are embracing SCV systems – enabling companies to track and manage raw materials, work-in-process, and finished goods across the extended supply chain. When fully implemented, they provide extensive demand, planning, supply and inventory information throughout the supply chain, enabling users to optimally balance customer service levels with costs to serve. Of those surveyed, 70 percent reported having a SCV system in place.

As effective systems drive better margin growth and higher turns, companies are still experiencing inventory growth and supply chain issues. When asked about the factors having the biggest impact on supply chain visibility, uncertainty of supplier deliveries and unpredictable customer demand were among the responses most cited. While SCV systems are intended to link customer demand to production schedules and supplier orders/deliveries, respondents continue to cite problems with forecast accuracy.

Respondents also addressed their ability to maintain optimal inventory levels, listing lack of discipline in operating processes and practices, a high degree of product complexity or number of stock keeping units (SKUs), and poor forecasts from marketing/sales as having significant impact.

Many of the factors listed above are driven by management disciplines, not information systems. The management discipline that most strongly addresses these factors is integrated materials management (IMM), commonly enabled through sales, inventory, and operations planning. This discipline is focused on synchronizing sales forecasts with delivery commitments and material supplies and involves all of a company’s key functional stakeholders. According to Mackey, “While systems matter, the bottom line is that effective supply chain visibility all comes down to management discipline.”

“The single biggest driver of excess inventory and unreliable delivery performance is inadequate material management practices. While SCV systems can greatly enhance IMM they cannot replace disciplined review and approval by critical management stakeholders. To make this task easier and more effective requires buy-in and coordination across key functions in the organization – even with the best technology in the world, it still comes back to management discipline,” Pillsbury said.

For more information, download the report here: Inventory Performance Today: Why is it Declining?

 

High Performance Computing Advancements

Machine Learning Algorithms for Big Data

Anodot big data screenshotBig Data comes with all that data transported by the Internet of Things. Big Data has little value unless you can tap into it for the information that you need for that decision you need to make in the next hour or two.

Anodot recently contacted me about a new analytics solution it has developed. For most of us, it is sufficient to know that such solution exists. Others may want to see what it’s up to.

The company just exited stealth mode introducing its real-time anomaly detection solution, which the company maintains will disrupt the static nature of today’s Business Intelligence (BI) with patented machine learning algorithms for big data. Pinpointing performance issues and business opportunities in real time, Anodot enables its customers to increase operational efficiency and maximize revenue generation.

The company also announced it closed a $3 million Series A funding round led by Disruptive Partners, bringing total funding in the company to $4.5 million. The company will use the funding to accelerate its product roadmap and expand its sales activity, focusing on the ad tech, e-commerce, IoT and manufacturing industries in the U.S. and EMEA.

Founded in June 2014, Anodot is the only analytics and anomaly detection solution that is data agnostic and automates the discovery of outliers in all business and operational data. Anodot’s platform isolates issues and correlates them across multiple parameters to surface and alert on incidents in real time.

Data analysis lag problem

“I experienced the data analysis lag problem first hand as CTO for Gett,” said Anodot CEO David Drai. “As a mobile taxi app, SMS text orders were dropped by the carrier, but it could take up to three days to spot critical issues and fix them, costing tens of thousands of dollars per incident. That’s where I got the idea for Anodot—to employ the latest advances in machine learning to detect performance problems automatically and in real time, eliminating the latency.”

Data-centric organizations share a common problem—they collect mountains of data, but deriving business value comes long after the actual event and requires data modeling experts using homegrown custom tools in a static and time-consuming process. The resulting delays in getting business insights can cost companies millions of dollars in lost revenue or production.

“There is a huge opportunity to disrupt the BI market by enabling automated and real-time insights into big data pools of metrics and KPIs,” said Tal Barnoach, Anodot board member and general partner at The Disruptive Fund, a privately held Tel Aviv/New York-based venture fund. “We have been with Anodot from the beginning. The team and the technology are terrific, we are impressed with the progress they have made to date and are excited to be participating in this next stage as they scale upward.”

Anodot is led by a proven team of three co-founders with strong credentials as entrepreneurs and technologists with deep experience in data science and global-scale SaaS infrastructures. CEO David Drai was co-founder and CTO of Cotendo for four years when it was acquired by Akamai for $300 million.

Chief Data Scientist Dr. Ira Cohen held the same position at HP Software where he led research and development in machine learning and data mining techniques. R&D VP Shay Lang has led engineering teams for more than 10 years at leading technology companies. On the board of directors, the team also includes Anthony Bettencourt, president and CEO at Imperva and a board member at Proofpoint, and Ben Lorica, O’Reilly Media’s chief data scientist and a top influencer on Twitter, as a board advisor.

Anodot is already being used in production by dozens of organizations, including Avantis (not the one that is part of Schneider Electric Software) that develops the most advanced desktop tools and monetization platforms in the world and Wix, a leading cloud-based Web development platform with millions of users worldwide.

Important issues missed

Before using Anodot, Wix said vast amounts of metrics and KPIs were measured and analyzed manually by data analysts, yet despite spending a great deal of time, important issues sometimes took days to identify.

“With Anodot we are able to detect changes very early and make decisions that have a direct impact on our business,” said Mark Sonis, monitoring team leader at Wix. “We are able to investigate issues in minutes, not hours, and it does its magic every day with little effort on our side. Anodot has become an essential solution across many of our teams including BI, R&D and DevOps.”

According to Doron Ben-David, CTO, VP R&D, Avantis, “Our vision is to enable publishers to focus on creating content, applications and media, while we provide the revenue mechanism. Fulfilling that promise means our core business revolves around numbers—the impressions, downloads, ad shares and other metrics that feed revenues and bottom lines. Anodot gives us the BI tool to track all of the metrics and KPIs that are essential to our success in real time and alerts us to issues as they happen so we can immediately respond before it impacts our customers.”

The use of solutions like Anodot’s with advanced and predictive analytics, including machine learning, will grow 65 percent faster than those without predictive functionality, according to IDC.

Unique features and advantages of Anodot Anomaly Detection include:

  • Operates in real time
  • Works with any type of metric or KPI and scales to any big data volume
  • Uses proprietary patented machine learning algorithms
  • Correlates different metrics to help identify root causes of problems and eliminate alert storms
  • Simulation capability optimizes alert planning and reduces false positive alerts
  • Eliminates the need for time-intensive manual analysis
  • Enables non-specialists to gain the insights they want and delivers fast time-to-value
  • Provides clear visualizations that help any user to understand what the data is showing them

 

 

Aveva Calls Off Schneider Electric Software Reverse Merger

Aveva Calls Off Schneider Electric Software Reverse Merger

A news item from the Department of the Blindingly Obvious. Aveva’s board has called off discussion for the “reverse merger” with Schneider Electric Software. It cited two reasons: the structure of the deal was overly complex and software integration issues. Yep, they are certainly correct on both counts.

You may recall that Schneider would give Aveva its software businesses (Wonderware, Avantis, SimSci, Indusoft, Citect I presume) and a huge chunk of cash in return for a 53% stake in the “New Aveva.”

I have watched many software mergers over the past 10 years. None ever achieved technology integration. They were all organizational and technical nightmares.

Then think about all the Wonderware technology embedded in Foxboro products. I’d get a headache even trying to sort through the legal and organizational problems.

A good software company

Wonderware (perhaps with other pieces of the software portfolio) would make a marvelous software company with adequate investment behind it. I still think that Schneider Electric will find a way to divest it. Running software is qualitatively different from running an electrical components and process automation business. Right now Schneider is still investing in the business, and that is a good sign.

Follow this blog

Get a weekly email of all new posts.