QAD CEO Speaks To Recent Acquisitions

While I was researching the QAD acquisition of RedZone, I noticed the prominence of something called “Adaptive Solutions.” When I mentioned I was curious about what that meant in beyond marketing terms, the PR team went to work and set up a conference call with QAD CEO Anton Chilton.

He told me, “The pace of change facing manufacturers has required a real-time response to situations. Industry models are changing. For instance, look at the automotive industry transitioning to electric vehicles. So they need solutions to adapt to rapid change.”

This explanation comes from the company’s website under the manufacturing tab—Digital manufacturing fully integrates planning, scheduling, quality, cost management, material movement and shop floor control. The solution allows manufacturers to leverage advanced digital manufacturing technologies to better communicate, analyze and use information to meet cost and quality objectives. Build a strong foundation for lean manufacturing concepts that eliminates waste throughout your operations. QAD’s manufacturing ERP capabilities also adapt to any style of manufacturing and to the unique needs of a geographic location and industry.

I mentioned that my experience and coverage usually ended with the MES layer. I have only a little ERP experience. Chilton said, “Some people see ERP as something like concrete poured in the form and left to harden. Enter a platform emphasizing no-code or low-code where users can build new capabilities on it without intrusive customization.” That sounds like a step in the right direction.

We spoke of the meaning of the RedZone acquisition. “We speak of the foundation of people, process, systems,” he said. “We due process and systems well with our current portfolio. With the RedZone acquisition, we can better address the people part of the equation. RedZone is a pure SaaS play providing real-time information to front line workers. It’s in the hands of workers on a tablet configured to each person’s role. The secret sauce includes locking in best practices such as kaizen right in the system on the tablet.  The system encourages the team to work collaboratively.”

I’m always curious about integrating the new acquisition into the existing structure. “RedZone can take in information from directly from QAD. It offers deeper interpretation with deeper modules, such as enterprise quality management and others.”

How good is this application? Chilton—“on overage RedZone users have seen 42% increased productivity for medium sized companies and 20% for large enterprises. It scales because it’s implemented at the plant level. The improvements are typically seen within the first 90 days on average. It’s in 1,000 locations with 300,000 users.”

Only a few months earlier, QAD had acquired LiveJourney. Its product is a data mining and predictive modeling application. It offers analysis of real-time data on the fly. It compares patterns from the actual to the as-designed. Managers and workers can use the results to find constraints or other problems and attack them as part of their Lean continuous improvement.

New IoT Services

Two announcements from HiveMQ.

HiveMQ Accelerates IoT Data Ingestion into Google Cloud 

Aligning your IT infrastructure with a major supplier can be not only expensive, but also it can leave you vulnerable to corporate decisions made far away. For example, Google recently announced plans to retire Google IoT Core, leaving customers with less than a year to migrate their IoT applications to a new service.

The latest Rework podcast from 37 Signals features co-founder David Heinemeier Hansson (@DHH) and Operations Director Eron Nicholson discussing leaving the cloud. They address this issue and others.

If you are caught by this shift in Google strategy, there is one possible solution just announced. HiveMQ has developed an MQTT broker that bypasses Google IoT Core to send up to billions of messages per day directly to Google Cloud for advanced analytics.

HiveMQ Enterprise Extension for Google Cloud Pub/Sub, is a new feature that seamlessly integrates MQTT data into Google Cloud. Organizations can now benefit from HiveMQ’s flexible, standards-based platform to send IoT data reliably and securely to Google Cloud enterprise services such as monitoring, advanced analytics and machine learning. 

HiveMQ can replace IoT Core’s MQTT data ingestion service to connect MQTT clients using HiveMQ’s MQTT broker and then map the MQTT message into Google Pub/Sub.

HiveMQ Enables Real-Time IoT Observability from Device to Cloud 

New feature traces MQTT data in real-time to give users better visibility into their IoT applications.

HiveMQ, a global leader in enterprise MQTT solutions, announced the availability of the HiveMQ Distributed Tracing Extension, a new feature that makes it possible to trace and debug MQTT data streams from device to cloud and back. Complete IoT observability requires insight into three pillars: metrics, traces and logs. HiveMQ has added distributed tracing to help organizations achieve end-to-end observability and make their IoT applications more performant and resilient. 

Distributed Tracing is a way to trace events and achieve a high-level overview of a message’s journey through multiple, complex systems. With the Distributed Tracing Extension, HiveMQ is the first MQTT broker to add OpenTelemetry support to provide complete transparency for every publish message that uses the HiveMQ MQTT broker. OpenTelemetry is an open standard for instrumentation that allows for interoperability across all services so organizations can achieve visibility over their entire system.

Lenovo Celebrates 30th Anniversary of ThinkSystem Innovation with  Broadest Portfolio Enhancement in its History

My knowledge of Lenovo stopped not long after its acquisition of the ThinkPad line of laptops and other personal computing devices many years ago. I accepted an invitation to a press event recently. Good thing. I’ve been attending IT company conferences for a few year. Turns out that Lenovo is a strong competitor in this market. They amazed me with the depth and breadth of their product line.

Celebrating the 30th anniversary, they announced many new products. You can find several summarized here and check the website for more complete information. 

  • Lenovo Infrastructure Solutions V3 delivers advanced ThinkSystem, ThinkAgile, and ThinkEdge servers and storage with next-generation AMD, Intel and Arm-based processors, NVIDIA AI Enterprise software and enhanced Lenovo ThinkShield security
  • New Lenovo XClarity One, a cloud-based, unified software management platform, provides an industry-first integration of TruScale Infrastructure-as-a-Service, Management-as-a-Service and Smarter Support analytics to simplify orchestration, automation and metering from edge to cloud 
  • Next-generation Lenovo Neptune warm water cooling and CO2 Offset Services help customers achieve their sustainability goals
  • Supported by next-generation AMD EPYC, Intel Xeon Scalable and Arm-based processors, as well as AMD Instinct and NVIDIA GPUs and NVIDIA AI Enterprise software 
  • Lenovo’s next-generation of ThinkAgile V3 hyperconverged infrastructure solutions are pre-integrated with an open ecosystem of partners, including Microsoft, Nutanix and VMware software capabilities
  • Three new Lenovo Microsoft Azure Solutions: SQL for AI and Machine Learning (ML) Insights, Backup and Recovery and Azure Virtual Desktop. 
  • The new Lenovo Open Cloud Automation (LOC-a) version 2.5 securely authenticates and activates leading ThinkEdge AI servers on site via a phone, accelerating business insights with a fully operational edge system within minutes or hours. 
  • Lenovo’s Modular Root of Trust helps protect, detect and recover from cyberattacks and digital compromises with bolstered tamper-detection and monitoring embedded into the chip design
  • Lenovo System Guard ensures heightened security between manufacturing, delivery and deployment with advanced hardware monitoring

Return of the Large Trade Show

IMTS / Hannover Messe invaded Chicago this week. I drove down a couple of days. It was huge. Booths populated all four halls. I did not see everything. Or even half.

Hannover Messe (in Chicago) has co-located for the past three or four events. As in the past, the automation / Hannover Messe part encompassed a few aisles in the East hall.

I’ll have more news items in the next post.

Best of what I saw:

Nokia. What?! I was approached for an appointment. I said yes figuring on a 5G private network discussion. I was partly right.

Let me back up for context.

  • Enterprises crave data to feed their information systems.
  • Data from industrial / manufacturing operations were bottled in isolated, siloed systems
  • Networking became robust
  • Interoperable protocols grew
  • The Internet of Things (IoT) became a thing
  • Suddenly data could go where and when needed

Solutions.

  • Automation vendors claimed connectivity to enterprise but that fell short
  • IT suppliers, supporters of the enterprise, tried to enter the market with gateways, networking, partnerships and ecosystems to get the data.
  • They couldn’t find the formula to sell to manufacturing (known as OT)
  • We have gateways, databases, networking, but still no enterprise solution

Nokia.

  • Builds off networking technology which has progressed to 5G Private Networks
  • Has added edge compute devices
  • Partnership with PTC (Kepware / Thingworx) for software connectivity
  • Attacking this open market from a new perspective–both the enterprise IT side and the operations OT side

I am not predicting success. I never do. What I love about trade shows is finding this nugget of original thinking cloaked in the mundane. They have the foundation. Can they sell?

Check out this page on the Nokia site.

Identify Data Sets With Orphaned Data Enabling Appropriate Actions

Steve Leeper, VP Systems Engineering and Marketing, and Carl D’Halluin, Chief Technology Officer of Datadobi, met with me recently to discuss data, unstructured data management, data migration, the company, and those poor little orphaned files and data left in storage long after they had useful life.

I’ve been writing about the company’s products for about two years. Datadobi is about to celebrate its 13th birthday. Organically funded, management expects it to be around for some time to come.

The occasion of this conversation was release 6.2 and an upgrade to the recently announced StorageMAP product—a multi-vendor, multi-cloud data management platform with the introduction of capabilities to discover and remediate orphaned data.

Orphaned data is owned by employees of a company that are inactive but still enabled on an organization’s network or systems. These datasets, which can amount to a significant part of a company’s total data stored, represent risk and cost to every organization. Any enterprise with a large employee base, normal to high attrition rates, or undertaking merger and acquisition (M&A) activities will be vulnerable to orphaned data. Orphaned data is mostly not of value to the organization and creates liability due to its unknown content. Eliminating orphaned data enables IT leaders to reduce cost, lower carbon footprint, and lower risk while maximizing value out of relevant unstructured data used by active users.

With the new capabilities, StorageMAP can identify data sets with high amounts of orphaned data, allowing IT leaders to group and organize the data independent of the business organization. The orphaned datasets can then be migrated, archived to a lower-cost platform, or deleted by StorageMAP. The platform provides an easy-to-read, granular report on the top users with orphaned data and for additional detail, a search function allows targeted searches to be executed across part of or the entire data estate.

The 6.2 release comes after Datadobi launched its StorageMAP platform earlier this year. StorageMAP provides a single pane of glass for organizations to manage unstructured data across their complete data storage estate on-premises and in the cloud. This latest update enables IT leaders to fully take control of orphaned data on their networks.

“Due to the scale and complexity of unstructured data in today’s heterogeneous storage environments, enterprises can easily lose track of orphaned data within networks and open themselves up to excess storage costs and risk,” said Carl D’Halluin, CTO, Datadobi. “StorageMAP’s new capabilities allow for a seamless experience identifying and taking the appropriate action on orphaned data to give IT leaders peace of mind.”

“As data proliferation continues, IT leaders are going to see more orphaned data on their networks. This is why it is so important that organizations turn to unstructured data management solutions like StorageMAP to find datasets associated with inactive users and take the appropriate action,” said Craig Ledo, IT Validation Analyst at Enterprise Strategy Group (ESG). “Combined with a high level of cross-vendor knowledge, years of real-life experience, and great customer support, enterprises can let StorageMAP do the heavy lifting when it comes to orphaned data.”

Fix Your Data Flow and Transform Your Business

Back in the 70s I actually had a job in a manufacturing company with a principle function regarding data. All the engineering, product, and cost data. Little did I know then that 45 years later data is the new oil. (Don’t really believe all the hype.)

I’ve been interested in the new function and applications called DataOps, among other things for data management. I’ve been part of the HPE VIP Community for a few years. HPE isn’t talking as much about manufacturing lately, but many of its technologies are germane. I picked up this short post from the Community site on data management.

Data management complexity is everywhere in the modern enterprise, and it can feel insurmountable. IT infrastructure has evolved into a complex web of fragmented software and hardware in many organizations, with disparate management tools, complicated procurement processes, inflexible provisioning cycles, and siloed data. Organizations are running multiple storage platforms, each with their own management tools, which becomes a progressively tougher problem at scale.

Complexity is a growing problem, and according to a recent ESG study, 74% of IT decision makers acknowledge that it is holding them back in their digital transformation journey. Their data management capabilities simply can’t keep pace with business demands. As a result, IT organizations are forced to spend time managing the infrastructure, as opposed to really leveraging the data.

Constant firefighting might sound like business as usual to many tech pros, but it can be avoided. To accelerate transformation, IT leaders must confront and eliminate data complexity first, getting data to flow where and when it needs to, and allowing the business to get back to business.

Of course, HPE has a solution—GreenLake for Block Storage.

Follow this blog

Get a weekly email of all new posts.