First Large-Scale Enterprise IoT Test Platform Launched

Many people remain confused about just what an Internet of Things (IoT) project consists of. Various analysts predict astronomical numbers for potential connected things. We know that a large industrial facility could have thousands, or indeed hundreds of thousands, of sensors and other data points. Testing the performance and resilience of such a network can be both daunting and critical. HiveMQ claims a solution with a platform built upon MQTT.

HiveMQ Swarm Complements HiveMQ MQTT Platform to Deliver a Complete Solution for Enterprises Deploying Large-Scale IoT Solutions

HiveMQ announced HiveMQ Swarm, the industry’s first solution that enables organizations of all sizes to reliably simulate and test large-scale IoT networks. HiveMQ Swam enables enterprises to easily test the scalability and performance of their IoT deployments, resulting in significantly increased quality and reliability of their system. HiveMQ Swarm is also the first solution that provides global enterprises with a superior solution to forecast capacity, infrastructure, and financial cost planning prior to putting their IoT system into production.

“As IoT solutions continue to grow in both scope and volume, the ability to test IoT solutions for scale and performance becomes mission-critical,” said Dominik Obermaier, CTO and founder of HiveMQ. “We have extensive experience with some of the largest IoT systems in the world. These customers have asked us to help them validate their systems before going into production. Swarm meets these needs by enabling our enterprise customers to ensure their large-scale IoT systems perform to expectation the first time they’re deployed.”

IoT systems are incredibly difficult to test prior to production. Emulating behavior in a production environment is often unreliable and individual IoT devices can demonstrate multiple complex behavior patterns. For example, autonomous vehicles at rest behave very differently than those navigating the unexpected events they encounter in the real world, be it a highway or a factory floor.

Despite these challenges, load and stress testing is an unavoidable reality, as fixing IoT production errors in the field can be incredibly expensive, not to mention that these errors can have potentially catastrophic results on the system itself. As a result, determining system resilience is a mission critical endeavor.

HiveMQ Swarm was designed specifically to solve these challenges. Swarm is a distributed platform able to create hundreds of millions of unique network connections that simulate devices, messages, and MQTT topics (a form of addressing that allows MQTT clients to share information), as well as develop reusable scenarios that emulate device behaviors. In addition to a custom data generator to create complex use cases for testing, HiveMQ Swarm is designed to seamlessly integrate with enterprise cloud infrastructure, including public clouds (e.g., AWS, Azure, GCP) and Kubernetes-based systems.

HiveMQ Swarm is complementary to the HiveMQ MQTT platform, an MQTT broker messaging platform designed for the fast, efficient and reliable movement of data to and from connected IoT devices. It uses the MQTT protocol for instant, bi-directional push of data between devices and enterprise systems.

WhiteHat Security Releases AppSec Stats Flash Volume 3

Cyber security news is always relevant, especially in our hyper-connected time. This tells of leaking information.

Findings reveal more than 40 percent of applications actively leaking information and at-risk of exposing sensitive data

WhiteHat Security, a wholly owned, independent subsidiary of NTT Ltd. and a world leader in application security, released AppSec Stats Flash Volume 3, the latest installment of the company’s monthly report and podcast reflecting on the current state of application security and the wider cyber threat landscape.

In AppSec Stats Flash Volume 3, WhiteHat Security’s Setu Kulkarni, vice president of corporate strategy and business development, and Zach Jones, senior director of detection research, are joined by Dino Boukouris, founder and managing director at Momentum Cyber, to primarily discuss how information leakage can expose vulnerabilities in connected applications across business-to-business partnerships, as well as analyze the latest application security data found in this month’s report.

“In any partnership or merger and acquisition activity, organizations reach a stage where they need to integrate applications to sync data, enhance productivity and grow revenue. While application integration issues have been simplified, there is still no way to predict how their security posture will be affected by the complex orchestrations that form a digital supply chain,” said Kulkarni. “When two companies decide to integrate their applications, they should explicitly account for the risks that both companies will inherit, particularly concerning sensitive user and infrastructure data.”

Key findings from AppSec Stats Flash Volume 3 include:

  • More than 40 percent of applications are actively leaking information and are at-risk of exposing sensitive data. “When we talk about information leakage, we often do not realize the vast amount of sensitive or partially sensitive information that the applications we interact with are collecting,” said Jones.
  • Exposure of A3-Sensitive Data, one of the leading vulnerabilities reported within information leakage, can result in a supply chain-type attack across connected applications. “Too often, by the time a formal security assessment takes place in an acquisition, application security is viewed as a ‘check-the-box’ diligence item as opposed to a key value driver,” said Boukouris.
  • Applications in the manufacturing sector continue to report the highest Window of Exposure, with 70 percent of applications having at least one serious vulnerability open over the previous 12 months. “Window of Exposure is a major concern as applications remain increasingly vulnerable across all industries, particularly manufacturing and finance. To improve these metrics, security and DevOps teams must take a holistic approach to identifying, prioritizing, and remediating these vulnerabilities in a manner that configures all changes with the development controls in process,” said Kulkarni.

Those interested in learning more about the findings and analysis in AppSec Stats Flash Volume 3 can now download the report and stream the latest podcast episode on WhiteHat Security’s websiteand popular platforms including Apple PodcastsSpotifyStitcherAmazon, and more. 

About WhiteHat SecurityWhiteHat Security, a wholly-owned, independent subsidiary of NTT Ltd., is the leading advisor for application security with the most comprehensive platform powered by artificial and human intelligence. Trusted for nearly two decades by Fortune 500 organizations, WhiteHat Security helps organizations accelerate their digital future in our application-driven world. WhiteHat Security is based in San Jose, California, with regional offices across the U.S. and Europe.

Seeq Announces Enterprise and Team Editions for Its Analytics Solutions

Seeq has developed some pretty cool analytics solutions. This announcement seems to be just some repackaging and rebranding of components, but it serves as a refresher for its suite of products.

New editions address end-user deployment requirements from individual plants and facilities 
to multi-plant enterprise and cloud deployments.

Seeq Corporation, a leader in manufacturing and industrial internet of things (IIoT) advanced analytics software, announces a new packaging of Seeq features and applications as Seeq Team and Seeq Enterprise editions. These editions address the needs of customers from a local water utility to a multi-national chemical, pharmaceutical, or oil & gas company.

Both Seeq editions, which run best as SaaS on AWS or Microsoft Azure, represent the culmination of learning and experiences with hundreds of Seeq deployments in process manufacturing organizations. For these manufacturers, Seeq enables advanced analytics insights to improve production and business outcomes across their organizations.

Seeq Cortex, a renaming of Seeq Server, is included in both editions and is the execution engine that delivers key features, including multi-source and type data connectivity, security, calculation scalability, and other features. Seeq Cortex ensures immediate and long-term support for customer data architectures and IT requirements.

“Seeq Cortex enables immediate access to analytics innovation with existing data architectures and silos, while also supporting customer data roadmaps and strategies on the cloud,” says Steve Sliwa, CEO and Co-Founder Seeq Corporation. “Cortex is the backbone of the predictive, diagnostic, machine learning, and descriptive analytics used by customers around the globe.”

Cortex benefits include:

·       Abstraction of data sources with high-speed connectivity to multiple and diverse time series and contextual data sources, including historians and SQL-based data sources.

·       Calculation speed with a highly parallelized, time-series specific engine to enable fast execution of analytics including data interpolation, filtering, cleansing, and modeling.

·       Data security and user access control through integration with OSIsoft PI for tag-level access control, or organizations may implement their own governance policies.

Seeq Team is optimized for new deployments in a single site facility such as a water utility or power generation plant with a limited number of time series and contextual data sources, or for the first usage of Seeq by a workgroup within a larger organization. Quick and easy deployment of Seeq and resulting ROI will demonstrate the benefit of leveraging existing data and expertise to improve production outcomes.

Seeq Enterprise is designed for complex single-plant, multi-site, or enterprise deployments with hundreds or thousands of users. It includes support for more complex data sources such as data lakes and ERP systems, along with features for integrating OT and IT data science teams driving digital transformation initiatives. Features specific to Seeq Enterprise include:

·       Seeq Data Lab is built on Jupyter Notebooks to enable easy Python access to Seeq functionality and to the vast library of open source modules and algorithms.

·       Unlimited Seeq users: Seeq Enterprise does not have a limit on number of users.

·       More and complex data sources: Seeq Enterprise supports up to 10 connections including data sources such as data lakes, ERP, and non-SQL databases.

·       Audit trail support: Seeq Enterprise includes features and administration tools for customers in regulated industries including support for CFR Part 11.

·       Data visualization tools: Seeq Enterprise supports data integration with BI and process applications such as Tableau, PowerBI, Spotfire and OSIsoft PI Vision.

In addition to support for large deployments with Seeq Enterprise, Seeq also has a licensing option for indirect usage of Seeq through its REST API and related software development kits for Java, C#, and Python, and Seeq connectivity to more than 10 enterprise data sources. This license, the Seeq Strategic Agreement, is appropriate for Seeq customers who require these capabilities and are making a multi-year commitment to Seeq success within their organization

“Seeq continues to release compelling analytics solutions for end users in process manufacturing and Industry 4.0 engagements,” comments Janice Abel, Principal Analyst at ARC Advisory Group. “The need for the faster and better insights provided by Seeq is a consistent requirement for organizations investing in IIoT and Smart Manufacturing.”

Seeq’s rapid growth is being fueled in part by its partnerships and commitment to cloud-based computing. Seeq is available in the AWS marketplace and is an AWS Industrial Competency Partner. On Microsoft Azure, Seeq has been available in the Azure Marketplace since 2019 and was recognized last year as a 2020 Microsoft Energy Partner of the Year Finalist.

Linux Foundation Announces Free sigstore Signing Service to Confirm Origin and Authenticity of Software

Another open-source software advance.

Red Hat, Google and Purdue University lead efforts to ensure software maintainers, distributors and consumers have full confidence in their code, artifacts and tooling

The Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced the sigstore project. sigstore improves the security of the software supply chain by enabling the easy adoption of cryptographic software signing backed by transparency log technologies. 

sigstore will empower software developers to securely sign software artifacts such as release files, container images and binaries. Signing materials are then stored in a tamper-proof public log. The service will be free to use for all developers and software providers, with the sigstore code and operation tooling developed by the sigstore community. Founding members include Red Hat, Google and Purdue University.

“sigstore enables all open source communities to sign their software and combines provenance, integrity and discoverability to create a transparent and auditable software supply chain,” said Luke Hinds, Security Engineering Lead, Red Hat office of the CTO. “By hosting this collaboration at the Linux Foundation, we can accelerate our work in sigstore and support the ongoing adoption and impact of open source software and development.” 

Understanding and confirming the origin and authenticity of software relies on an often disparate set of approaches and data formats. The solutions that do exist, often rely on digests that are stored on insecure systems that are susceptible to tampering and can lead to various attacks such as swapping out of digests or users falling prey to targeted attacks.

“Securing a software deployment ought to start with making sure we’re running the software we think we are. Sigstore represents a great opportunity to bring more confidence and transparency to the open source software supply chain,” said Josh Aas, executive director, ISRG | Let’s Encrypt. 

Very few open source projects cryptographically sign software release artifacts. This is largely due to the challenges software maintainers face on key management, key compromise / revocation and the distribution of public keys and artifact digests. In turn, users are left to seek out which keys to trust and learn steps needed to validate signing. Further problems exist in how digests and public keys are distributed, often stored on websites susceptible to hacks or a README file situated on a public git repository. sigstore seeks to solve these issues by utilization of short-lived ephemeral keys with a trust root leveraged from an open and auditable public transparency logs. 

“I am very excited about the prospects of a system like sigstore. The software ecosystem is in dire need of something like it to report the state of the supply chain. I envision that, with sigstore answering all the questions about software sources and ownership, we can start asking the questions regarding software destinations, consumers, compliance (legal and otherwise), to identify criminal networks and secure critical software infrastructure. This will set a new tone in the software supply chain security conversation,” said Santiago Torres-Arias, Assistant Professor of Electrical and Computer Engineering, University of Purdue / in-toto project founder.

“sigstore is poised to advance the state of the art in open source development,” said Mike Dolan, senior vice president and general manager of Projects at the Linux Foundation. “We are happy to host and contribute to work that enables software maintainers and consumers alike to more easily manage their open source software and security.” 

“sigstore aims to make all releases of open source software verifiable, and easy for users to actually verify them. I’m hoping we can make this easy as exiting vim,” Dan Lorenc, Google Open Source Security Team. “Watching this take shape in the open has been fun. It’s great to see sigstore in a stable home.”

For more information and to contribute.

ThinkIQ Announces New Suite of Manufacturing SaaS Solutions

Here’s some news from a new company using SaaS led by people I’ve known from other places in my past. They have an interesting take on manufacturing information systems.

ThinkIQ announced a suite of new solutions under its SaaS manufacturing platform, which features four new areas of data functionality, including ThinkIQ’s Visualize, Insight, Transform and Enterprise solutions.

This expanded functionality enables manufacturers to make sense of the data surfacing actions that enhance safety, reliability, and efficiency by leveraging a fact-based granular and a data-centric view of material flows and related provenance attribute data.

New platform components integrate with existing IoT infrastructure to help manage everything from supply chains to manufacturing processes and beyond. These added capabilities will continue to build upon ThinkIQ’s unprecedented material traceability and insight which helps manufacturers improve yield, quality, safety, compliance and brand confidence while reducing waste and environmental impact.

“The addition of the newest solutions within our platform will help manage the manufacturing process from supply chain to customer,” said Niels Andersen, CTO and CPO of ThinkIQ. “Having this truly transformative intelligence and insight into your supply chain helps organizations make smarter decisions about their processes which in turn makes them become more profitable and more competitive.”

The latest solutions offered on the ThinkIQ platform include:

  • ThinkIQ Visualize – This functionality moves companies past raw data to being able to explore, compare, and be aware of the data – with standardized metrics and views to bring wide visibility and context to data. ThinkIQ Visualize takes the existing data stream and brings on-premise gateways and connectors to centralize the data. Organizations will be able to see a view of all your data on one screen, and at multiple locations.
  • ThinkIQ Insight – This new feature uses advanced analytics to enable a material-centric view of operations. Deliverables include advanced visualizations, initial cause & effect identification, industry benchmarking, and cross-plant KPIs. Alerts and notifications bring problems to immediate attention, mitigation of recall risks, and potential yield improvements. It will give a material-centric view of operations, with insights unable to be seen earlier.
  • ThinkIQ Transform – This feature utilizes the results of the earlier steps to supply transformational intelligence and uncover root causes and effects. Data is correlated to the most important metrics. From process engineer, through plant manager to CEO, they have an instant, intelligent view of operations – one which stretches from the beginning of the supply chain through the plant and beyond.
  • ThinkIQ Enterprise – This functionality of the platform offers the ongoing benefits of Industry 4.0 manufacturing. Manufacturing process will now include traceability from raw materials to product delivery, as well as optimized supply chains and real-time contextualized data.

ThinkIQ’s SaaS Manufacturing cloud-based platform simplifies the creation of web-based applications and leverages the strengths of the Internet of Things, Big Data, Data Science, Semantic Modeling and Machine Learning. The platform is able to collect data inputs across supply chain (existing and IIoT sensors) and analyze with AI, ML to identify correlations and root causes. It creates a new set of value-added traceability data which is delivered with actionable insights and decisions to guide systems across the supply chain.

About ThinkIQ

ThinkIQ, a pioneer of Digital Manufacturing Transformation SaaS, delivers unprecedented material traceability and insight into ways to improve yield, quality, safety, compliance and brand confidence.  Our transformational intelligence platform delivers fact-based granular and data-centric contextualized view of material flows and related provenance attribute data that integrates into existing IoT infrastructures and crosses supply chains to Smart Manufacturing processes and beyond. Our customers have saved $10’s of millions by identifying waste and underperforming assets, as well as reducing warranty reserves for quality and safety issues. ThinkiQ is a privately held company headquartered in Aliso Viejo, CA.

Follow this blog

Get a weekly email of all new posts.