by Bill Scudder

To make industrial data actionable, evolve your data historian with an AIoT strategy

Opinion
Nov 29, 2021
Artificial IntelligenceIndustryInternet of Things

The ability to mobilize and integrate volumes of complex industrial data across the enterprise is business critical. That’s why industrial leaders need to invest in cloud-ready, purpose-built AI infrastructure and applications.

Industry 4.0 / Industrial IoT / Smart Factory / virtual display
Credit: Metamorworks / Getty Images

In my last article, I wrote about how industrial organizations, in the rush to implement new technologies like AI, the cloud, and the Industrial IoT, have found themselves with a technology stack packed with legacy, plumbed-together, on-premises solutions. The result is an environment with not only multiple siloed data sources, each storing, formatting, and securing data in their own unique ways, but also an equally siloed approach to understanding how to leverage that data into something actionable across the enterprise. Domain experts become not just go-to sources for understanding a certain process or workflow, but the only people with insight and meaningful context into different data sets tracked or generated by different sources.

Workforce shifts put industrial data value capture at risk

In a rapidly digitizing organization, this is a poor way of maintaining and processing data across a site – but it’s especially counterproductive when you consider the generational churn happening in today’s industrial workforce. Veteran domain experts are increasingly retiring and being replaced with newer workers who neither academically trained to handle such specific, legacy technologies nor do they have the domain knowledge and operational expertise their predecessors had. This operational expertise gap leaves industrial organizations with not just more aggregated data than they know what to do with, but data that they don’t have any real visibility into.

Making industrial data useful and actionable in this scenario is a two-step process.

Step one involves leveraging next-generation data historians to democratize data access, ensuring that everyone within a plant and across the enterprise – regardless of skill, training, tenure, or expertise – has equal access and ability to tap into data that sits in any source, across the plant, from the edge to the cloud. Making data truly universal means using edge-to-cloud integrated data historians to eliminate silos, clean up data lakes, give structure to unstructured data, apply tags to make datasets easier to find, and make industrial data accessible in an AI-ready state to drive industrial intelligence.

Step two is then making that data actionable so that decision makers, from the production floor to the management level, are able to understand not just what the data is telling them, but what next steps to take.

Evolving data historians with Artificial Intelligence of Things (AIoT)

To turn raw data into actionable insights, industrial organizations need to evolve their data historians to benefit from machine learning (ML) and AI algorithms, by leveraging an Industrial AI infrastructure that helps accelerate business value from industrial data. Data historians can’t just be used to collect process data; they have to be treated as the core of a greater industrial data management strategy, one that shifts gears from mass data accumulation to more thoughtful application, integration, and mobility of industrial data. Purposeful application of AI and ML are key to facilitating that evolution in the data historian’s function in an industrial organization, to tap previously undiscovered or unoptimized industrial data sets for new business value.

Many leading industrial organizations are adopting an AIoT strategy to accelerate time-to-value from their AI investments. An AIoT strategy provides integrated data management, edge and cloud infrastructure, and a production-grade AI environment to build, deploy, and host Industrial AI applications at enterprise speed and scale. It also serves as the foundational infrastructure to realize a transformative vision for the Self-Optimizing Plant.

Scaling AI for real-world applications requires providing the tools, infrastructure, and workflows for powering Industrial AI across the solution lifecycle. It also requires the software, hardware, and enterprise architecture needed to productize AI in industrial environments, including broader collaboration between development, data science, and infrastructure capabilities such as CloudOps, DevOps, MLOps, and others. This dimension is critical to helping organizations mature beyond sporadic AI proof-of-concepts to an enterprise-wide Industrial AI strategy.

Industrial AI supersedes “generic” AI in delivering real-world value

But not all AI is equal, and trying to apply a “generic” AI approach to your data historian in an industrial setting can undercut any ROI you’re hoping to get out of it. It may be tempting to think that training a generic AI model on large volumes of plant data would acclimatize the model to the plant’s needs. But if the plant, for safety or design reasons, is working within a limited scope of conditions, then the AI model is also ingesting that narrow band of data and teaching itself to operate within those guardrails. As a result, a generic AI model trained on plant data may not be as nimble as you want and expect an AI model to be – for instance, being able to respond to real-time market changes and adjusting production schedules accordingly.

Even worse, this generic AI model could end up producing inaccurate correlations or causations between industrial processes and plant equipment, giving decision makers insights or prescribed next steps that aren’t correct. This doesn’t just harm the plant’s ability to function and how leaders can make the plant more optimized or efficient; it also undermines the ability to productize AI in the industrial space and harms AI adoption overall. 

Generic AI and ML won’t do. Evolving your plant or refinery’s data historian to match the needs of a more complex data environment means using more specific, fit-for-purpose Industrial AI – in other words, AI that has been embedded into domain-specific applications focused on targeted business needs, rather than trained on a larger pool of plant data.

By deploying AI through specific purpose-built Industrial AI applications, rather than spray-and-pray AI approaches across the entire plant, industrial leaders both evade some of the (perceived) hurdles associated with implementing new technologies, and ensure that the AI algorithms are incorporating domain knowledge that’s specific to industrial processes and real-world engineering. This ensures that the Industrial AI is both ingesting relevant data guided by domain-specific purposes and generating insights that give decisionmakers a more accurate picture of their environment. This creates a safe, sustainable, and holistic workflow for decision-making that guarantees reliable long-term results.

To support and achieve their profitability, production, and sustainability goals, industrial organizations must evolve their current data historians into next-generation, industrial-grade data management solutions powered by an AIoT strategy, which provides the anchor technology for deploying Industrial AI applications across the enterprise. Having a data historian capable of mobilizing and integrating volumes of complex industrial data across the enterprise is not just a convenience; it’s business critical. And to do that, industrial leaders need to invest in cloud-ready, purpose-built Industrial AI infrastructure and applications to future-proof the business against volatile and complex market conditions.