Industrial AI Needs Context Engineers, NOT Prompt Engineers
Updated Industrial-grade AI Solution Services Built on Industrial Data Fabric, ARC Advisory Group 2025

Industrial AI Needs Context Engineers, NOT Prompt Engineers

Originally Published September 2025, on ARCweb.com by Colin Masson, Director of Research for Industrial AI, ARC Advisory Group Inc.

In ARC Advisory Group’s recent podcast series, "Industrial Systems Engineering in the New Era of AI," I explored the foundational shifts reshaping our industry with serial entrepreneur Rick Bullotta. For those who listened closely, especially to Episode 3 on the rise of domain-specific innovators, you probably heard the word 'context' used liberally—in the context (if you'll pardon the pun) of the industrial data challenges we've been trying to overcome for decades. This was no accident. For years, the core problem has been getting the right information to the right place at the right time.

That long-standing challenge has now met a new architectural paradigm. The initial hype around generative AI suggested the answer was "prompt engineering." Yet, anyone who has moved beyond simple chatbots knows the limitations of this approach in a complex industrial setting. Industrial processes are not single-turn, stateless events; they are complex, multi-step workflows that require memory, access to real-time data, and the ability to interact with other systems. The practical failures of prompt-centric systems have made it clear that a more robust, systems-level discipline is required.

This is the architectural shift from tactical prompts to strategic systems. The new discipline is called Context Engineering, and it is the true engine that connects the potential of the Industrial Data Fabric to the reality of enterprise-grade Agentic AI.

An Expanded Lexicon for the Modern Industrial AI Stack

To grasp this shift, we must first clarify our vocabulary. The AI landscape is evolving at a breakneck pace, and a shared understanding of these interconnected concepts is crucial for any organization crafting its industrial AI strategy.

  • Prompt Engineering is the foundational skill of crafting a single instruction to elicit a desired response from an AI model. Think of it as giving a single, clear order. It’s essential, but it’s not a strategy for managing a complex operation.
  • Context Engineering is the superset of prompt engineering. It is the systems-level discipline of designing, building, and orchestrating the entire information ecosystem an AI model sees at inference time. If a prompt is "what you say," context is "everything else the model sees"—including system instructions, retrieved knowledge from the data fabric, available tools, conversation memory, and real-time operational state. It’s a fundamental shift from "text composition to systems design".
  • Agentic AI represents the application of this thinking. These are systems that can reason, plan, and use tools to accomplish complex, multi-step goals with a degree of autonomy. A reliable AI agent is the result of effective Context Engineering.
  • Knowledge Graphs & Vector Databases are the data technologies that power the "Knowledge" pillar of the context architecture. The primary output of a mature Industrial Data Fabric is a dynamic knowledge graph that models the explicit, structured relationships between industrial assets and processes, becoming the definitive source of grounded truth.
  • MCP (Model Context Protocol) & A2A (Agent-to-Agent Protocol) are emerging open standards that form the communication layer for agentic systems. MCP standardizes agent-to-tool communication, acting as a universal plug-and-play interface for an agent to securely access external data and APIs. A2A standardizes inter-agent communication, allowing different AI agents to collaborate and delegate tasks.

In short, the Industrial Data Fabric creates the AI-ready knowledge. Context Engineering is the discipline of orchestrating that knowledge and connecting it to tools (via MCP) and other agents (via A2A). And Agentic AI is the autonomous system that this entire stack enables.

The Industrial Data Fabric: Prerequisite for Intelligence

This brings us back to the core of our series. Context Engineering, for all its power, is a theoretical software architecture without a foundation of high-quality, contextualized data. This is where its symbiotic relationship with the Industrial Data Fabric becomes clear. The IDF is the specialized technology stack purpose-built to create and serve the "Knowledge" and "State" components that the Context Engineering layer orchestrates.

When an industrial AI agent needs to troubleshoot a production line, its Context Engineering system doesn't just rely on a clever prompt. It dynamically assembles a context window by:

  1. Querying the Industrial Data Fabric's knowledge graph for the real-time status of all relevant assets, their maintenance histories, and their upstream/downstream dependencies.
  2. Retrieving standard operating procedures from a vector database.
  3. Accessing its memory of similar past events.
  4. Loading the definitions of tools it can use via MCP, such as creating a work order in an ERP system.
  5. Potentially delegating a sub-task (like checking spare part availability with a supplier) to a specialized procurement agent via the A2A protocol.

This architectural shift isn't just theoretical; it's being actively built by innovators in the industrial space. In fact, I'll soon be discussing this very topic and its practical implications on an upcoming podcast with Litmus Automation Founder and CEO, Vatsal Shah, whose team is focused on delivering the AI-ready data that makes this all possible.

An Updated Framework: Industrial-grade AI Solution Services

This evolution from a data-centric to an intelligence-centric architecture requires us to update our framework for what constitutes a complete, industrial-grade AI solution. The services required go beyond simple data management to encompass this new orchestration and intelligence layer, enabled by open standards. As promised, here is the updated and expanded list of core services that organizations should consider as they assemble their modern industrial AI stack.

Article content
Industrial-grade AI Solution Services Built on Industrial Data Fabric, ARC Advisory Group 2025

  • Connectivity & Data Ingestion (incl. Events, Agents, A2A) This foundational service connects to all IT, OT, and ET data sources. In a modern stack, it must also handle event-driven architectures and agent-to-agent communication via protocols like A2A, ingesting not just raw data but also messages and tasks from a distributed ecosystem of collaborating AI agents.
  • Data Processing & Contextualization: The Foundation for Context Engineering (incl. MCP) This remains the core value of the Industrial Data Fabric: transforming raw, siloed data into a coherent, understandable model of operations. The contextualized data and capabilities of the fabric are exposed securely as "tools" that AI agents can discover and use via the MCP standard, turning the fabric from a passive repository into an active, queryable resource for any compliant AI agent.
  • Data Storage & Modeling (incl. Vector DB, Knowledge Graphs) This service provides the persistence layer. Knowledge graphs are essential for modeling the explicit, structured relationships between industrial assets for precise reasoning. Vector databases complement this by enabling semantic search over unstructured data like technical manuals. Together, they form the comprehensive "long-term memory" that grounds AI agents in factual, enterprise-specific knowledge.
  • Context Engineering: The AI Orchestration & Intelligence Layer This is the active intelligence layer that sits atop the data fabric, serving as the "operating system" for the AI. This service is responsible for orchestrating Process & Domain Intelligence (AI+). It dynamically assembles the context window by codifying business rules and domain expertise into instructions, and by intelligently selecting knowledge from the fabric's knowledge graph, managing memory, and deciding when to use tools (via MCP) or delegate tasks (via A2A).
  • AI/ML Model Development & Management (Lifecycle) This service provides the tools and governance for developing, training, deploying, and monitoring the AI models themselves. In a context-engineered system, this goes beyond traditional MLOps to include managing the lifecycle of specialized agentic models and evaluating their performance based on the quality of the context they receive.
  • AI Agent & Workflow Orchestration This capability evolves traditional workflow management by leveraging context-aware AI agents. It involves automating and managing data pipelines and analytical workflows, with agents actively participating in and even directing these workflows based on real-time context provided by the Context Engineering layer.
  • Intelligent Applications: Delivering AI Agents & Copilots to Users This service delivers the power of the agentic system to end-users. It’s no longer just about dashboards. This now includes building AI "copilots" for frontline workers, creating conversational interfaces for complex troubleshooting, and providing the tools for human-in-the-loop oversight of autonomous agent actions.
  • Industry Standards & Protocols A mature industrial AI stack must be built on a foundation of openness and interoperability. This service ensures adherence to critical industry data standards (e.g., OPC-UA) as well as emerging AI communication protocols like MCP and A2A to prevent vendor lock-in and ensure the AI ecosystem can evolve.

The message is clear: while the industry has been rightly focused on building the data fabric, the next competitive frontier is already here. Mastery will belong to those who not only build the foundation but also master the art and science of Context Engineering to construct truly intelligent and collaborative systems upon it.

Stay Connected

The dialogue on industrial AI and digital transformation is constantly evolving. To stay informed and hear more insights from industry leaders, we invite you to subscribe to the ARC Advisory Group's Digital Transformation podcast series.

I'll soon be discussing Context Engineering and its practical implications on an upcoming podcast with Litmus Automation Founder and CEO, Vatsal Shah, whose team is focused on delivering the AI-ready data that makes this all possible.

We believe the best conversations include diverse perspectives. If you are an innovator in this space and would like to contribute to a future discussion, please reach out to Colin Masson at ARC Advisory Group.

Engage with ARC Advisory Group

For ARC Advisory Group recommendations for Navigating the AI Wars, Closing the Digital Divide by Embracing Industrial AI, assembling your Industrial-Grade Data Fabric, and governing and guiding major decisions about enterprise, cloud, industrial edge, and AI software, please contact Colin Masson at cmasson@arcweb.com or set up a meeting with me, or my fellow Analysts at ARC Advisory Group to find out more about our Executive Insights Service for Industrial organizations, and Industrial AI Insights Service for Vendors.

Very informative and insightful, thanks for sharing 👌

Excellent insight! I would only add standardized information models (OPC UA Companion Specs/CESMII SM Profiles) for ecosystem-wide (semantic) interoperability and remove A2A as Kudzai Manditereza proposed - please no new point-to-point integrations! Actially, those information models are excellent vessels to move agent communication between digital twins (as in OT, presumably they go hand-in-hand soon; (AI) agent controls/predicts/simulates assets/processes through digital twins). Just add needed parameters to data model (and use event-driven approach) to enable agent-to-agent communication. As Industrie 4.0 already has inbuilt.

To view or add a comment, sign in

More articles by Colin Masson

Others also viewed

Explore content categories