Industrial AI Needs Context Engineers, NOT Prompt Engineers
Originally Published September 2025, on ARCweb.com by Colin Masson, Director of Research for Industrial AI, ARC Advisory Group Inc.
In ARC Advisory Group’s recent podcast series, "Industrial Systems Engineering in the New Era of AI," I explored the foundational shifts reshaping our industry with serial entrepreneur Rick Bullotta. For those who listened closely, especially to Episode 3 on the rise of domain-specific innovators, you probably heard the word 'context' used liberally—in the context (if you'll pardon the pun) of the industrial data challenges we've been trying to overcome for decades. This was no accident. For years, the core problem has been getting the right information to the right place at the right time.
That long-standing challenge has now met a new architectural paradigm. The initial hype around generative AI suggested the answer was "prompt engineering." Yet, anyone who has moved beyond simple chatbots knows the limitations of this approach in a complex industrial setting. Industrial processes are not single-turn, stateless events; they are complex, multi-step workflows that require memory, access to real-time data, and the ability to interact with other systems. The practical failures of prompt-centric systems have made it clear that a more robust, systems-level discipline is required.
This is the architectural shift from tactical prompts to strategic systems. The new discipline is called Context Engineering, and it is the true engine that connects the potential of the Industrial Data Fabric to the reality of enterprise-grade Agentic AI.
An Expanded Lexicon for the Modern Industrial AI Stack
To grasp this shift, we must first clarify our vocabulary. The AI landscape is evolving at a breakneck pace, and a shared understanding of these interconnected concepts is crucial for any organization crafting its industrial AI strategy.
In short, the Industrial Data Fabric creates the AI-ready knowledge. Context Engineering is the discipline of orchestrating that knowledge and connecting it to tools (via MCP) and other agents (via A2A). And Agentic AI is the autonomous system that this entire stack enables.
The Industrial Data Fabric: Prerequisite for Intelligence
This brings us back to the core of our series. Context Engineering, for all its power, is a theoretical software architecture without a foundation of high-quality, contextualized data. This is where its symbiotic relationship with the Industrial Data Fabric becomes clear. The IDF is the specialized technology stack purpose-built to create and serve the "Knowledge" and "State" components that the Context Engineering layer orchestrates.
When an industrial AI agent needs to troubleshoot a production line, its Context Engineering system doesn't just rely on a clever prompt. It dynamically assembles a context window by:
This architectural shift isn't just theoretical; it's being actively built by innovators in the industrial space. In fact, I'll soon be discussing this very topic and its practical implications on an upcoming podcast with Litmus Automation Founder and CEO, Vatsal Shah, whose team is focused on delivering the AI-ready data that makes this all possible.
An Updated Framework: Industrial-grade AI Solution Services
This evolution from a data-centric to an intelligence-centric architecture requires us to update our framework for what constitutes a complete, industrial-grade AI solution. The services required go beyond simple data management to encompass this new orchestration and intelligence layer, enabled by open standards. As promised, here is the updated and expanded list of core services that organizations should consider as they assemble their modern industrial AI stack.
The message is clear: while the industry has been rightly focused on building the data fabric, the next competitive frontier is already here. Mastery will belong to those who not only build the foundation but also master the art and science of Context Engineering to construct truly intelligent and collaborative systems upon it.
Stay Connected
The dialogue on industrial AI and digital transformation is constantly evolving. To stay informed and hear more insights from industry leaders, we invite you to subscribe to the ARC Advisory Group's Digital Transformation podcast series.
I'll soon be discussing Context Engineering and its practical implications on an upcoming podcast with Litmus Automation Founder and CEO, Vatsal Shah, whose team is focused on delivering the AI-ready data that makes this all possible.
We believe the best conversations include diverse perspectives. If you are an innovator in this space and would like to contribute to a future discussion, please reach out to Colin Masson at ARC Advisory Group.
Engage with ARC Advisory Group
For ARC Advisory Group recommendations for Navigating the AI Wars, Closing the Digital Divide by Embracing Industrial AI, assembling your Industrial-Grade Data Fabric, and governing and guiding major decisions about enterprise, cloud, industrial edge, and AI software, please contact Colin Masson at cmasson@arcweb.com or set up a meeting with me, or my fellow Analysts at ARC Advisory Group to find out more about our Executive Insights Service for Industrial organizations, and Industrial AI Insights Service for Vendors.
Well said Colin Masson
Looking forward to the podcast Colin Masson!
Very informative and insightful, thanks for sharing 👌
Excellent insight! I would only add standardized information models (OPC UA Companion Specs/CESMII SM Profiles) for ecosystem-wide (semantic) interoperability and remove A2A as Kudzai Manditereza proposed - please no new point-to-point integrations! Actially, those information models are excellent vessels to move agent communication between digital twins (as in OT, presumably they go hand-in-hand soon; (AI) agent controls/predicts/simulates assets/processes through digital twins). Just add needed parameters to data model (and use event-driven approach) to enable agent-to-agent communication. As Industrie 4.0 already has inbuilt.