The Real Moat in AI Isn’t Models — It’s Infrastructure These days it doesn't take long for another model to drop. Bigger context windows. Flashier benchmarks. And the headlines roll in. If you read some of the background of what people are calling the "circular economy" you'd see that the profit that is being generated is primarily via cloud infrastructure. Microsoft’s Azure recently crossed over $75 billion in annual revenue, growing ~34 % YoY. Meanwhile, AWS pulled in ~$30.9 billion in a single quarter, yet its margin compressed as it invests heavily in AI-infrastructure. What this means is clear: the smart money isn’t only in building a model — it’s in owning the infrastructure that models run on. This creates a circular economy of intelligence: the more models and workloads spin up, the more infrastructure consumption rises, which drives revenue back into infrastructure investment, enabling more models. That loop is the moat. Personally, I am in the "DSLM/SLM's are the future" camp. Our CTO at Cetacean likes to say "the big LLM's have scraped the internet." So in order to advance the capabilities of the LM's users need to take matters into our own hands as they say. As in training models for specific use cases. But not everyone has the skills to do this. That drove me to create a no-code SLM training feature as part of the new multi-cloud managed AI infrastructure platform **Oceanic** that I am creating with a team of engineering and product leaders — Anthony Monroy, Salim Lakhani (who built DevPanel), and Agentic Engineer Matt Burch. Because it's more than clear, models aren’t the moat for startups, unless they are domain specific. Flexible easy to use AI-specific infrastructure is. I spent years working in the early days of Cloud at AT&T helping Amazon and others scale. This past year working with AI and creating AI-driven platforms and features led me to merge those experiences and begin building Oceanic at Cetacean Labs — a multi-cloud managed AI platform and App Builder designed to create and deploy workloads to any major cloud in just a few minutes. It’s the same stack that now powers Esteemed Agents, and will soon enable another new project we are incubating inside Esteemed, the domain-specific intelligence for Human Capital Management — HCMGPT. In that I’ve seen how the real leverage in AI doesn’t come from who has the biggest model, or finds the coolest feature in Google Vertex or Databricks — it comes from who can run, adapt, and scale domain specific intelligence faster, cheaper, easier, and deploy that to the best infra option. In Oceanic's case the user can opt for the cheapest. with intelligent infrastructure cost, and AI model switching in our Agents saving 50% on average. This has been a transformative time in IT, and although it takes focus and determination to adapt, I am glad to have my community and incredible collaborators I can rely on to share the journey. #Agentics #HCM #AI #Cloud
On point!
Spot on — we’ve seen the same truth building DevPanel: whoever controls and automates the infrastructure, wins.