VAST Data and Lablup Power Korea’s Sovereign AI Initiative with Integrated Backend.AI and the VAST AI Operating System
SEOUL, SOUTH KOREA – November 19, 2025 – VAST Data, the AI Operating System company, today announced a strategic collaboration with Lablup Inc., developer of the open, scalable AI computing platform Backend.AI and a key participant in South Korea’s government-backed Sovereign AI Project. Together, the companies have built a high-performance, scalable data foundation that enables Korea’s national AI consortium, which aims for technological self-reliance, to train and deploy large-language models entirely on domestic infrastructure.
Backend.AI provides a unified environment for AI model training, deployment, and inference across GPU and CPU resources. By integrating the VAST AI Operating System, Lablup and its customers can manage training data, model checkpoints, and artifacts with unmatched throughput and resilience, ensuring smooth data flows from model development to real-time serving.
As Lablup’s AI workloads grew, traditional infrastructure storage systems struggled to keep pace with the throughput and concurrency required for foundation-model training. The team turned to VAST Data to modernize its data layer, eliminating performance bottlenecks while maintaining reliability and control for multi-tenant research environments.
Running on SK Telecom’s Sovereign AI cluster, the integrated Backend.AI and VAST AI Operating System now orchestrates and accelerates model training across high-performance GPUs, giving Korean researchers a unified environment for both training and, in the future, inference.
Our mission has always been to democratize AI infrastructure through open, efficient, and scalable technology,” said Jeongkyu Shin, Founder and CEO of Lablup Inc. “By integrating VAST’s AI Operating System into Backend.AI environments, we’re enabling developers and enterprises to handle massive datasets and complex workloads with the reliability and performance that modern AI demands. It’s a technological synergy that really unlocks the full potential of our AI compute while ensuring sovereignty and data security.
Through direct integration with the VAST AI Operating System, Backend.AI achieves seamless, high-throughput access to data for massive training workloads.Using VAST’s Disaggregated, Shared-Everything (DASE) architecture, Lablup can scale compute and storage independently while maintaining a single global namespace, VAST DataSpace, enabling researchers to access checkpoints, datasets, and model outputs instantly across projects.
Powering Lablup’s role in Korea’s Sovereign AI initiative, the VAST AI Operating System enables Backend.AI to deliver everything from large-scale model training to real-time orchestration, including:
Sovereign Data Control: AI sovereignty begins and ends with data infrastructure. Built on VAST DataSpace and the AI Operating System’s Zero Trust architecture, Lablup enforces data residency, access control, and compliance across Korea’s Sovereign AI environments. For government, research, and regulated enterprise participants in the national consortium, this ensures not just where data lives, but how it’s protected, governed, and audited throughout the AI lifecycle.
High-Performance Data Infrastructure: Backend.AI integrates with the VAST DataStore, delivering high-throughput access to training datasets and model artifacts via low-latency, direct GPU-to-data data paths and RDMA. This bypasses CPU bottlenecks and ensures consistent performance across GPUs in SK Telecom’s Sovereign AI cluster, accelerating model pre-training and fine-tuning at national scale.
Unified Global Namespace for Seamless Collaboration: Built on VAST’s DASE architecture, Backend.AI provides researchers and developers with a single, unified namespace that spans every workload; from model training and validation to deployment. With VAST DataSpace, teams can instantly access checkpoints, datasets, and outputs across distributed environments without duplication or downtime, simplifying collaboration across the consortium.
Secure, Multi-Tenant Scalability: Lablup leverages VAST DataEngine and DataSpace to unify data and compute management across shared GPU clusters. Tight integration between VAST’s quota model and Backend.AI’s user-based access controls ensures secure isolation between tenants while maintaining predictable performance for every user. This architecture allows Lablup to scale compute and storage independently while providing consistent service to government, academic, and enterprise researchers alike.
Future-Ready for Inference and Agentic AI: As the Sovereign AI initiative transitions from development to deployment, Backend.AI’s integration with the VAST AI Operating System provides a foundation for inference and future agentic AI workloads. With VAST’s DataEngine enabling real-time orchestration and reasoning across massive datasets, Lablup’s customers can move beyond static model training toward dynamic, intelligent systems that continuously learn, adapt, and serve.
Within Lablup’s own infrastructure, the VAST AI Operating System supports AI infrastructure development, optimization testing, and model service orchestration. This integration allows engineers to test and deploy new AI models more efficiently, switching model weights in real time without downtime. Through joint engineering efforts, VAST and Lablup resolved early latency challenges, resulting in a stable, high-throughput system that scales seamlessly for both training and research operations.
Within the Korean Foundation Model Consortium, led by Upstage and supported by other national AI research partners, Backend.AI powered by the VAST AI Operating System processes massive-scale corpora for large-model pre-training and fine-tuning, advancing the development of frontier-grade, sovereign AI capabilities for Korea.
Within Lablup’s own infrastructure, the VAST AI Operating System supports AI infrastructure development, optimization testing, and model service orchestration. This integration allows engineers to test and deploy new AI models more efficiently, switching model weights in real time without downtime. Through joint engineering efforts, VAST and Lablup further reduced latency, resulting in a stable, high-throughput system that scales seamlessly for both training and research operations.
Within the Korean Foundation Model Consortium, led by Upstage and supported by other national AI research partners, Backend.AI powered by the VAST AI Operating System processes massive-scale datasets for large-model pre-training and fine-tuning, advancing the development of frontier-grade, sovereign AI capabilities for Korea.
#sovereignAI is where it’s at. Congratulations Lablup Inc. & VAST Data on the partnership
Organizations that are building large-scale AI capabilities are increasingly choosing VAST Data as their AI operating system. What is driving this widespread adoption? It could be VAST Data's unique combination of capabilities, including massive scaling potential, operational simplicity, an integrated vector database, and a unified data engine that supports real-time streaming, data processing, and event handling—all within a single operating system. I encourage you to read more to find out how VAST Data can power your AI initiatives.
Sovereign AI is what every country is most interested in. It’s a great privilege to be the foundation for so many of these initiatives.