{"id":2224,"date":"2025-07-10T10:34:03","date_gmt":"2025-07-10T10:34:03","guid":{"rendered":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2025\/07\/10\/docker-brings-compose-to-the-agent-era-building-ai-agents-is-now-easy\/"},"modified":"2025-07-10T10:34:03","modified_gmt":"2025-07-10T10:34:03","slug":"docker-brings-compose-to-the-agent-era-building-ai-agents-is-now-easy","status":"publish","type":"post","link":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2025\/07\/10\/docker-brings-compose-to-the-agent-era-building-ai-agents-is-now-easy\/","title":{"rendered":"Docker Brings Compose to the Agent Era: Building AI Agents is Now Easy"},"content":{"rendered":"<p>Agents are the future, and if you haven\u2019t already started building agents, you probably will soon. Across industries and use cases, agents can act on our behalf, and offload repetitive work, because they can act on our behalf with judgment and context.\u00a0<\/p>\n<p>But while agentic development is moving fast, today it\u2019s tedious, hard, and not fun: you need to quickly iterate with different prompts and models (both frontier models and local\/open models), you need to find and connect MCP tools to internal data securely, and you need to declaratively package everything so that others can run your agent. And you need this to be built once, and run anywhere: on your laptop, in CI, or in production.<\/p>\n<p>These problems are not new: they are what Docker was originally conceived for. It\u2019s not an overstatement to say that once upon a time, Docker made microservices possible, and today we\u2019re excited to share how we\u2019re evolving Docker for the era of agents.<\/p>\n<h2 class=\"wp-block-heading\">Launching today: Compose enters the agent era<\/h2>\n<p>Starting today, Docker makes it easy to build, ship, and run agents and agentic applications. Docker Compose launched a decade ago, and solved the problem of how to build and describe multi-container applications. It\u2019s used and loved by millions of developers every day, which is why we\u2019re excited to announce that we have agent building blocks in Compose.<\/p>\n<div class=\"wp-block-ponyo-video\">\n<div><\/div>\n<\/div>\n<p>Now, with just a compose.yaml, you can define your open models, agents, and MCP-compatible tools, then spin up your full agentic stack with a simple docker compose up. From dev to production (more on this later), your agents are wired, connected, and ready to run.\u00a0<\/p>\n<p>Not just that. Compose is also seamlessly integrated with today\u2019s most popular agentic frameworks:<\/p>\n<p><strong>LangGraph<\/strong> \u2013 Define your LangGraph workflow, wrap it as a service, plug it into compose.yaml, and run the full graph with docker compose up. <a href=\"https:\/\/github.com\/docker\/compose-agents-demo\/tree\/main\/langgraph\" target=\"_blank\">Try the LangGraph tutorial<\/a>.<\/p>\n<p><strong>Embabel<\/strong> \u2013 Use Compose to connect models, embed tools, and get a complete Embabel environment running. <a href=\"https:\/\/github.com\/embabel\/tripper\" target=\"_blank\">Explore the quickstart guide<\/a>.<\/p>\n<p><strong>Vercel AI SDK<\/strong> \u2013 Compose makes it easy to stand up supporting agents and services locally. <a href=\"https:\/\/github.com\/slimslenderslacks\/scira-mcp-chat\" target=\"_blank\">Check out the Vercel AI examples<\/a>.<\/p>\n<p><strong>Spring AI<\/strong> \u2013 Use Compose to spin up vector stores, model endpoints, and agents alongside your Spring AI backend. <a href=\"https:\/\/github.com\/spring-projects\/spring-ai-examples\/commit\/37b0030c5380846d3726451e0b743c7f76891320\" target=\"_blank\">View the Spring AI samples<\/a>.<\/p>\n<p><strong>CrewAI<\/strong> \u2013 Compose lets you containerize CrewAI agents. <a href=\"https:\/\/github.com\/docker\/compose-agents-demo\/tree\/main\/crew-ai\" target=\"_blank\">Try the CrewAI Getting Started guide<\/a>.<\/p>\n<p><strong>Google\u2019s ADK<\/strong> \u2013 Easily deploy your ADK-based agent stack with Docker Compose- agents, tools, and routing layers all defined in a single file. <a href=\"https:\/\/github.com\/docker\/compose-agents-demo\/tree\/main\/adk\" target=\"_blank\">Try our example.<\/a>\u00a0<\/p>\n<p><strong>Agno<\/strong> \u2013 Use Compose to run your Agno-based agents and tools effortlessly. <a href=\"https:\/\/github.com\/docker\/compose-for-agents\/tree\/main\/agno\" target=\"_blank\">Explore the Agno example<\/a>.<\/p>\n<p>But the power of the new Docker Compose goes beyond SDKs: it\u2019s deeply integrated with Docker\u2019s broader suite of AI features.<\/p>\n<p>Docker\u2019s <a href=\"https:\/\/www.docker.com\/products\/mcp-catalog-and-toolkit\/\">MCP Catalog<\/a> gives you instant access to a growing library of trusted, plug-and-play tools for your agents. No need to dig through repos, worry about compatibility, or wire things up manually. Just drop what you need into your Compose file and you\u2019re up and running. <\/p>\n<p>Docker <a href=\"https:\/\/www.docker.com\/products\/model-runner\/\">Model Runner<\/a> lets you pull open-weight LLMs directly from Docker Hub, run them locally, and interact with them via built-in OpenAI-compatible endpoints, so your existing SDKs and libraries just work, no rewrites, no retooling. And they run with full GPU acceleration. But what if you don\u2019t have enough local resources?<\/p>\n<h2 class=\"wp-block-heading\">Introducing Docker Offload: Cloud power, local simplicity<\/h2>\n<p>When building agents, local resource limits shouldn\u2019t slow you down. That\u2019s why we\u2019re <a href=\"https:\/\/www.docker.com\/products\/docker-offload\/\">introducing Docker Offload<\/a>, a truly seamless way to run your models and containers on a cloud GPU.<\/p>\n<p>Docker Offload frees you from infrastructure constraints by offloading compute-intensive workloads, like large language models and multi-agent orchestration, to high-performance cloud environments. No complex setup, no GPU shortages, no configuration headaches.<\/p>\n<p>With native integration into Docker Desktop and Docker Engine, Docker Offload gives you a one-click path from Compose to cloud. Build, test, and scale your agentic applications just like you always have locally, while Docker handles the heavy lifting behind the scenes. It\u2019s the same simple docker compose up experience, now supercharged with the power of the cloud.<\/p>\n<p>And to get you started, we\u2019re offering <a href=\"https:\/\/www.docker.com\/products\/docker-offload\/#earlyaccess\">300 minutes of free Offload usage<\/a>. Try it out, build your agents, and scale effortlessly from your laptop to the cloud.<\/p>\n<h2 class=\"wp-block-heading\">Compose is now production-ready with Google Cloud and Microsoft Azure<\/h2>\n<p>Last, but certainly not least, we\u2019ve worked hard to make sure that the exact same Compose file you used during development works in production, with no rewrites and no reconfiguration.<\/p>\n<p>We\u2019re proud to announce new integrations with Google Cloud Run and Microsoft Azure Container Apps Service that allow Docker Compose to specify a serverless architecture. For example, with Google Cloud, you can deploy your agentic app directly to a serverless environment using the new gcloud run compose up command. And we\u2019re working closely with Microsoft to bring this seamless experience to Azure as well.<\/p>\n<p>From the first line of YAML to production deployment, Compose makes the entire journey consistent, portable, and effortless, just the way agentic development should be.<\/p>\n<h3 class=\"wp-block-heading\">Let\u2019s Compose the future. Together.<\/h3>\n<p>The future of software is agentic, where every developer builds goal-driven, multi-LLM agents that reason, plan, and act across a rich ecosystem of tools and services.\u00a0<\/p>\n<p>With Docker Compose, Docker Offload, Docker\u2019s broader AI capabilities, and our partnerships with Google, Microsoft, and Agent SDKs, we\u2019re making that future accessible to, and easy for, everyone.\u00a0<\/p>\n<p>In short: Docker is the easiest way to build, run, and scale intelligent agents, from development to production.<\/p>\n<p>We can\u2019t wait to see what you create.<\/p>\n<h3 class=\"wp-block-heading\">Resources<a href=\"https:\/\/www.docker.com\/products\/model-runner\/\"><\/a><\/h3>\n<p><a href=\"https:\/\/www.docker.com\/solutions\/docker-ai\/\">Docker is simplifying Agent Development\u00a0<\/a><\/p>\n<p>Explore the capabilities of<a href=\"https:\/\/www.docker.com\/products\/docker-offload\/\"> Docker Offload<\/a><\/p>\n<p>Learn more about our AI Agent:<a href=\"https:\/\/docs.docker.com\/ai\/gordon\/\" target=\"_blank\"> Ask Gordon<\/a>\u00a0<\/p>\n<p><a href=\"https:\/\/docs.docker.com\/guides\/agentic-ai\/\" target=\"_blank\">Build Agentic Apps with Docker Compose<\/a>\u00a0<\/p>\n<p>Learn more about<a href=\"https:\/\/www.docker.com\/products\/model-runner\/\"> Docker Model Runner<\/a><\/p>","protected":false},"excerpt":{"rendered":"<p>Agents are the future, and if you haven\u2019t already started building agents, you probably will soon. Across industries and use [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":0,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[4],"tags":[],"class_list":["post-2224","post","type-post","status-publish","format-standard","hentry","category-docker"],"_links":{"self":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/2224","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/comments?post=2224"}],"version-history":[{"count":0,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/2224\/revisions"}],"wp:attachment":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/media?parent=2224"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/categories?post=2224"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/tags?post=2224"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}