{"id":2391,"date":"2025-08-19T15:22:18","date_gmt":"2025-08-19T15:22:18","guid":{"rendered":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2025\/08\/19\/building-ai-agents-with-docker-mcp-toolkit-a-developers-real-world-setup\/"},"modified":"2025-08-19T15:22:18","modified_gmt":"2025-08-19T15:22:18","slug":"building-ai-agents-with-docker-mcp-toolkit-a-developers-real-world-setup","status":"publish","type":"post","link":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2025\/08\/19\/building-ai-agents-with-docker-mcp-toolkit-a-developers-real-world-setup\/","title":{"rendered":"Building AI Agents with Docker MCP Toolkit: A Developer\u2019s Real-World Setup"},"content":{"rendered":"<p>Building AI agents in the real world often involves more than just making model calls \u2014 it requires integrating with external tools, handling complex workflows, and ensuring the solution can scale in production.<\/p>\n<p>In this post, we\u2019ll walk through a real-world developer setup for creating an agent using the Docker MCP Toolkit.<\/p>\n<p>To make things concrete, I\u2019ve built an agent that takes a Git repository as input and can answer questions about its contents \u2014 whether it\u2019s explaining the purpose of a function, summarizing a module, or finding where a specific API call is made. This simple but practical use case serves as a foundation for exploring how agents can interact with real-world data sources and respond intelligently.<\/p>\n<p>I built and ran it using the Docker MCP Toolkit, which made setup and integration fast, portable, and repeatable. This blog walks you through that developer setup and explains why Docker MCP is a game changer for building and running agents.<\/p>\n<h2 class=\"wp-block-heading\">Use Case: GitHub Repo Question-Answering Agent<\/h2>\n<p>The goal: Build an AI agent that can connect to a GitHub repository, retrieve relevant code or metadata, and answer developer questions in plain language.<\/p>\n<p>Example queries:<\/p>\n<p>\u201cSummarize this repo: https:\/\/github.com\/owner\/repo\u201d<\/p>\n<p>\u201cWhere is the authentication logic implemented?\u201d<\/p>\n<p>\u201cList main modules and their purpose.\u201d<\/p>\n<p>\u201cExplain the function parse_config and show where it\u2019s used.\u201d<\/p>\n<p><strong>This goes beyond a simple code demo \u2014 it reflects how developers work in real-world environments<\/strong><strong><br \/><\/strong><\/p>\n<p>The agent acts like a code-aware teammate you can query anytime.<\/p>\n<p>The MCP Gateway handles tooling integration (GitHub API) without bloating the agent code.<\/p>\n<p>Docker Compose ties the environment together so it runs the same in dev, staging, or production.<\/p>\n<h2 class=\"wp-block-heading\">Role of Docker MCP Toolkit<\/h2>\n<p>Without MCP Toolkit, you\u2019d spend hours wiring up API SDKs, managing auth tokens, and troubleshooting environment differences.<\/p>\n<p>With MCP Toolkit:<\/p>\n<p>Containerized connectors \u2013 Run the GitHub MCP Gateway as a ready-made service (docker\/mcp-gateway:latest), no SDK setup required.<\/p>\n<p>Consistent environments \u2013 The container image has fixed dependencies, so the setup works identically for every team member.<\/p>\n<p>Rapid integration \u2013 The agent connects to the gateway over HTTP; adding a new tool is as simple as adding a new container.<\/p>\n<p>Iterate faster \u2013 Restart or swap services in seconds using docker compose.<\/p>\n<p>Focus on logic, not plumbing \u2013 The gateway handles the GitHub-specific heavy lifting while you focus on prompt design, reasoning, and multi-agent orchestration.<\/p>\n<h2 class=\"wp-block-heading\">Role of Docker Compose\u00a0<\/h2>\n<p>Running everything via Docker Compose means you treat the entire agent environment as a single deployable unit:<\/p>\n<p>One-command startup \u2013 docker compose up brings up the MCP Gateway (and your agent, if containerized) together.<\/p>\n<p>Service orchestration \u2013 Compose ensures dependencies start in the right order.<\/p>\n<p>Internal networking \u2013 Services talk to each other by name (http:\/\/mcp-gateway-github:8080) without manual port wrangling.<\/p>\n<p>Scaling \u2013 Run multiple agent instances for concurrent requests.<\/p>\n<p>Unified logging \u2013 View all logs in one place for easier debugging.<\/p>\n<h2 class=\"wp-block-heading\">Architecture Overview<\/h2>\n<div class=\"wp-block-ponyo-image\"><\/div>\n<p>This setup connects a developer\u2019s local agent to GitHub through a Dockerized MCP Gateway, with Docker Compose orchestrating the environment. Here\u2019s how it works step-by-step:<\/p>\n<p>User Interaction<\/p>\n<p>The developer runs the agent from a CLI or terminal.<\/p>\n<p>They type a question about a GitHub repository \u2014 e.g., \u201cWhere is the authentication logic implemented?\u201d<\/p>\n<p>Agent Processing<\/p>\n<p>The Agent (LLM + MCPTools) receives the question.<\/p>\n<p>The agent determines that it needs repository data and issues a tool call via MCPTools.<\/p>\n<p>MCPTools \u2192 MCP Gateway<\/p>\n<p>\u00a0MCPTools sends the request using streamable-http to the MCP Gateway running in Docker.<\/p>\n<p>This gateway is defined in docker-compose.yml and configured for the GitHub server (&#8211;servers=github &#8211;port=8080).<\/p>\n<p>GitHub Integration<\/p>\n<p>The MCP Gateway handles all GitHub API interactions \u2014 listing files, retrieving content, searching code \u2014 and returns structured results to the agent.<\/p>\n<p>LLM Reasoning<\/p>\n<p>The agent sends the retrieved GitHub context to OpenAI GPT-4o as part of a prompt.<\/p>\n<p>\u00a0The LLM reasons over the data and generates a clear, context-rich answer.<\/p>\n<p>Response to User<\/p>\n<p>The agent prints the final answer back to the CLI, often with file names and line references.<\/p>\n<h2 class=\"wp-block-heading\">Code Reference &amp; File Roles<\/h2>\n<p>The detailed source code for this setup is available at this <a href=\"https:\/\/github.com\/rajeshsgr\/mcp-demo-agents\/tree\/main\" target=\"_blank\">link<\/a>.\u00a0<\/p>\n<p>Rather than walk through it line-by-line, here\u2019s what each file does in the real-world developer setup:<\/p>\n<h3 class=\"wp-block-heading\"><strong>docker-compose.yml<\/strong><\/h3>\n<p>Defines the MCP Gateway service for GitHub.<\/p>\n<p>Runs the docker\/mcp-gateway:latest container with GitHub as the configured server.<\/p>\n<p>Exposes the gateway on port 8080.<\/p>\n<p>Can be extended to run the agent and additional connectors as separate services in the same network.<\/p>\n<h3 class=\"wp-block-heading\"><strong>app.py<\/strong><\/h3>\n<p>Implements the GitHub Repo Summarizer Agent.<\/p>\n<p>Uses MCPTools to connect to the MCP Gateway over streamable-http.<\/p>\n<p>Sends queries to GitHub via the gateway, retrieves results, and passes them to GPT-4o for reasoning.<\/p>\n<p>Handles the interactive CLI loop so you can type questions and get real-time responses.<\/p>\n<p>In short: the Compose file manages <em>infrastructure and orchestration<\/em>, while the Python script handles <em>intelligence and conversation<\/em>.<\/p>\n<h2 class=\"wp-block-heading\">Setup and Execution<\/h2>\n<p><strong>Clone the repository\u00a0<\/strong><\/p>\n<p>git clone https:\/\/github.com\/rajeshsgr\/mcp-demo-agents\/tree\/main<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\ncd mcp-demo-agents\n<\/div>\n<p><strong>Configure environment<\/strong><\/p>\n<p>Create a .env file in the root directory and add your OpenAI API key:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\nOPEN_AI_KEY = &lt;&lt;Insert your Open AI Key&gt;&gt;\n<\/div>\n<p><strong>Configure GitHub Access<\/strong><\/p>\n<p>To allow the MCP Gateway to access GitHub repositories, set your GitHub personal access token:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\ndocker mcp secret set github.personal_access_token=&lt;YOUR_GITHUB_TOKEN&gt;\n<\/div>\n<p><strong>Start MCP Gateway<\/strong><\/p>\n<p>Bring up the GitHub MCP Gateway container using Docker Compose:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\ndocker compose up -d\n<\/div>\n<p><strong>Install Dependencies &amp; Run Agent<\/strong><\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\npython -m venv .venv &amp;&amp; source .venv\/bin\/activate<br \/>\npip install -r requirements.txt<br \/>\npython app.py\n<\/div>\n<p><strong>Ask Queries<\/strong><\/p>\n<p>Enter your query: Summarize https:\/\/github.com\/owner\/repo<\/p>\n<h2 class=\"wp-block-heading\"><strong>Real-World Agent Development with Docker, MCP, and Compose<\/strong><\/h2>\n<p>This setup is built with production realities in mind \u2014<\/p>\n<p><strong>Docker<\/strong> ensures each integration (GitHub, databases, APIs) runs in its own isolated container with all dependencies preconfigured.<\/p>\n<p><strong>MCP<\/strong> acts as the bridge between your agent and real-world tools, abstracting away API complexity so your agent code stays clean and focused on reasoning.<\/p>\n<p><strong>Docker Compose<\/strong> orchestrates all these moving parts, managing startup order, networking, scaling, and environment parity between development, staging, and production.<\/p>\n<p><strong>From here, it\u2019s easy to add:<\/strong><\/p>\n<p>More MCP connectors (Jira, Slack, internal APIs).<\/p>\n<p>Multiple agents specializing in different tasks.<\/p>\n<p>CI\/CD pipelines that spin up this environment for automated testing<\/p>\n<h2 class=\"wp-block-heading\"><strong>Final Thoughts<\/strong><\/h2>\n<p>By combining <strong>Docker<\/strong> for isolation, <strong>MCP<\/strong> for seamless tool integration, and <strong>Docker Compose<\/strong> for orchestration, we\u2019ve built more than just a working AI agent \u2014 we\u2019ve created a repeatable, production-ready development pattern. This approach removes environment drift, accelerates iteration, and makes it simple to add new capabilities without disrupting existing workflows. Whether you\u2019re experimenting locally or deploying at scale, this setup ensures your agents are reliable, maintainable, and ready to handle real-world demands from day one.<\/p>\n<h2 class=\"wp-block-heading\"><strong>Before vs. After: The Developer Experience<\/strong><\/h2>\n<div class=\"wp-block-ponyo-table style__default\">\n<p><strong>Aspect<\/strong><\/p>\n<p><strong>Without Docker + MCP + Compose<\/strong><\/p>\n<p><strong>With Docker + MCP + Compose<\/strong><\/p>\n<p><strong>Environment Setup<\/strong><\/p>\n<p>Manual SDK installs, dependency conflicts, \u201cworks on my machine\u201d issues.<\/p>\n<p>Prebuilt container images with fixed dependencies ensure identical environments everywhere.<\/p>\n<p><strong>Integration with Tools (GitHub, Jira, etc.)<\/strong><\/p>\n<p>Custom API wiring in the agent code; high maintenance overhead.<\/p>\n<p>MCP handles integrations in separate containers; agent code stays clean and focused.<\/p>\n<p><strong>Startup Process<\/strong><\/p>\n<p>Multiple scripts\/terminals; manual service ordering.<\/p>\n<p>docker compose up launches and orchestrates all services in the right order.<\/p>\n<p><strong>Networking<\/strong><\/p>\n<p>Manually configuring ports and URLs; prone to errors.<\/p>\n<p>Internal Docker network with service name resolution (e.g., http:\/\/mcp-gateway-github:8080).<\/p>\n<p><strong>Scalability<\/strong><\/p>\n<p>Scaling services requires custom scripts and reconfigurations.<\/p>\n<p>Scale any service instantly with docker compose up &#8211;scale.<\/p>\n<p><strong>Extensibility<\/strong><\/p>\n<p>Adding a new integration means changing the agent\u2019s code and redeploying.<\/p>\n<p>Add new MCP containers to docker-compose.yml without modifying the agent.<\/p>\n<p><strong>CI\/CD Integration<\/strong><\/p>\n<p>Hard to replicate environments in pipelines; brittle builds.<\/p>\n<p>Same Compose file works locally, in staging, and in CI\/CD pipelines.<\/p>\n<p><strong>Iteration Speed<\/strong><\/p>\n<p>Restarting services or switching configs is slow and error-prone.<\/p>\n<p>Containers can be stopped, replaced, and restarted in seconds.<\/p>\n<\/div>","protected":false},"excerpt":{"rendered":"<p>Building AI agents in the real world often involves more than just making model calls \u2014 it requires integrating with [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":0,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[4],"tags":[],"class_list":["post-2391","post","type-post","status-publish","format-standard","hentry","category-docker"],"_links":{"self":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/2391","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/comments?post=2391"}],"version-history":[{"count":0,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/2391\/revisions"}],"wp:attachment":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/media?parent=2391"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/categories?post=2391"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/tags?post=2391"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}