{"id":3304,"date":"2026-01-26T13:16:58","date_gmt":"2026-01-26T13:16:58","guid":{"rendered":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2026\/01\/26\/run-claude-code-locally-with-docker-model-runner\/"},"modified":"2026-01-26T13:16:58","modified_gmt":"2026-01-26T13:16:58","slug":"run-claude-code-locally-with-docker-model-runner","status":"publish","type":"post","link":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2026\/01\/26\/run-claude-code-locally-with-docker-model-runner\/","title":{"rendered":"Run Claude Code Locally with Docker Model Runner"},"content":{"rendered":"<p>We recently showed how to pair <a href=\"https:\/\/www.docker.com\/blog\/opencode-docker-model-runner-private-ai-coding\/\">OpenCode with Docker Model Runner<\/a> for a privacy-first, cost-effective AI coding setup. Today, we\u2019re bringing the same approach to <a href=\"https:\/\/claude.com\/product\/claude-code\" rel=\"nofollow noopener\" target=\"_blank\"><strong>Claude Code<\/strong><\/a>, Anthropic\u2019s agentic coding tool.<\/p>\n<p>This post walks through how to configure Claude Code to use Docker Model Runner, giving you full control over your data, infrastructure, and spend.<\/p>\n<div class=\"wp-block-ponyo-image\">\n                <img data-opt-id=1527100876  fetchpriority=\"high\" decoding=\"async\" width=\"1000\" height=\"616\" src=\"https:\/\/www.docker.com\/app\/uploads\/2026\/01\/Claude-code-DMR-figure-1.png\" class=\"fade-in attachment-full size-full\" alt=\"Claude code DMR figure 1\" title=\"- Claude code DMR figure 1\" \/>\n        <\/div>\n<p class=\"has-xs-font-size\">Figure 1: Using local models like gpt-oss to power Claude Code<\/p>\n<h3 class=\"wp-block-heading\">What Is Claude Code?<\/h3>\n<p><a href=\"https:\/\/claude.com\/product\/claude-code\" rel=\"nofollow noopener\" target=\"_blank\"><strong>Claude Code<\/strong><\/a> is Anthropic\u2019s command-line tool for agentic coding. It lives in your terminal, understands your codebase, and helps you code faster by executing routine tasks, explaining complex code, and handling git workflows through natural language commands.<\/p>\n<p><a href=\"https:\/\/docs.docker.com\/ai\/model-runner\/\" rel=\"nofollow noopener\" target=\"_blank\"><strong>Docker Model Runner (DMR)<\/strong><\/a> allows you to run and manage large language models locally. It exposes an Anthropic-compatible API, making it straightforward to integrate with tools like Claude Code.<\/p>\n<h3 class=\"wp-block-heading\">Install Claude Code<\/h3>\n<p>Install Claude Code<br \/>Install <a href=\"https:\/\/claude.com\/product\/claude-code\" rel=\"nofollow noopener\" target=\"_blank\">Claude Code<\/a>:<br \/><strong>macOS \/ Linux:<\/strong><\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n<pre class=\"brush: bash; gutter: false; title: ; notranslate\">\ncurl -fsSL https:\/\/claude.ai\/install.sh | bash\n<\/pre>\n<\/div>\n<p><strong>Windows PowerShell:<\/strong><\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n<pre class=\"brush: bash; gutter: false; title: ; notranslate\">\nirm https:\/\/claude.ai\/install.ps1 | iex\n\n<\/pre>\n<\/div>\n<h2 class=\"wp-block-heading\"><strong>Using Claude Code with Docker Model Runner<\/strong><\/h2>\n<p>Claude Code supports custom API endpoints through the ANTHROPIC_BASE_URL environment variable. Since Docker Model Runner exposes an Anthropic-compatible API, integrating the two is simple.<\/p>\n<div class=\"style-plain wp-block-ponyo-houston\">\n<p><em>Note for Docker Desktop users:<\/em><br \/><em>If you are running Docker Model Runner via Docker Desktop, make sure TCP access is enabled:<\/em><\/p>\n<\/div>\n<div class=\"wp-block-syntaxhighlighter-code \">\n<pre class=\"brush: bash; gutter: false; title: ; notranslate\">\ndocker desktop enable model-runner --tcp\n\n<\/pre>\n<\/div>\n<p><em>Once enabled, Docker Model Runner will be accessible at <\/em><a href=\"http:\/\/localhost:12434\/\" rel=\"nofollow noopener\" target=\"_blank\"><em>http:\/\/localhost:12434<\/em><\/a><em>.<\/em><\/p>\n<h3 class=\"wp-block-heading\">Increasing Context Size<\/h3>\n<p>For coding tasks, context length matters. While models like <a href=\"https:\/\/hub.docker.com\/r\/ai\/glm-4.7-flash\" rel=\"nofollow noopener\" target=\"_blank\"><strong>glm-4.7-flash<\/strong><\/a>, <a href=\"https:\/\/hub.docker.com\/r\/ai\/qwen3-coder\" rel=\"nofollow noopener\" target=\"_blank\"><strong>qwen3-coder<\/strong><\/a> and <a href=\"https:\/\/hub.docker.com\/r\/ai\/devstral-small-2\" rel=\"nofollow noopener\" target=\"_blank\"><strong>devstral-small-2<\/strong><\/a> come with 128K context by default, <a href=\"https:\/\/hub.docker.com\/r\/ai\/gpt-oss\" rel=\"nofollow noopener\" target=\"_blank\"><strong>gpt-oss<\/strong><\/a> defaults to 4,096 tokens.<\/p>\n<p>Docker Model Runner makes it easy to repackage any model with an increased context size:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n<pre class=\"brush: bash; gutter: false; title: ; notranslate\">\ndocker model pull gpt-oss\ndocker model package --from gpt-oss --context-size 32000 gpt-oss:32k\n<\/pre>\n<\/div>\n<p>Once packaged, use it with Claude Code:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n<pre class=\"brush: bash; gutter: false; title: ; notranslate\">\nANTHROPIC_BASE_URL=http:\/\/localhost:12434 claude --model gpt-oss:32k\n<\/pre>\n<\/div>\n<div class=\"wp-block-syntaxhighlighter-code \">\n<pre class=\"brush: bash; gutter: false; title: ; notranslate\">\nANTHROPIC_BASE_URL=http:\/\/localhost:12434 claude --model gpt-oss \"Describe this repo.\"\n<\/pre>\n<\/div>\n<p>That\u2019s it. Claude Code will now send all requests to your local Docker Model Runner instance.<\/p>\n<h2 class=\"wp-block-heading\"><strong>Run Claude Code locally with gpt-oss using Docker Model Runner<\/strong><\/h2>\n<p>Here\u2019s what it looks like in action:<\/p>\n<div class=\"wp-block-ponyo-image\">\n                <img data-opt-id=1723383371  fetchpriority=\"high\" decoding=\"async\" width=\"1000\" height=\"615\" src=\"https:\/\/www.docker.com\/app\/uploads\/2026\/01\/Claude-code-DMR-figure-2.png\" class=\"fade-in attachment-full size-full\" alt=\"Claude code DMR figure 2\" title=\"- Claude code DMR figure 2\" \/>\n        <\/div>\n<p class=\"has-xs-font-size\">Figure 2: Claude Code powered by Docker Model Runner keeps everything local and in your\u00a0 control<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n<pre class=\"brush: plain; gutter: false; title: ; notranslate\">\n$ docker model pull gpt-oss\n\n$ docker model package --from gpt-oss --context-size 32000 gpt-oss:32k\n\n$ ANTHROPIC_BASE_URL=http:\/\/localhost:12434 claude --model gpt-oss:32k \"Describe this repo.\"\n\n \u2590\u259b\u2588\u2588\u2588\u259c\u258c   Claude Code v2.1.12\n\u259d\u259c\u2588\u2588\u2588\u2588\u2588\u259b\u2598  gpt-oss:32k \u00b7 API Usage Billing\n  \u2598\u2598 \u259d\u259d    ~\/workspace\/model-runner\n\n\u276f Describe this repo.\n\n<img data-opt-id=1375843824  data-opt-src=\"https:\/\/s.w.org\/images\/core\/emoji\/16.0.1\/72x72\/23fa.png\"  decoding=\"async\" src=\"data:image/svg+xml,%3Csvg%20viewBox%3D%220%200%20100%%20100%%22%20width%3D%22100%%22%20height%3D%22100%%22%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%3E%3Crect%20width%3D%22100%%22%20height%3D%22100%%22%20fill%3D%22transparent%22%2F%3E%3C%2Fsvg%3E\" alt=\"\u23fa\" class=\"wp-smiley\" \/> Read(README.md)\n  \u23bf  Read 423 lines\n\n<img data-opt-id=1375843824  data-opt-src=\"https:\/\/s.w.org\/images\/core\/emoji\/16.0.1\/72x72\/23fa.png\"  decoding=\"async\" src=\"data:image/svg+xml,%3Csvg%20viewBox%3D%220%200%20100%%20100%%22%20width%3D%22100%%22%20height%3D%22100%%22%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%3E%3Crect%20width%3D%22100%%22%20height%3D%22100%%22%20fill%3D%22transparent%22%2F%3E%3C%2Fsvg%3E\" alt=\"\u23fa\" class=\"wp-smiley\" \/> Docker Model Runner (DMR)\n  - A Go-based tool that lets developers pull, run, and serve AI models \n    directly from Docker Hub or any OCI-compatible registry.\n  - Consists of two main components:\n    a. model-runner \u2013 a server\/daemon that manages model lifecycle, \n       inference, and metrics.\n    b. model-cli \u2013 a CLI plugin (docker model) that communicates with \n       the server, listing models, pulling new ones, and executing inference.\n\n  - Supports multiple inference backends: llama.cpp, vLLM, and others.\n  - Built with Docker Desktop in mind, but also works with Docker Engine \n    on Linux.\n  - Provides a REST API for model management and inference, plus a \n    \/metrics endpoint for Prometheus-style monitoring.\n\n<\/pre>\n<\/div>\n<p>Claude Code reads your repository, reasons about its structure, and provides an accurate summary, all while keeping your code entirely on your local machine.<\/p>\n<h2 class=\"wp-block-heading\"><strong>Monitor the requests sent by Claude Code<\/strong><\/h2>\n<p>Want to see exactly what Claude Code sends to Docker Model Runner? Use the docker model requests command:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n<pre class=\"brush: bash; gutter: false; title: ; notranslate\">\ndocker model requests --model gpt-oss:32k | jq .\n<\/pre>\n<\/div>\n<div class=\"wp-block-ponyo-image\">\n                <img data-opt-id=1938055853  data-opt-src=\"https:\/\/www.docker.com\/app\/uploads\/2026\/01\/Claude-code-DMR-figure-3.png\"  decoding=\"async\" width=\"1000\" height=\"615\" src=\"data:image/svg+xml,%3Csvg%20viewBox%3D%220%200%20100%%20100%%22%20width%3D%22100%%22%20height%3D%22100%%22%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%3E%3Crect%20width%3D%22100%%22%20height%3D%22100%%22%20fill%3D%22transparent%22%2F%3E%3C%2Fsvg%3E\" class=\"fade-in attachment-full size-full\" alt=\"Claude code DMR figure 3\" title=\"- Claude code DMR figure 3\" \/>\n        <\/div>\n<p class=\"has-xs-font-size\">Figure 3: Monitor requests sent by Claude Code to the LLM<\/p>\n<p>This outputs the raw requests, which is useful for understanding how Claude Code communicates with the model and debugging any compatibility issues.<\/p>\n<p><strong>Making It Persistent<\/strong><br \/>For convenience, set the environment variable in your shell profile:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n<pre class=\"brush: bash; gutter: false; title: ; notranslate\">\n# Add to ~\/.bashrc, ~\/.zshrc, or equivalent\nexport ANTHROPIC_BASE_URL=http:\/\/localhost:12434\n<\/pre>\n<\/div>\n<p>Then simply run:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n<pre class=\"brush: bash; gutter: false; title: ; notranslate\">\nclaude --model gpt-oss:32k \"Describe this repo.\"\n\n<\/pre>\n<\/div>\n<h3 class=\"wp-block-heading\">How You Can Get Involved<\/h3>\n<p>The strength of Docker Model Runner lies in its community, and there\u2019s always room to grow. To get involved:<\/p>\n<ul class=\"wp-block-list\">\n<li><strong>Star the repository:<\/strong> Show your support by starring the<a href=\"https:\/\/github.com\/docker\/model-runner\" rel=\"nofollow noopener\" target=\"_blank\"> Docker Model Runner repo<\/a>.<\/li>\n<li><strong>Contribute your ideas:<\/strong> Create an issue or submit a pull request. We\u2019re excited to see what ideas you have!<\/li>\n<li><strong>Spread the word:<\/strong> Tell your friends and colleagues who might be interested in running AI models with Docker.<\/li>\n<\/ul>\n<p>We\u2019re incredibly excited about this new chapter for Docker Model Runner, and we can\u2019t wait to see what we can build together. Let\u2019s get to work!<\/p>\n<h3 class=\"wp-block-heading\">Learn More<\/h3>\n<ul class=\"wp-block-list\">\n<li>Read the companion post: <a href=\"https:\/\/www.docker.com\/blog\/opencode-docker-model-runner-private-ai-coding\/\">OpenCode with Docker Model Runner for Private AI Coding<\/a><\/li>\n<li>Check out the Docker Model Runner General Availability<a href=\"https:\/\/www.docker.com\/blog\/announcing-docker-model-runner-ga\/\"> announcement<\/a><\/li>\n<li>Visit our <a href=\"https:\/\/github.com\/docker\/model-runner\" rel=\"nofollow noopener\" target=\"_blank\">Model Runner GitHub repo<\/a><\/li>\n<li>Get started with a simple<a href=\"https:\/\/github.com\/docker\/hello-genai\" rel=\"nofollow noopener\" target=\"_blank\"> hello GenAI application<\/a><\/li>\n<li>Learn more about <a href=\"https:\/\/claude.com\/product\/claude-code\" rel=\"nofollow noopener\" target=\"_blank\">Claude Code<\/a> from Anthropic\u2019s documentation<\/li>\n<\/ul>","protected":false},"excerpt":{"rendered":"<p>We recently showed how to pair OpenCode with Docker Model Runner for a privacy-first, cost-effective AI coding setup. Today, we\u2019re [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":3305,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[4],"tags":[],"class_list":["post-3304","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-docker"],"_links":{"self":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/3304","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/comments?post=3304"}],"version-history":[{"count":0,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/3304\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/media\/3305"}],"wp:attachment":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/media?parent=3304"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/categories?post=3304"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/tags?post=3304"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}