{"id":2612,"date":"2025-10-17T19:16:24","date_gmt":"2025-10-17T19:16:24","guid":{"rendered":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2025\/10\/17\/how-to-add-mcp-servers-to-openais-codex-with-docker-mcp-toolkit\/"},"modified":"2025-10-17T19:16:24","modified_gmt":"2025-10-17T19:16:24","slug":"how-to-add-mcp-servers-to-openais-codex-with-docker-mcp-toolkit","status":"publish","type":"post","link":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2025\/10\/17\/how-to-add-mcp-servers-to-openais-codex-with-docker-mcp-toolkit\/","title":{"rendered":"How to add MCP Servers to OpenAI\u2019s Codex with Docker MCP Toolkit"},"content":{"rendered":"<p>AI assistants are changing how we write code, but their true power is unleashed when they can interact with specialized, high-precision tools. OpenAI\u2019s Codex is a formidable coding partner, but what happens when you connect it directly to your running infrastructure?<\/p>\n<p>Enter the <strong><a href=\"https:\/\/docs.docker.com\/ai\/mcp-catalog-and-toolkit\/\" target=\"_blank\">Docker MCP Toolkit<\/a><\/strong>.<\/p>\n<p>The Model Context Protocol (MCP) Toolkit acts as a secure bridge, allowing AI models like Codex to safely discover and use any of the 200+ MCP servers from the trusted MCP catalog curated by Docker.<\/p>\n<p>In this post, we\u2019ll walk through an end-to-end demo, just like our <a href=\"https:\/\/www.docker.com\/blog\/add-mcp-servers-to-claude-code-with-mcp-toolkit\/\">Claude Code<\/a> and <a href=\"https:\/\/www.docker.com\/blog\/how-to-set-up-gemini-cli-with-mcp-toolkit\/\">Gemini CLI<\/a> tutorials. But this time, we\u2019re pairing <strong>Codex<\/strong> with <strong>Neo4j<\/strong> MCP servers.<\/p>\n<p>First, we\u2019ll connect Codex to the Neo4j server using the MCP Toolkit. Then, we\u2019ll show a fun example: building a graph of Pok\u00e9mon species and their types, and exploring the data visually. While playful, this example highlights how Codex + MCP can be applied to real-world, semi-structured data pipelines.<\/p>\n<p>Read on to see how a generic AI assistant, when supercharged with Docker and MCP, can evolve into a specialized data engineering powerhouse!<\/p>\n<h2 class=\"wp-block-heading\">Why use Codex with Docker MCP<\/h2>\n<p>While Codex provides powerful AI capabilities and MCP provides the protocol, Docker MCP Toolkit makes automated data modeling and graph engineering practical. Without containerization, building a knowledge graph means managing local Neo4j installations, dealing with database driver versions, writing boilerplate connection and authentication code, and manually scripting the entire data validation and loading pipeline. A setup that should take minutes can easily stretch into hours for each developer.<\/p>\n<p>Docker MCP Toolkit eliminates this friction:<\/p>\n<p><strong>200+ pre-built MCP servers<\/strong> in the Catalog<\/p>\n<p><strong>One-click deployment<\/strong> through Docker Desktop<\/p>\n<p><strong>Neo4j Data Modeling MCP<\/strong> for schema design and validation<\/p>\n<p><strong>Neo4j Cypher MCP<\/strong> for direct database queries and ingestion<\/p>\n<p><strong>Secure credential management<\/strong> for database passwords<\/p>\n<p><strong>Consistent configuration<\/strong> across Mac, Windows, and Linux<\/p>\n<p><strong>Automatic updates<\/strong> when new server versions are released<\/p>\n<p>We built Docker MCP Toolkit to meet developers where they are. If you\u2019re using Codex, you should be able to engineer a knowledge graph without wrestling with database infrastructure.<\/p>\n<h2 class=\"wp-block-heading\"><strong>The Setup: Connecting Codex to Neo4j Tools<\/strong><\/h2>\n<h3 class=\"wp-block-heading\">Prerequisites<\/h3>\n<p>First, we need to give Codex access to the specialized Neo4j tools.\u00a0<\/p>\n<p>Install <a href=\"https:\/\/developers.openai.com\/codex\/cli\/\" target=\"_blank\">Codex<\/a> and run it at least once to get authentication out of the way<\/p>\n<p><a href=\"https:\/\/www.docker.com\/products\/docker-desktop\/\">Install Docker Desktop<\/a> 4.40 or later<\/p>\n<p><a href=\"https:\/\/docs.docker.com\/ai\/mcp-catalog-and-toolkit\/get-started\/#enable-docker-mcp-toolkit\" target=\"_blank\">Enable MCP Toolkit<\/a>\u00a0<\/p>\n<h3 class=\"wp-block-heading\">Step 1: Add the Neo4j MCP Servers<\/h3>\n<p>The <a href=\"https:\/\/hub.docker.com\/mcp\/server\/neo4j-cypher\/overview\" target=\"_blank\">Neo4j Cypher<\/a> and <a href=\"https:\/\/hub.docker.com\/mcp\/server\/neo4j-data-modeling\/overview\" target=\"_blank\">Data Modeling<\/a> servers are available out-of-the-box in the main MCP Toolkit catalog.<\/p>\n<p>In Docker Desktop, navigate to the <strong>MCP Toolkit<\/strong> tab.<\/p>\n<p>Click the <strong>Catalog<\/strong> tab.<\/p>\n<p>Search for \u201cNeo4j\u201d and click <strong>+ Add<\/strong> for both the <strong>Neo4j Cypher<\/strong> and <strong>Neo4j Data Modeling<\/strong> servers.<\/p>\n<p>They will now appear in your \u201cMy servers\u201d list.<\/p>\n<div class=\"wp-block-ponyo-image\"><\/div>\n\n<h3 class=\"wp-block-heading\">Step 2: Connect Codex to the MCP Toolkit<\/h3>\n<p>With our tools ready, we run a one-time command to make Codex aware of the MCP Toolkit:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\ndocker mcp-client configure codex\n<\/div>\n<p>We can also do that from the Docker Desktop UI, navigate to the clients tab, and smash that connect button next to Codex and any other assistants you use:<\/p>\n<div class=\"wp-block-ponyo-image\"><\/div>\n<p>Docker will edit the corresponding configuration files and next time Codex starts, it\u2019ll connect to the MCP toolkit and you\u2019ll have the tools at your disposal!<\/p>\n<h3 class=\"wp-block-heading\">Step 3: Start and Configure Neo4j<\/h3>\n<p>We still need to configure the Neo4j Cypher MCP server to connect to the Neo4j database, so we\u2019ll set this up now. We\u2019ll use Codex to start our Neo4j database and configure the connection. First, we ask Codex to create the container:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n\u203a Spin up a Neo4j container for me in Docker please.\n<\/div>\n<p>Codex will run the necessary Docker command, and get our Neo4j container running. You can of course do this manually, use a cloud service, or download the <a href=\"https:\/\/neo4j.com\/download\/\" target=\"_blank\">Neo4j Desktop application<\/a>, but since we\u2019re having fun in Codex \u2013 then why not make it do these mundane things for us.<\/p>\n<div class=\"wp-block-ponyo-image\"><\/div>\n<p>With Neo4j available in the container we now need to configure the <a href=\"https:\/\/hub.docker.com\/mcp\/server\/neo4j-cypher\/overview\" target=\"_blank\"><strong>Neo4j Cypher<\/strong><\/a> MCP server to connect to it. The <a href=\"https:\/\/hub.docker.com\/mcp\/server\/neo4j-data-modeling\/overview\" target=\"_blank\"><strong>Neo4j Data Modeling<\/strong><\/a> MCP server works without any configuration. To simplify you can take a screenshot of the configuration form and ask Codex for the information on how to fill it to connect to the container we just started.<\/p>\n<div class=\"wp-block-ponyo-image\"><\/div>\n<p>Next, we asked Codex for help, pointing it to a screenshot of the configuration page:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n\u203a There is a screenshot in this directory. It has the information we need to configure, MCP server to connect to the Neo4j database. Please look at it and give me the information to configure the MCP server to connect to our Neo4j container.\n<\/div>\n<p>Codex inspected the container, found the running container, and determined its authentication details (e.g., NEO4J_AUTH=neo4j\/password). It then provided the correct configuration values. We just need to update the <strong>Neo4j Cypher<\/strong> MCP server configuration in the Docker Desktop UI with the neo4j-cypher.url (neo4j:\/\/localhost:7687) and the password.<\/p>\n<p>Fill the form in the Docker MCP toolkit for the Neo4j Cypher MCP server and restart codex now.<\/p>\n<h2 class=\"wp-block-heading\">Use Case: Building a Pok\u00e9mon Graph with Codex + Neo4j<\/h2>\n<p>Now we\u2019re ready to tackle our original problem of building the graph of Pok\u00e9mon species.\u00a0First of all, enter the YOLO mode by typing \/approvals and selecting the mode where it has freedom to do whatever it pleases:<\/p>\n<div class=\"wp-block-ponyo-image\"><\/div>\n\n<p>Next, we can throw the full power of Codex models at our task. We give it the full problem statement:\u00a0<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n\u203a Here&#8217;s the page with all pokemon in generation 1: https:\/\/bulbapedia.bulbagarden.net\/wiki\/List_of_Pok%C3%A9mon_by_index_number_in_Generation_I<br \/>\n  and here&#8217;s the types information: https:\/\/bulbapedia.bulbagarden.net\/wiki\/Type\n<p>  we&#8217;re going to create a graph with data about these pokemon (ignore MissingNo entries &#8211; they are a glitch and ignore other glitched pokemon).<br \/>\n  you have a neo4j container running, and the MCP tools for modelling and cypher.\n<\/p><\/div>\n<p>Codex will accept the task, break it down into a plan, and start cranking at it.\u00a0<\/p>\n<p>There are a few interesting moments in how it approaches it, though it largely depends on the model you configure Codex with. For me it was \u201cmodel: gpt-5-codex\u201d and it tends to use scripting more than chaining curl commands for example.\u00a0<\/p>\n<p>Here\u2019s the plan you might see if you\u2019re following along:<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n\u2022 I\u2019m preparing to build a Pokemon graph by fetching data from a website using network access. Since this is a complex task, I\u2019ll create a detailed plan involving data gathering, modeling<br \/>\n  entities like Pok\u00e9mon and Types, defining relationships such as evolutions and type effectiveness, and then using tools like MCP Docker to build the Neo4j graph. The first step is to draft<br \/>\n\u2022 Updated Plan<br \/>\n  \u2514 \u25a1 Inspect requirements and gather Generation I Pok\u00e9mon data with types and evolution chains<br \/>\n    \u25a1 Define Neo4j graph data model for Pok\u00e9mon, Types, and relationships<br \/>\n    \u25a1 Load data into Neo4j via Cypher ingestion and validate\n<\/div>\n<p>Then it immediately started cranking python scripts with requests, BeautifulSoup and other web scraping friends to fetch the data from the provided wiki. It\u2019s interesting that the Pokemon data is not actually on a single page, but spread through subpages for every Pokemon species, so perhaps scraping with Python makes a ton of sense over curling the pages and spending tokens on processing the data in the model.\u00a0And it leaves the scripts behind so you can verify and reproduce the results if needed.\u00a0<\/p>\n<p>Eventually the model proceeds to analyze the data, and here the MCP tools become really helpful. You can see below that it\u2019s calling the MCP_DOCKER.validate_data_model from the server we configured earlier.<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\nI&#8217;m outlining a comprehensive data model for Pokemon, types, and their relationships with precise property types and units converted for better clarity. I plan to generate ingestion-ready<br \/>\n  JSON files via a Python script to integrate cleanly with Neo4j ingestion tools, ensuring the data pipeline is robust and structured.<br \/>\n  \u2514 wrote data model to data\/gen1_pokemon_model.json\n<p>\u2022 Called MCP_DOCKER.validate_data_model({&#8220;data_model&#8221;:&#8221;data\/gen1_pokemon_model.json&#8221;,&#8221;return_validated&#8221;:true})<br \/>\n  \u2514 Input validation error: &#8216;data\/gen1_pokemon_model.json&#8217; is not of type &#8216;object&#8217;<br \/>\n&#8230;.\n<\/p><\/div>\n<p>It took Codex a few attempts to figure out the data model json schema and produce one for the Pok\u00e9mon that satisfied the<strong> Neo4j <\/strong>Data Modelling MCP server.\u00a0<\/p>\n<p>Then it returned to Python for creating the data ingestion script and loaded the data into the Neo4j instance.<\/p>\n<p>A few MCP tool calls later to query the data with cypher (query language for graph databases) which it can do because it has access to the MCP server for Neo4j Cypher. And with it, Codex and the MCP servers can answer analytical questions about our data.<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n  &#8211; Greedy type-coverage search suggests trios such as (Rhydon, Parasect, Dragonite) or (Rhydon, Parasect, Jynx) hit 13 of the 15 defending types super-effectively; no trio can cover Normal\/Rock simultaneously because Normal has no offensive 2\u00d7 matchup.\n<\/div>\n<p>Now what\u2019s really fun about Neo4j is that it comes with a terrific console where you can explore the data.\u00a0<\/p>\n<p>While our Neo4j container with the Pok\u00e9mon data is still running we can go to <a href=\"http:\/\/localhost:7474\/\" target=\"_blank\">http:\/\/localhost:7474<\/a>, enter neo4j\/password credentials and get to explore the data in a visual way.\u00a0<\/p>\n<p>Here for example is a subset of the Pok\u00e9mon and their type relationships.<\/p>\n<div class=\"wp-block-ponyo-image\"><\/div>\n\n<p>And if you know Cypher or have an AI assistant that can generate Cypher queries (and verify they work with an MCP tool call), you can generate more complex projections of your data, for example this (generated by Codex) shows all Pok\u00e9mon, their evolution relationships and primary\/secondary types.<\/p>\n<div class=\"wp-block-syntaxhighlighter-code \">\n  MATCH (p:Pokemon)<br \/>\n  CALL {<br \/>\n    WITH p<br \/>\n    OPTIONAL MATCH (p)-[:EVOLVES_TO*1..]-&gt;(evo:Pokemon)<br \/>\n    WITH collect(DISTINCT evo) AS evos<br \/>\n    RETURN [e IN evos WHERE e IS NOT NULL | {node: e, relType: &#8216;EVOLVES_TO&#8217;}] AS evolutionConnections<br \/>\n  }<br \/>\n  CALL {<br \/>\n    WITH p<br \/>\n    OPTIONAL MATCH (p)-[:HAS_TYPE]-&gt;(type:Type)<br \/>\n    WITH type<br \/>\n    ORDER BY type.name         \/\/ ensures a stable primary\/secondary ordering<br \/>\n    RETURN collect(type) AS orderedTypes<br \/>\n  }<br \/>\n  WITH p, evolutionConnections, orderedTypes,<br \/>\n       CASE WHEN size(orderedTypes) &gt;= 1 THEN orderedTypes[0] END AS primaryType,<br \/>\n       CASE WHEN size(orderedTypes) &gt;= 2 THEN orderedTypes[1] END AS secondaryType<br \/>\n  WITH p,<br \/>\n       evolutionConnections +<br \/>\n       CASE WHEN primaryType IS NULL THEN [] ELSE [{node: primaryType, relType: &#8216;HAS_PRIMARY_TYPE&#8217;}] END +<br \/>\n       CASE WHEN secondaryType IS NULL THEN [] ELSE [{node: secondaryType, relType: &#8216;HAS_SECONDARY_TYPE&#8217;}] END AS connections<br \/>\n  UNWIND connections AS connection<br \/>\n  RETURN p   AS pokemon,<br \/>\n         connection.node AS related,<br \/>\n         connection.relType AS relationship<br \/>\n  ORDER BY pokemon.name, relationship, related.name;\n<\/div>\n<div class=\"wp-block-ponyo-image\"><\/div>\n\n<h2 class=\"wp-block-heading\">Turn Your AI Coding Assistant into a Data Engineer, Architect, Analyst and More<\/h2>\n<p>While this Pok\u00e9mon demo is a fun example, it\u2019s also a practical blueprint for working with real-world, semi-structured data. Graph databases like Neo4j are especially well-suited for this kind of work. Their relationship-first model makes it easier to represent the complexity of real-world systems.<\/p>\n<p>In this walkthrough, we showed how to connect OpenAI\u2019s Codex to the Neo4j MCP Servers via<strong> <\/strong>Docker MCP Toolkit, enabling it to take on multiple specialized roles:<\/p>\n<p><strong>Data Engineer<\/strong>: Writing Python to scrape and transform web data<\/p>\n<p><strong>Data Architect<\/strong>: Designing and validating graph models using domain-specific tools<\/p>\n<p><strong>DevOps Engineer<\/strong>: Starting services and configuring tools based on its environment<\/p>\n<p><strong>Data Analyst<\/strong>: Running complex Cypher and Python queries to extract insights<\/p>\n<p>In your own projects, you might ask your AI assistant to \u201cAnalyze production logs and identify the cause of performance spikes,\u201d \u201cMigrate the user database schema to a new microservice,\u201d\u00a0or \u201cModel our product catalog from a set of messy CSVs.\u201d<\/p>\n<h3 class=\"wp-block-heading\">Summary<\/h3>\n<p>The <a href=\"https:\/\/docs.docker.com\/ai\/mcp-catalog-and-toolkit\/\" target=\"_blank\">Docker MCP Toolkit<\/a> bridges the gap between powerful AI coding agents and the specialized tools they need to be truly useful. By providing secure, one-click access to a curated catalog of over 200 MCP servers, it enables AI agents to interact with real infrastructure, including databases, APIs, command-line tools, and more. Whether you\u2019re automating data workflows, querying complex systems, or orchestrating services, the MCP Toolkit equips your assistant to work like a real developer. If you\u2019re building with AI coding assistants and want it to go beyond code generation, it\u2019s time to start integrating with the tools your stack already relies on!<\/p>\n<h3 class=\"wp-block-heading\">Learn more<\/h3>\n<p><a href=\"https:\/\/hub.docker.com\/mcp\" target=\"_blank\">Explore the MCP Catalog<\/a>: Discover containerized, security-hardened MCP servers<\/p>\n<p>Open Docker Desktop and <a href=\"https:\/\/hub.docker.com\/open-desktop?url=https:\/\/open.docker.com\/dashboard\/mcp\" target=\"_blank\">get started with the MCP Toolkit<\/a> <em>(Requires version 4.48 or newer to launch the MCP Toolkit automatically)<\/em><\/p>\n<p>Read about our tutorial on <a href=\"https:\/\/www.docker.com\/blog\/add-mcp-servers-to-claude-code-with-mcp-toolkit\/\">How to Add MCP Servers to Claude Code with Docker MCP Toolkit<\/a><\/p>\n<p>Read about our tutorial on <a href=\"https:\/\/www.docker.com\/blog\/how-to-set-up-gemini-cli-with-mcp-toolkit\/\">How to Add MCP Servers to Gemini CLI with Docker MCP Toolkit<\/a><\/p>","protected":false},"excerpt":{"rendered":"<p>AI assistants are changing how we write code, but their true power is unleashed when they can interact with specialized, [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":0,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[4],"tags":[],"class_list":["post-2612","post","type-post","status-publish","format-standard","hentry","category-docker"],"_links":{"self":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/2612","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/comments?post=2612"}],"version-history":[{"count":0,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/2612\/revisions"}],"wp:attachment":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/media?parent=2612"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/categories?post=2612"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/tags?post=2612"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}