mcpo: A Simple, Secure MCP-to-OpenAPI Proxy Server

mcpo: A Simple, Secure MCP-to-OpenAPI Proxy Server

Summary

mcpo is a dead-simple proxy server that transforms any Model Context Protocol (MCP) tool into an OpenAPI-compatible HTTP server. This allows seamless integration of MCP tools with LLM agents and applications that expect standard RESTful OpenAPI interfaces, eliminating the need for custom protocols or complex glue code. It enhances security, stability, and interoperability for AI tools, making them instantly usable with modern web standards.

Repository Info

Updated on March 4, 2026
View on GitHub

Tags

Click on any tag to explore related repositories

Introduction

mcpo is a powerful yet simple proxy server designed to expose any Model Context Protocol (MCP) tool as an OpenAPI-compatible HTTP server. This innovation allows your MCP tools to "just work" with Large Language Model (LLM) agents and applications that expect standard RESTful OpenAPI interfaces, removing the need for custom protocols, glue code, or complex integrations.

Why Use mcpo?

Native MCP servers often communicate over raw stdio, which presents several challenges, including inherent insecurity, incompatibility with most modern tools, and a lack of standard features like documentation, authentication, and robust error handling. mcpo addresses these issues effectively and without extra effort:

  • Instant Compatibility: Works immediately with OpenAPI tools, SDKs, and user interfaces.
  • Enhanced Security and Stability: Adds security, stability, and scalability by leveraging trusted web standards.
  • Auto-Generated Documentation: Automatically generates interactive documentation for every tool, requiring no configuration.
  • Pure HTTP: Utilizes pure HTTP, avoiding complex sockets, glue code, or unexpected behaviors.

mcpo transforms what might seem like an extra step into a streamlined process with superior outcomes, making your AI tools usable, secure, and interoperable right now.

Installation

We recommend using uv for lightning-fast startup and zero configuration. If you prefer, pip is also an option.

Using uv (recommended):

uvx mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command

Using pip:

pip install mcpo
mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command

Via Docker:

You can also run mcpo via Docker with no local installation:

docker run -p 8000:8000 ghcr.io/open-webui/mcpo:main --api-key "top-secret" -- your_mcp_server_command

Examples

Basic Usage:

To expose a simple MCP server, like mcp-server-time:

uvx mcpo --port 8000 --api-key "top-secret" -- uvx mcp-server-time --local-timezone=America/New_York

Your MCP tool will then be available at http://localhost:8000 with a generated OpenAPI schema, testable live at http://localhost:8000/docs.

Serving SSE-compatible MCP Servers:

mcpo --port 8000 --api-key "top-secret" --server-type "sse" -- http://127.0.0.1:8001/sse

Serving Streamable HTTP-compatible MCP Servers:

mcpo --port 8000 --api-key "top-secret" --server-type "streamable-http" -- http://127.0.0.1:8002/mcp

Using a Configuration File:

You can serve multiple MCP tools using a single configuration file. Enable hot-reload mode with --hot-reload to automatically watch your config file for changes.

mcpo --config /path/to/config.json --hot-reload

Example config.json:

{
  "mcpServers": {
    "memory": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-memory"]
    },
    "time": {
      "command": "uvx",
      "args": ["mcp-server-time", "--local-timezone=America/New_York"]
    }
  }
}

Each tool will be accessible under its own unique route, e.g., http://localhost:8000/memory and http://localhost:8000/time, each with a dedicated OpenAPI schema and proxy handler.

Links