n8n-nodes-mcp: Seamless AI Integration with Model Context Protocol

Summary
The n8n-nodes-mcp is a custom n8n node designed to facilitate interaction with Model Context Protocol (MCP) servers. It empowers n8n workflows to connect with AI models, access resources, execute tools, and utilize prompts in a standardized manner, significantly enhancing AI agent capabilities.
Repository Info
Tags
Click on any tag to explore related repositories
Introduction
The n8n-nodes-mcp repository provides a custom node for n8n, a fair-code licensed workflow automation platform, enabling seamless integration with Model Context Protocol (MCP) servers. MCP is a powerful protocol designed to allow AI models to interact with external tools and data sources in a standardized and efficient manner. This node empowers your n8n workflows to connect to MCP servers, access various resources, execute tools, and utilize prompts, significantly extending the capabilities of AI agents within your automations.
With 2855 stars and 494 forks, this MIT-licensed project, primarily developed in TypeScript, demonstrates strong community interest and active development.
Installation
To integrate the MCP Client node into your n8n instance, follow the official installation guide for n8n community nodes.
It is crucial to note that if you intend to use the MCP Client node as a tool within n8n AI Agents, you must set the N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE environment variable to true. This enables community nodes to function as tools for your AI agents.
Examples
The n8n-nodes-mcp node offers flexible ways to connect and interact with MCP servers, supporting various transport methods and configurations.
Connection Types and Credentials
The node supports three primary connection types:
- Command-line Based Transport (STDIO): Configure the command, arguments, and environment variables to start an MCP server directly.
- HTTP Streamable Transport (Recommended): Connect to an HTTP endpoint that supports streaming responses. This is the modern and recommended method for new implementations.
- Server-Sent Events (SSE) Transport (Deprecated): Available for legacy compatibility, connecting to SSE endpoints. HTTP Streamable is preferred for new projects.
Example: Using a Local MCP Server with HTTP Streamable
- Start a local MCP server that supports HTTP Streamable:
npx @modelcontextprotocol/server-example-streamable - In n8n, configure new credentials of type MCP Client (HTTP Streamable) API.
- Set HTTP Streamable URL to
http://localhost:3001/stream. - Add an MCP Client node to your workflow, select
HTTP Streamableas the Connection Type, and choose your newly created credentials.
Environment Variables
Environment variables can be passed to MCP servers via the credentials UI or, for Docker deployments, by prefixing them with MCP_.
Example: Using Brave Search MCP Server
This example demonstrates integrating the Brave Search MCP server:
- Install the Brave Search MCP server globally:
npm install -g @modelcontextprotocol/server-brave-search - Configure MCP Client credentials in n8n:
- Command:
npx - Arguments:
-y @modelcontextprotocol/server-brave-search - Environment Variables:
BRAVE_API_KEY=your-api-key
- Command:
- In an n8n workflow, add an MCP Client node. First, use the "List Tools" operation to see available search tools. Then, add another MCP Client node, select the "Execute Tool" operation, choose "brave_search", and set parameters like
{"query": "latest AI news"}.
Multi-Server Setup with AI Agent
For advanced scenarios, you can configure multiple MCP servers in a Docker environment and leverage them with n8n's AI Agent capabilities. By setting MCP_ prefixed environment variables in your docker-compose.yml and creating corresponding MCP Client credentials in n8n, an AI Agent can utilize various tools from different MCP servers.
version: '3'
services:
n8n:
image: n8nio/n8n
environment:
# MCP server environment variables
- MCP_BRAVE_API_KEY=your-brave-api-key
- MCP_OPENAI_API_KEY=your-openai-key
# ... other MCP keys
- N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
ports:
- "5678:5678"
volumes:
- ~/.n8n:/home/node/.n8n
Operations
The MCP Client node supports several operations to interact with MCP servers:
- Execute Tool: Run a specific tool with defined parameters.
- Get Prompt: Retrieve a specific prompt template.
- List Prompts: Get a list of all available prompts.
- List Resources: Obtain a list of resources from the MCP server.
- List Tools: Discover all available tools, including their descriptions and parameter schemas.
- Read Resource: Read a specific resource by its URI.
Why Use n8n-nodes-mcp?
The n8n-nodes-mcp node is an essential tool for anyone looking to enhance their n8n workflows with advanced AI capabilities. It provides a standardized and robust way to:
- Integrate AI Models: Connect your n8n automations directly to AI models via the Model Context Protocol.
- Access External Tools & Data: Empower AI agents to use external tools and data sources, such as search engines or weather APIs, within your workflows.
- Streamline AI Agent Development: Simplify the creation of sophisticated AI agents by providing a unified interface for various MCP servers.
- Leverage Community Resources: Benefit from a growing community, extensive documentation, and video tutorials that guide you through setup and advanced usage.
- Future-Proof Integrations: Utilize the recommended HTTP Streamable transport for efficient and flexible communication with MCP servers.
Links
- GitHub Repository: https://github.com/nerding-io/n8n-nodes-mcp
- n8n community nodes documentation: https://docs.n8n.io/integrations/community-nodes/installation/
- Model Context Protocol Documentation: https://modelcontextprotocol.io/docs/
- MCP TypeScript SDK: https://github.com/modelcontextprotocol/typescript-sdk
- MCP Transports Overview: https://modelcontextprotocol.io/docs/concepts/transports
- Official Quickstart Video: https://youtu.be/1t8DQL-jUJk
- MCP Explained YouTube Series: https://www.youtube.com/playlist?list=PLjOCx_PNfJ4S_oOSqrMi6t9_x1GllvQZO