n8n-nodes-mcp-reneworks
v0.1.7
Published
n8n node to connect to an MCP-enabled Chat API
Readme
n8n-nodes-mcp-reneworks
This is an n8n community node that lets you use an MCP-enabled Chat API as a Language Model in n8n.
Installation
For Local n8n
- Go to your n8n root directory (e.g.
~/.n8n). - Create a
customdirectory if it doesn't exist:mkdir custom. - Clone or copy this repository into
custom/n8n-nodes-mcp-reneworks. - Inside
custom/n8n-nodes-mcp-reneworks, runnpm installandnpm run build. - In
~/.n8n, runnpm install ./custom/n8n-nodes-mcp-reneworks. - Start n8n:
n8n start.
Installation via NPM / Git
If you have this node in a Git repository or as an NPM package:
From Git:
npm install git+https://github.com/YourUsername/n8n-nodes-mcp-reneworks.gitRun this command in your
~/.n8ndirectory.From a Tarball (.tgz): If you have the
.tgzfile (e.g.,n8n-nodes-mcp-reneworks-0.1.0.tgz):npm install /path/to/n8n-nodes-mcp-reneworks-0.1.0.tgzRun this command in your
~/.n8ndirectory.Publishing to NPM: If you want to publish this to the npm registry:
npm login npm publish --access publicThen anyone can install it via:
npm install n8n-nodes-mcp-reneworks
You need to build a custom Docker image or mount the volume. Use the n8n-nodes-starter guide for details on mounting.
Usage
- Open your n8n workflow.
- Add an AI Agent node.
- Add the MCP Chat Model node.
- Connect the MCP Chat Model output to the Model input of the AI Agent.
- Configure the MCP Chat Model:
- Base URL: The address of your MCP Chat API (default:
http://localhost:1234). - Model Name: The model identifier (e.g.,
ibm/granite-4-micro). - Integrations: Paste the JSON configuration for your MCP servers/plugins.
- Base URL: The address of your MCP Chat API (default:
Example Integrations JSON
[
{
"type": "ephemeral_mcp",
"server_label": "huggingface",
"server_url": "https://huggingface.co/mcp",
"allowed_tools": [
"model_search"
]
},
{
"type": "plugin",
"id": "mcp/playwright",
"allowed_tools": [
"browser_navigate"
]
}
]