n8n-nodes-localai
v2.0.0
Published
LocalAI integration for n8n with AI Agent support
Downloads
181
Maintainers
Readme
n8n-nodes-localai
This is an n8n community node that integrates LocalAI into n8n. It provides a Chat Model node compatible with n8n's AI Agent nodes.
Features
- LocalAI Chat Model: A node fully compatible with n8n's AI Agent ecosystem.
- Connect your local LLMs (Llama, Mistral, Phi, etc.) directly to n8n AI Agents.
- Supports configuration of Model Name, Temperature, and Max Tokens.
Installation
- Go to your n8n instance.
- Navigate to Settings > Community Nodes.
- Select Install.
- Enter the package name:
n8n-nodes-localai
Setup
You will need to create a LocalAI API credential in n8n to connect to your instance.
- Base URL: The URL where your LocalAI is running (e.g.,
http://localhost:8080orhttp://host.docker.internal:8080). - API Key: Optional. Only required if you have secured your LocalAI instance.
Usage
- Add an AI Agent node to your canvas.
- Add the LocalAI Chat Model node to your canvas.
- Connect the output of the LocalAI Chat Model node to the Model input of the AI Agent node.
- In the LocalAI node, specify the Model Name (e.g.,
llama-3-8b,gpt-4, etc.) that matches your LocalAI configuration.
License
MIT
