npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

n8n-nodes-custom-llm

v1.2.3

Published

Custom LLM node for n8n - use any LLM API via HTTP requests with curl import support

Downloads

874

Readme

n8n-nodes-custom-llm

Custom LLM node for n8n - Use any LLM API via custom HTTP request with curl import support.

Features

  • Manual Request Mode: Manually configure HTTP requests with full control over URL, headers, body, and query parameters
  • cURL Import Mode: Import HTTP request options directly from curl commands
  • Dynamic Prompt Injection: Inject prompts into any location in the JSON body using dot notation (e.g., messages[0].content)
  • Custom Parameters: Configure any LLM parameters like max_tokens, temperature, top_k, etc. with Liquid template support
  • Authentication Support: Built-in support for Header Auth and Basic Auth
  • Liquid Templates: Full support for n8n Liquid expressions in all fields

Installation

Global Installation (Development)

  1. Clone or download this repository:
git clone <repository-url>
cd n8n-nodes-custom-llm
  1. Build the project:
npm install
npm run build
  1. Link the module globally:
npm link
  1. Navigate to your n8n directory:
cd ~/.n8n
  1. Link the nodes package:
npm link n8n-nodes-custom-llm
  1. Restart n8n

Installation in n8n Directory

  1. Copy the entire project to your n8n directory:
cp -r n8n-nodes-custom-llm ~/.n8n/nodes/
  1. Navigate to the node directory:
cd ~/.n8n/nodes/n8n-nodes-custom-llm
  1. Install and build:
npm install
npm run build
  1. Restart n8n

Docker Installation

Add the following to your n8n docker-compose.yml:

volumes:
  - ./n8n-nodes-custom-llm:/nodes/n8n-nodes-custom-llm

Then rebuild and restart n8n.

Usage

Mode 1: Manual Configuration

  1. Add the "Custom LLM" node to your workflow
  2. Select "Manual" request mode
  3. Fill in:
    • URL: Your LLM API endpoint (e.g., https://api.openai.com/v1/chat/completions)
    • Method: POST (most LLM APIs use POST)
    • Authentication: Choose auth type (Header Auth, etc.)
    • JSON Body: Base request template
    • Prompt Field: Where to inject the prompt (e.g., messages[0].content)
    • Prompt: Your prompt text (supports Liquid templates)
    • Custom Parameters: Add parameters like max_tokens, temperature, top_k

Mode 2: Import from cURL

  1. Get a curl command from your LLM provider documentation
  2. Select "Import from cURL" request mode
  3. Paste the curl command
  4. Use the Prompt Field and Custom Parameters sections to override values dynamically

Example Configurations

OpenAI API Setup

  • URL: https://api.openai.com/v1/chat/completions
  • Method: POST
  • JSON Body:
    {
      "model": "gpt-4",
      "messages": [
        {
          "role": "user",
          "content": ""
        }
      ]
    }
  • Prompt Field: messages[0].content
  • Custom Parameters:
    • max_tokens: 1000
    • temperature: 0.7

Anthropic Claude API Setup

  • URL: https://api.anthropic.com/v1/messages
  • Method: POST
  • JSON Body:
    {
      "model": "claude-3-opus-20240229",
      "messages": [
        {
          "role": "user",
          "content": ""
        }
      ]
    }
  • Prompt Field: messages[0].content
  • Custom Parameters:
    • max_tokens: 1000
    • top_k: 250

Liquid Template Examples

Using Previous Node Data

  • Prompt: {{ $json.userInput }}
  • Custom Parameter: {{ $json.tokenCount }}

Conditional Values

{% if $json.priority == 'high' %}
  0.1
{% else %}
  0.7
{% endif %}

Complex Prompt

You are a {{ $json.role }}. 
Please answer the following question: {{ $json.question }}

Requirements

  • n8n 1.0.0 or higher

License

MIT