npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

notebooklm-mcp

v1.2.1

Published

MCP server for NotebookLM API with session support and human-like behavior

Readme

NotebookLM MCP Server

Let your CLI agents (Claude, Cursor, Codex...) chat directly with NotebookLM for zero-hallucination answers based on your own notebooks

TypeScript MCP npm Claude Code Skill GitHub

InstallationQuick StartWhy NotebookLMExamplesClaude Code SkillDocumentation


The Problem

When you tell Claude Code or Cursor to "search through my local documentation", here's what happens:

  • Massive token consumption: Searching through documentation means reading multiple files repeatedly
  • Inaccurate retrieval: Searches for keywords, misses context and connections between docs
  • Hallucinations: When it can't find something, it invents plausible-sounding APIs
  • Expensive & slow: Each question requires re-reading multiple files

The Solution

Let your local agents chat directly with NotebookLM — Google's zero-hallucination knowledge base powered by Gemini 2.5 that provides intelligent, synthesized answers from your docs.

Your Task → Local Agent asks NotebookLM → Gemini synthesizes answer → Agent writes correct code

The real advantage: No more manual copy-paste between NotebookLM and your editor. Your agent asks NotebookLM directly and gets answers straight back in the CLI. It builds deep understanding through automatic follow-ups — Claude asks multiple questions in sequence, each building on the last, getting specific implementation details, edge cases, and best practices. You can save NotebookLM links to your local library with tags and descriptions, and Claude automatically selects the relevant notebook based on your current task.


Why NotebookLM, Not Local RAG?

| Approach | Token Cost | Setup Time | Hallucinations | Answer Quality | |----------|------------|------------|----------------|----------------| | Feed docs to Claude | 🔴 Very high (multiple file reads) | Instant | Yes - fills gaps | Variable retrieval | | Web search | 🟡 Medium | Instant | High - unreliable sources | Hit or miss | | Local RAG | 🟡 Medium-High | Hours (embeddings, chunking) | Medium - retrieval gaps | Depends on setup | | NotebookLM MCP | 🟢 Minimal | 5 minutes | Zero - refuses if unknown | Expert synthesis |

What Makes NotebookLM Superior?

  1. Pre-processed by Gemini: Upload docs once, get instant expert knowledge
  2. Natural language Q&A: Not just retrieval — actual understanding and synthesis
  3. Multi-source correlation: Connects information across 50+ documents
  4. Citation-backed: Every answer includes source references
  5. No infrastructure: No vector DBs, embeddings, or chunking strategies needed

Installation

Claude Code

claude mcp add notebooklm npx notebooklm-mcp@latest

Codex

codex mcp add notebooklm -- npx notebooklm-mcp@latest
gemini mcp add notebooklm npx notebooklm-mcp@latest

Add to ~/.cursor/mcp.json:

{
  "mcpServers": {
    "notebooklm": {
      "command": "npx",
      "args": ["-y", "notebooklm-mcp@latest"]
    }
  }
}
amp mcp add notebooklm -- npx notebooklm-mcp@latest
code --add-mcp '{"name":"notebooklm","command":"npx","args":["notebooklm-mcp@latest"]}'

Generic MCP config:

{
  "mcpServers": {
    "notebooklm": {
      "command": "npx",
      "args": ["notebooklm-mcp@latest"]
    }
  }
}

Alternative: Claude Code Skill

Prefer Claude Code Skills over MCP? This server is now also available as a native Claude Code Skill with a simpler setup:

NotebookLM Claude Code Skill - Clone to ~/.claude/skills and start using immediately

Key differences:

  • MCP Server (this repo): Persistent sessions, works with Claude Code, Codex, Cursor, and other MCP clients
  • Claude Code Skill: Simpler setup, Python-based, stateless queries, works only with local Claude Code

Both use the same browser automation technology and provide zero-hallucination answers from your NotebookLM notebooks.


Quick Start

1. Install the MCP server (see Installation above)

2. Authenticate (one-time)

Say in your chat (Claude/Codex):

"Log me in to NotebookLM"

A Chrome window opens → log in with Google

3. Create your knowledge base

Go to notebooklm.google.com → Create notebook → Upload your docs:

  • 📄 PDFs, Google Docs, markdown files
  • 🔗 Websites, GitHub repos
  • 🎥 YouTube videos
  • 📚 Multiple sources per notebook

Share: ⚙️ Share → Anyone with link → Copy

4. Let Claude use it

"I'm building with [library]. Here's my NotebookLM: [link]"

That's it. Claude now asks NotebookLM whatever it needs, building expertise before writing code.


Real-World Example

Building an n8n Workflow Without Hallucinations

Challenge: n8n's API is new — Claude hallucinates node names and functions.

Solution:

  1. Downloaded complete n8n documentation → merged into manageable chunks
  2. Uploaded to NotebookLM
  3. Told Claude: "Build me a Gmail spam filter workflow. Use this NotebookLM: [link]"

Watch the AI-to-AI conversation:

Claude → "How does Gmail integration work in n8n?"
NotebookLM → "Use Gmail Trigger with polling, or Gmail node with Get Many..."

Claude → "How to decode base64 email body?"
NotebookLM → "Body is base64url encoded in payload.parts, use Function node..."

Claude → "How to parse OpenAI response as JSON?"
NotebookLM → "Set responseFormat to json, use {{ $json.spam }} in IF node..."

Claude → "What about error handling if the API fails?"
NotebookLM → "Use Error Trigger node with Continue On Fail enabled..."

Claude → ✅ "Here's your complete workflow JSON..."

Result: Perfect workflow on first try. No debugging hallucinated APIs.


Core Features

Zero Hallucinations

NotebookLM refuses to answer if information isn't in your docs. No invented APIs.

Autonomous Research

Claude asks follow-up questions automatically, building complete understanding before coding.

Smart Library Management

Save NotebookLM links with tags and descriptions. Claude auto-selects the right notebook for your task.

"Add [link] to library tagged 'frontend, react, components'"

Deep, Iterative Research

  • Claude automatically asks follow-up questions to build complete understanding
  • Each answer triggers deeper questions until Claude has all the details
  • Example: For n8n workflow, Claude asked multiple sequential questions about Gmail integration, error handling, and data transformation

Cross-Tool Sharing

Set up once, use everywhere. Claude Code, Codex, Cursor — all share the same library.

Deep Cleanup Tool

Fresh start anytime. Scans entire system for NotebookLM data with categorized preview.


Tool Profiles

Reduce token usage by loading only the tools you need. Each tool consumes context tokens — fewer tools = faster responses and lower costs.

Available Profiles

| Profile | Tools | Use Case | |---------|-------|----------| | minimal | 5 | Query-only: ask_question, get_health, list_notebooks, select_notebook, get_notebook | | standard | 10 | + Library management: setup_auth, list_sessions, add_notebook, update_notebook, search_notebooks | | full | 16 | All tools including cleanup_data, re_auth, remove_notebook, reset_session, close_session, get_library_stats |

Configure via CLI

# Check current settings
npx notebooklm-mcp config get

# Set a profile
npx notebooklm-mcp config set profile minimal
npx notebooklm-mcp config set profile standard
npx notebooklm-mcp config set profile full

# Disable specific tools (comma-separated)
npx notebooklm-mcp config set disabled-tools "cleanup_data,re_auth"

# Reset to defaults
npx notebooklm-mcp config reset

Configure via Environment Variables

# Set profile
export NOTEBOOKLM_PROFILE=minimal

# Disable specific tools
export NOTEBOOKLM_DISABLED_TOOLS="cleanup_data,re_auth,remove_notebook"

Settings are saved to ~/.config/notebooklm-mcp/settings.json and persist across sessions. Environment variables override file settings.


Architecture

graph LR
    A[Your Task] --> B[Claude/Codex]
    B --> C[MCP Server]
    C --> D[Chrome Automation]
    D --> E[NotebookLM]
    E --> F[Gemini 2.5]
    F --> G[Your Docs]
    G --> F
    F --> E
    E --> D
    D --> C
    C --> B
    B --> H[Accurate Code]

Common Commands

| Intent | Say | Result | |--------|-----|--------| | Authenticate | "Open NotebookLM auth setup" or "Log me in to NotebookLM" | Chrome opens for login | | Add notebook | "Add [link] to library" | Saves notebook with metadata | | List notebooks | "Show our notebooks" | Lists all saved notebooks | | Research first | "Research this in NotebookLM before coding" | Multi-question session | | Select notebook | "Use the React notebook" | Sets active notebook | | Update notebook | "Update notebook tags" | Modify metadata | | Remove notebook | "Remove [notebook] from library" | Deletes from library | | View browser | "Show me the browser" | Watch live NotebookLM chat | | Fix auth | "Repair NotebookLM authentication" | Clears and re-authenticates | | Switch account | "Re-authenticate with different Google account" | Changes account | | Clean restart | "Run NotebookLM cleanup" | Removes all data for fresh start | | Keep library | "Cleanup but keep my library" | Preserves notebooks | | Delete all data | "Delete all NotebookLM data" | Complete removal |


Comparison to Alternatives

vs. Downloading docs locally

  • You: Download docs → Claude: "search through these files"
  • Problem: Claude reads thousands of files → massive token usage, often misses connections
  • NotebookLM: Pre-indexed by Gemini, semantic understanding across all docs

vs. Web search

  • You: "Research X online"
  • Problem: Outdated info, hallucinated examples, unreliable sources
  • NotebookLM: Only your trusted docs, always current, with citations

vs. Local RAG setup

  • You: Set up embeddings, vector DB, chunking strategy, retrieval pipeline
  • Problem: Hours of setup, tuning retrieval, still gets "creative" with gaps
  • NotebookLM: Upload docs → done. Google handles everything.

FAQ

Is it really zero hallucinations? Yes. NotebookLM is specifically designed to only answer from uploaded sources. If it doesn't know, it says so.

What about rate limits? Free tier has daily query limits per Google account. Quick account switching supported for continued research.

How secure is this? Chrome runs locally. Your credentials never leave your machine. Use a dedicated Google account if concerned.

Can I see what's happening? Yes! Say "Show me the browser" to watch the live NotebookLM conversation.

What makes this better than Claude's built-in knowledge? Your docs are always current. No training cutoff. No hallucinations. Perfect for new libraries, internal APIs, or fast-moving projects.


Advanced Usage


The Bottom Line

Without NotebookLM MCP: Write code → Find it's wrong → Debug hallucinated APIs → Repeat

With NotebookLM MCP: Claude researches first → Writes correct code → Ship faster

Stop debugging hallucinations. Start shipping accurate code.

# Get started in 30 seconds
claude mcp add notebooklm npx notebooklm-mcp@latest

Disclaimer

This tool automates browser interactions with NotebookLM to make your workflow more efficient. However, a few friendly reminders:

About browser automation: While I've built in humanization features (realistic typing speeds, natural delays, mouse movements) to make the automation behave more naturally, I can't guarantee Google won't detect or flag automated usage. I recommend using a dedicated Google account for automation rather than your primary account—think of it like web scraping: probably fine, but better safe than sorry!

About CLI tools and AI agents: CLI tools like Claude Code, Codex, and similar AI-powered assistants are incredibly powerful, but they can make mistakes. Please use them with care and awareness:

  • Always review changes before committing or deploying
  • Test in safe environments first
  • Keep backups of important work
  • Remember: AI agents are assistants, not infallible oracles

I built this tool for myself because I was tired of the copy-paste dance between NotebookLM and my editor. I'm sharing it in the hope it helps others too, but I can't take responsibility for any issues, data loss, or account problems that might occur. Use at your own discretion and judgment.

That said, if you run into problems or have questions, feel free to open an issue on GitHub. I'm happy to help troubleshoot!


Contributing

Found a bug? Have a feature idea? Open an issue or submit a PR!

License

MIT — Use freely in your projects.


Built with frustration about hallucinated APIs, powered by Google's NotebookLM

Star on GitHub if this saves you debugging time!