@pan-sec/notebooklm-mcp
v2026.1.10
Published
Security-hardened MCP server for NotebookLM API with enterprise compliance (GDPR, SOC2, CSSF)
Downloads
1,521
Maintainers
Readme
NotebookLM MCP Server (Security Hardened)
🏆 The World's Most Advanced NotebookLM MCP Server
Zero-hallucination answers • Gemini Deep Research • 14 Security Layers • Enterprise Compliance
What's New 2026 • Deep Research • Document API • Create Notebooks • Security • Install
The only NotebookLM MCP with enterprise-grade security, post-quantum encryption, and full Gemini API integration.
Security-hardened fork of PleasePrompto/notebooklm-mcp • Maintained by Pantheon Security
⚡ TL;DR — What You Get
- 🔍 Query your NotebookLM notebooks — source-grounded, zero-hallucination answers
- 📚 Create & manage notebooks programmatically — no manual clicking
- 🎙️ Generate audio overviews — podcast-style summaries of your docs
- 🔬 Gemini Deep Research — comprehensive multi-source research (optional API)
- 📄 Document API — upload & query PDFs without browser (optional API)
- 🔐 14 security layers — post-quantum encryption, audit logs, secrets scanning
- ✅ Enterprise compliance — GDPR, SOC2, CSSF ready
- 💡 No API key required — core features work with just browser auth
🚀 What's New in 2026
Latest: v2026.1.8 — Updated dependencies, post-quantum crypto improvements
| Version | Highlights | |---------|------------| | v2026.1.8 | Major dependency updates (zod 4.x, dotenv 17.x, post-quantum 0.5.4) | | v2026.1.7 | MCP Protocol UX: tool icons, human-friendly titles, behavior annotations | | v2026.1.4 | Defense-in-depth path validation, security hardening | | v2026.1.1 | Deep health checks, chat history extraction, context management |
# Quick install
claude mcp add notebooklm -- npx @pan-sec/notebooklm-mcp@latestWhy Choose This MCP?
| Capability | Other MCPs | This MCP | |------------|------------|----------| | Query NotebookLM | ✅ Basic | ✅ + session management, quotas | | Create notebooks programmatically | ❌ | ✅ UNIQUE | | Gemini Deep Research | ❌ | ✅ EXCLUSIVE | | Document API (no browser) | ❌ | ✅ EXCLUSIVE | | Post-quantum encryption | ❌ | ✅ Future-proof | | Enterprise compliance | ❌ | ✅ GDPR/SOC2/CSSF | | Chat history extraction | ❌ | ✅ NEW | | Deep health verification | ❌ | ✅ NEW |
Core NotebookLM (No API Key Required)
| Tool | Description |
|------|-------------|
| ask_question | Query notebooks with source-grounded answers |
| add_notebook | Add a notebook to your library |
| list_notebooks | List all notebooks in library |
| select_notebook | Set active notebook |
| update_notebook | Update notebook metadata |
| remove_notebook | Remove from library |
| create_notebook | Programmatically create new notebooks |
| batch_create_notebooks | Create multiple notebooks at once |
| sync_library | Sync library with NotebookLM |
| list_sources | List sources in a notebook |
| add_source | Add source to notebook |
| remove_source | Remove source from notebook |
| generate_audio_overview | Create podcast-style audio |
| get_audio_status | Check audio generation status |
| download_audio | Download generated audio |
| list_sessions | List active sessions |
| close_session | Close a session |
| reset_session | Reset session history |
| get_health | Check server & auth status |
| setup_auth | Initial authentication |
| re_auth | Re-authenticate |
| cleanup_data | Clean up local data |
| get_quota | Check usage quotas |
| set_quota_tier | Set quota tier |
| get_query_history | View past queries |
| get_notebook_chat_history | Extract browser chat history |
| get_project_info | Get project context |
| export_library | Export library backup |
Gemini API (Optional - Requires GEMINI_API_KEY)
| Tool | Description |
|------|-------------|
| deep_research | Comprehensive research agent |
| gemini_query | Fast grounded queries |
| get_research_status | Check research progress |
| upload_document | Upload docs to Gemini |
| query_document | Query uploaded documents |
| query_chunked_document | Query large documents |
| list_documents | List uploaded documents |
| delete_document | Delete uploaded document |
Webhooks & Integrations
| Tool | Description |
|------|-------------|
| configure_webhook | Set up webhook notifications |
| list_webhooks | List configured webhooks |
| test_webhook | Test webhook delivery |
| remove_webhook | Remove a webhook |
Enterprise Compliance (16 additional tools)
See Compliance Documentation for full list.
Gemini Deep Research (v1.8.0)
The most powerful research capability for AI agents — now in your MCP toolkit.
v1.8.0 introduces the Gemini Interactions API as a stable, API-based research backend alongside browser automation. This gives your agents access to Google's state-of-the-art Deep Research agent.
Why This Matters
| Challenge | Solution | |-----------|----------| | Browser UI changes break automation | Gemini API is stable and versioned | | Need comprehensive research but no research agent | Deep Research agent does it for you | | Want current information with citations | Google Search grounding built-in | | Need reliable, fast queries | API-based = no UI dependencies |
New Tools
deep_research — Comprehensive Research Agent
"Research the security implications of post-quantum cryptography adoption in financial services"- Runs Google's Deep Research agent (same as Gemini Advanced)
- Takes 1-5 minutes for comprehensive, web-grounded analysis
- Returns structured answers with citations and sources
- Perfect for complex topics requiring multi-source synthesis
gemini_query — Fast Grounded Queries
"What are the latest CVEs for Log4j in 2025?" (with Google Search)
"Calculate the compound interest on $10,000 at 5% over 10 years" (with code execution)
"Summarize this security advisory: [URL]" (with URL context)- Google Search grounding — Current information, not just training data
- Code execution — Run calculations, data analysis
- URL context — Analyze web pages on demand
- Models:
gemini-2.5-flash(fast),gemini-2.5-pro(powerful),gemini-3-flash-preview(latest)
get_research_status — Background Task Monitoring
Run deep research in the background and check progress:
"Start researching [topic] in the background"
... continue other work ...
"Check research status for interaction_abc123"Hybrid Architecture
┌──────────────────────────────────────────────────────────────────────────────┐
│ NotebookLM MCP Server v2026.1.x │
├──────────────────────────────────────────────────────────────────────────────┤
│ │
│ ┌────────────────────────────────┐ ┌──────────────────────────────────┐ │
│ │ BROWSER AUTOMATION │ │ GEMINI API │ │
│ │ ✅ NO API KEY NEEDED │ │ ⚡ OPTIONAL - needs API key │ │
│ ├────────────────────────────────┤ ├──────────────────────────────────┤ │
│ │ │ │ │ │
│ │ QUERY │ │ RESEARCH │ │
│ │ • ask_question │ │ • deep_research │ │
│ │ • get_notebook_chat_history │ │ • gemini_query │ │
│ │ │ │ • get_research_status │ │
│ │ CREATE & MANAGE │ │ │ │
│ │ • create_notebook │ │ DOCUMENTS │ │
│ │ • batch_create_notebooks │ │ • upload_document │ │
│ │ • manage_sources │ │ • query_document │ │
│ │ • generate_audio │ │ • query_chunked_document │ │
│ │ • sync_notebook │ │ • list/delete_document │ │
│ │ │ │ │ │
│ │ HEALTH & SESSIONS v2026 │ │ │ │
│ │ • get_health (deep_check) │ │ Fast API • 48h retention │ │
│ │ • get_query_history │ │ Auto-chunking for large PDFs │ │
│ └────────────────────────────────┘ └──────────────────────────────────┘ │
│ │
│ ┌─────────────────────────────────┐ │
│ │ 14 SECURITY LAYERS │ │
│ │ Post-Quantum • Audit Logs │ │
│ │ Cert Pinning • Memory Wipe │ │
│ │ GDPR • SOC2 • CSSF Ready │ │
│ └─────────────────────────────────┘ │
└──────────────────────────────────────────────────────────────────────────────┘💡 Gemini API is completely optional! All core NotebookLM features (ask_question, notebooks, sessions, audio) work via browser automation with no API key required. The Gemini tools below are bonus features for users who want direct API access.
Gemini Configuration (Optional)
# Only required if you want Gemini API features (deep_research, gemini_query, upload_document)
GEMINI_API_KEY=your-api-key # Get from https://aistudio.google.com/apikey
# Optional settings
GEMINI_DEFAULT_MODEL=gemini-2.5-flash # Default model
GEMINI_DEEP_RESEARCH_ENABLED=true # Enable Deep Research
GEMINI_TIMEOUT_MS=30000 # API timeoutWhen to Use Which
| Task | Best Tool | Why |
|------|-----------|-----|
| Questions about YOUR documents | ask_question | Grounded on your uploaded sources |
| Comprehensive topic research | deep_research | Multi-source synthesis with citations |
| Current events / recent info | gemini_query + google_search | Live web data |
| Code calculations | gemini_query + code_execution | Reliable computation |
| Analyze a webpage | gemini_query + url_context | Direct page analysis |
| Quick PDF/document analysis | upload_document + query_document | Fast API, no browser (NEW!) |
📄 Document API (v1.9.0)
Upload and query documents directly via Gemini API — no browser automation needed.
v1.9.0 introduces the Gemini Files API for fast, reliable document analysis. Upload PDFs, analyze them instantly, and delete when done.
Why This Matters
| Feature | Browser Mode | Document API | |---------|--------------|--------------| | Speed | Seconds | Milliseconds | | Reliability | UI-dependent | API-stable | | File Support | Via NotebookLM | 50MB PDFs, 1000 pages | | Retention | Permanent | 48 hours | | Setup | Auth + cookies | Just API key |
New Tools
upload_document — Fast Document Upload
Upload any document to Gemini for instant querying:
Upload /path/to/research-paper.pdf- Supported: PDF (50MB, 1000 pages), TXT, MD, HTML, CSV, JSON, DOCX, images, audio, video
- 48-hour retention — files auto-expire, or delete manually
- Returns a file ID for querying
query_document — Ask Questions About Documents
"What are the main findings in this research paper?"
"Summarize section 3 of the document"
"Extract all statistics mentioned in the PDF"- Full document understanding (text, tables, charts, diagrams)
- Multi-document queries (compare multiple files)
- Fast API response (no browser wait)
list_documents — See All Uploaded Files
List all my uploaded documentsShows file names, sizes, MIME types, and expiration times.
delete_document — Clean Up Sensitive Files
Delete file xyz123Immediately remove files (don't wait for 48h expiration).
Workflow Example
1. upload_document("/research/paper.pdf")
→ Returns: files/abc123
2. query_document("files/abc123", "What methodology was used?")
→ Returns: "The paper uses a mixed-methods approach combining..."
3. query_document("files/abc123", "List all cited authors")
→ Returns: "Smith et al. (2024), Johnson (2023)..."
4. delete_document("files/abc123")
→ File removedAuto-Chunking for Large PDFs (v1.10.0)
No file size limits — PDFs of any size are automatically handled.
When you upload a PDF that exceeds Gemini's limits (50MB or 1000 pages), the system automatically:
- Detects the oversized PDF
- Splits it into optimal chunks (500 pages each)
- Uploads all chunks in parallel
- Returns chunk metadata for querying
upload_document("/research/massive-2000-page-report.pdf")
→ Returns:
{
"wasChunked": true,
"totalPages": 2000,
"chunks": [
{ "fileName": "files/abc1", "pageStart": 1, "pageEnd": 500 },
{ "fileName": "files/abc2", "pageStart": 501, "pageEnd": 1000 },
{ "fileName": "files/abc3", "pageStart": 1001, "pageEnd": 1500 },
{ "fileName": "files/abc4", "pageStart": 1501, "pageEnd": 2000 }
],
"allFileNames": ["files/abc1", "files/abc2", "files/abc3", "files/abc4"]
}query_chunked_document — Query All Chunks at Once
For chunked documents, use this tool to query all parts and get an aggregated answer:
query_chunked_document(
file_names: ["files/abc1", "files/abc2", "files/abc3", "files/abc4"],
query: "What are the key recommendations in this report?"
)
→ Queries each chunk, then synthesizes a unified answerWhen to Use Document API vs NotebookLM
| Scenario | Use | |----------|-----| | Quick one-off document analysis | Document API — fast, no setup | | Building a permanent knowledge base | NotebookLM — permanent storage | | Analyzing sensitive documents | Document API — 48h auto-delete | | Multi-source research over time | NotebookLM — organized notebooks | | CI/CD pipeline document processing | Document API — API-native | | Large PDFs (1000+ pages) | Document API — auto-chunking |
Programmatic Notebook Creation (v1.7.0+)
Create NotebookLM notebooks entirely from code — no manual clicks required.
Most MCP servers can only read from NotebookLM. This one can create notebooks, add sources, and generate audio — all programmatically.
create_notebook — Build Notebooks Instantly
Create a complete notebook with multiple sources in one command:
{
"name": "Security Research 2025",
"sources": [
{ "type": "url", "value": "https://owasp.org/Top10" },
{ "type": "file", "value": "/path/to/security-report.pdf" },
{ "type": "text", "value": "Custom analysis notes...", "title": "My Notes" }
],
"description": "OWASP security best practices",
"topics": ["security", "owasp", "vulnerabilities"]
}Supported source types:
- URL — Web pages, documentation, articles
- File — PDF, DOCX, TXT, and more
- Text — Raw text, code snippets, notes
batch_create_notebooks — Scale Up
Create up to 10 notebooks in a single operation:
{
"notebooks": [
{ "name": "React Docs", "sources": [{ "type": "url", "value": "https://react.dev/reference" }] },
{ "name": "Node.js API", "sources": [{ "type": "url", "value": "https://nodejs.org/api/" }] },
{ "name": "TypeScript Handbook", "sources": [{ "type": "url", "value": "https://www.typescriptlang.org/docs/" }] }
]
}Perfect for:
- Setting up project documentation libraries
- Onboarding new team members with curated knowledge bases
- Creating topic-specific research notebooks in bulk
manage_sources — Dynamic Source Management
Add or remove sources from existing notebooks:
{
"notebook_id": "abc123",
"action": "add",
"sources": [{ "type": "url", "value": "https://new-documentation.com" }]
}generate_audio — Audio Overview Creation
Generate NotebookLM's famous "Audio Overview" podcasts programmatically:
"Generate an audio overview for my Security Research notebook"sync_notebook — Keep Sources Updated
Sync notebook sources from a local directory:
{
"notebook_id": "abc123",
"directory": "/path/to/docs",
"patterns": ["*.md", "*.pdf"]
}Why This Matters
| Traditional Workflow | With This MCP |
|---------------------|---------------|
| Manually create notebook in browser | create_notebook → done |
| Click "Add source" for each document | Batch add in single command |
| Navigate UI to generate audio | generate_audio → podcast ready |
| Update sources by hand | sync_notebook from local files |
Your agent can now build entire knowledge bases autonomously.
📊 Query History & Chat Extraction (v2026.1.0)
Track your research and recover conversations from NotebookLM notebooks.
get_query_history — Review Past Research (v1.10.8)
All queries made through the MCP are automatically logged for review:
"Show me my recent NotebookLM queries"
"Find queries about security from last week"
"What did I ask the fine-tuning notebook?"- Automatic logging — every Q&A pair saved with metadata
- Search — find specific topics across all queries
- Filter — by notebook, session, or date
- Quota tracking — see query counts and timing
get_notebook_chat_history — Extract Browser Conversations (v2026.1.0)
Extract conversation history directly from a NotebookLM notebook's chat UI with context management to avoid overwhelming your AI context window:
Quick audit (preview mode):
{ "notebook_id": "my-research", "preview_only": true }Returns message counts without content — test the water before extracting.
Export to file (avoids context overflow):
{ "notebook_id": "my-research", "output_file": "/tmp/chat-history.json" }Dumps full history to disk instead of returning to context.
Paginate through history:
{ "notebook_id": "my-research", "limit": 20, "offset": 0 }
{ "notebook_id": "my-research", "limit": 20, "offset": 20 }Page through large histories without loading everything at once.
Returns:
{
"notebook_url": "https://notebooklm.google.com/notebook/xxx",
"notebook_name": "My Research",
"total_messages": 150,
"returned_messages": 40,
"user_messages": 75,
"assistant_messages": 75,
"offset": 0,
"has_more": true,
"messages": [...]
}Use cases:
- Recover conversations made directly in the NotebookLM browser (not tracked by MCP)
- Audit research — see what queries were made in a notebook
- Resume context — pick up where a previous session left off
- Quota reconciliation — understand why quota seems off
Why This Fork?
The original NotebookLM MCP is excellent for productivity — but MCP servers handle sensitive data:
- Browser sessions with Google authentication
- Cookies and tokens stored on disk
- Query history that may contain proprietary information
This fork adds 14 security hardening layers to protect that data.
Security Features
| Layer | Feature | Protection | |-------|---------|------------| | 🔐 | Post-Quantum Encryption | ML-KEM-768 + ChaCha20-Poly1305 hybrid | | 🔍 | Secrets Scanning | Detects 30+ credential patterns (AWS, GitHub, Slack...) | | 📌 | Certificate Pinning | Blocks MITM attacks on Google connections | | 🧹 | Memory Scrubbing | Zeros sensitive data after use | | 📝 | Audit Logging | Tamper-evident logs with hash chains | | ⏱️ | Session Timeout | 8h hard limit + 30m inactivity auto-logout | | 🎫 | MCP Authentication | Token-based auth with brute-force lockout | | 🛡️ | Response Validation | Detects prompt injection attempts | | ✅ | Input Validation | URL whitelisting, sanitization | | 🚦 | Rate Limiting | Per-session request throttling | | 🙈 | Log Sanitization | Credentials masked in all output | | 🐍 | MEDUSA Integration | Automated security scanning | | 🖥️ | Cross-Platform | Native support for Linux, macOS, Windows |
Post-Quantum Ready
Traditional encryption (RSA, ECDH) will be broken by quantum computers. This fork uses hybrid encryption:
ML-KEM-768 (Kyber) + ChaCha20-Poly1305- ML-KEM-768: NIST-standardized post-quantum key encapsulation
- ChaCha20-Poly1305: Modern stream cipher (immune to timing attacks)
Even if one algorithm is broken, the other remains secure.
Cross-Platform Support
Full native support for all major operating systems:
| Platform | File Permissions | Data Directory |
|----------|-----------------|----------------|
| Linux | Unix chmod (0o600/0o700) | ~/.local/share/notebooklm-mcp/ |
| macOS | Unix chmod (0o600/0o700) | ~/Library/Application Support/notebooklm-mcp/ |
| Windows | ACLs via icacls (current user only) | %LOCALAPPDATA%\notebooklm-mcp\ |
All sensitive files (encryption keys, auth tokens, audit logs) are automatically protected with owner-only permissions on every platform.
Enterprise Compliance (v1.6.0+)
Full compliance support for regulated industries:
| Regulation | Features | |------------|----------| | GDPR | Consent management, DSAR handling, right to erasure, data portability | | SOC2 Type II | Hash-chained audit logs, incident response, availability monitoring | | CSSF | 7-year retention, SIEM integration, policy documentation |
Compliance Tools (16 MCP tools)
compliance_dashboard - Real-time compliance status
compliance_report - Generate audit reports (JSON/CSV/HTML)
compliance_evidence - Collect evidence packages
grant_consent - Record user consent
submit_dsar - Handle data subject requests
request_erasure - Right to be forgotten
export_user_data - Data portability export
create_incident - Security incident management
...and 8 moreSee COMPLIANCE-SPEC.md for full documentation.
Installation
What Works Out of the Box (No API Key)
All core NotebookLM features work immediately with just browser authentication:
| Feature | Tool | Description |
|---------|------|-------------|
| 🔍 Query notebooks | ask_question | Get source-grounded answers from your documents |
| 📚 Manage library | add_notebook, list_notebooks, etc. | Organize your notebook collection |
| 🎙️ Audio overviews | generate_audio_overview | Create podcast-style summaries |
| 📝 Create notebooks | create_notebook | Programmatically create new notebooks |
| 🔄 Session management | list_sessions, reset_session | Manage conversation context |
| 📊 Chat history | get_notebook_chat_history | Extract past conversations |
| ❤️ Health checks | get_health | Verify authentication status |
Optional: Add GEMINI_API_KEY for bonus features like deep_research, gemini_query, and upload_document.
Claude Code
claude mcp add notebooklm -- npx @pan-sec/notebooklm-mcp@latestWith Authentication + Gemini (Recommended)
claude mcp add notebooklm \
--env NLMCP_AUTH_ENABLED=true \
--env NLMCP_AUTH_TOKEN=$(openssl rand -base64 32) \
--env GEMINI_API_KEY=your-gemini-api-key \
-- npx @pan-sec/notebooklm-mcp@latestCodex
codex mcp add notebooklm -- npx @pan-sec/notebooklm-mcp@latestAdd to ~/.cursor/mcp.json:
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
"env": {
"NLMCP_AUTH_ENABLED": "true",
"NLMCP_AUTH_TOKEN": "your-secure-token",
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}Add to ~/.gemini/antigravity/mcp_config.json (macOS/Linux) or %USERPROFILE%\.gemini\antigravity\mcp_config.json (Windows):
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"]
}
}
}With optional env vars:
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}Note: Antigravity does NOT support
${workspaceFolder}variables. Use absolute paths.
Add to ~/.config/opencode/opencode.json (global) or opencode.json in project root:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"notebooklm": {
"type": "local",
"command": ["npx", "-y", "@pan-sec/notebooklm-mcp@latest"],
"enabled": true,
"environment": {
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}Note: OpenCode uses
"mcp"(not"mcpServers") and"command"is an array.
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}Add to your VS Code settings.json:
{
"mcp": {
"servers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}
}Most MCP clients use this standard format:
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}Common config locations:
| Client | Config File |
|--------|-------------|
| Claude Desktop | ~/.config/claude/claude_desktop_config.json |
| Cursor | ~/.cursor/mcp.json |
| Antigravity | ~/.gemini/antigravity/mcp_config.json |
| OpenCode | ~/.config/opencode/opencode.json |
| Windsurf | ~/.codeium/windsurf/mcp_config.json |
Quick Start
1. Install (see above)
2. Authenticate
"Log me in to NotebookLM"Chrome opens → sign in with Google
3. Add your notebook
Go to notebooklm.google.com → Create notebook → Upload docs → Share link
4. Use it
"Research [topic] using this NotebookLM: [link]"5. Try Deep Research (NEW!)
"Use deep research to investigate [complex topic]"Complete Tool Reference
Research Tools
| Tool | Description | Backend |
|------|-------------|---------|
| ask_question | Query your NotebookLM notebooks | Browser |
| deep_research | Comprehensive research with citations | Gemini API |
| gemini_query | Fast queries with grounding tools | Gemini API |
| get_research_status | Check background research progress | Gemini API |
Notebook Management
| Tool | Description |
|------|-------------|
| add_notebook | Add notebook to library |
| list_notebooks | List all notebooks |
| get_notebook | Get notebook details |
| update_notebook | Update notebook metadata |
| remove_notebook | Remove from library |
| select_notebook | Set active notebook |
| search_notebooks | Search by query |
Source Management (v1.7.0+)
| Tool | Description |
|------|-------------|
| manage_sources | Add/remove/list sources |
| generate_audio | Create Audio Overview |
| sync_notebook | Sync sources from local files |
Session & System
| Tool | Description |
|------|-------------|
| list_sessions | View active sessions |
| close_session | Close a session |
| reset_session | Reset session chat |
| get_health | Server health check (with deep_check for UI verification) |
| get_query_history | Review past queries with search/filter |
| get_notebook_chat_history | Extract browser conversations (pagination, file export) |
| setup_auth | Initial authentication |
| re_auth | Re-authenticate |
| cleanup_data | Deep cleanup utility |
| get_library_stats | Library statistics |
| get_quota | Check usage limits and remaining quota |
Compliance (v1.6.0+)
16 compliance tools for GDPR, SOC2, and CSSF requirements.
What Gets Protected
| Data | Protection | |------|------------| | Browser cookies | Post-quantum encrypted at rest | | Session tokens | Auto-expire + memory scrubbing | | Query history | Audit logged with tamper detection | | Google connection | Certificate pinned (MITM blocked) | | Log output | Credentials auto-redacted | | API responses | Scanned for leaked secrets | | Gemini API key | Secure memory handling |
Configuration
All security features are enabled by default. Override via environment variables:
# Authentication
NLMCP_AUTH_ENABLED=true
NLMCP_AUTH_TOKEN=your-secret-token
# Gemini API (v1.8.0+)
GEMINI_API_KEY=your-api-key
GEMINI_DEFAULT_MODEL=gemini-2.5-flash
GEMINI_DEEP_RESEARCH_ENABLED=true
GEMINI_TIMEOUT_MS=30000
# Encryption
NLMCP_USE_POST_QUANTUM=true
NLMCP_ENCRYPTION_KEY=base64-32-bytes # Optional custom key
# Session Limits
NLMCP_SESSION_MAX_LIFETIME=28800 # 8 hours
NLMCP_SESSION_INACTIVITY=1800 # 30 minutes
# Secrets Scanning
NLMCP_SECRETS_SCANNING=true
NLMCP_SECRETS_BLOCK=false # Block on detection
NLMCP_SECRETS_REDACT=true # Auto-redact
# Certificate Pinning
NLMCP_CERT_PINNING=true
# Audit Logging
NLMCP_AUDIT_ENABLED=true
# Multi-Session Support (v2026.1.2+)
NOTEBOOK_PROFILE_STRATEGY=isolated # isolated|single|auto
NOTEBOOK_CLONE_PROFILE=true # Clone auth from base profileMulti-Session Mode
Run multiple Claude Code sessions simultaneously with isolated browser profiles:
# Add to ~/.bashrc or ~/.zshrc
export NOTEBOOK_PROFILE_STRATEGY=isolated
export NOTEBOOK_CLONE_PROFILE=true| Variable | Values | Description |
|----------|--------|-------------|
| NOTEBOOK_PROFILE_STRATEGY | single, auto, isolated | isolated = separate profile per session |
| NOTEBOOK_CLONE_PROFILE | true, false | Clone authenticated base profile into isolated instances |
How it works:
- Each session gets its own Chrome profile (no lock conflicts)
- Isolated profiles clone from the authenticated base profile
- Auth coordination ensures cloning waits for any in-progress authentication
See SECURITY.md for complete configuration reference.
Security Scanning
Run MEDUSA security scanner:
npm run security-scanOr integrate in CI/CD:
- name: Security Scan
run: npx @pan-sec/notebooklm-mcp && npm run security-scanComparison
vs Other NotebookLM MCPs
| Feature | Others | @pan-sec/notebooklm-mcp | |---------|--------|-------------------------| | Zero-hallucination Q&A | ✅ | ✅ | | Library management | ✅ | ✅ | | Create Notebooks Programmatically | ❌ | ✅ EXCLUSIVE | | Batch Create (10 notebooks) | ❌ | ✅ EXCLUSIVE | | Gemini Deep Research | ❌ | ✅ EXCLUSIVE | | Document API (no browser) | ❌ | ✅ EXCLUSIVE | | Auto-chunking (1000+ page PDFs) | ❌ | ✅ EXCLUSIVE | | Chat History Extraction | ❌ | ✅ NEW | | Deep Health Verification | ❌ | ✅ NEW | | Query History & Search | ❌ | ✅ | | Quota Management | ❌ | ✅ | | Source Management (add/remove) | ❌ | ✅ | | Audio Overview Generation | ❌ | ✅ | | Sync from Local Directories | ❌ | ✅ |
Security & Compliance (Unique to This Fork)
| Feature | Others | @pan-sec/notebooklm-mcp | |---------|--------|-------------------------| | Cross-platform (Linux/macOS/Windows) | ⚠️ Partial | ✅ Full | | Post-quantum encryption | ❌ | ✅ ML-KEM-768 + ChaCha20 | | Secrets scanning | ❌ | ✅ 30+ patterns | | Certificate pinning | ❌ | ✅ Google MITM protection | | Memory scrubbing | ❌ | ✅ Zero-on-free | | Audit logging | ❌ | ✅ Hash-chained | | MCP authentication | ❌ | ✅ Token + lockout | | Prompt injection detection | ❌ | ✅ Response validation | | GDPR Compliance | ❌ | ✅ Full | | SOC2 Type II | ❌ | ✅ Full | | CSSF (Luxembourg) | ❌ | ✅ Full |
Bottom line: If you need more than basic queries, or care about security, there's only one choice.
Version History
| Version | Highlights | |---------|------------| | v2026.1.1 | 🔍 Deep health check — verifies NotebookLM chat UI actually loads | | v2026.1.0 | 📊 Chat history extraction with context management, CalVer versioning | | v1.10.8 | Query history logging, quota tracking | | v1.10.0 | Auto-chunking for large PDFs (1000+ pages) | | v1.9.0 | Document API: upload, query, delete via Gemini Files API | | v1.8.0 | Gemini Deep Research, Query with Grounding, Background Tasks | | v1.7.0 | Programmatic notebook creation, batch operations, audio generation | | v1.6.0 | Enterprise compliance: GDPR, SOC2 Type II, CSSF | | v1.5.0 | Cross-platform support (Windows ACLs, macOS, Linux) | | v1.4.0 | Post-quantum encryption, secrets scanning |
Reporting Vulnerabilities
Found a security issue? Do not open a public GitHub issue.
Email: [email protected]
Credits
- Original MCP Server: Gérôme Dexheimer — notebooklm-mcp
- Security Hardening: Pantheon Security
- Post-Quantum Crypto: @noble/post-quantum
- Gemini API: Google AI
License
MIT — Same as original.
Security hardened with 🔒 by Pantheon Security
Powered by Google Gemini 🚀
Full Security Documentation • Compliance Guide • Report Vulnerability
