npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@notebook-intelligence/notebook-intelligence

v3.0.0

Published

AI coding assistant for JupyterLab

Readme

Notebook Intelligence

Notebook Intelligence (NBI) is an AI coding assistant and extensible AI framework for JupyterLab. It can use GitHub Copilot or AI models from any other LLM Provider, including local models from Ollama. NBI greatly boosts the productivity of JupyterLab users with AI assistance.

Feature Highlights

Agent Mode

In Agent Mode, built-in AI agent creates, edits and executes notebooks for you interactively. It can detect issues in the cells and fix for you.

Agent mode

Code generation with inline chat

Use the sparkle icon on cell toolbar or the keyboard shortcuts to show the inline chat popover.

Keyboard shortcuts: Ctrl + G / Cmd + G is the shortcut to show the inline chat popover and Ctrl + Enter / Cmd + Enter is the shortcut to accept the suggestion. Clicking Escape key closes the popover.

Generate code

Auto-complete

Auto-complete suggestions are shown as you type. Clicking Tab key accepts the suggestion. NBI provides auto-complete suggestions in code cells and Python file editors.

Chat interface

See blog posts for more features and usage.

Installation

NBI requires JupyterLab >= 4.0.0. To install the extension, run the command below and restart JupyterLab.

pip install notebook-intelligence

Configuration options

Configuring LLM Provider and models

You can configure the model provider and model options using the Notebook Intelligence Settings dialog. You can access this dialog from JupyterLab Settings menu -> Notebook Intelligence Settings, using /settings command in NBI Chat or by using the command palette. For more details, see the blog post.

Notebook Intelligence extension for JupyterLab

This extension is composed of a Python package named notebook_intelligence for the server extension and a NPM package named @notebook-intelligence/notebook-intelligence for the frontend extension.

Remembering GitHub Copilot login

Notebook Intelligence can remember your GitHub Copilot login so that you don't need to re-login after a JupyterLab or system restart. Please be aware of the security implications of using this feature.

[!CAUTION] If you configure NBI to remember your GitHub Copilot login, it will encrypt the token and store into a data file at ~/.jupyter/nbi/user-data.json. You should never share this file with others as they can access your tokens. Even though the token is encrypted, it is done so by using a default password and that's why it can be decrypted by others. In order to prevent that you can specify a custom password using the environment variable NBI_GH_ACCESS_TOKEN_PASSWORD.

NBI_GH_ACCESS_TOKEN_PASSWORD=my_custom_password

To let Notebook Intelligence remember your GitHub access token, go to Notebook Intelligence Settings dialog and check the option Remember my GitHub Copilot access token as shown below.

If your stored access token fails to login (due to expiration or other reasons), you will be prompted to relogin on the UI.

Built-in Tools

  • Notebook Edit (nbi-notebook-edit): Edit notebook using the JupyterLab notebook editor.
  • Notebook Execute (nbi-notebook-execute): Run notebooks in JupyterLab UI.
  • Python File Edit (nbi-python-file-edit): Edit Python files using the JupyterLab file editor.
  • File Edit (nbi-file-edit): Edit files in the Jupyter root directory.
  • File Read (nbi-file-read): Read files in the Jupyter root directory.
  • Command Execute (nbi-command-execute): Execute shell commands using embedded terminal in Agent UI or JupyterLab terminal.

Disabling Built-in tools

All built-in toolas are enabled by default in Agent Mode. However, you can disable them and make them controlled by an environment variable.

In order to disable any built-in tool use the disabled_tools config:

c.NotebookIntelligence.disabled_tools = ["nbi-notebook-execute","nbi-python-file-edit"]

Valid built-in tool values are nbi-notebook-edit, nbi-notebook-execute, nbi-python-file-edit, nbi-file-edit, nbi-file-read, nbi-command-execute.

In order to disable a built-in tool by default but allow re-enabling using an environment variable use the allow_enabling_tools_with_env config:

c.NotebookIntelligence.allow_enabling_tools_with_env = True

Then the environment variable NBI_ENABLED_BUILTIN_TOOLS can be used to re-enable specific built-in tools.

export NBI_ENABLED_BUILTIN_TOOLS=nbi-notebook-execute,nbi-python-file-edit

Configuration files

NBI saves configuration at ~/.jupyter/nbi/config.json. It also supports environment wide base configuration at <env-prefix>/share/jupyter/nbi/config.json. Organizations can ship default configuration at this environment wide config path. User's changes will be stored as overrides at ~/.jupyter/nbi/config.json.

These config files are used for saving LLM provider, model and MCP configuration. Note that API keys you enter for your custom LLM providers will also be stored in these config files.

[!IMPORTANT] Note that updating config.json manually requires restarting JupyterLab to take effect.

Model Context Protocol (MCP) Support

NBI seamlessly integrates with MCP servers. It supports servers with both Standard Input/Output (stdio) and Server-Sent Events (SSE) transports. The MCP support is limited to server tools at the moment.

You can easily add MCP servers to NBI by editing the configuration file ~/.jupyter/nbi/mcp.json. Environment wide base configuration is also support using the file at <env-prefix>/share/jupyter/nbi/mcp.json.

[!NOTE] Using MCP servers requires an LLM model with tool calling capabilities. All of the GitHub Copilot models provided in NBI support this feature. If you are using other providers make sure you choose a tool calling capable model.

[!CAUTION] Note that most MCP servers are run on the same computer as your JupyterLab installation and they can make irreversible changes to your computer and/or access private data. Make sure that you only install MCP servers from trusted sources.

MCP Config file example

{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/Users/mbektas/mcp-test"
      ]
    }
  }
}

You can use Agent mode to access tools provided by MCP servers you configured.

For servers with stdio transport, you can also set additional environment variables by using the env key. Environment variables are specified as key value pairs.

"mcpServers": {
    "servername": {
        "command": "",
        "args": [],
        "env": {
            "ENV_VAR_NAME": "ENV_VAR_VALUE"
        }
    },
}

Below is an example of a server configuration with Streamable HTTP transport. For Streamable HTTP transport servers, you can also specify headers to be sent as part of the requests.

"mcpServers": {
    "remoterservername": {
        "url": "http://127.0.0.1:8080/mcp",
        "headers": {
            "Authorization": "Bearer mysecrettoken"
        }
    },
}

If you have multiple servers configured but you would like to disable some for a while, you can do so by using the disabled key. servername2 will be disabled and not available in @mcp chat participant.

"mcpServers": {
    "servername1": {
        "command": "",
        "args": [],
    },
    "servername2": {
        "command": "",
        "args": [],
        "disabled": true
    },
}

Ruleset System

NBI includes a powerful ruleset system that allows you to define custom guidelines and best practices that are automatically injected into AI prompts. This helps ensure consistent coding standards, project-specific conventions, and domain knowledge across all AI interactions.

How It Works

Rules are markdown files with optional YAML frontmatter stored in ~/.jupyter/nbi/rules/. They are automatically discovered and applied based on context (file type, notebook kernel, chat mode).

Creating Rules

Global Rules - Apply to all contexts:

Create a file like ~/.jupyter/nbi/rules/01-coding-standards.md:

---
priority: 10
---

# Coding Standards

- Always use type hints in Python functions
- Prefer list comprehensions over loops when appropriate
- Add docstrings to all public functions

Mode-Specific Rules - Apply only to specific chat modes:

NBI supports mode-specific rules for three modes:

  • ask - Question/answer mode
  • agent - Autonomous agent mode with tool access
  • inline-chat - Inline code generation and editing

Create a file like ~/.jupyter/nbi/rules/modes/agent/01-testing.md:

---
priority: 20
scope:
  kernels: ['python3']
---

# Testing Guidelines

When writing code in agent mode:

- Always include error handling
- Add logging for debugging
- Test edge cases

Rule Frontmatter Options

---
apply: always # 'always', 'auto', or 'manual'
active: true # Enable/disable the rule
priority: 10 # Lower numbers = higher priority
scope:
  file_patterns: # Apply to specific file patterns
    - '*.py'
    - 'test_*.ipynb'
  kernels: # Apply to specific notebook kernels
    - 'python3'
    - 'ir'
  directories: # Apply to specific directories
    - '/projects/ml'
---

Configuration

Enable/Disable Rules System:

Edit ~/.jupyter/nbi/config.json:

{
  "rules_enabled": true
}

Auto-Reload Configuration:

Rules are automatically reloaded when changed (enabled by default). This behavior is controlled by the NBI_RULES_AUTO_RELOAD environment variable.

To disable auto-reload:

export NBI_RULES_AUTO_RELOAD=false
jupyter lab

Or to enable (default):

export NBI_RULES_AUTO_RELOAD=true
jupyter lab

Managing Rules

Rules are automatically discovered from:

  • Global rules: ~/.jupyter/nbi/rules/*.md
  • Mode-specific rules: ~/.jupyter/nbi/rules/modes/{mode}/*.md where {mode} can be:
    • ask - For question/answer interactions
    • agent - For autonomous agent operations
    • inline-chat - For inline code generation

Rules are applied in priority order (lower numbers first) and can be toggled on/off without deleting the files.

Developer documentation

For building locally and contributing see the developer documentatation.