npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@xpert-ai/plugin-minimax

v0.0.3

Published

MiniMax AI model plugin for XpertAI platform

Readme

Xpert Plugin: MiniMax

Overview

@xpert-ai/plugin-minimax connects MiniMax AI models to the XpertAI platform. The plugin integrates MiniMax's OpenAI-compatible API so XpertAI agents can leverage MiniMax's large language models, text embeddings, and text-to-speech capabilities.

Core Features

  • Ships MiniMaxModule, which registers the NestJS provider strategy, lifecycle hooks, and configuration schema required by the plugin runtime.
  • Implements MiniMaxLargeLanguageModel, a LangChain-powered adapter built on ChatOAICompatReasoningModel that supports streaming chat completions, function calling, and token accounting callbacks for agent telemetry.
  • Provides MiniMaxTextEmbeddingModel, a custom implementation that handles MiniMax's specific embedding API format with support for document and query embedding types.
  • Exposes MiniMaxTTSModel, which supports streaming text-to-speech synthesis with multiple voice options and audio formats.
  • Shares a console-ready manifest.yaml that drives the XpertAI UI forms (icons, help links, credential prompts) for quick operator onboarding.

Installation

npm install @xpert-ai/plugin-minimax

Peer Dependencies: Ensure your host service also provides @xpert-ai/plugin-sdk, @nestjs/common, @nestjs/config, @metad/contracts, @langchain/core, @langchain/openai, zod, and tslib. Refer to package.json for exact versions.

Enabling in XpertAI

  1. Add the plugin to the service dependencies so Node.js can resolve @xpert-ai/plugin-minimax.
  2. Declare the plugin before bootstrapping the XpertAI server:
    PLUGINS=@xpert-ai/plugin-minimax
  3. In the XpertAI admin console (or config file), create a model provider pointing to minimax, then add individual models that map to the specific MiniMax model variants you want to use.

Credentials & Model Configuration

The plugin schema backs the form fields you see in the console:

| Field | Description | | --- | --- | | api_key | Required. Your MiniMax API Key from platform.minimaxi.com. | | group_id | Required. Your MiniMax Group ID from platform.minimaxi.com. | | base_url | Optional. Base URL for API requests (defaults to https://api.minimaxi.com). Useful for custom endpoints or proxy configurations. |

During validation, the plugin checks that both api_key and group_id are provided and validates the base URL format if specified.

Supported Models

Large Language Models (LLM)

  • MiniMax-M2.5 - Latest flagship model, excels at coding and agentic tasks (196K context)
  • MiniMax-M2.5-highspeed - High-speed variant of M2.5 with 100 tokens/s throughput
  • MiniMax-M2.1 - Enhanced reasoning model with 230B parameters (196K context)
  • MiniMax-M2.1-highspeed - High-speed variant of M2.1
  • MiniMax-M2 - MiniMax M2 model with agentic capabilities (196K context)
  • M2-her - Specialized for role-playing and multi-turn conversations (65K context)
  • minimax-m1 - MiniMax M1 open-source model (1M context)

Text Embedding Models

  • embo-01 - MiniMax embedding model

Text-to-Speech Models

  • speech-2.8-hd - Latest HD TTS with 40 languages and 7 emotional variants
  • speech-2.8-turbo - Latest turbo TTS with 40 languages and 7 emotional variants
  • speech-2.6-hd - HD voice generation supporting 40 languages
  • speech-2.6-turbo - Low-latency voice generation supporting 40 languages
  • speech-02-hd - HD Speech-02 model with 24 languages
  • speech-02-turbo - Turbo Speech-02 model with 24 languages

Model Capabilities

  • Conversational Models: MiniMaxLargeLanguageModel merges provider credentials with per-model overrides, enables streaming, and registers token usage callbacks so agent telemetry stays accurate.
  • Embedding Models: MiniMaxTextEmbeddingModel implements custom embedding logic to handle MiniMax's specific API format, supporting both document and query embedding types.
  • TTS Models: MiniMaxTTSModel supports streaming text-to-speech synthesis with configurable voice settings, audio formats (mp3, wav, opus, aac, flac), and playback speed.

Development & Debugging

From the repo root, run Nx commands for this package:

cd xpertai
npx nx build models/minimax
npx nx lint models/minimax

Artifacts land in xpertai/models/minimax/dist. Jest settings live in jest.config.ts.

License

This plugin is distributed under the MIT License located at the repository root.