npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

siliconflow-hunyuan-mt-mcp

v0.1.0

Published

MCP translation server using SiliconFlow Hunyuan model

Downloads

9

Readme

SiliconFlow Hunyuan Translation MCP Server

A standalone MCP (Model Context Protocol) server for text translation using the SiliconFlow API with the Hunyuan-MT-7B model. This server exposes a single translate tool that can be used by any MCP-compatible client.

Features

  • Standalone MCP Server: Works with any MCP-compatible client
  • SiliconFlow Integration: Pre-configured for SiliconFlow with Hunyuan-MT-7B
  • OpenAI-Compatible: Can be used with any OpenAI-compatible API endpoint
  • Configurable: Supports both config files and environment variables
  • Customizable Prompts: Override the default translation prompt template

Installation

Prerequisites

  • Node.js 18 or higher
  • An API key from SiliconFlow or another OpenAI-compatible provider

Install from Source

# Clone or download the repository
cd siliconflow-hunyuan-mt-mcp

# Install dependencies
npm install

# Build the TypeScript source
npm run build

NPM Package Usage

The package is published to npm as siliconflow-hunyuan-mt-mcp. You can run it directly without cloning or installing locally.

Run with npx (Recommended)

npx -y siliconflow-hunyuan-mt-mcp

The -y flag automatically accepts the prompt to install if the package is not already cached.

Global Installation

npm install -g siliconflow-hunyuan-mt-mcp
siliconflow-hunyuan-mt-mcp

Configuration

The server can be configured via environment variables or a JSON config file. Configuration is loaded in this priority order (later overrides earlier):

  1. Built-in defaults
  2. Config file (if specified)
  3. Environment variables

Configuration Options

| Option | Environment Variable | Default | Description | |--------|---------------------|---------|-------------| | provider | SILICONFLOW_HUNYUAN_MT_PROVIDER | siliconflow | Provider identifier | | baseUrl | SILICONFLOW_HUNYUAN_MT_BASE_URL | https://api.siliconflow.cn/v1 | API base URL | | apiKeyEnvName | SILICONFLOW_HUNYUAN_MT_API_KEY_ENV_NAME | SILICONFLOW_API_KEY | Environment variable name containing the API key | | model | SILICONFLOW_HUNYUAN_MT_MODEL | tencent/Hunyuan-MT-7B | Model identifier | | promptTemplate | SILICONFLOW_HUNYUAN_MT_PROMPT_TEMPLATE | See below | Custom prompt template |

Config File

Set the config file path via:

  • SILICONFLOW_HUNYUAN_MT_CONFIG_PATH
  • SILICONFLOW_HUNYUAN_MT_CONFIG

Example config.json:

{
  "provider": "siliconflow",
  "baseUrl": "https://api.siliconflow.cn/v1",
  "apiKeyEnvName": "SILICONFLOW_API_KEY",
  "model": "tencent/Hunyuan-MT-7B"
}

Default Prompt Template

The default prompt template is:

You are a precise translation engine. Translate the given text from {{source_lang}} to {{target_lang}}.
Return only the translated text, with no extra commentary.

Text:
{{text}}

You can customize this by setting promptTemplate. Use {{source_lang}}, {{target_lang}}, and {{text}} as placeholders.

Usage Examples

SiliconFlow + Hunyuan-MT-7B (Recommended)

This is the primary use case. Set your SiliconFlow API key:

export SILICONFLOW_API_KEY="your-api-key-here"

Then start the server:

npm start

Or use the binary directly:

./dist/index.js

Custom OpenAI-Compatible Provider

To use a different provider (for example, OpenAI, Azure, or a local LLM server):

Option 1: Environment Variables

export SILICONFLOW_HUNYUAN_MT_BASE_URL="https://api.openai.com/v1"
export SILICONFLOW_HUNYUAN_MT_API_KEY_ENV_NAME="OPENAI_API_KEY"
export SILICONFLOW_HUNYUAN_MT_MODEL="gpt-4"
export OPENAI_API_KEY="your-openai-key"

npm start

Option 2: Config File

Create config.json:

{
  "baseUrl": "https://api.openai.com/v1",
  "apiKeyEnvName": "OPENAI_API_KEY",
  "model": "gpt-4"
}

Then:

export SILICONFLOW_HUNYUAN_MT_CONFIG_PATH="./config.json"
export OPENAI_API_KEY="your-openai-key"
npm start

Local LLM Server (e.g., Ollama, LM Studio)

{
  "baseUrl": "http://localhost:11434/v1",
  "apiKeyEnvName": "LOCAL_API_KEY",
  "model": "llama3"
}

Note: Some local servers do not require an API key. You can set apiKeyEnvName to any dummy variable.

MCP Tool Schema

The server exposes one tool: translate

Tool: translate

Translate text from one language to another.

Parameters:

| Parameter | Type | Required | Description | |-----------|------|----------|-------------| | text | string | Yes | The text to translate | | source_lang | string | Yes | Source language code (e.g., 'en', 'zh', 'auto') | | target_lang | string | Yes | Target language code (e.g., 'en', 'zh', 'ja') |

Example Tool Call:

{
  "name": "translate",
  "arguments": {
    "text": "Hello, world!",
    "source_lang": "en",
    "target_lang": "zh"
  }
}

Supported Language Codes:

Common language codes include:

  • en - English
  • zh - Chinese
  • ja - Japanese
  • ko - Korean
  • fr - French
  • de - German
  • es - Spanish
  • ru - Russian
  • auto - Auto-detect source language (if supported by model)

MCP Client Configuration

Add this server to your MCP client configuration:

Claude Desktop

Edit ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or the equivalent on your platform:

{
  "mcpServers": {
    "siliconflow-translate": {
      "command": "npx",
      "args": ["-y", "siliconflow-hunyuan-mt-mcp"],
      "env": {
        "SILICONFLOW_API_KEY": "your-api-key-here"
      }
    }
  }
}

Cursor

Add to your Cursor MCP settings:

{
  "mcpServers": {
    "siliconflow-translate": {
      "command": "npx",
      "args": ["-y", "siliconflow-hunyuan-mt-mcp"],
      "env": {
        "SILICONFLOW_API_KEY": "your-api-key-here"
      }
    }
  }
}

Generic MCP Client

Any MCP client that supports stdio transport can use:

{
  "name": "siliconflow-translate",
  "transport": "stdio",
  "command": "npx",
  "args": ["-y", "@chenhan/siliconflow-hunyuan-mt-mcp"]
}

Testing

Run the test suite:

npm test

This runs vitest on the test files. Tests cover configuration loading, prompt interpolation, and tool execution.

Development

Build

npm run build

Watch Mode (Rebuild on Change)

npm run watch

Troubleshooting

Server Won't Start

  1. Check that you have set the required API key environment variable
  2. Verify the baseUrl is correct and accessible
  3. Check the config file path is correct (if using one)

Translation Fails

  1. Verify your API key has sufficient credits/quota
  2. Check the model name is correct for your provider
  3. Ensure the language codes are supported by the model

Debug Output

Set NODE_ENV=development for additional logging (if implemented in your version).

License

MIT