@fre4x/huggingface
v1.0.65
Published
Hugging Face MCP server for local inference and Hub repository management.
Readme
Hugging Face MCP Server
MCP server for Hugging Face Hub management and local multimodal inference in Node.js.
Tool coverage
Hub management
huggingface_list_modelshuggingface_get_model_infohuggingface_list_repo_fileshuggingface_download_filehuggingface_whoamihuggingface_create_repohuggingface_upload_fileshuggingface_delete_fileshuggingface_delete_repo
Local inference
huggingface_generate_texthuggingface_classify_texthuggingface_embed_texthuggingface_transcribe_audiohuggingface_generate_speechhuggingface_image_to_text
Environment variables
HF_TOKENorHUGGINGFACE_ACCESS_TOKEN: Optional for public reads, required for authenticated Hub operations.HF_UPLOAD_ROOTorHUGGINGFACE_UPLOAD_ROOT: Optional safe root forhuggingface_upload_files. Defaults to the current working directory.HF_MODEL_CACHE_DIR: Optional cache directory for local model downloads.HF_LOCAL_MODEL_PATH: Optional local model directory for offline-only runs.HF_ALLOW_REMOTE_MODELS: Set tofalseto force local-only model loading by default.MOCK: Set totruefor offline mock mode.
Claude Desktop configuration
{
"mcpServers": {
"huggingface": {
"command": "npx",
"args": ["-y", "@fre4x/huggingface"],
"env": {
"HF_TOKEN": "hf_your_access_token",
"HF_UPLOAD_ROOT": "/absolute/path/to/upload-root",
"HF_MODEL_CACHE_DIR": "/absolute/path/to/.cache/huggingface",
"HF_LOCAL_MODEL_PATH": "/absolute/path/to/models",
"HF_ALLOW_REMOTE_MODELS": "true"
}
}
}
}Mock mode
MOCK=true npx @fre4x/huggingfaceNotes
- Local inference uses
@huggingface/transformers. - Hub operations use
@huggingface/hub. huggingface_upload_filesaccepts only local files under the configured upload root. Remote HTTP(S) URLs are rejected.- Local speech-to-text file decoding currently expects local WAV files for filesystem inputs. Remote HTTP(S) audio URLs are passed through directly.
