perplexity-webui-mcp
v2.0.0
Published
MCP wrapper around perplexity-webui-scraper for Perplexity WebUI access
Downloads
203
Maintainers
Readme
quick start
this package is a local mcp wrapper (stdio transport) that launches the upstream perplexity-webui-scraper mcp server via uvx.
manual run:
PERPLEXITY_SESSION_TOKEN="your_token_here" npx perplexity-webui-mcpimportant: this uses perplexity's internal webui api with a session cookie. for personal/local tinkering only - not affiliated with perplexity ai.
overview
perplexity-webui-mcp is a local stdio MCP wrapper that launches the upstream perplexity-webui-scraper MCP server through uvx. this keeps your package on npm while using the upstream battle-tested WebUI implementation (browser impersonation, retry logic, model-specific tools, and token CLI ecosystem).
quick installation
paste this into your llm agent session:
Install and configure perplexity-webui-mcp by following the instructions here:
https://raw.githubusercontent.com/Microck/perplexity-webui-mcp/refs/heads/master/INSTALL.mdnpm (recommended)
npm install -g perplexity-webui-mcpruntime requirement:
uv --versionif uv is missing, install it from https://docs.astral.sh/uv/getting-started/installation/
manual installation
from source
git clone https://github.com/Microck/perplexity-webui-mcp.git
cd perplexity-webui-mcp
npm install
npm run buildgetting your session token
fastest method (automatic via CLI):
uvx --with rich --from "perplexity-webui-scraper@latest" get-perplexity-session-tokenthis interactive CLI asks for your email, handles OTP/magic-link verification, and prints the session token.
you can run that command from any directory.
manual method (browser):
- open perplexity.ai in your browser and log in
- open devtools (f12 or cmd+opt+i)
- go to application > cookies >
https://www.perplexity.ai - copy the value of
__Secure-next-auth.session-token
powered by token extraction flow from: https://github.com/henrique-coder/perplexity-webui-scraper
configuration
because this server uses stdio, you configure it as a local command and pass the token via env.
mcp client config (claude desktop, opencode, etc)
{
"mcpServers": {
"perplexity": {
"command": "perplexity-webui-mcp",
"env": {
"PERPLEXITY_SESSION_TOKEN": "your_session_token_here"
}
}
}
}from source
{
"mcpServers": {
"perplexity": {
"command": "node",
"args": ["/path/to/perplexity-webui-mcp/dist/index.js"],
"env": {
"PERPLEXITY_SESSION_TOKEN": "your_session_token_here"
}
}
}
}remote deployment over tailscale (optional)
if your cloud machine gets blocked by cloudflare but your home machine works, run the upstream mcp server on the home machine and connect to it from opencode as a remote mcp.
- copy templates from this repo:
deploy/systemd/perplexity-webui-mcp.env.exampledeploy/systemd/perplexity-webui-mcp-sse.shdeploy/systemd/perplexity-webui-mcp.service
- install and enable service on the home machine (user service):
mkdir -p ~/.config ~/.config/systemd/user ~/.local/bin
cp deploy/systemd/perplexity-webui-mcp.env.example ~/.config/perplexity-webui-mcp.env
cp deploy/systemd/perplexity-webui-mcp-sse.sh ~/.local/bin/perplexity-webui-mcp-sse.sh
cp deploy/systemd/perplexity-webui-mcp.service ~/.config/systemd/user/perplexity-webui-mcp.service
chmod 600 ~/.config/perplexity-webui-mcp.env
chmod 755 ~/.local/bin/perplexity-webui-mcp-sse.sh
systemctl --user daemon-reload
systemctl --user enable --now perplexity-webui-mcp.service- point opencode (cloud host) to the tailscale endpoint:
{
"mcp": {
"perplexity-webui": {
"type": "remote",
"url": "http://<tailscale-ip>:8790/sse",
"enabled": true,
"oauth": false
}
}
}- verify:
opencode mcp listfeatures
| tool | description |
|------|-------------|
| pplx_ask | best-model query (auto model selection) |
| pplx_deep_research | deep research mode |
| pplx_sonar | sonar model |
| pplx_gpt52 / pplx_gpt52_thinking | gpt-5.2 variants |
| pplx_claude_sonnet / pplx_claude_sonnet_think | claude sonnet 4.5 variants |
| pplx_gemini_flash / pplx_gemini_flash_think / pplx_gemini_pro_think | gemini 3 variants |
| pplx_grok / pplx_grok_thinking | grok 4.1 variants |
| pplx_kimi_thinking | kimi k2.5 thinking |
all upstream model tools support source_focus values: web, academic, social, finance, all.
how this differs from v1.0.0
- old v1.0.0: one custom tool (
perplexity_search) implemented in local TypeScript HTTP logic. - current: delegates to upstream
perplexity-webui-scraperMCP, exposing the full upstream model-specific toolset. - result: significantly better compatibility with Perplexity anti-bot protections.
troubleshooting
| problem | solution |
|---------|----------|
| token invalid / 401 | get a fresh token from browser cookies |
| uvx not found | install uv (uv --version should work) |
| no answer returned | check rate limits or whether your account can access the selected model |
| clarifying questions error | deep research mode may request clarifying questions first |
| timeout | deep research can take several minutes - be patient |
verify both modes quickly
PERPLEXITY_SESSION_TOKEN="your_token_here" npm run self-testthis checks both:
- regular search (
best) - deep research (
deep_research)
and prints pass/fail per mode.
project structure
perplexity-webui-mcp/
├── deploy/
│ └── systemd/
│ ├── perplexity-webui-mcp.env.example
│ ├── perplexity-webui-mcp-sse.sh
│ └── perplexity-webui-mcp.service
├── src/
│ └── index.ts # proxy launcher for upstream MCP
├── package.json
├── tsconfig.json
├── .env.example
├── .gitignore
├── LICENSE
├── INSTALL.md
└── README.mdlicense
mit
author
shoutout
special thanks to henrique-coder/perplexity-webui-scraper for the WebUI reverse-engineering and token CLI workflow that helped this project.
