prompt-architect-mcp
v2.1.11
Published
Context-aware prompt enhancement MCP server
Maintainers
Readme
prompt-architect-mcp
Context-aware prompt enhancement for any LLM chat interface.
Remembers your project across conversations. Every prompt you send gets automatically enriched with the full history of what you have been building — no setup, no manual tracking. Works with OpenAI and Google Gemini.
How it works
Every time you call enhance_prompt_with_context, the server:
- Saves your prompt to a local SQLite database
- Reads everything you have discussed before in this project session
- Builds a context block from your history and pinned facts
- Enhances your prompt with that full context via OpenAI or Gemini
- Saves the result — so the next call is even richer
You do one thing. The server handles everything else silently.
Requirements
- Node.js 18 or higher
- An OpenAI API key or a Google Gemini API key (or both)
Installation — Claude Desktop
Open your Claude Desktop config file:
- Windows:
%APPDATA%\Claude\claude_desktop_config.json - Mac:
~/Library/Application Support/Claude/claude_desktop_config.json
Using OpenAI only
{
"mcpServers": {
"prompt-architect": {
"command": "npx",
"args": ["-y", "prompt-architect-mcp@latest"],
"env": {
"OPENAI_API_KEY": "sk-your-openai-key"
}
}
}
}Using Gemini only
{
"mcpServers": {
"prompt-architect": {
"command": "npx",
"args": ["-y", "prompt-architect-mcp@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-key"
}
}
}
}Using both (OpenAI takes priority)
{
"mcpServers": {
"prompt-architect": {
"command": "npx",
"args": ["-y", "prompt-architect-mcp@latest"],
"env": {
"OPENAI_API_KEY": "sk-your-openai-key",
"GEMINI_API_KEY": "your-gemini-key"
}
}
}
}Fully quit and reopen Claude Desktop after saving.
Installation — VS Code (GitHub Copilot Agent Mode)
Create or open:
- Global (all projects):
%APPDATA%\Code\User\mcp.json(Windows) or~/.config/Code/User/mcp.json(Mac/Linux) - Workspace (this project only):
.vscode/mcp.jsonin your project root
{
"servers": {
"prompt-architect": {
"type": "stdio",
"command": "npx",
"args": ["-y", "prompt-architect-mcp@latest"],
"env": {
"OPENAI_API_KEY": "sk-your-openai-key"
}
}
}
}Then open Copilot Chat (Ctrl+Alt+I), switch to Agent Mode, and your tools will appear.
Installation — Cursor
Create or open:
- Global:
~/.cursor/mcp.json - Project:
.cursor/mcp.jsonin your project root
{
"mcpServers": {
"prompt-architect": {
"command": "npx",
"args": ["-y", "prompt-architect-mcp@latest"],
"env": {
"OPENAI_API_KEY": "sk-your-openai-key"
}
}
}
}Or go to Cursor Settings → Tools & MCP → New MCP Server and paste the config.
Installation — Windsurf
Go to Windsurf Settings → MCP Servers → Add Server and paste:
{
"mcpServers": {
"prompt-architect": {
"command": "npx",
"args": ["-y", "prompt-architect-mcp@latest"],
"env": {
"OPENAI_API_KEY": "sk-your-openai-key"
}
}
}
}Installation — Firebase Studio (Project IDX / Antigravity)
Create .idx/mcp.json in your project root:
{
"mcpServers": {
"prompt-architect": {
"command": "npx",
"args": ["-y", "prompt-architect-mcp@latest"],
"env": {
"OPENAI_API_KEY": "sk-your-openai-key"
}
}
}
}Or use the Command Palette: Shift+Ctrl+P → Firebase Studio: Add MCP Server.
To keep your API key out of version control, put it in a .env file in your project root and omit the env block from mcp.json.
Tools
enhance_prompt_with_context — the main tool
Call this every time you want to enhance a prompt. Context grows automatically.
| Field | Required | Description |
| ------------ | -------- | -------------------------------------------------------------------------- |
| prompt | Yes | The raw prompt you want to enhance |
| session_id | No | Your project name e.g. my-novel. Auto-detected from git root if omitted. |
First call — no context yet:
Session: my-api-project
No context yet — this is your first turn. Context will grow automatically from here.
Category: technical
Enhanced Prompt:
...Second call onwards — context applied:
Session: my-api-project
Context applied:
[Project context]
History: User is building a Node.js REST API with Express and MongoDB...
Current task: Adding JWT authentication middleware
Pinned facts:
- Using Node.js 20, MongoDB Atlas, RS256 algorithm
Category: technical
Enhanced Prompt:
...already knows your stack, your history, your decisions...pin_fact — lock in a project decision
Use this when you want something remembered permanently across all future prompts in this project.
| Field | Required | Description |
| ------------ | -------- | --------------------------------------------------------- |
| fact | Yes | The fact to remember e.g. Using React 18 and TypeScript |
| session_id | No | Must match the name used in enhance_prompt_with_context |
list_sessions — see all your projects
No fields required. Returns all active project sessions with their last active date.
enhance_prompt — quick one-off (no context)
For when you just want a single prompt enhanced without any session tracking.
| Field | Required | Description |
| ------- | -------- | ------------------------- |
| input | Yes | The raw prompt to enhance |
Testing the tools — step by step
Step 1 — Pin your project stack
Tool: pin_fact
fact: Building a task management app with Node.js 20, Express 4, MongoDB Atlas, JWT HS256
session_id: task-appExpected: Fact pinned to session "task-app"
Step 2 — First enhancement (no history yet, but pinned fact appears)
Tool: enhance_prompt_with_context
prompt: create a mongoose schema for a task with priority levels
session_id: task-appExpected: context block shows your pinned stack. Enhanced prompt mentions Node.js, MongoDB, Mongoose.
Step 3 — Second enhancement (context builds automatically)
Tool: enhance_prompt_with_context
prompt: write the Express route to create a new task
session_id: task-appExpected: context block now shows history from Step 2. Enhanced prompt knows you already have a Mongoose schema.
Step 4 — Third enhancement (richer context)
Tool: enhance_prompt_with_context
prompt: add JWT middleware to protect the task routes
session_id: task-appExpected: enhanced prompt references your schema and your routes from Steps 2 and 3.
Step 5 — Verify sessions
Tool: list_sessionsExpected:
Your project sessions:
• task-app (last active: today)Session ID guide
The session_id is just a name for your project. Rules:
- Use the same name every time for the same project
- Can contain letters, numbers, spaces, hyphens
- Spaces and special characters are converted to hyphens automatically
- If you are in a git repo and omit
session_id, it is auto-detected from the repo name - If not in a git repo and
session_idis omitted, the tool will ask you to provide one
Good examples: my-novel, work-api-2025, recipe-app, thesis-project
Data location
All session data is stored locally on your machine at:
- Windows:
C:\Users\YourName\.prompt-architect\sessions.db - Mac/Linux:
~/.prompt-architect/sessions.db
No data is sent to any server other than your chosen LLM provider (OpenAI or Google) for prompt enhancement and summarisation.
License
MIT
