@mcp-abap-adt/llm-proxy
v0.0.1
Published
Minimal LLM proxy that normalizes provider access and surfaces MCP tools without executing them.
Downloads
57
Maintainers
Readme
LLM Proxy
Minimal LLM agent that normalizes provider access and surfaces MCP tools without executing them.
Overview
This agent acts as a thin orchestration layer between LLM providers and MCP (Model Context Protocol) servers. It provides tool catalogs to the LLM and returns the raw LLM response to the consumer.
Important Architecture Change:
- All LLM providers are accessed through SAP AI Core, not directly
- OpenAI models → SAP AI Core → OpenAI
- Anthropic models → SAP AI Core → Anthropic
- DeepSeek models → SAP AI Core → DeepSeek
- The model name determines which underlying provider SAP AI Core routes to
Features
- ✅ SAP AI Core integration (all LLM providers through SAP AI Core)
- ✅ MCP client integration with multiple transport protocols
- ✅ Stdio transport (for local processes)
- ✅ SSE transport (Server-Sent Events)
- ✅ Streamable HTTP transport (bidirectional NDJSON)
- ✅ Auto-detection of transport from URL
- ✅ Tool catalog surfacing (no tool execution at this layer)
- ✅ Conversation history management
- ✅ Raw LLM response passthrough for consumers
- 🔄 Streaming support (planned)
Installation
npm install @mcp-abap-adt/llm-proxyUsage
The agent can be used in two ways:
- Embedded in application - Import and use directly in your CAP service or application (same process)
- Standalone service - Run as a separate service/process
Both modes connect to MCP servers via transport protocols (HTTP/SSE/stdio), not directly to MCP server instances.
Embedded Usage (Same Process)
When using the agent embedded in your application (e.g., in cloud-llm-hub CAP service), you import it as a module:
// srv/agent-service.ts
import { SapCoreAIAgent, SapCoreAIProvider, MCPClientWrapper } from '@mcp-abap-adt/llm-proxy';
import { executeHttpRequest } from '@sap-cloud-sdk/http-client';
export default class AgentService extends cds.Service {
private agent: SapCoreAIAgent;
async init() {
// Create SAP AI Core provider (all LLM providers through SAP AI Core)
const llmProvider = new SapCoreAIProvider({
destinationName: 'SAP_AI_CORE_DEST', // SAP destination for AI Core
model: 'gpt-4o-mini', // Model name determines which provider SAP AI Core uses
httpClient: async (config) => {
return await executeHttpRequest(
{ destinationName: config.destinationName },
{
method: config.method as any,
url: config.url,
headers: config.headers,
data: config.data,
}
);
},
});
// Create MCP client
const mcpClient = new MCPClientWrapper({
url: 'http://localhost:4004/mcp/stream/http', // MCP proxy endpoint
headers: {
'Authorization': 'Basic YWxpY2U6',
'X-SAP-Destination': 'SAP_DEV_DEST',
},
});
// Create agent
this.agent = new SapCoreAIAgent({
llmProvider,
mcpClient,
});
await this.agent.connect();
}
async chat(message: string) {
return await this.agent.process(message);
}
}Architecture Note:
- The agent is imported as a module (like
@fr0ster/mcp-abap-adt) - Even when embedded in the same process, the agent connects to the MCP proxy via HTTP transport
- The MCP proxy embeds the
mcp-abap-adtserver instance - This keeps the architecture clean: agent → MCP proxy (via HTTP) → embedded MCP server
See Embedded Usage Guide for complete examples including per-request agent instances and caching strategies.
Standalone Usage (Separate Process)
Basic Example (Stdio Transport)
import { SapCoreAIAgent, SapCoreAIProvider, MCPClientWrapper } from '@mcp-abap-adt/llm-proxy';
import { executeHttpRequest } from '@sap-cloud-sdk/http-client';
// Create SAP AI Core provider
const llmProvider = new SapCoreAIProvider({
destinationName: 'SAP_AI_CORE_DEST',
model: 'gpt-4o-mini', // Routes to OpenAI through SAP AI Core
httpClient: async (config) => {
return await executeHttpRequest(
{ destinationName: config.destinationName },
{
method: config.method as any,
url: config.url,
headers: config.headers,
data: config.data,
}
);
},
});
const mcpClient = new MCPClientWrapper({
transport: 'stdio',
command: 'node',
args: ['path/to/mcp-server.js'],
});
const agent = new SapCoreAIAgent({
llmProvider,
mcpClient,
});
await agent.connect();
const response = await agent.process('What tools are available?');
console.log(response.message);HTTP Transport (Auto-Detection)
import { Agent, OpenAIProvider, MCPClientWrapper } from '@mcp-abap-adt/llm-proxy';
const llmProvider = new OpenAIProvider({
apiKey: process.env.OPENAI_API_KEY!,
model: 'gpt-4o-mini',
});
// Auto-detects 'stream-http' from URL
const mcpClient = new MCPClientWrapper({
url: 'http://localhost:4004/mcp/stream/http',
headers: {
'Authorization': 'Basic YWxpY2U6',
'Content-Type': 'application/x-ndjson',
},
});
const agent = new Agent({
llmProvider,
mcpClient,
});
await mcpClient.connect();
const sessionId = mcpClient.getSessionId(); // Get session ID for subsequent requests
const response = await agent.process('What tools are available?');
// The response is returned as-is; tool execution is handled by the consumer.
console.log(response.message);Explicit Transport Selection
// SSE transport
const sseClient = new MCPClientWrapper({
transport: 'sse',
url: 'http://localhost:4004/mcp/stream/sse',
headers: {
'Authorization': 'Basic YWxpY2U6',
},
});
// Streamable HTTP transport
const httpClient = new MCPClientWrapper({
transport: 'stream-http',
url: 'http://localhost:4004/mcp/stream/http',
headers: {
'Authorization': 'Basic YWxpY2U6',
},
});See src/mcp/README.md for detailed transport configuration options.
Embedded Usage in CAP Service
The agent can be imported and used directly in CAP services, similar to how mcp-abap-adt is used:
// srv/agent-service.ts
import { Agent, OpenAIProvider } from '@mcp-abap-adt/llm-proxy';
export default class AgentService extends cds.Service {
private agent: Agent;
async init() {
this.agent = new Agent({
llmProvider: new OpenAIProvider({
apiKey: process.env.OPENAI_API_KEY!,
}),
mcpConfig: {
url: 'http://localhost:4004/mcp/stream/http',
headers: {
'Authorization': 'Basic YWxpY2U6',
'X-SAP-Destination': 'SAP_DEV_DEST',
},
},
});
await this.agent.connect();
}
async chat(message: string) {
return await this.agent.process(message);
}
}See docs/LLM_AGENT_EMBEDDED_USAGE.md for complete embedded usage guide.
Development
Cross-Platform Development: This project is configured for consistent behavior across Windows, Linux, and macOS. See the parent project's Cross-Platform Development Guide for setup instructions and troubleshooting.
Verify your setup: Run npm run verify:setup from the root project to check cross-platform configuration.
# Install dependencies
npm install
# Setup environment (copy template and fill in your values)
cp .env.template .env
# Edit .env with your API keys and settings
# Build
npm run build
# Development mode (with tsx for hot reload)
# Will automatically load .env file if it exists
npm run dev
# Run test launcher (after build)
npm start
# Or with custom message
npm start "List all available ABAP programs"Environment Configuration
The agent supports configuration via .env file for easier setup:
Copy the template:
cp .env.template .envEdit
.envwith your settings:# SAP AI Core Configuration (required) # All LLM providers are accessed through SAP AI Core SAP_CORE_AI_DESTINATION=SAP_AI_CORE_DEST SAP_CORE_AI_MODEL=gpt-4o-mini # Model determines provider: gpt-4o-mini → OpenAI, claude-3-5-sonnet → Anthropic SAP_CORE_AI_TEMPERATURE=0.7 SAP_CORE_AI_MAX_TOKENS=2000 # MCP Configuration (optional, for MCP integration) MCP_ENDPOINT=http://localhost:4004/mcp/stream/http MCP_DISABLED=falseNote: Legacy direct provider configuration (OPENAI_API_KEY, etc.) is deprecated. All LLM providers must be accessed through SAP AI Core.
Run the agent - it will automatically load
.env:npm run dev:llm
Environment variables from .env can be overridden by actual environment variables.
Test Launcher
The agent includes a simple CLI test launcher for quick testing.
Note: The CLI launcher currently supports legacy direct provider configuration for testing purposes. In production, all LLM providers should be accessed through SAP AI Core.
Test LLM Only (Without MCP)
Test just the LLM provider without MCP integration:
Using SAP AI Core (Recommended):
# Set SAP AI Core destination
export SAP_CORE_AI_DESTINATION="SAP_AI_CORE_DEST"
export SAP_CORE_AI_MODEL="gpt-4o-mini" # Routes to OpenAI through SAP AI Core
npm run dev:llmLegacy Direct Provider (for testing only):
# Basic usage - set API key and run
export OPENAI_API_KEY="sk-proj-your-actual-key-here"
npm run dev:llm
# Or inline
OPENAI_API_KEY="sk-proj-your-key" npm run dev:llm
# With custom message
export OPENAI_API_KEY="sk-proj-your-key"
npm run dev:llm "Hello! Can you introduce yourself?"
# With specific model
export OPENAI_API_KEY="sk-proj-your-key"
export OPENAI_MODEL="gpt-4o" # or gpt-4-turbo, gpt-4o-mini, etc.
npm run dev:llm
# With organization ID (for team accounts)
export OPENAI_API_KEY="sk-proj-your-key"
export OPENAI_ORG="org-your-org-id"
npm run dev:llm
# With project ID (for project-specific billing)
export OPENAI_API_KEY="sk-proj-your-key"
export OPENAI_PROJECT="proj-your-project-id" # or OPENAI_PRJ
npm run dev:llm
# Full configuration
export OPENAI_API_KEY="sk-proj-your-key"
export OPENAI_MODEL="gpt-4o"
export OPENAI_ORG="org-your-org-id"
export OPENAI_PROJECT="proj-your-project-id"
npm run dev:llmAnthropic (Claude):
# Set provider and API key
export LLM_PROVIDER=anthropic
export ANTHROPIC_API_KEY="sk-ant-your-actual-key-here"
npm run dev:llm
# With custom message
export LLM_PROVIDER=anthropic
export ANTHROPIC_API_KEY="sk-ant-your-key"
npm run dev:llm "What can you do?"
# With specific model
export LLM_PROVIDER=anthropic
export ANTHROPIC_API_KEY="sk-ant-your-key"
export ANTHROPIC_MODEL="claude-3-5-sonnet-20241022" # or claude-3-opus, etc.
npm run dev:llmDeepSeek:
# Set provider and API key
export LLM_PROVIDER=deepseek
export DEEPSEEK_API_KEY="sk-your-actual-key-here"
npm run dev:llm
# With custom message
export LLM_PROVIDER=deepseek
export DEEPSEEK_API_KEY="sk-your-key"
npm run dev:llm "Explain what you can do"Alternative methods:
# Method 1: Using dedicated script (recommended)
export OPENAI_API_KEY="sk-proj-..."
npm run dev:llm
# Method 2: Using flag
export OPENAI_API_KEY="sk-proj-..."
npm run dev -- --llm-only
# Method 3: Using environment variable
export OPENAI_API_KEY="sk-proj-..."
export MCP_DISABLED=true
npm run devBasic Usage with OpenAI (With MCP)
# Method 1: Export environment variable
export OPENAI_API_KEY="sk-proj-..."
export MCP_ENDPOINT="http://localhost:4004/mcp/stream/http"
npm run dev
# Method 2: Inline (one-time use)
OPENAI_API_KEY="sk-proj-..." npm run dev
# Method 3: With custom message
export OPENAI_API_KEY="sk-proj-..."
npm run dev "What ABAP programs are available?"
# Method 4: Using .env file (if you have dotenv setup)
# Create .env file:
# OPENAI_API_KEY=sk-proj-...
# MCP_ENDPOINT=http://localhost:4004/mcp/stream/http
npm run devComplete Example
# From project root
cd submodules/llm-agent
# Set required environment variables
export OPENAI_API_KEY="sk-proj-your-actual-key-here"
export MCP_ENDPOINT="http://localhost:4004/mcp/stream/http"
export SAP_DESTINATION="SAP_DEV_DEST" # Optional, for SAP integration
# Optional: Set model
export OPENAI_MODEL="gpt-4o-mini" # or gpt-4o, gpt-4-turbo, etc.
# Run test launcher
npm run dev
# Or with custom message
npm run dev "List all available tools and describe what they do"Testing with Different LLM Providers
Anthropic (Claude):
export LLM_PROVIDER=anthropic
export ANTHROPIC_API_KEY="sk-ant-your-key-here"
export ANTHROPIC_MODEL="claude-3-5-sonnet-20241022" # Optional
npm run devDeepSeek:
export LLM_PROVIDER=deepseek
export DEEPSEEK_API_KEY="sk-your-key-here"
export DEEPSEEK_MODEL="deepseek-chat" # Optional
npm run devExample Output
🤖 LLM Proxy Test Launcher v0.0.1
📋 Configuration:
LLM Provider: openai
MCP Endpoint: http://localhost:4004/mcp/stream/http
Test Message: What tools are available?
✅ Created OpenAI provider
✅ Created MCP client
✅ Created agent instance
Agent type: OpenAIAgent
🔌 Connecting to MCP server...
✅ Connected to MCP server
📦 Available tools: 31
- GetProgram: Retrieve ABAP program source code...
- GetClass: Retrieve ABAP class source code...
- GetFunction: Retrieve ABAP function module...
... and 28 more
💬 Processing message: "What tools are available?"
📤 Response:
────────────────────────────────────────────────────────────
I can see you have 31 tools available for working with ABAP systems...
⏱️ Duration: 2341ms
📜 Conversation history: 4 messages
✅ Test completed successfully!The test launcher will:
- Connect to MCP server
- List available tools
- Process a test message
- Show response
- Display conversation history
Tool Execution Responsibility
The agent does not execute tools. It only:
- Fetches MCP tool catalogs
- Passes tool definitions to the LLM
- Returns the raw LLM response to the consumer
If your application needs tool execution, parse the model output in the consumer layer and call MCP tools there.
Architecture
src/agents/- Agent implementations (BaseAgent, SapCoreAIAgent, etc.)src/llm-providers/- LLM provider implementations (SapCoreAIProvider)src/mcp/- MCP client wrappersrc/types.ts- TypeScript type definitions
License
MIT
