sympraxi
v0.6.0
Published
TypeScript SDK for the Sympraxi AI prompt management tool
Maintainers
Readme
Sympraxi SDK
TypeScript SDK for the Sympraxi AI prompt management tool.
Installation
npm install sympraxi-sdkor
yarn add sympraxi-sdkUsage
Authentication
The SDK supports two authentication methods:
import { SympraxiClient } from 'sympraxi-sdk';
// Method 1: Using API token (recommended)
const client = new SympraxiClient({
apiToken: 'sym-xxxxxxxxxxxxxxxx',
});
// Method 2: Using API key (legacy)
const client = new SympraxiClient({
apiKey: 'your-api-key',
});Getting Prompts by Reference
import { SympraxiClient } from 'sympraxi-sdk';
const client = new SympraxiClient({
apiToken: 'sym-xxxxxxxxxxxxxxxx',
});
// Get a prompt using a reference in the format 'projectName:nodeName:nodeUUID'
async function getMyPrompt() {
try {
const response = await client.getPrompt('myProject:myNode:d5dca613-79bd-41f9-a148-23cccb38b6bb');
// Access the prompt text
console.log(response.prompt); // "I am a helpful AI agent."
// You can also access any other properties returned by the API
console.log(response);
return response;
} catch (error) {
console.error('Error fetching prompt:', error);
}
}Working with Project Nodes
The SDK can fetch and manage the entire graph of nodes for a project:
import { SympraxiClient } from 'sympraxi-sdk';
const client = new SympraxiClient({
apiToken: 'sym-xxxxxxxxxxxxxxxx',
});
async function buildAgentConfiguration() {
// Fetch all nodes for a project
const projectId = 'a7fc8904-79e9-4366-8dc7-4740dc51694b';
const nodeMap = await client.getNodeMap(projectId);
// Access nodes by ID
const supervisorNode = nodeMap['0c9c2e7f-b1ab-48be-8a1c-48f427aafaa2'];
if (!supervisorNode) return;
// Get the latest prompt version
const supervisorPrompt = client.getLatestPrompt(supervisorNode);
// Find child nodes (agents/tools)
const childNodeIds = supervisorNode.outgoingConnections.map(conn => conn.targetNode);
const childNodes = childNodeIds.map(id => nodeMap[id]).filter(Boolean);
// Extract node information for agent configuration
const agentConfigs = childNodes.map(node => ({
id: node.id,
name: node.name,
type: node.type,
prompt: client.getLatestPrompt(node),
// Other configuration details...
}));
return {
supervisor: {
id: supervisorNode.id,
name: supervisorNode.name,
prompt: supervisorPrompt,
},
agents: agentConfigs,
};
}LangGraph Integration
The SDK can automatically convert Sympraxi node configurations into runnable LangGraph agents:
import { SympraxiClient } from 'sympraxi-sdk';
import { createReactAgent } from '@langchain/langgraph/prebuilt';
import { createSupervisor } from '@langchain/langgraph-supervisor';
import { ChatOpenAI } from '@langchain/openai';
import { Calculator } from '@langchain/community/tools/calculator';
async function buildLangGraphWorkflow() {
const client = new SympraxiClient({
apiToken: 'sym-xxxxxxxxxxxxxxxx',
});
// Initialize your LLM
const llm = new ChatOpenAI({modelName: 'gpt-4'});
// Configure project and tools
const projectId = 'a7fc8904-79e9-4366-8dc7-4740dc51694b';
const rootSupervisorId = '0c9c2e7f-b1ab-48be-8a1c-48f427aafaa2';
// Map tool implementations to node IDs
const toolImplementations = {
'1fa89c68-bc47-448b-a9c7-1f9cc3893c26': new Calculator(),
// Add more tools as needed
};
// Provide factory functions for agent creation
const agentFactory = {
createReactAgent,
createSupervisor,
};
// Build the complete LangGraph workflow
const workflow = await client.buildLangGraphWorkflow(
projectId,
rootSupervisorId,
{ model: llm },
toolImplementations,
agentFactory
);
// Run the workflow
const result = await compiledWorkflow.invoke(
{ messages: [{ content: "Hello, I need financial advice" }] },
{ configurable: { thread_id: `thread-${Date.now()}` } }
);
return result;
}Using Prompt Variables
The SDK supports variable substitution in prompts. You can define variables in your prompts using the {{ $variableName }} syntax, and then provide values for these variables when building your workflow:
// In Sympraxi, create prompts with variables:
// "Hello, my name is {{ $agentName }}. I'll help you with {{ $topic }}."
// When using the SDK, pass variables when building the workflow:
const workflow = await client.buildLangGraphWorkflow(
projectId,
rootSupervisorId,
{ model: llm },
toolImplementations,
agentFactory,
{
// Variables to substitute in all prompts
agentName: "Jane",
topic: "financial planning",
userAge: 42,
isExistingCustomer: true
}
);
// Variables can also be used with the getLatestPrompt method:
const prompt = client.getLatestPrompt(node, {
agentName: "Jane",
customGreeting: "Welcome back!"
});Variables support string, number, and boolean values, which are converted to strings during substitution. If a variable is used in a prompt but not provided in the variables object, it will remain unchanged in the final prompt.
Extending to Other Frameworks
The LangGraph integration is designed to be extensible to other agent frameworks as well. You can implement custom factory functions to adapt Sympraxi configurations to different frameworks.
Node Types and Connections
The SDK provides structured types for working with Sympraxi's graph-based prompt management:
Node: Represents a node in the graph (agent, supervisor, or tool)NodeVersion: Version history for a node's promptConnection: Represents a link between nodes
Node Structure Example
interface Node {
id: string;
name: string;
type: 'agent' | 'supervisor' | 'tool';
versions: NodeVersion[];
outgoingConnections: Connection[];
incomingConnections: Connection[];
// ...other properties
}Error Handling
The SDK throws descriptive errors for various error conditions:
- Invalid prompt reference format
- Invalid UUID format
- API errors (with status code and response)
- Network errors
Development
Running Tests
This package includes Jest-based tests. Run them with:
npm testBuilding the Package
Build the package with:
npm run buildLicense
MIT
