@goharvest/simforge
v0.8.0
Published
Simforge client for provider-based API calls with local BAML execution
Readme
Simforge TypeScript SDK
Simforge client for provider-based API calls.
Monorepo Setup
This package is part of a pnpm workspace monorepo. The workspace structure separates concerns:
- Root
package.json: Contains only shared dev dependencies (Biome, TypeScript, Vitest, Knip, Madge) - Package
package.json: Contains runtime dependencies specific to this package - You can run tests and validation from the root directory:
pnpm testorpnpm validate - No need to manually update shared dev tooling versions across packages
Installing Dependencies
Always install dependencies from the root directory:
# From the root directory
pnpm installUpdating pnpm
To update pnpm across the monorepo:
# From the root directory
pnpm pnpm:updateThis automatically updates pnpm and syncs the version in package.json and CI/CD workflows.
Running Commands
You can run commands from this directory or from the root:
# From this directory
pnpm test
pnpm build
pnpm validate
# From the root directory (runs across all packages)
pnpm test # Run all tests
pnpm validate # Run lint + tsc + testAvailable Development Tools
This package includes the following shared development tools:
| Tool | Purpose | Command |
|------|---------|---------|
| Vitest | Unit testing with coverage | pnpm test |
| Biome | Fast linting & formatting | pnpm lint |
| TypeScript | Type checking | pnpm tsc |
| Knip | Find unused dependencies/exports | pnpm knip |
| Madge | Detect circular dependencies | pnpm madge |
Run individual tools:
# Run tests with coverage
pnpm test
# Lint and format code
pnpm lint
# Type check without building
pnpm tsc
# Find unused dependencies
pnpm knip
# Check for circular dependencies
pnpm madge
# Run all validation checks
pnpm validateInstallation
Basic Installation
npm install @goharvest/simforge
# or
pnpm add @goharvest/simforge
# or
yarn add @goharvest/simforgeWith OpenAI Agents SDK Tracing
If you want to use the OpenAI Agents SDK tracing integration:
npm install @goharvest/simforge @openai/agents
# or
pnpm add @goharvest/simforge @openai/agentsThe @openai/agents package is an optional peer dependency - the SDK works fine without it unless you need tracing functionality.
Local Development
cd simforge-typescript-sdk
pnpm buildTo link for local development:
# In this directory
pnpm link --global
# In your app directory
pnpm link --global @harvest/simforgeTo switch back to npm version:
pnpm unlink @harvest/simforge
pnpm add @harvest/simforgeUsage
Basic Usage (Local Execution)
By default, the SDK executes BAML functions locally on the client by fetching the BAML prompt from the server and running it with your own API keys:
import { Simforge } from "@goharvest/simforge";
const client = new Simforge({
apiKey: "sf_your_api_key_here", // Required: Your Simforge API key
serviceUrl: "https://simforge.goharvest.ai", // Optional
envVars: {
OPENAI_API_KEY: process.env.OPENAI_API_KEY, // Your LLM provider API keys
},
executeLocally: true, // Default: true
});
const result = await client.call("method_name", {
arg1: "value1",
arg2: "value2",
});Server-Side Execution
If you prefer to execute BAML on the server (using the server's API keys), set executeLocally: false:
const client = new Simforge({
apiKey: "sf_your_api_key_here",
executeLocally: false, // Execute on server instead of locally
});
const result = await client.call("method_name", {
arg1: "value1",
arg2: "value2",
});With TypeScript Generics
You can specify the return type for type safety:
interface MyResponse {
answer: string;
confidence: number;
}
const result = await client.call<MyResponse>("analyze", {
text: "Hello world",
});
console.log(result.answer); // TypeScript knows this is a stringError Handling
import { Simforge, SimforgeError } from "@goharvest/simforge";
const client = new Simforge({
apiKey: "sf_your_api_key_here",
serviceUrl: "https://your-simforge-instance.com",
});
try {
const result = await client.call("method_name", { input: "value" });
} catch (error) {
if (error instanceof SimforgeError) {
console.error("Simforge error:", error.message);
if (error.url) {
console.error("Configure at:", error.url);
}
}
}Configuration
| Option | Required | Description |
| ---------------- | -------- | -------------------------------------------------------------------------------------------------------------------------------- |
| apiKey | Yes | Your Simforge API key (generate from your Simforge dashboard) |
| serviceUrl | No | The base URL for the Simforge API (default: https://simforge.goharvest.ai) |
| timeout | No | Request timeout in milliseconds (default: 120000) |
| envVars | No | Environment variables for LLM provider API keys (e.g., { OPENAI_API_KEY: "..." }). Required for local execution. |
| executeLocally | No | Whether to execute BAML locally on the client (default: true). Set to false to execute on the server with the server's API keys. |
Execution Modes
Local Execution (Default)
When executeLocally: true (default), the SDK:
- Fetches the BAML function definition from the Simforge server
- Executes the BAML locally using the
@boundaryml/bamlruntime - Uses your own LLM provider API keys (passed via
envVars) - Returns the result directly to you
Benefits:
- Full control over API keys and costs
- Lower latency (no server round-trip for execution)
- Privacy: Your data doesn't go through the Simforge server
Requirements:
- You must provide LLM provider API keys via
envVars - The BAML runtime runs in your environment (Node.js, browser with polyfills, etc.)
Server-Side Execution
When executeLocally: false, the SDK:
- Sends the function name and inputs to the Simforge server
- The server executes the BAML using its own API keys
- Returns the result to you
Benefits:
- No need to manage LLM provider API keys
- Simpler setup
- Works in any environment
Requirements:
- The Simforge server must have the necessary API keys configured
Environment Variables
SIMFORGE_URL: The base URL for the Simforge API (used ifserviceUrlis not provided)OPENAI_API_KEY,ANTHROPIC_API_KEY, etc.: LLM provider API keys (required for local execution)
Testing
A test script is provided to verify both local and server-side execution:
# Set up environment variables
export SIMFORGE_API_KEY="sf_your_api_key"
export OPENAI_API_KEY="your_openai_key"
# Run the test script
pnpm tsx test-local-execution.ts [function-name] [query]
# Example
pnpm tsx test-local-execution.ts extract-entities "What is the capital of France?"The test script will:
- Test local execution with your API keys
- Test server-side execution
- Compare results and performance
Publishing
pnpm build
npm publish --access public