@l10nmonster/helpers-anthropic
v3.1.3
Published
Anthropic Claude LLM provider for l10nmonster using Vertex AI
Downloads
78
Readme
L10nMonster Anthropic Helper
This package provides integration between L10nMonster and Anthropic's Claude models via both direct API and Google Vertex AI.
Installation
npm install @l10nmonster/helpers-anthropicUsage
Direct API Configuration
import { AnthropicAgent } from '@l10nmonster/helpers-anthropic';
const agent = new AnthropicAgent({
id: 'claude-translator',
model: 'claude-3-5-sonnet-latest',
quality: 80,
temperature: 0.1,
maxTokens: 4096,
apiKey: 'your-anthropic-api-key' // Direct API access
});Vertex AI Configuration
import { AnthropicAgent } from '@l10nmonster/helpers-anthropic';
const agent = new AnthropicAgent({
id: 'claude-translator',
model: 'claude-3-5-sonnet@20241022',
quality: 80,
temperature: 0.1,
maxTokens: 4096,
vertexProject: 'your-gcp-project-id',
vertexLocation: 'us-central1'
});Authentication
The AnthropicAgent supports two authentication methods:
Direct API
- API Key: Obtain an API key from Anthropic Console
- Set apiKey: Pass your API key in the configuration
Vertex AI
- Service Account: Set up a service account with Vertex AI permissions
- gcloud CLI: Run
gcloud auth loginandgcloud config set project YOUR_PROJECT_ID - Environment Variables: Set
GOOGLE_APPLICATION_CREDENTIALSto point to your service account key file
Retry Behavior
The AnthropicAgent uses the native retry mechanism built into both the Anthropic SDK and Anthropic Vertex SDK:
- Automatic Retries: Both SDKs automatically handle retries with exponential backoff
- Configurable: Set
maxRetriesin the constructor to control retry attempts (passed directly to both SDKs) - Smart Error Handling: Automatically retries on network errors, rate limits, and temporary service issues
- No Manual Implementation: Unlike other providers, this uses the SDK's native retry logic for better reliability
- Consistent Behavior: The same
maxRetriesvalue is used whether you're using direct API or Vertex AI
Configuration Options
model(required): The Claude model to usequality(required): Quality score for translations (0-100)apiKey: Your Anthropic API key (for direct API access)temperature: Controls randomness (0.0-1.0, default: 0.1)maxTokens: Maximum output tokens (default: 4096)maxRetries: Maximum number of retries (passed to Anthropic SDK)vertexProject: GCP project ID (for Vertex AI, auto-detected if not provided)vertexLocation: GCP region (for Vertex AI, default: 'global')persona: Custom translator persona (optional)customSchema: Custom response schema (optional)
Testing
Run the test suite using Node.js built-in testing:
# From the package directory
npm test
# From the workspace root
npm test --workspace=helpers-anthropicThe test suite includes:
- Unit tests for all major functionality
- Integration tests with the LLMTranslationProvider inheritance
- Error handling and edge case scenarios
- Mock testing without external API calls
Requirements
- Node.js >= 22.11.0
- Google Cloud Project with Vertex AI API enabled
- Proper authentication setup for Google Cloud
License
MIT
