npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@memberjunction/ai-anthropic

v2.124.0

Published

MemberJunction Wrapper for Anthropic AI Models

Downloads

4,931

Readme

@memberjunction/ai-anthropic

A comprehensive wrapper for Anthropic's AI models (Claude) that provides a standardized interface within the MemberJunction AI framework. This package enables seamless integration with Claude models while maintaining consistency with MemberJunction's AI abstraction layer.

Features

  • Seamless Integration: Direct integration with Anthropic's Claude models
  • Standardized Interface: Implements MemberJunction's BaseLLM abstract class
  • Streaming Support: Full support for streaming responses
  • Advanced Caching: Ephemeral caching support for improved performance
  • Thinking/Reasoning: Support for Claude's thinking/reasoning capabilities with configurable token budgets
  • Message Formatting: Automatic handling of message format conversions and role mappings
  • Error Handling: Comprehensive error handling with detailed error reporting
  • Token Usage Tracking: Detailed token usage tracking including cached tokens
  • Multiple Models: Support for all Claude model variants (Opus, Sonnet, Haiku, etc.)

Installation

npm install @memberjunction/ai-anthropic

Requirements

  • Node.js 16+
  • An Anthropic API key
  • MemberJunction Core libraries (@memberjunction/ai, @memberjunction/global)

Usage

Basic Setup

import { AnthropicLLM } from '@memberjunction/ai-anthropic';

// Initialize with your API key
const anthropicLLM = new AnthropicLLM('your-anthropic-api-key');

Chat Completion

import { ChatParams } from '@memberjunction/ai';

// Create chat parameters
const chatParams: ChatParams = {
  model: 'claude-3-opus-20240229',
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
    { role: 'user', content: 'Hello, can you help me understand how AI works?' }
  ],
  maxOutputTokens: 1000,
  temperature: 0.7,
  enableCaching: true // Enable ephemeral caching
};

// Get a response
try {
  const response = await anthropicLLM.ChatCompletion(chatParams);
  if (response.success) {
    console.log('AI Response:', response.data.choices[0].message.content);
    console.log('Token Usage:', response.data.usage);
    
    // Check cache info if available
    if (response.cacheInfo) {
      console.log('Cache Hit:', response.cacheInfo.cacheHit);
      console.log('Cached Tokens:', response.cacheInfo.cachedTokenCount);
    }
  } else {
    console.error('Error:', response.errorMessage);
  }
} catch (error) {
  console.error('Exception:', error);
}

Streaming Chat Completion

const streamParams: ChatParams = {
  model: 'claude-3-sonnet-20240229',
  messages: [
    { role: 'user', content: 'Write a short story about AI' }
  ],
  maxOutputTokens: 2000,
  streaming: true,
  streamCallback: (content: string) => {
    // Handle streaming chunks as they arrive
    process.stdout.write(content);
  }
};

const response = await anthropicLLM.ChatCompletion(streamParams);

Using Thinking/Reasoning Models

const reasoningParams: ChatParams = {
  model: 'claude-3-opus-20240229',
  messages: [
    { role: 'user', content: 'Solve this complex math problem: ...' }
  ],
  effortLevel: 'high',
  reasoningBudgetTokens: 5000, // Allow up to 5000 tokens for reasoning
  maxOutputTokens: 2000
};

const response = await anthropicLLM.ChatCompletion(reasoningParams);

Text Summarization

import { SummarizeParams } from '@memberjunction/ai';

const text = `Long text that you want to summarize...`;

const summarizeParams: SummarizeParams = {
  text: text,
  model: 'claude-2.1',
  temperature: 0.3,
  maxWords: 100
};

const summary = await anthropicLLM.SummarizeText(summarizeParams);
console.log('Summary:', summary.summary);

Direct Access to Anthropic Client

// Access the underlying Anthropic client for advanced usage
const anthropicClient = anthropicLLM.AnthropicClient;

// Use the client directly if needed for features not exposed by the wrapper
const customResponse = await anthropicClient.messages.create({
  model: 'claude-3-haiku-20240307',
  system: 'You are a helpful assistant.',
  messages: [{ role: 'user', content: 'Hello!' }],
  max_tokens: 500
});

API Reference

AnthropicLLM Class

Extends BaseLLM from @memberjunction/ai to provide Anthropic-specific functionality.

Constructor

new AnthropicLLM(apiKey: string)

Creates a new instance of the AnthropicLLM wrapper.

Parameters:

  • apiKey: Your Anthropic API key

Properties

  • AnthropicClient: (read-only) Returns the underlying Anthropic SDK client instance
  • SupportsStreaming: (read-only) Returns true - Anthropic supports streaming

Methods

ChatCompletion(params: ChatParams): Promise

Performs a chat completion using Claude models.

Parameters:

  • params: ChatParams object containing:
    • model: The model to use (e.g., 'claude-3-opus-20240229')
    • messages: Array of chat messages
    • maxOutputTokens: Maximum tokens to generate (default: 64000)
    • temperature: Temperature for randomness (0-1)
    • enableCaching: Enable ephemeral caching (default: true)
    • streaming: Enable streaming responses
    • streamCallback: Callback for streaming chunks
    • effortLevel: Enable thinking/reasoning mode
    • reasoningBudgetTokens: Token budget for reasoning (min: 1)

Returns: ChatResult with response data, usage info, and timing metrics

SummarizeText(params: SummarizeParams): Promise

Summarizes text using Claude's completion API.

Parameters:

  • params: SummarizeParams object containing:
    • text: Text to summarize
    • model: Model to use (default: 'claude-2.1')
    • temperature: Temperature setting
    • maxWords: Maximum words in summary

Returns: SummarizeResult with the generated summary

ConvertMJToAnthropicRole(role: ChatMessageRole): 'assistant' | 'user'

Converts MemberJunction chat roles to Anthropic-compatible roles.

Parameters:

  • role: MemberJunction role ('system', 'user', 'assistant')

Returns: Anthropic role ('assistant' or 'user')

Advanced Features

Caching

The wrapper supports Anthropic's ephemeral caching feature, which can significantly improve performance for repeated queries:

const params: ChatParams = {
  model: 'claude-3-opus-20240229',
  messages: [...],
  enableCaching: true // Caching is enabled by default
};

Caching is automatically applied to the last message in the conversation for optimal performance.

Message Format Handling

The wrapper automatically handles:

  • Conversion of system messages to Anthropic's format
  • Prevention of consecutive messages with the same role
  • Proper formatting of content blocks
  • Automatic insertion of filler messages when needed

Error Handling

The wrapper provides comprehensive error information:

try {
  const response = await anthropicLLM.ChatCompletion(params);
  if (!response.success) {
    console.error('Error:', response.errorMessage);
    console.error('Status:', response.statusText);
    console.error('Time Elapsed:', response.timeElapsed, 'ms');
    console.error('Exception Details:', response.exception);
  }
} catch (error) {
  console.error('Exception occurred:', error);
}

Integration with MemberJunction

This package is designed to work seamlessly with the MemberJunction AI framework:

  1. Consistent Interface: Implements the same methods as other AI providers
  2. Type Safety: Full TypeScript support with proper typing
  3. Global Registration: Automatically registers with MemberJunction's class factory using @RegisterClass decorator
  4. Standardized Results: Returns standardized result objects compatible with MemberJunction's AI abstraction

Dependencies

  • @anthropic-ai/sdk (^0.50.4): Official Anthropic SDK
  • @memberjunction/ai (^2.43.0): MemberJunction AI core framework
  • @memberjunction/global (^2.43.0): MemberJunction global utilities

Development

Building

npm run build

Running in Development

npm start

Supported Parameters

The Anthropic provider supports the following LLM parameters:

Supported:

  • temperature - Controls randomness in the output (0.0-1.0)
  • maxOutputTokens - Maximum number of tokens to generate
  • topP - Nucleus sampling threshold (0.0-1.0)
  • topK - Limits vocabulary to top K tokens
  • stopSequences - Array of sequences where the API will stop generating
  • responseFormat - Output format (Text, JSON, Markdown, etc.)

Not Supported:

  • frequencyPenalty - Not available in Anthropic API
  • presencePenalty - Not available in Anthropic API
  • minP - Not available in Anthropic API
  • seed - Not available in Anthropic API

License

ISC

Support

For issues and questions: