npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@genkit-ai/anthropic

v0.2.0

Published

Genkit AI framework plugin for Anthropic APIs.

Readme

Genkit Anthropic AI Plugin

@genkit-ai/anthropic is Anthropic plugin for Genkit. It supersedes the earlier community package genkitx-anthropic and is now maintained by Google.

Supported models

The plugin supports the most recent Anthropic models like Claude Haiku 4.5, Claude Sonnet 4.5, and Claude Opus 4.5. Additionally, the plugin supports all of the non-retired older models.

Installation

Install the plugin in your project with your favorite package manager:

  • npm install @genkit-ai/anthropic
  • yarn add @genkit-ai/anthropic
  • pnpm add @genkit-ai/anthropic

Usage

Initialize

import { genkit } from "genkit";
import { anthropic } from "@genkit-ai/anthropic";

const ai = genkit({
  plugins: [anthropic({ apiKey: process.env.ANTHROPIC_API_KEY })],
  // specify a default model for generate here if you wish:
  model: anthropic.model("claude-sonnet-4-5"),
});

Basic examples

The simplest way to generate text is by using the generate method:

const response = await ai.generate({
  model: anthropic.model("claude-haiku-4-5"),
  prompt: "Tell me a joke.",
});

console.log(response.text);

Multi-modal prompt

// ...initialize Genkit instance (as shown above)...

const response = await ai.generate({
  prompt: [
    { text: "What animal is in the photo?" },
    { media: { url: imageUrl } },
  ],
});
console.log(response.text);

Extended thinking

Claude 4.5 models can expose their internal reasoning. Enable it per-request with the Anthropic thinking config and read the reasoning from the response:

const response = await ai.generate({
  prompt: "Walk me through your reasoning for Fermat’s little theorem.",
  config: {
    thinking: {
      enabled: true,
      budgetTokens: 4096, // Must be >= 1024 and less than max_tokens
    },
  },
});

console.log(response.text); // Final assistant answer
console.log(response.reasoning); // Summarized thinking steps

When thinking is enabled, request bodies sent through the plugin include the thinking payload ({ type: 'enabled', budget_tokens: … }) that Anthropic's API expects, and streamed responses deliver reasoning parts as they arrive so you can render the chain-of-thought incrementally.

Document Citations

Claude can cite specific parts of documents you provide, making it easy to trace where information in the response came from. Use the anthropicDocument() helper to create citable documents. For more details, see the Anthropic Citations documentation.

import { anthropic, anthropicDocument } from "@genkit-ai/anthropic";

const response = await ai.generate({
  model: anthropic.model("claude-sonnet-4-5"),
  messages: [
    {
      role: "user",
      content: [
        anthropicDocument({
          source: {
            type: "text",
            data: "The grass is green. The sky is blue. Water is wet.",
          },
          title: "Basic Facts",
          citations: { enabled: true },
        }),
        { text: "What color is the grass? Cite your source." },
      ],
    },
  ],
});

// Access citations from response parts
const citations = response.message?.content?.flatMap(
  (part) => part.metadata?.citations || [],
) ?? [];

console.log("Citations:", citations);

Important: Citations must be enabled on all documents in a request, or on none of them. You cannot mix documents with citations enabled and disabled in the same request.

Supported document source types:

  • text - Plain text documents (returns char_location citations)
  • base64 - Base64-encoded PDFs (returns page_location citations)
  • url - PDFs accessible via URL (returns page_location citations)
  • content - Custom content blocks with text/images (returns content_block_location citations)
  • file - File references from Anthropic's Files API, beta API only (returns page_location citations)

Citations are returned in the response parts' metadata and include information about the document index, cited text, and location (character indices, page numbers, or block indices depending on the source type).

Prompt Caching

You can cache prompts by adding cache_control metadata to the prompt. Use the cacheControl() helper for type-safe cache configuration. You can cache system messages, user messages, tools, and media.

import { anthropic, cacheControl } from "@genkit-ai/anthropic";

const response = await ai.generate({
  model: anthropic.model("claude-sonnet-4-5"),
  system: {
    text: longSystemPrompt,
    metadata: { ...cacheControl() }, // default: ephemeral
  },
  messages: [
    {
      role: "user",
      content: [{ text: "What is the main idea of the text?" }],
    },
  ],
});

// Or with explicit TTL:
metadata: { ...cacheControl({ ttl: '1h' }) }

// Or using the type directly:
import { type AnthropicCacheControl } from "@genkit-ai/anthropic";
metadata: { cache_control: { type: 'ephemeral', ttl: '5m' } as AnthropicCacheControl }

Note: Caching is only used when the prompt exceeds a certain token length. This token length is documented in the Anthropic API documentation.

Beta API Limitations

The beta API surface provides access to experimental features, but some server-managed tool blocks are not yet supported by this plugin. The following beta API features will cause an error if encountered:

  • web_fetch_tool_result
  • code_execution_tool_result
  • bash_code_execution_tool_result
  • text_editor_code_execution_tool_result
  • mcp_tool_result
  • mcp_tool_use
  • container_upload

Note that server_tool_use and web_search_tool_result ARE supported and work with both stable and beta APIs.

Within a flow

import { z } from "genkit";

// ...initialize Genkit instance (as shown above)...

export const jokeFlow = ai.defineFlow(
  {
    name: "jokeFlow",
    inputSchema: z.string(),
    outputSchema: z.string(),
  },
  async (subject) => {
    const llmResponse = await ai.generate({
      prompt: `tell me a joke about ${subject}`,
    });
    return llmResponse.text;
  },
);

Direct model usage (without Genkit instance)

The plugin supports Genkit Plugin API v2, which allows you to use models directly without initializing the full Genkit framework:

import { anthropic } from "@genkit-ai/anthropic";

// Create a model reference directly
const claude = anthropic.model("claude-sonnet-4-5");

// Use the model directly
const response = await claude({
  messages: [
    {
      role: "user",
      content: [{ text: "Tell me a joke." }],
    },
  ],
});

console.log(response);

You can also create model references using the plugin's model() method:

import { anthropic } from "@genkit-ai/anthropic";

// Create model references
const claudeHaiku45 = anthropic.model("claude-haiku-4-5");
const claudeSonnet45 = anthropic.model("claude-sonnet-4-5");
const claudeOpus45 = anthropic.model("claude-opus-4-5");

// Use the model reference directly
const response = await claudeSonnet45({
  messages: [
    {
      role: "user",
      content: [{ text: "Hello!" }],
    },
  ],
});

This approach is useful for:

  • Framework developers who need raw model access
  • Testing models in isolation
  • Using Genkit models in non-Genkit applications

Acknowledgements

This plugin builds on the community work published as genkitx-anthropic by Bloom Labs Inc. Their Apache 2.0–licensed implementation provided the foundation for this maintained package.

Credits

This plugin is maintained by Google with acknowledgement to the community contributions from Bloom Labs Inc.