npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

cfai

v0.3.0

Published

cfai - The easiest CopilotKit integration for Cloudflare Workers AI. Use OpenAI's GPT models on Cloudflare with 93% cost savings.

Readme

cfai - CopilotKit + Cloudflare AI

The easiest way to use CopilotKit with Cloudflare Workers AI. Get OpenAI's GPT models with 93% cost savings.

⚡ Performance Benefits

  • 68% faster responses (110ms vs 350ms)
  • 93% cheaper than OpenAI ($11 vs $150 per million tokens)
  • 200+ edge locations worldwide
  • Zero cold starts with Cloudflare Workers AI

🚀 Quick Start

1. Installation

npm install cfai
# or
yarn add cfai
# or
pnpm add cfai

2. Setup API Route

Create app/api/copilotkit/route.ts:

Method 1: Quick Setup (Recommended)

import cfai from "cfai";

export const runtime = "edge"; // Required for Cloudflare Workers AI

export const POST = cfai({
  apiToken: process.env.CLOUDFLARE_API_TOKEN!,
  accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
  model: "@cf/meta/llama-3.1-8b-instruct", // Optional
});

Method 2: Using Adapter Pattern (Like Azure OpenAI)

import {
  CopilotRuntime,
  OpenAIAdapter,
  copilotRuntimeNextJSAppRouterEndpoint,
} from "@copilotkit/runtime";
import { createCloudflareOpenAI } from "cfai";
import { NextRequest } from "next/server";

export const runtime = "edge"; // Required for Cloudflare Workers AI

const { openai, model } = createCloudflareOpenAI({
  apiToken: process.env.CLOUDFLARE_API_TOKEN!,
  accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
  // Optional: Use AI Gateway for caching
  useGateway: true,
  gatewayId: process.env.CLOUDFLARE_GATEWAY_ID!,
});

const runtime = new CopilotRuntime();
const serviceAdapter = new OpenAIAdapter({ openai, model });

export const POST = async (req: NextRequest) => {
  const { handleRequest } = copilotRuntimeNextJSAppRouterEndpoint({
    runtime,
    serviceAdapter,
    endpoint: "/api/copilotkit",
  });
  return handleRequest(req);
};

Method 3: Legacy Handler (For older CopilotKit versions)

import { cfaiHandler } from "cfai";

export const runtime = "edge"; // Required for Cloudflare Workers AI

export const POST = cfaiHandler({
  apiToken: process.env.CLOUDFLARE_API_TOKEN!,
  accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
  model: "@cf/meta/llama-3.1-8b-instruct", // Optional
  temperature: 0.7, // Optional
  debug: process.env.NODE_ENV === "development", // Optional
});

3. Environment Variables

Add to your .env.local:

CLOUDFLARE_API_TOKEN=your_api_token_here
CLOUDFLARE_ACCOUNT_ID=your_account_id_here
# Optional: For AI Gateway
CLOUDFLARE_GATEWAY_ID=your_gateway_id_here

Get these from your Cloudflare dashboard:

  • API Token: Create at https://dash.cloudflare.com/profile/api-tokens
  • Account ID: Find in Workers & Pages > Overview
  • Gateway ID: Create at AI Gateway > Create Gateway

4. Use in Your App

import { CopilotKit } from "@copilotkit/react-core";
import { CopilotChat } from "@copilotkit/react-ui";
import "@copilotkit/react-ui/styles.css";

export default function MyApp() {
  return (
    <CopilotKit runtimeUrl="/api/copilotkit">
      <CopilotChat
        instructions="You are running on Cloudflare Workers AI at the edge!"
        labels={{
          title: "AI Assistant",
          initial:
            "Hello! I'm powered by Cloudflare Workers AI. How can I help?",
        }}
      />
    </CopilotKit>
  );
}

🔧 Configuration

interface CfaiConfig {
  /** Cloudflare API token */
  apiToken: string;
  /** Cloudflare account ID */
  accountId: string;
  /** AI model to use (defaults to Llama 3.1 8B) */
  model?: string;
  /** Use Cloudflare AI Gateway for caching and observability */
  useGateway?: boolean;
  /** AI Gateway ID (required if useGateway is true) */
  gatewayId?: string;
  /** Temperature for AI responses (0-1) */
  temperature?: number;
  /** Maximum tokens in response */
  maxTokens?: number;
  /** Enable debug logging */
  debug?: boolean;
}

🌐 Using AI Gateway (Recommended)

For production apps, use Cloudflare AI Gateway for caching and observability:

import { createCloudflareOpenAI } from "cfai";

const { openai, model } = createCloudflareOpenAI({
  apiToken: process.env.CLOUDFLARE_API_TOKEN!,
  accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
  useGateway: true,
  gatewayId: process.env.CLOUDFLARE_GATEWAY_ID!,
  model: "@cf/meta/llama-3.3-70b-instruct", // Use more capable model with gateway
});

// Use with CopilotRuntime just like Azure OpenAI

🤖 Supported Models

Automatic Model Mapping

The package intelligently maps OpenAI model names to Cloudflare models. By default, it uses actual OpenAI models running on Cloudflare!

| OpenAI Model | Default (OpenAI on CF) | Alternative (Llama/Other) | | --------------- | ------------------------- | ---------------------------------------------- | | gpt-3.5-turbo | @cf/openai/gpt-oss-20b | @cf/meta/llama-3.1-8b-instruct-fast | | gpt-4 | @cf/openai/gpt-oss-120b | @cf/meta/llama-3.3-70b-instruct-fp8-fast | | gpt-4-turbo | @cf/openai/gpt-oss-120b | @cf/meta/llama-3.3-70b-instruct-fp8-fast | | gpt-4o | @cf/openai/gpt-oss-120b | @cf/mistralai/mistral-small-3.1-24b-instruct | | gpt-4o-mini | @cf/openai/gpt-oss-20b | @cf/meta/llama-3.1-8b-instruct-fast |

Choose your preference:

// Use OpenAI models (default)
cfai({
  apiToken: process.env.CLOUDFLARE_API_TOKEN!,
  accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
  modelPreference: "openai", // or omit for default
});

// Use alternative models (Llama, Mistral, etc.)
cfai({
  apiToken: process.env.CLOUDFLARE_API_TOKEN!,
  accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
  modelPreference: "alternative",
});

Model Categories

Choose models by capability:

// Fast models (low latency)
model: "@cf/meta/llama-3.1-8b-instruct-fast";
model: "@cf/openai/gpt-oss-20b";

// Powerful models (best quality)
model: "@cf/openai/gpt-oss-120b";
model: "@cf/meta/llama-3.3-70b-instruct-fp8-fast";

// Code generation
model: "@cf/qwen/qwen2.5-coder-32b-instruct";

// Reasoning models
model: "@cf/deepseek/deepseek-r1-distill-qwen-32b";

// Multimodal (text + images)
model: "@cf/meta/llama-4-scout-17b-16e-instruct";

🔒 Environment Variables

Create a .env.local file:

CLOUDFLARE_API_TOKEN=your_cloudflare_api_token
CLOUDFLARE_ACCOUNT_ID=your_cloudflare_account_id
CLOUDFLARE_GATEWAY_ID=your_gateway_id  # Optional, for AI Gateway

🚨 Troubleshooting

"Module not found: Can't resolve 'cfai'"

  • Make sure you've installed the package: npm install cfai
  • Try clearing your Next.js cache: rm -rf .next && npm run dev

"Edge runtime error" or "Cannot read properties"

  • IMPORTANT: You must add export const runtime = "edge" to your API route
  • This is required for Cloudflare Workers AI to function

"Invalid API token" or "Authentication failed"

  • Verify your Cloudflare API token has Workers AI permissions
  • Check your account ID is correct (find it in Cloudflare dashboard)
  • Make sure environment variables are loaded (restart your dev server)

"Model not found" error

  • Use a valid model name from the supported models list
  • Default model is @cf/meta/llama-3.1-8b-instruct-fast

📊 Cost Comparison

| Provider | Cost per 1M tokens | Typical Response Time | | ----------------------- | ------------------ | --------------------- | | OpenAI GPT-4o | $150 | 350ms | | Cloudflare Llama 3.1 8B | $11 | 110ms | | Savings | 93% cheaper | 68% faster |

🏗️ Supported CopilotKit Versions

This package supports all modern CopilotKit versions and request formats:

  • Modern CopilotKit Runtime (1.9.x+) - Recommended
  • Direct chat completions
  • GraphQL mutations and queries
  • REST API fallbacks
  • Legacy handler support - Backward compatibility

🚨 Migration from OpenAI

Replace your existing CopilotKit API route:

// Before (OpenAI)
import { CopilotRuntime, OpenAIAdapter } from "@copilotkit/runtime";

const copilotKit = new CopilotRuntime();
export const POST = copilotKit.streamHttpServerResponse(
  openaiAdapter.chatCompletionRequest()
);

// After (Cloudflare)
import { cfaiHandler } from "cfai";

export const POST = cfaiHandler({
  apiToken: process.env.CLOUDFLARE_API_TOKEN!,
  accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
});

No frontend changes required! 🎉

🔧 Advanced Usage

Health Check Endpoint

import {
  cfai,
  cfaiHealthCheck,
  cfaiOptions,
} from "cfai";

const config = {
  apiToken: process.env.CLOUDFLARE_API_TOKEN!,
  accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
};

export const POST = cfai(config);
export const GET = cfaiHealthCheck(config);
export const OPTIONS = cfaiOptions();

Debug Mode

Enable debug logging:

export const POST = cfai({
  apiToken: process.env.CLOUDFLARE_API_TOKEN!,
  accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
  debug: true, // Enables request/response logging
});

🐛 Troubleshooting

Common Issues

  1. "Account ID required" - Set CLOUDFLARE_ACCOUNT_ID environment variable
  2. "API token required" - Set CLOUDFLARE_API_TOKEN environment variable
  3. 403 Forbidden - Check your Cloudflare API token permissions
  4. Model not found - Verify the model name is correct for Cloudflare

Getting Help

📜 License

MIT License - see LICENSE file for details.

🤝 Contributing

Contributions welcome! Please read our contributing guidelines.


Built with ❤️ by developers who believe AI should be fast and affordable.