npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@vivgrid/ai-sdk-provider

v1.1.1

Published

Vercel AI SDK provider for vivgrid AI Bridge platform

Readme

Vivgrid Vercel AI SDK Provider

The official Vercel AI SDK provider for Vivgrid - a global AI inference infrastructure platform.

Installation

npm install @vivgrid/ai-sdk-provider
# or
pnpm add @vivgrid/ai-sdk-provider
# or
yarn add @vivgrid/ai-sdk-provider

Setup

1. Obtain Your API Key

Retrieve your API key from the Vivgrid Console.

2. Set Environment Variable

export VIVGRID_API_KEY=your-api-key

Usage

Server-Managed Models and System Prompts

Model and System Prompt configurations are managed through the Vivgrid web console, eliminating the need to specify them in your code:

import { vivgrid } from "@vivgrid/ai-sdk-provider";
import { generateText } from "ai";

const { text } = await generateText({
  // Uses the model configured in the web console
  model: vivgrid(), 
  // System prompt is automatically attached based on console configuration
  prompt: "Write a vegetable lasagna recipe for 4 people",
});

console.log(text);

Streaming Text Generation

import { vivgrid } from "@vivgrid/ai-sdk-provider";
import { streamText } from "ai";

const { textStream } = await streamText({
  model: vivgrid(),
  prompt: "Tell me a story about a brave little rabbit",
});

for await (const textPart of textStream) {
  process.stdout.write(textPart);
}

Serverless LLM Functions with MCP Support

Tools (LLM Function Calling) can be written in TypeScript with strongly-typed language support. They are decoupled from the main codebase and reduce management overhead through a serverless architecture. Additionally, all tools are automatically served as MCP (Model Context Protocol) servers.

Here's how to implement a get-weather tool:

const description = "Get the current weather for `city_name`";

export type Argument = {
  /**
   * The name of the city to be queried
   */
  city_name: string;
}

export async function handler(args: Argument) {
  const result = await getWeather(args.city_name);
  return result;
}

Then deploy it to Vivgrid, and use it in your code like this:

import { vivgrid } from "@vivgrid/ai-sdk-provider";
import { generateText } from "ai";

const { text, toolCalls } = await generateText({
  model: vivgrid(),
  prompt: "What's the weather like in San Francisco?",
  /* No need to define tools locally anymore */
  // tools: {
  //   weather: {
  //     description: "Get weather for a specified location",
  //     parameters: z.object({
  //       location: z.string().describe("City name"),
  //     }),
  //     execute: async ({ location }) => {
  //       // Actual weather API call
  //       return `The weather in ${location} is sunny, 22°C`;
  //     },
  //   },
  // },
});

You can explore more Serverless LLM Function examples and deploy them to Vivgrid with one click.

Object Generation

import { vivgrid } from "@vivgrid/ai-sdk-provider";
import { generateObject } from "ai";
import { z } from "zod";

const { object } = await generateObject({
  model: vivgrid(),
  schema: z.object({
    recipe: z.object({
      name: z.string(),
      ingredients: z.array(
        z.object({
          name: z.string(),
          amount: z.string(),
        })
      ),
      steps: z.array(z.string()),
    }),
  }),
  prompt: "Generate a pasta recipe",
});

console.log(object.recipe);

Model Management

All AI model selection and configuration is managed through the Vivgrid Console. You can:

  • Select and switch between different AI models (OpenAI, Anthropic Claude, etc.)
  • Configure model parameters
  • Manage usage quotas
  • Monitor usage statistics

This approach provides the flexibility to switch and manage models without modifying your codebase.

Features

  • ✅ Server-managed models
  • ✅ Server-managed system prompts
  • ✅ Build LLM functions with strongly-typed language support
  • ✅ Serverless tools / MCP integration
  • ✅ Globally deployed models and tools
  • ✅ Text generation
  • ✅ Streaming text generation
  • ✅ Object generation (structured outputs)
  • ✅ JSON mode
  • ✅ OpenAI API compatible

Advanced Configuration

JSON Mode

const model = vivgrid({
  jsonMode: true, // Forces the model to output valid JSON
});

Structured Outputs

const model = vivgrid({
  structuredOutputs: true, // Enables structured outputs (default: true)
});

Error Handling

try {
  const { text } = await generateText({
    model: vivgrid(),
    prompt: "Hello",
  });
} catch (error) {
  if (error instanceof Error) {
    console.error("Error:", error.message);
  }
}

Contributing

We welcome issues and pull requests! Please feel free to contribute to this project.

License

MIT

Links