npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@mdb-devx/ai-gateway

v0.3.3

Published

The AI Gateway Client is a universal (browser & Node.js) TypeScript module that provides seamless integration with the AI Gateway. Quickly and securely integrate Generative AI models from OpenAI (and soon to be others) into your client and server applicat

Downloads

6

Readme

AI Gateway Client

The AI Gateway Client is a universal (browser & Node.js) TypeScript module that provides seamless integration with the AI Gateway. Quickly and securely integrate Generative AI models from OpenAI (and soon to be others) into your client and server applications.

The following OpenAI endpoints and models are supported:

Chat Models:

  • gpt-3.5-turbo
  • gpt-3.5-turbo-0613
  • gpt-3.5-turbo-16k
  • gpt-3.5-turbo-16k-0613
  • gpt-4
  • gpt-4-0613
  • gpt-4-32k
  • gpt-4-32k-0613

Embedding Models:

  • text-embedding-ada-002

Legacy Completion Models:

  • gpt-3.5-turbo

Installation

To install the AI Gateway Client package:

npm install @mdb-devx/ai-gateway

Usage

Creating the client:

import { AIGatewayClient } from "@mdb-devx/ai-gateway";
const client = new AIGatewayClient({
  url: "https://example.com",
  jwt: "SOME_OKTA_JWT_TOKEN",
});

Using the client:

Chat Completions:

import { Chat, RequestOpts } from '@mdb-devx/ai-gateway';

const chat: Chat = await client.chat({
  model: "gpt-3.5-turbo-0613",
  messages: [
    { role: "system", content: "You are a helpful assistant." },
    { role: "user", content: "What is the captial of Canada?" },
  ],
  max_token: 10,
  temperature: 0.2,
}, {}: RequestOpts);
const result = completion.choices[0].message.content;

Embeddings:

import { Embedding, RequestOpts } from '@mdb-devx/ai-gateway';

const embedding: Embedding = await client.embedding({
  input: "The food was delicious and the waiter...",
  model: "text-embedding-ada-002",
}, {}: RequestOpts);

Legacy Completions:

import { Completion, RequestOpts } from '@mdb-devx/ai-gateway';

const completion: Completion = await client.completion({
  model: "gpt-3.5-turbo", // Older legacy completion models (e.g. text-davinci-003) are unsupported.
  prompt: "The capital of Canada is",
  max_token: 10,
  temperature: 0.2,
}, {}: RequestOpts);
const result = completion.choices[0].text;

Stream completions:

Both chat and legacy completions support streaming responses as the model produces tokens, so the client provides separate methods (client.chatStream() and client.completionStream()) to deeply integrate with the stream lifecycle.

import {ChatChunk, EventSourceMessage, StreamOpts} from '@mdb-devx/ai-gateway'; // or CompletionChunk

const ctrl = new AbortController();
await client.chatStream(
  {
    model: "gpt-3.5-turbo-0613",
    messages: [
      { role: "system", content: "You are a helpful assistant." },
      { role: "user", content: "What is the captial of Canada?" },
    ],
    stream: true,
    max_token: 10,
    temperature: 0.2,
  },
  {
    onopen(response: Response) {
      // Called when a response is received. If not provided, will default to
      // a basic validation to ensure the content-type is text/event-stream.
    },
    onmessage(data: ChatChunk, event: EventSourceMessage) {
      // Called on each message. Response data is parsed from JSON event.data
    },
    onerror(error: unknown) {
      // Rethrow to stop the operation.
      // Do nothing to automatically retry.
      // You can also return a specific retry interval here.
    },
    onclose() {
      // Called when a response finishes. If you don't expect the server to kill
      // the connection, you can throw an exception here and retry using onerror.
    },
    signal: ctrl.signal, // Stream can be cancelled elsewhere with ctrl.abort()
  }: StreamOpts
);