npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

ai-sdk-elements

v0.1.0

Published

Render rich UI elements inline with LLM-streamed text. Server-side enrichment and client-side rendering for the Vercel AI SDK.

Readme

ai-sdk-elements

npm version

Rich UI elements for the Vercel AI SDK. Define, enrich, and render structured content inline with LLM text.

LLMs output @name{...json...} markers inline with text. The server parses these markers, enriches them with external data, and streams the results to the client. The client renders them as React components with loading, error, and ready states.

Install

npm install ai-sdk-elements ai zod react streamdown

ai (v5+), zod (v4+), and react are peer dependencies.

Quick Start

1. Define an element (server)

import { defineElement } from "ai-sdk-elements";
import { z } from "zod";

const weatherElement = defineElement({
  name: "weather",
  description: "Display current weather for a city",
  schema: z.object({
    city: z.string().describe("City name"),
  }),
  enrich: async (input, deps) => {
    const response = await fetch(`https://api.weather.example/v1/current?city=${input.city}`);
    const data = await response.json();
    return { city: input.city, temperature: data.temperature, condition: data.condition };
  },
});

2. Define the element UI (client)

import { defineElementUI } from "ai-sdk-elements";
import { z } from "zod";

const weatherElementUI = defineElementUI({
  name: "weather",
  dataSchema: z.object({
    city: z.string(),
    temperature: z.number(),
    condition: z.string(),
  }),
  render: (data) => (
    <div className="weather-card">
      <h3>{data.city}</h3>
      <p>
        {data.temperature}° — {data.condition}
      </p>
    </div>
  ),
  loading: () => <div className="skeleton">Loading weather...</div>,
  error: (error) => <div className="error">Failed to load weather: {error}</div>,
});

3. Generate the system prompt and stream

Use generateElementPrompt to build the instruction block, then pass it as part of the system prompt to streamText or a ToolLoopAgent.

With streamText

import { streamText } from "ai";
import { openai } from "@ai-sdk/openai";
import { generateElementPrompt } from "ai-sdk-elements";
import { createElementStream } from "ai-sdk-elements/server";

const elementPrompt = generateElementPrompt([weatherElement]);

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: openai("gpt-4.1"),
    system: `You are a helpful assistant.\n\n${elementPrompt}`,
    messages,
    abortSignal: req.signal,
  });

  const enrichedStream = createElementStream({
    source: result.toUIMessageStream(),
    elements: [weatherElement],
    abortSignal: req.signal,
  });

  return new Response(enrichedStream, {
    headers: { "Content-Type": "text/event-stream" },
  });
}

With ToolLoopAgent

import { ToolLoopAgent, createAgentUIStreamResponse } from "ai";
import { openai } from "@ai-sdk/openai";
import { generateElementPrompt } from "ai-sdk-elements";

const elementPrompt = generateElementPrompt([weatherElement]);

const agent = new ToolLoopAgent({
  model: openai("gpt-4.1"),
  instructions: `You are a helpful assistant.\n\n${elementPrompt}`,
  tools: {
    /* your tools here */
  },
});

4. Process the stream (server)

import { createElementStream } from "ai-sdk-elements/server";

const enrichedStream = createElementStream({
  source: aiSdkStream, // ReadableStream<UIMessageChunk> from the AI SDK
  elements: [weatherElement],
  deps: {}, // Dependency injection (API clients, DB connections, etc.)
  abortSignal: req.signal,
  onEnrichError: (error, marker) => {
    console.error(`Failed to enrich ${marker.name}:`, error);
  },
});

createElementStream wraps the AI SDK stream. It passes through all chunks, detects @name{...} markers in text deltas, and emits data-element parts with progressive state updates (loading -> ready or error).

5. Render with Streamdown (client)

Use useMarkdownElements with Streamdown to render elements inline with markdown. Markers are replaced with HTML tags that map to React components.

import { Streamdown } from "streamdown";
import "streamdown/styles.css";
import { useMarkdownElements } from "ai-sdk-elements/react/streamdown";

const MarkdownMessage = ({ message }) => {
  const textPart = message.parts.find((p) => p.type === "text");
  const { processedText, components, elementNames } = useMarkdownElements({
    text: textPart?.text ?? "",
    parts: message.parts,
    elements: [weatherElementUI],
  });

  return (
    <Streamdown
      allowedTags={Object.fromEntries(
        elementNames.map((name) => [name, ["dataElementId", "dataElementState"]]),
      )}
      components={components}
    >
      {processedText}
    </Streamdown>
  );
};

useMarkdownElements returns:

  • processedText — markdown with markers replaced by <name data-element-id="el-0" data-element-state="loading"></name> HTML tags
  • components — a record mapping element names to React components (pass directly to Streamdown)
  • elementNames — deduplicated list of element names found in the text (use with allowedTags to whitelist them through Streamdown's sanitizer)

License

MIT