@alchemystai/aisdk
v0.0.2
Published
The Vercel AI SDK Integration for Alchemyst AI.
Downloads
127
Maintainers
Readme
Alchemyst AI — Vercel AI SDK Integration
Alchemyst AI is the context layer for your LLM applications — it remembers, reasons, and injects contextual intelligence automatically into every call. This package provides a seamless integration with Vercel’s AI SDK to enhance your Gen-AI apps with memory, retrieval, and context-aware toolchains — all through a single line of configuration.
🚀 Installation
npm install @alchemystai/aisdk
# or
yarn add @alchemystai/aisdk
#or
pnpm add @alchemystai/aisdk
# or
bun add @alchemystai/aisdk
⚡ Quick Start
Here’s how to plug Alchemyst AI into your ai SDK call in one line:
import { streamText } from 'ai';
import { alchemystTools } from '@alchemystai/aisdk';
const result = await streamText({
model: "gpt-5-nano",
prompt: "Remember that my name is Alice",
tools: alchemystTools("YOUR_ALCHEMYST_AI_KEY", true, true)
});This automatically attaches the Alchemyst Context Engine to your model call — enabling persistent memory and context-aware generation across sessions.
🧩 API Reference
alchemystTools(apiKey: string, enableMemory?: boolean, enableRetrieval?: boolean)
| Parameter | Type | Default | Description |
| ----------------- | --------- | ------- | ------------------------------------------------------- |
| apiKey | string | — | Your Alchemyst AI API key. Required. |
| enableMemory | boolean | true | Enables contextual memory between user sessions. |
| enableRetrieval | boolean | true | Enables semantic retrieval from your connected sources. |
Returns an object compatible with the tools parameter of the ai SDK functions (streamText, generateText, streamObject, etc.).
🧠 What It Does
Once integrated, Alchemyst AI automatically:
- Persists context across user sessions.
- Augments prompts with retrieved knowledge from your Alchemyst AI workspace.
- Supports custom tool functions for domain-specific reasoning.
- Runs entirely server-side — no extra infrastructure required.
💡 Example: Contextual Chatbot
import { streamText } from 'ai';
import { alchemystTools } from '@alchemystai/aisdk';
export async function POST(req: Request) {
const { messages } = await req.json();
const response = await streamText({
model: 'gpt-5-nano',
messages,
tools: alchemystTools(process.env.ALCHEMYST_API_KEY!, true, true),
});
return response.toAIStreamResponse();
}With this, your AI app will:
- Remember previous user messages (via Alchemyst context memory)
- Retrieve relevant knowledge chunks
- Enrich every prompt dynamically before sending it to the model
🔒 Environment Variables
Set your API key in your environment:
ALCHEMYST_API_KEY=sk-xxxxxxThen reference it in your code as:
alchemystTools(process.env.ALCHEMYST_API_KEY!);🧰 Supported SDK Methods
Alchemyst AI integrates with all Vercel AI SDK entry points:
| SDK Function | Supported | Notes |
| ---------------- | --------- | ------------------------------- |
| streamText | ✅ | Real-time streaming generation |
| generateText | ✅ | Synchronous generation |
| streamObject | ✅ | Structured JSON output |
| generateObject | ✅ | Non-streaming structured output |
🧪 Example with Retrieval Off
const result = await streamText({
model: "gpt-5-nano",
prompt: "Hello there!",
tools: alchemystTools("YOUR_ALCHEMYST_AI_KEY", true, false),
});📜 License
MIT © 2025 Alchemyst AI
