@meetdewey/vercel-ai
v0.1.0
Published
Vercel AI SDK integration for Dewey — retrieval tools and RAG helpers
Downloads
20
Maintainers
Readme
@meetdewey/vercel-ai
Dewey integration for the Vercel AI SDK. Adds two ready-to-use AI SDK tools that connect your application to a Dewey document collection.
Installation
npm install @meetdewey/vercel-ai
# or
pnpm add @meetdewey/vercel-aiTools
deweyRetrievalTool
Calls Dewey's hybrid semantic + BM25 search and returns chunks with citation metadata. Use this when you want the model to decide when and how to search — it receives query results as tool output and can call it multiple times.
import { streamText } from 'ai'
import { anthropic } from '@ai-sdk/anthropic'
import { deweyRetrievalTool } from '@meetdewey/vercel-ai'
const result = streamText({
model: anthropic('claude-haiku-4-5-20251001'),
system: 'Answer questions using the search tool. Always cite your sources by filename and section.',
messages,
tools: {
search: deweyRetrievalTool({
apiKey: process.env.DEWEY_API_KEY!,
collectionId: process.env.DEWEY_COLLECTION_ID!,
limit: 8,
}),
},
maxSteps: 5,
})Tool output per result:
{
content: string // chunk text
score: number // relevance score (RRF)
filename: string // source document filename
sectionTitle: string // section heading
sectionLevel: number // heading depth (1 = top-level)
documentId: string // document UUID
sectionId: string // section UUID
}deweyResearchTool
Delegates a question to Dewey's agentic research endpoint. Dewey runs a multi-step tool-call loop internally — searching, reading sections, reasoning across documents — and returns a cited answer and source list. Use this for complex multi-document questions.
import { generateText } from 'ai'
import { openai } from '@ai-sdk/openai'
import { deweyResearchTool } from '@meetdewey/vercel-ai'
const { text } = await generateText({
model: openai('gpt-4o-mini'),
system: 'You are an assistant. Use the research tool for document questions.',
messages,
tools: {
research: deweyResearchTool({
apiKey: process.env.DEWEY_API_KEY!,
collectionId: process.env.DEWEY_COLLECTION_ID!,
depth: 'balanced',
}),
},
maxSteps: 3,
})Tool output:
{
answer: string // full cited answer text
sources: Array<{
filename: string
sectionTitle: string
sectionId: string
documentId: string
}>
}Next.js example
// app/api/chat/route.ts
import { streamText } from 'ai'
import { anthropic } from '@ai-sdk/anthropic'
import { deweyRetrievalTool } from '@meetdewey/vercel-ai'
export async function POST(req: Request) {
const { messages } = await req.json()
const result = streamText({
model: anthropic('claude-haiku-4-5-20251001'),
system:
'You are a helpful research assistant. Search the document collection ' +
'to answer questions. Always cite the source filename and section.',
messages,
tools: {
search: deweyRetrievalTool({
apiKey: process.env.DEWEY_API_KEY!,
collectionId: process.env.DEWEY_COLLECTION_ID!,
}),
},
maxSteps: 5,
})
return result.toDataStreamResponse()
}Options
deweyRetrievalTool(options)
| Option | Type | Default | Description |
|---|---|---|---|
| apiKey | string | required | Dewey project API key |
| collectionId | string | required | Collection UUID to search |
| limit | number | 10 | Default max chunks (1–50); model can override per call |
| baseUrl | string | — | Override API base URL for local dev |
deweyResearchTool(options)
| Option | Type | Default | Description |
|---|---|---|---|
| apiKey | string | required | Dewey project API key |
| collectionId | string | required | Collection UUID to research |
| depth | 'quick' \| 'balanced' \| 'deep' \| 'exhaustive' | 'balanced' | Research thoroughness |
| model | string | — | Model ID for Dewey's research loop (requires matching BYOK key) |
| baseUrl | string | — | Override API base URL for local dev |
Resources
- Dewey documentation
- Free tier signup — no credit card required
- Vercel AI SDK docs
