@epicdm/flowstate-rag-client
v0.1.0
Published
RAG query client - semantic search and context building for FlowState
Maintainers
Readme
@epicdm/flowstate-rag-client
RAG query client library for FlowState - semantic search, context building, and memory recall.
Overview
This package provides a client library for querying the FlowState RAG (Retrieval-Augmented Generation) system. It connects to SurrealDB for vector similarity search and uses Ollama for text embeddings.
Features
- Semantic Search - Query documents using natural language
- Context Building - Build LLM-ready context strings from relevant documents
- Memory Recall - Retrieve conversation history and agent memories
- Document Similarity - Find related documents
- Multi-tenancy - Workspace and organization filtering
- Type-safe - Full TypeScript support with comprehensive type definitions
Installation
yarn add @epicdm/flowstate-rag-clientUsage
Basic Setup
import { RAGClient } from '@epicdm/flowstate-rag-client';
const client = new RAGClient({
surrealdbUrl: 'ws://localhost:8000/rpc',
surrealdbUser: 'root',
surrealdbPass: 'root',
surrealdbNamespace: 'flowstate',
surrealdbDatabase: 'rag',
ollamaUrl: 'http://localhost:11434',
embeddingModel: 'nomic-embed-text'
});
await client.connect();Semantic Search
const results = await client.search({
query: 'What are the high priority bugs?',
workspaceId: 'ws_1',
collections: ['tasks', 'issues'],
limit: 10,
minScore: 0.7
});
for (const result of results) {
console.log(`[${result.collection}] ${result.content} (score: ${result.score})`);
}Context Building
Build a context string suitable for LLM consumption:
const context = await client.getContext({
topic: 'What are the current sprint goals?',
workspaceId: 'ws_1',
includeMemories: true,
maxTokens: 2000
});
console.log(context.context);
// "Based on the following information:
// [Task] Fix login bug - High priority...
// [Note] Sprint planning notes..."
console.log(`Token estimate: ${context.tokenEstimate}`);
console.log(`Sources: ${context.sources.length}`);Memory Recall
Recall conversation history or agent memories:
const memories = await client.recall({
topic: 'project requirements',
namespace: 'user_123',
sessionId: 'session_abc',
limit: 10
});
for (const memory of memories.memories) {
console.log(memory.content);
}Find Similar Documents
Find documents similar to a given document:
const similar = await client.findSimilar({
collection: 'tasks',
docId: 'task_123',
limit: 5,
minScore: 0.8
});
console.log('Similar tasks:', similar);Cleanup
await client.disconnect();API Reference
RAGClient
Constructor
new RAGClient(config: RAGClientConfig)Methods
connect(): Promise<void>- Connect to SurrealDBdisconnect(): Promise<void>- Close connectionisConnected(): boolean- Check connection statussearch(options: SearchOptions): Promise<SearchResult[]>- Semantic searchgetContext(options: ContextOptions): Promise<ContextResult>- Build context stringrecall(options: RecallOptions): Promise<RecallResult>- Recall memoriesfindSimilar(options: FindSimilarOptions): Promise<SearchResult[]>- Find similar documents
Types
interface RAGClientConfig {
surrealdbUrl: string;
surrealdbUser: string;
surrealdbPass: string;
surrealdbNamespace: string;
surrealdbDatabase: string;
ollamaUrl: string;
embeddingModel: string;
}
interface SearchResult {
id: string;
collection: string;
docId: string;
content: string;
metadata: Record<string, unknown>;
score: number;
}
interface ContextResult {
context: string;
sources: SearchResult[];
tokenEstimate: number;
}
interface RecallResult {
memories: SearchResult[];
}Development
# Install dependencies
yarn install
# Build
yarn build
# Run tests
yarn test
# Run tests with coverage
yarn test:coverage
# Type check
yarn typecheck
# Lint
yarn lint
yarn lint:fixArchitecture
The RAGClient integrates with two external services:
- SurrealDB - Vector database for storing and querying document embeddings
- Ollama - Local LLM server for generating text embeddings
┌─────────────┐
│ RAGClient │
└──────┬──────┘
│
├──────────► SurrealDB (Vector Search)
│
└──────────► Ollama (Text Embeddings)Requirements
- Node.js 18+
- SurrealDB server running (for vector storage)
- Ollama server running with embedding model (e.g., nomic-embed-text)
Related Packages
@epicdm/flowstate-rag-sync- Syncs RxDB documents to RAG system@epicdm/agent-memory-server- Manages agent conversation memory
License
Apache-2.0
Author
Epic Digital Interactive Media LLC
