hollow-sdk
v0.2.0
Published
TypeScript SDK for Hollow — serverless web perception for AI agents
Downloads
30
Maintainers
Readme
hollow-sdk
TypeScript SDK for Hollow — serverless web perception for AI agents.
Gives any Node.js application a browser that runs without a machine: no Chromium, no BaaS, no GPU.
Install
npm install hollow-sdkUsage
Single page
import { HollowClient } from 'hollow-sdk';
const hollow = new HollowClient();
const page = await hollow.perceive('https://news.ycombinator.com');
console.log(page.gdgMap);
// → [TEXT: news.ycombinator.com]
// [Stories:]
// [1] "Ask HN: Who is hiring? (March 2026)" comments:834
// ...
console.log(page.tier); // 'text'
console.log(page.confidence); // 0.95Agent task
import { HollowClient, runAgent } from 'hollow-sdk';
const hollow = new HollowClient();
// Default — Anthropic (reads ANTHROPIC_API_KEY)
const result = await runAgent(hollow, {
task: 'What are the top 3 stories on Hacker News right now?',
onStep: (step) => console.log(`Step ${step.step}: ${step.tier} (${step.confidence})`),
});
console.log(result);Works with any model
The agent loop is model-agnostic. Pass any adapter — or implement ModelAdapter yourself.
Anthropic (default)
import { AnthropicAdapter } from 'hollow-sdk';
const result = await runAgent(hollow, {
task: '...',
model: new AnthropicAdapter({ model: 'claude-opus-4-6' }),
});OpenAI
npm install openai # optional peer dependencyimport { OpenAIAdapter } from 'hollow-sdk';
const result = await runAgent(hollow, {
task: '...',
model: new OpenAIAdapter({ model: 'gpt-4o' }),
});Any OpenAI-compatible API (Groq, Together, Ollama…)
import { OpenAIAdapter } from 'hollow-sdk';
// Groq
const result = await runAgent(hollow, {
task: '...',
model: new OpenAIAdapter({
apiKey: process.env.GROQ_API_KEY,
baseURL: 'https://api.groq.com/openai/v1',
model: 'llama-3.3-70b-versatile',
}),
});
// Local Ollama
const result = await runAgent(hollow, {
task: '...',
model: new OpenAIAdapter({
apiKey: 'ollama',
baseURL: 'http://localhost:11434/v1',
model: 'llama3.2',
}),
});Bring your own model
Implement ModelAdapter — one method:
import type { ModelAdapter } from 'hollow-sdk';
const myAdapter: ModelAdapter = {
async complete(messages) {
// messages: { role: 'system'|'user'|'assistant', content: string }[]
return myModel.chat(messages);
},
};
const result = await runAgent(hollow, {
task: '...',
model: myAdapter,
});Custom endpoint
const hollow = new HollowClient({
endpoint: 'https://my-hollow-instance.vercel.app',
});Or set HOLLOW_ENDPOINT in your environment — the client reads it automatically.
API
HollowClient
const hollow = new HollowClient(options?)| Option | Type | Default |
|--------|------|---------|
| endpoint | string | https://hollow-tan-omega.vercel.app |
hollow.perceive(url, sessionId?, stateId?)
Load a URL. Returns a PerceiveResult with sessionId, gdgMap, confidence, tier, and jsErrors.
hollow.act(sessionId, action, intervention?)
Interact with the current page. action.elementId values come from the gdgMap.
await hollow.act(page.sessionId, { type: 'click', elementId: 3 });
await hollow.act(page.sessionId, { type: 'fill', elementId: 4, value: 'search query' });
await hollow.act(page.sessionId, { type: 'scroll', direction: 'down' });hollow.getSession(sessionId)
Get the latest state of a session without taking any action.
hollow.closeSession(sessionId)
Free a session's Redis state when done. Idempotent.
runAgent(client, options)
Run a multi-step AI agent.
| Option | Type | Default |
|--------|------|---------|
| task | string | required |
| model | ModelAdapter | new AnthropicAdapter() |
| maxSteps | number | 15 |
| startUrl | string | https://www.startpage.com |
| onStep | (step: AgentStep) => void | — |
AnthropicAdapter
| Option | Type | Default |
|--------|------|---------|
| apiKey | string | ANTHROPIC_API_KEY env var |
| model | string | claude-opus-4-6 |
OpenAIAdapter
Requires npm install openai.
| Option | Type | Default |
|--------|------|---------|
| apiKey | string | OPENAI_API_KEY env var |
| model | string | gpt-4o |
| baseURL | string | OpenAI default |
Types
type Tier = 'hollow' | 'vdom' | 'text' | 'cache' | 'mobile' | 'partial';
interface Message {
role: 'system' | 'user' | 'assistant';
content: string;
}
interface ModelAdapter {
complete(messages: Message[]): Promise<string>;
}
interface PerceiveResult {
sessionId: string;
gdgMap: string;
confidence: number;
tier: Tier;
jsErrors: string[];
}
interface AgentStep {
step: number;
gdgMap: string;
action: Action | null;
confidence: number;
tier: Tier;
reasoning?: string;
}Deploy your own Hollow
This SDK points at the public demo by default (10 req/min rate limit per IP). For production use, deploy your own instance:
git clone https://github.com/Badgerion/hollow
cd hollow && npm install && vercel deployThen point the SDK at it:
const hollow = new HollowClient({ endpoint: 'https://your-instance.vercel.app' });License
Apache 2.0
