@withmartian/ai-sdk-provider
v1.0.3
Published
Martian OpenAI-compatible provider for the Vercel AI SDK
Maintainers
Readme
@withmartian/ai-sdk-provider
A lightweight provider for the Vercel AI SDK that routes requests to the Martian API. You can call any model listed in the Available Models section in our docs https://app.withmartian.com/docs.
Install
npm install @withmartian/ai-sdk-providerSet your API key (e.g. in .env):
MARTIAN_API_KEY=your_martian_api_key_hereQuick Start (AI SDK)
import { generateText } from "ai";
import { martianProvider } from "@withmartian/ai-sdk-provider";
const { text } = await generateText({
model: martianProvider("gpt-4o-mini"), // → "openai/gpt-4o-mini"
prompt: "Write a limerick about cats."
});
console.log(text);More model examples
import { generateText } from "ai";
import { martianProvider } from "@withmartian/ai-sdk-provider";
// OpenAI
await generateText({ model: martianProvider("o4-mini"), prompt: "Hi" });
await generateText({ model: martianProvider("openai/gpt-4o"), prompt: "Hi" });
// OpenAI with cheap variant
await generateText({ model: martianProvider("openai/gpt-5:cheap"), prompt: "Summarize this" });
// Anthropic
await generateText({ model: martianProvider("anthropic/claude-3-7-sonnet-20250219"), prompt: "Hello" });
// Any other provider/id already namespaced is left unchanged
await generateText({ model: martianProvider("mistralai/mixtral-8x7b-instruct"), prompt: "Explain..." });Manual API (optional)
If you prefer direct fetch:
import { getMartianConfig, prefixModelId } from "@withmartian/ai-sdk-provider";
const cfg = getMartianConfig();
const model = prefixModelId("gpt-4o-mini"); // → openai/gpt-4o-mini
const res = await fetch(`${cfg.baseURL}/chat/completions`, {
method: "POST",
headers: {
Authorization: `Bearer ${cfg.apiKey}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
model,
messages: [{ role: "user", content: "Say hello" }],
}),
});
const data = await res.json();
console.log(data.choices?.[0]?.message?.content);API
import { martianProvider, createMartianProvider, getMartianConfig, prefixModelId } from "@withmartian/ai-sdk-provider";
// Default instance (callable)
martianProvider(modelId: string)
martianProvider.chat(modelId: string)
martianProvider.text(modelId: string)
// Utilities
martianProvider.getModelId(modelId: string) // prefix helper
martianProvider.getConfig() // { baseURL, apiKey }
// Custom instance
const martian = createMartianProvider({ apiKey?: string, baseURL?: string });Notes
- OpenAI-style ids are automatically prefixed with
openai/. - Already-namespaced ids (e.g.
anthropic/...) are passed through unchanged. - Works with AI SDK v5 (peer).
License
MIT
