@robota-sdk/agent-provider-gemma
v3.0.0-beta.63
Published
Gemma model-family provider for Robota using OpenAI-compatible local endpoints
Maintainers
Readme
@robota-sdk/agent-provider-gemma
Gemma model-family provider for Robota using OpenAI-compatible local endpoints such as LM Studio.
This provider is separate from Gemini API support. Gemini API behavior belongs to agent-provider-gemini; agent-provider-google remains only as a compatibility wrapper.
The provider owns Gemma/LM Studio serving-template projection and official local setup help links. It filters Gemma reasoning-channel markers from user-visible text and converts documented native tool-call text emitted by the Gemma/LM Studio template into Robota universal toolCalls when the referenced tool was declared in the request.
Usage
import { GemmaProvider } from '@robota-sdk/agent-provider-gemma';
const provider = new GemmaProvider({
apiKey: 'lm-studio',
baseURL: 'http://localhost:1234/v1',
defaultModel: 'gemma-local-model',
});Use this package instead of the generic OpenAI provider for Gemma-family local models. The Gemma provider owns reasoning marker filtering and native tool-call text projection; the shared OpenAI-compatible transport does not infer model-family syntax.
Gemma/LM Studio OpenAI-compatible endpoints support declared Robota function tools. They do not advertise provider-native hosted web search/fetch; use local WebSearch/WebFetch tools for explicit local web access.
Native Replay Payload Capture
When IChatOptions.onProviderNativeRawPayload is provided, the provider emits exact OpenAI-compatible request, response, and stream event payloads before Gemma-specific reasoning/tool-call projection changes the Robota-normalized assistant message. agent-core routes these callbacks into provider-neutral provider_native_raw_payload execution events for replay-grade session logs.
See docs/SPEC.md for the package contract.
