npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@cool-ai/beach-llm-mastra

v0.1.2

Published

Mastra adapter for Beach's LLMProvider — wraps a @mastra/core Agent so its model + tool runtime drives Beach actor turns.

Readme

@cool-ai/beach-llm-mastra

Mastra adapter for Beach's LLMProvider. Wraps a @mastra/core Agent so its model + tool runtime drives Beach actor turns.

Home: cool-ai.org · Documentation: cool-ai.org/docs

Why this exists

Beach's LLMProvider interface is the seam where actor turns meet a model. Beach ships two implementations directly — AnthropicProvider (native Anthropic SDK, including extended-thinking) and VercelAIProvider (the Vercel AI SDK). This package adds a third: a wrap of a Mastra Agent. Consumers who have already built up a Mastra Agent — with its memory, instructions, tracing, lifecycle hooks — can drop it into Beach without rewriting their agent runtime.

The wrap is intentionally narrow. Beach owns the turn looprespond() discipline, the ToolRegistry, manifests, the canonical pipeline. Mastra contributes the model + tool runtime and whatever ergonomics the consumer wired onto its Agent. The two responsibilities meet at one method (agent.generate(messages, options)); Beach normalises the result into its own CompletionResult shape.

Install

npm install @cool-ai/beach-llm-mastra @mastra/core

@mastra/core is a peer dependency. Consumers supply their own Mastra Agent instance — the connection-injection convention shared by every Mastra-quarter adapter. Beach's interior never imports @mastra/core.

Usage

import { Agent } from '@mastra/core/agent';
import { openai } from '@ai-sdk/openai';
import { MastraProvider } from '@cool-ai/beach-llm-mastra';
import { callActor, ToolRegistry } from '@cool-ai/beach-llm';

const agent = new Agent({
  name: 'concierge',
  model: openai('gpt-4o'),
  // Leave instructions empty — Beach passes its own `system` per call.
  instructions: '',
});

const provider = new MastraProvider(agent);

const result = await callActor({
  config: {
    id: 'concierge',
    model: 'gpt-4o',
    systemPrompt: '… your Beach actor system prompt …',
    tools: ['fetch_user_bookings'],
  },
  provider,
  registry: tools,
  messages: [{ role: 'user', content: 'Suggest a quiet beach in Europe.' }],
  sessionId: 'demo',
  slotKey: 'main',
});

The Mastra Agent's model, tools, and any other configuration come from Mastra. The actor's prompt, tool list, and turn lifecycle come from Beach.

Canonical wrap-pattern signature

Every Mastra-quarter adapter follows the same shape, locked at CR-167 as the canonical signature for the rest of the quarter:

| Concern | Where it lives | |---|---| | Underlying SDK / runtime client | Constructed by the consumer, passed to the adapter as a constructor argument. Beach never imports the SDK. | | Adapter class | Implements one Beach interface (here: LLMProvider). One method per surface (here: complete()). | | Configuration | Beach's per-call options (CompletionOptions) take precedence; adapter-level options refine the wrap (e.g. instructionsAuthority). | | Peer dependency | The underlying SDK (@mastra/core) is a peer dep, never a direct dep. Consumers pin their own version. | | Error handling | Errors from the underlying SDK propagate unchanged. Beach's actor loop owns retry / fallback. |

Subsequent adapters in the quarter (missives, channel, durable, evals, voice) follow the same shape. The canonical signature is the durable contract.

Configuration

new MastraProvider(agent, {
  // Default: 'beach' — Beach's per-call `system` is forwarded as Mastra's
  // `instructions` and overrides any Agent-level instructions for that call.
  // Set to 'agent' to keep Agent-level instructions and ignore Beach's
  // per-call `system` (rarely the right choice; Beach's actor configs
  // expect their system prompt to be authoritative).
  instructionsAuthority: 'beach',
});

Message normalisation

Beach's Message[] shape (with text / thinking / tool_use / tool_result content parts) is mapped to Mastra's request shape (text / reasoning / tool-call / tool-result). Tool result messages are emitted as Mastra role: 'tool' messages with tool-result content parts. The bidirectional mapping mirrors VercelAIProvider's — the two SDKs share the underlying AI SDK semantics.

thinking blocks lose their signature on the way to Mastra (Mastra's reasoning content is plain text). Use AnthropicProvider for Anthropic models with extended thinking enabled.

Architectural commitments

  • Beach's interior never imports @mastra/core. @cool-ai/beach-core, @cool-ai/beach-session, @cool-ai/beach-llm, and the rest of Beach's runtime stay Mastra-independent. The adapter is the boundary.
  • No wrapping under any @mastra/core/ee path. The adapter targets only Apache-2.0 surfaces.
  • Mastra Agents are the consumer's concern. The adapter expects a constructed Agent and does not expose a builder API.

Related

  • beach-llm README — the LLMProvider interface this adapter implements.
  • AnthropicProvider, VercelAIProvider — the other two LLM provider implementations Beach ships.
  • The Mastra-adapter quarter: this package is week 1 of the locked Option B ordering (LLM → missives → channel → durable → evals → voice). CR-166 tracks the quarter.