ai-sdk-provider-llama-cpp
v0.0.2
Published
Llama.cpp provider for the Vercel AI SDK
Readme
ai-sdk-provider-llama-cpp
The llama.cpp provider for the AI SDK contains language model support for the llama.cpp server.
Setup
The llama.cpp provider is available in the ai-sdk-provider-llama-cpp module. You can install it with:
npm install ai-sdk-provider-llama-cppInstall in your target project:
Or you can install it from the tarball:
To use this provider in another local project, you can install the generated tarball:
npm install /full/path/to/ai-sdk-provider-llama-cpp/ai-sdk-provider-llama-cpp-0.0.1.tgzFor example, you can place ai-sdk-provider-llama-cpp-0.0.1.tgz in a folder called localpkg and run it.
Setup of llama.cpp
Run llama.cpp as a server:
llama-server.exe -hf [Model path] -p [Port]You should be able to do this on Linux or macOS as well.
Once the server is running, take note of the URL.
Provider Instance
You can import the default provider instance llamaCpp from ai-sdk-provider-llama-cpp:
import { llamaCpp } from 'ai-sdk-provider-llama-cpp';If you need to provide a custom base URL or other options, you can use the createLlamaCpp function:
import { createLlamaCpp } from 'ai-sdk-provider-llama-cpp';
const llamaCpp = createLlamaCpp({
baseURL: 'http://127.0.0.1:8080/v1',
});Language Models
The llama.cpp provider is based on the OpenAI Compatible Provider, so it supports the standard Chat Completion API.
You can use the llamaCpp provider to generate text:
- Model name is dummy.
import { llamaCpp } from 'ai-sdk-provider-llama-cpp';
import { generateText } from 'ai';
const { text } = await generateText({
model: llamaCpp('llama-2-7b-chat'),
prompt: 'Write a vegetarian lasagna recipe.',
});Or stream text:
import { llamaCpp } from 'ai-sdk-provider-llama-cpp';
import { streamText } from 'ai';
const result = streamText({
model: llamaCpp('llama-2-7b-chat'),
prompt: 'Write a vegetarian lasagna recipe.',
});
for await (const textPart of result.textStream) {
process.stdout.write(textPart);
}Documentation
Local Build & Usage
To build this package locally:
Install dependencies:
npm installBuild the package:
npm run buildCreate a tarball (optional):
npm pack
