@middlebop/client
v0.0.2
Published
middlebop npm package
Downloads
3
Readme
Multiple AI models always same syntax
Welcome to Middlebop, integrate with multiple AI models using one single syntax. Our library serves as a unified interface, allowing you to communicate with multiple AI models and seemlessly switch between them in real time without rewriting your code.
Before you begin
Make sure you have a Middlebop account and have created an API key that you will use with the api and clients.
:warning: Always store your Middlebop API key safely! Never share it and we recommend to always call it from your backend!
You can create and handle your API keys at https://middlebop.com
npm install @middlebop/client # or yarn add @middlebop/client or bun add @middlebop/clientChat completion example
import { sendChat } from "middlebop";
const middlebopApiKey = "mb-yourSuperSecretApiKey";
const messages: MiddlebopChatMessage[] = [
{
role: "user",
content: {
type: "text",
text: "Hello, who are you?",
},
},
];
const response = await sendChat({
messages,
model: "gpt-4",
middlebopApiKey,
});
console.log(response.data);
// OUTPUT:
// {
// "responseMessages": [
// {
// "role": "assistant",
// "content": {
// "type": "text",
// "text": "Hello! I'm AI developed by OpenAI. How can I help you today?"
// }
// }
// ]
// }Streaming chat response example
Input is the same as when creating a non streaming chat response. When calling the streaming chat function you pass a callback handler that recieves every chunk. You can also pass in an error handler.
import {
MiddlebopChatMessage,
startChatStream,
} from "middlebop";
const middlebopApiKey = "mb-yourSuperSecretApiKey";
const messages: MiddlebopChatMessage[] = [
{
role: "system",
content: {
type: "text",
text: "You are a helpful assistant called beep-boop",
},
},
{
role: "user",
content: {
type: "text",
text: "Hello, who are you?",
},
},
];
const onStreamResponse = (chunk: MiddlebopChatCompletionStreamChunk) => {
console.log(chunk);
};
const onStreamError = (error: unknown) => {
console.error(error);
};
startChatStream(
{
messages,
model: "gemini-pro",
middlebopApiKey,
},
onStreamResponse,
onStreamError
);
// CHUNK OUTPUT EXAMPLE
// {
// "type": "chunk",
// "model": "gemini-pro",
// "messages": [
// {
// "role": "assistant",
// "content": {
// "type": "text",
// "text": "Hello"
// }
// }
// ]
// }More documentation can be found at https://docs.middlebop.com
