@blaxel/llamaindex
v0.2.69
Published
Blaxel SDK for TypeScript
Readme
Blaxel TypeScript SDK
Blaxel is a perpetual sandbox platform that achieves near instant latency by keeping infinite secure sandboxes on automatic standby, while co-hosting your agent logic to cut network overhead.
This package contains helper functions for Blaxel's TypeScript SDK, to let you retrieve model clients and MCP tool definitions in the format required by LlamaIndex.
Example
// With LlamaIndex
import { blTools, blModel } from "@blaxel/llamaindex";
const stream = agent({
llm: await blModel("sandbox-openai"),
tools: [
...(await blTools(["blaxel-search", "webcrawl"])),
tool({
name: "weather",
description: "Get the weather in a specific city",
parameters: z.object({
city: z.string(),
}),
execute: async (input) => {
console.debug("TOOLCALLING: local weather", input);
return `The weather in ${input.city} is sunny`;
},
}),
],
systemPrompt: prompt,
}).run(process.argv[2]);
Installation
# npm
npm install @blaxel/llamaindex
# yarn
yarn add @blaxel/llamaindex
# bun
bun add @blaxel/llamaindexAuthentication
The SDK authenticates with your Blaxel workspace using these sources (in priority order):
- Blaxel CLI, when logged in
- Environment variables in
.envfile (BL_WORKSPACE,BL_API_KEY) - System environment variables
- Blaxel configuration file (
~/.blaxel/config.yaml)
When developing locally, the recommended method is to just log in to your workspace with the Blaxel CLI:
bl login YOUR-WORKSPACEThis allows you to run Blaxel SDK functions that will automatically connect to your workspace without additional setup. When you deploy on Blaxel, this connection persists automatically.
When running Blaxel SDK from a remote server that is not Blaxel-hosted, we recommend using environment variables as described in the third option above.
Usage
Model use
Blaxel acts as a unified gateway for model APIs, centralizing access credentials, tracing and telemetry. You can integrate with any model API provider, or deploy your own custom model. When a model is deployed on Blaxel, a global API endpoint is also created to call it.
This package includes a helper function that creates a reference to a model deployed on Blaxel and returns a framework-specific model client that routes API calls through Blaxel's unified gateway.
// With LlamaIndex
import { blModel } from "@blaxel/llamaindex";
const model = await blModel("gpt-5-mini");MCP tool use
Blaxel lets you deploy and host Model Context Protocol (MCP) servers, accessible at a global endpoint over streamable HTTP.
This package includes a helper function that retrieves and returns tool definitions from a Blaxel-hosted MCP server in the format required by specific frameworks.
Here is an example of retrieving tool definitions from a Blaxel sandbox's MCP server in the format required by LlamaIndex:
import { SandboxInstance } from "@blaxel/core";
import { blTools } from "@blaxel/llamaindex";
// Create a new sandbox
const sandbox = await SandboxInstance.createIfNotExists({
name: "my-sandbox",
image: "blaxel/base-image:latest",
memory: 4096,
region: "us-pdx-1",
ttl: "24h"
});
// Get sandbox tools
const tools = await blTools(['sandbox/my-sandbox'])Telemetry
Instrumentation happens automatically when workloads run on Blaxel.
Enable automatic telemetry by importing the @blaxel/telemetry package:
import "@blaxel/telemetry";Requirements
- Node.js v18 or later
Documentation
Contributing
Contributions are welcome! Please feel free to submit a pull request.
License
This project is licensed under the MIT License. See the LICENSE file for details.
