@libertasinception/cortex
v0.1.0
Published
AI framework for Helix Synapse — MCP server, custom AI modules, multi-provider LLM integration, and LangChain tools
Maintainers
Readme
@libertasinception/cortex
AI framework for Helix Synapse — MCP server, custom AI modules, multi-provider LLM integration, and LangChain tools
Installation
npm install @libertasinception/cortexQuick Start
import { CortexClient } from "@libertasinception/cortex";
const cortex = new CortexClient(httpClient);
// Start MCP server with blockchain tools
const server = await cortex.startMCPServer({
port: 3100, name: "helix-mcp",
tools: [{
name: "query_balance",
description: "Query PHI balance",
parameters: { address: { type: "string" } },
handler: async ({ address }) => await helix.query.balance(address, "uphi"),
}],
});
// Register AI modules (RAG, chain-analysis)
await cortex.modules.register({
name: "governance-rag", type: "rag",
config: { vectorStore: { provider: "qdrant" } },
});
// Create AI agent
const agent = await cortex.createAgent({
name: "advisor",
modules: ["governance-rag"],
provider: { name: "anthropic", model: "claude-sonnet-4-6" },
});
const response = await agent.chat("Analyze proposal #42");Features
- MCP Server — expose blockchain tools to Claude, GPT, etc.
- Custom AI modules: RAG, chain-analysis, custom pipelines
- Multi-provider: Anthropic, OpenAI, Mistral, Ollama
- Vector stores: Pinecone, Qdrant, ChromaDB
- LangChain & LlamaIndex integration
- A2A (Agent-to-Agent) messaging
API Reference
Classes
CortexClient
Types
AIProviderConfigAIProviderMCPServerConfigMCPToolAIAgentAgentQueryResultCustomModuleVectorStoreConfigLangChainConfig
Dependencies
@libertasinception/corezod
Requirements
- Node.js >= 18.0.0
- TypeScript >= 5.7 (recommended)
Links
License
MIT - see LICENSE for details.
