@equinor/fusion-framework-module-ai
v2.0.1
Published
AI module for Fusion Framework providing AI/LLM integration capabilities
Readme
@equinor/fusion-framework-module-ai
AI services module for Fusion Framework applications, providing integration with language models, embeddings, and vector stores.
Features
- Language Models: Support for various AI language models (OpenAI, Azure OpenAI, etc.)
- Embeddings: Text embedding services for semantic search and similarity
- Vector Stores: Vector database integration for storing and searching embeddings
- Flexible Configuration: Support for both eager and lazy initialization
- Type Safety: Full TypeScript support with comprehensive type definitions
Installation
pnpm add @equinor/fusion-framework-module-aiUsage
Basic Configuration
import { Framework } from '@equinor/fusion-framework';
import { enableAI, type AIServiceType } from '@equinor/fusion-framework-module-ai';
import { AzureOpenAIModel } from '@equinor/fusion-framework-module-ai/lib/azure/AzureOpenAIModel';
import { AzureOpenAiEmbed } from '@equinor/fusion-framework-module-ai/lib/azure/AzureOpenAiEmbed';
import { AzureVectorStore } from '@equinor/fusion-framework-module-ai/lib/azure/AzureVectorStore';
const framework = new Framework({
name: 'AI Example App',
modules: [
enableAI(config =>
config
.setModel('gpt-4', new AzureOpenAIModel({
apiKey: process.env.AZURE_OPENAI_API_KEY!,
modelName: 'gpt-4'
}))
.setEmbedding('embeddings', new AzureOpenAiEmbed({
apiKey: process.env.AZURE_OPENAI_API_KEY!,
modelName: 'text-embedding-ada-002'
}))
.setVectorStore('vector-db', new AzureVectorStore({
endpoint: process.env.AZURE_SEARCH_ENDPOINT!,
apiKey: process.env.AZURE_SEARCH_API_KEY!,
indexName: 'documents'
}))
)
]
});Lazy Initialization with Factory Functions
const framework = new Framework({
name: 'AI App with Lazy Loading',
modules: [
enableAI(config =>
config
.setModel('gpt-4-lazy', (args) => new AzureOpenAIModel({
apiKey: args.env.AZURE_OPENAI_API_KEY,
modelName: 'gpt-4'
}))
.setEmbedding('embeddings-lazy', (args) => new AzureOpenAiEmbed({
apiKey: args.env.AZURE_OPENAI_API_KEY,
modelName: 'text-embedding-ada-002'
}))
.setVectorStore('vector-db-lazy', (args) => new AzureVectorStore({
endpoint: args.env.AZURE_SEARCH_ENDPOINT,
apiKey: args.env.AZURE_SEARCH_API_KEY,
indexName: 'documents'
}))
)
]
});Using the AI Provider
async function useAIProvider(framework: Framework) {
const frameworkInstance = await framework.initialize();
const aiProvider = frameworkInstance.modules.ai;
// Get configured services
const chatService = aiProvider.getService('chat', 'gpt-4') as IModel;
const embeddingService = aiProvider.getService('embeddings', 'embeddings') as IEmbed;
const searchService = aiProvider.getService('search', 'vector-db') as IVectorStore;
// Use the services
const response = await chatService.execute([
{ role: 'user', content: 'Hello, how are you?' }
]);
const embeddings = await embeddingService.execute('This is a test document');
const searchResults = await searchService.execute('search query');
}API Reference
AIConfigurator
The AIConfigurator class provides a fluent API for configuring AI services.
Methods
setModel(identifier: string, modelOrFactory: ValueOrCallback<IModel>): thissetEmbedding(identifier: string, embeddingOrFactory: ValueOrCallback<IEmbed>): thissetVectorStore(identifier: string, vectorStoreOrFactory: ValueOrCallback<IVectorStore>): this
AIProvider
The AIProvider class provides access to configured AI services.
Methods
getService(type: AIServiceType, identifier: string): IModel | IEmbed | IVectorStore- Get a configured AI service
Types
AIServiceType
Type for AI service types.
type AIServiceType = 'chat' | 'embeddings' | 'search';ValueOrCallback
Represents either a value of type T or a callback that creates a value of type T.
type ValueOrCallback<T> = T | ConfigBuilderCallback<T>;AIModuleConfig
Configuration object generated by the AIConfigurator.
type AIModuleConfig = {
models?: Record<string, ValueOrCallback<IModel>>;
embeddings?: Record<string, ValueOrCallback<IEmbed>>;
vectorStores?: Record<string, ValueOrCallback<IVectorStore>>;
};Error Handling
The AI module provides comprehensive error handling:
- Service Not Found: Clear error messages when requesting non-existent services
- Configuration Errors: Validation of configuration parameters
- Service Errors: Wrapped service-specific errors with context
Contributing
Please refer to the main Fusion Framework contributing guidelines for information on how to contribute to this module.
