@ignitionai/azure-storage-mcp
v1.0.4
Published
Complete Azure Storage MCP server with CRUD operations, batch processing, schema validation, and advanced querying for Azure Tables, Blobs, Service Bus Queues, and Storage Queues
Maintainers
Readme
Azure Storage MCP Server
A comprehensive Model Context Protocol (MCP) server for Azure Storage services, providing complete integration with Azure Table Storage, Azure Blob Storage, and Azure Service Bus Queues. Features include CRUD operations, batch processing, schema validation, advanced querying, container management, blob operations, and asynchronous workflow orchestration.
Features
🔥 Core Operations
- Complete CRUD: Create, Read, Update, Delete entities
- Batch Processing: Handle up to 100 entities per operation
- Table Management: Create and delete tables
- Advanced Querying: OData filters, pagination, sorting
- Container Management: Create, list, and delete blob containers
- Blob Operations: Upload, download, list, and delete blobs
- Queue Management: Create, list, and delete Service Bus queues
- Message Operations: Send, receive, and peek queue messages
🛡️ Data Integrity
- Schema Validation: Automatic inference and validation against existing data
- Type Safety: Comprehensive Zod validation for all Azure Storage services
- Error Handling: Detailed error messages with actionable suggestions
- Content Type Detection: Automatic MIME type detection for blob uploads
⚡ Performance & Reliability
- Efficient Batching: Automatic grouping by PartitionKey
- Resource Discovery: Dynamic table, container, and queue discovery
- Connection Flexibility: Support for connection strings and managed identity
- Asynchronous Processing: Queue-based workflows for long-running tasks
- Lazy Loading: Service clients instantiated only when needed
Installation
npm install -g @ignitionai/azure-storage-mcpQuick Start
1. Configure Authentication
Add your Connection String
AZURE_STORAGE_CONNECTION_STRING="DefaultEndpointsProtocol=https;AccountName=...;AccountKey=...;EndpointSuffix=core.windows.net"Add your storage account name
AZURE_STORAGE_ACCOUNT_NAME="your-storage-account"Add your Service Bus connection (for queues)
AZURE_SERVICE_BUS_CONNECTION_STRING="Endpoint=sb://your-namespace.servicebus.windows.net/;SharedAccessKeyName=...;SharedAccessKey=..."
# OR
AZURE_SERVICE_BUS_NAMESPACE="your-namespace"2. Add to Claude Desktop Configuration
Add to your claude_desktop_config.json:
{
"mcpServers": {
"azure-storage": {
"command": "npx",
"args": [
"-y",
"@ignitionai/azure-storage-mcp"
],
"env": {
"AZURE_STORAGE_CONNECTION_STRING": "YOUR_CONNECTION_STRING",
"AZURE_STORAGE_ACCOUNT_NAME": "YOUR_ACCOUNT_NAME",
"AZURE_SERVICE_BUS_CONNECTION_STRING": "YOUR_SERVICE_BUS_CONNECTION_STRING"
}
}
}3. Start Using with Claude
The server will automatically discover your tables, containers, and queues, providing 28 powerful tools:
Available Tools
📝 Entity Operations
create-azure-table-entity- Create a single entityupdate-azure-table-entity- Update entity (merge/replace modes)delete-azure-table-entity- Delete a single entitycheck-azure-table-entity-exists- Verify entity existence
📦 Batch Operations
batch-create-azure-table-entities- Create up to 100 entitiesbatch-update-azure-table-entities- Update up to 100 entitiesbatch-delete-azure-table-entities- Delete up to 100 entities
🗄️ Table Management
create-azure-table- Create new tablesdelete-azure-table- Delete tables (⚠️ irreversible)list-azure-tables- List all available tables
🔍 Data Discovery
read-azure-table- Read with OData filtersquery-azure-table-advanced- Advanced queries with pagination and sortinginspect-azure-table-schema- Analyze existing data structure
📦 Container Operations
create-blob-container- Create new blob containerslist-blob-containers- List all containersdelete-blob-container- Delete containersget-container-properties- Get container metadata
🗃️ Blob Operations
upload-blob- Upload files to blob storagedownload-blob- Download blobs with metadataread-azure-blob- Read blob content as textlist-blobs- List blobs in containerdelete-blob- Delete individual blobsget-blob-properties- Get blob metadata
🚀 Queue Operations
send-queue-message- Send messages to queuesreceive-queue-message- Receive and process messagespeek-queue-message- Preview messages without consumingcreate-azure-queue- Create new Service Bus queueslist-azure-queues- List all available queuesdelete-azure-queue- Delete queuesget-azure-queue-properties- Get queue metadata and stats
Usage Examples
Table Storage
Creating Entities
// Claude will automatically validate against existing schema
await createEntity({
tableName: "Users",
partitionKey: "Department_IT",
rowKey: "user_001",
entity: {
name: "John Doe",
email: "[email protected]",
age: 30,
active: true
}
})Batch Operations
// Create multiple users efficiently
await batchCreateEntities({
tableName: "Users",
entities: [
{
partitionKey: "Department_IT",
rowKey: "user_002",
entity: { name: "Jane Smith", age: 28 }
},
{
partitionKey: "Department_IT",
rowKey: "user_003",
entity: { name: "Bob Wilson", age: 35 }
}
]
})Advanced Queries
// Query with filtering, sorting, and pagination
await queryTableAdvanced({
tableName: "Users",
filter: "age gt 25 and active eq true",
orderBy: ["age desc", "name asc"],
top: 50,
skip: 0
})Schema Analysis
// Understand existing data structure
await inspectTableSchema({
tableName: "Users",
sampleSize: 20
})Blob Storage
Uploading Files
// Upload a file to blob storage
await uploadBlob({
containerName: "documents",
blobName: "report-2024.pdf",
content: "<file content>",
contentType: "application/pdf",
overwrite: true
})Managing Containers
// Create a new container
await createContainer({
containerName: "project-assets",
publicAccess: "none",
metadata: {
project: "webapp-redesign",
environment: "production"
}
})Queue Storage
Asynchronous Task Processing
// Send a long-running task to a queue
await sendQueueMessage({
queueName: "data-processing",
messageBody: JSON.stringify({
operation: "analyze-sales-data",
dataset: "sales-2024-q4.csv",
filters: ["region=NA", "product=software"]
}),
correlationId: "analysis-session-123"
})
// Later, check for results
await peekQueueMessage({
queueName: "analysis-results",
maxMessageCount: 1
})Schema Validation
The server automatically analyzes existing data to ensure new entries conform to established patterns:
- Required Properties: Detected from 80%+ presence in existing data
- Type Validation: Ensures consistent data types across entities
- Azure Constraints: Validates PartitionKey/RowKey format, property limits
- Helpful Warnings: Suggests missing properties or type mismatches
Configuration
Environment Variables
| Variable | Description | Required |
|----------|-------------|----------|
| AZURE_STORAGE_CONNECTION_STRING | Storage account connection string | One of these |
| AZURE_STORAGE_ACCOUNT_NAME | Storage account name (for managed identity) | One of these |
| AZURE_SERVICE_BUS_CONNECTION_STRING | Service Bus connection string | One of these |
| AZURE_SERVICE_BUS_NAMESPACE | Service Bus namespace (for managed identity) | One of these |
Azure Storage Limits
The server enforces Azure Storage constraints:
Table Storage:
- Entity Size: Max 252 properties per entity
- Property Size: Max 64KB for string/binary values
- Batch Size: Max 100 entities per batch operation
- Key Format: PartitionKey/RowKey cannot contain
/,\\,#,?
Blob Storage:
- Blob Size: Max 200GB per blob (block blobs)
- Container Names: 3-63 characters, lowercase letters, numbers, hyphens
- Blob Names: Max 1024 characters
Service Bus Queues:
- Message Size: Max 256KB (Standard), 1MB (Premium)
- Queue Size: Max 80GB
- Message TTL: Max 14 days
Dynamic Resources
All Azure Storage resources are automatically discovered and exposed as MCP resources:
Tables:
azure-table://{tableName}- Full table dataazure-table://{tableName}/{partitionKey}- Filtered by partition
Blob Containers:
azure-blob://{containerName}- Container contentsazure-blob://{containerName}/{blobName}- Individual blob
Service Bus Queues:
azure-queue://{queueName}- Queue properties and messages
Development
Building from Source
git clone https://github.com/IgnitionAI/azure-storage-mcp.git
cd azure-storage-mcp
pnpm install
pnpm buildTesting
# Test the built server
pnpm start:prod
# Use MCP Inspector for debugging
pnpm inspectContributing
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Commit changes:
git commit -m 'Add amazing feature' - Push to branch:
git push origin feature/amazing-feature - Open a Pull Request
License
MIT License - see LICENSE file for details.
Support
Built with ❤️ by IgnitionAI for the MCP ecosystem.
