alfred-logger-sdk
v2.0.0
Published
Production-ready data collection SDK for feeding structured events to LLM Data Agents with auto-capture capabilities
Downloads
14
Maintainers
Readme
Alfred Logger SDK
Production-ready data collection SDK for feeding structured events to LLM Data Agents with automatic HTTP request/response capture capabilities.
Features
🚀 Auto-Capture - Automatically captures all HTTP/HTTPS requests and responses
🔒 Security-First - Built-in data sanitization and sensitive field redaction
📊 Custom Context - Add application-specific context to all events
🎯 Trace Management - Built-in request tracing and correlation
⚡ Performance - Configurable sampling, buffering, and rate limiting
🛡️ Production-Ready - Memory leak protection, error recovery, graceful shutdown
🔧 Express Integration - Drop-in middleware for Express.js applications
Installation
npm install alfred-logger-sdkQuick Start
const { Logger } = require('alfred-logger-sdk');
const logger = new Logger({
endpoint: 'https://your-api.com/events',
apiKey: 'your-api-key',
// Auto-capture all HTTP requests/responses
autoCapture: {
enabled: true,
sampleRate: 1.0 // Capture 100% of requests
}
});
// Manual event logging
logger.userAction('button_click', { button: 'submit', page: '/checkout' });
logger.error(new Error('Payment failed'), { orderId: '12345' });
// Custom context (included in all events)
logger.setCustomContext({
userId: 'user-123',
sessionId: 'sess-abc',
version: '2.1.0'
});
// Graceful shutdown
process.on('SIGINT', async () => {
await logger.shutdown();
process.exit(0);
});Configuration Options
const logger = new Logger({
// Required
endpoint: 'https://your-api.com/events',
apiKey: 'your-api-key', // Optional if endpoint doesn't require auth
// Buffering & Performance
batchSize: 50, // Events per batch
maxBufferSize: 10000, // Max events in memory
flushInterval: 5000, // Flush interval (ms)
maxEventSize: 1024 * 1024, // Max event size (1MB)
// Application Context
appName: 'my-app',
environment: 'production',
customContext: {
userId: 'user-123',
tenantId: 'tenant-456'
},
// Security
sanitizePayloads: true, // Auto-sanitize sensitive data
// Auto-Capture Configuration
autoCapture: {
enabled: true,
captureRequestBody: true,
captureResponseBody: true,
maxBodySize: 64 * 1024, // 64KB body limit
sampleRate: 1.0, // 0.0 - 1.0 sampling rate
captureHeaders: true,
// Filtering
ignoredPaths: ['/health', '/ping'],
ignoredHosts: ['localhost', 'internal.com'],
sensitiveHeaders: ['authorization', 'cookie']
}
});API Reference
Event Logging Methods
// User interactions
logger.userAction('click', { button: 'submit', page: '/home' });
// System events
logger.systemEvent('startup', { version: '1.2.3', pid: process.pid });
// Performance metrics
logger.performanceMetric('response_time', 250, 'ms');
// Errors with context
logger.error(new Error('DB connection failed'), { query: 'SELECT * FROM users' });
// Custom events
logger.customData('feature_usage', { feature: 'export', format: 'csv' });
// Generic events
logger.collectEvent('custom_event', { any: 'data' }, { traceId: 'trace-123' });Context Management
// Set custom context (included in all future events)
logger.setCustomContext('userId', 'user-123');
logger.setCustomContext({
sessionId: 'sess-abc',
experimentGroup: 'variant-a'
});
// Get context
const userId = logger.getCustomContext('userId');
const allContext = logger.getCustomContext();
// Remove context
logger.removeCustomContext('sessionId');
logger.removeCustomContext(['key1', 'key2']);
logger.clearCustomContext();Trace Management
// Manual tracing
const traceId = logger.startTrace(); // or logger.startTrace('custom-trace-id')
logger.collectEvent('step_1', { data: 'value' });
logger.collectEvent('step_2', { data: 'value' });
logger.endTrace();
// Automatic tracing with callback
logger.withTrace(null, (traceId) => {
// All events here will have the same traceId
logger.userAction('process_start', {});
return someAsyncOperation().then(result => {
logger.userAction('process_complete', { result });
return result;
});
});Auto-Capture Control
// Enable/disable at runtime
logger.enableAutoCapture({ sampleRate: 0.5 });
logger.disableAutoCapture();
// Update configuration
logger.updateAutoCaptureConfig({
maxBodySize: 32 * 1024,
ignoredPaths: ['/health', '/metrics']
});Express.js Integration
const express = require('express');
const { Logger } = require('alfred-logger-sdk');
const app = express();
const logger = new Logger({
endpoint: 'https://your-api.com/events',
apiKey: 'your-api-key'
});
// Add middleware to capture all requests/responses
app.use(logger.expressMiddleware());
app.get('/api/users', (req, res) => {
// Request/response automatically logged
// Access trace ID: req.traceId
logger.userAction('api_call', {
endpoint: '/api/users',
userId: req.user?.id
});
res.json({ users: [] });
});Axios Integration
const axios = require('axios');
// Automatically log all axios requests/responses
logger.decorateAxios(axios);
// All axios calls now automatically logged
const response = await axios.get('https://api.example.com/data');Security
The SDK automatically sanitizes sensitive data:
- Header Sanitization:
authorization,cookie,x-api-keyetc. are redacted - Payload Sanitization: Fields containing
password,token,secret,keyare redacted - Body Size Limits: Prevents memory issues with large payloads
- URL Validation: Prevents logging to malicious endpoints
See SECURITY.md for complete security guidelines.
Examples
Check the /examples directory for complete usage examples:
Error Handling
The SDK is designed to never break your application:
// Graceful error handling
logger.collectEvent('test', { data: 'too large'.repeat(1000000) }); // Throws error
// vs
logger.collectEvent('test', { data: 'normal size' }); // Works fine
// Failed HTTP requests are retried automatically
// Failed events are buffered and retried
// Memory limits prevent buffer overflowTypeScript Support
TypeScript definitions will be added in a future release. Currently works with JavaScript/Node.js projects.
Publishing Your Own Version
To publish this package to npm:
Update package.json:
# Update name, author, repository fields npm version patch # or minor/majorTest everything:
npm test npm run test:coveragePublish to npm:
npm login npm publish
Performance Considerations
- Buffering: Events are batched to reduce HTTP overhead
- Sampling: Configure
sampleRateto capture subset of requests - Size Limits: Automatic truncation of large payloads
- Memory Management: Buffer overflow protection prevents memory leaks
- Rate Limiting: Built-in request throttling
Development
# Install dependencies
npm install
# Run tests
npm test
npm run test:watch
npm run test:coverage
# Run examples
node examples/basic-usage.js
node examples/auto-capture.jsLicense
MIT © [Your Name]
Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Changelog
See CHANGELOG.md for version history and updates.
