@sergdudko/objectstream
v4.0.22
Published
Creates a stream to convert json from string or convert json to string.
Maintainers
Readme
@sergdudko/objectstream
A powerful and efficient Node.js library for streaming JSON processing. Transform JSON strings to objects and objects to JSON strings with support for custom separators, multiple encodings, and high-performance streaming operations.
✨ Features
- Dual Package: Full ES Modules (ESM) and CommonJS (CJS) support
- TypeScript: Complete type definitions included
- High Performance: Based on native Node.js stream methods
- Multiple Encodings: Support for utf8, base64, latin1, binary, and hex
- Custom Separators: Configure start, middle, and end separators
- Memory Efficient: Streaming approach for large JSON datasets
- Zero Dependencies: No external dependencies
📦 Installation
npm install @sergdudko/objectstream🚀 Quick Start
ESM (ES Modules)
import { Parser, Stringifer } from '@sergdudko/objectstream';
// String to Object conversion
const parser = new Parser();
parser.on('data', (obj) => {
console.log('Parsed object:', obj);
});
parser.write('{"name":"John","age":30}');
parser.end();
// Object to String conversion
const stringifer = new Stringifer();
stringifer.on('data', (jsonString) => {
console.log('JSON string:', jsonString.toString());
});
stringifer.write({ name: 'John', age: 30 });
stringifer.end();CommonJS
const { Parser, Stringifer } = require('@sergdudko/objectstream');
// Or using default export
const objectstream = require('@sergdudko/objectstream');
const { Parser, Stringifer } = objectstream.default;TypeScript
import { Parser, Stringifer } from '@sergdudko/objectstream';
interface User {
name: string;
age: number;
}
const parser = new Parser();
parser.on('data', (user: User) => {
console.log(`User: ${user.name}, Age: ${user.age}`);
});📚 API Reference
Parser Class
Transform stream that converts JSON strings to JavaScript objects.
Constructor
new Parser(start?: string, middle?: string, end?: string)Parameters
start(optional): First separator character (default: none)middle(optional): Middle separator character (default: none)end(optional): End separator character (default: none)
Methods
setEncoding(encoding): Set input encoding (utf8,utf-8,base64,latin1,binary,hex)
Events
data: Emitted when an object is parsederror: Emitted when parsing failsend: Emitted when stream endsfinish: Emitted when stream finishes
Stringifer Class
Transform stream that converts JavaScript objects to JSON strings.
Constructor
new Stringifer(start?: string, middle?: string, end?: string)Parameters
start(optional): First separator character (default: none)middle(optional): Middle separator character (default: none)end(optional): End separator character (default: none)
Methods
setEncoding(encoding): Set output encoding (utf8,utf-8,base64,latin1,binary,hex)
Events
data: Emitted when JSON string is generatederror: Emitted when stringification failsend: Emitted when stream endsfinish: Emitted when stream finishes
💡 Usage Examples
Basic JSON Processing
import { Parser, Stringifer } from '@sergdudko/objectstream';
const parser = new Parser();
const stringifer = new Stringifer();
// Parse JSON string
parser.on('data', (obj) => {
console.log('Parsed:', obj);
});
parser.write('{"message":"Hello World"}');
parser.end();
// Stringify object
stringifer.on('data', (data) => {
console.log('Stringified:', data.toString());
});
stringifer.write({ message: 'Hello World' });
stringifer.end();Custom Separators for JSON Arrays
import { Parser, Stringifer } from '@sergdudko/objectstream';
// Process JSON array with custom separators
const parser = new Parser('[', ',', ']');
const stringifer = new Stringifer('[', ',', ']');
stringifer.on('data', (data) => {
console.log('JSON Array chunk:', data.toString());
});
// Write multiple objects
stringifer.write({ id: 1, name: 'Alice' });
stringifer.write({ id: 2, name: 'Bob' });
stringifer.write({ id: 3, name: 'Charlie' });
stringifer.end(); // Output: [{"id":1,"name":"Alice"},{"id":2,"name":"Bob"},{"id":3,"name":"Charlie"}]Different Encodings
import { Parser, Stringifer } from '@sergdudko/objectstream';
// Base64 encoding
const stringifer = new Stringifer();
stringifer.setEncoding('base64');
stringifer.on('data', (data) => {
console.log('Base64 JSON:', data); // Base64 encoded JSON string
});
stringifer.write({ encoded: true });
stringifer.end();
// Parse Base64 encoded JSON
const parser = new Parser();
parser.setEncoding('base64');
parser.on('data', (obj) => {
console.log('Decoded object:', obj);
});
// Write base64 encoded JSON
parser.write(Buffer.from('{"decoded":true}').toString('base64'));
parser.end();Stream Piping
import { Parser, Stringifer } from '@sergdudko/objectstream';
import { Transform } from 'stream';
// Create a processing pipeline
const parser = new Parser();
const processor = new Transform({
objectMode: true,
transform(obj, encoding, callback) {
// Process each object
obj.processed = true;
obj.timestamp = Date.now();
callback(null, obj);
}
});
const stringifer = new Stringifer();
// Pipe the streams together
parser
.pipe(processor)
.pipe(stringifer)
.on('data', (data) => {
console.log('Processed JSON:', data.toString());
});
// Input data
parser.write('{"name":"test"}');
parser.end();Error Handling
import { Parser, Stringifer } from '@sergdudko/objectstream';
const parser = new Parser();
parser.on('data', (obj) => {
console.log('Valid object:', obj);
});
parser.on('error', (errors) => {
console.error('Parsing errors:', errors);
});
// Valid JSON
parser.write('{"valid":true}');
// Invalid JSON
parser.write('{"invalid":}');
parser.end();🎯 Supported Encodings
| Encoding | Input | Output | Description |
|----------|-------|--------|-------------|
| utf8 (default) | ✅ | ✅ | Standard UTF-8 text |
| utf-8 | ✅ | ✅ | Alias for utf8 |
| base64 | ✅ | ✅ | Base64 encoded data |
| latin1 | ✅ | ✅ | Latin-1 encoding |
| binary | ✅ | ✅ | Binary data encoding |
| hex | ✅ | ✅ | Hexadecimal encoding |
⚡ Performance
ObjectStream is optimized for high-performance streaming operations:
- Memory Efficient: Processes data in chunks, suitable for large JSON files
- Zero-Copy Operations: Minimizes memory copying where possible
- Stream-Based: Non-blocking operations using Node.js streams
- Optimized Parsing: Efficient JSON parsing with error recovery
🧪 Testing
The library includes comprehensive TypeScript tests:
npm testTest coverage includes:
- ✅ Parser functionality with various data types
- ✅ Stringifer functionality with validation
- ✅ Custom separators and encodings
- ✅ Stream piping and event handling
- ✅ Error handling and edge cases
- ✅ Performance benchmarks
- ✅ ESM/CJS compatibility
🏗️ Development
# Install dependencies
npm install
# Run tests
npm test
# Build dual package (ESM + CJS)
npm run build
# Lint code
npm run lint📄 Package Structure
dist/
├── esm/ # ES Modules build
├── cjs/ # CommonJS build
└── types/ # Shared TypeScript definitions🤝 Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
🎯 Version History
- v3.x: TypeScript rewrite, dual package support, modern Node.js features
- v2.x: Enhanced performance and encoding support
- v1.x: Initial release with basic streaming functionality
📄 License
MIT License - see LICENSE file for details.
🆘 Support
- 📝 Issues: GitHub Issues
- 💬 Discussions: GitHub Discussions
- 📧 Email: [email protected]
💝 Support This Project
If ObjectStream helps you build amazing applications, consider supporting its development:
Your support helps maintain and improve Redux Cluster for the entire community!
Made with ❤️ by Siarhei Dudko
