@geekmidas/cache
v0.0.7
Published
Type-safe email client with SMTP support and React templates
Downloads
278
Readme
@geekmidas/cache
A type-safe, flexible caching library for TypeScript applications with support for multiple cache implementations and a unified interface.
Features
- Type-safe: Full TypeScript support with generics for strongly-typed cache values
- Unified interface: Common
Cache<T>interface for all implementations - Multiple backends: In-memory, Redis (Upstash), and Expo Secure Store implementations
- Async API: Promise-based interface for consistency across implementations
- Flexible TTL: Support for time-to-live across all implementations
- Easy testing: Simple interface makes mocking and testing straightforward
- Modular exports: Import only what you need
- React Native support: Expo Secure Store implementation for mobile apps
Installation
npm install @geekmidas/cacheOptional Dependencies
For Redis support via Upstash:
npm install @upstash/redisFor React Native support via Expo:
npm install expo-secure-storeQuick Start
In-Memory Cache
Perfect for development, testing, or simple applications:
import { InMemoryCache } from '@geekmidas/cache/memory';
const cache = new InMemoryCache<string>();
// Set a value
await cache.set('user:123', 'John Doe');
// Get a value
const userName = await cache.get('user:123'); // 'John Doe'
// Delete a value
await cache.delete('user:123');
// Check if value exists
const exists = await cache.get('user:123'); // undefinedRedis Cache (Upstash)
For production applications needing distributed caching:
import { UpstashCache } from '@geekmidas/cache/upstash';
const cache = new UpstashCache<User>(
process.env.UPSTASH_REDIS_URL!,
process.env.UPSTASH_REDIS_TOKEN!
);
// Set with TTL (1 hour)
await cache.set('user:123', { id: 123, name: 'John Doe' }, 3600);
// Get typed object
const user = await cache.get('user:123'); // User | undefined
// Delete
await cache.delete('user:123');API Reference
Cache Interface
All cache implementations follow this interface:
interface Cache<T> {
get(key: string): Promise<T | undefined>;
set(key: string, value: T, ttl?: number): Promise<void>;
delete(key: string): Promise<void>;
}Methods
get(key: string): Promise<T | undefined>
Retrieves a value from the cache.
- Parameters:
key- The cache key - Returns: Promise resolving to the cached value or
undefinedif not found - Example:
const value = await cache.get('my-key'); if (value !== undefined) { // Use the value }
set(key: string, value: T, ttl?: number): Promise<void>
Stores a value in the cache.
- Parameters:
key- The cache keyvalue- The value to storettl- Optional time-to-live in seconds
- Returns: Promise resolving when the operation completes
- Example:
// Set without TTL await cache.set('key', 'value'); // Set with 1 hour TTL await cache.set('key', 'value', 3600);
delete(key: string): Promise<void>
Removes a value from the cache.
- Parameters:
key- The cache key to delete - Returns: Promise resolving when the operation completes
- Example:
await cache.delete('my-key');
InMemoryCache
Simple in-memory cache implementation.
import { InMemoryCache } from '@geekmidas/cache/memory';
class InMemoryCache<T> implements Cache<T>Features
- ✅ Fast access (no network latency)
- ✅ Reference equality for objects
- ✅ No external dependencies
- ❌ No TTL support (TTL parameter is ignored)
- ❌ Data lost on process restart
- ❌ Not suitable for multi-instance applications
Example
const cache = new InMemoryCache<number>();
await cache.set('counter', 42);
const count = await cache.get('counter'); // 42UpstashCache
Redis-based cache using Upstash Redis client.
import { UpstashCache } from '@geekmidas/cache/upstash';
class UpstashCache<T> implements Cache<T>Constructor
new UpstashCache<T>(url: string, token: string)url: Upstash Redis URLtoken: Upstash Redis token
Features
- ✅ Persistent storage
- ✅ TTL support (defaults to 3600 seconds)
- ✅ Distributed caching
- ✅ JSON serialization/deserialization
- ❌ Requires external Redis service
- ❌ Network latency for operations
Example
interface User {
id: number;
name: string;
email: string;
}
const cache = new UpstashCache<User>(
'https://your-redis-url.upstash.io',
'your-token'
);
await cache.set('user:1', {
id: 1,
name: 'John Doe',
email: '[email protected]'
}, 7200); // 2 hours TTL
const user = await cache.get('user:1');ExpoSecureCache
Secure storage for React Native apps using Expo SecureStore.
import { ExpoSecureCache } from '@geekmidas/cache/expo';
class ExpoSecureCache<T> implements Cache<T>Features
- ✅ Secure, encrypted storage on device
- ✅ TTL support with automatic expiration
- ✅ Persistent across app restarts
- ✅ iOS Keychain & Android Keystore backed
- ❌ React Native/Expo only
- ❌ Limited storage capacity (varies by platform)
- ❌ Slower than in-memory cache
Example
import { ExpoSecureCache } from '@geekmidas/cache/expo';
interface AuthTokens {
accessToken: string;
refreshToken: string;
}
const cache = new ExpoSecureCache<AuthTokens>();
// Store tokens securely with 1 hour TTL
await cache.set('auth:tokens', {
accessToken: 'eyJhbGciOiJIUzI1NiIs...',
refreshToken: 'dGhpcyBpcyBhIHJlZnJlc2ggdG9rZW4...'
}, 3600);
// Retrieve tokens
const tokens = await cache.get('auth:tokens');
// Clear tokens on logout
await cache.delete('auth:tokens');Platform Considerations
- iOS: Uses Keychain Services, data persists across app reinstalls
- Android: Uses Android Keystore, data cleared on app uninstall
- Storage Limits: Approximately 2KB per item on most devices
- Performance: Async operations can be slower than memory cache
Advanced Usage
Type Safety
The cache is fully type-safe with TypeScript generics:
// String cache
const stringCache = new InMemoryCache<string>();
await stringCache.set('key', 'value');
const str: string | undefined = await stringCache.get('key');
// Object cache
interface Product {
id: number;
name: string;
price: number;
}
const productCache = new InMemoryCache<Product>();
await productCache.set('product:1', { id: 1, name: 'Widget', price: 9.99 });
const product: Product | undefined = await productCache.get('product:1');Cache Abstraction
Create cache-agnostic services:
interface UserService {
getUser(id: number): Promise<User | undefined>;
setUser(user: User): Promise<void>;
deleteUser(id: number): Promise<void>;
}
class CachedUserService implements UserService {
constructor(
private cache: Cache<User>,
private userRepository: UserRepository
) {}
async getUser(id: number): Promise<User | undefined> {
const cacheKey = `user:${id}`;
// Try cache first
let user = await this.cache.get(cacheKey);
if (user) return user;
// Fallback to repository
user = await this.userRepository.findById(id);
if (user) {
await this.cache.set(cacheKey, user, 3600); // 1 hour
}
return user;
}
async setUser(user: User): Promise<void> {
await this.userRepository.save(user);
await this.cache.set(`user:${user.id}`, user, 3600);
}
async deleteUser(id: number): Promise<void> {
await this.userRepository.delete(id);
await this.cache.delete(`user:${id}`);
}
}
// Use with any cache implementation
const service = new CachedUserService(
new InMemoryCache<User>(), // or new UpstashCache<User>(url, token)
new UserRepository()
);Factory Pattern
Create cache instances based on configuration:
interface CacheConfig {
type: 'memory' | 'redis' | 'expo';
redis?: {
url: string;
token: string;
};
}
function createCache<T>(config: CacheConfig): Cache<T> {
switch (config.type) {
case 'memory':
return new InMemoryCache<T>();
case 'redis':
if (!config.redis) throw new Error('Redis config required');
return new UpstashCache<T>(config.redis.url, config.redis.token);
case 'expo':
return new ExpoSecureCache<T>();
default:
throw new Error(`Unsupported cache type: ${config.type}`);
}
}
const cache = createCache<User>({
type: process.env.NODE_ENV === 'production' ? 'redis' : 'memory',
redis: {
url: process.env.UPSTASH_REDIS_URL!,
token: process.env.UPSTASH_REDIS_TOKEN!,
},
});Wrapper Pattern
Add functionality like logging or metrics:
class LoggingCache<T> implements Cache<T> {
constructor(
private cache: Cache<T>,
private logger: Logger
) {}
async get(key: string): Promise<T | undefined> {
this.logger.debug(`Cache GET: ${key}`);
const value = await this.cache.get(key);
this.logger.debug(`Cache GET result: ${value ? 'HIT' : 'MISS'}`);
return value;
}
async set(key: string, value: T, ttl?: number): Promise<void> {
this.logger.debug(`Cache SET: ${key} (TTL: ${ttl})`);
await this.cache.set(key, value, ttl);
}
async delete(key: string): Promise<void> {
this.logger.debug(`Cache DELETE: ${key}`);
await this.cache.delete(key);
}
}
const cache = new LoggingCache(
new UpstashCache<User>(url, token),
logger
);Testing
Mocking
The interface makes testing easy:
import { Cache } from '@geekmidas/cache';
class MockCache<T> implements Cache<T> {
private store = new Map<string, T>();
async get(key: string): Promise<T | undefined> {
return this.store.get(key);
}
async set(key: string, value: T): Promise<void> {
this.store.set(key, value);
}
async delete(key: string): Promise<void> {
this.store.delete(key);
}
}
// Use in tests
const mockCache = new MockCache<User>();
const service = new CachedUserService(mockCache, mockRepository);Testing with Real Implementations
import { InMemoryCache } from '@geekmidas/cache/memory';
describe('UserService', () => {
let cache: Cache<User>;
let service: CachedUserService;
beforeEach(() => {
cache = new InMemoryCache<User>();
service = new CachedUserService(cache, mockRepository);
});
it('should cache user data', async () => {
const user = { id: 1, name: 'John' };
await service.setUser(user);
const cachedUser = await cache.get('user:1');
expect(cachedUser).toEqual(user);
});
});Performance Considerations
InMemoryCache
- Pros: Very fast, no network latency
- Cons: Limited by available memory, not persistent
- Best for: Development, testing, single-instance apps
UpstashCache
- Pros: Persistent, distributed, scalable
- Cons: Network latency, external dependency
- Best for: Production, multi-instance apps, shared cache
ExpoSecureCache
- Pros: Secure, encrypted, persistent on device
- Cons: React Native only, storage limits, slower operations
- Best for: Mobile apps, sensitive data (tokens, credentials)
Key Strategies
- Use appropriate TTL: Balance between cache freshness and performance
- Consider cache size: Monitor memory usage for in-memory cache
- Implement fallback: Always have a fallback when cache is unavailable
- Use compression: For large objects, consider compression before caching
Error Handling
Cache operations can fail, so implement proper error handling:
async function getUserSafely(id: number): Promise<User | undefined> {
try {
const cached = await cache.get(`user:${id}`);
if (cached) return cached;
} catch (error) {
console.warn('Cache get failed, falling back to database:', error);
}
try {
const user = await database.getUser(id);
if (user) {
try {
await cache.set(`user:${id}`, user, 3600);
} catch (error) {
console.warn('Cache set failed:', error);
}
}
return user;
} catch (error) {
console.error('Database query failed:', error);
throw error;
}
}Migration Guide
From Map to Cache
// Old way
const cache = new Map<string, User>();
cache.set('user:1', user);
const user = cache.get('user:1');
// New way
const cache = new InMemoryCache<User>();
await cache.set('user:1', user);
const user = await cache.get('user:1');From Other Cache Libraries
Most cache libraries can be adapted to implement the Cache<T> interface:
class RedisCache<T> implements Cache<T> {
constructor(private redis: RedisClient) {}
async get(key: string): Promise<T | undefined> {
const value = await this.redis.get(key);
return value ? JSON.parse(value) : undefined;
}
async set(key: string, value: T, ttl?: number): Promise<void> {
const serialized = JSON.stringify(value);
if (ttl) {
await this.redis.setex(key, ttl, serialized);
} else {
await this.redis.set(key, serialized);
}
}
async delete(key: string): Promise<void> {
await this.redis.del(key);
}
}Contributing
- Follow the existing code style (2 spaces, single quotes, semicolons)
- Add comprehensive tests for new features
- Ensure all implementations follow the
Cache<T>interface - Update documentation for API changes
- Use the "Integration over Unit" testing philosophy
License
MIT License - see the LICENSE file for details.
