breath-core
v1.0.1
Published
Headless breath and heart rate tracking library using MediaPipe
Maintainers
Readme
Breath Core
A headless, framework-agnostic library for real-time Breath Rate and Heart Rate monitoring using standard webcams.
Built with MediaPipe for high-performance face and pose tracking in the browser.
🚀 Features
- Breathing Rate: Tracks chest/shoulder displacement to detect inhalation and exhalation.
- Heart Rate (rPPG): Extracts the photoplethysmographic signal (blood volume pulse) from the facial skin using the Green channel.
- Stress Analysis: derived from Heart Rate Variability (HRV) and Breath Rate.
- Privacy First: All processing happens locally in the browser via WebAssembly (WASM). No video data is sent to the cloud.
📦 Installation
npm install breath-core(Note: Requires a bundler like Vite, Webpack, or Rollup)
🛠 Usage
import { BreathTracker } from 'breath-core';
// 1. Initialize Tracker
const tracker = new BreathTracker({
mode: 'accurate', // 'accurate' (full model) or 'performance' (lite)
sampleRate: 30 // Expected video FPS
});
// 2. Listen for Events
tracker.on('breath', (metrics) => {
console.log(`BPM: ${metrics.breathing.rate}`);
console.log(`State: ${metrics.breathing.state}`); // 'inhale', 'exhale', 'hold'
console.log(`Heart Rate: ${metrics.heart.rate}`);
});
// 3. Initialize & Start Loop
await tracker.initialize();
async function loop() {
const videoElement = document.getElementById('myVideo');
await tracker.processFrame(videoElement);
requestAnimationFrame(loop);
}
loop();Architecture
The library is designed with a modular, pipeline-based architecture.
BreathTracker(src/tracker.js): The main public-facing class that orchestrates the entire process. It manages state, runs the processing loop, and emits the final metrics.VisionEngine(src/core/vision.js): A dedicated wrapper for the MediaPipe Vision tasks. It initializes theFaceLandmarkerandPoseLandmarkermodels and handles the detection calls on each video frame.SignalAnalyzer(src/core/analyzer.js): This class is responsible for converting the raw landmarks fromVisionEngineinto physiological signals.- It calculates the average Y-position of the shoulders for the breathing motion signal.
- It computes the bounding box of the cheek regions and calculates the average Green channel value for the rPPG signal.
dsp/: A directory containing pure Digital Signal Processing utilities.filters.js: A first-order IIR (Infinite Impulse Response) bandpass filter to isolate desired frequencies.analysis.js: Functions for calculating frequency from a signal buffer (calculateFrequency) and Heart Rate Variability (calculateHRV).buffer.js: A simpleRollingBufferclass to hold time-series data.
🧠 Algorithmic Principles
This library implements a 3-stage pipeline to convert raw video pixels into physiological metrics.
1. Vision Engine (MediaPipe)
We utilize the MediaPipe Face Landmarker and Pose Landmarker (2024/2025 Tasks API) to extract stable keypoints.
- Pose: We use the
pose_landmarker_fullmodel to track the Y-axis displacement of the shoulders (Indices 11 & 12). - Face: We extract a dense mesh to identify the cheek regions.
2. Signal Extraction
- Breathing (Motion-based): We track the vertical translation of the shoulder girdle. This signal is inverted (Up = Inhale) and amplified.
- Heart Rate (rPPG): We implement the method proposed by Li et al. [2] and Wang et al. [1].
- Region of Interest (ROI): We dynamically compute a bounding box for the cheek regions using facial landmarks.
- Signal: We average the Green Channel intensity across all pixels in this ROI. Hemoglobin absorbs green light strongly, making this channel most responsive to blood volume changes [4].
- Optimization: Unlike simple point-sampling, we average hundreds of pixels to improve the Signal-to-Noise Ratio (SNR), allowing detection even with standard webcams.
3. DSP (Digital Signal Processing)
Raw signals are noisy. We apply a 2-stage filter:
- Bandpass Filter:
- Breath: 0.1Hz - 0.5Hz (6 - 30 BPM)
- Heart: 0.7Hz - 3.0Hz (42 - 180 BPM)
- Zero-Crossing Detection:
- We detect phases (Inhale/Exhale) by tracking signal direction changes.
- State detection uses a hysteresis threshold (~1% of signal amplitude) to prevent jitter while maintaining responsiveness.
💻 Development
Running the Demo App
The project includes a vanilla JavaScript demo application in the /frontend_html directory.
- Install dependencies:
npm install - Run the dev server:
This will open the demo app in your browser (usually atnpm run devhttp://localhost:5173).
Running Tests
Unit tests for the DSP and core logic are located in the /tests directory.
npm run test📚 References
- Wang, W., den Brinker, A. C., Stuijk, S., & de Haan, G. (2017). "Algorithmic Principles of Remote PPG," IEEE Transactions on Biomedical Engineering, 64(7), 1479–1491. DOI
- Li, X., Chen, J., Zhao, G., & Pietikainen, M. (2014). "Remote Heart Rate Measurement From Face Videos Under Realistic Situations," CVPR, 4264-4271. DOI
- Rouast, P. V., et al. (2018). "Remote heart rate measurement using low-cost RGB face video: a technical literature review," Frontiers of Computer Science, 12(5), 858–872. DOI
- Verkruysse, W., Svaasand, L. O., & Nelson, J. S. (2008). "Remote plethysmographic imaging using ambient light," Optics Express, 16(26), 21434–45. DOI
📄 License
ISC
