ozon-grabber
v1.1.0
Published
CLI for scanning Ozon order/return history via a real Chrome session and submitting parsed data to a local SQLite-backed service.
Downloads
435
Maintainers
Readme
ozon-grabber
CLI that scans your Ozon orders/returns via a real Chrome session and submits the parsed data to a SQLite-backed service.
The CLI launches Chrome through chrome-devtools-mcp, navigates the order/return pages it has access to, dumps each page's HTML, parses it locally with jsdom, and POSTs the result to a configurable backend (the companion service in ozon-orders-history_v2/backend).
Quick start
# 1. log in once — Chrome opens, log in to Ozon, press Ctrl+C
npx ozon-grabber login
# 2. scan orders. user-id and backend-url are picked up from env;
# --start-order is fetched from the backend (auto-resume).
npx ozon-grabber start --verbose
# 3. scan returns
npx ozon-grabber start --returns --verboseConfiguration
Set these once in your shell profile (e.g. ~/.zshrc):
export OZON_USER_ID=12345678
export OZON_BACKEND_URL=http://localhost:3000 # or your remote serviceBoth can also be passed as flags (--user-id, --backend-url) — flags override env.
Flags
| Flag | Default | Purpose |
|---|---|---|
| --user-id | $OZON_USER_ID | Ozon account id (required if env unset) |
| --start-order | from GET /scan-state/<user-id> | Where to begin. Omit to resume |
| --max-orders | unlimited | Cap how many to scan in this run |
| --returns | false | Scan returns (/my/returnDetails) instead of orders |
| --output <path> | stdout | Where to write the JSON summary |
| --backend-url | $OZON_BACKEND_URL or http://localhost:3000 | Submit URL |
| --backend / --no-backend | true | Submit POST per parsed result (or skip with --no-backend) |
| --user-data-dir | ./chrome-profile | Chrome profile directory (cookies + login persisted here) |
| --verbose | false | Per-order log lines |
Notes
- The Chrome profile (
chrome-profile/by default) carries your Ozon cookies. Don't share it. - First navigation can hit the Ozon anti-bot challenge — the CLI auto-retries
Navigation timeoutup to three times. --no-backendoutputs JSON to--output(or stdout) and skips submission. Useful for offline parsing.
License
MIT
