npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@iflow-mcp/cargofy-atlas

v1.0.1

Published

ATLAS — AI Transport Logistics Agent Standard. MCP server for logistics companies.

Readme

ATLAS — AI Transport Logistics Agent Standard

The open-source MCP server that gives AI agents deep context about your logistics operations — without your data ever leaving your infrastructure.

License: Apache 2.0 MCP Compatible Docker


The Problem

Enterprise logistics companies have years of operational data — emails, contracts, TMS records, carrier relationships, pricing history. AI agents need this context to be useful. But sharing raw data with external cloud services is a non-starter for compliance, legal, and security teams.

The result: AI stays shallow. Agents can't negotiate from context. Every interaction starts from zero.

The Solution

ATLAS runs inside your security perimeter. It connects to your existing systems, indexes your data locally, and exposes a standardized MCP interface. Any AI agent can query ATLAS — getting deep operational context — without your data ever leaving your infrastructure.

[Your Company]                        [Cargofy / Any AI Agent]
  ├── Email                                      │
  ├── TMS                    MCP Protocol        │
  ├── ERP          ←─────────────────────────────┤
  ├── Contracts              (questions only,    │
  ├── Knowledge Base          no raw data out)   │
  └── ATLAS instance ────────────────────────────┘

Your data stays with you. Agents get the context they need.


Quick Start

Option 1: Docker (recommended)

docker run -p 3000:3000 cargofy/atlas

Open http://localhost:3000 — the Setup Wizard will guide you through initial configuration.

For production use with persistent data and AI features, see DOCKER.md.

Option 2: Claude Desktop

Add to your claude_desktop_config.json under mcpServers:

{
  "atlas": {
    "command": "docker",
    "args": ["run", "--rm", "-i", "cargofy/atlas", "node", "src/index.js"]
  }
}

Option 3: Run from source

git clone https://github.com/cargofy/ATLAS
cd ATLAS
npm install
cp config.example.yml config.yml   # edit with your settings
node seed.js                       # optional: load sample data
node src/ui-server.js              # Web UI + API on port 3000
# or
node src/index.js                  # MCP server on stdio

Web UI

ATLAS ships with a full web interface at http://localhost:3000:

| Page | Description | |------|-------------| | Dashboard | Server status, record counts, connector health, SLA violations | | Explorer | Browse data models, execute queries with visual filters | | Chat | Conversational AI interface with tool calling for logistics queries | | Playground | Test MCP tools directly from the browser | | Knowledge Base | Manage enterprise knowledge files (markdown, folders, CRUD) | | Import | Upload data files (JSON, CSV, XLSX), seed database, import from folders | | Connectors | View and manage data source configurations, trigger manual sync | | Modules | Enable/disable plugins, trigger sync, view module status | | Settings | Visual + YAML config editor with live reload | | Setup Wizard | First-run configuration (AI provider, security, instance name) |


MCP Tools

ATLAS exposes 35 MCP tools via the Model Context Protocol. Any MCP-compatible agent can connect:

| Category | Tools | |----------|-------| | Discovery | get_available_models, get_schema, get_available_carriers, get_available_lanes, get_available_document_types, get_sync_status | | Query | get_records, query (natural language search across all data) | | Shipments | get_shipment, get_shipments, get_shipment_events, get_unsigned_documents, get_closure_checklist | | Carriers | search_carriers, get_carrier_shipments | | Rates | get_rate_history | | Documents | list_documents | | Operations | get_sla_violations, get_idle_assets, get_anomalies, get_active_issues (20+ disruption types) |


AI Features

ATLAS supports multiple AI providers with role-based model routing:

| Provider | Models | Use | |----------|--------|-----| | Anthropic | Claude Sonnet/Opus/Haiku | Chat, extraction, knowledge enrichment | | OpenAI | GPT-4o, GPT-4o-mini | Chat, extraction | | Ollama | Any local model | Fully offline operation |

AI capabilities:

  • Entity extraction — upload any logistics document (PDF, CSV, XLSX, email) and extract structured data (shipments, carriers, rates, documents)
  • Knowledge enrichment — AI automatically updates your knowledge base from extracted data, detecting contradictions and appending new facts
  • Chat with tools — conversational interface that queries your data using MCP tools
  • Role routing — assign different models to different tasks (chat, extraction, knowledge)

Data Models

Core Models (always enabled)

| Model | Description | |-------|-------------| | Shipments | Ocean, air, road, rail, multimodal — status, mode, route, carrier, planned delivery | | Carriers | Profiles, type (trucking, shipping line, airline, rail, broker), country, rating | | Lanes | Origin → destination pairs with mode and average transit days | | Rates | Freight pricing by carrier, lane, mode, date range | | Documents | BOL, CMR, AWB, invoice, customs, POD, packing list, certificate of origin | | Tracking Events | Pickup, transit, delivery, exception events with location and geolocation | | Service Levels | Planned transit times per lane/mode/service type |

Extension Models (opt-in via config)

Assets, Drivers, Transport Orders, Facilities, Tenders, Tender Quotes, Tender Awards, Dispatches, Legs, Customs Entries


Connectors

| Connector | Status | Description | |-----------|--------|-------------| | REST API | Available | Generic REST with JSONPath mapping, bearer/basic/api_key auth | | Filesystem | Available | Local JSON, CSV, TXT, MD files (optional PDF/DOCX/XLSX) | | AI Extract | Available | Upload files for AI-powered entity extraction | | Email (IMAP/Exchange) | v0.2 | Indexes logistics-related emails | | SAP TM | Coming soon | SAP Transportation Management | | Oracle TMS | Coming soon | Oracle Transportation Management | | Transporeon | Coming soon | Transporeon platform integration | | project44 | Coming soon | Visibility and tracking data |

Modules (Plugins)

| Module | Description | |--------|-------------| | file-watch | Monitor a local folder for new files, auto-process through AI extraction pipeline | | knowledge-enricher | Automatically enrich knowledge base from AI extractions | | google-drive | Sync files from Google Drive folders with AI analysis (Docs/Sheets export, recursion) |

Modules are enabled/disabled via config.yml or the Modules page in Web UI.


Architecture

ATLAS Instance (your infrastructure)
├── AI Layer
│   ├── Multi-provider LLM client (Claude, OpenAI, Ollama)
│   ├── Entity extraction pipeline
│   ├── Knowledge engine (enrichment + contradiction detection)
│   └── Chat with tool calling
├── Module System (plugin architecture)
│   ├── file-watch
│   ├── knowledge-enricher
│   └── google-drive
├── Connectors
│   ├── REST API connector
│   ├── Filesystem connector
│   └── AI extraction connector
├── Storage Layer
│   ├── SQLite (default) or PostgreSQL
│   └── Knowledge base (markdown files)
├── MCP Server
│   ├── stdio transport (CLI / Claude Desktop)
│   └── HTTP/SSE transport (remote agents)
└── Web UI + REST API
    ├── Dashboard, Explorer, Chat, Playground
    ├── Knowledge Base manager
    ├── Settings, Modules, Import
    └── Setup Wizard

Security & Privacy

  • Zero data egress — ATLAS never sends your raw data outside your network
  • Local AI option — run Ollama for fully offline operation
  • Bearer token auth — scoped permissions (read/write) per token
  • Non-root Docker — runs as unprivileged atlas user
  • Audit logs — full log of every query made to your instance
  • Open source — inspect every line of code

Use Cases

Carrier Negotiation Agent

Agent queries ATLAS: "What's our volume with DHL on DE→PL in Q4?" → Gets answer from your own data → Negotiates from a position of knowledge.

Customer Service Agent

"Where is shipment #12345?" → Agent queries ATLAS for shipment status from your TMS → Answers instantly without manual lookup.

Procurement Agent

"Who are the top 3 carriers for refrigerated transport to Ukraine?" → Agent pulls from your historical performance data in ATLAS → Makes data-driven recommendation.


Powered by Cargofy

ATLAS is built and maintained by Cargofy — the AI platform for logistics. We built ATLAS because our enterprise customers needed it. We open-sourced it because the logistics industry needs a standard.

Cargofy platform connects to your ATLAS instance to provide:

  • AI agents that make calls, send messages, negotiate on your behalf
  • Analytics and reporting on top of your ATLAS data
  • Managed ATLAS hosting (if you prefer not to self-host)
  • Enterprise connectors and SLA support

Learn more about Cargofy


Contributing

ATLAS is Apache 2.0 licensed. Contributions welcome.

git clone https://github.com/cargofy/atlas
cd atlas
npm install
npm run dev

See CONTRIBUTING.md for guidelines.


License

Apache License 2.0 — see LICENSE


Listed In

ATLAS is submitted to the following MCP directories and lists:

punkpeye/awesome-mcp-servers appcypher/awesome-mcp-servers wong2/awesome-mcp-servers modelcontextprotocol/servers PulseMCP MCP Index Cursor Directory

Submit, discover, and explore MCP servers in the ecosystem.