npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@langwatch/server

v3.2.1

Published

Run LangWatch locally with one command — `npx @langwatch/server`. Complete LangWatch stack: observability, evaluations, AI gateway, agent simulations.

Readme

Why LangWatch?

The platform for LLM evaluations and AI agent testing. We help teams test, simulate, evaluate, and monitor LLM-powered agents end-to-end — before release and in production. Built for teams that need regression testing, simulations, and production observability without building custom tooling.

LangWatch gives you full visibility into agent behavior and the tools to systematically improve reliability, performance, and cost, while keeping you in control of your AI system

Getting Started

Cloud ☁️

The easiest way to get started with LangWatch.

Create a free account → create a project → get started/ copy your API key.

Local setup 💻

The fastest way to run LangWatch locally — only Node.js required:

npx @langwatch/server

The CLI installs uv, postgres, redis, clickhouse, and the AI gateway binary into ~/.langwatch/, scaffolds a .env with locally-generated secrets, then starts every service in parallel and opens http://localhost:5560. Everything lives under ~/.langwatch/; rm -rf ~/.langwatch is a clean reset.

Prefer Docker? You can still use docker compose:

git clone https://github.com/langwatch/langwatch.git
cd langwatch
cp langwatch/.env.example langwatch/.env
docker compose up -d --wait --build

Once running, LangWatch will be available at http://localhost:5560, where you can create your first project and API key.

Deployment options ⚓️

Run LangWatch on your own infrastructure:

For companies that have strict data residency and control requirements, without needing to go fully on-prem.

Read more about it on our docs.

You can also run LangWatch locally without docker to develop and help contribute to the project.

Start just the databases using docker and leave it running:

docker compose up redis postgres opensearch

Then, on another terminal, install the dependencies and start LangWatch:

make install
make start

🚀 Quick Start

Ship safer agents in minutes. Create a free account, then dive into these guides:

🗺️ Integrations

LangWatch builds and maintains several integrations listed below. Our tracing platform is built on top of OpenTelemetry, so we support any OpenTelemetry-compatible library out of the box.

Frameworks:
LangChain · LangGraph · Vercel AI SDK · Mastra · CrewAI · Google ADK

Model Providers:
OpenAI · Anthropic · Azure · Google Cloud · AWS · Groq · Ollama

Platforms

LangFlow · Flowise · n8n

and many more…

Are you using a platform that could benefit from a direct LangWatch integration? We'd love to hear from you, please fill out this very quick form.

💬 Support

Have questions or need help? We're here to support you in multiple ways:

  • Documentation: Our comprehensive documentation covers everything from getting started to advanced features.
  • Discord Community: Join our Discord server for real-time help from our team and community.
  • X (Twitter): Follow us on X for updates and announcements.
  • GitHub Issues: Report bugs or request features through our GitHub repository.
  • Enterprise Support: Enterprise customers receive priority support with dedicated response times. Our pricing page contains more information.

🤝 Collaborating

Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

Please read our Contribution Guidelines for details on our code of conduct, and the process for submitting pull requests.

✍️ License

Please read our LICENSE.md file.

👮‍♀️ Security + Compliance

As a platform that has access to data that is highly likely to be sensitive, we take security incredibly seriously and treat it as a core part of our culture.

| Legal Framework | Current Status | | --------------- | ------------------------------------------------------------------------------ | | GDPR | Compliant. DPA available upon request. | | ISO 27001 | Certified. Certification report available upon request on our Enterprise plan. |

Please refer to our Security page for more information. Contact us at [email protected] if you have any further questions.

Vulnerability Disclosure

If you need to do a responsible disclosure of a security vulnerability, you may do so by email to [email protected], or if you prefer you can reach out to one of our team privately on Discord.