npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@asksakina/islamic-knowledge-mcp

v1.0.1

Published

Verified Islamic knowledge MCP server — Quranic verses, authenticated du'as, and the 99 Names of Allah. Scholar-reviewed across mainstream Sunni schools (Hanafi, Maliki, Shafi'i, Hanbali). Provides three lookup tools to AI agents over the Model Context Pr

Readme

Sakina Islamic Knowledge MCP Server v1

Standalone TypeScript MCP server exposing Sakina's Gem-reviewed Islamic knowledge to external AI agents via the Model Context Protocol.

Architectural posture: "Sakina ships records, agents ship answers." The server is a reference library, not an advisor. Every response includes a _sakina_meta envelope with disclaimer, LLM directives, presentation contract, and educational context. What the calling agent does with the record is its responsibility — but every response arms the agent with enough structural context to make mishandling difficult and trackable.

Tools

get_quran_verse

Verbatim Quranic verse lookup by surah:ayah.

// Request
{ "surah": 2, "ayah": 186, "locale": "en" }

Returns the canonical Sakina envelope with Arabic text, translation, surah name (Arabic + English), and the quran presentation contract (no paraphrasing, surah:ayah citation required, "Translation of the Meaning" labelling).

get_dua

Returns du'as matching a life context (anxiety, grief, morning, travel, …). Context is resolved against canonical category slugs, an alias map, and tag dimensions on each category. If the context contains a crisis keyword (matched against the main app's detectCrisis keyword list), the response includes a mandatory crisis_resource block.

// Request
{ "context": "anxiety" }

get_name_of_allah

Look up by number (1–99) or string (transliteration, Arabic, or English meaning). Returns Arabic, transliteration, locale-aware meaning, reflection, and Quranic references.

// Request
{ "number": 29 }
// or
{ "name": "Al-Hakam" }

Resource

sakina://about

Read this resource before using the tools. Contains the seven core directives plus the three Gem-cleared educational dawah texts (Quran preservation, hadith grading, madhab attribution).

Response envelope

Every tool returns the same shape:

{
  "_sakina_meta": {
    "version": "1.0",
    "source": "Sakina Islamic Knowledge Server (asksakina.com)",
    "content_type": "quran_verse" | "dua_collection" | "name_of_allah" | "not_found",
    "disclaimer": "Sakina provides verified Islamic reference content for educational purposes. ...",
    "llm_directives": { "CRITICAL_RULES": ["…"] },
    "presentation_contract": { "quran": { … } | "hadith": { … } | "name_of_allah": { … } },
    "educational_context": "…"
  },
  "content": { /* tool-specific record */ },
  // get_dua only, when crisis keywords detected:
  "crisis_resource": { "directive": "…", "text": "…" }
}

Connecting from an MCP client

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%/Claude/claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "sakina-islamic-knowledge": {
      "command": "npx",
      "args": ["-y", "@asksakina/islamic-knowledge-mcp"]
    }
  }
}

Restart Claude Desktop. The three tools (get_quran_verse, get_dua, get_name_of_allah) and the sakina://about resource appear in the tool picker.

VS Code (with an MCP-aware extension)

Add to .vscode/mcp.json in your workspace:

{
  "servers": {
    "sakina-islamic-knowledge": {
      "command": "npx",
      "args": ["-y", "@asksakina/islamic-knowledge-mcp"]
    }
  }
}

Remote (Streamable HTTP)

If you want to connect to the public hosted instance instead of running locally:

{
  "mcpServers": {
    "sakina-islamic-knowledge": {
      "type": "streamable-http",
      "url": "https://sakina-mcp.fly.dev/mcp"
    }
  }
}

(Substitute the actual deployed URL — mcp.asksakina.com once the custom domain is provisioned.)

Running locally for development

cd mcp-server
npm install
npm run bundle-data     # snapshot data from main repo into data/
npm run typecheck       # tsc --noEmit
npm run start           # HTTP server on :3030 via tsx
npm run test:tools      # exercise each tool, prints sample JSON

# Or build + run from compiled output:
npm run build           # tsc -> dist/, postbuild adds shebang + dist/package.json
npm run start:dist      # node dist/server.js

POST /mcp accepts standard MCP JSON-RPC 2.0. GET /health returns { "status": "ok", "name", "version" }.

Rate limiting

Default: 60 requests per minute per IP.

  • UPSTASH_REDIS_REST_URL + UPSTASH_REDIS_REST_TOKEN set → distributed Upstash Redis fixed window.
  • Otherwise → in-memory token bucket, per process.

Responses include X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset. A 429 with code -32029 is returned on exceed, with the WO-prescribed message text.

Authentication

v1 ships unauthenticated. Optional X-Sakina-App-Id header is accepted but not enforced or logged in v1. Per the WO this becomes required in v2.

Data sources

| Tool | Source | |---|---| | get_quran_verse | alquran.cloud (Tanzil-derived Uthmani Arabic + Saheeh International / Jalandhri / Indonesian MoRA translations). Same upstream the main Sakina app uses. Cached in process for 24h per verse + edition. | | get_dua | Direct import of Sakina's 444-entry du'a corpus from ../src/data/duas. | | get_name_of_allah | Direct import of Sakina's 99 Names from ../src/lib/data/99-names. |

The two direct-imported sources mean the MCP server cannot drift from what the Sakina app surfaces. Any update to the main repo's data files automatically lands here on the next build.

Content rules enforced in code

  • Unicode prophet salutation (U+FDFA) is replaced with (peace be upon him) at every data-loader boundary. The response builder rejects any output containing the symbol.
  • No emoji directives are embedded in llm_directives.CRITICAL_RULES for every content type.
  • Hadith grading is paired with each du'a record; the presentation contract for hadith asserts require_grading: true.
  • Crisis keyword detection runs on every get_dua request — same keyword list used by the main app's safety gate.

Out of scope for v1

Per the Phase 3 planning doc (docs/mcp-phase-3-planning.md):

  • Semantic search / search_islamic_guidance — needs a Gem-reviewed eval set.
  • explain_islamic_concept — directly conflicts with CLAUDE.md's "AI never gives spiritual/fiqh advice" rule until an Architect-level ruling is made.
  • get_pastoral_guidance — Gem 3's framing depends on Sakina-controlled surface; cannot ship via MCP without a separate review.
  • check_halal_ingredient — blocked on WO#61 restoration.
  • HMAC signing enforcement — v1 accepts but ignores the X-Sakina-App-Id header.
  • Authority page + take-down policy — out-of-band Architect work before v2.

Deployment

The package is fully self-contained at publish time — npm run bundle-data snapshots the du'a corpus, the 99 Names, and the canonical safety modules from the main Sakina monorepo into data/ and src/safety/_synced/. The compiled output in dist/ plus the bundled data/ directory have no runtime dependency on the main app.

Fly.io (recommended)

cd mcp-server
fly launch              # first time only — picks up fly.toml
fly secrets set \
  UPSTASH_REDIS_REST_URL=... \
  UPSTASH_REDIS_REST_TOKEN=...
fly deploy
fly status
curl https://<app>.fly.dev/health

The fly.toml config:

  • App name sakina-mcp, region lhr (London Heathrow).
  • Internal port 3030, force HTTPS.
  • HTTP healthcheck on /health every 30s.
  • Auto-stop on idle, auto-start on request — keeps cost low for a v1.
  • 1 shared CPU, 256 MB RAM.

Local container test

docker compose up --build
curl http://localhost:3030/health

The docker-compose.yml is a smoke-test config; production deployment goes through fly.toml.

Other targets

The package is just a Node.js HTTP server, so it runs on any platform that takes a Dockerfile or a Node process: Railway, Render, a small VPS, Vercel Functions (with a small adapter to swap the transport for the Web-Standard variant), Cloudflare Workers (likewise). Pick whatever has the right cold-start profile for your use; the in-memory rate limiter falls back gracefully when Upstash isn't configured.

npm publishing

The package is registered as @asksakina/islamic-knowledge-mcp. Publishing is an Architect action:

cd mcp-server
npm login                                    # to the @asksakina org
npm publish --access public

prepublishOnly runs bundle-databuildtest:tools so the published tarball always contains fresh data and a clean build. The files field in package.json whitelists dist/, data/, README.md, LICENSE, and .mcp/server.json.

Verify the published tarball with npm pack --dry-run first to see what would ship.

MCP Registry submission

mcp-server/.mcp/server.json declares Sakina's identity for the official MCP Registry:

  • Server name: com.asksakina/islamic-knowledge
  • Package: @asksakina/islamic-knowledge-mcp (npm, stdio transport)
  • Remote: https://sakina-mcp.fly.dev/mcp (Streamable HTTP)

Architect publishing flow:

mcp-publisher login                          # adds DNS TXT record for asksakina.com
mcp-publisher publish .mcp/server.json
curl "https://registry.modelcontextprotocol.io/v0/servers?search=com.asksakina"

Update the remotes.url in server.json after the Fly.io app is deployed (and again if/when the custom domain mcp.asksakina.com is provisioned).