growth-nirvana-mcp-server
v1.0.2
Published
External MCP server for Growth Nirvana Rails read APIs
Readme
Growth Nirvana Rails MCP Server
External MCP server for account-scoped, API-key-protected read endpoints backed by Growth Nirvana Rails.
Requirements
- Node
v20.19.1 pnpm
Setup
nvm use v20.19.1
pnpm installEnvironment variables:
GROWTH_NIRVANA_API_KEY(required): plaintext key issued byAPIClientKey.issue!GROWTH_NIRVANA_BASE_URL(optional): Rails host URL (defaulthttps://app.growthnirvana.com)- The server automatically appends
/api/v1/mcpif missing.
- The server automatically appends
GROWTH_NIRVANA_TIMEOUT_MS(optional, default15000)GROWTH_NIRVANA_MAX_RETRIES(optional, default3)
Run and test
pnpm start
pnpm check
pnpm testAuth and key behavior
Each request sends:
Authorization: Bearer <api_key>(preferred)X-API-Key: <api_key>
Key semantics enforced by Rails:
- master key: can operate across accounts
- client key: restricted to its own account
- account-scoped tools support
account_id="self"(default when omitted)
Tool set
Master discovery tool:
search_accounts(q, page?, per_page?)
Search and account-scoped tools:
list_connectors(account_id, provider?, q?, status?, updated_since?, page?, per_page?)get_connector(account_id, connector_id)search_datasets(account_id, q, page?, per_page?, updated_since?, type?, enabled?)list_warehouse_tables(account_id, page?, per_page?, updated_since?, name?, include?)search_warehouse_tables(account_id, q, page?, per_page?, include?)get_warehouse_table(account_id, warehouse_table_id, include?)list_warehouse_fields(account_id, page?, per_page?, updated_since?, name?, include?)search_warehouse_fields(account_id, q, page?, per_page?, include?)get_warehouse_field(account_id, warehouse_field_id, include?)search_transformation_models(account_id, q, dataset_id?, updated_since?, page?, per_page?)list_datasets(account_id, page?, per_page?, updated_since?, type?, enabled?, include?)get_dataset(account_id, dataset_id, include?)list_dataset_warehouse_tables(account_id, dataset_id, page?, per_page?, updated_since?, name?, include?)get_dataset_warehouse_table(account_id, dataset_id, warehouse_table_id, include?)list_dataset_warehouse_fields(account_id, dataset_id, warehouse_table_id, page?, per_page?, updated_since?, name?, include?)get_dataset_warehouse_field(account_id, dataset_id, warehouse_table_id, warehouse_field_id, include?)list_dataset_transformation_models(account_id, dataset_id, page?, per_page?, updated_since?, folder_id?)get_dataset_transformation_model(account_id, dataset_id, transformation_model_id)get_dataset_client_config(account_id, dataset_id)
Top-level account tools:
list_transformation_models(account_id, page?, per_page?, dataset_id?, updated_since?)get_transformation_model(account_id, transformation_model_id)list_data_transformations(account_id, page?, per_page?, dataset_id?, active?, updated_since?)get_data_transformation(account_id, data_transformation_id)
Async run tools:
create_query_execution(account_id, query, saved_query_id?, run_with_liquid?)get_query_execution(account_id, query_execution_id, include?, includeResults?, row_limit?)cancel_query_execution(account_id, query_execution_id)create_dry_run(account_id, query, context, dataset_id?, package_version_id?, queryable_id?, queryable_type?, run_with_dependencies?, run_with_liquid?)get_dry_run(account_id, dry_run_id)install_all_paid_pro(account_id, connector_id, dataset_display_name, package_version_id?, dataset_name?, name_suffix?, hotglue_dataset?, stream_token?, idempotency_key?)get_package_install(account_id, package_install_id)create_dataset_bundle_export(account_id, dataset_id, idempotency_key?)get_dataset_bundle_export(account_id, dataset_id, bundle_export_id)
Scopes
Scopes expected by the tools:
read:accounts(forsearch_accounts; master key flow)read:datasetsread:warehouse_tablesread:connectorsread:transformation_modelsread:client_dataset_configread:data_transformationsrun:query_executionsrun:dry_runsrun:packagesrun:dataset_bundle_exportsread:*and*wildcard support is server-siderun:*wildcard support is server-side
Response and error handling
Success envelopes are normalized and returned as:
- list:
{ data: [...], meta: { page, per_page, total }, errors: [] } - show:
{ data: {...}, errors: [] }
Error envelope from Rails:
{ data: null, errors: [{ code, message }] }
HTTP status behavior:
401: invalid/missing key (no blind retry)403: missing scope or forbidden account/key type404: resource not found422: invalid query params429/5xx: exponential backoff retries
Debug logging
Each request logs:
- request URL
account_id- status code
- Rails
error.codeanderror.message
Orchestration guidance
For ambiguous account names in master-key workflows:
- call
search_accounts("Stackmatix") - present matches and pick
account_id - run account-scoped tools using that
account_id
Sync strategy:
- start with
get_dataset(account_id, dataset_id) - fan out to dataset tables/models list tools
- maintain independent
updated_sincecursors per resource type (datasets, tables, models) - page until complete using
page+per_page
Dataset search notes:
search_datasetsis account-scoped and searchesname+alias_name- supports
q(required),page,per_page,updated_since,type,enabled - backend ranking is exact/prefix-first, then broader contains matches
Connector notes:
get_connectorexpects provider-prefixed connector ids likefivetran_123orhotglue_456- endpoints are metadata/link-first and do not return plaintext secrets/tokens
Dataset client config payload
get_dataset_client_config exposes both:
customTransformationsConfig(raw YAML string)customTransformationsConfigParsed(parsed object/hash)
