npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@einlogic/mcp-fabric-api

v2.2.0

Published

MCP server for Microsoft Fabric REST APIs — enables data engineers and data analysts to use AI assistants beyond Copilot to build and manage Fabric components

Readme

mcp-fabric-api

MCP (Model Context Protocol) server for the Microsoft Fabric REST APIs. Built for data engineers and data analysts who want to use AI assistants beyond Copilot — such as Claude, Claude Code, or any MCP-compatible client — to build and manage their Fabric components. Covers workspaces, lakehouses, warehouses, notebooks, pipelines, semantic models, reports, dataflows, eventhouses, eventstreams, reflexes, GraphQL APIs, SQL endpoints, variable libraries, git integration, deployment pipelines, mirrored databases, KQL databases, ML models, ML experiments, copy jobs, and external data shares.

Safe by default: This server blocks all destructive operations (create, update, delete) until you explicitly configure the WRITABLE_WORKSPACES environment variable. Read operations always work. Set WRITABLE_WORKSPACES="*" to allow writes to all workspaces, or use patterns to limit access. See Workspace Safety Guard for details.

Prerequisites

  • Node.js 18+
  • Azure CLI (az login for authentication)
  • Access to a Microsoft Fabric workspace

Quick Start

Authenticate with Azure CLI:

az login

Run directly with npx (no install needed):

npx @einlogic/mcp-fabric-api

Setup

Claude Desktop

Add to your Claude Desktop config file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "fabric": {
      "command": "npx",
      "args": ["-y", "@einlogic/mcp-fabric-api"]
    }
  }
}

Claude Code CLI

claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api

To verify it was added:

claude mcp list

HTTP Mode (Remote)

For remote deployments, set environment variables:

export TRANSPORT=http
export PORT=3000
export AZURE_CLIENT_ID=your-client-id
export AZURE_CLIENT_SECRET=your-client-secret
export AZURE_TENANT_ID=your-tenant-id
npx @einlogic/mcp-fabric-api

The server exposes:

  • POST /mcp — MCP endpoint (StreamableHTTP)
  • GET /mcp — SSE stream for server notifications
  • DELETE /mcp — Session cleanup
  • GET /.well-known/oauth-protected-resource — OAuth metadata

Workspace Safety Guard

Control which workspaces allow write operations (create, update, delete) via the WRITABLE_WORKSPACES environment variable. Only workspaces matching the configured name patterns will permit CUD (Create, Update, Delete) operations. Read operations are never restricted.

Default behavior: When WRITABLE_WORKSPACES is not set or empty, all destructive operations are blocked. You must explicitly configure this variable to enable writes.

| WRITABLE_WORKSPACES value | Behavior | |------------------------------|----------| | Not set / empty | All writes blocked (safe default) | | * | All workspaces writable | | *-Dev,*-Test,Sandbox* | Only matching workspaces writable |

Set comma-separated glob patterns:

WRITABLE_WORKSPACES=*-Dev,*-Test,Sandbox*

Wildcard examples:

  • * matches all workspaces (allow everything)
  • *-Dev matches "Sales-Dev", "Finance-Dev"
  • Sandbox* matches "Sandbox-123", "Sandbox-Mike"
  • Exact-Name matches only "Exact-Name" (case-insensitive)

Guarded tools (89 total) — every tool that creates, updates, or deletes workspace items:

| Domain | Guarded tools | |--------|--------------| | Workspace | workspace_update, workspace_delete | | Lakehouse | lakehouse_create, lakehouse_update, lakehouse_delete, lakehouse_load_table, lakehouse_create_shortcut, lakehouse_update_definition, lakehouse_delete_shortcut | | Warehouse | warehouse_create, warehouse_update, warehouse_delete, warehouse_update_definition | | Notebook | notebook_create, notebook_update, notebook_delete, notebook_update_definition | | Pipeline | pipeline_create, pipeline_update, pipeline_delete, pipeline_create_schedule, pipeline_update_schedule, pipeline_delete_schedule, pipeline_update_definition | | Semantic Model | semantic_model_create_bim, semantic_model_create_tmdl, semantic_model_update_details, semantic_model_delete, semantic_model_update_bim, semantic_model_update_tmdl, semantic_model_take_over | | Report | report_create_definition, report_update, report_delete, report_clone, report_update_definition, report_rebind | | Dataflow | dataflow_create, dataflow_update, dataflow_delete | | Eventhouse | eventhouse_create, eventhouse_update, eventhouse_delete | | Eventstream | eventstream_create, eventstream_update, eventstream_delete, eventstream_update_definition | | Reflex | reflex_create, reflex_update, reflex_delete, reflex_update_definition | | GraphQL API | graphql_api_create, graphql_api_update, graphql_api_delete | | Variable Library | variable_library_create, variable_library_update, variable_library_delete, variable_library_update_definition | | Git Integration | git_connect, git_disconnect, git_initialize_connection, git_commit_to_git, git_update_from_git, git_update_credentials | | Deployment Pipeline | deployment_pipeline_assign_workspace, deployment_pipeline_unassign_workspace, deployment_pipeline_deploy | | Mirrored Database | mirrored_database_create, mirrored_database_update, mirrored_database_delete, mirrored_database_update_definition, mirrored_database_start_mirroring, mirrored_database_stop_mirroring | | KQL Database | kql_database_create, kql_database_update, kql_database_delete, kql_database_update_definition | | ML Model | ml_model_create, ml_model_update, ml_model_delete | | ML Experiment | ml_experiment_create, ml_experiment_update, ml_experiment_delete | | Copy Job | copy_job_create, copy_job_update, copy_job_delete, copy_job_update_definition | | External Data Share | external_data_share_create, external_data_share_revoke |

Not guarded: Read operations (list, get, get_definition, get_bim, get_tmdl), query execution (DAX, KQL, SQL, GraphQL), run/refresh/cancel operations, export operations, and deployment pipeline CRUD (tenant-level, not workspace-scoped).

Claude Desktop config with guard:

{
  "mcpServers": {
    "fabric": {
      "command": "npx",
      "args": ["-y", "@einlogic/mcp-fabric-api"],
      "env": {
        "WRITABLE_WORKSPACES": "*-Dev,*-Test,Sandbox*"
      }
    }
  }
}

Claude Code CLI with guard:

WRITABLE_WORKSPACES="*-Dev,*-Test" claude mcp add fabric -- npx -y @einlogic/mcp-fabric-api

Error when not configured:

WRITABLE_WORKSPACES is not configured. Destructive actions are blocked by default. Set WRITABLE_WORKSPACES to a comma-separated list of workspace name patterns, or "*" to allow all.

Error when workspace not in allow list:

Workspace "Production-Analytics" is not in the writable workspaces list. Allowed patterns: *-Dev, *-Test, Sandbox*

File-Based I/O

To avoid large payloads overwhelming MCP clients, definition tools use file paths instead of inline content. The server reads files from disk when sending definitions to Fabric, and writes files to disk when retrieving definitions from Fabric.

Input tools — the server reads definition files from the specified path and uploads to Fabric:

| Tool | Parameter | Description | |------|-----------|-------------| | semantic_model_create_bim | definitionFilePath | Path to model.bim JSON file | | semantic_model_update_bim | definitionFilePath | Path to model.bim JSON file | | semantic_model_create_tmdl | filesDirectoryPath | Directory of .tmdl and .pbism files | | semantic_model_update_tmdl | filesDirectoryPath | Directory of .tmdl and .pbism files | | notebook_update_definition | definitionDirectoryPath | Directory containing notebook definition files | | eventstream_update_definition | definitionDirectoryPath | Directory containing eventstream definition files | | report_create_definition | definitionDirectoryPath | Directory of PBIR report definition files | | report_update_definition | definitionDirectoryPath | Directory of PBIR report definition files | | variable_library_create | definitionDirectoryPath | Directory of .json and .platform files | | variable_library_update_definition | definitionDirectoryPath | Directory of .json and .platform files | | lakehouse_update_definition | partsDirectoryPath | Directory of definition files (or inline parts) | | warehouse_update_definition | partsDirectoryPath | Directory of definition files (or inline parts) | | pipeline_update_definition | partsDirectoryPath | Directory of definition files (or inline parts) | | reflex_update_definition | partsDirectoryPath | Directory of definition files (or inline parts) | | mirrored_database_update_definition | partsDirectoryPath | Directory of definition files (or inline parts) | | kql_database_update_definition | partsDirectoryPath | Directory of definition files (or inline parts) | | copy_job_update_definition | partsDirectoryPath | Directory of definition files (or inline parts) |

Output tools — the server retrieves definitions from Fabric and writes them to disk:

| Tool | Parameter | What gets written | |------|-----------|-------------------| | semantic_model_get_bim | outputFilePath | Single model.bim JSON file | | semantic_model_get_tmdl | outputDirectoryPath | TMDL files preserving folder structure | | notebook_get_definition | outputDirectoryPath | Notebook definition files | | lakehouse_get_definition | outputDirectoryPath | Lakehouse definition files | | warehouse_get_definition | outputDirectoryPath | Warehouse definition files | | pipeline_get_definition | outputDirectoryPath | Pipeline definition files | | report_get_definition | outputDirectoryPath | Report definition files (report.json, pages, visuals) | | dataflow_get_definition | outputDirectoryPath | Dataflow definition files | | eventstream_get_definition | outputDirectoryPath | Eventstream definition files | | graphql_api_get_definition | outputDirectoryPath | GraphQL schema definition files | | reflex_get_definition | outputDirectoryPath | Reflex definition files | | variable_library_get_definition | outputDirectoryPath | Variable library files (variables.json, valueSets/) | | mirrored_database_get_definition | outputDirectoryPath | Mirrored database definition files | | kql_database_get_definition | outputDirectoryPath | KQL database definition files | | copy_job_get_definition | outputDirectoryPath | Copy job definition files |

TMDL directory structure example:

/tmp/my-model/
  model.tmdl
  definition.pbism
  definition/
    tables/
      Sales.tmdl
      Product.tmdl
    relationships.tmdl

Development

git clone https://github.com/your-org/mcp-fabric-api.git
cd mcp-fabric-api
npm install
npm run build
npm start
npm run dev          # Watch mode
npm run inspect      # Launch MCP Inspector

Tools (193 total)

Auth (4 tools)

| Tool | Description | |------|-------------| | auth_get_current_account | Show current Azure identity, tenant, and token expiry | | auth_list_available_accounts | List subscriptions/tenants from local az login state (does not query Entra) | | auth_switch_tenant | Switch to a different Azure tenant (with rollback on failure) | | auth_clear_token_cache | Clear cached tokens to force re-acquisition |

Workspace (6 tools)

| Tool | Description | |------|-------------| | workspace_list | List all accessible Fabric workspaces | | workspace_get | Get details of a specific workspace | | workspace_create | Create a new workspace | | workspace_update | Update a workspace's name or description | | workspace_delete | Delete a workspace | | workspace_list_items | List all items in a workspace (with optional type filter) |

Lakehouse (14 tools)

| Tool | Description | |------|-------------| | lakehouse_list | List all lakehouses in a workspace | | lakehouse_get | Get lakehouse details (SQL endpoint, OneLake paths) | | lakehouse_create | Create a new lakehouse (LRO, schemas enabled by default) | | lakehouse_update | Update lakehouse name or description | | lakehouse_delete | Delete a lakehouse | | lakehouse_list_tables | List all tables in a lakehouse (falls back to SQL endpoint for schema-enabled lakehouses) | | lakehouse_load_table | Load data into a table from OneLake (LRO). Not supported for schema-enabled lakehouses | | lakehouse_create_shortcut | Create a OneLake shortcut (file, folder, table, or schema level) with support for multiple target types | | lakehouse_get_sql_endpoint | Get SQL endpoint details | | lakehouse_get_definition | Get lakehouse definition (LRO). Writes files to outputDirectoryPath | | lakehouse_update_definition | Update lakehouse definition (LRO). Reads from partsDirectoryPath or inline parts | | lakehouse_list_shortcuts | List all OneLake shortcuts in a lakehouse | | lakehouse_get_shortcut | Get details of a specific OneLake shortcut | | lakehouse_delete_shortcut | Delete a OneLake shortcut |

Warehouse (9 tools)

| Tool | Description | |------|-------------| | warehouse_list | List all warehouses in a workspace | | warehouse_get | Get warehouse details including connection string and provisioning status | | warehouse_create | Create a new warehouse (LRO) | | warehouse_update | Update warehouse name or description | | warehouse_delete | Delete a warehouse | | warehouse_get_sql_endpoint | Get SQL connection details for a warehouse | | warehouse_list_tables | List all tables in a warehouse | | warehouse_get_definition | Get warehouse definition (LRO). Writes files to outputDirectoryPath | | warehouse_update_definition | Update warehouse definition (LRO). Reads from partsDirectoryPath or inline parts |

Notebook (10 tools)

| Tool | Description | |------|-------------| | notebook_list | List all notebooks in a workspace | | notebook_get | Get notebook details | | notebook_create | Create a new notebook (LRO) | | notebook_update | Update notebook name or description | | notebook_delete | Delete a notebook | | notebook_get_definition | Get notebook definition (LRO). Writes files to outputDirectoryPath | | notebook_update_definition | Update notebook definition (LRO). Reads files from definitionDirectoryPath | | notebook_run | Run a notebook on demand | | notebook_get_run_status | Get notebook run status | | notebook_cancel_run | Cancel a running notebook |

Pipeline (15 tools)

| Tool | Description | |------|-------------| | pipeline_list | List all data pipelines | | pipeline_get | Get pipeline details | | pipeline_create | Create a new pipeline | | pipeline_update | Update pipeline name or description | | pipeline_delete | Delete a pipeline | | pipeline_run | Run a pipeline on demand | | pipeline_get_run_status | Get pipeline run status | | pipeline_cancel_run | Cancel a running pipeline | | pipeline_list_runs | List all run instances | | pipeline_list_schedules | List pipeline schedules | | pipeline_create_schedule | Create a pipeline schedule | | pipeline_update_schedule | Update a pipeline schedule | | pipeline_delete_schedule | Delete a pipeline schedule | | pipeline_get_definition | Get pipeline definition (LRO). Writes files to outputDirectoryPath | | pipeline_update_definition | Update pipeline definition (LRO). Reads from partsDirectoryPath or inline parts |

Semantic Model (15 tools)

| Tool | Description | |------|-------------| | semantic_model_list | List all semantic models | | semantic_model_get_details | Get semantic model metadata (name, ID, description) — does not return the definition | | semantic_model_create_bim | Create a semantic model from a BIM/JSON file (LRO). Reads model.bim from definitionFilePath | | semantic_model_create_tmdl | Create a semantic model from TMDL files (LRO). Reads .tmdl/.pbism from filesDirectoryPath | | semantic_model_update_details | Update semantic model name or description — does not modify the definition | | semantic_model_delete | Delete a semantic model | | semantic_model_refresh | Trigger a model refresh (Power BI API) | | semantic_model_execute_dax | Execute a DAX query (Power BI API) | | semantic_model_get_bim | Get definition in BIM/JSON format (LRO). Writes model.bim to outputFilePath | | semantic_model_get_tmdl | Get definition in TMDL format (LRO). Writes TMDL files to outputDirectoryPath | | semantic_model_update_bim | Update definition from BIM/JSON file (LRO). Reads model.bim from definitionFilePath | | semantic_model_update_tmdl | Update definition from TMDL files (LRO). Reads .tmdl/.pbism from filesDirectoryPath | | semantic_model_get_refresh_history | Get refresh history (Power BI API) | | semantic_model_take_over | Take over ownership of a semantic model (Power BI API) | | semantic_model_get_datasources | Get data sources of a semantic model (Power BI API) |

Report (13 tools)

| Tool | Description | |------|-------------| | report_list | List all reports | | report_get | Get report details | | report_create_definition | Create a new report from PBIR definition files (LRO). Reads from definitionDirectoryPath | | report_update | Update report name or description | | report_delete | Delete a report | | report_clone | Clone a report (Power BI API) | | report_export | Export report to file format (PDF, PPTX, PNG, etc.) via Power BI API | | report_get_export_status | Check report export status | | report_get_definition | Get report definition (LRO). Writes files to outputDirectoryPath | | report_update_definition | Update report definition from PBIR directory (LRO). Reads from definitionDirectoryPath | | report_rebind | Rebind a report to a different semantic model/dataset (Power BI API) | | report_get_pages | Get the list of pages in a report (Power BI API) | | report_get_datasources | Get data sources used by a report (Power BI API) |

Dataflow Gen2 (8 tools)

| Tool | Description | |------|-------------| | dataflow_list | List all Dataflow Gen2 items | | dataflow_get | Get dataflow details | | dataflow_create | Create a new dataflow | | dataflow_update | Update dataflow name or description | | dataflow_delete | Delete a dataflow | | dataflow_refresh | Trigger a dataflow refresh | | dataflow_get_refresh_status | Get refresh job status | | dataflow_get_definition | Get dataflow definition (LRO). Writes files to outputDirectoryPath |

Eventhouse (7 tools)

| Tool | Description | |------|-------------| | eventhouse_list | List all eventhouses | | eventhouse_get | Get eventhouse details | | eventhouse_create | Create a new eventhouse (LRO) | | eventhouse_update | Update eventhouse name or description | | eventhouse_delete | Delete an eventhouse | | eventhouse_get_sql_endpoint | Get query service URI and connection details | | eventhouse_execute_kql | Execute a KQL query against a KQL database |

Eventstream (7 tools)

| Tool | Description | |------|-------------| | eventstream_list | List all eventstreams | | eventstream_get | Get eventstream details | | eventstream_create | Create a new eventstream (LRO) | | eventstream_update | Update eventstream name or description | | eventstream_delete | Delete an eventstream | | eventstream_get_definition | Get eventstream definition (LRO). Writes files to outputDirectoryPath | | eventstream_update_definition | Update eventstream definition (LRO). Reads from definitionDirectoryPath |

Reflex / Activator (7 tools)

| Tool | Description | |------|-------------| | reflex_list | List all Reflex (Activator) items | | reflex_get | Get reflex details | | reflex_create | Create a new reflex | | reflex_update | Update reflex name or description | | reflex_delete | Delete a reflex | | reflex_get_definition | Get reflex definition (LRO). Writes files to outputDirectoryPath | | reflex_update_definition | Update reflex definition (LRO). Reads from partsDirectoryPath or inline parts |

GraphQL API (7 tools)

| Tool | Description | |------|-------------| | graphql_api_list | List all GraphQL API items | | graphql_api_get | Get GraphQL API details | | graphql_api_create | Create a new GraphQL API | | graphql_api_update | Update GraphQL API name or description | | graphql_api_delete | Delete a GraphQL API | | graphql_api_get_definition | Get GraphQL schema definition (LRO). Writes files to outputDirectoryPath | | graphql_api_execute_query | Execute a GraphQL query |

SQL Endpoint (4 tools)

| Tool | Description | |------|-------------| | sql_endpoint_list | List all SQL endpoints | | sql_endpoint_get | Get SQL endpoint details | | sql_endpoint_get_connection_string | Get TDS connection string | | sql_endpoint_execute_query | Execute a T-SQL query against a lakehouse or warehouse SQL endpoint |

Variable Library (7 tools)

| Tool | Description | |------|-------------| | variable_library_list | List all variable libraries in a workspace | | variable_library_get | Get variable library details including active value set name | | variable_library_create | Create a variable library, optionally with definition files from definitionDirectoryPath (LRO) | | variable_library_update | Update name, description, or active value set | | variable_library_delete | Delete a variable library | | variable_library_get_definition | Get definition (LRO). Writes files (variables.json, valueSets/) to outputDirectoryPath | | variable_library_update_definition | Update definition from directory of .json and .platform files (LRO) |

Git Integration (9 tools)

| Tool | Description | |------|-------------| | git_get_connection | Get Git connection details for a workspace | | git_get_status | Get Git status of items (sync state between workspace and remote) | | git_connect | Connect a workspace to a Git repository (Azure DevOps or GitHub) | | git_disconnect | Disconnect a workspace from its Git repository | | git_initialize_connection | Initialize a Git connection after connecting (LRO) | | git_commit_to_git | Commit workspace changes to the connected Git repository (LRO) | | git_update_from_git | Update workspace from the connected Git repository (LRO) | | git_get_credentials | Get Git credentials configuration for the current user | | git_update_credentials | Update Git credentials configuration for the current user |

Deployment Pipeline (12 tools)

| Tool | Description | |------|-------------| | deployment_pipeline_list | List all deployment pipelines accessible to the user | | deployment_pipeline_get | Get details of a specific deployment pipeline | | deployment_pipeline_create | Create a new deployment pipeline | | deployment_pipeline_update | Update deployment pipeline name or description | | deployment_pipeline_delete | Delete a deployment pipeline | | deployment_pipeline_list_stages | List all stages in a deployment pipeline | | deployment_pipeline_list_stage_items | List all items in a specific stage | | deployment_pipeline_assign_workspace | Assign a workspace to a pipeline stage | | deployment_pipeline_unassign_workspace | Unassign a workspace from a pipeline stage | | deployment_pipeline_deploy | Deploy items from one stage to another (LRO) | | deployment_pipeline_list_operations | List operations (deployment history) | | deployment_pipeline_get_operation | Get details of a specific deployment operation |

Mirrored Database (11 tools)

| Tool | Description | |------|-------------| | mirrored_database_list | List all mirrored databases in a workspace | | mirrored_database_get | Get details of a specific mirrored database | | mirrored_database_create | Create a new mirrored database (LRO) | | mirrored_database_update | Update mirrored database name or description | | mirrored_database_delete | Delete a mirrored database | | mirrored_database_get_definition | Get mirrored database definition (LRO). Writes files to outputDirectoryPath | | mirrored_database_update_definition | Update definition (LRO). Reads from partsDirectoryPath or inline parts | | mirrored_database_start_mirroring | Start mirroring for a mirrored database | | mirrored_database_stop_mirroring | Stop mirroring for a mirrored database | | mirrored_database_get_mirroring_status | Get the mirroring status | | mirrored_database_get_tables_mirroring_status | Get mirroring status of individual tables |

KQL Database (7 tools)

| Tool | Description | |------|-------------| | kql_database_list | List all KQL databases in a workspace | | kql_database_get | Get details of a specific KQL database | | kql_database_create | Create a new KQL database (LRO). Requires a parent eventhouse | | kql_database_update | Update KQL database name or description | | kql_database_delete | Delete a KQL database | | kql_database_get_definition | Get KQL database definition (LRO). Writes files to outputDirectoryPath | | kql_database_update_definition | Update definition (LRO). Reads from partsDirectoryPath or inline parts |

ML Model (5 tools)

| Tool | Description | |------|-------------| | ml_model_list | List all ML models in a workspace | | ml_model_get | Get details of a specific ML model | | ml_model_create | Create a new ML model (LRO) | | ml_model_update | Update ML model name or description | | ml_model_delete | Delete an ML model |

ML Experiment (5 tools)

| Tool | Description | |------|-------------| | ml_experiment_list | List all ML experiments in a workspace | | ml_experiment_get | Get details of a specific ML experiment | | ml_experiment_create | Create a new ML experiment (LRO) | | ml_experiment_update | Update ML experiment name or description | | ml_experiment_delete | Delete an ML experiment |

Copy Job (7 tools)

| Tool | Description | |------|-------------| | copy_job_list | List all copy jobs in a workspace | | copy_job_get | Get details of a specific copy job | | copy_job_create | Create a new copy job | | copy_job_update | Update copy job name or description | | copy_job_delete | Delete a copy job | | copy_job_get_definition | Get copy job definition (LRO). Writes files to outputDirectoryPath | | copy_job_update_definition | Update definition (LRO). Reads from partsDirectoryPath or inline parts |

External Data Share (4 tools)

| Tool | Description | |------|-------------| | external_data_share_list | List all external data shares for an item | | external_data_share_get | Get details of a specific external data share | | external_data_share_create | Create a new external data share for an item | | external_data_share_revoke | Revoke an external data share |

License

AGPL-3.0