@fabric-harness/databricks
v0.16.1
Published
Databricks tools and filesystem sources for Fabric Harness agents.
Downloads
1,104
Maintainers
Readme
@fabric-harness/databricks
Databricks integration helpers for Fabric Harness.
Current surface
DatabricksRestClient— small token-authenticated REST client.databricksSqlTool()— run SQL Warehouse statements from an agent tool.databricksRunJobTool()— trigger an existing Databricks job.databricksNotebookTool()— submit a one-off notebook run.unityCatalogTablesTool()— list Unity Catalog tables.databricksMlflowLogMetricTool()/databricksMlflowLogParamTool()— write MLflow telemetry from an agent run.databricksWorkspaceSource()— mount exported workspace notebooks/files as a Fabric filesystem source.databricksSqlSandbox()(subpath:@fabric-harness/databricks/sql-sandbox) — fullSandboxEnvwhoseexec(sql)runs against a SQL Warehouse, with results serialized as JSONL or CSV.
Install
npm install @fabric-harness/databricks @fabric-harness/sdkSQL Warehouse sandbox
Run an entire agent against a SQL Warehouse, with session.shell() (or session.prompt() tool calls) routed to SQL statement execution:
import { init } from '@fabric-harness/sdk';
import { databricksSqlSandbox } from '@fabric-harness/databricks/sql-sandbox';
const fabric = await init({
sandbox: databricksSqlSandbox({
host: process.env.DATABRICKS_HOST!,
token: process.env.DATABRICKS_TOKEN!,
warehouseId: process.env.DATABRICKS_WAREHOUSE_ID!,
catalog: 'main',
schema: 'analytics',
resultFormat: 'jsonl', // or 'csv'
}),
});
const session = await fabric.session();
const result = await session.shell('SELECT customer_id, SUM(amount) FROM orders GROUP BY 1');
console.log(result.stdout); // newline-delimited JSON rowsFor real data files, mount with databricksVolumeSource from @fabric-harness/connectors/databricks-volume:
import { databricksVolumeSource } from '@fabric-harness/connectors/databricks-volume';
await session.mount('/mnt/landing', databricksVolumeSource({
host: process.env.DATABRICKS_HOST!,
token: process.env.DATABRICKS_TOKEN!,
volumePath: '/Volumes/main/landing/raw',
}));Documentation
- Databricks deployment guide
- Roadmap — cluster/notebook-backed
SandboxEnvand Jobs deploy target are planned.
Keep Databricks hostnames and tokens in environment variables or secret stores; do not put them in prompts or payloads.
