Skip to main content
Glama

Databricks MCP Server

by samhavens
  • Linux
  • Apple

upload_file_to_volume

Transfer local files to Databricks Unity Catalog volumes for storage or processing. Supports large multi-GB files, progress tracking, and overwrite options. Returns upload status, file size, and time.

Instructions

Upload a local file to a Databricks Unity Catalog volume. Args: local_file_path: Path to local file (e.g. './data/products.json') volume_path: Full volume path (e.g. '/Volumes/catalog/schema/volume/file.json') overwrite: Whether to overwrite existing file (default: False) Returns: JSON with upload results including success status, file size in MB, and upload time. Example: # Upload large dataset to volume result = upload_file_to_volume( local_file_path='./stark_export/products_full.json', volume_path='/Volumes/kbqa/stark_mas_eval/stark_raw_data/products_full.json', overwrite=True ) Note: Handles large files (multi-GB) with progress tracking and proper error handling. Perfect for uploading extracted datasets to Unity Catalog volumes for processing.

Input Schema

NameRequiredDescriptionDefault
local_file_pathYes
overwriteNo
volume_pathYes

Input Schema (JSON Schema)

{ "properties": { "local_file_path": { "title": "Local File Path", "type": "string" }, "overwrite": { "default": false, "title": "Overwrite", "type": "boolean" }, "volume_path": { "title": "Volume Path", "type": "string" } }, "required": [ "local_file_path", "volume_path" ], "title": "upload_file_to_volumeArguments", "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/samhavens/databricks-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server