databasesanalytics-data

Databricks

by characat0

Interact with Databricks data catalogs, schemas, and SQL warehouses securely. Ideal for Databricks certified and Azure D

Provides a bridge between AI and Databricks workspaces, enabling interaction with data catalogs, schemas, tables, and SQL warehouses for direct querying and analysis of Databricks data.

github stars

4

Direct SQL execution on warehousesFull catalog browsing capabilities

best for

  • / Data analysts exploring Databricks datasets
  • / AI-powered data analysis workflows
  • / Automated reporting from Databricks
  • / Data discovery and catalog exploration

capabilities

  • / Execute SQL queries on Databricks warehouses
  • / Browse data catalogs and schemas
  • / List available tables with filtering
  • / Get detailed table information
  • / Access SQL warehouse metadata
  • / Query Databricks data structures

what it does

Connects AI assistants to Databricks workspaces for browsing data catalogs and executing SQL queries. Query your Databricks tables and warehouses directly through natural language.

about

Databricks is a community-built MCP server published by characat0 that provides AI assistants with tools and capabilities via the Model Context Protocol. Interact with Databricks data catalogs, schemas, and SQL warehouses securely. Ideal for Databricks certified and Azure D It is categorized under databases, analytics data. This server exposes 6 tools that AI clients can invoke during conversations and coding sessions.

how to install

You can install Databricks in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

license

MIT

Databricks is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

readme

Databricks MCP Server

A Model Context Protocol (MCP) server for interacting with Databricks.

Installation

You can download the latest release for your platform from the Releases page.

VS Code

Install the Databricks MCP Server extension in VS Code by pressing the following link:

<img src="https://img.shields.io/badge/VS_Code-VS_Code?style=flat-square&label=Install%20Server&color=0098FF" alt="Install in VS Code">

Alternatively, you can install the extension manually by running the following command:

# For VS Code
code --add-mcp '{"name":"databricks","command":"npx","args":["databricks-mcp-server@latest"]}'
# For VS Code Insiders
code-insiders --add-mcp '{"name":"databricks","command":"npx","args":["databricks-mcp-server@latest"]}'

Tools

The Databricks MCP Server provides a Model Context Protocol (MCP) interface to interact with Databricks workspaces. It offers the following functionalities:

List Catalogs

Lists all catalogs available in the Databricks workspace.

Tool name: list_catalogs

Parameters: None

Returns: JSON array of catalog objects

List Schemas

Lists all schemas in a specified Databricks catalog.

Tool name: list_schemas

Parameters:

  • catalog (string, required): Name of the catalog to list schemas from

Returns: JSON array of schema objects

List Tables

Lists all tables in a specified Databricks schema with optional filtering.

Tool name: list_tables

Parameters:

  • catalog (string, required): Name of the catalog containing the schema
  • schema (string, required): Name of the schema to list tables from
  • filter_pattern (string, optional, default: ".*"): Regular expression pattern to filter table names

Returns: JSON array of table objects

Execute SQL

Executes SQL statements on a Databricks SQL warehouse and returns the results.

Tool name: execute_sql

Parameters:

  • statement (string, required): SQL statement to execute
  • timeout_seconds (number, optional, default: 60): Timeout in seconds for the statement execution
  • row_limit (number, optional, default: 100): Maximum number of rows to return in the result

Returns: JSON object containing columns and rows from the query result, with information of the SQL warehouse used to execute the statement.

List SQL Warehouses

Lists all SQL warehouses available in the Databricks workspace.

Tool name: list_warehouses

Parameters: None

Returns: JSON array of SQL warehouse objects

Supported Platforms

  • Linux (amd64)
  • Windows (amd64)
  • macOS (Intel/amd64)
  • macOS (Apple Silicon/arm64)

Usage

Authentication

The application uses Databricks unified authentication. For details on how to configure authentication, please refer to the Databricks Authentication documentation.

Running the Server

Start the MCP server:

./databricks-mcp-server

The server will start and listen for MCP protocol commands on standard input/output.

Development

Prerequisites

  • Go 1.24 or later

FAQ

What is the Databricks MCP server?
Databricks is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
How do MCP servers relate to agent skills?
Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
How are reviews shown for Databricks?
This profile displays 10 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.5 out of 5—verify behavior in your own environment before production use.
MCP server reviews

Ratings

4.510 reviews
  • Shikha Mishra· Oct 10, 2024

    Databricks is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.

  • Piyush G· Sep 9, 2024

    We evaluated Databricks against two servers with overlapping tools; this profile had the clearer scope statement.

  • Chaitanya Patil· Aug 8, 2024

    Useful MCP listing: Databricks is the kind of server we cite when onboarding engineers to host + tool permissions.

  • Sakshi Patil· Jul 7, 2024

    Databricks reduced integration guesswork — categories and install configs on the listing matched the upstream repo.

  • Ganesh Mohane· Jun 6, 2024

    I recommend Databricks for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.

  • Oshnikdeep· May 5, 2024

    Strong directory entry: Databricks surfaces stars and publisher context so we could sanity-check maintenance before adopting.

  • Dhruvi Jain· Apr 4, 2024

    Databricks has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.

  • Rahul Santra· Mar 3, 2024

    According to our notes, Databricks benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.

  • Pratham Ware· Feb 2, 2024

    We wired Databricks into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

  • Yash Thakker· Jan 1, 2024

    Databricks is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.