ai-mldeveloper-tools

Gemini Bridge

by elyin

Bridge Claude and Google's Gemini AI using the official Gemini CLI. Enable direct queries and file sharing between model

Bridges Claude with Google's Gemini AI through the official Gemini CLI, enabling direct queries and file-based context sharing between the two language models.

github stars

85

Zero API costs via Gemini CLI60-second timeout protectionNo complex state management

best for

  • / AI developers comparing model responses
  • / Code analysis using multiple AI perspectives
  • / Research requiring different AI model approaches

capabilities

  • / Send queries to Gemini models
  • / Share file context with Gemini
  • / Execute Gemini CLI commands
  • / Analyze files using Gemini

what it does

Connects Claude to Google's Gemini AI through the official Gemini CLI, allowing you to query Gemini models and share file context between the two language models.

about

Gemini Bridge is a community-built MCP server published by elyin that provides AI assistants with tools and capabilities via the Model Context Protocol. Bridge Claude and Google's Gemini AI using the official Gemini CLI. Enable direct queries and file sharing between model It is categorized under ai ml, developer tools. This server exposes 2 tools that AI clients can invoke during conversations and coding sessions.

how to install

You can install Gemini Bridge in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

license

MIT

Gemini Bridge is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

readme

Gemini Bridge

CI Status PyPI Version MIT License Python 3.10+ MCP Compatible Gemini CLI

A lightweight MCP (Model Context Protocol) server that enables AI coding assistants to interact with Google's Gemini AI through the official CLI. Works with Claude Code, Cursor, VS Code, and other MCP-compatible clients. Designed for simplicity, reliability, and seamless integration.

✨ Features

  • Direct Gemini CLI Integration: Zero API costs using official Gemini CLI
  • Simple MCP Tools: Two core functions for basic queries and file analysis
  • Stateless Operation: No sessions, caching, or complex state management
  • Production Ready: Robust error handling with configurable 60-second timeouts
  • Minimal Dependencies: Only requires mcp>=1.0.0 and Gemini CLI
  • Easy Deployment: Support for both uvx and traditional pip installation
  • Universal MCP Compatibility: Works with any MCP-compatible AI coding assistant

🚀 Quick Start

Prerequisites

  1. Install Gemini CLI:

    npm install -g @google/gemini-cli
    
  2. Authenticate with Gemini:

    gemini auth login
    
  3. Verify installation:

    gemini --version
    

Installation

🎯 Recommended: PyPI Installation

# Install from PyPI
pip install gemini-bridge

# Add to Claude Code with uvx (recommended)
claude mcp add gemini-bridge -s user -- uvx gemini-bridge

Alternative: From Source

# Clone the repository
git clone https://github.com/shelakh/gemini-bridge.git
cd gemini-bridge

# Build and install locally
uvx --from build pyproject-build
pip install dist/*.whl

# Add to Claude Code
claude mcp add gemini-bridge -s user -- uvx gemini-bridge

Development Installation

# Clone and install in development mode
git clone https://github.com/shelakh/gemini-bridge.git
cd gemini-bridge
pip install -e .

# Add to Claude Code (development)
claude mcp add gemini-bridge-dev -s user -- python -m src

🌐 Multi-Client Support

Gemini Bridge works with any MCP-compatible AI coding assistant - the same server supports multiple clients through different configuration methods.

Supported MCP Clients

  • Claude Code ✅ (Default)
  • Cursor
  • VS Code
  • Windsurf
  • Cline
  • Void
  • Cherry Studio
  • Augment
  • Roo Code
  • Zencoder
  • Any MCP-compatible client

Configuration Examples

<details> <summary><strong>Claude Code</strong> (Default)</summary>
# Recommended installation
claude mcp add gemini-bridge -s user -- uvx gemini-bridge

# Development installation
claude mcp add gemini-bridge-dev -s user -- python -m src
</details> <details> <summary><strong>Cursor</strong></summary>

Global Configuration (~/.cursor/mcp.json):

{
  "mcpServers": {
    "gemini-bridge": {
      "command": "uvx",
      "args": ["gemini-bridge"],
      "env": {}
    }
  }
}

Project-Specific (.cursor/mcp.json in your project):

{
  "mcpServers": {
    "gemini-bridge": {
      "command": "uvx",
      "args": ["gemini-bridge"],
      "env": {}
    }
  }
}

Go to: SettingsCursor SettingsMCPAdd new global MCP server

</details> <details> <summary><strong>VS Code</strong></summary>

Configuration (.vscode/mcp.json in your workspace):

{
  "servers": {
    "gemini-bridge": {
      "type": "stdio",
      "command": "uvx",
      "args": ["gemini-bridge"]
    }
  }
}

Alternative: Through Extensions

  1. Open Extensions view (Ctrl+Shift+X)
  2. Search for MCP extensions
  3. Add custom server with command: uvx gemini-bridge
</details> <details> <summary><strong>Windsurf</strong></summary>

Add to your Windsurf MCP configuration:

{
  "mcpServers": {
    "gemini-bridge": {
      "command": "uvx",
      "args": ["gemini-bridge"],
      "env": {}
    }
  }
}
</details> <details> <summary><strong>Cline</strong> (VS Code Extension)</summary>
  1. Open Cline and click MCP Servers in the top navigation
  2. Select Installed tab → Advanced MCP Settings
  3. Add to cline_mcp_settings.json:
{
  "mcpServers": {
    "gemini-bridge": {
      "command": "uvx",
      "args": ["gemini-bridge"],
      "env": {}
    }
  }
}
</details> <details> <summary><strong>Void</strong></summary>

Go to: SettingsMCPAdd MCP Server

{
  "mcpServers": {
    "gemini-bridge": {
      "command": "uvx",
      "args": ["gemini-bridge"],
      "env": {}
    }
  }
}
</details> <details> <summary><strong>Cherry Studio</strong></summary>
  1. Navigate to Settings → MCP Servers → Add Server
  2. Fill in the server details:
    • Name: gemini-bridge
    • Type: STDIO
    • Command: uvx
    • Arguments: ["gemini-bridge"]
  3. Save the configuration
</details> <details> <summary><strong>Augment</strong></summary>

Using the UI:

  1. Click hamburger menu → SettingsTools
  2. Click + Add MCP button
  3. Enter command: uvx gemini-bridge
  4. Name: Gemini Bridge

Manual Configuration:

"augment.advanced": { 
  "mcpServers": [ 
    { 
      "name": "gemini-bridge", 
      "command": "uvx", 
      "args": ["gemini-bridge"],
      "env": {}
    }
  ]
}
</details> <details> <summary><strong>Roo Code</strong></summary>
  1. Go to Settings → MCP Servers → Edit Global Config
  2. Add to mcp_settings.json:
{
  "mcpServers": {
    "gemini-bridge": {
      "command": "uvx",
      "args": ["gemini-bridge"],
      "env": {}
    }
  }
}
</details> <details> <summary><strong>Zencoder</strong></summary>
  1. Go to Zencoder menu (...) → ToolsAdd Custom MCP
  2. Add configuration:
{
  "command": "uvx",
  "args": ["gemini-bridge"],
  "env": {}
}
  1. Hit the Install button
</details> <details> <summary><strong>Alternative Installation Methods</strong></summary>

For pip-based installations:

{
  "command": "gemini-bridge",
  "args": [],
  "env": {}
}

For development/local testing:

{
  "command": "python",
  "args": ["-m", "src"],
  "env": {},
  "cwd": "/path/to/gemini-bridge"
}

For npm-style installation (if needed):

{
  "command": "npx",
  "args": ["gemini-bridge"],
  "env": {}
}
</details>

Universal Usage

Once configured with any client, use the same two tools:

  1. Ask general questions: "What authentication patterns are used in this codebase?"
  2. Analyze specific files: "Review these auth files for security issues"

The server implementation is identical - only the client configuration differs!

⚙️ Configuration

Timeout Configuration

By default, Gemini Bridge uses a 60-second timeout for all CLI operations. For longer queries (large files, complex analysis), you can configure a custom timeout using the GEMINI_BRIDGE_TIMEOUT environment variable.

Example configurations:

<details> <summary><strong>Claude Code</strong></summary>
# Add with custom timeout (120 seconds)
claude mcp add gemini-bridge -s user --env GEMINI_BRIDGE_TIMEOUT=120 -- uvx gemini-bridge
</details> <details> <summary><strong>Manual Configuration (mcp_settings.json)</strong></summary>
{
  "mcpServers": {
    "gemini-bridge": {
      "command": "uvx",
      "args": ["gemini-bridge"],
      "env": {
        "GEMINI_BRIDGE_TIMEOUT": "120"
      }
    }
  }
}
</details>

Timeout Options:

  • Default: 60 seconds (if not configured)
  • Range: Any positive integer (seconds)
  • Per-call override: Supply timeout_seconds to either tool for one-off extensions
  • Recommended: 120-300 seconds for large file analysis
  • Invalid values: Fall back to 60 seconds with warning

🛠️ Available Tools

consult_gemini

Direct CLI bridge for simple queries.

Parameters:

  • query (string): The question or prompt to send to Gemini
  • directory (string): Working directory for the query (default: current directory)
  • model (string, optional): Model to use - "flash" or "pro" (default: "flash")
  • timeout_seconds (int, optional): Override the execution timeout for this request

Example:

consult_gemini(
    query="Find authentication patterns in this codebase",
    directory="/path/to/project",
    model="flash"
)

consult_gemini_with_files

CLI bridge with file attachments for detailed analysis.

Parameters:

  • query (string): The question or prompt to send to Gemini
  • directory (string): Working directory for the query
  • files (list): List of file paths relative to the directory
  • model (string, optional): Model to use - "flash" or "pro" (default: "flash")
  • timeout_seconds (int, optional): Override the execution timeout for this request
  • mode (string, optional): Either "inline" (default) to stream file contents or "at_command" to let Gemini CLI resolve @path references itself

Example:

consult_gemini_with_files(
    query="Analyze these auth files and suggest improvements",
    directory="/path/to/project",
    files=["src/auth.py", "src/models.py"],
    model="pro",
    timeout_seconds=180
)

Tip: When scanning large trees, switch to mode="at_command" so the Gemini CLI handles file globbing and truncation natively.

📋 Usage Examples

Basic Code Analysis

# Simple research query
consult_gemini(
    query="What authentication patterns are used in this project?",
    directory="/Users/dev/my-project"
)

Detailed File Review

# Analyze specific files
consult_gemini_with_files(
    query="Review these files and suggest security imp

---

FAQ

What is the Gemini Bridge MCP server?
Gemini Bridge is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
How do MCP servers relate to agent skills?
Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
How are reviews shown for Gemini Bridge?
This profile displays 64 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.8 out of 5—verify behavior in your own environment before production use.
MCP server reviews

Ratings

4.864 reviews
  • Anaya Rao· Dec 28, 2024

    Strong directory entry: Gemini Bridge surfaces stars and publisher context so we could sanity-check maintenance before adopting.

  • Shikha Mishra· Dec 24, 2024

    Gemini Bridge has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.

  • Camila Anderson· Dec 24, 2024

    We wired Gemini Bridge into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

  • Advait Haddad· Dec 12, 2024

    Useful MCP listing: Gemini Bridge is the kind of server we cite when onboarding engineers to host + tool permissions.

  • Yuki Robinson· Dec 4, 2024

    Gemini Bridge is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.

  • Lucas Farah· Nov 23, 2024

    We wired Gemini Bridge into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

  • Lucas Haddad· Nov 19, 2024

    I recommend Gemini Bridge for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.

  • Camila Reddy· Nov 15, 2024

    Gemini Bridge is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.

  • Camila Thomas· Nov 3, 2024

    According to our notes, Gemini Bridge benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.

  • Camila Rao· Oct 22, 2024

    We wired Gemini Bridge into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

showing 1-10 of 64

1 / 7