search-webai-ml

Gemini DeepSearch

by alexcong

Gemini DeepSearch automates web research using Google Search API and Gemini models, delivering in-depth, cited insights

Performs automated multi-step web research using Google Search API and Gemini models to generate diverse search queries, conduct parallel searches, and synthesize comprehensive answers with proper source citations through configurable research depth levels.

github stars

25

Multi-step automated research workflowThree configurable effort levelsLangGraph-powered state management

best for

  • / Research analysts needing comprehensive topic investigation
  • / Content creators requiring well-sourced information
  • / Students and academics conducting literature reviews
  • / Professionals preparing detailed reports

capabilities

  • / Generate sophisticated search queries automatically
  • / Conduct parallel web searches using Google Search API
  • / Synthesize information from multiple sources
  • / Identify knowledge gaps and iterate research
  • / Produce citation-rich answers with source tracking
  • / Configure research depth with effort levels

what it does

Performs automated multi-step web research using Google Search and Gemini AI to generate comprehensive answers with citations. Configurable research depth levels control query generation and iteration loops.

about

Gemini DeepSearch is a community-built MCP server published by alexcong that provides AI assistants with tools and capabilities via the Model Context Protocol. Gemini DeepSearch automates web research using Google Search API and Gemini models, delivering in-depth, cited insights It is categorized under search web, ai ml.

how to install

You can install Gemini DeepSearch in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

license

MIT

Gemini DeepSearch is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

readme

Gemini DeepSearch MCP

Gemini DeepSearch MCP is an automated research agent that leverages Google Gemini models and Google Search to perform deep, multi-step web research. It generates sophisticated queries, synthesizes information from search results, identifies knowledge gaps, and produces high-quality, citation-rich answers.

Features

  • Automated multi-step research using Gemini models and Google Search
  • FastMCP integration for both HTTP API and stdio deployment
  • Configurable effort levels (low, medium, high) for research depth
  • Citation-rich responses with source tracking
  • LangGraph-powered workflow with state management

Usage

Development Server (HTTP + Studio UI)

Start the LangGraph development server with Studio UI:

make dev

Local MCP Server (stdio)

Start the MCP server with stdio transport for integration with MCP clients:

make local

Testing

Run the test suite:

make test

Test the MCP stdio server:

make test_mcp

Use MCP inspector

make inspect

With Langsmith tracing

GEMINI_API_KEY=AI******* LANGSMITH_API_KEY=ls******* LANGSMITH_TRACING=true make inspect

API

The deep_search tool accepts:

  • query (string): The research question or topic to investigate
  • effort (string): Research effort level - "low", "medium", or "high"
    • Low: 1 query, 1 loop, Flash model
    • Medium: 3 queries, 2 loops, Flash model
    • High: 5 queries, 3 loops, Pro model

Return Format

HTTP MCP Server (Development mode):

  • answer: Comprehensive research response with citations
  • sources: List of source URLs used in research

Stdio MCP Server (Claude Desktop integration):

  • file_path: Path to a JSON file containing the research results

The stdio MCP server writes results to a JSON file in the system temp directory to optimize token usage. The JSON file contains the same answer and sources data as the HTTP version, but is accessed via file path rather than returned directly.

Requirements

  • Python 3.12+
  • GEMINI_API_KEY environment variable

Installation

Install directly using uvx:

uvx install gemini-deepsearch-mcp

Claude Desktop Integration

To use the MCP server with Claude Desktop, add this configuration to your Claude Desktop config file:

macOS

Edit ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "gemini-deepsearch": {
      "command": "uvx",
      "args": ["gemini-deepsearch-mcp"],
      "env": {
        "GEMINI_API_KEY": "your-gemini-api-key-here"
      },
      "timeout": 180000
    }
  }
}

Windows

Edit %APPDATA%/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "gemini-deepsearch": {
      "command": "uvx",
      "args": ["gemini-deepsearch-mcp"],
      "env": {
        "GEMINI_API_KEY": "your-gemini-api-key-here"
      },
      "timeout": 180000
    }
  }
}

Linux

Edit ~/.config/claude/claude_desktop_config.json:

{
  "mcpServers": {
    "gemini-deepsearch": {
      "command": "uvx",
      "args": ["gemini-deepsearch-mcp"],
      "env": {
        "GEMINI_API_KEY": "your-gemini-api-key-here"
      },
      "timeout": 180000
    }
  }
}

Important:

  • Replace your-gemini-api-key-here with your actual Gemini API key
  • Restart Claude Desktop after updating the configuration
  • Set ample timeout to avoid MCP error -32001: Request timed out

Alternative: Local Development Setup

For development or if you prefer to run from source:

{
  "mcpServers": {
    "gemini-deepsearch": {
      "command": "uv",
      "args": ["run", "python", "main.py"],
      "cwd": "/path/to/gemini-deepsearch-mcp",
      "env": {
        "GEMINI_API_KEY": "your-gemini-api-key-here"
      }
    }
  }
}

Replace /path/to/gemini-deepsearch-mcp with the actual absolute path to your project directory.

Once configured, you can use the deep_search tool in Claude Desktop by asking questions like:

  • "Use deep_search to research the latest developments in quantum computing"
  • "Search for information about renewable energy trends with high effort"

Agent Source

The deep search agent is from the Gemini Fullstack LangGraph Quickstart repository.

License

MIT

FAQ

What is the Gemini DeepSearch MCP server?
Gemini DeepSearch is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
How do MCP servers relate to agent skills?
Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
How are reviews shown for Gemini DeepSearch?
This profile displays 70 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.7 out of 5—verify behavior in your own environment before production use.
MCP server reviews

Ratings

4.770 reviews
  • Li Nasser· Dec 28, 2024

    We evaluated Gemini DeepSearch against two servers with overlapping tools; this profile had the clearer scope statement.

  • Li Garcia· Dec 24, 2024

    We wired Gemini DeepSearch into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

  • Ganesh Mohane· Dec 20, 2024

    We wired Gemini DeepSearch into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

  • Isabella Jain· Dec 20, 2024

    Gemini DeepSearch is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.

  • Li Anderson· Dec 4, 2024

    Gemini DeepSearch has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.

  • James Mehta· Nov 23, 2024

    Strong directory entry: Gemini DeepSearch surfaces stars and publisher context so we could sanity-check maintenance before adopting.

  • Mateo Okafor· Nov 19, 2024

    Gemini DeepSearch is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.

  • Li Johnson· Nov 15, 2024

    Gemini DeepSearch is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.

  • Sakshi Patil· Nov 11, 2024

    Gemini DeepSearch is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.

  • Benjamin Smith· Nov 11, 2024

    We wired Gemini DeepSearch into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

showing 1-10 of 70

1 / 7