// may the 4th be with you⚔️
ai-mldatabases

Qdrant

by qdrant

Qdrant is a powerful vector database for AI systems to store and retrieve vector-based memories with advanced vector sea

Store and retrieve vector-based memories for AI systems.

github stars

1.3K

0 commentsdiscussion

Both formats append explainx.ai attribution and the canonical URL for this MCP server listing.

Semantic search capabilitiesMetadata support for enhanced organization

best for

  • / AI assistants that need long-term memory
  • / Chatbots requiring context from past conversations
  • / Knowledge management systems with semantic search
  • / AI applications needing persistent memory storage

capabilities

  • / Store text information with vector embeddings
  • / Search for semantically similar content
  • / Add metadata to stored information
  • / Manage multiple collections
  • / Retrieve relevant memories based on queries

what it does

Provides vector-based memory storage and retrieval for AI systems using Qdrant database. Enables AI to store information and find semantically similar content later.

about

Qdrant is an official MCP server published by qdrant that provides AI assistants with tools and capabilities via the Model Context Protocol. Qdrant is a powerful vector database for AI systems to store and retrieve vector-based memories with advanced vector sea It is categorized under ai ml, databases.

how to install

You can install Qdrant in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

license

Apache-2.0

Qdrant is released under the Apache-2.0 license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

readme

mcp-server-qdrant: A Qdrant MCP server

smithery badge

The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external data sources and tools. Whether you're building an AI-powered IDE, enhancing a chat interface, or creating custom AI workflows, MCP provides a standardized way to connect LLMs with the context they need.

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Overview

An official Model Context Protocol server for keeping and retrieving memories in the Qdrant vector search engine. It acts as a semantic memory layer on top of the Qdrant database.

Components

Tools

  1. qdrant-store
    • Store some information in the Qdrant database
    • Input:
      • information (string): Information to store
      • metadata (JSON): Optional metadata to store
      • collection_name (string): Name of the collection to store the information in. This field is required if there are no default collection name. If there is a default collection name, this field is not enabled.
    • Returns: Confirmation message
  2. qdrant-find
    • Retrieve relevant information from the Qdrant database
    • Input:
      • query (string): Query to use for searching
      • collection_name (string): Name of the collection to store the information in. This field is required if there are no default collection name. If there is a default collection name, this field is not enabled.
    • Returns: Information stored in the Qdrant database as separate messages

Environment Variables

The configuration of the server is done using environment variables:

NameDescriptionDefault Value
QDRANT_URLURL of the Qdrant serverNone
QDRANT_API_KEYAPI key for the Qdrant serverNone
COLLECTION_NAMEName of the default collection to use.None
QDRANT_LOCAL_PATHPath to the local Qdrant database (alternative to QDRANT_URL)None
EMBEDDING_PROVIDEREmbedding provider to use (currently only "fastembed" is supported)fastembed
EMBEDDING_MODELName of the embedding model to usesentence-transformers/all-MiniLM-L6-v2
TOOL_STORE_DESCRIPTIONCustom description for the store toolSee default in settings.py
TOOL_FIND_DESCRIPTIONCustom description for the find toolSee default in settings.py

Note: You cannot provide both QDRANT_URL and QDRANT_LOCAL_PATH at the same time.

[!IMPORTANT] Command-line arguments are not supported anymore! Please use environment variables for all configuration.

FastMCP Environment Variables

Since mcp-server-qdrant is based on FastMCP, it also supports all the FastMCP environment variables. The most important ones are listed below:

Environment VariableDescriptionDefault Value
FASTMCP_DEBUGEnable debug modefalse
FASTMCP_LOG_LEVELSet logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)INFO
FASTMCP_HOSTHost address to bind the server to127.0.0.1
FASTMCP_PORTPort to run the server on8000
FASTMCP_WARN_ON_DUPLICATE_RESOURCESShow warnings for duplicate resourcestrue
FASTMCP_WARN_ON_DUPLICATE_TOOLSShow warnings for duplicate toolstrue
FASTMCP_WARN_ON_DUPLICATE_PROMPTSShow warnings for duplicate promptstrue
FASTMCP_DEPENDENCIESList of dependencies to install in the server environment[]

Installation

Using uvx

When using uvx no specific installation is needed to directly run mcp-server-qdrant.

QDRANT_URL="http://localhost:6333" \
COLLECTION_NAME="my-collection" \
EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2" \
uvx mcp-server-qdrant

Transport Protocols

The server supports different transport protocols that can be specified using the --transport flag:

QDRANT_URL="http://localhost:6333" \
COLLECTION_NAME="my-collection" \
uvx mcp-server-qdrant --transport sse

Supported transport protocols:

  • stdio (default): Standard input/output transport, might only be used by local MCP clients
  • sse: Server-Sent Events transport, perfect for remote clients
  • streamable-http: Streamable HTTP transport, perfect for remote clients, more recent than SSE

The default transport is stdio if not specified.

When SSE transport is used, the server will listen on the specified port and wait for incoming connections. The default port is 8000, however it can be changed using the FASTMCP_PORT environment variable.

QDRANT_URL="http://localhost:6333" \
COLLECTION_NAME="my-collection" \
FASTMCP_PORT=1234 \
uvx mcp-server-qdrant --transport sse

Using Docker

A Dockerfile is available for building and running the MCP server:

# Build the container
docker build -t mcp-server-qdrant .

# Run the container
docker run -p 8000:8000 \
  -e FASTMCP_HOST="0.0.0.0" \
  -e QDRANT_URL="http://your-qdrant-server:6333" \
  -e QDRANT_API_KEY="your-api-key" \
  -e COLLECTION_NAME="your-collection" \
  mcp-server-qdrant

[!TIP] Please note that we set FASTMCP_HOST="0.0.0.0" to make the server listen on all network interfaces. This is necessary when running the server in a Docker container.

Installing via Smithery

To install Qdrant MCP Server for Claude Desktop automatically via Smithery:

npx @smithery/cli install mcp-server-qdrant --client claude

Manual configuration of Claude Desktop

To use this server with the Claude Desktop app, add the following configuration to the "mcpServers" section of your claude_desktop_config.json:

{
  "qdrant": {
    "command": "uvx",
    "args": ["mcp-server-qdrant"],
    "env": {
      "QDRANT_URL": "https://xyz-example.eu-central.aws.cloud.qdrant.io:6333",
      "QDRANT_API_KEY": "your_api_key",
      "COLLECTION_NAME": "your-collection-name",
      "EMBEDDING_MODEL": "sentence-transformers/all-MiniLM-L6-v2"
    }
  }
}

For local Qdrant mode:

{
  "qdrant": {
    "command": "uvx",
    "args": ["mcp-server-qdrant"],
    "env": {
      "QDRANT_LOCAL_PATH": "/path/to/qdrant/database",
      "COLLECTION_NAME": "your-collection-name",
      "EMBEDDING_MODEL": "sentence-transformers/all-MiniLM-L6-v2"
    }
  }
}

This MCP server will automatically create a collection with the specified name if it doesn't exist.

By default, the server will use the sentence-transformers/all-MiniLM-L6-v2 embedding model to encode memories. For the time being, only FastEmbed models are supported.

Support for other tools

This MCP server can be used with any MCP-compatible client. For example, you can use it with Cursor and VS Code, which provide built-in support for the Model Context Protocol.

Using with Cursor/Windsurf

You can configure this MCP server to work as a code search tool for Cursor or Windsurf by customizing the tool descriptions:

QDRANT_URL="http://localhost:6333" \
COLLECTION_NAME="code-snippets" \
TOOL_STORE_DESCRIPTION="Store reusable code snippets for later retrieval. \
The 'information' parameter should contain a natural language description of what the code does, \
while the actual code should be included in the 'metadata' parameter as a 'code' property. \
The value of 'metadata' is a Python dictionary with strings as keys. \
Use this whenever you generate some code snippet." \
TOOL_FIND_DESCRIPTION="Search for relevant code snippets based on natural language descriptions. \
The 'query' parameter should describe what you're looking for, \
and the tool will return the most relevant code snippets. \
Use this when you need to find existing code snippets for reuse or reference." \
uvx mcp-server-qdrant --transport sse # Enable SSE transport

In Cursor/Windsurf, you can then configure the MCP server in your settings by pointing to this running server using SSE transport protocol. The description on how to add an MCP server to Cursor can be found in the [Cursor documentation](https://docs.cursor.com/context/model-context-protocol#adding


FAQ

What is the Qdrant MCP server?
Qdrant is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
How do MCP servers relate to agent skills?
Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
How are reviews shown for Qdrant?
This profile displays 28 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.6 out of 5—verify behavior in your own environment before production use.

Discussion

Product Hunt–style comments (not star reviews)
  • No comments yet — start the thread.
MCP server reviews

Ratings

4.628 reviews
  • Dhruvi Jain· Dec 20, 2024

    We wired Qdrant into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

  • Aarav Nasser· Dec 20, 2024

    Strong directory entry: Qdrant surfaces stars and publisher context so we could sanity-check maintenance before adopting.

  • Oshnikdeep· Nov 11, 2024

    Qdrant is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.

  • Aarav Park· Nov 11, 2024

    I recommend Qdrant for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.

  • Ganesh Mohane· Oct 2, 2024

    Qdrant is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.

  • Xiao Bhatia· Oct 2, 2024

    Qdrant reduced integration guesswork — categories and install configs on the listing matched the upstream repo.

  • Daniel Nasser· Sep 25, 2024

    Qdrant is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.

  • Rahul Santra· Sep 9, 2024

    We evaluated Qdrant against two servers with overlapping tools; this profile had the clearer scope statement.

  • Evelyn Sethi· Sep 9, 2024

    Qdrant is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.

  • Pratham Ware· Aug 28, 2024

    Useful MCP listing: Qdrant is the kind of server we cite when onboarding engineers to host + tool permissions.

showing 1-10 of 28

1 / 3