search-web

AnyCrawl

by any4ai

AnyCrawl offers advanced web scraping and internet scraping with flexible depth limits. Scrape any website and extract s

Integrates with the AnyCrawl API to provide web scraping and crawling capabilities with configurable depth limits, multiple scraping engines, and structured data extraction in various formats including markdown and JSON.

github stars

5

Cloud service available — no server setupMultiple scraping engines supportedAsync crawling with status monitoring

best for

  • / Data analysts gathering web content for research
  • / Developers building content aggregation systems
  • / SEO professionals auditing website structures
  • / Researchers collecting structured web data

capabilities

  • / Scrape individual web pages for content
  • / Crawl entire websites with depth control
  • / Search web and scrape results
  • / Extract data as markdown, JSON, HTML, or screenshots
  • / Monitor async crawling job status
  • / Choose between Playwright, Cheerio, or Puppeteer engines

what it does

Scrapes web pages and crawls entire websites using the AnyCrawl API, extracting content in multiple formats with configurable depth limits.

about

AnyCrawl is an official MCP server published by any4ai that provides AI assistants with tools and capabilities via the Model Context Protocol. AnyCrawl offers advanced web scraping and internet scraping with flexible depth limits. Scrape any website and extract s It is categorized under search web.

how to install

You can install AnyCrawl in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

license

MIT

AnyCrawl is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

readme

AnyCrawl MCP Server

🚀 AnyCrawl MCP Server — Powerful web scraping and crawling for Cursor, Claude, and other LLM clients via the Model Context Protocol (MCP).

Features

  • Web Scraping: Extract content from single URLs with multiple output formats
  • Website Crawling: Crawl entire websites with configurable depth and limits
  • Search Engine Integration: Search the web and optionally scrape results
  • Multiple Engines: Support for Playwright, Cheerio, and Puppeteer
  • Flexible Output: Markdown, HTML, text, screenshots, and structured JSON
  • Async Operations: Non-blocking crawl jobs with status monitoring
  • Error Handling: Robust error handling and logging
  • Multiple Modes: STDIO (default), MCP(HTTP), SSE; cloud-ready with Nginx proxy

Installation

Running with npx

ANYCRAWL_API_KEY=YOUR-API-KEY npx -y anycrawl-mcp

Manual installation

npm install -g anycrawl-mcp-server

ANYCRAWL_API_KEY=YOUR-API-KEY anycrawl-mcp

Configuration

AnyCrawl MCP Server supports two deployment modes: Cloud Service (recommended) and Self-Hosted.

Cloud Service (Recommended)

Use the AnyCrawl cloud service at mcp.anycrawl.dev. No server setup required.

# Only need your API key
export ANYCRAWL_API_KEY="your-api-key-here"

Cloud endpoints:

  • MCP (streamable_http): https://mcp.anycrawl.dev/{API_KEY}/mcp
  • SSE: https://mcp.anycrawl.dev/{API_KEY}/sse

Self-Hosted Deployment

For self-hosted deployments, configure the base URL to point to your own AnyCrawl API instance:

export ANYCRAWL_API_KEY="your-api-key-here"
export ANYCRAWL_BASE_URL="https://your-api-server.com"  # Your self-hosted API URL

For local development with custom host/port:

export ANYCRAWL_HOST="127.0.0.1"  # Default: mcp.anycrawl.dev (cloud)
export ANYCRAWL_PORT="3000"       # Default: 3000

Get your API key

  • Visit the AnyCrawl website and sign up or log in: AnyCrawl
  • 🎉 Sign up for free to receive 1,500 credits — enough to crawl nearly 1,500 pages.
  • Open the dashboard → API Keys → Copy your key.
  • Copy the key and set it as the ANYCRAWL_API_KEY environment variable (see above).

Usage

Available Modes

AnyCrawl MCP Server supports the following deployment modes:

Default mode is STDIO (no env needed). Set ANYCRAWL_MODE to switch.

ModeDescriptionBest ForTransport
STDIOStandard MCP over stdio (default)Command-type MCP clients, local toolingstdio
MCPStreamable HTTP (JSON, stateful)Cursor (streamable_http), API integrationHTTP + JSON
SSEServer-Sent EventsWeb apps, browser integrationsHTTP + SSE

Quick Start Commands

# Development (local)
npm run dev            # STDIO (default)
npm run dev:mcp          # MCP mode (JSON /mcp)
npm run dev:sse          # SSE mode (/sse)

# Production (built output)
npm start              # STDIO (default)
npm run start:mcp
npm run start:sse

# Env examples
ANYCRAWL_MODE=MCP ANYCRAWL_API_KEY=YOUR-KEY npm run dev:mcp
ANYCRAWL_MODE=SSE ANYCRAWL_API_KEY=YOUR-KEY npm run dev:sse

Docker Compose (MCP + SSE with Nginx)

This repo ships a production-ready image that runs MCP (JSON) on port 3000 and SSE on port 3001 in the same container, fronted by Nginx. Nginx also supports API-key-prefixed paths /{API_KEY}/mcp and /{API_KEY}/sse and forwards the key via x-anycrawl-api-key header.

docker compose build
docker compose up -d

Environment variables used in Docker image:

  • ANYCRAWL_MODE: MCP_AND_SSE (default in compose), or MCP, SSE
  • ANYCRAWL_MCP_PORT: default 3000
  • ANYCRAWL_SSE_PORT: default 3001
  • CLOUD_SERVICE: true to extract API key from /{API_KEY}/... or headers
  • ANYCRAWL_BASE_URL: default https://api.anycrawl.dev

Running on Cursor

Configuring Cursor. Note: Requires Cursor v0.45.6+.

For Cursor v0.48.6 and newer, add this to your MCP Servers settings:

{
  "mcpServers": {
    "anycrawl-mcp": {
      "command": "npx",
      "args": ["-y", "anycrawl-mcp"],
      "env": {
        "ANYCRAWL_API_KEY": "YOUR-API-KEY"
      }
    }
  }
}

For Cursor v0.45.6:

  1. Open Cursor Settings → Features → MCP Servers → "+ Add New MCP Server"
  2. Name: "anycrawl-mcp" (or your preferred name)
  3. Type: "command"
  4. Command:
env ANYCRAWL_API_KEY=YOUR-API-KEY npx -y anycrawl-mcp

On Windows, if you encounter issues:

cmd /c "set ANYCRAWL_API_KEY=YOUR-API-KEY && npx -y anycrawl-mcp"

Running on VS Code

For manual installation, add this JSON to your User Settings (JSON) in VS Code (Command Palette → Preferences: Open User Settings (JSON)):

{
  "mcp": {
    "inputs": [
      {
        "type": "promptString",
        "id": "apiKey",
        "description": "AnyCrawl API Key",
        "password": true
      }
    ],
    "servers": {
      "anycrawl": {
        "command": "npx",
        "args": ["-y", "anycrawl-mcp"],
        "env": {
          "ANYCRAWL_API_KEY": "${input:apiKey}"
        }
      }
    }
  }
}

Optionally, place the following in .vscode/mcp.json in your workspace to share config:

{
  "inputs": [
    {
      "type": "promptString",
      "id": "apiKey",
      "description": "AnyCrawl API Key",
      "password": true
    }
  ],
  "servers": {
    "anycrawl": {
      "command": "npx",
      "args": ["-y", "anycrawl-mcp"],
      "env": {
        "ANYCRAWL_API_KEY": "${input:apiKey}"
      }
    }
  }
}

Running on Windsurf

Add this to ~/.codeium/windsurf/model_config.json:

{
  "mcpServers": {
    "mcp-server-anycrawl": {
      "command": "npx",
      "args": ["-y", "anycrawl-mcp"],
      "env": {
        "ANYCRAWL_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

Running on Claude Desktop / Claude Code

Claude Code (CLI)

Add AnyCrawl MCP server using the claude mcp add command:

claude mcp add --transport stdio anycrawl -e ANYCRAWL_API_KEY=your-api-key -- npx -y anycrawl-mcp

Claude Desktop

Add this to your Claude Desktop configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "anycrawl": {
      "command": "npx",
      "args": ["-y", "anycrawl-mcp"],
      "env": {
        "ANYCRAWL_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

Running with SSE Server Mode

The SSE (Server-Sent Events) mode provides a web-based interface for MCP communication, ideal for web applications, testing, and integration with web-based LLM clients.

Quick Start

ANYCRAWL_MODE=SSE ANYCRAWL_API_KEY=YOUR-API-KEY npx -y anycrawl-mcp

# Or using npm scripts
ANYCRAWL_API_KEY=YOUR-API-KEY npm run dev:sse

Server Configuration

Optional server settings for local/self-hosted deployments:

export ANYCRAWL_PORT=3000           # Default: 3000
export ANYCRAWL_HOST=127.0.0.1      # Set to override cloud default (mcp.anycrawl.dev)

Health Check

curl -s http://localhost:${ANYCRAWL_PORT:-3000}/health
# Response: ok

Generic MCP/SSE Client Configuration

For other MCP/SSE clients that support SSE transport, use this configuration:

{
  "mcpServers": {
    "anycrawl": {
      "type": "sse",
      "url": "https://mcp.anycrawl.dev/{API_KEY}/sse",
      "name": "AnyCrawl MCP Server",
      "description": "Web scraping and crawling tools"
    }
  }
}

or

{
  "mcpServers": {
    "AnyCrawl": {
      "type": "streamable_http",
      "url": "https://mcp.anycrawl.dev/{API_KEY}/mcp"
    }
  }
}

Environment Setup:

# Start SSE server with API key
ANYCRAWL_API_KEY=your-api-key-here npm run dev:sse

Cursor configuration for HTTP modes (streamable_http)

Configure Cursor to connect to your HTTP MCP server.

Local HTTP Streamable Server:

{
  "mcpServers": {
    "anycrawl-http-local": {
      "type": "streamable_http",
      "url": "http://127.0.0.1:3000/mcp"
    }
  }
}

Cloud HTTP Streamable Server:

{
  "mcpServers": {
    "anycrawl-http-cloud": {
      "type": "streamable_http",
      "url": "https://mcp.anycrawl.dev/{API_KEY}/mcp"
    }
  }
}

Note: For HTTP modes, set ANYCRAWL_API_KEY (and optional host/port) in the server process environment or in the URL. Cursor does not need your API key when using streamable_http.

Available Tools

1. Scrape Tool (anycrawl_scrape)

Scrape a single URL and extract content in various formats.

Best for:

  • Extracting content from a single page
  • Quick data extraction
  • Testing specific URLs

Parameters:

  • url (required): The URL to scrape
  • engine (required): Scraping engine (playwright, cheerio, puppeteer)
  • formats (optional): Output formats (markdown, html, text, screenshot, screenshot@fullPage, rawHtml, json)
  • proxy (optional): Proxy URL
  • timeout (optional): Timeout in milliseconds (default: 300000)
  • retry (optional): Whether to retry on failure (default: false)
  • wait_for (optional): Wait time for page to load
  • include_tags (optional): HTML tags to include
  • exclude_tags (optional): HTML tags to exclude
  • json_options (optional): Options for JSON extraction

Example:

{
  "name": "anycrawl_scrape",
  "arguments": {
    "url": "https://example.com",
    "engine": "cheerio",
    "formats": ["markdown", "html"],
    "timeout": 30000
  }
}

2. Crawl Tool (anycrawl_crawl)

Start a crawl job to scrape multiple pages from a website. By default this waits for completion and returns aggregated results using the SDK's client.crawl (defaults: poll every 3 seconds, timeout after 60 seconds).

**B


FAQ

What is the AnyCrawl MCP server?
AnyCrawl is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
How do MCP servers relate to agent skills?
Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
How are reviews shown for AnyCrawl?
This profile displays 10 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.5 out of 5—verify behavior in your own environment before production use.
MCP server reviews

Ratings

4.510 reviews
  • Shikha Mishra· Oct 10, 2024

    AnyCrawl is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.

  • Piyush G· Sep 9, 2024

    We evaluated AnyCrawl against two servers with overlapping tools; this profile had the clearer scope statement.

  • Chaitanya Patil· Aug 8, 2024

    Useful MCP listing: AnyCrawl is the kind of server we cite when onboarding engineers to host + tool permissions.

  • Sakshi Patil· Jul 7, 2024

    AnyCrawl reduced integration guesswork — categories and install configs on the listing matched the upstream repo.

  • Ganesh Mohane· Jun 6, 2024

    I recommend AnyCrawl for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.

  • Oshnikdeep· May 5, 2024

    Strong directory entry: AnyCrawl surfaces stars and publisher context so we could sanity-check maintenance before adopting.

  • Dhruvi Jain· Apr 4, 2024

    AnyCrawl has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.

  • Rahul Santra· Mar 3, 2024

    According to our notes, AnyCrawl benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.

  • Pratham Ware· Feb 2, 2024

    We wired AnyCrawl into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

  • Yash Thakker· Jan 1, 2024

    AnyCrawl is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.