firecrawl-download▌
firecrawl/cli · updated Apr 8, 2026
Download entire websites as organized local files in multiple formats.
- ›Maps site structure first, then scrapes each discovered page into nested directories under .firecrawl/ , supporting markdown, links, screenshots, and custom format combinations
- ›Filters pages by path patterns ( --include-paths , --exclude-paths ), search queries, subdomain inclusion, and page limits to control scope
- ›All standard scrape options work with download: format selection, full-page screenshots, main-conten
firecrawl download
Experimental. Convenience command that combines
map+scrapeto save an entire site as local files.
Maps the site first to discover pages, then scrapes each one into nested directories under .firecrawl/. All scrape options work with download. Always pass -y to skip the confirmation prompt.
When to use
- You want to save an entire site (or section) to local files
- You need offline access to documentation or content
- Bulk content extraction with organized file structure
Quick start
# Interactive wizard (picks format, screenshots, paths for you)
firecrawl download https://docs.example.com
# With screenshots
firecrawl download https://docs.example.com --screenshot --limit 20 -y
# Multiple formats (each saved as its own file per page)
firecrawl download https://docs.example.com --format markdown,links --screenshot --limit 20 -y
# Creates per page: index.md + links.txt + screenshot.png
# Filter to specific sections
firecrawl download https://docs.example.com --include-paths "/features,/sdks"
# Skip translations
firecrawl download https://docs.example.com --exclude-paths "/zh,/ja,/fr,/es,/pt-BR"
# Full combo
firecrawl download https://docs.example.com \
--include-paths "/features,/sdks" \
--exclude-paths "/zh,/ja" \
--only-main-content \
--screenshot \
-y
Download options
| Option | Description |
|---|---|
--limit <n> |
Max pages to download |
--search <query> |
Filter URLs by search query |
--include-paths <paths> |
Only download matching paths |
--exclude-paths <paths> |
Skip matching paths |
--allow-subdomains |
Include subdomain pages |
-y |
Skip confirmation prompt (always use in automated flows) |
Scrape options (all work with download)
-f <formats>, -H, -S, --screenshot, --full-page-screenshot, --only-main-content, --include-tags, --exclude-tags, --wait-for, --max-age, --country, --languages
See also
- firecrawl-map — just discover URLs without downloading
- firecrawl-scrape — scrape individual pages
- firecrawl-crawl — bulk extract as JSON (not local files)
Ratings
4.5★★★★★10 reviews- ★★★★★Shikha Mishra· Oct 10, 2024
firecrawl-download is among the better-maintained entries we tried; worth keeping pinned for repeat workflows.
- ★★★★★Piyush G· Sep 9, 2024
Keeps context tight: firecrawl-download is the kind of skill you can hand to a new teammate without a long onboarding doc.
- ★★★★★Chaitanya Patil· Aug 8, 2024
Registry listing for firecrawl-download matched our evaluation — installs cleanly and behaves as described in the markdown.
- ★★★★★Sakshi Patil· Jul 7, 2024
firecrawl-download reduced setup friction for our internal harness; good balance of opinion and flexibility.
- ★★★★★Ganesh Mohane· Jun 6, 2024
I recommend firecrawl-download for anyone iterating fast on agent tooling; clear intent and a small, reviewable surface area.
- ★★★★★Oshnikdeep· May 5, 2024
Useful defaults in firecrawl-download — fewer surprises than typical one-off scripts, and it plays nicely with `npx skills` flows.
- ★★★★★Dhruvi Jain· Apr 4, 2024
firecrawl-download has been reliable in day-to-day use. Documentation quality is above average for community skills.
- ★★★★★Rahul Santra· Mar 3, 2024
Solid pick for teams standardizing on skills: firecrawl-download is focused, and the summary matches what you get after install.
- ★★★★★Pratham Ware· Feb 2, 2024
We added firecrawl-download from the explainx registry; install was straightforward and the SKILL.md answered most questions upfront.
- ★★★★★Yash Thakker· Jan 1, 2024
firecrawl-download fits our agent workflows well — practical, well scoped, and easy to wire into existing repos.