JinaAI Search▌

by spences10
Enhance web content discovery with JinaAI Search, integrating ai search like bing ai and Microsoft for advanced data ext
Integrates JinaAI's search capabilities for web content discovery, information retrieval, and data extraction
best for
- / LLM applications needing web search
- / Content research and information retrieval
- / Documentation and knowledge gathering
capabilities
- / Search web content through JinaAI API
- / Extract clean text with preserved structure
- / Gather images and links from web pages
- / Control response size with token budgets
- / Cache search results for performance
what it does
Searches the web using JinaAI's API to retrieve clean, LLM-optimized content from web pages and documentation. This repository is no longer maintained - use mcp-omnisearch instead.
about
JinaAI Search is a community-built MCP server published by spences10 that provides AI assistants with tools and capabilities via the Model Context Protocol. Enhance web content discovery with JinaAI Search, integrating ai search like bing ai and Microsoft for advanced data ext It is categorized under search web.
how to install
You can install JinaAI Search in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
license
MIT
JinaAI Search is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
readme
mcp-jinaai-search
⚠️ Notice
This repository is no longer maintained.
The functionality of this tool is now available in mcp-omnisearch, which combines multiple MCP tools in one unified package.
Please use mcp-omnisearch instead.
A Model Context Protocol (MCP) server for integrating Jina.ai's Search API with LLMs. This server provides efficient and comprehensive web search capabilities, optimised for retrieving clean, LLM-friendly content from the web.
<a href="https://glama.ai/mcp/servers/u6603w196t"> <img width="380" height="200" src="https://glama.ai/mcp/servers/u6603w196t/badge" /> </a>Features
- 🔍 Advanced web search through Jina.ai Search API
- 🚀 Fast and efficient content retrieval
- 📄 Clean text extraction with preserved structure
- 🧠 Content optimised for LLMs
- 🌐 Support for various content types including documentation
- 🏗️ Built on the Model Context Protocol
- 🔄 Configurable caching for performance
- 🖼️ Optional image and link gathering
- 🌍 Localisation support through browser locale
- 🎯 Token budget control for response size
Configuration
This server requires configuration through your MCP client. Here are examples for different environments:
Cline Configuration
Add this to your Cline MCP settings:
{
"mcpServers": {
"jinaai-search": {
"command": "node",
"args": ["-y", "mcp-jinaai-search"],
"env": {
"JINAAI_API_KEY": "your-jinaai-api-key"
}
}
}
}
Claude Desktop with WSL Configuration
For WSL environments, add this to your Claude Desktop configuration:
{
"mcpServers": {
"jinaai-search": {
"command": "wsl.exe",
"args": [
"bash",
"-c",
"JINAAI_API_KEY=your-jinaai-api-key npx mcp-jinaai-search"
]
}
}
}
Environment Variables
The server requires the following environment variable:
JINAAI_API_KEY: Your Jina.ai API key (required)
API
The server implements a single MCP tool with configurable parameters:
search
Search the web and get clean, LLM-friendly content using Jina.ai Reader. Returns top 5 results with URLs and clean content.
Parameters:
query(string, required): Search queryformat(string, optional): Response format ("json" or "text"). Defaults to "text"no_cache(boolean, optional): Bypass cache for fresh results. Defaults to falsetoken_budget(number, optional): Maximum number of tokens for this requestbrowser_locale(string, optional): Browser locale for rendering contentstream(boolean, optional): Enable stream mode for large pages. Defaults to falsegather_links(boolean, optional): Gather all links at the end of response. Defaults to falsegather_images(boolean, optional): Gather all images at the end of response. Defaults to falseimage_caption(boolean, optional): Caption images in the content. Defaults to falseenable_iframe(boolean, optional): Extract content from iframes. Defaults to falseenable_shadow_dom(boolean, optional): Extract content from shadow DOM. Defaults to falseresolve_redirects(boolean, optional): Follow redirect chains to final URL. Defaults to true
Development
Setup
- Clone the repository
- Install dependencies:
pnpm install
- Build the project:
pnpm run build
- Run in development mode:
pnpm run dev
Publishing
- Create a changeset:
pnpm changeset
- Version the package:
pnpm version
- Build and publish:
pnpm release
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT License - see the LICENSE file for details.
Acknowledgments
- Built on the Model Context Protocol
- Powered by Jina.ai Search API
FAQ
- What is the JinaAI Search MCP server?
- JinaAI Search is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
- How do MCP servers relate to agent skills?
- Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
- How are reviews shown for JinaAI Search?
- This profile displays 10 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.5 out of 5—verify behavior in your own environment before production use.
Ratings
4.5★★★★★10 reviews- ★★★★★Shikha Mishra· Oct 10, 2024
JinaAI Search is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.
- ★★★★★Piyush G· Sep 9, 2024
We evaluated JinaAI Search against two servers with overlapping tools; this profile had the clearer scope statement.
- ★★★★★Chaitanya Patil· Aug 8, 2024
Useful MCP listing: JinaAI Search is the kind of server we cite when onboarding engineers to host + tool permissions.
- ★★★★★Sakshi Patil· Jul 7, 2024
JinaAI Search reduced integration guesswork — categories and install configs on the listing matched the upstream repo.
- ★★★★★Ganesh Mohane· Jun 6, 2024
I recommend JinaAI Search for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.
- ★★★★★Oshnikdeep· May 5, 2024
Strong directory entry: JinaAI Search surfaces stars and publisher context so we could sanity-check maintenance before adopting.
- ★★★★★Dhruvi Jain· Apr 4, 2024
JinaAI Search has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.
- ★★★★★Rahul Santra· Mar 3, 2024
According to our notes, JinaAI Search benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.
- ★★★★★Pratham Ware· Feb 2, 2024
We wired JinaAI Search into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.
- ★★★★★Yash Thakker· Jan 1, 2024
JinaAI Search is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.