LLM.txt Directory▌

by mcp-get
LLM.txt Directory — Quickly access up-to-date API documentation and developer resources for modern LLM integrations.
Access up-to-date API documentation efficiently.
best for
- / AI developers needing current API documentation
- / Building context-aware LLM applications
- / Accessing structured documentation for AI tools
capabilities
- / Search LLM.txt files by content
- / List available LLM.txt documentation files
- / Fetch complete content from LLM.txt files
- / Perform contextual searches across documentation
what it does
Searches and retrieves content from LLM.txt files, which contain structured documentation and API information for AI applications.
about
LLM.txt Directory is a community-built MCP server published by mcp-get that provides AI assistants with tools and capabilities via the Model Context Protocol. LLM.txt Directory — Quickly access up-to-date API documentation and developer resources for modern LLM integrations. It is categorized under ai ml, developer tools.
how to install
You can install LLM.txt Directory in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
license
MIT
LLM.txt Directory is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
readme
MCP Get Community Servers
This repository contains a collection of community-maintained Model Context Protocol (MCP) servers. All servers are automatically listed on the MCP Get registry and can be viewed and installed via CLI:
npx @michaellatman/mcp-get@latest list
Note: While we review all servers in this repository, they are maintained by their respective creators who are responsible for their functionality and maintenance.
Available Servers
- LLM.txt Server - A server for searching and retrieving content from LLM.txt files. Provides tools for listing available files, fetching content, and performing contextual searches.
- Curl Server - A server that allows LLMs to make HTTP requests to any URL using a curl-like interface. Supports all common HTTP methods, custom headers, request body, and configurable timeouts.
- macOS Server - A server that provides macOS-specific system information and operations.
Installation
You can install any server using the MCP Get CLI:
npx @michaellatman/mcp-get@latest install <server-name>
For example:
npx @michaellatman/mcp-get@latest install @mcp-get-community/server-curl
Development
To run in development mode with automatic recompilation:
npm install
npm run watch
Contributing
We welcome contributions! Please feel free to submit a Pull Request.
License
While this repository's structure and documentation are licensed under the MIT License, individual servers may have their own licenses. Please check each server's documentation in the src directory for its specific license terms.
Support
If you find these servers useful, please consider starring the repository!
FAQ
- What is the LLM.txt Directory MCP server?
- LLM.txt Directory is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
- How do MCP servers relate to agent skills?
- Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
- How are reviews shown for LLM.txt Directory?
- This profile displays 10 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.5 out of 5—verify behavior in your own environment before production use.
Ratings
4.5★★★★★10 reviews- ★★★★★Shikha Mishra· Oct 10, 2024
LLM.txt Directory is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.
- ★★★★★Piyush G· Sep 9, 2024
We evaluated LLM.txt Directory against two servers with overlapping tools; this profile had the clearer scope statement.
- ★★★★★Chaitanya Patil· Aug 8, 2024
Useful MCP listing: LLM.txt Directory is the kind of server we cite when onboarding engineers to host + tool permissions.
- ★★★★★Sakshi Patil· Jul 7, 2024
LLM.txt Directory reduced integration guesswork — categories and install configs on the listing matched the upstream repo.
- ★★★★★Ganesh Mohane· Jun 6, 2024
I recommend LLM.txt Directory for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.
- ★★★★★Oshnikdeep· May 5, 2024
Strong directory entry: LLM.txt Directory surfaces stars and publisher context so we could sanity-check maintenance before adopting.
- ★★★★★Dhruvi Jain· Apr 4, 2024
LLM.txt Directory has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.
- ★★★★★Rahul Santra· Mar 3, 2024
According to our notes, LLM.txt Directory benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.
- ★★★★★Pratham Ware· Feb 2, 2024
We wired LLM.txt Directory into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.
- ★★★★★Yash Thakker· Jan 1, 2024
LLM.txt Directory is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.