LLMsTXT Agent
Your gateway to standardized documentation retrieval for Large Language Models
Standardized
Consistent documentation format across all sources, optimized for LLM consumption
Real-time
Instant access to the latest documentation updates as they happen
Fallback Hub
Central repository ensures documentation availability even when primary sources are unavailable
Choose Your Connection Method #
MCP Inspector
Visual interface to explore and test the connection
Cursor Integration
Configure Cursor IDE to use LLMsTXT
Command Line
Direct CLI access for advanced users
Using the MCP Inspector
Visual interface for exploring the LLMsTXT Agent
Install and run:
npx @modelcontextprotocol/inspector
Once launched:
- 1 Select 'SSE' as the Transport Type
- 2 Enter
https://mcp.llmtxt.dev/sse
- 3 Click "Connect" to start exploring
Cursor Integration
Configure your Cursor IDE settings
Add to your settings:
{
"mcpServers": {
"llmtxt.dev": {
"command": "npx",
"args": ["mcp-remote", "https://mcp.llmtxt.dev/sse"]
}
}
}
Command Line Usage
Direct connection via CLI
Connect with:
npx mcp-remote https://mcp.llmtxt.dev/sse
What is LLMsTXT Agent?
LLMsTXT is a remote server that helps find documentation for websites. It looks for special files called llms.txt that websites use to share their documentation in a standard way. This server follows the Model Context Protocol (MCP) rules to make sure everything works smoothly.
The server first tries to find the llms.txt file on the website you're interested in. If it can't find one, it checks our central hub at llmtxt.dev instead. This way, you always get the documentation you need in a consistent format that AI models can easily use.
How It Works
Primary Source Check
First, the agent looks for site-specific llms.txt files at the source, ensuring the most up-to-date documentation.
Fallback Repository
If primary source isn't available, it queries our central llmtxt.dev repository, ensuring documentation is always accessible.
Real-time Updates
Documentation updates are delivered instantly, ensuring your LLM always has the latest information.
Questions or Issues?
Reach out to me on Twitter at @ipwanciu for support, feature requests, or general questions about the LLMsTXT MCP server.