MCP server for real-time LLM API documentation — stop hallucinations
🌐 Website: https://llmmcp.vercel.app<br> 🎥 Demo:
https://github.com/user-attachments/assets/eaad8d05-b7a8-4bf0-86c6-4fe2726da628
Stop LLM hallucinations and outdated code patterns.
llmmcp is a Model Context Protocol (MCP) server that provides real-time, up-to-date documentation for major LLM providers (OpenAI, Anthropic, and Google Gemini). It ensures your AI agents—like Cursor, Claude Desktop, or Windsurf—base their work on current official documentation instead of stale training data or deprecated library patterns.
LLMs frequently hallucinate about their own latest versions, feature availability (e.g., tool use in certain models), and pricing. llmmcp fixes this by providing:
You can use llmmcp immediately in your favorite AI tools without local installation.
Add a new MCP server in Settings > Models > MCP Servers:
llmmcpcommandnpx -y llmmcp@latestAdd the following to your claude_desktop_config.json:
{
"mcpServers": {
"llmmcp": {
"command": "npx",
"args": ["-y", "llmmcp@latest"]
}
}
}search_docsSearch the latest official documentation for specific technical details. Example: "What are the tool use parameters for Gemini 1.5 Pro?"
list_providersGet a dynamically updated list of available providers (OpenAI, Anthropic, Google) and their currently promoted models.
llmmcp is designed for speed and reliability:
This project is open-source. If you'd like to run your own instance of the backend:
Developed by Abdullah Al Mahmud
MIT