MCP server for AI-enhanced prompt engineering and request conversion.
BetterPrompt MCP is a Model Context Protocol (MCP) server that enhances user requests using advanced prompt engineering techniques. It exposes a single, powerful tool that transforms simple requests into structured, context-rich instructions tailored for optimal AI model performance.
Instead of manually crafting detailed prompts, BetterPrompt MCP converts your requests into expertly engineered prompts that get better results from AI models.
Before & After Example
Without BetterPrompt:
"Write a function to calculate fibonacci numbers"
With BetterPrompt Enhancement:
"You are a world-class AI assistant with expertise in advanced prompt engineering techniques from top AI research labs like Anthropic, OpenAI, and Google DeepMind.
Your task is to provide an exceptional response to the following user request:
"Write a function to calculate fibonacci numbers"
Please enhance your response by:
Structure your response with clear headings, detailed explanations, and examples where appropriate. Ensure your answer is comprehensive, actionable, and directly addresses all aspects of the request."
Install and run via npx:
npx -y betterprompt-mcpOr add to your MCP client configuration:
{
"mcpServers": {
"betterprompt": {
"command": "npx",
"args": ["-y", "betterprompt-mcp"]
}
}
}Most MCP clients work with this standard config:
{
"mcpServers": {
"betterprompt": {
"command": "npx",
"args": ["-y", "betterprompt-mcp"]
}
}
}Pick your client below. Where available, click the install button; otherwise follow the manual steps.
<details> <summary><b>VS Code</b></summary>Click a button to install:
<img src="https://img.shields.io/badge/VS_Code-VS_Code?style=flat-square&label=Install%20Server&color=0098FF" alt="Install in VS Code"> <img alt="Install in VS Code Insiders" src="https://img.shields.io/badge/VS_Code_Insiders-VS_Code_Insiders?style=flat-square&label=Install%20Server&color=24bfa5">
Fallback (CLI):
code --add-mcp '{"name":"betterprompt","command":"npx","args":["-y","betterprompt-mcp"]}'Click to install:
<img src="https://cursor.com/deeplink/mcp-install-dark.svg" alt="Install in Cursor">
Or add manually: Settings → MCP → Add new MCP Server → Type: command, Command: npx -y betterprompt-mcp.
Click to install:
Or manually: Program → Install → Edit mcp.json, add the standard config above.
Install button: TODO – no public deeplink available yet.
Manual setup:
mcpServers entry:{
"mcpServers": {
"betterprompt": {
"command": "npx",
"args": ["-y", "betterprompt-mcp"]
}
}
}Restart Continue if needed.
</details> <details> <summary><b>Goose</b></summary>Click to install:
Or manually: Advanced settings → Extensions → Add custom extension → Type: STDIO → Command: npx -y betterprompt-mcp.
Install via CLI:
claude mcp add betterprompt npx -y betterprompt-mcpAdd to claude_desktop_config.json using the standard config above, then restart Claude Desktop. See the MCP quickstart:
Model Context Protocol – Quickstart
</details> <details> <summary><b>Windsurf</b></summary>Follow the Windsurf MCP documentation and use the standard config above.
</details> <details> <summary><b>Gemini CLI</b></summary>Follow the Gemini CLI MCP server guide; use the standard config above.
Docs: Configure MCP server in Gemini CLI
</details> <details> <summary><b>Qodo Gen</b></summary>Open Qodo Gen chat panel → Connect more tools → + Add new MCP → Paste the standard config above → Save.
</details> <details> <summary><b>opencode</b></summary>Create or edit ~/.config/opencode/opencode.json:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"betterprompt": {
"type": "local",
"command": ["npx", "-y", "betterprompt-mcp"],
"enabled": true
}
}
}enhance-requestTransforms user requests into world-class AI-enhanced prompts using advanced prompt engineering techniques.
Input:
request (string, required): The user request to transform into an enhanced AI promptOutput: AI-enhanced prompt with structure, context, and clear instructions.
Example Usage:
{
"name": "enhance-request",
"arguments": {
"request": "Write a function to calculate fibonacci numbers"
}
}Request:
{
"name": "enhance-request",
"arguments": {
"request": "Explain quantum computing"
}
}Enhanced Result:
"You are a world-class AI assistant with expertise in advanced prompt engineering techniques from top AI research labs like Anthropic, OpenAI, and Google DeepMind.
Your task is to provide an exceptional response to the following user request:
"Explain quantum computing"
Please enhance your response by:
Structure your response with clear headings, detailed explanations, and examples where appropriate. Ensure your answer is comprehensive, actionable, and directly addresses all aspects of the request."
BetterPrompt MCP leverages the MCP Sampling API to enhance user requests:
enhance-request tool, the server sends a sampling request to your MCP clientThis approach has several benefits:
betterprompt-mcp/
├── src/
│ └── index.ts # Main server implementation
├── tests/ # Test files and verification scripts
├── dist/ # Compiled output (generated)
├── package.json # Dependencies and scripts
├── tsconfig.json # TypeScript configuration
└── README.md # DocumentationBuild:
npm run buildWatch (dev):
npm run watchFormat:
npm run format
npm run format:checkTest:
npm run test:comprehensiveWe use ESLint + Prettier to keep the codebase consistent.
npm run lintnpm run lint -- --fix or npm run lint:fixnpm run lint:ci (produces artifacts/lint-report.json)scripts/lint-autofix-and-commit.sh. The script uses a conservative heuristic (small change threshold) and will abort auto-commit when changes appear large or potentially behavior-affecting; in such cases open a PR for human review.MIT License
For questions or issues, open an issue on GitHub or contact the author via GitHub profile.
Aung Myo Kyaw (GitHub)