Streamline your workflow with Felix. Integrate it into your workspace and tailor its behavior to y…
A tiny Model Context Protocol server with a few useful tools, deployed on Smithery, tested in Claude Desktop, and indexed in NANDA.
Tools included
hello(name) – quick greetingrandomNumber(max?) – random integer (default 100)weather(city) – current weather via wttr.insummarize(text, maxSentences?, model?) – OpenAI-powered summary (requires OPENAI_API_KEY)Public server page
https://smithery.ai/server/@FelixYifeiWang/felix-mcp-smithery
MCP endpoint (streamable HTTP)
https://server.smithery.ai/@FelixYifeiWang/felix-mcp-smithery/mcp
(In Smithery/NANDA, auth is attached via query param api_key and optional profile, configured in the platform UI; do not hardcode secrets here.)
Open Settings → Developer → mcpServers and add:
{
"mcpServers": {
"felix-mcp-smithery": {
"command": "npx",
"args": [
"-y",
"@smithery/cli@latest",
"run",
"@FelixYifeiWang/felix-mcp-smithery",
"--key",
"YOUR_SMITHERY_API_KEY",
"--profile",
"YOUR_PROFILE_ID"
]
}
}
}Start a new chat and run:
{ "name": "Felix" }”StreamableHTTPServerTransport on /mcp (POST/GET/DELETE).Mcp-Session-Id (no close recursion).gpt-4o-mini).Requires Node 18+ (tested on Node 20).
git clone https://github.com/FelixYifeiWang/felix-mcp-smithery
cd felix-mcp-smithery
npm installSet env (only needed if you’ll call summarize locally):
export OPENAI_API_KEY="sk-..."Run:
node index.js
# ✅ MCP Streamable HTTP server on 0.0.0.0:8081 (POST/GET/DELETE /mcp)Local curl:
curl -s -X POST "http://localhost:8081/mcp" \
-H 'Content-Type: application/json' \
-H 'Mcp-Protocol-Version: 2025-06-18' \
--data '{"jsonrpc":"2.0","id":0,"method":"initialize","params":{"protocolVersion":"2025-06-18"}}'hello
{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"hello","arguments":{"name":"Felix"}}}randomNumber
{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"randomNumber","arguments":{"max":10}}}weather
{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"weather","arguments":{"city":"Boston"}}}summarize (needs OPENAI_API_KEY set on the server)
{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"summarize","arguments":{"text":"(paste long text)","maxSentences":2}}}Server core:
McpServer from @modelcontextprotocol/sdk with tools registered in buildServer().
Transport: StreamableHTTPServerTransport on /mcp handling:
POST /mcp — JSON-RPC requests (and first-time initialize)GET /mcp — server-to-client notifications (SSE)DELETE /mcp — end sessionCORS: Allows all origins; exposes Mcp-Session-Id header (good for hosted clients).
OpenAI summarize: Thin fetch wrapper around /v1/chat/completions with a short “crisp summarizer” system prompt.
GitHub repo with:
index.js (Express + MCP)
package.json (@modelcontextprotocol/sdk, express, cors, zod)
Dockerfile
smithery.yaml:
kind: server
name: felix-mcp-smithery
version: 1.0.0
runtime: container
startCommand:
type: http
transport: streamable-http
port: 8081
path: /mcp
ssePath: /mcp
health: /In Smithery:
OPENAI_API_KEY (optional for summarize).✅ MCP Streamable HTTP server on 0.0.0.0:8081 (POST/GET/DELETE /mcp)Go to join39.org → Context Agents → Add
Felix MCP (Smithery)https://server.smithery.ai/@FelixYifeiWang/felix-mcp-smithery/mcp?api_key=YOUR_KEY&profile=YOUR_PROFILEStreamable-HTTP MCP hosted on Smithery. Tools: hello, randomNumber, weather, summarize (OpenAI).Test from NANDA: initialize → tools/list → call hello.
.
├─ index.js # Express + Streamable HTTP + tools
├─ package.json # sdk/express/cors/zod
├─ Dockerfile # container build for Smithery
└─ smithery.yaml # Smithery project config@smithery/cli run ….Parts of this project (tool scaffolding, error fixes, and documentation polish) were produced with AI assistance. The final code, deployment, and testing steps were implemented and verified by me.