ai.smithery/FelixYifeiWang-felix-mcp-smithery
Streamline your workflow with Felix. Integrate it into your workspace and tailor its behavior to y…
★ 0No licenseai-ml
Install
Config snippet generator goes here (5 client tabs)
README
# Felix MCP (Smithery)
A tiny **Model Context Protocol** server with a few useful tools, deployed on **Smithery**, tested in **Claude Desktop**, and indexed in **NANDA**.
**Tools included**
* `hello(name)` – quick greeting
* `randomNumber(max?)` – random integer (default 100)
* `weather(city)` – current weather via wttr.in
* `summarize(text, maxSentences?, model?)` – OpenAI-powered summary *(requires `OPENAI_API_KEY`)*
**Public server page**
`https://smithery.ai/server/@FelixYifeiWang/felix-mcp-smithery`
**MCP endpoint (streamable HTTP)**
`https://server.smithery.ai/@FelixYifeiWang/felix-mcp-smithery/mcp`
*(In Smithery/NANDA, auth is attached via query param `api_key` and optional `profile`, configured in the platform UI; do **not** hardcode secrets here.)*
---
## Demo
### In Claude Desktop (recommended)
1. Open **Settings → Developer → mcpServers** and add:
```json
{
"mcpServers": {
"felix-mcp-smithery": {
"command": "npx",
"args": [
"-y",
"@smithery/cli@latest",
"run",
"@FelixYifeiWang/felix-mcp-smithery",
"--key",
"YOUR_SMITHERY_API_KEY",
"--profile",
"YOUR_PROFILE_ID"
]
}
}
}
```
2. Start a new chat and run:
* “List tools from **felix-mcp-smithery**”
* “Call **hello** with `{ "name": "Felix" }`”
* “Call **summarize** on this text (2 sentences): …”
---
## Features
* **Streamable HTTP MCP** – Express + MCP SDK’s `StreamableHTTPServerTransport` on `/mcp` (POST/GET/DELETE).
* **Session-aware** – proper handling of `Mcp-Session-Id` (no close recursion).
* **OpenAI summarization** – tidy summaries via chat completions (model default `gpt-4o-mini`).
* **Zero-friction hosting** – packaged as a container and deployed on Smithery.
---
## Install (local)
> Requires **Node 18+** (tested on Node 20).
```bash
git clone https://github.com/FelixYifeiWang/felix-mcp-smithery
cd felix-mcp-smithery
npm install
```
Set env (only needed if you’ll call `summarize` locally):
```bash
export OPENAI_API_KEY="sk-..."
```
Run:
```bash
node index.js
# ✅ MCP Streamable HTTP server on 0.0.0.0:8081 (POST/GET/DELETE /mcp)
```
Local curl:
```bash
curl -s -X POST "http://localhost:8081/mcp" \
-H 'Content-Type: application/json' \
-H 'Mcp-Protocol-Version: 2025-06-18' \
--data '{"jsonrpc":"2.0","id":0,"method":"initialize","params":{"protocolVersion":"2025-06-18"}}'
```
---
## Usage (tools)
**hello**
```json
{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"hello","arguments":{"name":"Felix"}}}
```
**randomNumber**
```json
{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"randomNumber","arguments":{"max":10}}}
```
**weather**
```json
{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"weather","arguments":{"city":"Boston"}}}
```
**summarize** *(needs `OPENAI_API_KEY` set on the server)*
```json
{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"summarize","arguments":{"text":"(paste long text)","maxSentences":2}}}
```
---
## How it works
* **Server core:**
`McpServer` from `@modelcontextprotocol/sdk` with tools registered in `buildServer()`.
Transport: `StreamableHTTPServerTransport` on `/mcp` handling:
* `POST /mcp` — JSON-RPC requests (and first-time `initialize`)
* `GET /mcp` — server-to-client notifications (SSE)
* `DELETE /mcp` — end session
* **CORS:** Allows all origins; exposes `Mcp-Session-Id` header (good for hosted clients).
* **OpenAI summarize:** Thin `fetch` wrapper around `/v1/chat/completions` with a short “crisp summarizer” system prompt.
---
## Deployment (Smithery)
1. GitHub repo with:
* `index.js` (Express + MCP)
* `package.json` (`@modelcontextprotocol/sdk`, `express`, `cors`, `zod`)
* `Dockerfile`
* `smithery.yaml`:
```yaml
kind: server
name: felix-mcp-smithery
version: 1.0.0
runtime: container
startCommand:
type: http
transport: streamable-http
port: 8081
path: /mcp
ssePath: /mcp
health: /
```
2. In Smithery:
* Create server from the repo.
* Add **Environment Variables**: `OPENAI_API_KEY` (optional for `summarize`).
* Deploy → confirm logs show:
`✅ MCP Streamable HTTP server on 0.0.0.0:8081 (POST/GET/DELETE /mcp)`
---
## NANDA Index
* Go to **join39.org → Context Agents → Add**
* **Agent Name:** `Felix MCP (Smithery)`
* **MCP Endpoint:**
`https://server.smithery.ai/@FelixYifeiWang/felix-mcp-smithery/mcp?api_key=YOUR_KEY&profile=YOUR_PROFILE`
* **Description:**
`Streamable-HTTP MCP hosted on Smithery. Tools: hello, randomNumber, weather, summarize (OpenAI).`
* Test from NANDA: `initialize` → `tools/list` → call `hello`.
---
## Project structure
```