Windsor.ai MCP
Query and analyze marketing, sales, and business data from 325+ platforms via Windsor.ai.
★ 2MITai-ml
Install
Config snippet generator goes here (5 client tabs)
README
# Windsor MCP Server
Windsor MCP (Model Context Protocol) enables your LLM to query, explore, and analyze your full-stack business data integrated into Windsor.ai with zero SQL writing or custom scripting.
It connects seamlessly to 325+ platforms, giving AI-native tools such as Claude, Perplexity, Cursor, or others, real-time access to your performance marketing, sales, and customer data to help you unlock valuable insights.
---
## 🌟 Features
### Natural language access to business data
Windsor MCP is a natural language interface that connects your integrated Windsor.ai datasets with the LLM platform, enabling you to better understand your data by asking questions like:
- “What campaigns had the best ROAS last month?”
- “Give me a breakdown of spend by channel over the past 90 days.”
- “What campaigns are wasting our advertising budget?”
All in real-time, directly inside your LLM chat interface.
### Out-of-the-box integration with 325+ sources
Sync data from Facebook Ads, GA4, HubSpot, Salesforce, Shopify, TikTok Ads, and more via native Windsor.ai connectors.
### Zero-code setup
Windsor MCP works via the Claude Desktop or with a lightweight dev proxy. No custom integrations required.
### Open standard compatibility
Built on Anthropic’s open MCP spec, it’s compatible with Claude, Perplexity, Cursor, and more.
### Real-time Analytics without SQL
Get instant breakdowns, summaries, and performance insights from your integrated data.
---
## 🎯 How It Works
You connect Windsor MCP to your preferred LLM as an external connector using the MCP protocol. The LLM can then issue real-time data queries and receive structured results, all within the chat interface.
### Example prompts:
- What was total ad spend by channel last month?
- Break down ROAS for Meta vs Google Ads for Q2
- Are there any campaigns overspending vs target ROAS?
---
## 🚀 Getting Started
### View our official documentation
[https://windsor.ai/introducing-windsor-mcp/](https://windsor.ai/introducing-windsor-mcp/)
---
## Option 1: Claude Desktop (Recommended)
### Prerequisites:
- Claude Pro or higher-tier Claude Desktop plan
- Your Windsor API key
### Steps:
1. Go to **Claude settings → Connectors → Add custom connector**
2. Use one of the following URLs for Windsor MCP:
- `https://mcp.windsor.ai`
- `https://mcp.windsor.ai/sse`
3. Open a new chat and start with:
<pre>
My Windsor.ai API key is {your-key}. {Your question here}
</pre>
4. Accept connector permissions and start querying your data!
---
## Option 2: Developer Proxy Setup
For users on lower-tier Claude plans or requiring custom setups for advanced flexibility.
### Prerequisites:
- Claude Desktop with dev mode enabled
### Installation steps:
1. Inatall mcp-proxy and copy its path.
<pre>
uv tool install mcp-proxy
which mcp-proxy # Copy full path
</pre>
2. Configure Claude Desktop:
Open Settings → Developer → Edit Config and add:
<pre>
{
"mcpServers": {
"windsor": {
"command": "/Users/{your-username}/.local/bin/mcp-proxy",
"args": ["https://mcp.windsor.ai/sse"]
}
}
}
</pre>
💡 Replace <your-username> with your system username.
3. Fully quit and reopen Claude. You should now see “windsor” listed in your MCP options.
---
## Option 3: Windsor MCP with Cursor
### Prerequisites:
- Cursor Desktop installed
- Your Windsor API key
### Installation steps:
1. Install mcp-proxy
<pre>
uv tool install mcp-proxy
which mcp-proxy # Copy full path
</pre>
2. Open settings in Cursor Desktop. Select **Tool & Integrations** > **New MCP Server**.
3. The mcp.json file will open. Paste the following script into it:
<pre>
{
"mcpServers": {
"windsor": {
"command": "/Users/{your-username}/.local/bin/mcp-proxy",
"args": ["https://mcp.windsor.ai/sse"]
}
}
}
</pre>
4. Windsor MCP will now become active in Cursor. It will ask for Windsor’s API Key in a prompt; just paste it, and you are good to go with any questions related to your data.
---
## Option 4: Windsor MCP with Gemini CLI
### Installation steps:
1. Install mcp-proxy
<pre>
uv tool install mcp-proxy
which mcp-proxy # Copy full path
</pre>
2. Install Gemini CLI
Use Node.js to globally install the Gemini CLI (make sure you have Node.js 18 or later installed).
<pre>
npm install -g @google/gemini-cli
</pre>
3. Configure Gemini to use Windsor MCP
Navigate to the Gemini config directory:
<pre>
cd ~/.gemini
</pre>
If the .gemini directory doesn’t exist yet, run gemini once to generate it.
Open the settings.json file:
<pre>
nano settings.json
</pre>
Add the following configuration inside the JSON object:
<pre>
{
"mcpServers": {
"windsor": {
"command": "/Users/{your-username}/.local/bin/mcp-proxy",
"args": ["https://mcp.windsor.ai/sse"]
}
}
}
</pre>
Note: Make sure the overall file remains valid JSON (no trailing commas or syntax errors).
4. Start Gemini with Windsor MCP
Now, simply run Gemini:
<pre>
gemini
</pre>
You’ll be asked for you