Back to Directory/Developer Tools

io.github.backspacevenkat/perspectives

Query GPT 5.2, Claude Opus 4.5, Gemini 3, Grok 4.1 simultaneously for AI perspectives

Developer Toolsv1.8.70

Polydev - Multi-Model AI Perspectives

Get unstuck faster. Query GPT 5.2, Claude Opus 4.5, Gemini 3, and Grok 4.1 simultaneously — one API call, four expert opinions.

npm version SWE-bench Verified License: MIT


Why Polydev?

Stop copy-pasting between ChatGPT, Claude, and Gemini. Get all their perspectives in your IDE with one request.

MetricResult
SWE-bench Verified74.6% Resolve@2
Cost vs Claude Opus62% lower
Response time10-40 seconds

"Different models have different blind spots. Combining their perspectives eliminates yours."


Supported Models

ModelProviderStrengths
GPT 5.2OpenAIReasoning, code generation
Claude Opus 4.5AnthropicAnalysis, nuanced thinking
Gemini 3 ProGoogleMultimodal, large context
Grok 4.1xAIReal-time knowledge, directness

Quick Start

1. Get your free API token

polydev.ai/dashboard/mcp-tokens

TierMessages/MonthPrice
Free1,000$0
Pro10,000$19/mo

2. Install in your IDE

Claude Code

bash
claude mcp add polydev -- npx -y polydev-ai@latest

Then set your token:

bash
export POLYDEV_USER_TOKEN="pd_your_token_here"

Or add to ~/.claude.json:

json
{
  "mcpServers": {
    "polydev": {
      "command": "npx",
      "args": ["-y", "polydev-ai@latest"],
      "env": {
        "POLYDEV_USER_TOKEN": "pd_your_token_here"
      }
    }
  }
}

Cursor

Add to ~/.cursor/mcp.json:

json
{
  "mcpServers": {
    "polydev": {
      "command": "npx",
      "args": ["-y", "polydev-ai@latest"],
      "env": {
        "POLYDEV_USER_TOKEN": "pd_your_token_here"
      }
    }
  }
}

Windsurf

Add to your MCP configuration:

json
{
  "mcpServers": {
    "polydev": {
      "command": "npx",
      "args": ["-y", "polydev-ai@latest"],
      "env": {
        "POLYDEV_USER_TOKEN": "pd_your_token_here"
      }
    }
  }
}

Cline (VS Code)

  1. Open Cline settings (gear icon)
  2. Go to "MCP Servers" → "Configure"
  3. Add the same JSON config as above

OpenAI Codex CLI

Add to ~/.codex/config.toml:

toml
[mcp_servers.polydev]
command = "npx"
args = ["-y", "polydev-ai@latest"]

[mcp_servers.polydev.env]
POLYDEV_USER_TOKEN = "pd_your_token_here"

[mcp_servers.polydev.timeouts]
tool_timeout = 180
session_timeout = 600

Usage

Natural Language

Just mention "polydev" or "perspectives" in your prompt:

text
"Use polydev to debug this infinite loop"

"Get perspectives on: Should I use Redis or PostgreSQL for caching?"

"Use polydev to review this API for security issues"

MCP Tool

Call the get_perspectives tool directly:

typescript
{
  "tool": "get_perspectives",
  "arguments": {
    "prompt": "How should I optimize this database query?",
    "user_token": "pd_your_token_here"
  }
}

Example Response

text
🤖 Multi-Model Analysis

┌─ GPT 5.2 ────────────────────────────────────────
│ The N+1 query pattern is causing performance issues.
│ Consider using eager loading or batch queries...
└──────────────────────────────────────────────────

┌─ Claude Opus 4.5 ────────────────────────────────
│ Looking at the execution plan, the table scan on
│ `users` suggests a missing index on `email`...
└──────────────────────────────────────────────────

┌─ Gemini 3 ───────────────────────────────────────
│ The query could benefit from denormalization for
│ this read-heavy access pattern...
└──────────────────────────────────────────────────

┌─ Grok 4.1 ───────────────────────────────────────
│ Just add an index. The real problem is you're
│ querying in a loop - fix that first.
└──────────────────────────────────────────────────

✅ Consensus: Add index on users.email, fix N+1 query
💡 Recommendation: Use eager loading with proper indexing

Research

Our approach achieves 74.6% on SWE-bench Verified (Resolve@2), matching Claude Opus at 62% lower cost.

ApproachResolution RateCost/Instance
Claude Haiku (baseline)64.6%$0.18
+ Polydev consultation66.6%$0.24
Resolve@2 (best of both)74.6%$0.37
Claude Opus (reference)74.4%$0.97

Read the full paper →


Available Tools

ToolDescription
get_perspectivesQuery multiple AI models simultaneously
get_cli_statusCheck status of local CLI tools
force_cli_detectionRe-detect installed CLI tools
send_cli_promptSend prompts to local CLIs with fallback

Links

IDE Guides


License

MIT License - see LICENSE for details.


Learn More