io.github.neuledge/context

Local-first documentation for AI agents. Offline, instant, private doc retrieval.

108Apache-2.0devtools

Install

Config snippet generator goes here (5 client tabs)

README

<p align="center">
  <h1 align="center">Context</h1>
  <p align="center">
    <strong>Up-to-date docs for AI agents — local, instant, plug and play.</strong>
  </p>
</p>

<p align="center">
  <a href="https://www.npmjs.com/package/@neuledge/context"><img src="https://img.shields.io/npm/v/@neuledge/context.svg" alt="npm version"></a>
  <a href="https://github.com/neuledge/context/blob/main/LICENSE"><img src="https://img.shields.io/badge/license-Apache--2.0-blue.svg" alt="License"></a>
  <a href="https://www.typescriptlang.org/"><img src="https://img.shields.io/badge/TypeScript-5.0-blue.svg" alt="TypeScript"></a>
</p>

---

AI agents are trained on outdated docs. When libraries release new versions, your AI doesn't know — and confidently gives you wrong answers.

```js
// Your AI, mass-trained on AI SDK v5 docs, will suggest:
import { Experimental_Agent as Agent, stepCountIs } from 'ai';

// But v6 changed the API entirely:
import { ToolLoopAgent } from 'ai';
```

The fix isn't better prompting. It's giving your AI the right docs.

## How It Works

Context is an MCP server backed by a [community-driven package registry](registry/) with **100+ popular libraries** already built and ready to use. When your AI agent needs documentation, it searches the registry, downloads the right package, and queries it locally — all automatically.

**Install once. Configure once. Then just ask your AI.**

<p align="center">
  <img src="https://media.githubusercontent.com/media/neuledge/context/main/packages/context/assets/ai-sdk-demo.gif" alt="Context demo" width="800">
</p>

---

## :rocket: Quick Start

### 1. Install

```bash
npm install -g @neuledge/context
```

### 2. Connect to your AI agent

Context works with any MCP-compatible agent. Pick yours:

<details>
<summary><strong>Claude Code</strong></summary>

```bash
claude mcp add context -- context serve
```

</details>

<details>
<summary><strong>Claude Desktop</strong></summary>

Add to your config file:
- **Linux**: `~/.config/claude/claude_desktop_config.json`
- **macOS**: `~/Library/Application Support/Claude/claude_desktop_config.json`
- **Windows**: `%APPDATA%\Claude\claude_desktop_config.json`

```json
{
  "mcpServers": {
    "context": {
      "command": "context",
      "args": ["serve"]
    }
  }
}
```

Restart Claude Desktop to apply changes.

</details>

<details>
<summary><strong>Cursor</strong></summary>

Add to `~/.cursor/mcp.json` (global) or `.cursor/mcp.json` (project-specific):

```json
{
  "mcpServers": {
    "context": {
      "command": "context",
      "args": ["serve"]
    }
  }
}
```

Or use **Settings > Developer > Edit Config** to add the server through the UI.

</details>

<details>
<summary><strong>OpenAI Codex</strong></summary>

Either use the CLI

```bash
codex mcp add context -- context serve
```

Or add to `~/.codex/config.toml` (global) or `.codex/config.toml` (project-specific):

```toml
[mcp_servers.context]
command = "context"
args = ["serve"]
```

Restart OpenAI Codex to apply changes.

</details>

<details>
<summary><strong>VS Code (GitHub Copilot)</strong></summary>

> Requires VS Code 1.102+ with GitHub Copilot

Add to `.vscode/mcp.json` in your workspace:

```json
{
  "servers": {
    "context": {
      "type": "stdio",
      "command": "context",
      "args": ["serve"]
    }
  }
}
```

Click the **Start** button that appears in the file, then use Agent mode in Copilot Chat.

</details>

<details>
<summary><strong>Windsurf</strong></summary>

Add to `~/.codeium/windsurf/mcp_config.json`:
- **Windows**: `%USERPROFILE%\.codeium\windsurf\mcp_config.json`

```json
{
  "mcpServers": {
    "context": {
      "command": "context",
      "args": ["serve"]
    }
  }
}
```

Or access via **Windsurf Settings > Cascade > MCP Servers**.

</details>

<details>
<summary><strong>Zed</strong></summary>

Add to your Zed settings (`cmd+,` or `ctrl+,`):

```json
{
  "context_servers": {
    "context": {
      "command": {
        "path": "context",
        "args": ["serve"]
      }
    }
  }
}
```

Check the Agent Panel settings to verify the server shows a green indicator.

</details>

<details>
<summary><strong>Goose</strong></summary>

Run `goose configure` and select **Command-line Extension**, or add directly to `~/.config/goose/config.yaml`:

```yaml
extensions:
  context:
    type: stdio
    command: context
    args:
      - serve
    timeout: 300
```

</details>

### 3. Ask your AI anything

That's it. Just ask:

> "How do I create middleware in Next.js?"

Your agent searches the [community registry](registry/), downloads the docs, and answers with accurate, version-specific information. Everything happens automatically — no manual `context install` needed for registry packages.

---

## The Community Registry

The registry is what makes Context plug and play. It's a growing collection of **100+ pre-built documentation packages** maintained by the community. Think of it like a package manager, but for AI-ready docs.

**Popular packages a