Opinionated MCP server — Pydantic output schemas, 7 domains, built to evolve.
An information metabolism engine. Every text fragment loaded into an LLM's context window is energy — tokens consumed, output influenced, budget spent. Vivesca applies continuous selection pressure so that energy is never wasted.
MCP server is one interface. The metabolism is the identity.
uv add vivescaoutputSchemaShould LLM systems manage their own context? Vivesca tests whether they should, and how.
Claim: Every text fragment in an LLM's context is energy under selection pressure. Continuous metabolism — governed by taste, not just metrics — outperforms manual optimization.
Predictions:
Status: Hypothesis with one implementation, zero external validation. The signals accumulating now are the first real data.
Tokens are energy. Text is mass. Taste decides how to spend it. The rest is plumbing.
Read the trilogy: