Install
One install. Three ways to use it.
Install once. Pick the integration that fits your workflow.
Python SDK
For custom code — embed Gradata in your app
Use Gradata programmatically inside your Python application. Log corrections, inspect rules, export brains.
- Full API surface
- Runs in your process
- Works with any LLM provider
- 1,900+ tests
CLI
For Claude Code, Codex CLI, Cursor, Aider, bash scripts
A shell command any AI agent can call. Token-efficient, composes with Unix pipes, zero ceremony.
- Any shell-capable AI uses it natively
- Low token overhead (~200 vs MCP's 5K+)
- Composable with pipes
- No schema negotiation
MCP Server
For Claude Desktop, ChatGPT Desktop, Windsurf, Zed
An MCP server any MCP-capable client can connect to. Auto-discovery, standardized schema, no shell needed.
- Auto-discovery by MCP clients
- Works without shell access
- Standardized tool schema
- Good for enterprise (OAuth, audit)
For developers
Three lines to get started.
from gradata import Brain
brain = Brain("./my-brain")
brain.correct(draft="Dear Sir...", final="Hey Mike...")