reference
Pick a model.
Anthracode is provider-agnostic. Bring keys from Anthropic, OpenAI, Google, or run a local model via Ollama / LM Studio.
Configure a provider
Run anthracode auth login for an interactive flow, or set environment variables directly.
~/.env
ANTHROPIC_API_KEY=sk-ant-...
OPENAI_API_KEY=sk-...
GOOGLE_API_KEY=...
GROQ_API_KEY=...Recommended models
For most coding tasks, use a frontier model with a reasoning option. These are sane defaults:
- claude-opus-4-7 — best overall judgement; default.
- claude-sonnet-4-6 — fast, capable, lower cost.
- gpt-5.1 — strong on long-context refactors.
- gemini-2.5-pro — long context, good for multi-repo work.
Selecting a model
~/terminal
anthracode --model claude-opus-4-7or in config:
~/anthracode.config.ts
export default {
model: "claude-opus-4-7",
fallbackModel: "claude-sonnet-4-6",
};Local models
Anthracode talks to Ollama and LM Studio out of the box.
~/anthracode.config.ts
provider: {
ollama: { baseUrl: "http://localhost:11434" },
},
model: "ollama/qwen2.5-coder:32b",Local models are best for privacy-sensitive work and offline use. Expect higher latency and lower task-completion quality vs frontier models.
Cost & rate limits
Anthracode reports token usage per turn in the TUI status bar. anthracode --usage prints session totals, broken down by model.
