Skip to main content
Skaro supports four LLM providers out of the box. You can use one provider for everything, or mix them using role-based routing.

Provider Comparison

ProviderAPI Key RequiredDefault ModelConsole URL
AnthropicYesclaude-sonnet-4-6console.anthropic.com
OpenAIYesgpt-5.2platform.openai.com
GroqYesllama-3.3-70b-versatileconsole.groq.com
OllamaNoqwen3:32bLocal — ollama.com

Available Models

Anthropic

ModelContext WindowMax Output
Claude Opus 4.6 (claude-opus-4-6)200K128K
Claude Sonnet 4.6 (claude-sonnet-4-6)200K64K
Claude Sonnet 4.5 (claude-sonnet-4-5-20250929)200K64K
Claude Haiku 4.5 (claude-haiku-4-5-20251001)200K64K

OpenAI

ModelContext WindowMax Output
GPT-5.2 (gpt-5.2)256K128K
GPT-5.1 (gpt-5.1)256K128K
GPT-5 (gpt-5)256K65K
GPT-5 Mini (gpt-5-mini)256K65K
GPT-5.2 Codex (gpt-5.2-codex)256K128K
GPT-4.1 (gpt-4.1)1M32K
GPT-4.1 Mini (gpt-4.1-mini)1M32K

Groq

ModelContext WindowMax Output
Llama 3.3 70B (llama-3.3-70b-versatile)131K32K
Llama 3.1 8B Instant (llama-3.1-8b-instant)131K131K
GPT-OSS 120B (openai/gpt-oss-120b)131K65K
Llama 4 Scout 17B (meta-llama/llama-4-scout-17b-16e-instruct)131K8K
Kimi K2 (moonshotai/kimi-k2-instruct-0905)262K16K
Qwen3 32B (qwen/qwen3-32b)131K40K

Ollama (Local)

ModelContext WindowMax Output
Qwen3 32B (qwen3:32b)131K40K
Qwen 3.5 35B (qwen3.5:35b)131K40K
Llama 3.3 70B (llama3.3:70b)131K32K
DeepSeek R1 70B (deepseek-r1:70b)131K65K
Gemma 3 27B (gemma3:27b)131K8K
Phi-4 14B (phi4:14b)16K16K
CodeLlama 34B (codellama:34b)16K16K
You can also enter any custom model ID during skaro init or via skaro config --model your-model-id. The lists above are suggestions, not hard limits.

Choosing a Provider

Best quality — Anthropic or OpenAI. Larger models produce better architecture reviews and more consistent code. Best for the architect role. Fastest inference — Groq. Hardware-accelerated inference makes it excellent for code generation. Good for the coder role. Full privacy — Ollama. Code never leaves your machine. No API costs. Trade-off: requires local hardware (16GB+ RAM for 30B+ models) and may produce lower quality output than cloud providers. Cost-effective start — Groq offers a generous free tier. Good for trying Skaro without spending money.

Quick Setup

# Pick one:
skaro config --provider anthropic --api-key sk-ant-...
skaro config --provider openai --api-key sk-...
skaro config --provider groq --api-key gsk_...
skaro config --provider ollama --model qwen3:32b
See Role-Based Routing to use different providers for different phases.