Models
Sidekick supports a wide range of AI models from leading providers. Each model has different capabilities, context windows, and pricing.
Available Models
Selecting a Model
Open the model selector
Click the model dropdown in the chat input area (shows current model name).
Browse available models
Models are grouped by provider. Only models you've enabled in Settings appear here.
Choose your model
Click a model to select it. Your choice is saved for future conversations.
Configuring Model Visibility
Control which models appear in the model selector:
- Go to Settings → Models
- Toggle models on/off for each provider
- Disabled models won't appear in the selector
Hide models you don't use to keep the selector clean and focused.
Using Ollama (Local Models)
Run AI models locally with Ollama:
Install Ollama
Download and install Ollama on your machine.
Pull a model
ollama pull llama3.2
ollama pull codellamaConfigure in Sidekick
Go to Settings → Ollama and ensure the connection is configured.
Select your local model
Ollama models appear in the model selector under "Ollama".
Local models run entirely on your machine—no API keys or internet required.
Choosing the Right Model
For everyday coding
- Claude Sonnet or GPT-5 — Good balance of speed and capability
- Gemini Flash — Fast responses for quick tasks
For complex reasoning
- Claude Opus — Deep analysis and planning
- Models with Extended Thinking — Enable for complex problems
For cost efficiency
- Claude Haiku — Fast and affordable
- Ollama models — Free (runs locally)
For vision tasks
- Any model with Vision capability
- Upload screenshots for UI debugging, error analysis, etc.
API Keys
Each provider requires an API key. See Getting Started for setup instructions.
| Provider | Get API Key |
|---|---|
| Anthropic | console.anthropic.com |
| OpenAI | platform.openai.com |
| aistudio.google.com | |
| xAI | x.ai |