Which Model Should I Use?
| Mode | What works | Recommendation |
|---|---|---|
| Chat Mode | Any model, including local | Ollama or Gemini Flash |
| Agent Mode | Cloud models only | Claude Opus 4.5 or Kimi K2.5 (open source) |
Kimi K2.5 is an open-source, multimodal model with great agentic performance — and 60-70% cheaper than Claude models.
Cloud Providers
Connect to powerful AI models using your API keys. Your keys stay on your machine — requests go directly to the provider.Gemini (Free)
Gemini (Free)
Gemini Flash is fast and free. Google gives you 20 requests per minute at no cost.Get your API key:
- Go to aistudio.google.com
- Click Get API key in the sidebar
-
Click Create API key and copy it

-
Go to
chrome://browseros/settings - Click USE on the Gemini card
-
Set Model ID to
gemini-2.5-flash-preview-05-20 - Paste your API key
-
Check Supports Images, set Context Window to
1000000 -
Click Save

Claude (Best for Agents)
Claude (Best for Agents)
Claude Opus 4.5 gives the best results for Agent Mode.Get your API key:
- Go to console.anthropic.com
- Click API keys in the sidebar
-
Click Create Key and copy it

-
Go to
chrome://browseros/settings - Click USE on the Anthropic card
-
Set Model ID to
claude-opus-4-5-20250514 - Paste your API key
-
Check Supports Images, set Context Window to
200000 -
Click Save

OpenAI
OpenAI
GPT-4.1 is solid for both chat and agent tasks.Get your API key:
- Go to platform.openai.com
- Click settings icon → API keys
-
Click Create new secret key and copy it

-
Go to
chrome://browseros/settings - Click USE on the OpenAI card
-
Set Model ID to
gpt-4.1 - Paste your API key
-
Check Supports Images, set Context Window to
128000 -
Click Save

OpenRouter
OpenRouter
Access 500+ models through one API.Get your API key:
Add to BrowserOS:
- Go to openrouter.ai and sign up
- Copy your API key from the homepage
anthropic/claude-opus-4.5).
-
Go to
chrome://browseros/settings - Click USE on the OpenRouter card
- Paste the model ID and your API key
- Set Context Window based on the model
-
Click Save

Local Models
Local Model Guide
Run AI completely offline with Ollama or LM Studio. Includes recommended models, context length setup, and configuration steps.
Switching Between Models
Use the model switcher in the Assistant panel to change providers anytime. The default provider is highlighted.
