Skip to main content

Quick Setup

1

Navigate to AI Settings

Navigate to chrome://settings/browseros-ai to add Ollama as a provider.
2

Get Model ID

Get the model ID of your Ollama model (e.g., gpt-oss:20b)
3

Start Ollama Server

Start Ollama with CORS enabled:
OLLAMA_ORIGINS="*" ollama serve
4

Select and Use

Select the model in agent and start using it! 🥳
If you don’t want to run from CLI with CORS settings, we recommend using LM Studio instead. See the LM Studio setup guide.

Detailed Visual Guide

Step 1: Navigate to Settings Page

Navigate to the BrowserOS AI settings page at chrome://settings/browseros-ai
Navigate to settings page

Step 2: Get the Ollama Model ID

Identify and copy your Ollama model ID for configuration.
Get the ollama model ID

Step 3: Start Ollama from CLI

Run Ollama with the required CORS settings to allow BrowserOS to connect:
OLLAMA_ORIGINS="*" ollama serve
Unfortunately, Ollama by default doesn’t allow requests from other apps without this configuration.
Start Ollama from CLI

Step 4: Use the Model

Select the model in the Agent dropdown and start using it! 🚀
Use the model

Alternative: LM Studio

LM Studio Setup

If you prefer not to run Ollama from the command line, LM Studio provides a more user-friendly alternative with a graphical interface.
I