Skip to main content

Complete Setup Guide

This guide will walk you through setting up and using OpenAI’s GPT-OSS model with LM Studio and BrowserOS.

Step 1: Setup LM Studio and Download GPT-OSS

1

Download LM Studio

Download LM Studio from https://lmstudio.ai/
2

Open Discovery Page

Click on Discover in LM Studio (the 🔍 icon on the left).
Setup LMStudio and download OpenAI GPT-OSS
3

Search and Download Model

Search for gpt-oss-20b and click Download.
Search for gpt-oss-20b and click Download
4

Load the Model

After download finishes, load the model.
After download finishes, load the model
Enable the flag to choose model parameters on load
Enable the flag to choose model parameters on load
Set context length to 32768 (adjust based on your hardware) and load the model.
Set context length to 32768

Step 2: Configure BrowserOS to use LM Studio

1

Add Provider

Navigate to chrome://settings/browseros-ai and click Add Provider.
Configure BrowserOS to use LMStudio
2

Select Provider Type

Choose OpenAI Compatible as the Provider Type.
Choose Provider Type as OpenAI Compatible
3

Configure Connection

  • Set Base URL to http://127.0.0.1:1234/v1
  • Set Model ID to openai/gpt-oss-20b
  • Set context length to 32768
  • Save your configuration
Set Base URL configuration
Complete configuration
4

Set as Default

Change the default provider to lmstudio and you’re ready to go!
Change the default provider to lmstudio
5

Start Using GPT-OSS

You can now use GPT-OSS from the Agent!
You can use gpt-oss from Agent
If everything is set up correctly, you should see messages in LM Studio:
LM Studio showing active connections

Configuration Summary

Troubleshooting

  • Verify LM Studio is running and the model is loaded
  • Check the server logs in LM Studio for any errors
  • Ensure the Base URL is correct (http://127.0.0.1:1234/v1)
  • Make sure the context length in BrowserOS matches LM Studio
  • Reduce context length if you’re running out of memory
  • Consider using a smaller model if hardware is limited
I