1

Talking to a local LLM in the Firefox sidebar

I tried this a few months back, but the LLM sidebar doesn't offer any extra integration that I saw? No variables and no ability to include the page you're looking at as context by doing something like "@selection" or "@currentpage" or similar. I don't understand what the point of having it in your sidebar is when most people already have shortcuts or workflows already setup for their favorite LLM chat applicacations.

2 days agoKetoManx64

Despite the common plan for many browsers to include their own local model, I feel nobody is really promoting a practical workflow for in-browser LLM side-usage. It's a good time to explore designs like this.

3 days agoPaulShomo

Those that care about local models already have the knowledge/abilitu to run them with things like LLMStudio, Llama.cpp

2 days agoKetoManx64

TL;DR

Open `about:config`

Set `browser.ml.chat.hideLocalhost` to `false`

Chatbot sidebar shows "localhost" when Open WebUI runs there.

Change `browser.ml.chat.provider` to the correct URL.