I tried this a few months back, but the LLM sidebar doesn't offer any extra integration that I saw?
No variables and no ability to include the page you're looking at as context by doing something like "@selection" or "@currentpage" or similar.
I don't understand what the point of having it in your sidebar is when most people already have shortcuts or workflows already setup for their favorite LLM chat applicacations.
Despite the common plan for many browsers to include their own local model, I feel nobody is really promoting a practical workflow for in-browser LLM side-usage. It's a good time to explore designs like this.
Those that care about local models already have the knowledge/abilitu to run them with things like LLMStudio, Llama.cpp
TL;DR
Open `about:config`
Set `browser.ml.chat.hideLocalhost` to `false`
Chatbot sidebar shows "localhost" when Open WebUI runs there.
Change `browser.ml.chat.provider` to the correct URL.
I tried this a few months back, but the LLM sidebar doesn't offer any extra integration that I saw? No variables and no ability to include the page you're looking at as context by doing something like "@selection" or "@currentpage" or similar. I don't understand what the point of having it in your sidebar is when most people already have shortcuts or workflows already setup for their favorite LLM chat applicacations.
Despite the common plan for many browsers to include their own local model, I feel nobody is really promoting a practical workflow for in-browser LLM side-usage. It's a good time to explore designs like this.
Those that care about local models already have the knowledge/abilitu to run them with things like LLMStudio, Llama.cpp
TL;DR
Open `about:config`
Set `browser.ml.chat.hideLocalhost` to `false`
Chatbot sidebar shows "localhost" when Open WebUI runs there.
Change `browser.ml.chat.provider` to the correct URL.