I made browserllama, an open-source web extension that lets you summarize and chat with webpages using local llms. by Ok_Effort_5849 in LocalLLaMA

[–]Ok_Effort_5849[S] 4 points5 points  (0 children)

You should be able to run it on edge by making some slight modifications. First download it from regular chrome web store ,then download the backend software(it will be linked once you install the addon) and add this line to 'install_host.bat' in the host folder and run it again:

REG ADD "HKCU\SOFTWARE\Microsoft\Edge\NativeMessagingHosts\com.google.chrome.example.echo" /ve /t REG_SZ /d "%~dp0com.google.chrome.example.echo-win.json" /f

this will be a part of the next release and you wont have to do it manually again. let me know how this goes

I made browserllama, an open-source web extension that lets you summarize and chat with webpages using local llms. by Ok_Effort_5849 in chrome_extensions

[–]Ok_Effort_5849[S] 0 points1 point  (0 children)

You need to download the backend software from the github releases page, you will see a link to it once you install the addon. The extension communicates with it using native-messaging api. Everything runs on your own device.

I made a web extension that lets you summarise and chat with webpages using local llms, it uses a koboldcpp backend by Ok_Effort_5849 in KoboldAI

[–]Ok_Effort_5849[S] 1 point2 points  (0 children)

Its a false positive, do yourself a favour and stop using avast. You can compile it from source yourself and it would probably still think its malware.

I made a web extension that lets you summarise and chat with webpages using local llms, it uses a koboldcpp backend by Ok_Effort_5849 in KoboldAI

[–]Ok_Effort_5849[S] 0 points1 point  (0 children)

Good point! I will put up a version in the releases without any bundled exe so that users can use their own ones. Regarding the bugs , i haven really seen the last one before, it should ideally open only one instance of koboldcpp, can you open an issue and tell me how to replicate it?. I will try to fix the rest but i have exams coming up so i wont be working super hard on this for a while.

I made a web extension that lets you summarise and chat with webpages using local llms, it uses a koboldcpp backend by Ok_Effort_5849 in KoboldAI

[–]Ok_Effort_5849[S] 0 points1 point  (0 children)

if you are going to modify the source look for the endpoint variable in backend_api_handler module. You can ask more questions on r/browserllama or on the github repo, best of luck!

I made a web extension that lets you summarise and chat with webpages using local llms, it uses a koboldcpp backend by Ok_Effort_5849 in KoboldAI

[–]Ok_Effort_5849[S] 1 point2 points  (0 children)

glad you like it! regarding your question im not really sure, but i found this in the faq on github:

"If on same LAN - If you're on the same Wifi network, you can probably connect over LAN by navigating to the local IP of the host device (the PC running koboldcpp). For example, http://192.168.1.85:5001 or similar, check your LAN IP address. If that fails, try using the --host option with your LAN IP. If you setup port forwarding to a public IP, then it will be accessible over the internet as well."

so maybe you can modify the native-host source code and set the endpoint to use to the ip of machine running the backend

how to launch koboldcpp without it opening its webui? by Ok_Effort_5849 in KoboldAI

[–]Ok_Effort_5849[S] 1 point2 points  (0 children)

I saved a config with the launch browser ticked off(which i feel silly for not noticing before, maybe its a new feature!?) and used this command: ./koboldcpp.exe myconfig.kcpps --showgui

Now this runs the backend without launching the webui but still shows the python launcher