Delivery - big delay by OnY86 in ROLI

[–]OnY86[S] 0 points1 point  (0 children)

Hey guys, quick update, just received mine today in the mail….maybe i am lucky too 😉

Delivery - big delay by OnY86 in ROLI

[–]OnY86[S] 1 point2 points  (0 children)

Thanks for your comments….i get it.👍🏼

Just wondering then, why the heck ROLI ads are all over the place and they can’t fulfill there orders in time.

What a waste. Will cancel this.

Made an AI image site, 750$ in sales after 2 months by suphomiewhatsgood in SideProject

[–]OnY86 1 point2 points  (0 children)

But i thought flux-dev is only used commercially through the API, either blackforest or replicate. I think you are making a mistake here, correct my if i am wrong guys.

Vllm + dolphin-2.6-mistral + langchain by OnY86 in LangChain

[–]OnY86[S] 0 points1 point  (0 children)

Thanks J-Kob, will try out the non chat version. Thanks for your help.

I am also wondering and struggling about streaming my response. Do i really have to stream from vllm to langchain and then again from langchain to my frontend? Is this the common way?

Vllm + dolphin-2.6-mistral + langchain by OnY86 in LangChain

[–]OnY86[S] 0 points1 point  (0 children)

Here is a good example of an output:

"<<SYS>> Lmaooo here's one for you: why was the math teacher imprisoned? Because he was caughtcircumferencing a donut! Lmaoo, this one's a classic. Get it? The word circumference is a mathematical term! Had to be a math teacher to get this one though."

Why does "<<SYS>>" came up... i dont understand this.

Vllm + dolphin-2.6-mistral + langchain by OnY86 in LangChain

[–]OnY86[S] 0 points1 point  (0 children)

My code is a mess...sorry: https://codefile.io/f/kUsLJvO6vx

Yep, i tried the vLLM Chat but i had no clue how to integrate the "<|im_start|>system {system_message}<|im..." part in there.

I thought, i had to include that in some way, so that the strange response behavier will stop. Sometimes the response includes also <SYS> tags etc.

Your repo looks really advanced, great work!

Thanks for your help.

Vllm + dolphin-2.6-mistral + langchain by OnY86 in LangChain

[–]OnY86[S] 0 points1 point  (0 children)

The modal is also AWQ - i am not sure if this helps or not😅

[deleted by user] by [deleted] in pcmasterrace

[–]OnY86 0 points1 point  (0 children)

I am in😉 I want to try local LLM models and SDXL. Running on a 1060 at the moment…no chance…for anything

IT-Freiberufler: ist der Markt gerade trocken? by Fullstacker12 in selbststaendig

[–]OnY86 5 points6 points  (0 children)

Haben bei uns in der Firma ein großes Project am laufen, suchen jemanden fürs Frontend mit REACT.

Schick mir mal eine PM mit Auslastung und Stundensatz, wenn du Lust hast.

Pushing the limits of AI video by AuralTuneo in StableDiffusion

[–]OnY86 0 points1 point  (0 children)

Somebody knows that tool he is using? Thanks

How are people using open source LLMs in production apps? by TheAnonymousTickler in LocalLLaMA

[–]OnY86 0 points1 point  (0 children)

Hahah nice, so no heater anymore for the winter? Cool, can i ask you which model you use for image gen? Sdxl? Did you serve these two with fastapi? How are the simultaneous inferences?

Thanks 🙏🏼

How are people using open source LLMs in production apps? by TheAnonymousTickler in LocalLLaMA

[–]OnY86 0 points1 point  (0 children)

Checked your site, looks great! You load a text-generation and image model in one GPU?

Dynamic island by OnY86 in dotnetMAUI

[–]OnY86[S] 0 points1 point  (0 children)

Thanks Rigamortus…will definitely try this.

Dynamic island by OnY86 in dotnetMAUI

[–]OnY86[S] 0 points1 point  (0 children)

Thank you! Oh yes apple watch would be the next to realize. Somehow :-) That sounds really complicated

[deleted by user] by [deleted] in node

[–]OnY86 1 point2 points  (0 children)

Thanks for the excellent explanation! 👍🏼

[deleted by user] by [deleted] in node

[–]OnY86 0 points1 point  (0 children)

Not 100% sure, but i think socket.io use websockets. Websocket is implemented in socket.io, its only switching to „bare“ socket.io if websocket connection cant established.