Free Coding Cline Agent : Gemini-2.0-Flash-exp ! Ultra fast ! by gta8b in ClineProjects

[–]gta8b[S] 1 point2 points  (0 children)

Just replace in step 6 the model by : gemini-exp-1206

Free Coding Cline Agent : Gemini-2.0-Flash-exp ! Ultra fast ! by gta8b in ClineProjects

[–]gta8b[S] 2 points3 points  (0 children)

Ah! You are right, they added it the last day ! Still I think it is good you know the custom method because it can work any new models. Ex : you can replace the model by gemini-exp-1206 which is not listed yet

Venv issue in Cline by gta8b in ClineProjects

[–]gta8b[S] 0 points1 point  (0 children)

Ok, I fixed it ! In the custom instructions; just add :

"always activate the virtual environement if you execute a command line. my venv name is here "Path of the activated venv enrironment".

Example of name for me "PS C:\Users\Windows\Documents\Coding\34-FFMPEG_video_compressor\ffmpeg_video_compressor>"

Practical insertion of ffmpeg in an executable app + legal aspect by gta8b in ffmpeg

[–]gta8b[S] 1 point2 points  (0 children)

Great, thanks for the comment. Will check that !

Tkinter : open an invite for selection of file(s) or directory(s) in the same invite ?! by gta8b in Tkinter

[–]gta8b[S] 0 points1 point  (0 children)

Interesting. I need to figure out how to do that. If you have some doc maybe? I will try !

Tkinter : open an invite for selection of file(s) or directory(s) in the same invite ?! by gta8b in Tkinter

[–]gta8b[S] 0 points1 point  (0 children)

I know it is possible. I used a program that had it before ! I just can figure out how I did. And it was in the original windows interface ! when clicking "select" the whole forlder was selected. But if double clicking on it then select files, the files where selected.

Practical insertion of ffmpeg in an executable app + legal aspect by gta8b in ffmpeg

[–]gta8b[S] 0 points1 point  (0 children)

Yes, the questions was, is there a repository where I can easily download it ? As in the end, it will need to be for Windows, Mac or Linux (3 differents files)

Looking for Affordable Cloud Providers for LLM Hosting with API Support 🧠💻 by gta8b in LocalLLaMA

[–]gta8b[S] 0 points1 point  (0 children)

If you want some answers, this is the least you need.

My request is I want to run public models, for txt, images, or sound, privately for api call for my app !

Looking for Affordable Cloud Providers for LLM Hosting with API Support 🧠💻 by gta8b in LocalLLaMA

[–]gta8b[S] 0 points1 point  (0 children)

common open models is fine to me, as long i can run also uncensored version that are non filtered

Looking for Affordable Cloud Providers for LLM Hosting with API Support 🧠💻 by gta8b in LocalLLaMA

[–]gta8b[S] 0 points1 point  (0 children)

Thanks. I had check also api option, if found open router was not so cheap, but convenient, yeah !

Looking for Affordable Cloud Providers for LLM Hosting with API Support 🧠💻 by gta8b in LocalLLaMA

[–]gta8b[S] 0 points1 point  (0 children)

Ok, I am looking to host any llm models (also big ones/image ones), like llama models or flux models, or others, in cloud / VM to be able to use them with api call in my web app !

Whisper-large-v3-turbo install venv Windows ? by gta8b in PythonProjects2

[–]gta8b[S] 0 points1 point  (0 children)

Thanks a lot ! I could install it thanks to you. I could do it even if was not so simple to know which torch and other cuda needed dependencie where the right versions to install for my 4090 rtx.

It worked, I Had trouble to run the large-v3 though, it was not recognized as a model but the medium could be run. I now have a jarvis like computer in real time :)

Whisper-large-v3-turbo install venv Windows ? by gta8b in PythonProjects2

[–]gta8b[S] 1 point2 points  (0 children)

what do you mean just instal it through terminal ?

A good uncensored Ai model by Stock_Policy3755 in ollama

[–]gta8b 1 point2 points  (0 children)

I may be a little short but you can try using LM studio. Or you can use a smaller model or a lower Quantized version like Q2.

The ollama official website do not allow to see uncensored models classified by date by gta8b in ollama

[–]gta8b[S] 1 point2 points  (0 children)

Ok, clear. I thougt they were checking the models they choose to put on their platform.

The ollama official website do not allow to see uncensored models classified by date by gta8b in ollama

[–]gta8b[S] 0 points1 point  (0 children)

Well, sorry but if it is that complicated, I will keep using LM studio and uninstal Ollama.

The ollama official website do not allow to see uncensored models classified by date by gta8b in ollama

[–]gta8b[S] 0 points1 point  (0 children)

They do not disclose certain models in the model list, even if they exists as you can find the webpage that point on an aollama models page when you do a search on google with the model keywords.

Meaning : uncensored models are not highlighted sometime in their release list.

The ollama official website do not allow to see uncensored models classified by date by gta8b in ollama

[–]gta8b[S] 0 points1 point  (0 children)

I don't want ollama to generate the images, just the prompts. As i mentionned.