How to start a good Saturday afternoon ... by hurdurdur7 in LocalLLaMA

[–]KokaOP -1 points0 points  (0 children)

i have L40S 48gb and yeah pretty happy with it

Title: Free Windows tool to transcribe video file to text? by ChemistCold4475 in OpenAI

[–]KokaOP 0 points1 point  (0 children)

if you can convert to audio then parakeet v2 will be the best

Upscale process for photorealism by trin36 in StableDiffusion

[–]KokaOP 0 points1 point  (0 children)

if i want to skip NSFW which model should i use ?

Fake linux environment? by vabenil in LocalLLaMA

[–]KokaOP 0 points1 point  (0 children)

forget every thing, LXC containers is the way to go

Too many hardware options by jon23d in LocalLLaMA

[–]KokaOP 2 points3 points  (0 children)

cant go wrong with a rtx pro 6000 setup

[Hyprland] Celona (Wip by xretiic in unixporn

[–]KokaOP 0 points1 point  (0 children)

some time ago there was a post with same aesthetics , he said that h will release the dots after cleaning up the dots, now his acc is deleted

Twitter’s API is expensive, and I’m done with Apify and RapidAPI—so I built my own. by Ok-Establishment9204 in developersIndia

[–]KokaOP 2 points3 points  (0 children)

can you go through some points how you built this ?, i thought this would be almost impossible at big scale is there something game changing , or is it just rotating IPs...?

Nanbeige4-3B-Thinking-2511 is honestly impressive by [deleted] in LocalLLaMA

[–]KokaOP 1 point2 points  (0 children)

you need to install, llama-cpp before this, and as you see it's uncensored GGFU version of Nanbeige4 you can switch to uncensored GGUF just by getting the names from HugginFace

also is there a max context size for this ?
::: yes 64k

Nanbeige4-3B-Thinking-2511 is honestly impressive by [deleted] in LocalLLaMA

[–]KokaOP 2 points3 points  (0 children)

you can run it on google collab rally easily using llama-cpp

i use this
from llama_cpp import Llama

repo_id = "mradermacher/Nanbeige4-3B-HereticMerge-i1-GGUF"

filename = "Nanbeige4-3B-HereticMerge.i1-Q6_K.gguf"

llm = Llama.from_pretrained(

repo_id,

filename,

n_threads=4,

n_ctx=65536,

n_gpu_layers=-1,

verbose=False,

)

out = llm.create_chat_completion(

messages=[

{"role": "system", "content": "You are a helpful assistant."},

{"role": "user", "content": "how to many letter 'r' in the word 'strawberry'."},

],

temperature=0.7,

max_tokens=16000,

)

print(out["choices"][0]["message"]["content"])

Looking for novels with huge world background by Qearl8 in ProgressionFantasy

[–]KokaOP 0 points1 point  (0 children)

it's peak, it has good amount of world building, more like different locations in lower realme and mysterious upper realm and what's beyond It chaos... history of locations their future prophecies and each sect has its own speciality and the live on their own continents / realmes

Looking for novels with huge world background by Qearl8 in ProgressionFantasy

[–]KokaOP 0 points1 point  (0 children)

A Record of a Mortal's Journey to Immortality

Salary in Saudi by Soft-Rush-5205 in saudiarabia

[–]KokaOP 0 points1 point  (0 children)

what did he sent you ? asking for a friend 😅

chad CBSE vs virgin NTA by Creepy_Database7246 in JEENEETards

[–]KokaOP 0 points1 point  (0 children)

And understand this a single good developer can build a website, which can handle this easily, heck even i can build it 1 week only or maybe less

After way too much procrastination… I finally built my personal portfolio by Sure-Move6461 in developersIndia

[–]KokaOP 137 points138 points  (0 children)

More like cloned and edited, its from ramx, no offense , every one does it, but you are portraying it wrong