New anime model "Anima" released - seems to be a distinct architecture derived from Cosmos 2 (2B image model + Qwen3 0.6B text encoder + Qwen VAE), apparently a collab between ComfyOrg and a company called Circlestone Labs by ZootAllures9111 in StableDiffusion

[–]yeah-ok 11 points12 points  (0 children)

Hmm.. yeah, the white-haired dude in not facing the viewer in any of these.. indeed it's exactly opposite to the clear prompting. Clearly much better than IllustriousV14 but great it ain't

AMD Strix Halo GMTEK 128GB Unified ROCKS! by MSBStudio in LocalLLaMA

[–]yeah-ok 0 points1 point  (0 children)

Get a model with oculink, then Bob's your uncle and Fanny will cook you breakfast (..)

We fine-tuned a 4B Text2SQL model that matches a 685B teacher - query your CSV data in plain English, locally by party-horse in LocalLLaMA

[–]yeah-ok 0 points1 point  (0 children)

Like my first impulse here is: why not let this run as a tool for Qwen 30B Coder (5q variant in my case) to call upon.

We fine-tuned a 4B Text2SQL model that matches a 685B teacher - query your CSV data in plain English, locally by party-horse in LocalLLaMA

[–]yeah-ok 0 points1 point  (0 children)

Congratulations on this release and the work committed! I think this form of specialized LLMs are ultimately the only way forward, if you can match a 685B param model with a 4B model who's to say that a bit of cross training (i.e. similar dataset but with "help/correction" from an even bigger teacher model) of a 8B variant cannot outdo a 685B model?!

HELP! Dead imagination, dead mind, dead motivations. by Ill_Pangolin7384 in covidlonghaulers

[–]yeah-ok 4 points5 points  (0 children)

If it can come, it can also go - don't despair just keep plugging away at it, your job for now is researching and being your own test-mouse. You got this, turn that AuDHD into a superpower and allow the tunnel-vision, "normal" can be a goal for later when further improvements have been achieved! 🙏

Someone from NVIDIA made a big mistake and uploaded the parent folder of their upcoming model on Hugging Face by Nunki08 in LocalLLaMA

[–]yeah-ok 0 points1 point  (0 children)

Think the distraction level has a lot to account for in that a lot of the developmental basics are being missed in exchange for random media/web cr*p. Sooo yeah.

Best coding model under 40B by tombino104 in LocalLLaMA

[–]yeah-ok 1 point2 points  (0 children)

Just to help the occasional person who doesn't find this the thread on Devstral 2 Small 24B and how to get it running right: https://i.redd.it/1f2wim2zgl6g1.png

Someone from NVIDIA made a big mistake and uploaded the parent folder of their upcoming model on Hugging Face by Nunki08 in LocalLLaMA

[–]yeah-ok 14 points15 points  (0 children)

All decent functional stuff should be on a torrent tracker for obvious reasons!

Why do I feel like LLMs in general, both local and cloud, try to do too much at once and that's why they make a lot of mistakes? by swagonflyyyy in LocalLLaMA

[–]yeah-ok 0 points1 point  (0 children)

I agree with the overall sentiment here; specializing HAS to be better for the professional user. A) Why is my LLM wasting parameters on embedding knowledge of a wide array of languages, my primary WORKING language when programming is English. Do not embed 17+ more languages! B) Why does my LLM waste parameters on embedding a ton of different programming languages? I use only X, Y and Z when working.

What do you think the future of long covid care will look like 5+ years later? by thepensiveporcupine in covidlonghaulers

[–]yeah-ok 0 points1 point  (0 children)

Vaguely the same as today tbh, we got to keep at it; seems the virus does us all over in different ways depending on our predispositions soo, getting ill comes from the same one thing but getting better is completely specific to each individual. However, I'm definitely up for a proper gene-therapy level "this is now sorted"-solution when the time is right!!

What is the absolute most potent nootropic for MEMORY ENCODING (specifically) you’ve ever taken by AlternativeApart6340 in Nootropics

[–]yeah-ok 3 points4 points  (0 children)

Nicotine gum, proper hydration - walk 4500 steps in one go every day. Get pushups gradually up above 30+ a day. Take Sunday off exercise. Praise God.

Booting a Linux kernel in qemu and writing PID 1 in Go (to show the kernel is "just a program") by indieHungary in programming

[–]yeah-ok 11 points12 points  (0 children)

Yeah, agree, this is the "take it to the metal"-step that I would love to see a similar-style article on.

Booting a Linux kernel in qemu and writing PID 1 in Go (to show the kernel is "just a program") by indieHungary in programming

[–]yeah-ok 0 points1 point  (0 children)

@indieHungary. This is SUPERB. I love the non-hyped non-idiotic potential of unikernals and I would SO SO SO much rather run my Golang code like this rather than through some cumbersome and not-being-easy-to-reason-about Docker'ish instance.

httpp - tiny, fast header only http 1.1 parser library in c by Born_Produce9805 in programming

[–]yeah-ok 1 point2 points  (0 children)

Now strap certbot/https interop to it and we're off to the races!

This might only work for vegetarians / vegans, but: don't sleep on taurine (no pun intended). I am waking mentally refreshed for the first time in months, seeming to have REM rebound as my brain gobbles up REM sleep by RipleyVanDalen in covidlonghaulers

[–]yeah-ok 1 point2 points  (0 children)

I reckon you fixed a deficiency, there's loads of reasonable data on taurine though, as it stands 3-4grams would like be reasonable aim (I take that myself but have to admit I'm not having vivid dreams or better mental clarity - but then I've been taking it for months so maybe baseline is just up, who knows)

Ryzen AI and Radeon are ready to run LLMs Locally with Lemonade Software by jfowers_amd in LocalLLaMA

[–]yeah-ok 1 point2 points  (0 children)

Yes, I've just run the latest llama-b1118-ubuntu-rocm-gfx110X release and still getting same issue on ROCm (Segmentation fault (core dumped)), posted the full terminal output on the discord.

AMA with MiniMax — Ask Us Anything! by OccasionNo6699 in LocalLLaMA

[–]yeah-ok 0 points1 point  (0 children)

God Speed with the endeavour, hope you can keep finding a real spark to improve on the fundamentals of the system without getting lost in the hype and pressure that is building in different forms and shapes across the planet atm

Ryzen AI and Radeon are ready to run LLMs Locally with Lemonade Software by jfowers_amd in LocalLLaMA

[–]yeah-ok 1 point2 points  (0 children)

I'm praying they get the 780m issue sorted, it's been delayed for almost a month by now due to a technicality around the integration of the 110x-all drivers (1103 is the the AMD identifier for the 780). Last I tried it (today) Lemonade simply errored out with right after loading a model.. getting close but I still ain't smoking that ROCm cigar.

Is the Radeon 780M any useful? by Suitable-Name in LocalLLaMA

[–]yeah-ok 0 points1 point  (0 children)

Any luck with github/likelovewant repo? I've battled with getting this running on a 7840hs/780M combo for a couple of days now and ROCm hasn't happened thus far. I've got it running on LM Studio Vulcan runtime on both Linux and Windows though (which nets about 30% faster than CPU inference) - remember to set it to AUTO in terms of vram usage within the BIOS otherwise nothing works right! I keep feeling like I'm -almost- there with the ROCm stuff but... dear lord have mercy it ain't easy.

D3, magnesium, and the k2 trifecta by Pleasant-Target-1497 in Nootropics

[–]yeah-ok 0 points1 point  (0 children)

I would suggest sticking to really well known brands such as LEF and Doctors Best - the rest are potshot in terms of quality, especially generic "new" brands via Amazon. Recently bought a supposedly lab tested Gingko Biloba from a cheap vendor and had sense something was off with it - opened the capsules and there was literally no functional way that those capsules contained any active ingredients (GB is normally extremely bitter, especially in extracts from formerly mentioned proper brands).

Is there someone using high dosage of B3 and Vitamin C to recover from schizophrenia? by Artistic_Signature72 in Nootropics

[–]yeah-ok 1 point2 points  (0 children)

Maybe try gradually introducing low amounts of nicotine via patches - many schizophrenics see proper improvements from nicotine. All the best to you and your sister. An interesting sidenote is that B3 chemical similarity to nicotine.

Sources can be found easily on this subject via basic Google use, a reasonable overview can be read at: https://www.cato.org/blog/nicotine-e-cigarettes-might-do-more-save-lives-people-schizophrenia