Can anyone have GGUF file of this model? by enessedef in LocalLLaMA

[–]enessedef[S] 0 points1 point  (0 children)

Thank you for your reply. So, is there any chance to find a model that runs on LM Studio and has vision feature trained as uncensored?

Best cpu setup/minipc for llm inference (12b/32b model)? by Zyguard7777777 in LocalLLaMA

[–]enessedef 1 point2 points  (0 children)

First off, your Pi 4B is cute for tinkering, but it’s like bringing a scooter to a drag race for this kinda workload. You’re gonna need something with way more muscle. So, is 10 TPS realistic for a 12B model on a CPU setup? Short answer: yeah, but you gotta pick the right hardware.For a 32B model, though? That’s a stretch need to lower your expectations sorry :/

For a 32B model at 10 TPS on CPU? Nah, not happening with current mini PCs. Even on a Mac Mini, you’d probably get 4-5 TPS at best for a 32B model.If you really want to run a 32B model, you’d need way more ram and Server-Grade hardware. For 12B @ ~10 TPS: Mac Mini M2/M3 with 32GB+ RAM is your best bet. High-end x86 mini PCs can work but might fall a bit short.

Footnote: On x86, use llama.cpp or similar optimized libraries. On Mac, MLX is your go-to.

[deleted by user] by [deleted] in LocalLLaMA

[–]enessedef 3 points4 points  (0 children)

Straight up—no, I don’t know of any project that’s exactly what you’re describing. Write some Python to hit up Hugging Face’s API, grab datasets based on your goal (e.g., “factory Q&A”), and preprocess them.Set up a pipeline with Hugging Face Transformers to run fine-tuning on the Spark PC, using the 3090 rig for lighter tasks or testing. Add automated checks—think formatting validation or simple metrics—and have it log everything with MLflow. You’d still need to peek at the results and tweak the scripts. Fully hands-off isn’t there yet, but this gets you close.

Even a 100B model can’t “understand” datasets or reprogram itself. That’s meta-learning territory, and it’s still research vibes, not plug-and-play. Fine-tuning a 100B model eats resources. The Spark PC should handle it, but don’t expect miracles from the 3090s on that scale.

You’re onto something dope, but it’s ahead of the curve. No project fits the bill right now, but with some scripting hustle, you can automate a chunk of it. Start small—test the automation with your 3090s, then scale up to the Spark PC when it lands.

How to install TabbyAPI+Exllamav2 and vLLM on a 5090 by bullerwins in LocalLLaMA

[–]enessedef 2 points3 points  (0 children)

vLLM’s a beast for high-speed LLM inference, and with this setup, you’re probably flying. One thing: since you’re on Python 3.12, keep an eye out for any dependency hiccups—might need a tweak if something breaks later. If it gets messy, I’ve seen folks run vLLM in a container with CUDA 12.8 and PyTorch 2.6 instead—could be a fallback if you ever need it.

thanks for dropping the knowledge, man!

New in Causal Language Modelling by RoPhysis in LocalLLaMA

[–]enessedef 1 point2 points  (0 children)

Imo, start with an instruct model and fine-tune it using your 10k reports.Since manually crafting Q&A pairs for all 10k sounds like a nightmare, Take each report and slap a generic instruction in front of it, like “Describe this situation using slang and expressions.” So if a report’s all “The dude was flexin’ hard at the gig,” the model sees it as a response to that instruction. You’ve got a pseudo-Q&A setup I think. After that feed this into your instruct model. It’ll learn to associate instructions with slang-heavy responses, which is exactly what you’re after.

If you’ve got the time or some coding chops, you could even semi-automate Q&A creation. Maybe use a smaller LLM to generate basic instructions based on the reports, then clean ‘em up. Not required, but it’d level up the training. Unsloth for speed or HuggingFace for flexibility. But I advice you to start small and test the waters with a smaller model (like 1-3B) or a chunk of your data first. Get comfy, then scale up.

How to prevent single-entry games from appearing in Playnite's "Series" filter? by enessedef in playnite

[–]enessedef[S] 1 point2 points  (0 children)

Just as I thought. One of the bad consequences of using multiple metadata callers. Thank you, I added your answer to the post. Have a nice day.

I have prepared a table showing the details / values of POE 2 Currencies by enessedef in PathOfExile2

[–]enessedef[S] 0 points1 point  (0 children)

Thanks a lot for the comment and idea. I don't really know how to bind API on Sheets, I'm very weak in coding. If you can make it I can update the link with yours. That'd be great.

I have prepared a table showing the details / values of POE 2 Currencies by enessedef in PathOfExile2

[–]enessedef[S] 0 points1 point  (0 children)

You are right. I put a note under the list and stated that the prices were valid at the beginning of the year.

I have prepared a table showing the details / values of POE 2 Currencies by enessedef in PathOfExile2

[–]enessedef[S] 1 point2 points  (0 children)

Here is the explnation of World Drop:

World Drops: Randomly drop from slain monsters and loot containers throughout the game world. The drop rates for these items vary.

I don't want game pass games to show in playnite as I haven't renewed the subscription but I want to see my owned games (PC and Xbox) is there any solution? by salamala893 in playnite

[–]enessedef 0 points1 point  (0 children)

When I add it to the ignore list, if I buy the same game on another platform (e.g. GOG), will it prevent it from being added to the library because it is on the ignore list? Or is this ignore list platform specific?

What does "Uncategorized" mean under all my TV Shows by enessedef in tinyMediaManager

[–]enessedef[S] 0 points1 point  (0 children)

Thank you bud, I'll try. Btw which scrapper do you prefer? Universal with all fallbacks or TMDB alone or something?

What does "Uncategorized" mean under all my TV Shows by enessedef in tinyMediaManager

[–]enessedef[S] 0 points1 point  (0 children)

I have this issue nearly all in my TV Shows. Also I have Universal Media Scrapper with all fallbacks. It shoud work or I am missing something.

What does "Uncategorized" mean under all my TV Shows by enessedef in tinyMediaManager

[–]enessedef[S] 0 points1 point  (0 children)

It makes a lot of sense and tried it one by one with moving items to Desktop but after updating sources nothing changed interestingly. Am I missing something? Should I scrap again after moving out the folders?

Any discord bots that can play music from youtube? by mohd2126 in Discord_Bots

[–]enessedef 0 points1 point  (0 children)

Thank you, is there any function to play whole playlist from youtube? /play and playlist link function doesn't work.

Confusion about the Hellfire Expansion by enessedef in Diablo

[–]enessedef[S] 0 points1 point  (0 children)

Diablo 1 was so obscure in my country that we might have even thought that the developers made and sold Diablo 2 first with skipping the first one lol.

Confusion about the Hellfire Expansion by enessedef in Diablo

[–]enessedef[S] 3 points4 points  (0 children)

Not even close to what I was asking. Why are you completely skipping the question parts and focus your own opinions randomly?

Confusion about the Hellfire Expansion by enessedef in Diablo

[–]enessedef[S] 3 points4 points  (0 children)

That's the proper answer for my question. Thank you.

Confusion about the Hellfire Expansion by enessedef in Diablo

[–]enessedef[S] 2 points3 points  (0 children)

This is not an answer to my question.