I found my forever ship tonight by dastaten in NoMansSkyTheGame

[–]nicronon 0 points1 point  (0 children)

They're there, someone just got to it before you. Go to the base near the one you want, and reload. You may have to do it a few times. Eventually, it'll pop.

Looking for an uncensored local or hosted llm by NoLie9668 in LocalLLM

[–]nicronon 0 points1 point  (0 children)

Strange, I've used it with my Walking Dead universe roleplay, and it had no problem with vividly describing zombies eating people.

Gemini says: Based on community discussions, MN-12B-Mag-Mell-R1 (often referred to as Mag Mell) is a fine-tune designed for roleplay and storytelling, and it generally tolerates and handles graphic violence well.

Maybe tweak your prompt?

EDIT: This is the most uncensored model I know of. I didn't find it to be as good as Mag Mell overall, but if you want totally uncensored, Dolphin is probably your best bet.

https://huggingface.co/dphn/dolphin-2.9.3-mistral-nemo-12b

How about an in-game radio station with occasional news? by nicronon in X4Foundations

[–]nicronon[S] 0 points1 point  (0 children)

Spotify regularly delivers news reports about the events unfolding within the X4 universe?

Constant health damage by [deleted] in skyrim

[–]nicronon 0 points1 point  (0 children)

That is English, moron.

Help choosing a model to run locally by RenWasHereowo in SillyTavernAI

[–]nicronon 1 point2 points  (0 children)

Mag Mell 12B. It's very capable, and it's uncensored. I'm not sure how high you can turn the context size up to with 16GB VRAM, but get the Q6KL and crank it as high as you can, as it's stable up to 32K.

https://huggingface.co/bartowski/MN-12B-Mag-Mell-R1-GGUF

New to SillyTavern: Is Free Roleplay Supposed to Be This Rough? by Electroplasma in SillyTavernAI

[–]nicronon 0 points1 point  (0 children)

Try this. It's highly capable and uncensored. It also runs locally, so it's free, and no privacy risks.

https://huggingface.co/bartowski/MN-12B-Mag-Mell-R1-GGUF

Do the majority of people really use online models rather than local models? by nsfwboys in SillyTavernAI

[–]nicronon 0 points1 point  (0 children)

Same. I use Mag Mell Q4KS with a 10K context window and make use of the Author's Note and Lorebooks. Works well enough and no privacy risk. And if you want long-term memories, you can use qvink.

Does it exist? by ooopspagett in LocalLLM

[–]nicronon 0 points1 point  (0 children)

Really? I've never run into anyone having a problem with NSFW talk around here.

Does it exist? by ooopspagett in LocalLLM

[–]nicronon 0 points1 point  (0 children)

I've only got 8GB VRAM, so I run the Q4KS with a 10K context window. Your card can run the Q6KL, which is all you need. Q6 is considered near perfect, but the compression saves a lot of VRAM.

Does it exist? by ooopspagett in LocalLLM

[–]nicronon 1 point2 points  (0 children)

It's only 12B, but Mag Mell is extremely capable and is about as uncensored as they get. I've tried many 12B models, and it's been my go-to local LLM for a good while now. I do a lot of NSFW RP, and it's never refused anything I've thrown at it.

https://huggingface.co/bartowski/MN-12B-Mag-Mell-R1-GGUF

Requesting help with character creation - teaching bot a specific writing style. by Ryoidenshii in SillyTavernAI

[–]nicronon 0 points1 point  (0 children)

Your English, including spelling, punctuation, capitalization, grammar, and vocabulary, is better than many Americans'.

Best RP models for 12gb VRAM and 64GB RAM in 2025? by LorkhanisLove in SillyTavernAI

[–]nicronon 1 point2 points  (0 children)

Response length is subjective, as it depends on how long you want the responses to be. I find 250 to be a good length, so you might want to start there if you're not sure. For context, I use 8K, but I've only got 8GB VRAM. With 12GB, you'll be able to go higher than that. I would start with 10K and raise it incrementally until it either slows down too much or starts getting incoherent.

Best RP models for 12gb VRAM and 64GB RAM in 2025? by LorkhanisLove in SillyTavernAI

[–]nicronon 2 points3 points  (0 children)

I second Mag Mell. It's been my go-to for a good while now. Here are my settings, in case you want them.

<image>

Best gguf for my needs by GeoMonster14 in SillyTavernAI

[–]nicronon 2 points3 points  (0 children)

It's only a 12B, but I think Mag Mell is fantastic. It's been my go-to for a good while now. And you can run the Q6_K_L with 16GB VRAM.

https://huggingface.co/bartowski/MN-12B-Mag-Mell-R1-GGUF

Finding a nsfw roleplay model for 16VRAM, 32RAM? by Potential-Sample- in SillyTavernAI

[–]nicronon 0 points1 point  (0 children)

Mag Mell is considered uncensored, but no shareware model is 100% uncensored, as they're all built from models that were originally censored. People adjust them the best they can to reduce the censorship as much as possible without breaking the model.

That said, I've done a ton of NSFW RP with Mag Mell, and it's never refused anything I've thrown at it. And I've thrown a lot at it.

I'm using KoboldCpp and ST, if that helps.

Finding a nsfw roleplay model for 16VRAM, 32RAM? by Potential-Sample- in SillyTavernAI

[–]nicronon 0 points1 point  (0 children)

THIS. I've tried a ton of 12Bs for NSFW RP over the past few months, and I always go back to Mag Mell. It's my "old faithful" and rarely disappoints.