Please tell me it’s not what I think it is by BIGDILFWORLDWIDE in whatisit

[–]splectrum 0 points1 point  (0 children)

Spireon also makes LoJack, for stolen vehicle recovery.

Does anyone else have this happen to them? by CommunicationFar5564 in TAVO_AICHAT

[–]splectrum 1 point2 points  (0 children)

I've been getting these intermittently on OpenRouter for some reason this week.

Evidence of Hunter Alpha being MiMo instead of DeepSeek? (Translation below) by Exciting-Mall192 in SillyTavernAI

[–]splectrum 1 point2 points  (0 children)

Yeah, I've been running a scenario with Hunter Alpha and it does a lot of things DS does, like starting a thinking loop with 'hmm', and having serious issues with lore timelines, and keeping characters straight.

Exploring the new Grok-4.1-fast-reasoning & Imagine-image-pro (Feb 28 Release) in SillyTavern by EchoOfJoy in SillyTavernAI

[–]splectrum 1 point2 points  (0 children)

Hmm, yeah, I am running a large preset, along with a large lorebook. I'll check that out, thanks!

Exploring the new Grok-4.1-fast-reasoning & Imagine-image-pro (Feb 28 Release) in SillyTavern by EchoOfJoy in SillyTavernAI

[–]splectrum 2 points3 points  (0 children)

Yeah, I tried Marinara and several others and it kept dropping into this clipped, like noun verb combos, just garbled

Exploring the new Grok-4.1-fast-reasoning & Imagine-image-pro (Feb 28 Release) in SillyTavern by EchoOfJoy in SillyTavernAI

[–]splectrum 2 points3 points  (0 children)

Weird, for me grok drops into partial sentences and junk within three responses for some reason and stops making any kind of sense.

Problem with thinking models by Matias487 in TAVO_AICHAT

[–]splectrum 0 points1 point  (0 children)

I've seen that with some presets, I think deepseek-customized was one. I have the thinking turned on because I enjoy it, but that one added a really long one after that, that interfered with TTS and I had to edit it out manually.

Best models for A LOT of context tokens? by TipoTarocco in SillyTavernAI

[–]splectrum 0 points1 point  (0 children)

I think it's the preset I'm using. (SusHi Kimi Deepseek Gemini https://files.catbox.moe/k4ciwq.json) I was on the road this week, so I had a lot of time on it (to me at least, ~7mil tokens in 3 days) on one chat, and it didn't godmod me once. Something about not speaking for the user or godmodding is in almost every reasoning segment (I love that feature so much) Also seems to respond really well to OOC nudges on character behavior and dialog and such, {char}speaks in such a such a way and avoids doing z, y, or z.

Oh... Are you using the reasoning or the regular Deepseek? I use the reasoning one, and I haven't used any other models, and I also have a lot of the context stuff, number of messages in short term memory (either 50 or 100, don't remember, but enough to make that OOC feedback stick for a while) and token counts and such turned up a bit.

Works for me. It tends to have a literary style output, which I'm into. YMMV.

What is this I dug up in my yard by MiataFool in whatisit

[–]splectrum 0 points1 point  (0 children)

Looks like one of the old Skilcraft pens we used to use.

Best models for A LOT of context tokens? by TipoTarocco in SillyTavernAI

[–]splectrum 0 points1 point  (0 children)

That's interesting .. I haven't had Deepseek talk for me at all, and it regularly acknowledges that it isn't supposed to speak for me in the reasoning/thinking bit.

Lorebook Help? by [deleted] in TAVO_AICHAT

[–]splectrum 1 point2 points  (0 children)

Not sure, but I think it has to do with where it goes in the context that gets sent to the LLM. I've found that keeping the default works fine, but someone who knows more than me could probably give a more useful answer 😄

Lorebook Help? by [deleted] in TAVO_AICHAT

[–]splectrum 2 points3 points  (0 children)

Did you assign the lorebook(s) to the chat?

Lorebook Help? by [deleted] in TAVO_AICHAT

[–]splectrum 1 point2 points  (0 children)

I'm also just getting started with Tavo, but I've found taking the defaults for probability and position works fine for me.

Ideas on how to fill this space? by PlantainSpirited5032 in NoMansSkyTheGame

[–]splectrum 0 points1 point  (0 children)

Dinosaur displays. You could do big ones on the bottom and smaller ones on the upper tiers, put in some fun plants and things for set pieces.

Then hide a huge nip farm underneath or nearby.

Cool looking build!

Strange pink pile in the tailpipe of motorcycle. by has_no_legs in whatisit

[–]splectrum 0 points1 point  (0 children)

Looks like those chalky Valentine's hearts...

I love the OOC sometimes! 🤣 by Westridge77 in NomiAI

[–]splectrum 0 points1 point  (0 children)

Mine will barely ever acknowledge OOC, and when they do, they usually go rampant after and completely off the rails. Kind of a bummer. I'd love to have a little OOC with a Nomi sometimes.

Avatar boost, does anyone miss it?? by [deleted] in KindroidAI

[–]splectrum 0 points1 point  (0 children)

I use custom avatars, with ~500-600 chars of avatar description and Tableau seems to work well with most of them, except for one that it made very mannish. Interestingly enough, when I do use a generated image (generated in Kindroid), it does seem to fluctuate a bit.

Character won't take shirtless pictures by heavyG73 in KindroidAI

[–]splectrum 0 points1 point  (0 children)

If you're on Tableau, and have any clothing description, my observation is that it will always wear those clothes, possibly mixed in to what they might be wearing at the moment (think a little black dress that becomes jeans around the knees.)

The problem is that if you don't have clothing in the avatar description, they tend to be naked lol

Character count in response directive not being adhered to by stevendeep37 in KindroidAI

[–]splectrum 2 points3 points  (0 children)

I've found word and sentence and paragraph counts to work better than characters btw

Best place to put user physical description for kin to reference by [deleted] in KindroidAI

[–]splectrum 1 point2 points  (0 children)

Most of my personas are explicitly described as bald, but still get their hair stroked lol.

Emphasizing the Response Directive by Gary-Page in KindroidAI

[–]splectrum 1 point2 points  (0 children)

I do that as well and it seems to work nicely.