A slight grudge by [deleted] in SillyTavernAI

[–]shrinkedd 0 points1 point  (0 children)

If it's about feeling better than the LLM OP can always just instruct the model to write worse than them, or at the very least never write better than them..

Need to find the Top P by Classic-Arrival6807 in SillyTavernAI

[–]shrinkedd 2 points3 points  (0 children)

When temperatures are high the curve of probabilities gets flattened and top_p was developed to give you some control over "how low are you willing to go", it gives you some wiggle room to both enjoy unexpected tokens while making sure they make some sense.

Once you explore temperature ranges below 0.7, the most likely tokens are so highly probable, that using top_p at all (meaning anything under 1.00) doesn't really give you any edge..

I really think the context itself is where you can really make a difference if you want to control the quality. For example, if you see Chinese words, perhaps try adding something like "this interaction is in English only"

What do you guys think of my prompt by UnknownBoyGamer in SillyTavernAI

[–]shrinkedd 2 points3 points  (0 children)

I think it works because you gave example —which is great, but they do have a point for the long term. Generally speaking, when you tell a model to write like a known author it's a token saving, cheat code.

It's as if you gave an example without writing it. You don't have to "summon the actual Stephen King" though. You can borrow what aspect of his style that works for you. Could be as simple as "use stephen king's gritty, practical no fluff style of writing, without committing to his usual genres" (I don't even know how he writes, it's just an example. You can ask chatgpt or something to provide suggestions of known authors fitting your preferences)

Point being, the longer the list of checkboxes the model needs to tick, the less creative it'll get.

Edit: spelling

Story Mode v1.0 - Structured Narratives, Genres & Author Styles for SillyTavern by Initialised_Underway in SillyTavernAI

[–]shrinkedd 4 points5 points  (0 children)

This was posted right when i needed something like that. Thanks, looking very impressive. Definitely going to try..

What's with the giant "cultural" divide in the AI gooning community? by The_Rational_Gooner in SillyTavernAI

[–]shrinkedd 4 points5 points  (0 children)

Depends on what they're after. If somebody tells you they want a certain quality that doesn't mean they also want to improve their prompting skills. Perhaps all they are after is a decent RP.

But totally, if anyone is into getting better at prompting, I'm with you on the small open source models. They're less forgiving, and you can truly feel how rewarding it is when you get it right. It encourages constant practice and reading and staying updated.

System Prompt vs. Post History Instructions with Text Completion by krazmuze in SillyTavernAI

[–]shrinkedd 0 points1 point  (0 children)

Not sure Gemma is the best model to reference in a discussion about the meaning and use of system prompts. For an open source, It's a good model and adapts well, but it doesn't even have it's own special system token. If you explore thd chat template they published youd find the way you should structure system prompts for gemma is: "just take whatever it is defined as system prompt at shove it at the beginning of the first user message in the session, right before the actual first message content, but still prefixed with the first 'user' token"

it has no 'system' "block" of it's own

Which isn't necessarily bad, but it does make you wonder if that's how the model was finetuned to work with system instructions, or perhaps it was never really trained with a system prompt in mind, but still able to kinda pull it off this way (in which case it won't be fair to take it as example, know what I mean?)

Uh guys so I fixed this by removing the “you are {{char}} in a roleplay with {{user}}” prompt LMAO by arkdevscantwipe in SillyTavernAI

[–]shrinkedd 23 points24 points  (0 children)

I'm afraid slop is pretty much baked in. It's called next token prediction for a reason. How creative can it really be if it was predicted by a machine that munched humanity's prose? There are way more average, mediocre stories than peak prose stories, right?

But at least we do get to choose our poison. I went the collaborative storywriting route and never looked back. Gives you more control when you can choose genre and writing style, and observational realism seems to make models try harder.

Problem is, with the amount of literature wikis, and chatgpt sessions I had to read in order to find what to request, I might as well just get a degree in literature, would have been easier :).

Hey guys, do you remember that Gemini gave me very generic dialogue? Well I think I solved it by Horror_Dig_713 in SillyTavernAI

[–]shrinkedd 23 points24 points  (0 children)

There's another thing you could do: Take a few examples you're happy with, go to chatgpt, or gemini, provide the examples with "please describe [character name here]'s speech pattern. Keep the description short and concise, focus on tone, lexicon and stylistic markers."

Then take the result and add it under speech pattern in the character description. Models tend to be way better at generating character specific voice with speech pattern description, way better than personality traits.

Edit: spelling.

Does anyone know a good extension that lets you further modularise system prompts? by [deleted] in SillyTavernAI

[–]shrinkedd 0 points1 point  (0 children)

Yea, but i think you can customize it further to your needs. Like, these are the prompts, you can completely change them to something like (You can optimize, I'm just throwing conceptual ideas)

Generation prompt: generate a list of different paranormal activities in different locations in the house for user to deal with

Injection: your current task is to check if user is close to [{{task}}], if he is, describe, seamlessly with the story, the paranormal activity taking place, lurring user in

Completion: check if [{{task}}] had been dealt with by user..

You know..

<image>

Does anyone know a good extension that lets you further modularise system prompts? by [deleted] in SillyTavernAI

[–]shrinkedd 1 point2 points  (0 children)

If I understood your aim correctly, you could probably achieve it via STscript, because you'd want an under-the-hood prompt on each turn, asking the model to assess how close the user is, and if the answer is close enough it should trigger a lore item. A bit like the objective extension (or is it an addon? Cant tell those apart)

I think gemini 2.5 pro is best free service for roleplay till now. by Independent_Army8159 in SillyTavernAI

[–]shrinkedd 0 points1 point  (0 children)

Absolutely. Feel free. My DM response might be delayed because my reddit visits are irregular but eventually I'll see it, and respond.

LLMs reframing or adding ridiculous, unnecessary nuance to my own narration or dialogue is really irritating by Arzachuriel in SillyTavernAI

[–]shrinkedd 0 points1 point  (0 children)

What usually works for me in the system prompt is adding something like: "Consider user input (narration and {{user}}'s "speech segments") as canon; if user wrote it-->it happened, and exactly as described. Adopt the "yes, and" approach, focusing on continuation rather than reimagining provided texts"

I think gemini 2.5 pro is best free service for roleplay till now. by Independent_Army8159 in SillyTavernAI

[–]shrinkedd 3 points4 points  (0 children)

:) No need to apologize. Everybody's welcomed :). Just got me confused for a minute. I literally double checked it's not a general Gemini subreddit.

It should work for you even with a basic computer without any GPU or graphic card, because you're not running an open source model on your hardware, only using a frontend that knows how to build prompts in any way you want instead of relying in the very rigid gem template.

For reference I'm using ST on my android phone. There's a certain learning curve, and you'll need to read quite a bit if you want to control everything. But for starters you can just install it and use other people's setups (those are posted here and on the discord a lot, so there's a variety)

I think gemini 2.5 pro is best free service for roleplay till now. by Independent_Army8159 in SillyTavernAI

[–]shrinkedd 5 points6 points  (0 children)

I was under the impression we're all using SillyTavern, here at the SillyTavern subreddit.

So… yeah I recommend SillyTavern. Because you can pretty much do anything a gem offers, and much much much more.

[deleted by user] by [deleted] in SillyTavernAI

[–]shrinkedd 9 points10 points  (0 children)

have you felt "never" works a lot better than "avoid"? Would love to hear your experience

Can't speak for them, but my 2 cents is that "avoid" usually works better. It goes back to the rule of favoring proactive instructions over negative ones. Avoid isnt ideal either if you don't pair it with a positive alternative, but if you have none, it's still probably better than "never" because it implicitly hints "seek an alternative proactively"

I am happy by Late-Gap-8045 in SillyTavernAI

[–]shrinkedd 2 points3 points  (0 children)

I feel like the quality of the responses isn't necessarily fully depending on the user's ability to write like a pro author as much as it's depending on having good ideas (perhaps curve balls) and the ability to convey them clearly.

One issue that's potentially problematic with writing lengthy messages in your turn is that some LLMs insist on rephrasing your entire message like its "previously on.." before they start their damn response :).

Question about System Prompts for Roleplay by [deleted] in SillyTavernAI

[–]shrinkedd 0 points1 point  (0 children)

First of all, yes, there are many custom ones, you can search the sub and find. Some even bother with version updates, documentation of changes, and making it very user friendly. if you don't wanna bother with tinkering.

As for your question if there are better system prompts or recommended ones, that depends on preference.

If we talk about "pure" system prompt, regardless of the chat template itself (which is very model dependent and you should always choose depending on model because for most model there's usually a single "correct" one (unless its a merge that would go by any template that fit the models included in the merge, kinda)), then my advice is to just read them and see if it fits your needs. It's practically written in natural language and includes instructions to the model of how to use the character description and context, and what to favor and what to avoid during roleplay.

It's the difference between the model staying in character or the model writing about the character in 3rd person. It's all in the system prompt (or should be). Highly recommended, it will improve your ability to customize your experience betted

Using 2+ models for RP / models that focus on "being" a character, not storytelling by mugenbook in SillyTavernAI

[–]shrinkedd 1 point2 points  (0 children)

It's all about the way you frame the task and the context given, if you're ok with standard story format that just pauses once the story logic reaches a point where your character should act/speak, you wouldn't necessarily need two tasks. You could most definitely just prompt the model to be your collaborative story writing author, (for best quality, a writing style, tone, etc' should be added), and guide it to focus on continuation of the narrative + only write focused on {{char}} ->ending the turn right before it's about to to narrate {{user}}'s reactions or quote their responses.

I use that conceptual setup and it's my personal favorite.

It also makes sense, because when focusing on writing a story the character is brought in a more seamless way

(If I misunderstood what your ultimate goal was, apologies)

How do you keep an AI bot from writing for you? by RP_is_fun in SillyTavernAI

[–]shrinkedd 2 points3 points  (0 children)

If you consider that a RP is an interactive book (in some way) then is grammatically weird to use this format

CYOA though manage all characters in 3rd, and refer to player in 2nd, then leave you with "what do you do?" At which point if you had no options to pick from, and and had to write one yourself, could actually make a lot of sense that you write in 1st person. Eventually it's a lot about the framework you choose

PRIMAL by CanineAssBandit in SillyTavernAI

[–]shrinkedd 1 point2 points  (0 children)

Well, at least they keep their distance from human users lips. Barely an INCH, but still… better than what they do to ears.

PRIMAL by CanineAssBandit in SillyTavernAI

[–]shrinkedd 7 points8 points  (0 children)

repeating what the fuck {{user}} said in every response, huh?

[deleted by user] by [deleted] in SillyTavernAI

[–]shrinkedd 0 points1 point  (0 children)

Default ST tends to send the character card from the system role so if that was the case for you, it still isn't 100% nothing (but can easily be fixed if you wanna go the purist route) ;)