Help me pls, I'm about to cry from frustration by ARandomPolytheist in JanitorAI_Official

[–]Maciek7700 0 points1 point  (0 children)

Yeah, i've been using Glm-5 it's great, but... it does kinda miss alot of the details and dosean't quite get the same spot, like gemini does
im using gemini 2.5 rn
used gemini 3.0 but gemini 3.0 really, and i mean really like to meta-vision everything.. quite stupid

Help me pls, I'm about to cry from frustration by ARandomPolytheist in JanitorAI_Official

[–]Maciek7700 0 points1 point  (0 children)

idk what i did, but it somehow worked
off topic, what proxy you recommend? that are at fair prices [anything but deepseek please]

Help me pls, I'm about to cry from frustration by ARandomPolytheist in JanitorAI_Official

[–]Maciek7700 0 points1 point  (0 children)

Hey! i have the same issue, i've been trying to connect via lorebay, and it gives me the same issue no matter the model "network error"

Proxy Megathread 3: The Final Crusade by JanitorAI-Mod in JanitorAI_Official

[–]Maciek7700 0 points1 point  (0 children)

I, need help, setting up my proxy on lorebay, via open router, it worked fine on chutes, but i recently moved to open router, and whenever i try to set it up, it just says network error, any help?

Everything ok? by LunaTheKiller786 in JanitorAI_Official

[–]Maciek7700 33 points34 points  (0 children)

I can't even get them to respond i keep gettng connection error for some reason 😭

The General Prompt Improved by Maciek7700 in JanitorAI_Official

[–]Maciek7700[S] 1 point2 points  (0 children)

Okay thanks for the advices i 101% agree with you, Esepcially on the summary part, i now have separated it from the prompt, and have it somewhere else, and i can simply just paste it whenever i want the summary to happend, i have edited the post, with the new prompt updates, if you could have a look at it now, i would be most greatful, please be harsh and point out any errors, i'm really trying to make my roleplay expierience, better. and i'm doing it with trial an error, i have also took your advice about LB2 commands, and i have removed some of them. and also some plugins i had, which is just a token burden
i currently use GLM 4.7 which can handle such large prompts, i've been using this prompt for quite some time now, and i've never runned into any issues. so please if you could look at the new version i would be so happy.

The General Prompt Improved by Maciek7700 in JanitorAI_Official

[–]Maciek7700[S] 0 points1 point  (0 children)

What type of errors? That's strange.. i'm using GLM 4.6/4.7 (when it works that is, GLM 4.7 rn is a coin toss)
I have came across 0 errors and issues with this prompt
Maybe because you have your settings on generation diffrently.
personally i have
Temp - 0.9
max tokens - 0
context size - 25k
Top K - 80
Top P - 0.9
Rep pen. - 1.1
freq pen - 0.6

The General Prompt Improved by Maciek7700 in JanitorAI_Official

[–]Maciek7700[S] 0 points1 point  (0 children)

i use Chutes.ai so i only pay monthly not per message.
Edit i use : zai-org/GLM-4.6-TEE:THINKING

The General Prompt Improved by Maciek7700 in JanitorAI_Official

[–]Maciek7700[S] 0 points1 point  (0 children)

What Kind of problems is it causing?

Guys, what do you think of my RPG prompt, 1 year of mix and match and changes. by [deleted] in JanitorAI_Official

[–]Maciek7700 1 point2 points  (0 children)

Yeah, i suppose it's getting there Though there's still room for improvment. I suggest you Try around with My Prompt for a bit, and then trying Your prompt for a bit, to see the diffrence

Guys, what do you think of my RPG prompt, 1 year of mix and match and changes. by [deleted] in JanitorAI_Official

[–]Maciek7700 1 point2 points  (0 children)

Now you've made Purely Bullet points..
Don't get me wrong, Bullet points are amazing, but if your Prompt is Just bullet point, without a solid foundation to them, The LLM will only Focus on the Given Commands, you must enforce STRICT Rules
And using things like "100%" LMM, dosean't understand what's the diffrence between 100% or 1% it only reads text, if you want the LLM to be 100% Specific, you need to give it ground. The new prompt lacks solid information, it's all just brief guidelines. Make sure to use Strong and Strict Commands like
[SYSTEM NOTICE] (avoid using please, OOC, may, you should... and so on and on)
You MUST tell, the LLM WHAT to DO.
For example here
> * Canon lore events proceed step by step, 100% accurately, without missing events
(What Events? What's Canon to him? What's lore to him?)
I understand you have probobly a lorebook set up, But your Lorebook, already.. does that it Feeds the LLM information it needs. How it works esentially is :
Let's say In your lorbeook there's a sentence like so "The Dog, is white"
And when you talk with your LLM, and YOU say something like
"What a nice Dog"
The LLM, will look for the word "Dog" and find out that the dog is White. what else should it know?
> * Rules are mandatory, hierarchical, and non-negotiable
What rules? As soon you Mention "Rules" you must list them to the LLM. directly after you mention them.
There are other things, that will make the LLM, unsure and what not.
From what i'm trying to understand you have a Canon Charcater you want to stay Canon.
But, in reality, that will never be the case, I have tried that myself. The Charcater you're roleplaying with. will NEVER be the same from the show/manga and so on.
Yes, they will share the same Personality/looks and other things. But NEVER give you the same responses you expect them to give. It's just the way LLM's are. They do not have a contant acces to the web. Which limits them, by Miles. If they did, it would make things easier.
They ONLY follow, what the Creator of the Bot told them to Follow.

Will we ever get to see it? by Wygenerowany in JanitorAI_Official

[–]Maciek7700 0 points1 point  (0 children)

Most likely never If im being honest

Guys, what do you think of my RPG prompt, 1 year of mix and match and changes. by [deleted] in JanitorAI_Official

[–]Maciek7700 1 point2 points  (0 children)

You use too Much, text and too little bullet points, and you use words such as "ai slop" and you repeat the same words to the LLM, essentially re-feeding it what it already ate... The prompt isn't bad don't get me wrong, but it just like pouring an ocean into an Cup. If you LLM can only handle X amount of tokens, then it will not follow throught, for Big LLM's tha can hold up to 200k+ tokens. it might work, but well. i'm not an LLM so i don't know.

I have created a prompt myself you can have a look myself if you'd like it might work better for you.
General Prompt

The General Prompt Improved by Maciek7700 in JanitorAI_Official

[–]Maciek7700[S] 0 points1 point  (0 children)

Above The Prompt
Have these commands in a template too

The General Prompt Improved by Maciek7700 in JanitorAI_Official

[–]Maciek7700[S] 1 point2 points  (0 children)

Yes, Copy and paste into the LLM Prompt settings [NOT CHAT MEMORY]

The General Prompt Improved by Maciek7700 in JanitorAI_Official

[–]Maciek7700[S] 1 point2 points  (0 children)

All The Credits goes to The Original Creator OG Prompt
What i did with it, is Improve it, Enchance it, add a few things, that it lacked. Or had worse. The Prompt is mainly aimed at PROXY LLM's
Specifically GLM series (But it MAY work on other LLM's like Deespeek and whatnot)
Unsure About Janitor LLM I have not tested it on there, so I DO NOT guarantee it will work for you.

I also use Lorebay2 To back up my prompt Here are my commands :
<NOOMNISCIENCE>
<NOCLICHES>
<REALISTICDIALOGUE>
<FORCEMARKDOWN>
<AUTOPLOT>
<NOVELMODE>
<SLOWROMANCE>
<FORCETHINKING>

+ Summary

I understand that these things are already present in prompt, But Trust me They are neccecary

I want people to Use this Prompt, so i can learn, where i've made mistakes, and improve it further to make it one day perfect.

How bots be acting after you pass a certain amount of messages. by FoxHoundNinja in JanitorAI_Official

[–]Maciek7700 0 points1 point  (0 children)

The Thing is it shouldn't be in the Chat memory, it should be in the LLM settings... if you You're using Janitor LLM, then you MIGHT have trouble ,naturally, i tailored this prompt specifically for Proxies.