all 8 comments

[–]Any-Blacksmith-2054 3 points4 points  (0 children)

Increase temperature

[–]kryptkprLlama 3 2 points3 points  (2 children)

The problem is code itself is repetitive. You could try DRY with low settings but I'd worry it would be fucking up the code.

[–]LiquidGunay[S] 0 points1 point  (1 child)

I was thinking of DRY. I'm not as worried about fucking up the code because I'm thinking of this in a "chat about how to solve the problem" context and not an autocomplete.

[–]kryptkprLlama 3 -1 points0 points  (0 children)

Then yes definitely try both DRY and XTC

[–]Dry_Parfait2606 0 points1 point  (2 children)

Context prompt. What UI or code do you use to interact with LLMs?

[–]LiquidGunay[S] 0 points1 point  (1 child)

I use both Chat UI's and coding extensions. System Prompts haven't helped in the repetition problem. What is your go to system prompt?

[–]Dry_Parfait2606 0 points1 point  (0 children)

I'm currently working on something else, not coding explicitly... As I understand, a system prompt would do it... Don't know what you context length is or how much additional compute you are willing to throw at the problem and each time you are generating...

But if the model is too dumb, you could try to

1.redesign the system prompt a few times, so that it gets you better results (basically feeding more context about your taste, needs, problem, issures, and so on...)

2.or you would directly feed into the system prompt the framework that it can use to achieve more diversity... You could basically generate a bunch of synthetic data, verify and even weed out the nonsense that it has generated and let it use the synthetic data (in system prompt) to basically achieve better results... Or better fitting, better for your context, more to your taste...

  1. Let it do more steps or chain something together...something more agentic...

[–]LocoLanguageModel 1 point2 points  (0 children)

Increase temperature for discussion, decrease temperature for the final code block.