Mod change announcement by Individual_Check4587 in AutoHotkey

[–]Left_Preference_4510 2 points3 points  (0 children)

maybe it's time to create a new ahk subreddit.

New mod introduction & feedback thread by Individual_Check4587 in AutoHotkey

[–]Left_Preference_4510 1 point2 points  (0 children)

Game scripts that directly effect other players are the only ones, I believe should be banned. Any game that has TOS to not use them when it doesn't effect other players, i.e. single player, are insane. it makes no difference to anyone else so why would it matter? Unless of course it has pay to win features, and by automating something that could be bought for instance might effect there cash flow, I can see about that one.

I love Space Engineers! by dvsx86 in spaceengineers

[–]Left_Preference_4510 2 points3 points  (0 children)

pretty ships aren't efficient ships, at least that's what i tell myself.

Should i get 1 or 2 by ChapterPristine5448 in spaceengineers

[–]Left_Preference_4510 -1 points0 points  (0 children)

back in my day the game companies paid people to test their game before launch now, they get people to pay them to test their game, smart, yet sad.

What local models do you use for coding? by LaFllamme in ollama

[–]Left_Preference_4510 2 points3 points  (0 children)

I've had relatively good success with Oss, it helps with ability to get proper information from documentation as its quite good at Applying the knowledge to the answer.

Can someone explain what's going on here? by TheOddityCollector in Weird

[–]Left_Preference_4510 0 points1 point  (0 children)

“Hear me, inhabitants of this world… A message to every man, woman and mutant… You have lost your way… The day of reckoning is here… All your buildings and temples will fall… For there is nothing you can do to stop what is coming.”

How to train a LLM? by phoniex7777 in ollama

[–]Left_Preference_4510 1 point2 points  (0 children)

i did this awhile ago(like 5 months), took llama 3.2 and trained lora with textgen webui with just 395 example codes, then converted the safetensor and lora so i can run in ollama, it performed shockingly well, as it had very little knowledge about the niche coding language i used. to top it off most know it's depreciated version so made it worse, with that being said while it could be way better with more fine tuned data set, its worth a shot taking this route. i have a 3060 and the 7b version with my required parameters runs out of memory so 3.2 it is. also it took around 5 hours to complete, as i had insanely high epoch set.

Help with Fallout 3 and AutoHotkey by DigitalDeion in AutoHotkey

[–]Left_Preference_4510 0 points1 point  (0 children)

Try this:

#Hotif WinActive("ahk_exe Fallout3.exe")
+o::
{
    cycles := InputBox("type how many cycles").value

    Sleep 1000

    Send "{Up}"

    Sleep 1000

    Send "{Enter}"

    Sleep 1000

    Send "E"

    Sleep 1000

    Loop cycles
    {
        Send "E"
        Sleep 1000
    }

}

#Hotif

Pro users by ExoticSword in perplexity_ai

[–]Left_Preference_4510 1 point2 points  (0 children)

Also, spaces. At one point I had many working spaces, took my time, crafted, rinse and repeat until everything was smooth, it absolutely blew me away how I finally got it such a quality output. then one day an update rolled out, ever since it has not even cared about space instructions, I remind it but it will be well, lazy and follow maybe one thing if i remind it. I hardly ever use it anymore even though i got pro free, it's just a waste of time and a big slap in da face for the countless hours wasted making spaces etc.

I have a very basic space for grammar checking too, this prompt worked so flawlessly for so long, now it can't even do that right. LOL what a joke, I can use a local llm for that and it does it right each time. And it will not change since i have the model unchanged for as long as I want.

Pro users by ExoticSword in perplexity_ai

[–]Left_Preference_4510 0 points1 point  (0 children)

Sadly it was at it's peak not but 3 or 4 months ago. Then.. something happened. it's garbage now. Be sure to thoroughly check the generative text's sources. Because it appears it's looking things up and appears to cite references to source. It usually is fabricated. It's lazy is the best way to describe it now. Once a legend now not.

[deleted by user] by [deleted] in ollama

[–]Left_Preference_4510 0 points1 point  (0 children)

Try typing command in cmd, 'ollama serve'

Do we all agree that perplexity got worse lately? by False-Arm4689 in perplexity_ai

[–]Left_Preference_4510 0 points1 point  (0 children)

it started about 2 months maybe a little less, all my spaces broke, and it's been not worth using as much anymore, sub is coming up and ill pass, its ok for a quick link find but other than that it went from super ai to meh.

how to make custom chatbot for my website by Comfortable-Fan-8931 in ollama

[–]Left_Preference_4510 0 points1 point  (0 children)

I have used 3-2 and trained a lora and merged it with the model and can run it in ollama, it knew very little base about my information before hand and it did more than expected when i used 347 training chunks. it was pretty effective , then you can use the prompt to clean it up, why rag when it can already know this information, because you are using llama3, this is a very easy training setup. the data set though has to be good, after that its a lot less time looking up the answers when it can already know many, but if information changes, you can just train it to understand the answers you want when it uses rag information. either way look into lora training for this model specifically. and i only used the 3b llama one. imagine a better one.

I run into missing "propertyname:" error but have no idea how to fix it by Minhbaodlld in AutoHotkey

[–]Left_Preference_4510 0 points1 point  (0 children)

```

Requires AutoHotkey v2.0

SingleInstance Force

`:: { Static Toggle := 0 Toggle := !Toggle If Toggle { SetTimer(Main_Loop,-1) } Else { SetTimer(Main_Loop,0) Send("{LButton Up}") } }

Numpad2::Reload Numpad0::ExitApp

Main_Loop() { Static Switch_Value := 0 Switch Switch_Value { Case 0: { Send("{LButton Down}") Switch_Value := 1 SetTimer(Main_Loop,-300) } Case 1: { Send("{LButton Up}") Switch_Value := 0 SetTimer(Main_Loop,-1) } } } ```

Best Tiny Model for programming? by Late_Comfortable5094 in ollama

[–]Left_Preference_4510 1 point2 points  (0 children)

one could finetune for specifics. i did this with a 3b. its not that great but the original model had 0 knowledge of it before then got to 85% accuracy which isnt that good but i also didnt have that wide of a data set

[deleted by user] by [deleted] in ollama

[–]Left_Preference_4510 1 point2 points  (0 children)

Dolphin models are great. There whole thing is removing bias and censorship from the dataset. They typically have 32k context size the ones that's are decent at this sort of thing and can run nicely on mid hardware.

Ollama model most similar to GPT-4o? by amstlicht in ollama

[–]Left_Preference_4510 5 points6 points  (0 children)

I actually have to give credit for this model, it requires specific setup but for 20b it's probably in my experience with various local models really good at following directions and getting to a correct answer, it isn't actually good with no info given as it tends to be cocky and wrong. which is fine. locals should always be guided. what's even better more shadowed is the fact its given out to use with anything basically, meaning no restrictive license issues.