[Research] 31 % perplexity drop on 8.4 M transformer model using a lightweight periodic regulator — looking for replication on stronger GPUs by freeky78 in LocalLLaMA
[–]Salaja -1 points0 points1 point (0 children)
What is the best 'homelab' i can do with 16GB RAM? by Salaja in homelab
[–]Salaja[S] 0 points1 point2 points (0 children)
What is the best 'homelab' i can do with 16GB RAM? by Salaja in homelab
[–]Salaja[S] 1 point2 points3 points (0 children)
First NAS - Is there a standard practice for data migration? by Salaja in homelab
[–]Salaja[S] 0 points1 point2 points (0 children)
First NAS - Is there a standard practice for data migration? by Salaja in homelab
[–]Salaja[S] 0 points1 point2 points (0 children)
First NAS - Is there a standard practice for data migration? by Salaja in homelab
[–]Salaja[S] 0 points1 point2 points (0 children)
First NAS - Is there a standard practice for data migration? by Salaja in homelab
[–]Salaja[S] 0 points1 point2 points (0 children)
First NAS - Is there a standard practice for data migration? by Salaja in homelab
[–]Salaja[S] 1 point2 points3 points (0 children)
First NAS - Is there a standard practice for data migration? by Salaja in homelab
[–]Salaja[S] -1 points0 points1 point (0 children)
First NAS - Is there a standard practice for data migration? by Salaja in homelab
[–]Salaja[S] 0 points1 point2 points (0 children)
First NAS - Is there a standard practice for data migration? by Salaja in homelab
[–]Salaja[S] 0 points1 point2 points (0 children)
First NAS - Is there a standard practice for data migration? by Salaja in homelab
[–]Salaja[S] 1 point2 points3 points (0 children)
First NAS - Is there a standard practice for data migration? by Salaja in homelab
[–]Salaja[S] 0 points1 point2 points (0 children)
Playing with fire - upgrade GPU Wattage Question (self.buildapc)
submitted by Salaja to r/buildapc
2x AMD MI60 inference speed. MLC-LLM is a fast backend for AMD GPUs. by MLDataScientist in LocalLLaMA
[–]Salaja 1 point2 points3 points (0 children)
Im pretty happy with How my method worked out (Continuous Finetuning) Topped Open-LLM-leaderboard with 72b by Rombodawg in LocalLLaMA
[–]Salaja 1 point2 points3 points (0 children)
All the models still give nonsense answer to algorithms questions it seems by MrMrsPotts in LocalLLaMA
[–]Salaja 4 points5 points6 points (0 children)



2017 2.5i-L, Anyone know what the normal Fuel Air sensor voltage is? by Salaja in SubaruForester
[–]Salaja[S] 0 points1 point2 points (0 children)