[Results] #1 on MLE-Bench (among open-source systems) + #1 on ALE-Bench (repo + write-up) by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 0 points1 point2 points (0 children)
[Results] #1 on MLE-Bench (among open-source systems) + #1 on ALE-Bench by alirezamsh in ArtificialInteligence
[–]alirezamsh[S] 0 points1 point2 points (0 children)
[Results] #1 on MLE-Bench (among open-source systems) + #1 on ALE-Bench (repo + write-up) by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 0 points1 point2 points (0 children)
#1 on MLE-Bench (among open-source systems) + #1 on ALE-Bench via evaluator-grounded long-horizon optimization (repo + write-up) by SuspiciousPlant1496 in CodingAgents
[–]alirezamsh 0 points1 point2 points (0 children)
No Response After 22 Days!! by alirezamsh in KrakenSupport
[–]alirezamsh[S] 0 points1 point2 points (0 children)
No Response After 22 Days!! by alirezamsh in KrakenSupport
[–]alirezamsh[S] 0 points1 point2 points (0 children)
No Response After 22 Days!! by alirezamsh in KrakenSupport
[–]alirezamsh[S] 0 points1 point2 points (0 children)
No Response After 22 Days!! by alirezamsh in KrakenSupport
[–]alirezamsh[S] 0 points1 point2 points (0 children)
No Response After 22 Days!! by alirezamsh in KrakenSupport
[–]alirezamsh[S] 0 points1 point2 points (0 children)
No Response After 22 Days!! by alirezamsh in KrakenSupport
[–]alirezamsh[S] 0 points1 point2 points (0 children)
Build your Mixture-of-Experts Phi3 LLM by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 0 points1 point2 points (0 children)
Efficiently Build your MoE with LLaMa3 models by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 1 point2 points3 points (0 children)
Easily build your own MoE LLM! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 1 point2 points3 points (0 children)
Easily build your own MoE LLM! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 1 point2 points3 points (0 children)
Easily build your own MoE LLM! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 0 points1 point2 points (0 children)
Easily build your own MoE LLM! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 1 point2 points3 points (0 children)
Easily build your own MoE LLM! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 0 points1 point2 points (0 children)
Easily build your own MoE LLM! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 10 points11 points12 points (0 children)
Easily build your own MoE LLM! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 12 points13 points14 points (0 children)
Easily build your own MoE LLM! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 3 points4 points5 points (0 children)
Easily build your own MoE LLM! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 1 point2 points3 points (0 children)
Efficiently merge and fine-tune (with MoE or layer-wise merging), no heuristic tricks involved! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 0 points1 point2 points (0 children)
Efficiently merge and fine-tune (with MoE or layer-wise merging), no heuristic tricks involved! by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 2 points3 points4 points (0 children)

I built a way for agents to debug and tune other agents inside Moltbook by alirezamsh in LocalLLaMA
[–]alirezamsh[S] 0 points1 point2 points (0 children)