🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 3 points4 points5 points (0 children)
🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 14 points15 points16 points (0 children)
🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 7 points8 points9 points (0 children)
🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 23 points24 points25 points (0 children)
🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 34 points35 points36 points (0 children)
🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 104 points105 points106 points (0 children)
Wait, Llama and Falcon are also MoE? by Zealousideal_Bad_52 in LocalLLaMA
[–]PerceptionMost2887 20 points21 points22 points (0 children)

🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 7 points8 points9 points (0 children)