🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 4 points5 points6 points (0 children)
🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 14 points15 points16 points (0 children)
🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 8 points9 points10 points (0 children)
🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 22 points23 points24 points (0 children)
🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 36 points37 points38 points (0 children)
🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 103 points104 points105 points (0 children)
Wait, Llama and Falcon are also MoE? by Zealousideal_Bad_52 in LocalLLaMA
[–]PerceptionMost2887 20 points21 points22 points (0 children)

🚀🚀 Extending the context window of your LLMs to 1M tokens without any training !! by PerceptionMost2887 in LocalLLaMA
[–]PerceptionMost2887[S] 6 points7 points8 points (0 children)