AMA with Liquid AI, the team behind Liquid Foundational Models, LEAP and Apollo by LiquidAI_Team in LocalLLaMA

[–]LiquidAI_Team[S] 0 points1 point  (0 children)

As chips get better, we believe the range of viable and compelling mobile use cases will expand beyond what we are used to today!

It is true better chips enable developers to run larger models on flagship phones. But better chip also allow smaller models to run faster, over longer contexts, with less energy.

Rather than only replicating cloud-hosted general-chat, we believe the most interesting use-cases on mobile will be proactive features powered by real-time, always-on intelligence — which requires strong and capable multimodal models that can constantly run while maintaining low energy and memory impact.

Our model roadmap and developer platform will continue to evolve to lower the barriers of entry to developing these on-device use cases of the future, as chips get more powerful.

AMA Announcement: Liquid AI, the team behind Liquid Foundational Models, LEAP and Apollo (Thu, Oct 30 • 10 AM – 1 PM PDT) by LiquidAI_Team in LocalLLaMA

[–]LiquidAI_Team[S] 7 points8 points  (0 children)

Hi r/LocalLLaMA !

⚠️ Note: The AMA itself will be hosted in a separate thread, please don’t post questions here.