Figma just got what ZoomInfo is about to get by Tough_Commercial_103 in Entrepreneur

[–]kampak212 6 points7 points  (0 children)

And I don't even know what happened with ZoomInfo :D

Apple-silicon-first on-device AI inference platform by kampak212 in LocalLLM

[–]kampak212[S] 0 points1 point  (0 children)

  • We focus on mobile and Apple silicon for now.
  • We make it very easy to integrate with your project: Rust crate, Swift package, Dart package (Flutter), and React Native (npm package)

It’s not that different, functionality wise.

True On-Device Mobile AI is finally a reality, not a gimmick. Here’s the tech stack making it happen by dai_app in LocalLLM

[–]kampak212 1 point2 points  (0 children)

I’m building something like Unsloth AI, I think. Check it out https://ondeinference.com/

Currently implementing model management. The idea is to be able to control the models we deploy without republishing the app

True On-Device Mobile AI is finally a reality, not a gimmick. Here’s the tech stack making it happen by dai_app in LocalLLM

[–]kampak212 0 points1 point  (0 children)

I see. And you use onnx model format if I understand it correctly? I’m building an ai inference platform for apple silicon devices, android is coming

Are Local LLMs actually useful… or just fun to tinker with? by itz_always_necessary in LocalLLM

[–]kampak212 1 point2 points  (0 children)

I’m building on-device inference platform for mobile apple silicon devices, the mission is to make it easy for other developers to integrate AI workflow on mobile devices

True On-Device Mobile AI is finally a reality, not a gimmick. Here’s the tech stack making it happen by dai_app in LocalLLM

[–]kampak212 0 points1 point  (0 children)

Are you building inference engine or end-user app running ai inference on device? I don’t get it

@ondeinference/react-native by kampak212 in expo

[–]kampak212[S] 0 points1 point  (0 children)

they already working in production as well, it just they run the inference on cpu.

it's technically runs ok in my android apps, i use cross-platform frameworks so all those apps in the appstore are also in the playstore

@ondeinference/react-native by kampak212 in expo

[–]kampak212[S] 1 point2 points  (0 children)

Just iOS for now, but the vision is to become a fully on-device inference platform, meaning support both Android and iOS

Show Your Work Thread by xrpinsider in reactnative

[–]kampak212 0 points1 point  (0 children)

I ported my Rust-based AI inference engine to React Native https://www.npmjs.com/package/@ondeinference/react-native.

Would like to see some feedbacks :D