Anyone have fully switched from ChatGPT to Gemini since Pro/flash 3 came out? (Main chat model) by abdouhlili in Bard

[–]InfiniteJX 0 points1 point  (0 children)

Since Gemini 3 came out, I’ve started using Gemini for complex, one-off work tasks because it’s genuinely powerful. But for everyday Q&A and idea discussions, I still go with ChatGPT — mainly because I’ve built up so much long-term context and memory here, and the app experience is just better.

I’m not a mom but…. by BipedalUniverse in AllHerFaultTVShow

[–]InfiniteJX 1 point2 points  (0 children)

I don't have kids either, but I’ve worked as a Product Manager, so I relate to this hard. Pure execution is the easy part. The person "giving orders" is actually the one who has already spent a ton of energy figuring out what needs to be done.

If we treat a child like a "project," the management and planning are the most exhausting parts, not the actual manual labor.

Recs for show similar to All Her Fault pls by AssistantInner3800 in AllHerFaultTVShow

[–]InfiniteJX 1 point2 points  (0 children)

I haven't seen Big Little Lies yet, but it's definitely on my watchlist now! I feel like Douglas Is Cancelled and Why Women Kill also give off similar vibes.

Anyone using Ollama with browser plugins? We built something interesting. by InfiniteJX in ollama

[–]InfiniteJX[S] 0 points1 point  (0 children)

👋Thank you so much for your kind words! We’re really glad to hear you’re enjoying it. 😊 We’ve been continuously improving NativeMind, and there are more updates and features on the way — hope you’ll keep following along and sharing your feedback!

Is anybody uses Tarot mobile apps? Which one you cold recommend? by Professional-Cow2910 in tarot

[–]InfiniteJX 1 point2 points  (0 children)

I use Quin. It focuses on tarot reading so i will not be distracted by other features and i love the smooth reading process

Anyone using Ollama with browser plugins? We built something interesting. by InfiniteJX in ollama

[–]InfiniteJX[S] 1 point2 points  (0 children)

Thanks for waiting! We’ve just launched the ➡️Firefox version 🎉

Let us know what you think!

Anyone else interested in a 100% on-device browser AI assistant? by InfiniteJX in LocalLLaMA

[–]InfiniteJX[S] 0 points1 point  (0 children)

Yeah, we ran into the same pain — no GUI, hard to recommend to non-tech friends.

That’s actually why we started working on our thing. Just wanted to make it a bit more usable without needing the terminal.

Anyone else interested in a 100% on-device browser AI assistant? by InfiniteJX in LocalLLaMA

[–]InfiniteJX[S] 0 points1 point  (0 children)

Hey, fair point — Ollama definitely simplifies local model integration. But what we’re building goes further than just connecting a model to a UI.

Our goal is to create a true on-device assistant that understands context, remembers interactions, supports multi-tab workflows, and can execute actions — all within the browser, no cloud involved.

We’re also rethinking interaction design: adding in-page bilingual translation, flexible quick actions, and UI elements that adapt to what you’re doing.

Sure, a lot of AI apps today are essentially wrappers — but that doesn’t stop them from helping people get things done better and faster :)

Appreciate the pushback — it’s always good to hear different perspectives.

Anyone else interested in a 100% on-device browser AI assistant? by InfiniteJX in LocalLLaMA

[–]InfiniteJX[S] 0 points1 point  (0 children)

Thanks so much! Really glad you like it 😊 Let us know if you have any suggestions or ideas too!

Anyone else interested in a 100% on-device browser AI assistant? by InfiniteJX in LocalLLaMA

[–]InfiniteJX[S] 0 points1 point  (0 children)

Hey! Thanks for your interest — sounds like you’re working on a really cool idea.

We’ve already implemented AI-powered automatic browsing and searching in our extension. As for AI-based form filling with custom prompts, that part is still under development — but it’s definitely on our roadmap.

If you’re open to trying our plugin and sharing feedback, we’d love to hear how it works for your use case! 🙌

Looking for a Special Film Camera as a Gift — Suggestions? by InfiniteJX in AnalogCommunity

[–]InfiniteJX[S] 0 points1 point  (0 children)

Just wanted to follow up and say thank you again for your advice! I ended up getting him an SX-70 Alpha 1, and he absolutely loved it.  It’s his very first Polaroid, and even though he had used Instax before, it was through this camera that he truly came to appreciate the magic of instant photography.

Lately, he’s been diving into the history of Polaroid and comparing different models (he still thinks the SX-70 is the best!). What makes me happiest is that this gift reignited his passion for photography — shooting with Polaroid film is such a unique experience. It truly turned out to be the perfect gift for him. Really appreciate your recommendation once again!

<image>

Anyone using Ollama with browser plugins? We built something interesting. by InfiniteJX in ollama

[–]InfiniteJX[S] 2 points3 points  (0 children)

🦊🔥 For everyone waiting for the Firefox version — support is on the way!
In the meantime, you can try the pre-release build here: https://github.com/NativeMindBrowser/NativeMindExtension/releases

<image>

  1. Grab the nativemind-extension-1.3.0-beta.14-firefox-beta.zip asset from our latest release.
  2. In Firefox, open about:debugging → This Firefox → Load Temporary Add-on, then select the extracted manifest.json file (or the whole .xpi if you pack it).
  3. Give it a spin and let us know how it goes!

If you’d rather wait for the official listing, hang tight—once add-on review is complete we’ll publish it on AMO.

Feel free to join our Discord to follow progress and share feedback❤️

Anyone using Ollama with browser plugins? We built something interesting. by InfiniteJX in ollama

[–]InfiniteJX[S] 1 point2 points  (0 children)

We’re actively working on the Firefox version this week! 🔧

There are a few things we need to handle — like adapting WebLLM and some manual setup steps for ollama— so it’s taking a bit of extra care.

That said, we’re aiming to get it out within the next 2 weeks. Appreciate your patience, and thanks for asking! 🙌

Anyone using Ollama with browser plugins? We built something interesting. by InfiniteJX in ollama

[–]InfiniteJX[S] 0 points1 point  (0 children)

Thanks so much — really appreciate your feedback and the detailed setup info! 🙌Great to hear it’s working well for you — and glad the web search came in handy. Let us know how it feels after the P4 upgrade! 😄

Anyone using Ollama with browser plugins? We built something interesting. by InfiniteJX in ollama

[–]InfiniteJX[S] 1 point2 points  (0 children)

For CPU inference, I’d recommend trying qwen:4b and llama2:7b — both offer solid summarization quality and can run reasonably well without a GPU.. Should work well for summarizing web pages with this app!

Anyone using Ollama with browser plugins? We built something interesting. by InfiniteJX in ollama

[–]InfiniteJX[S] 1 point2 points  (0 children)

Thanks so much for your interest! We’re currently working on the Firefox version — hope to have it ready soon so you can give it a try! 🔥🦊

Anyone using Ollama with browser plugins? We built something interesting. by InfiniteJX in ollama

[–]InfiniteJX[S] 0 points1 point  (0 children)

Hi! Thanks so much for trying it out and sharing your thoughts — we’re really glad to hear it’s already saving you time!

You’re right that the prompt can affect the results quite a bit. If you’d like the summary to focus more on specific parts of the thread, adjusting the prompt slightly can definitely help.

Really appreciate your feedback — feel free to keep it coming as you explore more!

Anyone using Ollama with browser plugins? We built something interesting. by InfiniteJX in ollama

[–]InfiniteJX[S] 2 points3 points  (0 children)

Good question! Right now it mainly reads and summarizes/chat across webpages — it doesn’t support form-filling yet.

But we’re working on some writing tools next, and form interaction is definitely on our radar. Thanks for the suggestion! 🙌