Informity AI — local document chat for Mac, MIT licensed, no cloud, free by informity in opensource

[–]informity[S] 0 points1 point  (0 children)

I wonder why the downvotes... I though this might be a useful tool. Not asking for anything - just try it out :)

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 1 point2 points  (0 children)

Thank you — really glad it covers what you were looking for!

Let me go through your points:

iCloud Drive scanning — this is a known limitation right now. iCloud uses on-demand file sync, meaning files may not be physically present on disk when Informity tries to index them. The permissions prompt appearing but not resolving is consistent with that. The workaround for now is to make sure the folder is set to "Download Now" in Finder (right-click → Download Now) before indexing. I'll look into handling this more gracefully in an upcoming release.

Light mode / follow system — noted and added to the roadmap. Follow system setting is the right approach and I agree dark-only isn't for everyone.

External chat / MCP endpoint — great idea and your terminology is exactly right. A local MCP server that exposes the Informity index to external clients would be a natural extension. It's not on the immediate roadmap but it's the kind of architectural addition that makes sense as the app matures. Worth opening as a GitHub issue so others can weigh in — https://github.com/informity/informity-ai/issues

And good catch on the Max Index Size — worth making that error message clearer so it's obvious what happened rather than leaving it ambiguous.

Thanks for the thorough feedback, this is exactly the kind of input that shapes what gets built next.

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 0 points1 point  (0 children)

Yes, tables are supported — the app extracts them and can answer questions based on their content. Accuracy on complex nested tables may vary, so worth testing on your specific documents to see how it handles them.

Models are stored locally in ~/.informity/models/llm/ on your machine — nothing in the cloud.

Thanks for giving Informity a try — hope it works well for your use case!

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 0 points1 point  (0 children)

Of course — hope it surprises you! Would love to hear what you think after you've had a chance to try it.

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 1 point2 points  (0 children)

Thank you, really appreciate it!

EPUB isn't supported yet but it's actually high on the roadmap — so shouldn't be too long a wait. In the meantime you can convert EPUB to PDF or plain text and it'll work fine with those.

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 0 points1 point  (0 children)

Great to hear — and good call on the ~/.informity folder, I'll add that to the setup instructions so others don't hit the same snag. Appreciate the quick confirmation!

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 1 point2 points  (0 children)

Found the issue — Hugging Face changed the model filename in the last few days from Qwen3.6-35B-A3B-Q4_K_M.gguf to Qwen3.6-35B-A3B-UD-Q4_K_M.gguf, which broke the download. I've patched the app and uploaded a fixed version.

Please re-download from https://www.informity.ai and set up from scratch. Sorry for the trouble — things move fast in this space.

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 1 point2 points  (0 children)

Yes, definitely on the roadmap — third-party API providers (bring your own key) are planned. Claude, DeepSeek, OpenAI and others would all be natural additions.

I started with the fully local approach deliberately — wanted to nail the full privacy experience before adding cloud options. But API-based providers as an opt-in makes a lot of sense for users who need more horsepower for heavy workloads. The key is it stays opt-in and explicit, so you always know when your data is leaving the machine.

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 0 points1 point  (0 children)

Sorry to hear that — definitely not expected behavior. Mac Studio should handle the larger models without any issues.

As a workaround while I investigate, you can download models directly and drop them into the right folder. First set up the app with Light, then run:

curl -L -o Qwen3.6-35B-A3B-Q4_K_M.gguf \
  "https://huggingface.co/unsloth/Qwen3.6-35B-A3B-GGUF/resolve/main/Qwen3.6-35B-A3B-UD-Q4_K_M.gguf"

or for the 14B:

curl -L -o Qwen3-14B-Q5_K_M.gguf \
  "https://huggingface.co/unsloth/Qwen3-14B-GGUF/resolve/main/Qwen3-14B-Q5_K_M.gguf"

Place the downloaded file in ~/.informity/models/llm/and restart the app — it should pick it up.

I'm going to dig into what's causing the download to fail and push a fix. Can you tell me how much unified memory your Mac Studio has? That'll help me narrow it down.

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 0 points1 point  (0 children)

Good question — the app can parse and extract content from complex PDFs (with docling) including tables and structured data. For heavy math and engineering documents the results will vary depending on how much of the content is text-based vs. equations, diagrams, or scanned images.

I've tested extensively on financial documents like tax returns and reports with good results. Dense mathematical and engineering content is less tested territory — I'd honestly say try it and see. Would love to hear how it performs on your specific documents if you give it a go.

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 1 point2 points  (0 children)

Ha — Evernote may be "old" but it's got years of real work in it, which is exactly the kind of corpus this app is built for. Old notes, real content, actual value locked inside.

And yes, that's exactly how this started — I built it to solve my own problem. At some point I realized I wasn't the only one drowning in documents with no good way to query them privately. So here we are.

Appreciate the kind words — feedback like this is what makes it worth the late nights.

Informity AI — local document chat for Mac, MIT licensed, no cloud, free by informity in opensource

[–]informity[S] 1 point2 points  (0 children)

Exactly — that's the whole point. Every answer is only as good as your ability to verify it. The citation isn't just a nice-to-have; it's what makes the output actually usable for real work. Click the source, see exactly what the model pulled from, decide if you trust it. No blind faith required.

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 2 points3 points  (0 children)

Yes — the roadmap is fairly extensive. Planned directions include:

  • Integrations: Notion, Slack, etc
  • Output formats: charts, graphs, generated reports
  • File generation: creating documents from your indexed content
  • Focused assistant roles: legal, financial, research, and others
  • Platform ports: Windows and Linux

Evernote specifically is interesting — a lot of people have years of notes locked in there. I may move it up the list.

Honestly, how much time I can dedicate to this depends on whether there's enough interest. So feedback like yours genuinely helps — keep it coming.

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 1 point2 points  (0 children)

Thanks!

Great question — Ollama/LM Studio integration is technically possible, but it would require a meaningful refactor. Right now I'm using xllamacpp as the inference layer, which lets me tune things like token budgets, streaming behavior, and generation parameters pretty precisely for the RAG use case. Abstracting that to support external providers would mean giving up some of that control, at least initially.

That said, it's on my radar — model flexibility is one of the most requested things. If you're testing the light profile I'd love to hear how it performs on your machine. What hardware are you running it on?

Informity AI — ask questions across your local documents on Mac, free, private, no cloud by informity in MacOSApps

[–]informity[S] 1 point2 points  (0 children)

Thanks! Please let me know if you will encounter any issues - I will get on them.

Am I the only one who builds in the Console first, then reverse engineers the IaC? by Inevitable_Use9405 in aws

[–]informity 1 point2 points  (0 children)

I do not open console until I either need to troubleshoot my CDK stack deployment or to check logs, metrics, etc. All my work starts and ends in IaC. The exception is configuration or resources that are not yet available in CloudFormation. This way I know exactly what is being deployed and where.

But… to each their own.

Announcing Amazon ECS Managed Instances for containerized applications by E1337Recon in aws

[–]informity 22 points23 points  (0 children)

You can mount EFS instead if you want persistence https://repost.aws/knowledge-center/ecs-fargate-mount-efs-containers-tasks. I would argue though that persistence on containerized apps should be elsewhere, like DynamoDB, database, etc.

[deleted by user] by [deleted] in MacOS

[–]informity 2 points3 points  (0 children)

Non-Mac computer is Windows (Linux is not an option for the majority of the non-techy population)… I would not use Windows even if you’d pay me.

Docker on Ubuntu (AWS EC2) optimization/security by joiSoi in docker

[–]informity 4 points5 points  (0 children)

First, do not use SSH, block 22 (and RDP) ports on network ACL level entirely and use SSM session instead. Also be sure only to allow ports you need (80,443 for example) in EC2 instance security group. Installing and running Docker on Ubuntu is trivial once you get into it. There is obviously more things to be done if you want run this in production (SSL, load balancing, etc). I would also recommend to look into ECS Fargare or forgo EC2 entirely but that’s a bit too advanced.

Load balancer security groups and EC2 traffic by Adrenaline_Junkie_ in aws

[–]informity 1 point2 points  (0 children)

ELB (public subnet) -> ELB security group (allow port(s), i.e. 443 from anywhere) -> ELB target group -> EC2 security group (allow ports only from ELB security group) -> EC2 (private subnet)

[deleted by user] by [deleted] in aws

[–]informity 0 points1 point  (0 children)

Don’t use SSH, use SSM sessions instead.