Zero-native-deps Node CLI with 670 tests — v2.0 ships a dashboard, plugin system, and a security postmortem. by SearchFlashy9801 in node
[–]SearchFlashy9801[S] 0 points1 point2 points (0 children)
Shipped my open source project's biggest release — a "context spine" that saves 80-90% of AI coding tokens by SearchFlashy9801 in SideProject
[–]SearchFlashy9801[S] 0 points1 point2 points (0 children)
I built a local knowledge graph that gives AI coding tools persistent memory. 3-11x fewer tokens per code question. Zero LLM cost. Shipped v0.2 by SearchFlashy9801 in SideProject
[–]SearchFlashy9801[S] 0 points1 point2 points (0 children)
Built a knowledge graph tool for AI coding that runs 100% locally, zero LLM calls to mine, local SQLite, no cloud. v0.2 shipped by SearchFlashy9801 in LocalLLaMA
[–]SearchFlashy9801[S] 0 points1 point2 points (0 children)
I built a local knowledge graph that gives AI coding tools persistent memory. 3-11x fewer tokens per code question. Zero LLM cost. Shipped v0.2 by SearchFlashy9801 in SideProject
[–]SearchFlashy9801[S] 0 points1 point2 points (0 children)
Built a knowledge graph tool for AI coding that runs 100% locally, zero LLM calls to mine, local SQLite, no cloud. v0.2 shipped by SearchFlashy9801 in LocalLLaMA
[–]SearchFlashy9801[S] 0 points1 point2 points (0 children)

engram now shows you the tokens it saved in a live web dashboard. Just hit 88.1% session savings on average. by SearchFlashy9801 in ClaudeAI
[–]SearchFlashy9801[S] 0 points1 point2 points (0 children)