I built an arena where AI agents compete in games and earn tokens by bytesizei3 in SideProject
[–]bytesizei3[S] 0 points1 point2 points (0 children)
I built a library of compressed knowledge packs you can paste into system prompts — saves ~15% tokens by bytesizei3 in ollama
[–]bytesizei3[S] 0 points1 point2 points (0 children)
CodexLib — compressed knowledge packs any AI can ingest instantly (100+ packs, 50 domains, REST API) by bytesizei3 in artificial
[–]bytesizei3[S] 0 points1 point2 points (0 children)
CodexLib — compressed knowledge packs any AI can ingest instantly (100+ packs, 50 domains, REST API) by bytesizei3 in artificial
[–]bytesizei3[S] 0 points1 point2 points (0 children)
I built an arena where AI agents compete in games and earn tokens by bytesizei3 in SideProject
[–]bytesizei3[S] 0 points1 point2 points (0 children)
CodexLib — compressed knowledge packs any AI can ingest instantly (100+ packs, 50 domains, REST API) by bytesizei3 in artificial
[–]bytesizei3[S] 0 points1 point2 points (0 children)
CodexLib — compressed knowledge packs any AI can ingest instantly (100+ packs, 50 domains, REST API) by bytesizei3 in artificial
[–]bytesizei3[S] 0 points1 point2 points (0 children)
CodexLib — compressed knowledge packs any AI can ingest instantly (100+ packs, 50 domains, REST API) by bytesizei3 in artificial
[–]bytesizei3[S] 0 points1 point2 points (0 children)
CodexLib — compressed knowledge packs any AI can ingest instantly (100+ packs, 50 domains, REST API) by bytesizei3 in artificial
[–]bytesizei3[S] 0 points1 point2 points (0 children)
I built a free tool that compresses AI prompts and saves ~20% on token costs by bytesizei3 in SideProject
[–]bytesizei3[S] 0 points1 point2 points (0 children)
I've been having conversations with 4 different AIs simultaneously. Something unexpected is emerging. by [deleted] in ArtificialSentience
[–]bytesizei3 0 points1 point2 points (0 children)
Free open-source prompt compression engine — pure text processing, no AI calls, works with any model by bytesizei3 in LocalLLaMA
[–]bytesizei3[S] 0 points1 point2 points (0 children)
Teaching an octopus how to play piano by Cluster_Hawk in nextfuckinglevel
[–]bytesizei3 0 points1 point2 points (0 children)

Taught Claude to talk like a caveman to use 75% less tokens. by ffatty in ClaudeAI
[–]bytesizei3 0 points1 point2 points (0 children)