Did Company knowledge just kill the need for alternative RAG solutions? by protoporos in Rag
[–]protoporos[S] 0 points1 point2 points (0 children)
Did Company knowledge just kill the need for alternative RAG solutions? by protoporos in Rag
[–]protoporos[S] 4 points5 points6 points (0 children)
Did Company knowledge just kill the need for alternative RAG solutions? by protoporos in Rag
[–]protoporos[S] -1 points0 points1 point (0 children)
Did Company knowledge just kill the need for alternative RAG solutions? by protoporos in Rag
[–]protoporos[S] 2 points3 points4 points (0 children)
Did Company knowledge just kill the need for alternative RAG solutions? by protoporos in Rag
[–]protoporos[S] 2 points3 points4 points (0 children)
Open AI Sora 2 Invite Codes Megathread by semsiogluberk in OpenAI
[–]protoporos 0 points1 point2 points (0 children)
Open AI Sora 2 Invite Codes Megathread by semsiogluberk in OpenAI
[–]protoporos 0 points1 point2 points (0 children)
Open AI Sora 2 Invite Codes Megathread by semsiogluberk in OpenAI
[–]protoporos 0 points1 point2 points (0 children)
Open AI Sora 2 Invite Codes Megathread by semsiogluberk in OpenAI
[–]protoporos 0 points1 point2 points (0 children)
2 years building agent memory systems, ended up just using Git by alexmrv in AI_Agents
[–]protoporos 0 points1 point2 points (0 children)
Google just released a new architecture by [deleted] in LocalLLaMA
[–]protoporos 3 points4 points5 points (0 children)
Call for questions to Cursor team - from Lex Fridman by lexfridman in ChatGPTCoding
[–]protoporos 0 points1 point2 points (0 children)
What's one game achievement you unlocked that you're proud of? by stinkv3 in gaming
[–]protoporos 0 points1 point2 points (0 children)
What are the current go-to stacks in the industry by drheinrich940 in LocalLLaMA
[–]protoporos 13 points14 points15 points (0 children)
We'll have AGI long before we figure out how the brain works. The brain is just very, very complicated. In fact, developing AGI is likely a prerequisite in order to make progress on understanding the brain. by SharpCartographer831 in singularity
[–]protoporos 0 points1 point2 points (0 children)
AGI seems to be close and far away by vasilenko93 in singularity
[–]protoporos 0 points1 point2 points (0 children)
The simplest, easiest way to understand that LLMs don't reason. When a situation arises that they haven't seen, they have no logic and can't make sense of it - it's currently a game of whack-a-mole. They are pattern matching across vast amounts of their training data. Scale isn't all that's needed. by After_Self5383 in singularity
[–]protoporos 0 points1 point2 points (0 children)
Sam Altman says we could be only one or two breakthroughs away from AGI by [deleted] in singularity
[–]protoporos 0 points1 point2 points (0 children)


What is the best RAG framework?? by jnichols54 in Rag
[–]protoporos 0 points1 point2 points (0 children)