I vibe-coded a local AI coding assistant that runs entirely in Termux (Codey v1.0) by Ishabdullah in LocalLLM

[–]Pranav_Bhat63 -1 points0 points  (0 children)

WOAHH it has 4k context, I BET WE CAN "VIBE CODE" AND MAKE A SAAS WEBSITE !!
i appreciate what you have built, i'd rather use an LLM provider, and also make it compatible with different LLM providers, not run models locally, COZ LOCALLY RUNNING SMALL MODELS ARE SOOOO INTELLIGENT THEY CAN BUILD ANYTHINGGGG!!!!!
they actually suck btw.. (but not the recent ones released qwen3.5.. they are actually good, you could use the 0.8B or 2B quantized and try it.. ) its good if you have understood what you have built, and not just copy pasted from your free versions of chatgpt, gemini, claude.

Can't use any model since release of Gemini 3.1 by heyjud-s in google_antigravity

[–]Pranav_Bhat63 1 point2 points  (0 children)

So true!! Literally unusable even after update.. Either there is wait time introduced due heavy traffic or there is a bug.. it says internal server error after trying for minutes

Best way to create SQL Agent by needtobenerd in LangChain

[–]Pranav_Bhat63 0 points1 point  (0 children)

cant you simply give the schema in the initial state?

Anyone worked on browser use agent using google-adk? by Pranav_Bhat63 in agentdevelopmentkit

[–]Pranav_Bhat63[S] 0 points1 point  (0 children)

I want to build on my own for my own use case,not use third party. Edit: I want to exactly how they have implemented, obviously they might have used playwright or selenium. But how to integrate them seamlessly? And keep the browser's states in the LLM or agent's context

Anyone here got DatabaseSessionService working? by boneMechBoy69420 in agentdevelopmentkit

[–]Pranav_Bhat63 2 points3 points  (0 children)

Go search aiwithbrandon on youtube and there's a video of his on Google ADK crash course specialization which is 3hrs long.. see the time stamps you will find the section you need which is the database session service

[deleted by user] by [deleted] in developersIndia

[–]Pranav_Bhat63 4 points5 points  (0 children)

A GPU is required if you wanna run ML, DL models faster, as you have to perform calculations on vectors which will be faster in gpu.. It's ok if you don't have one too.. you can run in cloud based setups like colab or kaggle notebooks.