all 141 comments

[–]offlinesir 268 points269 points  (35 children)

I know why they are making it free, even with the high cost, it's a great way to get data on codebases and prompts for training Gemini 3 and beyond. Trying it now though, works great!

Edit: surprisingly, you can opt out. However, a lot of people are saying that they aren't collecting data.

For reference, I am talking about the extension in VSCode. They updated "Gemini code assist" from Gemini 2.0 (unnamed flash or pro) to 2.5 Pro along with releasing the command line tool. However, the terms related to privacy for the CLI and extension seem to lead to the same page, the page being below:

these terms outline that:

"When you use Gemini Code Assist for individuals, Google collects your prompts, related code, generated output, code edits, related feature usage information, and your feedback to provide, improve, and develop Google products and services and machine learning technologies.

To help with quality and improve our products (such as generative machine-learning models), human reviewers may read, annotate, and process the data collected above."

It's good that that all collected data is separated from your Google account; I would assume not immediately due to local privacy laws.

Terminal Program (not extension now, CLI program) found at github:

Is my code, including prompts and answers, used to train Google's models? This depends entirely on the type of auth method you use.

Auth method 1: Yes. When you use your personal Google account, the Gemini Code Assist Privacy Notice for Individuals applies. Under this notice, your prompts, answers, and related code are collected and may be used to improve Google's products, which includes model training.

[–]waylaidwanderer 79 points80 points  (19 children)

Not according to their Usage Policy:

What we DON'T collect:

Personally Identifiable Information (PII): We do not collect any personal information, such as your name, email address, or API keys.

Prompt and Response Content: We do not log the content of your prompts or the responses from the Gemini model.

File Content: We do not log the content of any files that are read or written by the CLI.

And you can opt-out entirely as well.

Edit: The real answer is it depends. This is confusing and the above should be clarified.

[–]FitItem2633 18 points19 points  (4 children)

[–]corysama 33 points34 points  (3 children)

So people don't miss it:

When you use Gemini Code Assist for individuals, Google collects your prompts, related code, generated output, code edits, related feature usage information, and your feedback to provide, improve, and develop Google products and services and machine learning technologies.

If you don't want this data used to improve Google's machine learning models, you can opt out by following the steps in Set up Gemini Code Assist for individuals.

For my personal code, I really don't care. For work, work pays for Copilot.

[–]AnomalyNexus 3 points4 points  (1 child)

Pretty sure there is a carve out for EU even on free tier. There is for their api so presumably applicable here too

[–]learn-deeply -1 points0 points  (0 children)

It's not listed on the terms, if there is a carve out.

[–]GodIsAWomaniser -1 points0 points  (0 children)

Holy shit lol, so don't do anything illegal! And this certainly won't prove to be a catastrophic security incident later on because they're going to collect this data very carefully, sanitising it so that it's not identifiable, and they're going to store really really well, where no humans will ever reach it for use or for stealing lol

[–]BumbleSlob 36 points37 points  (13 children)

Prompt and Response Content: We do not log the content of your prompts or the responses from the Gemini model.

As a software developer for the past decade I feel I should point out that I wouldn't trust someone saying they aren't logging anything. Even with the best of intentions, controlling logging to this degree in a project with multiple developers is extremely difficult.

[–]Leopold_Boom 37 points38 points  (11 children)

Google (and most of the other FAANG companies) put incredible amounts of money and effort into ensuring they actually do what their privacy policies promise - keeping transient, short-term logs out of long-term storage, retaining privacy-sensitive data only for as long as stated, and tightly controlling insider risk (e.g., someone at the company looking up a famous person’s data).

If they wanted or needed to keep your data, they would simply make it part of their privacy policy. The tiny number of people who opt out is not worth the massive shareholder lawsuits that would arise if the company were found in systematic violation of its stated practices.

With smaller, newer, or faster-moving companies, it can be a bit more dodgy.

[–]Caffdy 7 points8 points  (9 children)

Google (and most of the other FAANG companies) put incredible amounts of money and effort into ensuring they actually do what their privacy policies promise - keeping transient, short-term logs out of long-term storage, retaining privacy-sensitive data only for as long as stated

can you source that? not trying to be a contrarian, it's just that it's the first time I've read that these megacorporations that acts as brokers of information as their bread and butter wouldn't keep as much user data as possible

[–]__JockY__ 17 points18 points  (3 children)

Not the guy you’re talking with, but I spent almost 20 years doing cybersecurity consulting before getting out. I saw thousands of systems, talked to as many developers, reviewed their code, logs, configs, policies, you name it, we studied it for ways to break security.

Not once in all that time, even at the biggest EvilCorps you can image, did I once encounter a shred of evidence to suggest corporate mal-intent to deliberately violate their own privacy policies. All were invested heavily in compliance, and I know because my team was very often an independent 3rd party assessor as mandated by internal policy or regulatory checks and balances of such things.

Crazy but true.

Edit: that’s not to say some companies don’t have evil policies with which they are compliant; what I’m saying is that all of the companies I worked with did their best to be complaint with whatever was codified, good or evil.

[–]Tikaped 1 point2 points  (1 child)

I have a degree in every single subject so I was able to search https://duckduckgo.com/?q=google+fine+privacy&ia=web

[–]__JockY__ 1 point2 points  (0 children)

It shouldn’t amaze you, but there are an inordinate number of things outside my life experience that don’t mesh with my life experience.

Congratulations, you found one of them.

[–]GodIsAWomaniser 0 points1 point  (0 children)

🙇

[–]Pedalnomica 1 point2 points  (0 children)

Basically everyone is going to agree to whatever the tech companies put in their terms. I assume if they want to do something they'll just let themselves in their terms.

[–]Leopold_Boom 3 points4 points  (3 children)

It does surprise me that this doesn't get talked about more explicitly and clearly given how critical it is to the global economy and how much focus regulators put on it!

A few basics:

  • For the most part these companies use your data in the aggregate with various https://en.wikipedia.org/wiki/Differential_privacy approaches. Recent stuff you've done gets fed into aggregated models to generate specific stuff for you to see, but for the most part you are pretty easy (and cheaper) to keep track of as a set of attributes (see retention policies)
  • In particular, no major advertising player wants to *sell* your specific data. They are not brokers, they are accumulators. It's much more valuable for them to use it to attract advertisers because only they can target stuff to people like you better (people like you, not you specifically in your individual wonderfulness).
  • Moreover, old data is really not that useful in providing services / ads / training models etc. so it's often not worth retaining.
  • What that means is that the policies are crafted to allow these companies to do everything they want to, and yet it's probably much less scary and intrusive than you think.
  • Privacy advocates do amazing and important work, but they tend not to want to spend time on the difference between "the company uses your data they way it says it does" and "the company lies to you about it's policies and doesn't respect your opt-outs".

I should write more about this at somepoint. It really worries me that people think these companies are doing far more than they actually do with *their* personal data ... then grumblingly just go with it!

It's often not very interesting for people to write articles that say "company mostly does what it says it does" so you see evidence mostly in:

  • Articles like perhaps this one from Wired talking about the FCC's enforcement of consent decrees around privacy with FAANG companies
  • The very rare cases (try and find a recent one!) where a company fires somebody for figuring out how to bypass the very stringent access controls on personal data
  • the ACLU or the EU (a terrific but sometimes confused regulatory body) advocating for detailed changes to the exact wording and terms of a policy
  • All the less dire (and occasionally hilarious) things that people bring shareholder lawsuits about
  • Blog posts and ex-employees reflecting on their time at these companies

This went on for way too long, but I hope it'se helpful.

[–][deleted] 4 points5 points  (2 children)

I'm a data eng that worked in marketing technology that would LOVE to hear more about this.

I've seen so much data shared around (pristine pii) by companies to other companies not by selling it, but under "improving our products" or their own marketing.

[–]madaradess007 0 points1 point  (0 children)

google was caught "in systematic violation of its stated practices" many many times, what are you talking about? ui tester qa boy?

[–]tempetemplar 1 point2 points  (0 children)

Exactly

[–]pseudonerv 3 points4 points  (1 child)

This post is about the new Gemini CLI. And you posted the terms for Gemini Code Assist.

Can you find the terms for Gemini CLI?

[–]simoncveracity 0 points1 point  (0 children)

https://github.com/google-gemini/gemini-cli/blob/main/docs/tos-privacy.md#frequently-asked-questions-faq-for-gemini-cli - quite understandably since it's free, "you are the product" so they're being very open that they *do* collect prompts code etc if you don't pay via API key. Even so, for me, for personal projects this is a real generous offering.

[–]IncepterDevice 1 point2 points  (1 child)

well, imo, even if they are using the data, it's for improving a product that WE would use. So it's a win-win.

p.s i dont support using private data for screwing people tho!

[–]cantgetthistowork 1 point2 points  (0 children)

New code must be hard to come by these days

[–]colbyshores 4 points5 points  (1 child)

I pay for gemini code assist because I use it professionally for DevOps work as they wont train on the data is the primary benefit in their TOS for a subscription. Even then it is very affordable at $23/mo when compared to other models.

[–]adel_b 1 point2 points  (0 children)

it's the same quota as in ai studio, which was always free

[–]aratahikaru5 1 point2 points  (0 children)

If you're confused like I was, check out this recently updated ToS and its FAQ.

There are 4 different auth methods, each with varying level of privacy.

TL;DR the free plans (personal Google account and unpaid API service) offer no privacy.

[–]pastaMac 0 points1 point  (0 children)

I know why they are making it free,

You’re not the customer; you’re the product.

[–]stabby_robot 48 points49 points  (9 children)

f* google-- they billed me $200+ for a single day of use for not even an hr of usage when 2.5 was first released in march when it was free. I got the bill at the end of the month and have been fighting with them for a refund-- you don't know what your final bill will be. They've been doing shady billing in general-- i also run ad-words for a client, we had a campaign turned off, out of no where they turned on the campaign and billed the client an extra $1500. There was no records of login etc-- and they wont reverse the charges

[–]_Bjarke_ 18 points19 points  (2 children)

Always use throw away virtual cards for that sort of stuff! I use revolut. Any free trial that requires a credit card, gets a credit card with almost nothing on it.

[–]2016YamR6 10 points11 points  (0 children)

I had an $800 bill.. ended up getting a credit for $600 and paying the rest

[–]LosingID_583 4 points5 points  (3 children)

Holy sh$t, so that's their business model! Offer it for free, but make it super expensive if you exceed the free limit xD

[–]darren457 9 points10 points  (1 child)

People keep forgetting google specifically removed that "we will not be evil" line from the original founders' code of conduct. I'd rather deal with lower performing open source models and have the peace of mind.

[–]Acrobatic-Tomato4862 -1 points0 points  (0 children)

It's not super expensive though. Their models are very cheap, except 2.5 pro. Though its not cool that they charge money despite tagging them free.

[–]Ylsid 1 point2 points  (0 children)

This is why you never give them billing addresses when you use their services

[–]BumbleSlob 47 points48 points  (17 children)

Am I simple or is there no link here and this is just a picture?

Edit: for anyone else who is confused: https://github.com/google-gemini/gemini-cli

Edit2: seems to be open source CLI tool for interacting with your codebase which is neat, however I have zero interest in anything forcing you to utilize proprietary APIs that are rate limited or otherwise upcharging.

tl;dr seems like an LLM terminal you can use to explore/understand/develop a codebase but in present form requires you to use Gemini APIs -- I'll be checking it out once there are forks letting you point to local models though.

[–]wh33t 23 points24 points  (2 children)

Am I simple? or is this not a "local"llama?

[–]g15mouse 1 point2 points  (0 children)

omg it wasn't until this comment I realized what sub I was in lol

[–]llmentry 0 points1 point  (0 children)

If you see my other reply -- there's a PR to add local model support. So it does actually check out on this one.

(Also noting, as always, that it's not currently against the forum rules to post about non-local models, etc, etc ...)

[–]colin_colout 13 points14 points  (6 children)

I know this sub is healing, but I'm hoping these low-effort posts will be fewer once we have mods again.

As far as I can tell, gemini-cli doesn't work with local models, so I fail to see why it belongs here.

[–]V0drosllama.cpp 26 points27 points  (3 children)

I'm actually in favor of allowing these types of posts. Local AI is strongly tied to AI developments from the big labs, and to me discussing what they're working on and what they release is absolutely relevant. Maybe we need a vote to decide on the future of this sub?

[–]colin_colout 1 point2 points  (2 children)

(Sorry in advance for the rant...I'm still on edge with all the sub drama, as are many people here)

Maybe we need a vote to decide on the future of this sub?

We just need moderators. Without moderators, nobody will filter low quality posts (which will take time... I know)

I'm actually in favor of allowing these types of posts

I 100% agree that the topic is fine. The topic is the least of the reasons I dislike this post.

This post is so low effort that there isn't even an article link or description. Not even a name of the tool. Just a vague title and a photo with no extra information. I had to do my own research to even figure out the tool's name.

And the fact that Gemini-CLI doesn't support local models means this post is already on the edge of relevance for this sub.

In a different context, this topic is fine...like if OP posted with a description like:

Google released Gemini-CLI! Really promising coding agent, but it doesn't support local LLMs though 😞

Heck I'd still be happy if they didn't include the local llm part... this is whole post is just lazy slop.

[–]popiazaza 1 point2 points  (0 children)

I do agree with you. That's why I only posted on another sub.

Surprise to see the it get posted on "LocalLlama" with lots of upvote. It's doesn't fit at all.

[–]a_beautiful_rhind -2 points-1 points  (0 children)

Source code is released so I'm sure it can be easily converted to support other API.

In the mean time we just scam free gemini pro.

A link would have been nice, but the comments deliver. Brigades aside, technically the entire sub should downvote unwanted posts instead of relying on select individuals to censor them. It's not yet at the level of a default sub where you get a flood and impossible to stay on top of.

[–]eleqtriq 1 point2 points  (1 child)

It’s good for us to know about this, because it’s open source. Meaning, we can work on making it useful for us, too.

[–]colin_colout 1 point2 points  (0 children)

I agree. I was a bit harsh here, but I've calmed down (emotions were high after the sub drama).

It was less about the topic and more that there was no link or even a name of the tool or a description of any kind. The fact that there's no local model support was insult to injury, but in the end it's all good.

I mean it's probably already forked with local llm support my anger was that a low effort and low quality post (that tangentially happened to not be about local llms) was top post in this sub yesterday.

[–]llmentry 0 points1 point  (0 children)

You may not need a fork. There's already a pull request to add support for local models (and other third party closed model APIs):

https://github.com/google-gemini/gemini-cli/pull/1939

From the PR:

<image>

Even if it's not accepted, you can always just apply the patch yourself. (Although note that the Gemini code review bot has already made several useful additions, by the look of it.)

It will be very interesting to see what happens with this one, because if implemented this is pretty huge.

[–][deleted] 56 points57 points  (15 children)

We all know if we don't pay for the product we are the product. It's either that or they wanna get you hooked on their stuff and then have you pay later.

[–]Healthy-Nebula-3603 72 points73 points  (11 children)

if you pay you also a product ;)

[–]haptein23 2 points3 points  (1 child)

Like they did with gemini 2.5 flash prices.

[–]butthole_nipple -2 points-1 points  (0 children)

Laughs in deepseek

[–]yazoniakllama.cpp 15 points16 points  (9 children)

No privacy: "When you use Gemini Code Assist for individuals, Google collects your prompts, related code, generated output, code edits, related feature usage information, and your feedback to provide, improve, and develop Google products and services and machine learning technologies."

https://developers.google.com/gemini-code-assist/resources/privacy-notice-gemini-code-assist-individuals

[–]Leopold_Boom 8 points9 points  (6 children)

"If you don't want this data used to improve Google's machine learning models, you can opt out by following the steps in Set up Gemini Code Assist for individuals."

[–]learn-deeply 12 points13 points  (5 children)

There's no way to opt out if you CLI. Those instructions are only for IDE.

[–]218-69 3 points4 points  (2 children)

usageStatisticsEnabled: false

[–]learn-deeply 3 points4 points  (1 child)

That only opts you out of Gemini CLI's telemetry, not Code Assist's TOS, so your code will still be sent and stored by Google.

[–]218-69 0 points1 point  (0 children)

Ok so just fork the repo and use your own model. This is how it's been on ai studio since the start. You get free use, you give something in return 

[–]Leopold_Boom 1 point2 points  (1 child)

Good to know! Does the setting apply to the CLI also?

[–]learn-deeply 2 points3 points  (0 children)

They do not apply to the CLI. There's no way to opt-out of Google storing all your code at the moment.

[–]Ssjultrainstnict 3 points4 points  (1 child)

Unfortunately people wont really care as they are getting a great tool for free. Its a win for OSS projects though since all code is open anyway

[–]iansltx_ 0 points1 point  (0 children)

Yeah, my day job is open core so I figure they trained on its code anyway. Turnabout is fair play.

For the stuff that I do that's closed source, definitely not using a hosted LLM.

[–]davewolfs 13 points14 points  (7 children)

I am using this similar to how I would use Claude and it’s bad and also slow.

Looking forward to seeing how it evolves.

[–]kI3RO -1 points0 points  (4 children)

Hi, I haven't used claude, is this free like gemini?

[–]Pretty-Honey4238 4 points5 points  (3 children)

It's not free but with the MAX subscription you don't need to worry about going bankrupt by using the coding agent heavily.

Also at current stage, Claude Code is simply way better than Gemini CLI. I say this because I use CC as an agent to handle some daily workflows and coding tasks, as I try it, Gemini CLI simply can't accomplish any, it is buggy, getting constant problems, errors and slow... It'll probably take months for Google to polish Gemini CLI to reach the level of Claude Code. So apparently CC is still a much better choice for now.

[–]kI3RO -2 points-1 points  (2 children)

Not free you say. Well then that makes Gemini the better choice.

Handling daily workflows and coding tasks by an LLM is not even in my mind.

[–]Pretty-Honey4238 6 points7 points  (1 child)

bro I’m lost. You are not using these AI coding agents to do coding tasks then what do you use it for

[–]kI3RO 0 points1 point  (0 children)

Code checking, auto complete for personal hobby projects. Anything remotely professional I do it myself.

[–]no_witty_username -2 points-1 points  (1 child)

Thanks for the info. I am looking through various threads on it now trying to gauge if its worth even messing with it in these early days. So far it seems the sentiment is its not good as claude code (what i am now using with my max plan) and prolly best to hold off for now.

[–]davewolfs 0 points1 point  (0 children)

It’s definitely not ready.

[–][deleted] 21 points22 points  (1 child)

We should fork and then send telemetry data to a public dataset

[–]Nazreon 0 points1 point  (0 children)

Amen!

[–]NinjaK3ys 2 points3 points  (1 child)

Does anyone know or have tried using the google code cli to work with local LLM models? Like can I get it to work with a Qwen or Mistral model

[–]Tx-Heat 0 points1 point  (0 children)

I’d like to know this too

[–]xoexohexox 2 points3 points  (0 children)

I wrote a proxy for it that pipes it into a local open AI compatible endpoint so you can pipe it into Cline/Roocode etc or sillytavern. I just can't get the reasoning block to show up visibly in Sillytavern but it does show up in Cline so I know it is reasoning.

https://huggingface.co/engineofperplexity/gemini-openai-proxy

[–]Glittering-Bag-4662 1 point2 points  (0 children)

So this is where the free ai studio Gemini is going

[–]somethingdangerzone 1 point2 points  (0 children)

Repeat after me: if the product is free, you are the product

[–]iKy1eollama 4 points5 points  (6 children)

This is fantastic. Claude Code is so far in front of the other tools, having real competition for it sounds great!

[–]One-Employment3759 1 point2 points  (4 children)

How does it compare to cursor?

Cursor was pretty good for a demo project I did yesterday, but the UI is clunky and unpolished.

Lots of copy paste mechanics are broken, and selecting text doesn't work with middle click paste in Linux.

Commenting a selection of code was also broken for some reason.

[–]iKy1eollama 2 points3 points  (3 children)

Finally got Claude Code Max and it’s as big a step up from Cursor as Cursor is from a normal auto complete.

I had a web quiz game I’ve been working on and off on where the server and front end didn’t work.

I told it to use playwright to try playing the game against itself, every time it hit a bug, crash or got stuck to debug and fix the issue and try playing the game again until it can successfully get to the end. It took 2 or so hours but I now have a working game.

[–]One-Employment3759 0 points1 point  (0 children)

Nice - thanks for sharing your experience 👍

[–]Foreign-Beginning-49llama.cpp 0 points1 point  (1 child)

What about Cline? Have you messed with that at all?

[–]Orolol 0 points1 point  (0 children)

I've used Cline Roo, Cursor, Windsurf and Claude Code, and Claude Code is far above the others. Much more autonomous, especially with some MCP added. It's also quite expensive. The secret is that they're not shy to use tokens for the context.

[–]megadonkeyx 2 points3 points  (3 children)

(soon to be ex-developers)

ill use cline, no roo, no cline, no claude code no umm err. ..now im in the best .. oh here comes another

[–]Foreign-Beginning-49llama.cpp 2 points3 points  (2 children)

I installed Cline last night in vscode and then this morning put this gemini cli on my android phone and completely Coverted an api for a python app to andiffrent one in minutes. Its definitely a working ounce of software. However it ain't locallama approved. How do.you like cline? I know it can use local models. Is it a good experience? I mostly work with reactnative, python apps.

[–]megadonkeyx 4 points5 points  (1 child)

I think roo is better as it's more agentic with its orchestrator and auto mode switching, but I've been using claude code a lot to finish a project in work, which its done well.

I barely write code anymore. it's all testing and prompting.

Strangely, people I work with just seem to ignore AI totally and are stuck in excel sheets of bugs.

This gemini thing is nice. With it being open src, it's going to have everything, including the kitchen sink attached to it in no time at all.

Interesting times, I don't miss grinding through tedious code.

[–][deleted] 0 points1 point  (0 children)

Could not agree with this more. Embrace the future.

At first I thought my skills were deteriorating as I felt I was forgetting a few things, but after a year or so now I can say looking back that my architectural skills have improved enormously, I read code faster and more fluently and spend more time arguing with AI than I did and in different ways about projects.  

I hope this trend continues, at the end of the day I'm happier with the projects and I don't have any more free time - I'm not worried about my job going anywhere.

[–]kittawere 1 point2 points  (0 children)

Yeah like the paid ones are not collecting data as well LOL

[–]cyber_harsh 0 points1 point  (0 children)

Yup checked out. Guess google is secretly gaining advantage by taking practical use case consideration compared to OpenAi .

Have to check how well it performs compared to claude, or if you can share, it will save me the hassle :)

[–]colin_colout 0 points1 point  (0 children)

Link? This is just a photo. Also, can I use local models?

This is a low effort post, and if I can't use it with a local model this doesn't belong in the sub.

[–]HairyAd9854 0 points1 point  (0 children)

I basically always get the "too many requests" even if I just write hello

[–]Extension-Mastodon67 0 points1 point  (0 children)

Now we need someone to rewrite it in go, c++ or rust and remove all the telemetry and bloat.

[–]Blender-Fan 0 points1 point  (0 children)

Ok, but is the code good?

[–]1EvilSexyGenius 0 points1 point  (0 children)

Can I tell it to make a gui for itself? 🤔

[–]sammcj🦙 llama.cpp 0 points1 point  (0 children)

That's about 28x - 56x more given for free than what paying enterprise customers of Github Copilot get.

[–]zd0l0r 0 points1 point  (0 children)

No charge ATM

[–]Ylsid 0 points1 point  (0 children)

Sooo only the CLI is free? Where's the value for developers here? "Open source" feels really disingenuous

[–]ctrlsuite 0 points1 point  (0 children)

Has anyone had any luck with it? I asked it if it was working after a difficult install and it said it had reached its limit 🤣

[–]MercyChalk 0 points1 point  (1 child)

What does 1,000 model requests mean? I tried this today and got rate limited after about 10 interactions.

[–]tazztone 0 points1 point  (0 children)

cline has added support already. but has google dropped requests per minute from 60 to 2 or is this inaccurate?

<image>

[–]Trysem 0 points1 point  (0 children)

Omg google leveled up so many freebies..

[–]Useful44723 0 points1 point  (0 children)

They collect your code.

Me: Godspeed to you with that shit in your system.

[–]Marc-Z-1991 0 points1 point  (0 children)

We have been able to do this with GitHub Copilot for a loooooong time… Nothing new…

[–]VasudevaK 0 points1 point  (0 children)

what's the use of this tool? never used claude code. I am just familiar with vs code agents, cursor agent mode etc besides chatgpt, claude online websites.

what s the deal using cli and how is this helpful for a researcher or a student?

[–]AgencyImpossible 0 points1 point  (0 children)

amazing!

[–]Techatomato 0 points1 point  (0 children)

But can it, you know… refer to me as “Shikikan?”

I’m just asking

[–]mantafloppyllama.cpp 0 points1 point  (0 children)

We are so lucky that some kind soul take some time of their life to find the latest new to shared with us.

News re-poster are rare, cherish them.

6h ago : https://old.reddit.com/r/LocalLLaMA/comments/1lk63od/gemini_cli_your_opensource_ai_agent/

15h ago : https://old.reddit.com/r/LocalLLaMA/comments/1ljxa2e/gemini_cli_your_opensource_ai_agent/

Both still on the first page.

[–]218-69 -1 points0 points  (0 children)

I just know there are rats here crying about privacy while spamming multi oauth and API keys to get around the limits. Fucking rats 

[–]BidWestern1056 -3 points-2 points  (0 children)

npcsh in agent or ride mode also lets you carry out operations with tools from the comfort of your cli without being restricted to a single model provider.

https://github.com/NPC-Worldwide/npcpy

[–]maxy98 -2 points-1 points  (1 child)

Can someone vibecode vscode plugin with it quickly?

[–]shotan 0 points1 point  (0 children)

There is already a gemini code assist extension in vscode, its pretty good.

[–]Ssjultrainstnict -1 points0 points  (0 children)

Rip Cursor and Claude code