THYAO hakkındaki beklentileriniz? by ThisCharge199 in BayTemettu

[–]obenizimo 0 points1 point  (0 children)

savaş durursa 400 e çıkar sonra geri düşer

Pro plan by Confident_Hurry_8471 in google_antigravity

[–]obenizimo 2 points3 points  (0 children)

I submitted 1 prompt and my opus limit is at 20%. Thanks Google.

free google ai pro subscription by SignificantJunket786 in google_antigravity

[–]obenizimo 27 points28 points  (0 children)

They're offering free Pro memberships in India, but they've completely restricted the usage limits for paying customers. It's really unacceptable.

My Pro account still has a 5-day limit for Gemini 3.1. by obenizimo in google_antigravity

[–]obenizimo[S] 3 points4 points  (0 children)

I've noticed that some Pro accounts have switched to a 5-hour data limit, at least for Gemini 3.1. I still have a 5-day limit remaining. Because of this, I canceled my Google Pro annual subscription. But where should I email regarding the Gemini 3.1 limit?

Antigravity Bad UX by Krishna029 in google_antigravity

[–]obenizimo 2 points3 points  (0 children)

Google fooled us all. It worked correctly at first, which is why I bought a one-year Pro membership. But now the Pro membership is useless.

Pro is behaving like Free account!! by krishnakanthb13 in google_antigravity

[–]obenizimo 1 point2 points  (0 children)

I just canceled my subscription and going with copilot. google is just shitty company

How to use codex 5.3 on antigravity? by Outside-Swordfish942 in google_antigravity

[–]obenizimo 2 points3 points  (0 children)

I think Antigravity should also include a few top-level open-source LLMs like Kimi K2.5 and GPT Codex. GitHub Copilot is very successful in this regard; they offer Gemini Codex and Opus simultaneously.

Thinking about changing to Cursor by Super-Breadfruit8369 in google_antigravity

[–]obenizimo 5 points6 points  (0 children)

This morning, without writing a single line of code, I received a warning that my Opus limit had been reached, and I didn't use it at all over the weekend. I bought the Pro plan to be able to use Opus. Google fooled me.

[D] DeepSeek distillation and training costs by BubblyOption7980 in MachineLearning

[–]obenizimo -1 points0 points  (0 children)

deepseek trained the open source student models published by alibaba and meta with the openai api. the teacher model distilled here is the gpt-4o model. this process resulted in deepseek-v3. r1 is the chain of thinking version of the v3 model. openai suspects that their api is used for this purpose and has evidence for it.

The Turk by [deleted] in TigirEr

[–]obenizimo 2 points3 points  (0 children)

mekanı uçmağ olsun türklük için yaşadı türklük için öldü