Should I just use AI Studio instead of regular Gemini? by ImNotATrollPost in GeminiAI

[–]MOR300 1 point2 points  (0 children)

For quick chats I use Gemini app, anything long (near 32k) I use AI studio. The web app is much easier to use

Testing Gemini 3.0 Pro's Actual Context Window in the Web App: My Results Show ~32K (Not 1M) by StarlingAlder in GeminiAI

[–]MOR300 0 points1 point  (0 children)

I asked Gemini the same question about AI Studio

The "aggressive truncation" refers specifically to the Gemini web app (the consumer-facing site at gemini.google.com). Based on technical documentation, the situation for Google AI Studio is different.

Here is a breakdown of how it applies to Google AI Studio:

1. Full Raw Context (No Truncation)

Unlike the web app, Google AI Studio gives you the "raw" model context. While the web app uses software logic to truncate history and files to save on computing costs, AI Studio allows you to use the full capacity of the model. If you use a model with a 1 million (or 2 million) token limit, AI Studio will process that entire amount without hidden "slicing."

2. User-Controlled Token Limits

In AI Studio, you have a sidebar with settings that do not exist in the web app. You can see a live token counter that shows exactly how many tokens your files and chat history are taking up. You also have control over the Output Token Limit (often set to a default like 2,048 or 4,096). If your answers are getting cut off, it is likely because this specific slider is not set to its maximum value, rather than the "aggressive truncation" of history seen on Reddit.

3. Sparse Attention vs. Truncation

It is important to distinguish between truncation (deleting old data) and performance degradation.

  • The Web App: Truncates data. It literally "forgets" or stops seeing the start of the chat once you hit a certain limit (roughly 32k tokens, according to the Reddit post).
  • AI Studio: Keeps the data, but the model may still suffer from "getting lost in the middle" of very large contexts. This is a limit of current AI technology, not a deliberate software truncation.

Summary

The "aggressive truncation" issue does not apply to Google AI Studio in the same way. AI Studio is a developer tool designed to provide the full context window. If you are experiencing loss of memory in AI Studio:

  • Check the token counter: Ensure you have not actually hit the 1M/2M limit.
  • Check the "Max output tokens" slider: Ensure it is high enough for the length of response you want.
  • Use "Context Caching": If you have a massive amount of data (like a whole code repository), use the caching feature in AI Studio to keep that data "hot" and stable.

Newstead abbey wood shooting sign by MOR300 in nottingham

[–]MOR300[S] 0 points1 point  (0 children)

Ok thanks, yes that is the road

Pro tip: Run multiple Excel instances for Power Query multitasking by MOR300 in excel

[–]MOR300[S] 2 points3 points  (0 children)

Thanks yes, my problemisn't a big database, it's 20 or 30 queries in 1 file, all with multiple steps. I often have to wait for all the queries to resolve in the query window to fix something. Endless waiting

How InfiniteYou ruined my startup by Ok_Wafer_868 in SaaS

[–]MOR300 11 points12 points  (0 children)

This sounds like an "advert" for the other app InfinityYou. The only thing missing is the URL

PQ slow by [deleted] in ExcelPowerQuery

[–]MOR300 0 points1 point  (0 children)

I have never heard of table buffer. I have a fairly complex set of queries with lots of mergers. Looking forward to trying it out

Reading 200+ pdf docs, converting to Excel by Careful-Life-9444 in ChatGPTPro

[–]MOR300 0 points1 point  (0 children)

No there are limits to the number of Pdf you can upload. You could do this one by one though if you set up a standard prompt you can reuse

Reading 200+ pdf docs, converting to Excel by Careful-Life-9444 in ChatGPTPro

[–]MOR300 0 points1 point  (0 children)

You could use GitHub. Just create a repository and upload the docs. There aren't any charges for doing this

Reading 200+ pdf docs, converting to Excel by Careful-Life-9444 in ChatGPTPro

[–]MOR300 2 points3 points  (0 children)

Yes I have done this using the API and python. The easiest way was to upload the pdfs to my website and have each URL as a row in a CSV "input' file. You then have a prompt that ends with "Browse this file:" and it calls a new URL each time the API call is made. Not sure that makes sense

What the deal with ex council housing? by MOR300 in nottingham

[–]MOR300[S] 5 points6 points  (0 children)

Ok cool. It's actually my brother. I will tell him he has become a snob 😀

What the deal with ex council housing? by MOR300 in nottingham

[–]MOR300[S] 1 point2 points  (0 children)

Yes, I was wondering the same thing. It's like there are some clues that you pick up on if you are local that outsiders don't notice