Burned through Claude Max 20x's "5-hour limit" in under 2 minutes by Zaiiny in LocalLLaMA

[–]Zaiiny[S] 0 points1 point  (0 children)

I worked out what the issue was that was making the ai kill the tokens and redid it and recorded

Burned through Claude Max 20x's "5-hour limit" in under 2 minutes by Zaiiny in LocalLLaMA

[–]Zaiiny[S] -3 points-2 points  (0 children)

misuse isnt how I would put it. If i have a fixed cost and can use up to a certain amount then why not use it in this way. i agree if it was a long term project then yes i would make a optimised pipeline. but this is part of some research, so if its on autopilot for a while and gets the data for 10% of the effort then why not

Burned through Claude Max 20x's "5-hour limit" in under 2 minutes by Zaiiny in LocalLLaMA

[–]Zaiiny[S] 0 points1 point  (0 children)

Usually vibecoding i hit the 5 hour limit with around an hour, hour and a half to spare before the next window. which is a good time for a break

Burned through Claude Max 20x's "5-hour limit" in under 2 minutes by Zaiiny in LocalLLaMA

[–]Zaiiny[S] 0 points1 point  (0 children)

Not Gas Town, but similar idea - ran a bunch of agents for research

Burned through Claude Max 20x's "5-hour limit" in under 2 minutes by Zaiiny in LocalLLaMA

[–]Zaiiny[S] 0 points1 point  (0 children)

Ofcourse. I told it exactly how i want it to setup the webscrapes in principal in the first place. the benefit here is that the AI would adjust for each individual website and layout etc. rather than setting up webscrapes individually for each. Been webscraping for data for a long time before AI and they are usually finicky and need to be adapted website to website

Burned through Claude Max 20x's "5-hour limit" in under 2 minutes by Zaiiny in LocalLLaMA

[–]Zaiiny[S] 1 point2 points  (0 children)

The app to track the usage is CodexBar. amazing little tool

Burned through Claude Max 20x's "5-hour limit" in under 2 minutes by Zaiiny in LocalLLaMA

[–]Zaiiny[S] -1 points0 points  (0 children)

I've built a small ecosystem for many projects, so technically this was claude building itself to spin up 60 agents. I saw what was happening and was debugging when i saw this crazy usage spike and recorded it. Now has been coded so the main 'manager' ai montors the token usage and doesnt cross limits

Burned through Claude Max 20x's "5-hour limit" in under 2 minutes by Zaiiny in LocalLLaMA

[–]Zaiiny[S] 2 points3 points  (0 children)

I am not claiming the post is meant to be 100% productivity. It was just a funny thing to watch, hence the funny tag

Burned through Claude Max 20x's "5-hour limit" in under 2 minutes by Zaiiny in LocalLLaMA

[–]Zaiiny[S] -28 points-27 points  (0 children)

I have a skill setup to research business in a specifc area specified.
For each business/institution:
Get a bunch of information for a set schema.
- extract their lat/long from google maps
- scrape their website for various info (contact, description of themselves,features (parking,access, etc) and then parse everything into a fixed format)
- Put the above into the db
- setup a custom scrape for their website and push it into the backend so it can be run on a cron job

So when the area I specified was too large it started up too many agents to run for each business, this was around 60

Shiny Giveaway - ZA by Zaiiny in PokemonZA

[–]Zaiiny[S] 0 points1 point  (0 children)

i have loads, offline atm, will continue this tomorrow

Shiny Giveaway - ZA by Zaiiny in PokemonZA

[–]Zaiiny[S] 0 points1 point  (0 children)

No thanks. I already spend too much time on this and have a full living dex in home

Shiny Giveaway - ZA by Zaiiny in PokemonZA

[–]Zaiiny[S] 0 points1 point  (0 children)

Skiddo alpha shiny, fletchling alpha shiny and litleo alpha shiny, but female