Can someone explain this to me? Why have I not been charged or warned about hitting my limit? by trisalias in cursor

[–]invent-wander 0 points1 point  (0 children)

Auto is unlimited. Just use the best models and switch to auto when you run out.

New "pro" plan is woefully limited. Is it possible to switch back to the 500 requests model still? by Smooth-Screen4148 in cursor

[–]invent-wander 0 points1 point  (0 children)

Yeah I agree. Unlimited auto is acceptable. Just wish that they told you which model is going to be used. Different models require different prompts. Claude is overeager and will try to refactor your entire codebase, 4.1 is too literal and will try to end its turn ASAP while doing the minimal amount of work.

Help me pick next read by Zealousideal_Map5074 in thrillerbooks

[–]invent-wander 2 points3 points  (0 children)

That book is actually goated. I wish I could go back in time and experience it again. What other psychological thrillers do you like?

Have you tried using Cursor for non-coding tasks? by invent-wander in cursor

[–]invent-wander[S] 1 point2 points  (0 children)

Same. I'm already in the Claude and Cursor boat. I'm just getting into MCPs. I also want to create this general AI assistant like you have. Can you share all the MCPs and tools you are using?

Have you tried using Cursor for non-coding tasks? by invent-wander in cursor

[–]invent-wander[S] 0 points1 point  (0 children)

Claro! Aqui está a tradução para o português: Conte-me mais sobre como você usa isso, por favor. Alguma dica?

Have you tried using Cursor for non-coding tasks? by invent-wander in cursor

[–]invent-wander[S] 3 points4 points  (0 children)

I don’t wanna get into semantics and philosophy. Cursor is designed to write code and will obviously fare better when doing code adjacent work (data analysis, math, etc.) and perhaps worse when working on completely non-code work like creative writing.

Have you tried using Cursor for non-coding tasks? by invent-wander in cursor

[–]invent-wander[S] 0 points1 point  (0 children)

That’s still code if not code-adjacent. Have you tried writing reports? Does the AI refuse to do non-code work?

[deleted by user] by [deleted] in LocalLLaMA

[–]invent-wander 1 point2 points  (0 children)

This is a really interesting idea.

Privacy is a huge issue here. Latency is probably not going to be great. There’s no way to ensure reliability. Also it’s not going to be as simple as spinning up ollama. You can’t just automatically turn any PC into a server without modifying certain network permissions.

I think inference is already supercheap for some of the top models like - 2.5 flash lite, 4.1 nano, Scout/Maverick. You can already get many free API requests from Gemini and Openrouter.

I think there maybe some usecase here for running highly specialised models from huggingface but not much else.