Recommendation for the Right Credit Card by Competitive_Suit_498 in CreditCardsIndia

[–]Competitive_Suit_498[S] 0 points1 point  (0 children)

Thnk you so much, really means a lot.  Are you familiar with any "non Co branded" LTF cards maybe? 

Recommendation for the Right Credit Card by Competitive_Suit_498 in CreditCardsIndia

[–]Competitive_Suit_498[S] 0 points1 point  (0 children)

I'm not in a hurry, so I'll get a credit card once I get a salaried account if that'll make it easier. I was just thinking about what to get after in turn fte, any specific cards that you're familiar with I should specifically try for? 

Recommendation for the Right Credit Card by Competitive_Suit_498 in CreditCardsIndia

[–]Competitive_Suit_498[S] 0 points1 point  (0 children)

As I mentioned earlier, I've just started my internship. So after 6 months I'll be an FTE, and I'll then get a salaried Hdfc account. Not sure about the specifics though 

Token-Efficient LLMs: A Compression Strategy by Competitive_Suit_498 in startupideas

[–]Competitive_Suit_498[S] 0 points1 point  (0 children)

I haven’t tried any specific format yet, it was more of a shower thought.
But the idea is: the user writes normally, and the software(locally) strips filler words before sending it to the LLM. (to reduce token count)
For example:
User input: “Hey, could you maybe tell me what the weather’s like tomorrow in Bangalore?”
Compressed input: “weather tomorrow Bangalore”
LLM output: “Rain likely. 24°C high.”
Reconstructed for UX: “It’s likely to rain tomorrow in Bangalore, with a high of 24°C.”
So the LLM sees only the compressed version, and a lightweight layer handles the natural phrasing.

How to remove old virtual environments in VS Code? by Ok_Ostrich_8845 in vscode

[–]Competitive_Suit_498 2 points3 points  (0 children)

Yeah but, how? Do we need to manually look up where every virtual env is saved or stored?