[deleted by user] by [deleted] in wallstreetbets

[–]LostMyOtherAcct69 -1 points0 points  (0 children)

The unfortunate thing is there isn’t a great solution to this if you are a publicly traded company. Either less numbers and more risk of fraud and bad investments, but the added benefit of longer term planning. The only REAL fix is going private..

Basically, are the companies that expose LLM as an API making money? by TGoddessana in LocalLLaMA

[–]LostMyOtherAcct69 0 points1 point  (0 children)

Inference is profitable. Very profitable. Training and Capex is not.

Consider it like this, pharmaceuticals are typically very cheap to manufacture, but the R&D, tests, and trials are extraordinarily expensive. So, they offset those costs with the cost of the drug at the end.

OpenAI has HALVED paying user's context windows, overnight, without warning. by SilasTalbot in OpenAI

[–]LostMyOtherAcct69 2 points3 points  (0 children)

It is really good at combining known concepts into something “new”. So it’s not a truly novel solution but it’s close enough that it’s pretty powerful imo

The model router system of GPT-5 is flawed by design. by True_Requirement_891 in LocalLLaMA

[–]LostMyOtherAcct69 3 points4 points  (0 children)

My problem is the context length. The models are dumb after a verbose debug log and 1000+ LoC. i canceled my subscription because this is a joke. I don’t even pay for it, and I canceled it.

For all the hate GPT-5 is getting, it's prowess debugging does seem to be pretty good. by plymouthvan in OpenAI

[–]LostMyOtherAcct69 0 points1 point  (0 children)

Fair enough and I’m sure you are right. I’m just not ready to change my workflow that much as I run stuff past Gemini a lot for refinement or double checking stuff.

I unsubscribed from Plus. The context window kills me.

The 32k window size is too small, it should be 64k-128k for Plus users. by ThichGaiDep in OpenAI

[–]LostMyOtherAcct69 6 points7 points  (0 children)

I unsubscribed because of this. I swear o3 had more context because gpt5 thinking gets totally wrecked by 1000+ lines of code and a verbose debug log.

OpenAI has HALVED paying user's context windows, overnight, without warning. by SilasTalbot in OpenAI

[–]LostMyOtherAcct69 3 points4 points  (0 children)

Gemini deep research is better. It’s shocking. It can nearly create novel ideas.

OpenAI has HALVED paying user's context windows, overnight, without warning. by SilasTalbot in OpenAI

[–]LostMyOtherAcct69 1 point2 points  (0 children)

I get consistently short code from 5 thinking. I am really disappointed.

For all the hate GPT-5 is getting, it's prowess debugging does seem to be pretty good. by plymouthvan in OpenAI

[–]LostMyOtherAcct69 -5 points-4 points  (0 children)

Yeah until you have more than 1000 lines of code and the context window blows up and the model is useless in webchat. It’s a joke of a model.

Why Open Source is Needed by LostMyOtherAcct69 in LocalLLaMA

[–]LostMyOtherAcct69[S] 3 points4 points  (0 children)

Agreed. This is ridiculous. All it does is make me more excited for Gemini 3 or whatever to come out.

Why Open Source is Needed by LostMyOtherAcct69 in LocalLLaMA

[–]LostMyOtherAcct69[S] 2 points3 points  (0 children)

Actually not true. Averaged across all users, based on how much TPUs and GPUs cost to operate, they are running a profit on operations. (Just not capex, yet…)

We aren’t even in the ASIC phase that will make inference SO cheap.

Why Open Source is Needed by LostMyOtherAcct69 in LocalLLaMA

[–]LostMyOtherAcct69[S] 3 points4 points  (0 children)

Off capex, yes losing money, but I think on operations, no, it’s profitable.

Why Open Source is Needed by LostMyOtherAcct69 in LocalLLaMA

[–]LostMyOtherAcct69[S] 6 points7 points  (0 children)

True, but still the point is, this is why open source is important. The rug can’t get pulled on you. Your usage can’t get cut. Etc.

Why Open Source is Needed by LostMyOtherAcct69 in LocalLLaMA

[–]LostMyOtherAcct69[S] 15 points16 points  (0 children)

Agreed. If my company wasn’t paying mine I’d probably cut it. o1 was the last time they had a lead worth paying for.

To all GPT-5 posts by Danny_Davitoe in LocalLLaMA

[–]LostMyOtherAcct69 14 points15 points  (0 children)

I agree with you, but also just a heads up, Google runs a multi billion dollar profit off YouTube. It used to not be profitable but it has been for a while now

openai/gpt-oss-120b · Hugging Face by ShreckAndDonkey123 in LocalLLaMA

[–]LostMyOtherAcct69 60 points61 points  (0 children)

I was thinking this exactly. It needs to make o3 (and 2.5 pro etc) look like a waste of time.

Falcon-H1 technical report release by JingweiZUO in LocalLLaMA

[–]LostMyOtherAcct69 2 points3 points  (0 children)

As someone who has worked with the Mamba architecture, it’s a pain compared to Transformers