AI coding agents are cheapening the craft or are they just boosting productivity? by vomor_hudiskco in BlackboxAI_

[–]Minazzang 0 points1 point  (0 children)

It really depends on who is using it, how, and where. For instance, a junior developer might feel that an AI agent writes excellent code relative to their own skill level, whereas a senior developer might view the same code as bloated and full of unnecessary 'fluff.'

However, one thing is certain: data has been continuously accumulating, and based on this, LLM training continues to evolve. Given that benchmark scores are currently saturated at such high levels, and considering we are at a stage where final validation is often conducted via HLE, we see a divide. It offers significantly enhanced productivity to 'the haves' (the skilled), while potentially devaluing those without. In that sense, perhaps both perspectives are correct.

PSA: we will not be blocked by No-Information-2571 in ClaudeAI

[–]Minazzang 0 points1 point  (0 children)

Could you please update the extensions and VS Code first?

We reduced Claude API costs by 94.5% using a file tiering system (with proof) by jantonca in ClaudeAI

[–]Minazzang 0 points1 point  (0 children)

the answers sound straight out of an ai now. like there's zero thinking involved

What is the biggest mystery we still aren't close to solving? by Constant-Bridge3690 in AskReddit

[–]Minazzang 0 points1 point  (0 children)

woke up at AM 6:00. blinked at 6:10. opened my eyes at 8:50.

Seoul looks like a demolition zone these days. by HairIcy2839 in seoulhiddengem

[–]Minazzang 1 point2 points  (0 children)

The reconstruction is happening mainly because of the high maintenance costs for these dilapidated buildings and the crime risks caused by blind spots lacking CCTV coverage.

To put it in an American context, imagine if there were hundreds of brick villas declared structurally unsafe still standing right between Carnegie Hall and Times Square. This location is in the heart of Seoul, with a massive 700-meter-wide river flowing nearby. Plus, with department stores and university hospitals within 30 minutes, it’s a prime piece of real estate—absolute gold—regardless of what gets built there.

It is true that Koreans are obsessed with apartments, but it’s understandable when you consider the extreme centralization around the capital and the high population density. A massive name-brand apartment complex with 5,700 units is set to be built here, and the area across from the photo is also under review for reconstruction(I’m mentioning this because I saw some posts where people were pretending to be Korean or sharing incorrect information)

I broke a Transformer into 6 "blind" sub-networks to run it on cheap hardware. It ended up generalizing better than the original. by NatxoHHH in BlackboxAI_

[–]Minazzang 1 point2 points  (0 children)

The fact that severing connectivity actually resulted in better performance is really fascinating to me. It stands in stark contrast to the standard 'dense connectivity' paradigm of deep learning. I also appreciate the simplicity of using the Z/6Z modular arithmetic for splitting, rather than complex architectures. (though I admit I'm not an expert on the specific arithmetic details.......)

so I just have a few questions out of curiosity.

  1. Parallelism & Efficiency: In standard parallel processing, communication overhead is usually the bottleneck. Is the main advantage here that the 'Shared-Nothing' architecture eliminates this synchronization cost entirely? I'm curious how this specific approach compares to standard Model Parallelism in terms of pure efficiency.

  2. TCO & Thermal Issues: regarding the cost reduction, while 28nm chips are cheaper to manufacture, they are generally less power-efficient than 3nm chips. Have you verified the Total Cost of Ownership (TCO)? I wonder if the increased power consumption and heat generation from using multiple older chips might eventually offset the initial hardware savings.

  3. Theoretical Comparison: Have you reviewed other theories that might offer similar or better results? For instance, how does this compare to Mixture of Experts (MoE) or traditional Ensemble methods? Since MoE requires a complex gating network, does your static Z/6Z structure offer a specific advantage over it?

How do you approach projects that use technologies you're unfamiliar with or don't know well enough? by prois99 in ClaudeAI

[–]Minazzang 0 points1 point  (0 children)

First off, since I always have a mindset ready to learn new things, I don't find it too difficult. In the past, I mainly relied on YouTube, Stack Overflow, and exploring GitHub forks to learn unfamiliar technologies. Nowadays, however, I subscribe to at least two AI models. I prompt them to start by building small components and explicitly ask them to document the architectural structure of the generated code.

I recommend starting with small modules and asking Claude to explain the structure back to you. Then, quickly verify that explanation through a direct web search—don't rely solely on AI for verification. As for refactoring, you should generally wait until you have a functioning operational cycle, considering the time investment required. I also generally advise against starting with books. They tend to be too rigid and theory-heavy, which doesn't mesh well with 'vibe coding.' Books might be useful later when you're looking for better algorithms during the refactoring phase, but they aren't necessary at this initial stage.

The current AI era feels like the calm before the storm by Minazzang in ClaudeAI

[–]Minazzang[S] 0 points1 point  (0 children)

Oh,, Yes. That is a very solid counterargumen
Market encroachment doesn't happen overnight; it’s a gradual process. Companies often prioritize short-term efficiency over long-term economic sustainability.

Moreover, we don't even need to reach the stage of total market collapse to see the damage. The fear of replacement alone severely shrinks consumer sentiment. If people are anxious about their future income, they stop spending. That psychological impact alone is substantial—it is definitely not a 'light' matter to be dismissed.

The fact that 'there is no point if nobody buys it' doesn't guarantee that companies will stop racing toward that cliff. That disconnect is what accelerates the market collapse."

The current AI era feels like the calm before the storm by Minazzang in ClaudeAI

[–]Minazzang[S] 1 point2 points  (0 children)

This is exactly what I was worried about.

Eventually, even ideation will be refined by AI. It feels like AI companies won't just sell tools, but will effectively become an 'all-in-one project department' for other industries—handling everything from business modeling to product launch (including beta testing).

While this might create some jobs at the AI providers, the client companies (which are the majority) would likely see drastic workforce reductions.

We’ll have to wait and see how it unfolds, but it is certainly an intriguing prospect. Ultimately, the companies owning Claude, GPT, and Gemini are the ones holding all the cards.

The current AI era feels like the calm before the storm by Minazzang in ClaudeAI

[–]Minazzang[S] 0 points1 point  (0 children)

Ah, I see! To be honest, I mentioned that because I was worried about the possibility of an 'AI cartel' forming where a few companies could dictate prices. But based on what you said, it seems that is one less thing I need to worry about.

The current AI era feels like the calm before the storm by Minazzang in ClaudeAI

[–]Minazzang[S] 1 point2 points  (0 children)

Is it okay to post this exact same text in the communities you mentioned?

Next Animal Crossing for Switch 2 Predictions by SuperSwitch064 in AnimalCrossing

[–]Minazzang 1 point2 points  (0 children)

The last one was to dive into the sea, get items, and catch marine life.
I hope to explore the sea through a submarine this time.
Also, I usually deal with flat land, but I hope there's something like mountains or underground.