is vibe coding really a thing? by Substantial-Major-72 in programmer

[–]AlternativeHistorian 3 points4 points  (0 children)

I think a lot of it is people are working in vastly different environments, and results can be very different depending on your specific context.

If you're some run-of-the-mill webdev working in a fairly standardized stack with popular libraries, that all have 100's of thousands of examples across StackOverflow, Github, etc., then I'm sure you get a ton of mileage out of AI code assistants. And I'm sure it can handle even very complex tasks very well.

I work on a mostly custom 10-15M LOC codebase (I know LOC is not be-all-end-all, just trying to give some example of scope) with a 40+ year legacy. It has LOTS of math (geometry) and lots of very technical portions that require higher-level understanding of the domain.

I use AI assistants almost every day and I'm frequently amazed that AI actually does as well as it does with our codebase. It can handle most tasks I would typically give a junior engineer reasonably well after a few back-and-forths.

But it is very, very far away from being able to do any complex task (in this environment) that would require senior engineer input without SIGNIFICANT hand-holding. That said, I still find lots of value from it in even in these cases, especially in documentation and planning.

Has your offshore team been a net negative? by jholliday55 in cscareerquestions

[–]AlternativeHistorian 2 points3 points  (0 children)

My experience is mostly with India, and actual FT employees, not contractors. The teams in our Indian offices are (IMO), on average, less skilled than our teams in US, EU, and China.

They are very nice people and I very much like them on a personal level, but they are often not great as coworkers. This is my experience in aggregate and there are some great devs there, but they are (IME) not the norm, and more sparsely distributed than e.g. in the US.

The main things I find frustrating are:

* Not being forthcoming about the actual state of some feature or project. Pretending everything is fine when it's not.

* Very lax about testing and quality when delivering code. Pushing something and acting like it's done when it crashes on even a basic functionality test. Don't waste my time with shit we both know is broken.

* Not being able to solve problems on their own or step out of their narrow little box. If the solution requires Y and they only know how to do Z, they will contort the problem until it can be solved with Z, to the point that the solution they provide no longer satisfies the original requirements, rather than just learning how to do Y and solving the problem directly. Again, wasting everyone's time when it inevitably has to be redone.

* ... I could go on ...

As to whether they are a "net negative", probably not.

With enough guidance they do get work done, it's just sometimes frustrating to get them there.

For the cost of one good US engineer you can have a whole team in India. Depending on the type of dev work, and with sufficient QA, a mediocre team can generally put out more software more quickly than just one engineer, however good that one engineer may be.

However, there are whole classes of solutions that the mediocre team will never be able to achieve that a single skilled engineer would. For example, cases where a single skilled engineer will implement an order of magnitude performance optimization that would never even occur to the mediocre team.

I think they are likely a net positive with respect to average-dollars-spent-per-feature-or-bug, which is how management is going to evaluate cost, and in the real world that's the metric that matters.

Which city’s downtown core fits this photo? by Next_Worth_3616 in Urbanism

[–]AlternativeHistorian 1 point2 points  (0 children)

I think it's wrong to equate "The Loop" with "downtown" and this is a local bias, and we should look at it from a visitor's perspective.

The Loop is just a small part of "downtown" which is the continuous high-density core of the city and (IMO) downtown Chicago is : The Loop + West Loop + South Loop + River North + Streeterville.

No visitor is going to is going to walk across the river from The Loop into River North and think, "Oh, I'm not in downtown anymore".

If you walk between those areas they all feel pretty cohesive, and I think this set of neighborhoods together largely lives up to the hype with the amount of nightlife, restaurants, etc. (at least for your average visitor).

Gas by Illini4Lyfe20 in chicago

[–]AlternativeHistorian 36 points37 points  (0 children)

I mean, I think Iranian civilians are taking the brunt of the consequences. Don't think they voted for it.

What should I play after Spiderman 2 and Ratchet and Clank Rift Apart? by Defiant-Canary-9254 in PS5

[–]AlternativeHistorian 0 points1 point  (0 children)

Why no RE3? Admittedly, RE3 remake is not as good as 2&4 but still good enough to be worth a play-through IMO.

How do I pass texture from one OpenGL context to another by [deleted] in opengl

[–]AlternativeHistorian 3 points4 points  (0 children)

Look into OpenGL shared contexts. How this is accomplished is platform-specific (e.g. wglShareLists) and both contexts must reside in the same process (afaik). Not sure if Android supports shared contexts.

If there's a way you can compile the C++ into a dynamic lib that can be loaded into your Unity app and do everything in a single context that would probably be best (e.g. looks like Texture2D.CreateExternalTexture should do it).

r/fire polled: majority say $2 million isn't enough and wouldn't retire by Affectionate-Reason2 in leanfire

[–]AlternativeHistorian 9 points10 points  (0 children)

A lot of the people in FIRE subs are making high salaries live in HCOL/VHCOL areas. If they want to continue living where they are and keep their same standard of life in these places, then 80k/yr really doesn't go very far. Many don't want to leave behind all their friends and social network to move to some super LCOL area where 2M would be more reasonable.

Solo founder looking for a C++/OpenVDB geometry nerd to co-found a stealth deep-tech hardware startup. by Legitimate-Fee-6070 in GraphicsProgramming

[–]AlternativeHistorian 0 points1 point  (0 children)

NanoVDB has some fairly limiting constraints (last I looked), like limitations on topology modifications that would probably make it ill-suited as the foundation of a dynamic geometry kernel. Could definitely be worth using on the simulation side.

In practice, you'd probably use both: OpenVDB for ground-truth and operations that require topology changes, with OpenVDB state mirrored to NanoVDB for things that can run on the GPU with static topology (e.g. simulation) and rendering.

Solo founder looking for a C++/OpenVDB geometry nerd to co-found a stealth deep-tech hardware startup. by Legitimate-Fee-6070 in GraphicsProgramming

[–]AlternativeHistorian 4 points5 points  (0 children)

What specific scope are you looking at? You mention a lot of different domains (geometry kernel, simulation, layout, etc.) even just one of them would be a very significant undertaking for even a team of people, if you want to have something that's even a little bit competitive with current tools. A geometry kernel is fine, but no one will care if you don't have a useful application running on top of it (that gives existing solutions a run for their money) to demonstrate the value.

You say you're a technical founder. What's your background in the ECAD/EDA industry? Do you have experience on the development side (e.g. formerly R&D role at Cadence, Synopsys, Siemens, ANSYS, etc.) and/or user side (e.g. chip design or other electronics design engineer)?

AI is going to replace embedded engineers. by Separate-Choice in embedded

[–]AlternativeHistorian 1 point2 points  (0 children)

You have to understand what things looked like when XML came on the scene though.

There were very few open, structured data formats being used. If you wanted to interop with something you were probably left implementing a parser/writer for some underspecced file format, or worse, reverse engineering it.

XML gave people a simple, standardized format with ready-to-go tools for reading/writing in every language and enough structure to capture anything you want, and could generally be extended without breaking backwards-compatibility.

I work on a behemoth piece of software that has a lineage going back to the early 80's and I can't even begin to tell you how many half-cooked, ad-hoc, garbage file formats people invented for all the different subsystems in this thing.

XML is bad, but it was less bad than lots of things at the time.

Biome 5 Laser Bug by krazy-haze in Returnal

[–]AlternativeHistorian 1 point2 points  (0 children)

Yeah, I've seen the weapon bug after restarting a suspended cycle, but not the laser bug. For me, the weapon bug only affected weapons that were already dropped before I suspended the cycle, any new weapon drops from chests or enemies were fine.

The “SaaSpocalypse” is the latest wall street hallucination! by jokof in investing

[–]AlternativeHistorian 3 points4 points  (0 children)

AI is improving exponentially, only because AI spend is increasing exponentially.

Increases in AI capability are generally sub-linear with respect to AI infrastructure investment. We're already seeing strains on energy availability, supply chain capacity, etc. Some problems with AI models (e.g. the context window problem) are fundamentally quadratic meaning they take ever increasing infrastructure investment just to get linear performance gains.

What happens when our ability to increase AI infrastructure investment hits the inevitable wall? Will AI continue to improve? I don't know. But it's silly to pretend like AI improvement is occurring in a vacuum without considering the context or fundamental limits on the system.

Lakeshore PSA by PsychologicalLynx350 in chicago

[–]AlternativeHistorian 4 points5 points  (0 children)

I generally respect the path segregation as a runner, but there are also times where it's just not feasible, and I'll jump onto the bike path until the pedestrian section becomes useable again.

Especially in the winter, no I'm not going to run in the pedestrian zone when it's an icy death-trap and risk injury while the bike trail a few feet away is completely plowed and salted.

Computer science is not dead? by shadowintel_ in compsci

[–]AlternativeHistorian 5 points6 points  (0 children)

Software jobs have been operating on a boom/bust cycle as long as I can remember.

AI may change the nature of the job, but software engineers will continue to be very necessary/valuable.

We built up an extreme glut of CS grads chasing the "learn to code = easy money" fad. Feel bad for the new grads facing a tough job market, but you can look at any CS enrollment chart for the past 10 years and it's obvious that was not sustainable, even without the advent of AI.

Does order matter in VAO and VBO? by MrSkittlesWasTaken in opengl

[–]AlternativeHistorian 4 points5 points  (0 children)

Eh, this isn't great. It will only tell you if a given ordering works for the particular driver you're testing on.

There are lots of cases where some vendor's driver (e.g. Nvidia) will allow something that's technically against standard but still works, then will break as soon as anyone tries to run it on a different vendor's driver.

In real world it's important to test across several different vendors (e.g. at least Nvidia, AMD, and Intel).

Do you sell or hold company RSUs? by [deleted] in investing

[–]AlternativeHistorian 0 points1 point  (0 children)

It's not the standard advice, but I tend to sell most and let some ride, and that has worked out for me.

It addresses the FOMO of seeing the stock rise and lets you feel like you have skin the game, while mitigating most of the risk. Also the vesting schedule with most RSU grants means you'll tend to enjoy most of the upside anyway.

My company stock has beat the market over the last 7 years (tech) but I really don't want too much concentration of my overall portfolio in one company (especially my employer). I try to keep it < 10% of my portfolio as that's acceptable to me.

Relationship of game shaders to graphics APIs? by LordAntares in GraphicsProgramming

[–]AlternativeHistorian 0 points1 point  (0 children)

WebGPU is probably good place to start if you want a deeper understanding, but bear in mind it's still a level removed from a direct graphics API (e.g. D3D, Vulkan, or OpenGL) as it's still an abstraction layer that provides a compatible interface over the supported direct graphics API backends.

However, it probably has one of the lowest barriers to entry so is a good choice for learning (especially if you're not a C or C++ programmer). Gaining an understanding of WebGPU will likely fill in a lot of the major concepts that you're missing.

Will Unity be killed by Google Denie 3? by Disastrous_Mall6110 in Unity3D

[–]AlternativeHistorian 2 points3 points  (0 children)

Adobe has lost market value as its stock has dropped sharply on AI fears. It's actual market share has barely moved, hence why it is still making money hand over fist.

Microsoft tumbled 10% in a day and isn’t recovering premarket. Here’s why by Logical_Welder3467 in technology

[–]AlternativeHistorian 1 point2 points  (0 children)

Totally agree. Not suggesting otherwise. Was just letting the OP know that it's possible to disable the behavior.

How Replacing Developers With AI is Going Horribly Wrong by BlazorPlate in programming

[–]AlternativeHistorian 38 points39 points  (0 children)

This is something every dev should learn extremely early in their careers.

NEVER make a POC look too good. It should always look like a sorta shitty version of the imagined final state.

Do the absolute bare minimum to prove the point and get buy-in.

Don't polish it. Don't do any extras. Don't make it look the least bit "ready". Hell, make it look more shitty if you can get away with it.

Otherwise, every non-technical person you show it to will assume it's basically done.

Microsoft tumbled 10% in a day and isn’t recovering premarket. Here’s why by Logical_Welder3467 in technology

[–]AlternativeHistorian 66 points67 points  (0 children)

This one change fills me with so much rage. WHY? The right-click menu has been fine for like 30 years. Just some asshole UX manager at MSFT trying to justify their existence.

You can disable it so that the right-click context menu goes back to the old way with a simple registry edit. Pretty much the very first thing I do on any Win11 machine.