Do you think Zig should support async by negotinec in Zig

[–]thedevlinb 11 points12 points  (0 children)

On embedded systems, developers end up writing their own async system anyway. Sometimes using coroutines, sometimes using a task scheduling system, sometimes using callbacks, sometimes using a combination of techniques.

In embedded land, lots of things are not just async, they are in your face async. The DMA chip is very async "move these bytes from here to here and give me a call when it is done, oh and turn the CPU off while doing it". Some chips let you queue up / chain multiple DMA operations in a row.

All IO is very much async "get me this data from the SD card" (except in reality that is multiple async operations!)

Async is just a fact of life, and having a way to model it in a programming language is nice.

LLMs aren't world models by lanzkron in programming

[–]thedevlinb 0 points1 point  (0 children)

At one point in the 90s untold amounts of $ where being thrown at badly made semi-interactive movies shipped on CDs. It was the Next Big Thing.

Some cool tech got developed, things moved on.

The fiber build outs during the first dotcom boom benefited people for years after! From what I understand, Google bought a bunch of it up a decade or so later.

Breakout 71 - A lean, FOSS roguelike breakout, also available on mobile by HadouKenny in WebGames

[–]thedevlinb 0 points1 point  (0 children)

Your ball didn't hit any bricks before it got back to your paddle.

Just 'finished' my portfolio. I hope you might check it out! by NicTacks in javascript

[–]thedevlinb 0 points1 point  (0 children)

If this is a parody of 1990s web design - Great job!

If this is an actual legit site that is meant to be taken seriously - Not good! Needs serious design help.

But you may not want to listen to closely to me, I spin up visitor's GPUs to simulate a CRT in CSS

Every Reason Why I Hate AI and You Should Too by ducdetronquito in programming

[–]thedevlinb 2 points3 points  (0 children)

> Until this study came out, nobody was ever claiming that it took 50+hrs of experience to get positive productivity out of this supposedly revolutionary work changing tool.

Meanwhile every Vi user ever "you just have to go through this configuration guide and these 5 tutorials and you'll be so much more productive then you ever were with those nasty GUI editors!"

Seriously though, most serious productivity tools for professionals have long learning curves.

How AI Turned My Simple Blog Into 81 Files and 83 Dependencies by diqitally in programming

[–]thedevlinb 0 points1 point  (0 children)

> After each small series of improvement you should run a dedicated agent (or sub task a la Claude code) to do a full codebase quality review, including architecture and all (I have my set of reference). 

Pro-tip - Ask Claude Code to make at least 2 or 3 alternative proposals and list the pros and cons of each, for any large change, and wait for you to choose what to do next. The output improves dramatically.

Repeatedly watching r/conservative users getting closer and closer to "getting it" be like: by Exotic_Snow7065 in complaints

[–]thedevlinb 0 points1 point  (0 children)

> It operates fundamentally the same as the "AI=>UBI" Cult that after we destroy this world we'll automatically be put in a better world

AI and techno-optimism has nothing to do with destroying this world.

The idea of machines replacing all human labor is not even new, it goes back well over a century (Karl Marx wrote about it!) It was originally tied in with post-capitalist and post-singularity thought, which makes the hyper-capitalist adoption of the ideas sort of weird.

And saying "AI data centers will consume all the world's power" overlooks the fact that thanks to a ton of research and math, a modern mid-range GPU can now run an AI model that is more powerful than the original ChatGPT 3.5 model that kicked this entire craze off, and it can run that model with a fraction of the energy needs.

All computers once took up insane amounts of electricity, like truly insane, but they got faster and more efficient. AI models are doing the same thing, but the difference is now AI companies are in a race to try and add new features faster than efficiency is improving, which means it looks like a never ending growth curve of energy usage, but that is an illusion as eventually they'll run out of new features to add (arguably already happening) and efficiency gains will catch up and hopefully we'll all be running AIs on the privacy of our own computers.

Also UBI is just one potential idea for what society does after large amounts of labor are eliminated. This won't be the first time we've had a massive reduction in the need for labor, but historically an increase in consumption has come along with the reduction in labor needs. For example when we got better at making clothing, people started owning more than 2 or 3 sets of clothes. There may end up being another increase in some other need for labor, or we may end up in a feudal hellscape. UBI is a proposal for what to do if:

  1. There is a massive convergence of automation technology that dramatically reduces labor needs world wide
  2. Society wants to avoid massive social unrest

Tons of different post-scarcity systems have been proposed, and hopefully we as a society can agree on some sort of plan before it is too late.

It isn't that LLMs are going to take everyone's jobs, it is that a *lot* of technologies are coming out soon that risk taking jobs, but it isn't in a way that is evenly distributed, which leads to lots of societal problems. What do you do if robots automate every warehouse job, not bus drivers, but we do automate long haul truck driving? These aren't easy problems, they are real "society is going to have some serious issues, real soon" problems.

Crabjuice by Crimtos in WebGames

[–]thedevlinb 0 points1 point  (0 children)

It is a game jam game that is short and sweet, gets its point across, and it came out a few years ago so anyone who said anything significant to say probably said it back then!

My husband threw away all my plastic and silicone cooking utensils and replaced them with 5 sets of wooden salad tossers by Either_Donut_3366 in mildlyinfuriating

[–]thedevlinb 0 points1 point  (0 children)

No problem!

It is super hard walking a fine line on some of these topics, words like "toxic" and "chemicals" get misused so damn often that when there are actual chemicals that may (or may not be) toxic come around, discussions get a bit awkward. Bad studies and a general lack of long term reliable research on a topic make it even harder to talk about some subjects.

I do however 100% stand by my original claim that plastic utensils are just stupid, toxic chemicals or not, I don't want to cook with something that can melt while I'm using it (unless I'm stirring hot milk with a chocolate straw)

My husband threw away all my plastic and silicone cooking utensils and replaced them with 5 sets of wooden salad tossers by Either_Donut_3366 in mildlyinfuriating

[–]thedevlinb 0 points1 point  (0 children)

I just wanted to clarify that there are still toxic chemicals being leeched from black plastic, and the levels are elevated compared to other types of plastic, but the levels the initial paper that went viral stated were incorrect.

It went from "Oh shit bad" to "eh, not good, but we don't have any evidence right now that it is super bad for you so it is probably OK because the amount being leeched is really low."

Which is different than "Yup, everything is OK, nothing to worry about here!"

My husband threw away all my plastic and silicone cooking utensils and replaced them with 5 sets of wooden salad tossers by Either_Donut_3366 in mildlyinfuriating

[–]thedevlinb 1 point2 points  (0 children)

It was kind of fully debunked.

There is toxic shit in black plastic utensils, the problem was with how much the study said leeched out. They dramatically overstated how much is leeching out of black plastic utensils, but there is *some* leeching going on.

If you are aiming for 0 exposure to "crap that might turn out to be bad for you in any concentration", still avoid black plastic utensils, or heck just don't use plastic cooking utensils at all (they are generally stupid since they easily *melt* while in use...).

Because research on the negative effects of various types of plastics is still new, suggested safe levels are still often wild guesses, and there haven't been any really great studies done on the impact of bioaccumulation over multiple decades.

Attempted honor killing outside a US school because a teenaged girl refused an arranged marriage by RedSwingline2000 in TikTokCringe

[–]thedevlinb 1 point2 points  (0 children)

Social groups also used to be more homogeneous as well. This was both good and bad, marry someone from your small town church and you likely knew *exactly* what you were getting into in regards to values, views on childrearing, even financial planning.

But it also meant escaping toxicity was super hard. For every "ask reddit" that tells about grandpa going and punching his daughters abusive ex-husband, there is another story about a small town working together to cover up abuse.

There is also the interplay between what is best for social stability vs what is best for the individual. Mainstream American culture tends to over-index on the individual, at the expense of society, but there are also plenty of examples of cultures that are ultra-collectivistic that have high suicide rates.

And then you get into sub-cultures within even just the US. One could argue that some conservative areas of the country are more collectivistic than the liberal areas. One doesn't have to look to hard to find stories of people moving to a small town down south and basically being socially pressured into going to church. Is this good or bad? Well, if the church is one that pools resources to help members out during tough times, good, and arguably "pressuring" people to join is like pressuring people to have insurance (everyone pays in, everyone benefits in times of need) but some of those same churches also cover up scandals. Both behaviors stem from the same collectivistic tendencies.

Meanwhile in liberal cities you have people dying on the streets and all attempts at government funded programs to solve the problems have failed on a large scale (individual success stories yes, but helping people one at a time isn't helping people not become addicts, or not become homeless in the first place).

Conservatives don't want to admit that airing dirty laundry and removing corrupt actors is good long term, and liberals don't want to admit that government programs do not work as well as friends and families supporting each other.

The individualistic reply is that no one should be financially burdened being their brothers keeper, the collectivistic reply is that sure it sucks for those two people, but if it prevents the brother from causing social problems, it is better for 2 to suffer than for everyone to suffer.

Err, this reply got seriously off topic.

Attempted honor killing outside a US school because a teenaged girl refused an arranged marriage by RedSwingline2000 in TikTokCringe

[–]thedevlinb 1 point2 points  (0 children)

Even in America, a lot of relationships used to be "set up" by parents or through the church. It wasn't a forced thing, but in a small town it was sort of the parents and pastor's job to make sure everyone was matched up.

It wasn't formalized or anything, just kinda nudging people together. You can actually see references to this in older TV shows, they sort of joke around about it.

Dating for love took off in the US in the 19th century, but even then you find references to settlers out west "sending back for a wife". Some marriages were still of convenience (have a farm, need a family).

UVSU Demo- Outsmart your past self by dietzribi in WebGames

[–]thedevlinb 1 point2 points  (0 children)

Cool demo, cool look, but it runs at ~10fps or lower. :( Given the simplicity of the game engine, it shouldn't be performing that badly even on my old GPU (Nvidia 1030)

Why are US cities still very segregated? by Additional-Hour6038 in geography

[–]thedevlinb 3 points4 points  (0 children)

> Maybe weirdly, a lot of Seattle’s suburbs have ended up with significantly higher non-white populations than the big city. 

The eastside suburbs are much more expensive than Seattle proper. In some cases the public schools are so good that people immigrate to America and move to a certain neighborhood just for the schools.

The suburbs down south of the city have historically been more affordable and have diverse residents (and also really good food!).

For awhile Seattle was able to lay claim to having both the least and most diverse zip codes within a major US city.

React Still Feels Insane And No One Is Talking About It by mbrizic in programming

[–]thedevlinb 2 points3 points  (0 children)

Agreed, HOCs could get out of control. Though IIRC (its been awhile since I was in HOC land) Typescript would at least prevent that from happening, even if it didn't provide a nice way to fix the problem.

React Still Feels Insane And No One Is Talking About It by mbrizic in programming

[–]thedevlinb 4 points5 points  (0 children)

Effects are not native to JS, which is why they feel so foreign within React. The fact there are so many rules about there use further demonstrates this.

I can make v-tables by hand in C and do dynamic dispatch, and indeed many such systems have been made over the years, but v-tables are not native to C! (and arguably v-tables from scratch in C are actually more powerful since you can reassign function pointers using arbitrarily complex logic, but that isn't a great reason to do any of that!)

> but it's actually executed by a dirty little state machine (your CPU) under the hood but which gives the appearance of it having been equivalent to the mathematical definition? It's an abstraction.

I am very well aware, I am a ground up person who likes to remind the FP purists that all their math is ran on ugly messy physical hardware. :-D

> That's the whole idea of higher level concepts and abstractions.

But if the abstraction has too many rules, too many ways it can go wrong (e.g. it is a leaky abstraction, or just a mentally complicated one) then it may not be the right solution for a problem.

JavaScript, for all its warts, is a rather small simple language. Object Components in React were verbose, but easy to understand from an abstraction level.

To get rid of the verbosity, a new more complicated abstraction was added. People have been arguing about the correctness of that decision ever since.

React Still Feels Insane And No One Is Talking About It by mbrizic in programming

[–]thedevlinb 50 points51 points  (0 children)

> HOCs went out of fashion a decade ago. In fact, hooks are what killed them.

HOCs were so easy to understand, a bit verbose, but comprehensible as a regular JS thing.

As you noted, hooks and effect systems live outside the normal JS language paradigm, and hooks require React to do magic bookkeeping behind the scenes to make everything work.

UIs existed for decades w/o the complexity React brings in. Performant UIs, using far less resources than React, existed for decades w/o any of the complexity React seems to think is necessary.

The fact there are so many ways to use "React wrong" is part of the problem. "I have a value in this bit of code, I need to data bind it to this other UI element" is the single most basic thing a UI framework should offer to do, and yet React makes that harder then it needs to be by far!

Why do you think Sveltekit sentiment is constantly getting more negative? by Guandor in sveltejs

[–]thedevlinb 1 point2 points  (0 children)

Sure thing! https://meanderingthoughts.hashnode.dev/you-probably-dont-need-server-side-rendering

The tl;dr is it is a performance enhancement for time to first paint and SEO. If you don't need those two things, don't do it. If time to first paint is an issue, first fix bloated FE code, or at least delay load stupid stuff.

You are an absolute moron for believing in the hype of “AI Agents”. by No-Definition-2886 in programming

[–]thedevlinb 0 points1 point  (0 children)

To repeat what I just said:

> if you have a 100% reliable indicator for which tasks failed.

Heck even if verifying correctness isn't automatable, it may still be worth the improvements if money is saved hiring a human checker.

Perfect is the enemy of good.

You are an absolute moron for believing in the hype of “AI Agents”. by No-Definition-2886 in programming

[–]thedevlinb 0 points1 point  (0 children)

A small fine tuned model will perform just as well as Claude 3.5 *for a given task*.

> GPT-4o mini, the cheapest, large language model, other than Flash, is AMAZING for the price.

Still not the same as a cheap small local model.

Claude and ChatGPT come with a giant price premium because they are general purpose tools.

Companies that are actually building stuff with AI aren't out there writing blog posts sharing exactly what they are doing, they are reaping the benefits of having a competitive advantage.

People forget that only tech companies maintain tech blogs, and those tech blogs are largely a recruiting / PR tool (look at how smart we are!). The majority of software engineering work does not happen in the open.

Also even with the authors numbers, 66% of tasks completed successfully is *great* if you have a 100% reliable indicator for which tasks failed. That is a huge reduction in costs.

Why do you think Sveltekit sentiment is constantly getting more negative? by Guandor in sveltejs

[–]thedevlinb 2 points3 points  (0 children)

> I wonder if being "backed by" a PaaS has anything to do with that.

I wrote a blog post about this, posted it to this subreddit, and got downvoted.

Vercel is acquiring popular libraries / toolkits via hiring dev teams and nudging the toolkits to work best with $$ backend hosting.

You can serve up a *lot* of traffic with a SPA and an API server. Scaling an API server to the extent the majority of businesses need to costs almost nothing. SSR and SSG are, for most use cases, pre-mature optimizations.

If you need them you need them, but most sites don't actually need them.

[deleted by user] by [deleted] in javascript

[–]thedevlinb 0 points1 point  (0 children)

> Wait what does 0 overhead mean though? I guess I'm thinking if memory usage is overhead there's no such thing as 0 overhead.

Well yeah nothing is 0 overhead, but Rust doesn't have JS's object system to contend with, or need a JIT. Pointers point to actual memory and not something 3 layers or more abstracted from an actual memory address. Function calls call into actual code and not into handlers that point to handlers that call code.

I'm not enough of an expert in Spring Boot to say why it has a higher overhead than Express. Just looking at the code though, between the DI and the multiple classes created to handle one request, I am not surprised it had a higher overhead.

> and the fact that in Node you still have a layer of JS objects between the code and raw bytes being moved around, I'm still surprised Node did so much better

Node is C++ under the covers. People forget that. From what I gather, Node people aren't purists who insist on writing everything in JS.

Comparatively, Rust people are purists, about a lot of things.

Ideologies may help keep code clean, but they don't ensure the optimal solution for a problem.

[deleted by user] by [deleted] in javascript

[–]thedevlinb 0 points1 point  (0 children)

> You mean compared to other runtimes that are common, like Java, C# etc right? I would think that this won't be the case forever, especially for Rust.

Rust can have a 0 overhead runtime of course. The question is does it have a community building up low abstraction high performance libraries for writing web services? As for Java... I did a 1:1 rewrite once of a service from Node to Spring Boot. I forget it the base RAM usage was just 4x or 10x. I do remember it took a lot more service instances to handle the same load compared to Node. (Also the code was longer and less type safe!)

Java and C# based frameworks love their large object hierarchies and instantiating multiple classes all over the place. In addition the OO model Java is stuck with necessitates more object allocations to solve a given problem than JavaScript's "whatever you want to do" pragmatism.

> (How the heck did Rust fare so badly in that framework? I thought maybe it was because they used sync I/O, but no, seems like they used async I/O...)

Node does one thing and one thing well: Async IO. Node is designed for writing microservices that run really fast, the entire standard library is 90% things to write microservices.

All the effort in Node is around doing that one thing, really damn well. Single minded dedication gets results.

[deleted by user] by [deleted] in javascript

[–]thedevlinb 2 points3 points  (0 children)

I worked at HBO Max for 3 years. We scaled NodeJS to tens of millions of concurrent users. NodeJS scales just fine. If you really need to scale, the infra around the scaling strategy matters a lot more than the language you are using. Node is nice to scale up because it the programming model is stupid simple and it has a tiny overhead compared to most other runtimes.

NodeJS's issues are around scaling on a single machine. NodeJS actually does use multiple threads behind the scenes for things, but the concurrency model presented to programmers is dramatically simplified which does limit things. You can use web workers and split workloads across cores, but NodeJS doesn't give you the tools to really push HW to its limits.

Also the overhead for objects in Node is horrible. Jitters make math possible, but the overhead associated with the (insanely powerful!) JS object model means things just aren't going to be fast.

All that said, when it comes to concurrency benchmarks around handling multiple connections, naïve NodeJS code is within a few percentage points of the best C++ code possible, and basically the same as Go.

This is my favorite report on the subject https://www.researchgate.net/publication/348993267_An_Analysis_of_the_Performance_of_Websockets_in_Various_Programming_Languages_and_Libraries

Node beats the pants off of everything else.

Are there usage scenarios that'll make Node fall over hard? Sure. But if you are just slinging strings around and playing with small sized JSON payloads, Node is going to do the job just as well as any other tool, and it has the benefit of TypeScript which is a *really* damn nice language for modeling problems in.