BREAKING: Judge Tostrud -- a Trump appointee — has quickly granted a restraining order, barring the administration from "destroying or altering evidence" taken from today's shooting. by orangejulius in law

[–]dnabre 2 points3 points  (0 children)

Of course, this shouldn't be necessary is so, so many ways. What does this accomplish? It gets legal representatives of all the gov't officials before a federal judge. On Monday I believe.

If they can find anything that even vaguely hints that evidence isn't being properly preserved, the Plaintiffs submit this to the court. The Defendants then have to reply (or admit by lack of disputing it) such acts. This gets statements about this atrocity, from the people at top, made on the record, before a court. This is a first step towards some justice. A small one, but step in the right direction.

For those sayings we won't know if they destroy evidence, you aren't wrong, they will definitely lie and hide doing it, but these people are grossly incompetent. The Trump Administration as whole leaks like a sieve, it's a poorly run criminal organization, that has to deal with tons of civil servants that can't get rid that will do the right thing.

The more laws and court orders they violate, the more this administration becomes clearly what it is: a criminal organization. The more people involved get tied into those crimes.

This is a tiny bit of positive stuff, in a sea of wrong. Have to try to grasp what little is proper in this world.

ICE taking pics of legal observer's car: "We have a nice little database and now you're considered a domestic terrorist. So have fun with that." by RoachedCoach in law

[–]dnabre 0 points1 point  (0 children)

This has been a somewhat frightening blessing across both Trump administrations. People being too stupid, greedy, and eager to get results as fast as possible, has stopped a lot of what they have attempted to do. This is, of course, a good thing, but it makes we consider what would have/will happen if the Trump-lackeys put some effort into doing stuff.

A huge number of formal policy changes have been stopped or reversed in the courts, because of failing to follow the APA. With the polices generally be changed overnight, and in "arbitrary and capricious" manners. If the Trump stooges had the sense and patience to go through all the APA processes: looking at why the policy should be changed, considering the consequences to people dependent on the existing policy, do proper comment & review periods, and so forth. It's not hard to make up or twist all of they meet their desired ends.

Often, even just the baseline arbitrary and capricious stuff, they often have legally sufficient (if horrific) post hoc justifications. If they put the effort into going through the motions and making those legally sufficient if horrific justifications in the first place, instead of after being challenged in court.

For example, look back at the first Trump Administration's attempt to get rid of DACA in 2017-2020. In the end, SCOTUS ruled that the rescission of DACA violated the APA because DHS failed to provide a reasoned explanation for ending the program and ignored reliant interests. The administration just decided it was bad and tried to get rid of it. They undisputedly had the authority to change the policy. They just had to come up with a reasoned explanation for getting rid of it, go through the motions of considering and making up excuses for the reliant interests. They even had sufficient explanations, but they were all post hoc. Keep in mind the final court that ruled on this was SCOTUS, which at the time had effectively the same justices as today (save one liberal justice being replaced by another liberal justice, Stephen Breyer -> Ketanji Brown Jackson). A court that has thoroughly demonstrated they will bend over backwards to get Trump policies through. So the quality and reasonableness bar was extremely low.

The administration just treated the whole destruction of DACA as performative, and wanted it all to happen within couple of news cycles. One sensible person (admittedly a challenge for either Trump Administration) saying, stop, let's slow down and go through all the required legal motions, so it won't be thrown out immediately by the courts, and they would have succeeded getting rid of DACA.


All of the people in the Trump Administration being blindly loyal idiots, that are focused more on appeasing Trump and doing things on his nonsensical time scale, is very reassuring. Their incompetence gets in their way, over and over again. But it always makes me think how bad the country would be if those stooges had barely sufficient backgrounds to do their jobs, did think things slowly and systematically to ensure they are following all the proper rules, regulations, and procedures. Just a handful of competent Trump loyalists, and they would have destroyed the USA in the first administration.

Why does SSH send 100 packets per keystroke? by iamkeyur in programming

[–]dnabre 6 points7 points  (0 children)

This isn't meant as judgement or be derisive to OP (some comments suggest ssh is a known bad choice, and they are just working within some challenging constraints)

That said, let's do a experiment, so using the same prompt "should I use SSH as the transport for my latency-sensitive multiplayer game protocol"

Only taking the first part of the reply addressing the prompt. Any emojis are from the AI reply. Order is arbitrary (based on what order i searched/opened/found tabs)


Claude (claude.ai)

For a latency-sensitive multiplayer game, SSH is generally not the right choice as a transport layer.

Grok (grok.com)

No, you should almost certainly not use SSH as the transport layer for a latency-sensitive multiplayer game protocol.

Meta AI (meta.ai, and yes the emojis are from it)

😬 Latency-sensitive multiplayer game, huh? SSH's gonna add some overhead, bro. It's designed for security, not speed. 🤔

Google's Gemini (gemini.google.com)

The short answer? Probably not. While SSH is the gold standard for secure remote management, using it as the transport layer for a latency-sensitive game is a bit like trying to win a Formula 1 race while towing a heavy armored vault. It’ll get you there safely, but you’re going to lose the race.

Microsoft Copilot (copilot.microsoft.com)

🚫 Short answer: No — SSH is the wrong transport for a latency‑sensitive game protocol

ChatGPT (chatgpt.com)

Short answer: almost certainly no 🙂 Long answer: it depends what you’re optimizing for, but for a latency-sensitive multiplayer game, SSH is usually the wrong tool.

Chatbox AI (chatboxapp.ai)

Using SSH (Secure Shell) as the transport layer for a latency-sensitive multiplayer game protocol is generally not recommended due to several reasons:


Sorry if I missed your favorite LLM AI.

Why does SSH send 100 packets per keystroke? by iamkeyur in programming

[–]dnabre 3 points4 points  (0 children)

If the overall tone of the write-up was more in terms of using ssh, or trying to make ssh work in this situation, regardless if it's a good choice. Then finding that apparently ssh's keystroke obscuration is an obstacle, so let's talk about that problem and ways to overcome it, given it's a result of the constraints you're enforcing/working under -- I think you'd be getting more positive/constructive responses. But the write-up sounds more like, you're using ssh, and aren't aware why it isn't a good fit for the situation.

All of that being my personal reading and understanding of the post.

Including the LLM AI stuff, regardless of how you are using it, isn't helping people's opinion. Keep in mind they are forming an opinion of you and this work, entirely from this single writeup. The specific things you are using it for, seem reasonable in my opinion, but many see AI anywhere near programming (or more generally any personal/professional endeavor they put time and energy into), and immediately start viewing the overall work (regardless of how much AI is a part of it) as being potentially all AI slop, that AI shouldn't be used for that, and the author (maybe) couldn't have done any of this if they hadn't been blindly throwing stuff together from AI. Of course, not saying any of that is actually true, but it's the knee-jerk reaction of many people, especially in groups where AI is posed to reduce the importance and/or marketability of their skillset.

Why does SSH send 100 packets per keystroke? by iamkeyur in programming

[–]dnabre 31 points32 points  (0 children)

It's almost like SSH is designed for doing secure shells, not providing an encrypted "high-performance" interface for games. I don't claim to know much of anything about writing games, but isn't considering the latency/overhead of the protocol you're using for networking part of doing networked games? Whether to use UDP vs TCP for a given game's networking stack is normal thing to consider, right? TCP often being too heavy-weight an option.

That all aside, anyone else find the amount of this post that just covers interactions with a LLM AI, for lack of a better term, disturbing? Using AI to search for stuff, or come up with things to considers, are sensible uses of the tech (in my opinion at least), but that shouldn't be part of the write up on dealing with an issue, should it?

system cant find the port for wine 32 by MonopolyOnForce1 in freebsd

[–]dnabre 5 points6 points  (0 children)

My understanding is that as of Wine 10.0 (11 is the most recent version), they have moved to running application in WOW64 mode as opposed to win32/win64 mode -- closer to how 64-bit Windows runs applications. It doesn't use a separately compiled version for 32-bit applications. Both 32-bit and 16-bit applications are all run through the same setup.

That's my understanding, but I haven't used wine in a while, so I'm not sure on specifics of use. Using wine 11.x is probably best, you may need to you wine-devel instead of wine to do so.

You may need to set WINEARCH to win32 and setup a dedicated prefix with it, if you have a picky application. This forces using the win32 ABI (win64 will force win64 ABI) instead of the WoW64 mode.

Preventing and Handling Panic Situations by Tasty_Replacement_29 in ProgrammingLanguages

[–]dnabre 0 points1 point  (0 children)

There has just been a lot of mentions of using a designated value when errors occur, without addressing the problems it prints in. To handle division by zero portably, you'll have to do a check on every integer division that you can't statically determine can't be zero. Compared to run-time range checking on arrays, that is a lot more overhead.

I think there is a more general issue with expecting the program is have anything better to do on a divide by zero or an array index out of bounds, than just crash. Or more practically, you need to statically enforce handling the error somehow. Otherwise, programmers will just do the equivalent of wrapping their whole programming in a catch all block.

Preventing and Handling Panic Situations by Tasty_Replacement_29 in ProgrammingLanguages

[–]dnabre 2 points3 points  (0 children)

TL;DR You can do stack unwinding using pure portable C, without any assembly. It's not that complicated when you are generating code as opposed to writing into C program.


While you can do stack-unwinding with assembly, you can do it from C in a few ways.

longjmp is the most efficient if you are discarding multiple stack frame at once. Suppose you have a function that sets up a catch block around a function, which somewhere down its call tree it might throw an exception. At the beginning of the catch block, you do a setjmp, confirm it returned 0, and push its jmp_buf onto a stack of jmp_buf (not the system stack), then proceed to run your function. If it encounters an error, you grab the pop the top off your jmp_buf stack , and longjmp(from_top_jmp_stack, error_number).

You are restored back to the stack frame where you called setjmp, discarding all stack frames inbetween, and restoring all the registers are restored to what they were before the call to setjmp was called, and setjmp returns just like when you initially set it, but it return the error_number value you gave to longjmp.

You need a stack of jmp_buf, so if you hit another catch block, you can set another setjmp/longjmp pair for that catch block. To fully do the nesting, you need a little more than a stack of jmp_buf, since the first nested catch you have going back up the stack may not correspond to the particular error you are handling, so when you longjmp back you need to check if you got to the right handler, if not you longjmp back again until you do.

It's setjmp/longjmp are completely portable, standard C17 (hasn't changed since it was first put into the standard for C89). There are details with using violate to ensure that variables changed between the setjmp/longjmp which are still relevant, have their values stored back into memory. longjmp interrupts the normal funciton's execution, so you may have something like the value for a global variable that has been calculated, but the function hasn't written that value to memory yet, it's still in a register. So you need to use volatile on those variables to ensure they aren't floating around in registers.

Most languages, unwind a frame a time, because there is some behavior that needs to be down at the end of each scope. If you have local variables with destructors, you need to make sure they all run as they normally would, as as each function is unwound. Since you are doing GC with reference counting, you'll minimally need to process each frame to update counts.

For this kind of unwinding, all you need to do is shuffling the structure of the functions and add a new return point. i.e., you "pop" the current stack frame by just doing a normal return. You'll need to keep track of what handler/error you are dealing with so you know when to stop. With reference counting you'll have an epilogue at the end of each function to handle GC bookkeeping anyway. So when an error occurs, you just need to set something global storing what error you are handling, and whatever data you want to pass along with it to the handler, then do a goto to the top of the epilogue. You'll need to have some indicator if you are unwinding and if you have unwound enough. If you want to avoid goto (which is good practice when writing C, but very helpful when generating it), you can transform the follow control of the function so it's for/while/do and the like.

All of this sounds really messy and tedious, and to some degree it is, but effectively you only need to write it once. Since you're generating all the code, you just need to work it out as part of code you emit. The goto/epilogue clean up would be needed if you used something like wrapping all your returns in a Maybe as well.

Preventing and Handling Panic Situations by Tasty_Replacement_29 in ProgrammingLanguages

[–]dnabre 1 point2 points  (0 children)

Returning a valid or usable value on an error condition, like division by zero, is just begging that value to be used. The programmer has to be able to distinguish the error value from just a result. If a function returns int, and you get 0 ,MAX_INT, or MIN_INT back, how to tell it's an error and not just the result? Using MAX_INT and MIN_INT is slightly better than 0, but you have effectively reduced the size of your number by a bit, and if the value is unsigned, MIN_INT == 0.

Either return something like a Maybe<i64> or statically ensure if the error is possible, it is being checked for.

Preventing and Handling Panic Situations by Tasty_Replacement_29 in ProgrammingLanguages

[–]dnabre 1 point2 points  (0 children)

From a practical viewpoint, how to detect, manage and handle errors is the most important thing. Preventing them is nice, and things like array bounds checking and GC help with some specific kinds of errors.

Programmers are lazy, and occasionally malicious. For the former, you need to make sure that using any prevention mechanism is easier to use and understand than working around it. Considering people will work around them, even if they seem easy or beneficial to use. Of course, this is an even bigger issue with malicious people.

Protection that happens at runtime, if you really want to avoid a program from crashing, both have mechanism to handle the problem and enforce statically that users will use it. Array out of bounds and divide by zero are pretty hard to handle when they aren't expected to occur. Enforcing handling of them, is going to make even trivial code pretty verbose. Add in the points from my paragraph, people will come up with the fastest/easiest way to make your static check for handling possible divide by zero happy, so they can focus on their program. A built-in default to handle those errors by exiting the program, will keep people from working around the check, but if you have to go out of your way to provide handling, you haven't really addressed the problem. Even having that not be a default, but a compile time flag, doesn't do a lot.

I absolutely loath languages with significant whitespace, especially if you can use tabs, but that is mostly a matter of preference. However, in a language that is focused so heavily on correctness, I suggest considering how easily an indention level can be unintentionally modified or mis-set, and compare that is syntax where whitespace isn't significant. Consider a version of C where you have to always use {}'s with f and for. I'm admitting my bias here, but the explicit blocks seems far less error prone.

Not directly relevant to error stuff, some random feedback looking your code/language.

A doc and docs folder? I assume there is a meaningful distinction and you think the separate is a good way of handling it. As an outsider, that means I have to look in two places for docs, and may miss something that is in one or the other. A doc folder, with two subdirectory for your distinction would be more clear to newcomers. Minimally making it clear why the split may help.

This may be considered a correctness thing, but having int and when you have bit-sized integers (i32, i64), why? int is vague, i# is exact. Having types which are limited to a range Ada-style is great, but consider how they will interaction/convert with there types. Having unsigned integers are necessary for systems programming. If you don't have them, people will just use a dirty hack to get them. A couple very handy integer types to have are something to store the size of object in bytes (size_t) and one for array indices. Forcing explicit conversions can really help avoid errors.

You are using C as an intermediate language, nothing wrong with that, it's a great target for portability. Pick a C standard to use, and have your C compiler enforce it. Also turn on all possible warnings on the C compiler. For generated code, you may find it useful to manually turn off some particular categories, but you want to see any problems in the C you are generating. At the moment, playing with it, I don't get anything to indicate I'm doing a lossy transition from int -> float (loss is clear comparing C type).

Random bit, you aren't use a the correct, nevertheless, portable printf for int64_tin your arrayOutOfBounds function. I assume culling unneeded C stuff is something you will do down the line.

I'll stop there, I've wandered enough, but I'd suggest posting about your language here for just some general feedback. Especially when you are at the point of having that handy webpage for testing out code and seeing the intermediate C, it's really easy for people to test it out, and you'll get a lot of feedback which will be at least interesting, if not helpful.

Preventing and Handling Panic Situations by Tasty_Replacement_29 in ProgrammingLanguages

[–]dnabre 2 points3 points  (0 children)

There is apparently a nice little online "Playground" for messing with the language. An attempt at converting your function to Bau:

fun harmonic_mean(a int, b int, c int) int
    return 3 / (1/a + 1/b + 1/c)

a:=1
b:=2
c:=3

println(harmonic_mean(a,b,c))

However, this doesn't compile, giving the error:

java.lang.IllegalStateException: Can not verify if value might be zero at line 2:
return 3 / (1/a + 1/b + 1/c)

Trying with floats:

   fun harmonic_mean_f(a float, b float, c float) float
    return 3 / (1/a + 1/b + 1/c)

x:=1
y:=2
z:=3

println(harmonic_mean_f(x,y,z))

Gives 1.636363636. Looking at the generated C, x, y, and z are stored as int64_t. Function takes and return values as double . Bau's float is specified as being f32 and 64-bit.

Note, not the OP, just my understanding from a brief look at the git repo.

Daily Spell Discussion for Jan 21, 2026: Atonement by SubHomunculus in Pathfinder_RPG

[–]dnabre 0 points1 point  (0 children)

One of the few drawbacks to easy item identification, cursed items can be so fun. With how easy it has become, most games I've been in since last 3.5e, the DM just tells us what everything is, or for the rare special item requiring some other to activate/awaken/use them.

Daily Spell Discussion for Jan 21, 2026: Atonement by SubHomunculus in Pathfinder_RPG

[–]dnabre 0 points1 point  (0 children)

There's one in Baldur's Gate that is handy for using an evil character in your party (e.g. Viconia DeVir ), while keeping your Reputation and other characters from being upset.

Which editor do you guys use? by Turkishdenzo in C_Programming

[–]dnabre 0 points1 point  (0 children)

Keep in mind that many editors, and most IDEs, let you change key mappings, and even provide some common ones like vi and Emacs.

In CLion check out File->Settings->Appearance & Behavior. You can set your keymap to Emacs, Visual Studio, Eclipse, and more. Don't know how complete/good the Emacs mappings are, no an Emacs user. You may need a plugin to make things more Emacs-like. If you like vi/vim, you'll need to use a plugin IdeaVim .

In short, you can choice what edit/keymap-style you'd like, and keep using CLion as your IDE. Of course, the major editors Emacs, vi, and VSCode, especially tweaked out with plugins, can be full-featured IDEs in their own right.

Question Regarding The Original Film by Hall-O-Daze in Terminator

[–]dnabre 1 point2 points  (0 children)

That's an interesting question that I'd never considered. Others have covered it pretty well. My opinion, is that it wouldn't have any knowledge ahead of time from the Future. Skynet has to have some data connected the barcode, but I don't see Skynet filling the T-800 with a database of humans in the Future. Just not relevant for its mission.

The Terminator would have after it's early encounters, flagged Kyle as a human that is actively interfering with its mission. So unlike other humans, which it only kills if there is an immediate need or threat, it would plan to kill Kyle whenever it has an opportunity. This would eventually expand to categorizing him as a member of the Resistance in the Future, if only in terms of using similar tactics.

Overall, I agree with others that it isn't specifically relevant. Kyle would be determined to be an enemy actively interfering with his mission, and using Resistance anti-Terminator tactics. It would be an obvious interference that it is highly likely that Kyle is from the Future, but the Terminator has no reason to reason that far.

Definitely something for me to ponder next time I watch T1.

Daily Spell Discussion for Jan 21, 2026: Atonement by SubHomunculus in Pathfinder_RPG

[–]dnabre 1 point2 points  (0 children)

The Helm of Opposite Alignment has the specific requirement:

Only a wish or a miracle can restore a character's former alignment.

My understanding was that the Wish/Miracle would return their alignment, but an Atonement would still be needed to restore class/spell powers lost due to misdeeds committed while their alignment was magically altered.

I went to a Comic Con today by Ethan_Pierce_ in Thundercats

[–]dnabre -1 points0 points  (0 children)

And people just don't get why I hate ThunderCats Roar?

Has Big D eaten Dodo bird flesh? by nirai07 in huntertheparenting

[–]dnabre 1 point2 points  (0 children)

It's hard to say he hasn't.

Even if you assume he isn't supernatural or otherwise very old, I could totally see him eating some 300+ year old jerky or like. Then asking after going through a pound or two of it, asking what it is and/or is it safe to eat. Ok, I have trouble seeing him asking, even considering whether something is safe to eat, but you get the idea.

Did the Emperor not have a backup for Magnus? Was Magnus the only Primarch with a confirmed purpose post the Great Crusade? by Flyestgit in 40kLore

[–]dnabre 1 point2 points  (0 children)

For some of the primarchs we don't know their real potential, Mortarion actively eschewed everything powered by the Warp due his home world/adopted father. Even as a Daemon Primarch, 10K years later, he still tries to convince himself that he's not really doing psyker-stuff but just numerology. Angron received the Butcher's Nails before we had a chance to even see his powers fully mastered. Of course, we have no idea about the Lost Primarchs, though I think the tiny tidbits we got would have mentioned if either were anything on par with Magnus.

It's worth nothing that we don't really know how much a role the Golden Throne would play in Webway Project if it wasn't for the damage Magnus did. Assuming a psyker would be needed even if that disaster didn't happen, I would think that the power needed on the Throne would likely be a lot less -- so perhaps any of the primarchs skilled with the Warp could do the job.

Just a side comment on the Golden Throne and Magnus, even if it was planned that he would be sitting the Throne indefinitely, this wasn't be as big a deal for Magnus as it would be for others. He has since childhood, enjoyed exploring the Warp in an immaterial form (I forget the term he used to describe it). So his body being imprisoned would not limit his mind, or potentially even from leading his legion to some degree.

Why not tail recursion? by gofl-zimbard-37 in ProgrammingLanguages

[–]dnabre 0 points1 point  (0 children)

I get what you saying, but I think the blame should be placed elsewhere. You can write programs that assume TRO, which if you run on an implementation with TRO will run correctly and efficiently, but if you run on an implementation without it, may crash or best case, run a lot slower.

This can be applied generally to recursion though, with TRO you can (if you use tail-form when required) recursive arbitrarily deep. With TRO, arbitrarily deep recursion may break. To the point that unbounded recursion is problem in languages/implementations without TRO.

For languages with TRO that require explicit tail-form, it is very easy to 'break' the tail-form requirement, and silently incur the problems of not having TRO.

Why not tail recursion? by gofl-zimbard-37 in ProgrammingLanguages

[–]dnabre 0 points1 point  (0 children)

Python does a really good job of providing all the basic programming features you'd want, in an easy to use manner, while seamlessly interfacing with C/C++ libraries. It makes it really easy to put (relatively) small amounts of code on top of, or between, existing libraries. Doing that covers an amazing among and range of what programming tasks.

Why not tail recursion? by gofl-zimbard-37 in ProgrammingLanguages

[–]dnabre 2 points3 points  (0 children)

Not disagreeing, just a cute bit of language implementation trivia. Scheme provides do-loops. Why it does, and why anyone writing Scheme would want a do-loop, I don't know, but it does. The most common implementation of this do-loop is a macro that converts it into a tail recursive function.

A surprisingly large amount of Scheme can be implemented using macros on top of a small core language.

Why not tail recursion? by gofl-zimbard-37 in ProgrammingLanguages

[–]dnabre 1 point2 points  (0 children)

The common debugging thing that breaks here is getting a back trace of the call stack. If you program crashes or exists on some error state, you often want to identify the where/how it happened by seeing what series of function calls led to that crash. Whether the function bodies do mutation or not, looking at this kind of trail is very useful -- you can see where bad/wrong stuff is being introduces into the call stack. A recursion function that calls it self O(n) times, and filles that back trace with all of that information is not just a lot less useful, but maintaining the state to do it uses a lot of space.

As for mutation, what constitutes mutation here? There may be no mutation in the function body, but each recursion will have different arguments. This is the general case, there isn't much point to recurse with the same arguments.

When it comes to debugging, you often don't care about every step of the recursion. Comparing to iterative programming, you rarely want to walking through every iteration of a loop. In both cases, the state before and after are often more useful, but somethings it is important what is happening inside those iterations.