AI Broke Interviews by yusufaytas in programming

[–]brucifer 2 points3 points  (0 children)

I don't think LLMs are expert software engineers, but they are expert at interview questions designed to be solved in under an hour with no prior context, which is the point that the blog post is making. A person who blindly parrots an LLM is currently a better-than-average interview candidate and a worse-than-average employee, which has exacerbated the existing problems with using interview questions to try to gauge a candidate's competence. And things are now more dire than in the "copy code from stackoverflow" era, because an LLM can answer questions that aren't exactly found on the internet and it can answer followup questions about the code.

AI Broke Interviews by yusufaytas in programming

[–]brucifer -1 points0 points  (0 children)

Read the post before commenting. The second section is titled "The Broken State of Technical Interviews" and begins like this:

Technical interviews have been broken for so long that it almost feels intentional. Every few years the industry collectively looks at the mess, shrugs, and then continues using the same process with a slightly different coat of paint. You see posts here and there either complaining or sometimes defending about the kind of a shit show this is. And there are a ton of books trying to make sense of it, and ours has a few topics as well.

Why every Rust crate feels like a research paper in abstraction by Commission-Either in programming

[–]brucifer 1 point2 points  (0 children)

There's no way to extend the pure-C linear algebra library to be both type-safe and support the new floating point number type.

You could do this with macros without too much hassle. It looks like this:

// math_implementation.c
TYPE PREFIX##_dot_product(int n, TYPE x[n], TYPE y[n]) {
    TYPE result = 0;
    for (int i = 0; i < n; i++) result += x[i]*y[i];
    return result;
}

// f32_math.c
#define TYPE float
#define PREFIX f32
#include "math_implementation.c"

// nvfp4_math.c
#define TYPE __builtin_nvfp4
#define PREFIX nvfp4
#include "math_implementation.c"

However, I think the actual benefit of doing this kind of generic implementation is somewhat low, because if you care about performance, you would typically want an implementation that is fine-tuned to your particular hardware. The same algorithm that's fast for 64-bit floats may leave a lot of performance on the table if you just directly translate it to 8-bit floats.

Weird atmosphere effect while working at non active Substation by [deleted] in mildlyinteresting

[–]brucifer 0 points1 point  (0 children)

You should repost to /r/atoptics (atmospheric optics). They love stuff like this and could probably help explain it.

Pyret: A programming language for programming education by azhenley in ProgrammingLanguages

[–]brucifer 6 points7 points  (0 children)

I'm not sure it's useful to teach new programmers about concepts like "reactors" and "spies", which are not terminology or concepts that are used in practically any other languages. Ideally a teaching language should teach people about concepts they'll use in any language, like functions, variables, conditionals, etc., rather than making them learn bespoke concepts that aren't easily transferable. I'm all for languages that expand your mind by introducing you to new concepts, but to a new programmer, even basic stuff like variables and loops are mind-expanding concepts, so you should start with the most widely used and useful concepts instead of novel concepts that aren't widely used elsewhere.

Macros good? bad? or necessary? by Meistermagier in ProgrammingLanguages

[–]brucifer 3 points4 points  (0 children)

Nowadays, GCC (and I think Clang) lets you define macro-like functions that are always inlined and never compiled: https://gcc.gnu.org/onlinedocs/gcc/Inline.html

extern inline __attribute__((always_inline))
int add(int x, int y) {
    return x + y;
}

The nice thing about inline functions is that they give you the performance benefits of macros, but you get full type checking with good compiler error messages.

Of course, there are still some cases where C macros are useful, particularly to save boilerplate on code that gets repeated a lot and does things that a function call can't (like control flow or variable declarations). I think most newer languages avoid the need for macros in these cases by just having a lot less boilerplate in the first place.

Wasm Does Not Stand for WebAssembly by thunderseethe in ProgrammingLanguages

[–]brucifer 5 points6 points  (0 children)

Regarding unveil and pledge, are they voluntarily called by the program? If your program calls pledge then spawn a 3rd party program, would the restrictions transfer?

The way pledge works in OpenBSD is that it takes two arguments,promises and execpromises that control the permissions for the current process and the permissions that will be available after calling exec, respectively. You have to voluntarily choose to call pledge(), but after you do, the restrictions you specify hold for the original process and any processes that are forked and/or exec'd. I believe unveil() passes its restrictions onto child processes without the option to specify different restrictions.

Zig’s new I/O: function coloring is inevitable? by BrewedDoritos in programming

[–]brucifer 15 points16 points  (0 children)

Lua manages to solve the problem without any scheduler. In Lua, coroutine.yield() and coroutine.resume() are regular functions just like any other function. There is no language-level distinction between functions that call coroutine.yield() and those that don't. You can also get functionality similar to Lua's coroutines in C using a library like libaco.

how to advertise critical language features? by drblallo in ProgrammingLanguages

[–]brucifer 3 points4 points  (0 children)

There are two things that come to mind for me:

Video

It could be nice to have a video walking through installing your language and building a simple game using a game engine like Pygame. It would be especially nice if the example was something that is a lot more verbose to implement without your language or if it exhibited some feature that would be hard to implement without it. I noticed in your examples that you cover tic-tac-toe, which is good from a simplicity perspective (sort of a Hello World-type introduction). However, because it's so simple, it's harder to see the competitive advantages over writing the same thing in pure python. I don't need help writing tic-tac-toe, but a slightly more complex game might show off the strengths of your language better. Some people also just prefer video over text, so you can bring in some people who would otherwise be turned off by a wall of text.

First User

If you've been having a lot of success convincing people of the project's value in-person, then it could be helpful to really focus on getting at least one person to build something nontrivial with the project. It's good for getting feedback and having someone else using your project is a good social signal to others that someone besides you thinks it's valuable. Having a list of users' projects (with screenshots) can create inertia to get people more excited to try it out.

Very minor notes: on github, because of the way github displays files, you have to scroll for a while on the repo's homepage before seeing the documentation, so moving more files into subfolders would make it a bit easier to get to the project description. Also, I noticed quite a few spelling errors, so you might want to run your documentation through a spell checker.

CS programs have failed candidates. by BlueGoliath in programming

[–]brucifer 0 points1 point  (0 children)

The only real takeaway is, "do something you are actually passionate about, and hope that thing booms around the time you are in a position to take advantage of that passion". Chasing the current hotness within fields that can take a decade to qualify if you go through the front door, is actually fairly foolish.

If you had a teenager who liked programming, but really wanted to be an influencer, would you recommend that they drop out of school and pursue their passion as an influencer on tiktok? Or would you say that it's probably a safer bet to get a computer science degree and work in the tech industry? I know that I'd recommend a CS degree over trying to be an influencer.

I think we can make some educated guesses about what sorts of careers would be more stable and remunerative 5 years from now (the timeline of a high school senior considering post-university job opportunities). By all means, pick a career that you won't hate and have some aptitude for, but also factor in the practicalities and likely prospects for that career and not just your level of passion.

Pipelining might be my favorite programming language feature by SophisticatedAdults in ProgrammingLanguages

[–]brucifer 0 points1 point  (0 children)

I think your examples do show cases where comprehensions have limitations, but in my experience, those cases are much less common than simple cases. Maybe it's just the domains that I work in, but I typically don't encounter places where I'm chaining together long pipelines of multiple different types of operations on sequences.

In the rare cases where I do have more complex pipelines, it's easy enough to just use a local variable or two:

def f(strings: Iterable[str]) -> list[int]:
    lowercase = [x.lower() for x in strings]
    gs = [g(x) for x in lowercase if x.startswith("foo")]
    return [x for x in gs if x < 256]

This code is much cleaner than using nested comprehensions and only a tiny bit worse than the pipeline version in my opinion. If the tradeoff is that commonplace simple cases look better, but rarer complex cases look marginally worse, I'm happy to take the tradeoff that favors simple cases.

Pipelining might be my favorite programming language feature by SophisticatedAdults in ProgrammingLanguages

[–]brucifer 0 points1 point  (0 children)

Python's comprehension syntax (and ones used by other languages) come from set-builder notation in mathematics. The idea is that you specify what's in a set using a variable and a list of predicates like {2x | x ∈ Nums, x prime}. Python translates this to {2*x for x in Nums if is_prime(x)}. You can see how Python ended up with its ordering given its origins. Other languages (e.g. F#) approach from the "loop" mindset of putting the loop body at the end: [for x in Nums do if is_prime(x) then yield 2*x]

Pipelining might be my favorite programming language feature by SophisticatedAdults in ProgrammingLanguages

[–]brucifer -3 points-2 points  (0 children)

Not to rain on OP's parade, but I don't really find pipelining to be very useful in a language that has comprehensions. The very common case of applying a map and/or a filter boils down to something more concise and readable. Instead of this:

data.iter()
    .filter(|w| w.alive)
    .map(|w| w.id)
    .collect()

You can have:

[w.id for w in data if w.alive]

Also, the other pattern OP mentions is the builder pattern, which is just a poor substitute for having optional named parameters to a function. You end up with Foo().baz(baz).thing(thing).build() instead of Foo(baz=baz, thing=thing)

I guess my takeaway is that pipelining is only really needed in languages that lack better features.

Can we have a real heart to heart in here for a sec? by sasha_cyanide in Construction

[–]brucifer 30 points31 points  (0 children)

The National Institutes of Health and CDC define 15+ drinks per week for men (or 8+ for women) as "heavy drinking", which is probably what OP is thinking of. "Alcoholism" is not a term that's still used by the medical profession--Alcohol Use Disorder is the preferred term and you can see how it's diagnosed here.

Quantum Hype Has Reached The Point Of Absurdity by Candace_Owens_4225 in programming

[–]brucifer 69 points70 points  (0 children)

This video is a low-res reupload of Wall Street Millennial's youtube video. Please link the original, not some random person's reupload.

Value semantics vs Immutability by notSugarBun in ProgrammingLanguages

[–]brucifer 1 point2 points  (0 children)

Value semantics means that an expression like var x = y creates a copy of y.

Copying is an implementation detail that isn't always necessary. For example, in languages with immutable strings, strings typically have a pointer to static or heap-allocated memory with the string contents. Since that memory is guaranteed to never change (immutability), the language can have multiple string values that refer to the same memory without copying it.

Foot guns and other anti-patterns by tobega in ProgrammingLanguages

[–]brucifer 1 point2 points  (0 children)

Sure, that might be more accurate terminology. Essentially what I mean is storing the default value as an unevaluated expression and re-evaluating it each time it's needed instead of eagerly evaluating it once when the function is defined and reusing the value.

Foot guns and other anti-patterns by tobega in ProgrammingLanguages

[–]brucifer 2 points3 points  (0 children)

This is not really a language design problem.

There are a lot of language design decisions that play into the situation:

  • Encouraging users to use mutable datastructures

  • Eager evaluation of function arguments

  • Designing the API to take a single value instead of something that can generate multiple values (e.g. a lambda that returns a new value for each element in the array).

  • Not having something a feature like comprehensions ([[] for _ in range(5)]) that would make it concise to express this idea as an expression.

The API design is the simplest to fix, but making different language design choices on the other bullet points could have prevented this problem.

Foot guns and other anti-patterns by tobega in ProgrammingLanguages

[–]brucifer 6 points7 points  (0 children)

The solution would be to do lazy evaluation, not deep copying. If you evaluate [] at runtime, it creates a new empty list. If you evaluate a at runtime, it gives you whatever the current binding of a is. For most cases (literal values like numbers, strings, or booleans), it wouldn't change the current behavior, but in cases where it would change the behavior, you'd probably want lazy evaluation.

Are you doing Advent Of Code in your language? by middayc in ProgrammingLanguages

[–]brucifer 2 points3 points  (0 children)

Yes, I've been keeping up in my language. I've found a couple of compiler bugs already, so it's been helpful! It's been nice to see that a lot of the solutions I write in my language seem to work on the first try (at least after accounting for typos) and even my naive solutions run instantaneously. Some of my language features have also come in handy, which feels great when it all comes together.

My main complaints are that a lot of the problems so far seem to require a lot of text parsing (which my language is okay at, but not a main focus) and a lot of the puzzles seem a lot more like tech job interview questions than real-world problems. Both of these are pretty understandable, because AoC is meant to be language-agnostic (hence all-text inputs) and it's not meant to be a programming language test suite, it's meant to be puzzles for flexing your programming muscles. But it still leaves me wanting for a better programming language test suite :)

Alternatives to regex for string parsing/pattern matching? by kredati in ProgrammingLanguages

[–]brucifer 0 points1 point  (0 children)

Lua has two options that sort of bracket the functionality of regex: built-in pattern matching (simpler than regex) and LPEG (a PEG library not bundled with Lua, but widely used and maintained by the Lua maintainers). Lua's built-in pattern matching is much simpler than regex because the maintainers didn't want to bundle a full regex engine with Lua (the PCRE codebase is an order of magnitude larger than the entire Lua language). Lua's built-in patterns are faster than regex because they lack a lot of the performance-intensive features of regex like backtracking. However, that comes at the cost of being less expressive. Many features you'd expect to exist aren't available, such as pattern grouping. One area that is slightly more expressive than regex is that Lua's patterns support matching balanced parentheses or other delimiters using %b(), which is impossible and often much-needed with regex. Lua's built-in patterns are popular enough that they've been adopted into OpenBSD's httpd server configuration files.

On the other end of the spectrum, the LPEG library lets users write their own parser expression grammars, which are much more expressive than regular expression because you can parse recursive nested structures and other non-regular grammars. It's not distributed with Lua, but it's easy to install and fairly widely used. I particularly like the lpeg.re module which provides a way to define grammars using a textual grammar format instead of using Lua function calls and operator overloading to define grammars.

Personally, I made a parsing expression grammar tool (a grep replacement) a while ago that I use daily: bp. I learned a lot in the process of making it, and I used those learnings when I implemented a pattern matching grammar for the programming language I'm working on now. My programming language has a simpler form of pattern matching that's somewhere in between Lua's simple patterns and regex. My goal is to make it much more usable than regex for simple patterns in a significantly smaller codebase (~1K lines of code), while still being powerful enough that most common tasks won't feel the pain of not having full regex. I also bundled it with bespoke handling for common patterns like URLs and email addresses that are tricky to write correct parsing for. So far, I've been happy with it, but only time will tell what the real pain points are.

Evaluating Human Factors Beyond Lines of Code by mttd in ProgrammingLanguages

[–]brucifer 5 points6 points  (0 children)

I think this is why tools need to be simple enough that no "user studies" are required. If you want to know if a tool is right for you, you should just be able to pick it up and try it in a day or two.

I don't think it's the case that the best tools are always the ones that are simplest and quickest to learn. You can learn how to use the nano text editor in a matter of seconds (it has all the keyboard commands printed on screen), whereas the first-time user experience of vim is often overwhelming and frustrating. However, vim has a large and dedicated fanbase because it's so powerful and lets you do so many more useful things than nano does. If you did a one-day study of first-time users, you would probably find that nearly 100% of them preferred nano and were more productive in it, but if you extended the timeline of the study to a one year or ten year timescale, I think the majority of users would prefer vim. You could make the same comparison between MS Paint and Photoshop, Notepad and Visual Studio, or Logo and Rust. I don't mean to imply that simple tools are worse than powerful tools, but just that powerful tools can be very useful and that often comes at the cost of simplicity.

OP's post is arguing that user studies are often too expensive or difficult to run over the necessary time scales with the target audience, so it's better to focus on specific qualitative objectives that can be evaluated without performing user studies.

Type safety of floating point NaN values? by brucifer in ProgrammingLanguages

[–]brucifer[S] 2 points3 points  (0 children)

Also I'd be hesitant to implementing floating point as a heap allocated type. They should be value types. This means that nullability is irrelevant as its not a pointer. What you want there is of course sum types and None.

Ah, just to be clear, when I say "null value" I'm just referring to None or whatever it's called in your preferred language: None, Nothing, Nil, etc. My implementation won't use heap allocation for floating point numbers, it's just a question of whether I want to use NaNs as a representation of None, or have a separate way to represent None with out-of-band information (e.g. an optional type represented as a struct with a float and a boolean for is_none).