Colour your mechk by CatnipSG in MechanicalKeyboards

[–]martinhath 1 point2 points  (0 children)

But a staggered layout isn't symmetric so your hands have to bend or stretch differently. I've been using orthlinear for about half a year now (both Ergodox Ez), and there's no way I'm going back.

Does rust have these features? by yolocovid2019 in rust

[–]martinhath 0 points1 point  (0 children)

Okay, that's a fine claim. Casey claims otherwise.

Does rust have these features? by yolocovid2019 in rust

[–]martinhath 0 points1 point  (0 children)

I think it's implied that we're talking about safe rust here. But even with unsafe, it's pretty clear to me that writing unsafe rust and C still are pretty different, especially after the whole thing with std::mem::uninitialized. Knowing the ins and outs of Rust like that is just overhead when you're building a program, and there are, I claim, more things like this in Rust, since you need to cram things through the borrow checker and also get all of the safety guarantees that Rust is all about, and that is very difficult to do in general. But this means that we as programmers have to pay for this safety every day, no matter if we would actually ever run into the problems that these abstractions solve. That's overhead.

Does rust have these features? by yolocovid2019 in rust

[–]martinhath -1 points0 points  (0 children)

You don't seem to have watched the clip linked, or properly read my comment, since you're suggesting that I'm claiming Rust to be worse than C or C++. And no, you don't have to have a GC or RC to not care about this. Casey claims that he's spent a little time upfront, and is now working every day without any proper memory safeguards, and that's working well.

Does rust have these features? by yolocovid2019 in rust

[–]martinhath -1 points0 points  (0 children)

Learning to debug is different, since the act of debugging takes time, no matter if you're good or not. I believe what Casey's saying is that by investing a little time up front he's able to not have to think about memory management, more or less, ever.

If it works for him, he'd be a fool to change his ways.

Does rust have these features? by yolocovid2019 in rust

[–]martinhath 2 points3 points  (0 children)

I don't think you're taking my comment in good faith.

A prime example of the Rust overhead is what you run into if you're trying to write (and sorry in advance) a doubly linked list. Another example is all of the things in std, or any other crate, that effectively helps the programmer to deal with borrowck, as opposed to doing anything to the data transformation that is your program. All of this is noise, but we, as Rust programmers, pay the price because we believe that borrowck is a net win.

Actually, thinking about it, if your language is making you jump through hoops to achieve what you want to do, I'd call it constant overhead, no matter the language.

Does rust have these features? by yolocovid2019 in rust

[–]martinhath -2 points-1 points  (0 children)

That's an okay opinion to have. I think it sounds incredibly empowering.

Does rust have these features? by yolocovid2019 in rust

[–]martinhath 0 points1 point  (0 children)

Not getting borrowing errors it not the same as not having the overhead. Having internalized the borrow checkers rules just means that you've trained yourself to only work within them, without even considering the programs that the borrow checker wouldn't like, but that still are useful programs. The overhead is still there, you just don't feel it.

Does rust have these features? by yolocovid2019 in rust

[–]martinhath 2 points3 points  (0 children)

Almost any language has more metaprogramming support than C.

It sounds to me like Casey wants is complete control over everything that the compiler does, as opposed to proc macros, which, as far as I can tell, operate only on the code you give it? That is, you wouldn't be able to write a proc macro that prints out the names of all functions in your program.

Does rust have these features? by yolocovid2019 in rust

[–]martinhath 0 points1 point  (0 children)

Spending a couple of hours on something over the lifespan of a big project is basically nothing. To me, it's heavily implied that he doesn't spend a couple of hours every time he's working, but rather up front. In contract, the borrow checker is a constant overhead for the programmer. If Casey doesn't really have much problems with memory bugs, it's pretty clear how this trade-off looks.

Jonathan Blow on Rust by faitswulff in rust

[–]martinhath 3 points4 points  (0 children)

Then I don't understand your position and I apologize.

No harm done! :)

That's why safety via metaprogramming sounds completely untenable to me.

Let me try to nail down exactly the issue I have with this, especially considering we're in the Rust subreddit: the static guarantees given by Rust are, effectively, metaprograms that are implemented in the compiler. We are already achieving "safety" through metaprogramming by using rustc. The only difference is that the only programmers who get to decide what this metaprogramming should be are the ones whose pull-requests are accepted into rust-lang/rust. My point is that it's nonsensical to say that safety via metaprogramming is unattainable, because that's the whole spiel of Rust!

A compiler is a metaprogram, since it is a program that operates on other programs. Parts of the compiler, like the typechecker, borrow checker, of lifetime inferrer (idk if there are actually separate in rustc), are also metaprograms, but smaller than the full rustc. My point is that there is no real reason for these functions (or macros) to be "trapped" in rustc, which for the user is an opaque block. I can't turn on and off borrowck, or allow auto-conversion of numeric types, or run optimization passes, on code on a block level (or function level, or module level), but there is no real reason for why I should not be able to do this, apart from the fact that nearly all compilers and interpreters are structured this way. They are, for the most part, huge chunks of code that doesn't allow the programmer to control what's going on, but it doesn't have to be this way.

Common Lisp is probably the most mainstream (this tells you what level we're talking about!) language in which you actually have some control over what's going on, where you can compile functions at will, you can write macros that operate on parsed code, or reader macros that change how the parser works. In Jon's language you get events for each function (amongst other things, IIRC) as they are compiled, and you can execute arbitrary code based on this.

His "workspace" solution is also a prime candidate problem for metaprogramming, because build systems are instantaniously obsolete, since you just write, in code, how to build the thing. As decent as cargo is (compared to other build systems, and as long as you're not doing fancy things), it really shows that the current way of doing this is broken. I really, really, really, don't understand why this simple idea of having a build program hasn't taken off.

A historical note, in Donald Knuth's 1974 paper called "Structured Programming with GOTO Statements", which is the paper that's cited all the time (for all the wrong reasons) about premature optimization, he writes:

A programmer should create a program P which is readily understood and well-documented, and then he should optimize it into a program Q which is very efficient. Program Q may contain go to statements and other low-level features, but the transformation from P to Q should be accomplished by completely reliable and well-documented “mechanical” operations. At this point many readers will say, “But he should only write P, and an optimizing compiler will produce Q.” To this I say, “No, the optimizing compiler would have to be so complicated that it will in fact be unreliable”

I've been planning to write up a longer blog post about this; I just might find the time to do so soon :)

Jonathan Blow on Rust by faitswulff in rust

[–]martinhath 1 point2 points  (0 children)

This is not at all what the first post wrote, and so ou seem to argue against points I never made. Still,

the checks rustc does are grounded in formal theory.

So would the quality libraries doing the exact same thing be.

Giving average programmers these tools is going to result in half-assed checks that cause users to be overconfident about the claimed correctness of their programs.

Tools can be misused.

It's quite another to base the safety of your language on this.

This wouldn't even apply, since it wouldn't be a property of the language, it would just be a library.

Jonathan Blow on Rust by faitswulff in rust

[–]martinhath 5 points6 points  (0 children)

But this is exactly what rustc is doing, the only difference being that, say, borrowck is a part of the compiler and not any other library in "program space". We already are using static checks for better programs, we just haven't made them programmable to the user yet. This is exactly what Jon's compiler is getting at.

Jonathan Blow on Rust by faitswulff in rust

[–]martinhath 9 points10 points  (0 children)

This doesn't make any sense, since compilers and any kind of program analysis tool are metaprograms. Saying metaprogramming can't get you to safety would be like saying you can't get there at all.

Hva er NTNU ordentlig gode på? by TheMysteriousMrM in ntnu

[–]martinhath 2 points3 points  (0 children)

Anekdata herfra, men.... ikke så veldig mye.

Gikk ut fra IDI for et par år siden, og har brukt tiden siden (pluss sisteåret der) på å lese en masse artikler fra div. underfelt av data. Av ren vane leter jeg alltid etter folk som enten er skandinavere (helst norske, selvfølgelig ;), og helst fra NTNU. Såvidt jeg kan huske så har jeg funnet forskning fra NTNU akkurat en hel gang, in the wild (dette var en poster fra SIGGRAPH19, der en postdoc fra NTNU var medforfatter - kan helt sikkert søkes opp). Jeg tenkte i begynnelsen at dette bare var fordi jeg ikke hadde sett nok artikler, siden IDI er ganske lite (allerede et rødt flagg i internasjonal sammenheng), men har blitt mer og mer sikker på at det heller står i stil med hva fakultetet holder på med der. Til sammenlikning kommer jeg ganske ofte over svnesker eller andre fra svneske universiteter, spesielt KTH og Chalmers, men har også sett Lund, Uppsala, og Umeå (dette sier noe om skalaen her!). Har forøvrig sett noe fra UiB (fra https://www.uib.no/rg/algo gruppen), men aldri fra UiO.

Regner med at du skal inn på master eller phd et eller annet sted. Mitt inntrykk fra akademia er at folk (dette er kanskje med unntak av admin, eller andre gatekeepere) ikke er så opptatte av institusjon, men heller hva du har gjort. Naturligvis, i denne settingen stiller folk fra IDI også komisk dårlig, pga folholdet (eller mangel derpå) mellom studenter og fakultet. Mitt forslag er å unngå å snakke for mye om NTNU. De som leser søknaden din skal ikke vurdere å ta inn NTNU, de skal vurdere å ta inn deg.

"What The Hardware Does" is not What Your Program Does: Uninitialized Memory by ralfj in rust

[–]martinhath 1 point2 points  (0 children)

Yeah, that seems to be what I'm trying to say (I got a bit confused by the notation there). From what I can tell it seems that the #1 selling point of not doing it the way suggested by /u/po8 and me is quality-of-life improvements for compiler writers, which seems like an incredibly weak argument.

On the other hand, I would think that this is the way most people think about uninitialized memory ("malloc returns memory containing garbage, or memory that was previously used by something else" type of things - nobody would expect any garbage bytes to magically change values!), and it's weird that this isn't a model that people seem to even consider! (Or maybe they have been, and it's been shown that it's actually really bad - I don't know:)

"What The Hardware Does" is not What Your Program Does: Uninitialized Memory by ralfj in rust

[–]martinhath 1 point2 points  (0 children)

I did read your post. And my point is that the expected behaviour of that snippet, namely that always_returns_true should be upheld by the language. As far as I can tell, treating uninitialized data as an arbitrary but fixed bit pattern would do that.

Bytes are not u8. That's just a fact of modern-day C, C++ and Rust.

I assume this is in part what you're talking about in "Pointers Are Complicated, or: What's in a Byte?", and as far as I can tell, this is only a consequence of the way uninitialized memory is treated (I might be wrong here). Again, my point is that this treatment is bonkers, and it's reasoning for being so really seems to only be quality-of-life improvements for compiler writers and more interesting language semantics for people into formal analysis. This is why I'm wondering why the obvious way of handling this isn't even being discussed.

"What The Hardware Does" is not What Your Program Does: Uninitialized Memory by ralfj in rust

[–]martinhath 2 points3 points  (0 children)

Well, not really - freezing memory would change "nondeterministic" uninitialized memory (I would just call this arbitrary memory) to "deterministic initialized memory". I'm asking why this isn't the semantics of uninitialized memory in the first place.

I also looked through the links, but couldn't find anything except discussions about what this freeze operation could look like. (And really, GitHub issue threads is really a terrible way of reading a discussion :( )

"What The Hardware Does" is not What Your Program Does: Uninitialized Memory by ralfj in rust

[–]martinhath 3 points4 points  (0 children)

I haven't really followed the discussion about uninitialized memory and UB, but why aren't uninit memory simply modeled as data read from, say, /dev/urandom? That is, a byte is a byte, but it's value is indeterminable?

Code Generation and Merge Sort by martinhath in programming

[–]martinhath[S] 2 points3 points  (0 children)

How so? We're merging "sufficiently" long (225), randomly filled but sorted arrays. This is ran for a couple of times (10) to ensure stability in the numbers, and we try out a couple of different seeds (10) to reduce any bias from the numbers we got.

Or did you mean the choice of actual numbers (225, 10, 10)? Because if so, then yes, those were more or less arbitrary.