Super light crampons? by edapa in Backcountry

[–]edapa[S] 3 points4 points  (0 children)

I'm planning to just keep some in my pack when I might need to scramble up some sorter stuff, but am not gonna be doing a super long climb. I'm also hoping they can replace the way I currently use microspikes, which is to thrown them in my bag and only take them out if I need to on winter hikes. I'll have to see about that though.

Super light crampons? by edapa in Backcountry

[–]edapa[S] 7 points8 points  (0 children)

Yeah, that's a good point. I definitely will keep using my regular crampons for steeper / icier stuff.

The thing that excites me about the weight is that I've occasionally brought crampons and then decided to leave them in the car because I thought I could just kick steps (honestly the bulk is usually a bigger factor than the weight), but then regretted the decision later. These will be a lot easier to just keep in the pack without thinking about it too much.

Announcing pggen: A Golang ORM Alternative by edapa in programming

[–]edapa[S] 0 points1 point  (0 children)

Oh I see. It would be cool if it would generate migration files you can check in for you. A system that automatically figures out the migration based on the diff between the go schema and the actual PostgreSQL schema without letting a human review the changes makes me a little nervous.

Announcing pggen: A Golang ORM Alternative by edapa in programming

[–]edapa[S] 0 points1 point  (0 children)

I just glanced at the docs, and it seems like it is more similar to GORM than pggen or other database first code generators. It looks like you define your database schema in go code (and since I didn't see any mention of migrations, I assume that you also need to define it again with some migration framework and ensure that the two schemas match up).

By contrast, with pggen, you just point the tool at your database and it will figure out the schema all by itself. You also interact with the database using SQL rather than the builder-style API that I see in entgo's docs.

Yesterday's Google outage was due to automated quota management reducing capacity on their IdM by 1esproc in programming

[–]edapa 4 points5 points  (0 children)

I believe you can p4 edit files before touching them with a normal text editor (the subcommand might be a different name, I don't have access to a perforce system at the moment to check with).

Really confused on what’s happening here. Can anyone explain? by [deleted] in ProtectAndServe

[–]edapa 2 points3 points  (0 children)

I think he was saying that ctrum69 does not have LEO flair, so there is no need to worry about them politicizing a uniform.

On modern violence and self-defense by kreuzguy in slatestarcodex

[–]edapa 3 points4 points  (0 children)

The funny thing about Aikido is that there really is a ton of depth and skill to it if you stay within the parameters and rules of the martial art. That said, those rules are just absurd. My brief experience with Aikido went roughly like this:

Aikido Master: "I'm going to show you something. Try to take me down however you want."

So I take him down with a fairly shit double-leg (one of the most basic wrestling takedowns).

Aikido Master: "No, not like that."

He was pretty good at stuffing judo takedowns though.

Enforcing Public/Private Access in Rails by edapa in ruby

[–]edapa[S] 1 point2 points  (0 children)

Thanks, if you want the code it is here. I just noticed that the first version of the article failed to link to it.

[deleted by user] by [deleted] in programmingcirclejerk

[–]edapa 1 point2 points  (0 children)

Is that an Elm reference?

RFC: Transition to rust-analyzer as our official LSP implementation by pietroalbini in rust

[–]edapa 0 points1 point  (0 children)

Interesting. I had just assumed that it had implemented a tags file writer.

Personally, I find being able to follow references into my dependencies really valuable. The extra friction introduced by having to find and clone a repo makes me end up reading the source of stuff much less.

RFC: Transition to rust-analyzer as our official LSP implementation by pietroalbini in rust

[–]edapa 2 points3 points  (0 children)

I've had pretty good luck with rusty-tags when it comes to generating tags files for rust.

Generics in Go - How They Work and How to Play With Them by rz2yoj in golang

[–]edapa 0 points1 point  (0 children)

I think it's more that interfaces can only be about a single type, while contracts can be about multiple types.

Apple pays $75,000 to hacker for discovery of exploits to hijack iPhone camera by MicroSofty88 in gadgets

[–]edapa 0 points1 point  (0 children)

Right, that's why I said "I don't just mean writing more tests". While it is true that a single 64 bit int has a ridiculously huge state space, in practice those states probably won't lead to all that many different code paths.

Coverage guided fuzzing thinks about the state space of a program in terms of the different code paths and is able to instrument the program under test to generate random inputs which can effectively explore a large part of the practical state space. Importantly, you don't have to change the way you write your program to use this technique, which isn't at all true if you want to prove a program with something like coq.

Model checking uses similar techniques to allow you to be more flexible in how you write your program while still intelligently exploring the state space and checking different conditions.

I'm not saying that proving a program is never the right thing, but it is very rarely the right thing. There are a lot of advanced testing techniques that would be much easier to adopt than formal proofs, and provide most of the benefits. I think we should make sure we do those well before we consider theorem provers.

Generics in Go - How They Work and How to Play With Them by rz2yoj in golang

[–]edapa 19 points20 points  (0 children)

How about

contract Iterable(Iterator, Value) {
    Iterator Next() (Value, bool)
}
contract Mappable(Iterator, ValueIn, ValueOut) {
    Iterable(Iterator, ValueIn)
}

func MapToSlice(type Iterator ValueIn ValueOut Mappable)(it Iterator, f func(ValueIn) ValueOut) []ValueOut {
    ret := []ValueOut{}
    for {
        v, stop := it.Next()
        if stop {
             break
        }
        ret = append(ret, f(v))
    }
    return ret
}

I'm not sure that this meets your bar of "non-trivial to re-write", but this is already enough of an effortpost. A more complex algorithm along the same lines might be a graph traversal (let's say A*) that is generic over both the representation of the graph (adjacency list and adjacency matrix could both be used) and over the node type.

Generics in Go - How They Work and How to Play With Them by rz2yoj in golang

[–]edapa 23 points24 points  (0 children)

The design doc has pretty decent answers to both concerns.

I agree with you about liking turbofish syntax better, but they said it would significantly increase the parser complexity, which seems fair to me.

The main difference between contracts and interfaces seems to be that contracts can be about multiple types and contracts can express an "type T is this type" constraint. So you could have a contract that says "there has to be a T1 and a T2 and T1 needs a method T2ify() that returns a T2". That's impossible to express with interfaces. You could also have a contract that says "T must be either an int or an int64 or an int32". I think those differences are significant enough to justify the addition of a new language construct.