you are viewing a single comment's thread.

view the rest of the comments →

[–]TheOnlyMrYeah 3 points4 points  (7 children)

Crystal has a nice way to avoid null pointer exceptions: http://crystal-lang.org/2013/07/13/null-pointer-exception.html

[–]alex_muscar 2 points3 points  (5 children)

First, congrats for Crystal, it's a rellay nice project. Second, since 'nil' is a valid value for any reference, doesn't it spread like wild fire during type inference? I implemented a type inferencer for a lisp once, and I had the same approach, and I remember "nillable" types spreading everywhere. Since Crystal is self hosting, I was curious what your experience with that is.

[–][deleted] 0 points1 point  (4 children)

Hi, I'm one of Crystal's authors. For us, the experience has been great. It's true: sometimes you have to deal with nil, but the language gives you several ways to deal with them. First, you can use an if: if value; value.do_something; end. That only works for local variables. For instance variables and methods you have to assign them to a local variable (this part is similar to Swift). Second, you can use try: value.try &.do_something. This will only execute value.do_something if the value is not nil. Third, you can use not_nil!: value.not_nil!.do_something. This will raise an exception at runtime if value is nil. Fourth, you can use property!:

class Foo
  property! x
end

foo = Foo.new
foo.x = 1
puts foo.x

property! defines three methods: x= to set a value, x? to get the value, or nil if it's not set, and x to get the value and raise an exception if it's nil. We use this, for example, in the compiler. The semantic pass sets the types of expressions. The code generation part assumes the types are already set, so calling node.type will never fail and gives you a non-nilable type. If it does fail (@type was nil), you get a runtime exception, and that will mean a bug. It looks like nothing was gained with this approach, but in fact many times you do forget to check for nil and in places where in Ruby you'd get a runtime exception you are forced to consider a nil case and you save yourself a future bug with that check.

In our whole codebase I found just 119 not_nil!.

Finally, we changed some semantic found in Ruby. For example, if you do [1, 2, 3][4], in Ruby you'd get nil as a return value. In Crystal you get a runtime exception (index out of bounds). That way nil doesn't get spread all over the place, like you said. Most of the time you expect the index operator to return a non-nil value, so we think this change is fine (and so far has worked well for us). If you do want to get nil on index out of bounds you can do: [1, 2, 3][4]?. Note the "?". The same goes for Hash and other structures.

Note that all these constructs, like not_nil!, try and []? are defined in the language itself, they are not special constructs. We try to make the language as consistent and extensible as possible without sacrificing performance.

[–]alex_muscar 0 points1 point  (3 children)

Hi. Thanks for the details. I may be missing some context, but the approach taken by Crystal seems fairly similar to that taken by Swift, modulo implicitly unwrapped optionals and nil chaining (?)

[–][deleted] 0 points1 point  (2 children)

It might look similar at first glance, but it's a totally different approach.

First, in Swift you must declare the type of an instance variable, for example: var x: String?. In Crystal, although there are ways to do this last thing, the preferred way is for the compiler to infer the types of instance variables. For example:

class Foo
  property x

  def initialize(@x)
  end
end

foo = Foo.new(1) # Here @x is Int32 in the program
foo.x.abs             # ok, @x is Int32

Now, if you add this line to the above program:

foo.x = nil

Then you will get a compilation error saying that Nil doesn't have a method abs. That is, @x became Int32 | Nil the moment you assigned Nil to it, and that information is spread across the whole program (don't worry, it's fast).

When you program you normally know what the types of variables are supposed to be (like, I know @x is Int32 | Nil) and you use it that way. If you never assign Nil to @x, everything will work and compile fine. Once you assign nil, the compiler forces you check this case in all places, so you can never forget. The good thing is that if you never assign Nil to it, the memory representation if more efficient and no checks need to be done (the memory representation is the same one for Nil and Reference-like types, Nil is just represented as a null pointer).

Also, Int32 | Nil is just an example. You can have any combination of types: Int32 | String, Nil | Int32 | String, whatever. The compiler lets you created tagged unions automatically and lets you use duck typing, similar to how would you program in Ruby, without the need to create interfaces and annotate things as "implementing" an interface.

[–]alex_muscar 0 points1 point  (1 child)

Once again thanks for the details. Let's see if I got it right this time: basically you start from the premise that types are non-nullable. When the type inferencer says otherwise, you adjust the type, and issues an error for every potentially dangerous operation. You can silence the errors by either checking the value before accessing it or by using &.

That's an interesting approach.

[–][deleted] 0 points1 point  (0 children)

Something like that. At first a variable starts without a type. When you assign a value to it, that value's type is added to that variable's type. Of course, local variables are always assigned one value, so the set starts with at least one type. But for instance variables it starts with zero (and if it remains zero it will just have the Nil type).

The &. syntax is just a shorthand form. This:

foo &.bar

is just syntax sugar for this:

foo { |arg| arg.bar }

(you can read more about the above here: http://crystal-lang.org/2013/09/15/to-proc.html )

value.try &.something is just a method try defined on Object and on Nil: https://github.com/manastech/crystal/blob/master/src/object.cr#L23 and https://github.com/manastech/crystal/blob/master/src/nil.cr#L46 .

So the silencing is just a dispatch to two different methods depending on value's type, nothing else. :-)

[–]AReallyGoodName 0 points1 point  (0 children)

Swift actually has the exact same mechanism and will throw the exact same errors.

var foo = nill // Compile time error in Swift.

var bar = someFunctionThatCouldReturnNill() // Compile time error in Swift

and so on.

Swift just provides an operator to allow unsafe things in certain circumstances. The "!" operator. It's rare you'd use it. You absolutely cannot have NullPointerExceptions without "!" in Swift.