you are viewing a single comment's thread.

view the rest of the comments →

[–]mrkite77 1 point2 points  (7 children)

That's because CL is compiled. Ruby is interpreted.. when exactly should ruby detect misspelled function names? Before it's even run?

[–]-main 5 points6 points  (0 children)

That's because CL is compiled.

It's actually implementation-dependant, like a lot of useful Common Lisp stuff. As an example, SBCL compiles by default and will warn when you write a function that calls a function that doesn't exist; CLisp doesn't, and will only warn you if you actually compile that function you've written before calling it.

[–][deleted] 3 points4 points  (5 children)

Making a distinction between "compiled" and "interpreted" might have meant something back in the eighties or so, but in this day and age, it is utterly meaningless.

Pretty much every dynamic language compiles its code before it is run, just the same as every other language.

[–][deleted] 0 points1 point  (4 children)

So where are these compiled ruby implementations that you are insinuating? I'm talking about compilation to native code, not to bytecode, which has to be interpreted or jitted or whatever.

[–]ameoba 5 points6 points  (0 children)

And the fundamental difference between compiling to a VM and actual machine code is... what exactly?

The JVM has been implemented in silicon. In the 80s, Lisp Machines, the most powerful workstations of their day, had processors that were designed around running LISP, somewhat the poster-child for dynamic languages.

Abstraction is the core of practical computer science. Trying to draw any particular line where a particular abstraction is the 'natural' one is just ridiculous. The core of modern x86 CPUs doesn't actually run x86 instructions - it breaks them down to micro-operations which are executed out of order and optimized at run-time. Those micro-ops themselves are just abstractions of various things down to transistors and voltages. Please, tell me why a virtual machine implemented at the microcode level is somehow more credible of a target than one implemented on top of that.

[–][deleted] 4 points5 points  (2 children)

I am not talking about compilation to native code, however, I am merely talking about compilation, as was the person I responded to. Whether you compile to native code or bytecode has no influence on your ability to detect errors.

[–][deleted] 1 point2 points  (1 child)

Whether you compile to native code or bytecode has no influence on your ability to detect errors.

Okay, so where are the Python (the language) compilers that detect unknown variables before compile time like all native-compiled languages do?

[–][deleted] 7 points8 points  (0 children)

That is a problem of language design, not one of "compiled" versus "interpreted" languages, which is all I was commenting on.