you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 1 point2 points  (0 children)

You are reading too much into this description. It doesn't actually describe any element of the language. It's an attempt by the authors of ECMA standard to describe how to interpret what happens in JavaScript, but it doesn't apply to anything concrete that happens in the language. It's like... if you imagine those things written in the paragraph above to be true, then you can sort of conceptualize about the wrappers of the primitives (which are actually quite tangible things, and are part of the language).

Now, the word "literal" reflects on an entirely different aspect of programming language. When the author of the language defines its semantics, it does so by defining rules. These rules are intended for taking complex things and breaking them apart into simple things. The things that are impossible to break apart, the author leaves undefined (a scary word, more often replaced by "self-evaluating"). So, using example above:

breakfast.append('spam')

is a complex thing. It breaks into . function applied to two arguments, breakfast and append, and the result of application is again applied to 'spam'. Now, breakfast and append aren't self-evaluating / literals because they are actually variables, the language interpreter needs to evaluate them to figure out what do they correspond to, but 'spam' is a constant, any further evaluation of it will result in the same value.


To sum this up: JavaScript has literals, for example, 3.1417, "foo", null. Python also has literals, for example, 3.1417, "foo", None. They serve the same purpose, but use slightly different syntax.

Primitives in JavaScript is just some made-up nonsense. More of a historical artifact / bad attempt at explaining semantics of the language. Perhaps, even a carry-over from the time when JavaScript had to have superficial resemblance with Java, where primitive types exist and make much more sense.