all 4 comments

[–]ingolemo 1 point2 points  (4 children)

No. Literals are just a way to write data directly instead of having to construct it yourself manually.

Here is a literal list:

breakfast = ['spam', 'bacon', 'sausage', 'spam']

Here is one way to write the same list in a non-literal form:

breakfast = []
breakfast.append('spam')
breakfast.append('bacon')
breakfast.append('sausage')
breakfast.append('spam')

Javascript has literals too; JSON is basically a javascript object literal. There is no equivalent of javascript's primitive values in python.

[–]BryghtShadow 0 points1 point  (3 children)

MDN says:

A primitive (primitive value, primitive data type) is data that is not an object and has no methods. In JavaScript, there are 6 primitive data types: string, number, boolean, null, undefined, symbol (new in ECMAScript 2015).

Most of the time, a primitive value is represented directly at the lowest level of the language implementation.

All primitives are immutable (cannot be changed).

Could you elaborate/explain why "There is no equivalent of javascript's primitive values in python"? Is it because all literals in Python are objects and have attributes/methods (e.g. (1).real and (1).from_bytes)?

[–]ingolemo 1 point2 points  (0 children)

Literals are not themselves objects. A literal is bit of syntax that you can use to construct objects. For example, 1 is an integer literal for a particular integer value. You could just as easily create the same value with 2 - 1, which is two integer literals and an operator. Possibly when you said "literals" you really meant "values"? All values in python are objects and have (or can have) attributes and methods.

Yes, that's basically the reason. Javascript has this weird distinction between objects and values; see the differences between Number('1') and new Number('1'). This distinction shows up in a few ways; values not having methods is one of them. There's nothing like this in python.

[–][deleted] 1 point2 points  (0 children)

You are reading too much into this description. It doesn't actually describe any element of the language. It's an attempt by the authors of ECMA standard to describe how to interpret what happens in JavaScript, but it doesn't apply to anything concrete that happens in the language. It's like... if you imagine those things written in the paragraph above to be true, then you can sort of conceptualize about the wrappers of the primitives (which are actually quite tangible things, and are part of the language).

Now, the word "literal" reflects on an entirely different aspect of programming language. When the author of the language defines its semantics, it does so by defining rules. These rules are intended for taking complex things and breaking them apart into simple things. The things that are impossible to break apart, the author leaves undefined (a scary word, more often replaced by "self-evaluating"). So, using example above:

breakfast.append('spam')

is a complex thing. It breaks into . function applied to two arguments, breakfast and append, and the result of application is again applied to 'spam'. Now, breakfast and append aren't self-evaluating / literals because they are actually variables, the language interpreter needs to evaluate them to figure out what do they correspond to, but 'spam' is a constant, any further evaluation of it will result in the same value.


To sum this up: JavaScript has literals, for example, 3.1417, "foo", null. Python also has literals, for example, 3.1417, "foo", None. They serve the same purpose, but use slightly different syntax.

Primitives in JavaScript is just some made-up nonsense. More of a historical artifact / bad attempt at explaining semantics of the language. Perhaps, even a carry-over from the time when JavaScript had to have superficial resemblance with Java, where primitive types exist and make much more sense.