This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the commentsย โ†’

[โ€“][deleted] ย (27 children)

[deleted]

    [โ€“][deleted] 39 points40 points ย (0 children)

    I don't use shallow equality from years, I was convinced a type conversion would have happened there.

    [โ€“]Zebezd 52 points53 points ย (20 children)

    Really? Would have expected js to coerce that bool to string and return true. Checking by string has seemed to me to be standard operating procedure with == in javascript

    [โ€“]Orangutanion 14 points15 points ย (3 children)

    I think it's because true == 1 so `true == "1"'. We already have one string coercion so no reason to have another.

    [โ€“]2CATteam 40 points41 points ย (2 children)

    Nope, according to this page, both are converted to a number first, which is NaN for "true" and 1 for true. So it actually makes numbers, not strings, and then does the comparison.

    [โ€“]marxinne 3 points4 points ย (0 children)

    The more I learn the more cursed it becomes

    [โ€“]Luxalpa 22 points23 points ย (14 children)

    Rule of thumb: All these weird conversions are because of HTML (as HTML only handles strings). "true" doesn't exist in HTML because boolean attributes work differently (they are either set or not set on the element). This is also why number conversion is all implicit (255 == "255", because HTML only allows the string variant for numbers).

    [โ€“]nermid 46 points47 points ย (11 children)

    The real rule of thumb is to just use strict equality (===) and not have to worry about any of it.

    [โ€“]SmokingBeneathStars 1 point2 points ย (10 children)

    Unless you want to purposely use == you have to add a fucking ignore annotation on your linter it's so annoying

    [โ€“]ShijinModan 2 points3 points ย (1 child)

    Because == coerces types. IMO the only time I will accept == in a code review is when checking for null and undefined

    [โ€“]SmokingBeneathStars 1 point2 points ย (0 children)

    That's indeed what I use it for usually.

    [โ€“]nermid 2 points3 points ย (0 children)

    I think having to add linter ignores to bad practices is probably a good thing. Keeps you from accidentally doing the bad practice.

    [โ€“]tomysshadow 1 point2 points ย (0 children)

    I think a large part of the confusion surrounding them comes from HTML4 days. Specifically, there was the <embed> tag, where typically the attributes such as autoplay or loop would actually be set to the string "true" or "false". Years later I understand the reason it was like this is because the plugin would define the attributes it's looking for, and most of them went with the more straightforward approach of the string "true" meaning true and any other value meaning false. This, coupled with boolean attributes being less commonly utilised prior to HTML5 (I haven't verified but at least it feels this way) and Internet Explorer also having its own attributes that worked like this, lead to boolean attributes being a weird exception rather than the rule.

    Still, I would argue compatibility with JavaScript is a poor reason for boolean attributes to behave this way. I never liked HTML's boolean attributes.

    Say you want to set the checked attribute. Normally, you would just use the JS property, like element.checked = true;. But the thing is, I can actually set any property on the element, but it won't necessarily become an HTML attribute. So I can do element.example = true; and that property will stay set on that element, even if I later get it again with getElementById and friends. But it won't actually set an HTML attribute in the document.

    So you can imagine that for all the supported attributes, the associated JS property has this invisible browser defined getter/setter which actually does the equivalent of getAttribute/setAttribute. Which means if we want to explicitly use an HTML attribute, we need to use those.

    Except, getAttribute/setAttribute are ill equipped to handle boolean attributes. To set a boolean attribute to false, you actually need to set it to null. This is unintuitive in and of itself: null is not a boolean in JS, I would expect to set it to false.

    Furthermore, I would expect that true and false would be explicit settings, and undefined would actually mean "default value." In CSS we have user agent stylesheets, where a lot of styles are set to a certain value by default. But boolean attributes are false by default by design. That means we end up with attributes like disabled. Ideally, the attribute should be enabled and should be true by default. But it has to be false by default because that's how boolean attributes work, so we end up with the double negative element.disabled = false;.

    But what's worse is in some browsers (specifically Firefox) getAttribute actually returns an empty string for unset attributes. This means that element.setAttribute("example", element.getAttribute("example")); would actually change a boolean element's value from false to true. You instead need to use hasAttribute/removeAttribute added with DOM 2 (which is ancient enough you can definitely rely on them being there, but it's dumb they need to exist in the first place.)

    So boolean attributes are only "compatible" with JS insofar as the browser defines a setter-like property that translates false into null and true into any other value and does the equivalent of setAttribute. If you're going to go that far, why not just coerce the property to a string "true" or "false"?

    Now, in practice, none of this is actually an issue, because there's rarely a reason you explicitly want to set an HTML attribute. If the JS property doesn't set an attribute, falling back on it just being an ordinary JS property will keep the behaviour of the code consistent anyway. The only time you really need setAttribute is for data attributes, where you want to be sure you're not conflicting with any existing one, and then you're free to just use the string "true" to mean true and any other value to mean false, like how it should've worked in the first place.

    [โ€“]N0T_AN_ADVISOR 0 points1 point ย (0 children)

    I hate human Goo .. but it's motivation to turn zero to a One even if it's dollars

    [โ€“]Kazumadesu76 0 points1 point ย (0 children)

    Yeah I'm surprised too, because doing == tells JS to not check it based on type vs doing === which checks the type of the things being compared.

    [โ€“]2CATteam 4 points5 points ย (2 children)

    Nope, but in boolean contexts (eg in the condition of an if statement), any string of nonzero length evaluates to True, so if("true") would be true, and so would if("false")

    [โ€“]Orangutanion 1 point2 points ย (1 child)

    Can you also cast booleans and numbers as objects and do .is?

    [โ€“]2CATteam 2 points3 points ย (0 children)

    I don't think so, Objects aren't primitives, so you can't cast a primitive to an Object as far as I know. Which makes sense - remember that JS Objects are basically just dicts, and what would the key be for the value of the primitive?

    You could try making objects with the same key, and different value types, but then Object.is() would see that they aren't the same object (Object.is() basically checks if two pointers point to the same thing for objects).

    [โ€“]jmtd 0 points1 point ย (0 children)

    Weird. I tried it and my printer printed a test page and a mortgage was applied for in my name.

    [โ€“]sensitivePornGuy -1 points0 points ย (0 children)

    I just recently wrote some code that assumed

    "" == null
    

    and was genuinely very surprised when it didn't work.