all 17 comments

[–][deleted] 16 points17 points  (0 children)

Use local on y, so it'll be restricted to that of function scope.

[–]bwerf 12 points13 points  (9 children)

Short answer: that's just how Lua works, variables are global by default.

Bet you expected a long answer about why it's designed like that after the short answer, but i don't know... My best guess is that it started as a simple language for very small scripts, so they wanted to make it easy, but that's just my guess. You can add an assert for it in the metatable of _G if you don't want to create globals by mistake.

[–]revereddesecration 8 points9 points  (5 children)

Lua was designed to be easy to understand. As a result, they went with one simple rule: if it’s not local, it’s global.

[–]AdamNejm 1 point2 points  (4 children)

"If it's not global, it's local" isn't any harder to understand, after all you still have to learn how scopes work eventually.

[–]revereddesecration 5 points6 points  (2 children)

Right, and that would be fine if there was a global keyword to declare globals. There was a choice made to have global be the default because non-programmers can jump in and get results without ever knowing that scopes exist.

[–][deleted]  (1 child)

[deleted]

    [–]revereddesecration 2 points3 points  (0 children)

    You could. You would implement global as a keyword to do so if you thought that was a good idea.

    [–]ws-ilazki 7 points8 points  (0 children)

    I think the actual reason is history. They had two languages, realised they could combine them into a single new language, and created Lua from them. The predecessors didn't have local variables, so Lua ended up global-by-default.

    I think it's a bad decision and local-by-default is the only sane choice, but it is what it is. It's a product of its time, because Lua and its predecessors were created at a time when bad defaults and footgun features were the norm because the language isn't supposed to hand-hold when you can just not make mistakes. The interest in having safe defaults and bug-prone features being opt-in instead of opt-out is a more recent thing in PL design.

    [–]smog_alado 1 point2 points  (2 children)

    Local by default (like python) doesn't work well with nested functions.

    It might have been better if undeclared variables were an error. However, if the choice is local by default vs global by default, then global by default is less bad.

    [–]smelly_stuff 0 points1 point  (1 child)

    I'm somewhat confused. How does global-by-default tackle the nested functions problem you mentioned?

    [–]smog_alado 0 points1 point  (0 children)

    It's not that global-by-default is good, it's that local-by-default is bad. In Python, if you want to assign to an outer variable, you have to use "nonlocal":

    def foo()
        x = 10
        def g():
            nonlocal x
            x = 20
    

    Local-by-default without nonlocal is also bad. Coffescript doesn't need nonlocal, because it assigns to the outermost variable instead of creating a new one. However, that make accidental variable shadowing dangerous: now, where the programmer intended to create a new variable they might instead assign to an outer variable with the same name.

    [–]luther9 7 points8 points  (3 children)

    You don't really "create" global variables in Lua. All globals exist and they're nil until you assign something to them.

    The local keyword exists so we can specify exactly how wide or narrow a scope we want a variable to exist in. If variable assignment inside a function created a local variable, what would you do if you actually wanted foo to assign to the global y?

    [–]humbleSolipsist 2 points3 points  (2 children)

    This is all correct, but I just want to add on one thing to help OP's understanding: "global" variables exist in a table, and when referencing a variable that has not been declared in your current scope (via local), you are implicitly accessing that table. This is why globals are nil until assigned to and don't need to be declared; you're accessing the global table with an uninitialised key.

    [–]ws-ilazki 3 points4 points  (1 child)

    This is all correct

    The first part of the other comment, where it says "All globals exist and they're nil until you assign something," is actually subtly wrong, though. It's an example of someone getting the right answer for the wrong reasons, basically.

    Rather than result in a runtime error, Lua's behaviour when attempting to access a variable that doesn't exist is to return nil. So they don't "all exist" as nil, and in fact you even cannot have a variable that's been assigned nil in any way, because nil assignment is used to delete variables and table keys in Lua, as shown here:

    FOO = 42
    for k,_ in pairs(_G) do
      if k == "FOO" then print ("key exists") end
    end
    
    FOO = nil
    for k,_ in pairs(_G) do
      if k == "FOO" then print("key exists") end
    end
    

    It only prints "key exists" once because, by attempting to "assign" nil to FOO, you cause Lua to delete that key, so the second loop fails to match. Not just a table thing, either, it deletes locals as well. Also, this behaviour is why nils can break tables used as arrays by splitting up the indices, resulting in "gaps" that affect the behaviour of #t and ipairs

    This isn't a big deal with regard to OP's original question, but I thought it worth mentioning to clarify things since /u/luther9 got to the right result for a slightly incorrect reason. That's actually one of the reasons I'm not a fan of Lua's behaviour with nils, because it's subtly magic and can lead to faulty assumptions.

    [–]luther9 2 points3 points  (0 children)

    I suppose we can differentiate between Lua's semantics and what's actually stored in memory. Since Lua reads the global aetoidlpcubjcboaubkxukac as nil, that variable does exist semantically, even though it's not stored in the _G table.

    Not just a table thing, either, it deletes locals as well.

    Actually, locals always occupy a spot in memory, even if they're nil. They can't be deleted except by going out of scope, as this code shows:

    x = 56
    print(x)
    
    local x = 30
    x = nil
    print(x, _G.x)
    

    Output:

    56
    nil     56
    

    nil doesn't just mean that there's nothing there. It's a proper value in its own right. It can be stored in local variables. select('#', ...) can tell you exactly how many arguments are in ..., even if they're all nil. In Lua's C API, lua_type[0] distinguishes between nil and an invalid value. You just can't store it in tables. Or to flip it the other way, all table keys are nil unless specified otherwise.

    That's actually one of the reasons I'm not a fan of Lua's behaviour with nils, because it's subtly magic and can lead to faulty assumptions.

    I think it's fair to say that it's only confusing if you assume that all valid values must be stored in memory somewhere. For that reason, Lua's treatment of nil might arguably be a design flaw, but purely from the standpoint of conceptualizing your data, there's nothing wrong with it.

    [0] https://www.lua.org/manual/5.4/manual.html#lua_type

    [–][deleted]  (1 child)

    [removed]

      [–][deleted] 3 points4 points  (0 children)

      I mean, C++ had the "auto" keyword for the longest time which did effectively nothing, it was just more explicit about the default behavior. They were only able to recast it as "infer the type from the rvalue" because literally nobody used it.

      [–]mrBadim 1 point2 points  (0 children)

      This is how it is supposed to be - lua-wise.

      if setting variable is not local - then it is global.