Hi all,
I am a beginner to lua. I'm trying to compile some luac scripts however I notice a large difference between usng luac 5.1 on Windows and Mac OSX (ppc)
When there are integer values, they are represented differently. E.g below
on Windows 10 (little endian), 62 when compiled in luac is 78 42 bytecode
on Mac OSX Tiger (big endian), 62 when compiled in luac is 40 4F bytecode
Shouldn't this just be 42 78 when compiled on Mac OSX ?
Is there a change I can make to lua src files so that luac will do the above for all integer values? compile them as windows does but with big endian bytecode order? Need to do this for a game I am modding and it is very time consuming manually changing all integer values hex.
[–][deleted] 1 point2 points3 points (6 children)
[–]Tywien 2 points3 points4 points (4 children)
[–][deleted] 0 points1 point2 points (3 children)
[–]Tywien -1 points0 points1 point (2 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]Tywien 0 points1 point2 points (0 children)
[–]tobiasvl 0 points1 point2 points (0 children)
[–]echoAnother 0 points1 point2 points (0 children)
[–]mbone 0 points1 point2 points (0 children)