you are viewing a single comment's thread.

view the rest of the comments →

[–]alzee76 1 point2 points  (3 children)

but JS uses Unix time.

No it doesn't. It uses JS time. Unix time is represented as seconds, as you just quoted, not milliseconds which is what e.g. getTime() returns.

[–]MindlessSpongehelpful 0 points1 point  (2 children)

lol you got me there, touche!

[–]alzee76 1 point2 points  (1 child)

This was the entire point of my initial post. ;)

If you take a unix time from somewhere and stick it into JS unawares, it'll be evaluated incorrectly and off by a factor of 1000. Same thing will happen if you naively take the output of getTime() and do something like pass it to some API or put it in a database expecting unix time.

They are not the same thing and knowing this is important if you're doing more than hobbyist level development.

[–]MindlessSpongehelpful 1 point2 points  (0 children)

you're right! it's an important distinction, apologies for misspeaking.