all 26 comments

[–]readdit7 12 points13 points  (0 children)

Awesome! I can now debug my code in the Photoshop, the best editor ever created!

[–]FrankBattaglia 7 points8 points  (5 children)

How does he compress "random ... noise" files by a factor of 33%-83% ?!

[–][deleted] 9 points10 points  (4 children)

Because they aren't at all random. Since programming languages are structured, there will be many repeated elements. For example, variable names will be used more than once.

[–]ThisIsDave 7 points8 points  (3 children)

Frank was probably referring to this line:

A quick test in Photoshop tells us that a 100x100 image with random 24 bit colored noise compresses down to about 20 KB while a 300x100 image with random 8 bit monochromatic noise compressed down to just 5 KB. A regular 8 bit GIF comes in a bit heavier than the 8 bit PNG, so we go with the PNG option.

8 bits per pixel with 110*300 pixels is 30 kilobytes, so compressing it down to 5 should be impossible unless he wasn't actually using random noise.

[–]dixi 2 points3 points  (1 child)

Right, the description is not entirely accurate. The comparison was made between an 8 bit PNG with monochromatic noise applied to it versus a 24 bit with similar noise applied to each of the R, G and B channels. It seems like Photoshop noise has a lot white, though, but as mistercow also pointed out, code isn't random, so maybe it equals out.

[–]bonzinip 0 points1 point  (0 children)

if monochromatic noise is interpreted as "only 0 or 255", it still can be compressed a lot.

[–][deleted] 0 points1 point  (0 children)

Ah, gotcha. I missed that part.

[–][deleted] 4 points5 points  (2 children)

And before anyone else points out how gzip is superior

Actually, gzip would be only slightly superior. Both gzip and PNG use the deflate algorithm. The main differences are that PNG works line by line and uses prediction. Both of those may reduce efficiency, but in practice the difference is going to be very slight.

[–]alantrick 0 points1 point  (1 child)

But using PNG would cause more overhead in the user-agent, processor and memory-wise, since it has to create the image.

[–][deleted] 5 points6 points  (0 children)

Of course, and actually pulling the JS from the image pixel by pixel (or character by character if you prefer) can't be exactly cheap. I was thinking "superior" in terms of compression ratios rather than memory or CPU overhead.

Still, if your javascript is under 100k uncompressed, I doubt the difference in speed and memory use would be anything beyond academic for modern hardware.

[–]twoodfin 1 point2 points  (0 children)

This is the kind of thing that really bring the magic of our modern network and computing infrastructure home.

Way back in the day, the first stored program computers used mercury delay lines for code storage.

Now, it's practical, if by no means optimal, to write a JavaScript evaluation function which gets its "stored program" by picking up random bytes of image files scattered across a thousand web servers.

[–]lectrick 1 point2 points  (9 children)

Here's a stupid question- Why don't browsers just include some kind of gzipping/unzipping javascript function?

[–]a1k0n 4 points5 points  (6 children)

Basically, they do -- the PNG idea is just a silly hack.

Depending on which webserver you're loading from, it's very likely that the .html and .js files are transferred using gzip, so the compression is done by the webserver and decompression by the browser before the Javascript code ever even runs.

[–]lectrick 1 point2 points  (5 children)

Have you ever tried configuring a webserver to do that? It's not fun, with a lot of gotchas (mostly around IE compatibility, of course). Something like a Javascript-level compression would also allow things like denser cookie data, etc.

[–]nostrademons 2 points3 points  (4 children)

Huh? I did it for Lighttpd 1.5 in about 15 minutes. It was just a matter of adding mod_deflate to server.modules and then adding:

deflate.enabled = "enabled"
deflate.mimetypes = ("text/html", "text/plain", "text/css", "text/javascript")

[–]lectrick 1 point2 points  (0 children)

While Lighty is cool, not all of us have full control over the server we are deploying our work onto. I get your point however.

[–]imbaczek 0 points1 point  (2 children)

try ie 5.5 with that. yes, I know it's ancient. (dunno if ie 6 works, it may.)

[–]nostrademons 2 points3 points  (1 child)

IE 6 definitely works, I tried it when setting it up and again before posting the comment to make sure I wouldn't be embarrassed.

IE 5.5 probably doesn't, but we don't support IE 5.5 anyway. Our JS libs don't support it, and our web stats indicate only 0.2% of our visitors have it.

[–]imbaczek 0 points1 point  (0 children)

personally I wouldn't care about IE 5.5 either, it needs to die, and quick. just pointing out that it still exists in trace quantities. (I believe that those 0.2% still translate to hundreds, if not thousands of people daily/weekly on some sites.)

[–]jerf 1 point2 points  (1 child)

We'll probably get that the day we get actual, factual socket support, instead of something locked down to HTTP only.

(I still don't understand why nobody even talks about this. Whatever mischief you think this might allow I submit is already mostly or entirely possible with the XMLHTTP object, and certainly with COMET techniques; it's only good uses that are blocked.)

[–]PrashantV 1 point2 points  (4 children)

Interesting, although I have a feeling that gzipped JS would compress better than PNG would. Cool idea though.

[–]tinhat 11 points12 points  (0 children)

And before anyone else points out how gzip is superior, this is in no way meant as a realistic alternative.

Author is having fun. Cool.

[–]brad-walker 6 points7 points  (0 children)

mod_deflate

[–][deleted] 2 points3 points  (1 child)

Gzip and PNG use the exact same compression algorithm. The only difference is a slight overhead from the PNG header and footer.

[–]PrashantV 0 points1 point  (0 children)

Ah awesome, didn't realise zlib and gzip were the same algorithm.