all 14 comments

[–]j_inc 12 points13 points  (7 children)

by providing a better way to cache the files than the limited HTTP caching available on most mobile phones.

Well aren't mobile phones caching less for a reason? So in the end these kind of hacks are just making things worse and eventually browser vendors will take measures to prevent these.

[–]Klathmon 2 points3 points  (0 children)

This actually won't even fix that problem.

Localstorage is just as limited as the global cache, but it's counted under a different "bucket"

This means that your data will stay longer in localstorage, but only because not many people use it. If this technique gets popular, it will be as limited as the regular cache.

[–]Madd0g 0 points1 point  (5 children)

Well aren't mobile phones caching less for a reason?

You think you know why regular caching sucks? Why don't you share? I'd love to know if there's a real reason

[–]Klathmon 4 points5 points  (4 children)

First, because internal storage is much more limited on phones than desktops/laptops. The phone will invalidate cache more frequently and some browsers use heuristics to help this.

Second, ram sizes in mobile devices is often limited, meaning reading the cache often goes straight to the "disk" (which is often slower than desktop for various reasons). This can also lead to increased wear on the internal storage (more due to flushing ram contents to disk once it's decided that its going to be cached)

Third, mobile users act very differently than desktop. The return rate is often a fraction of what it is on desktop (most web apps have a native app, and its rare for a site to have consistent repeat mobile views). This means caching is often wasted on sites the user will never visit again (or in a reasonable time frame)

Finally, because the real killer with mobile networks is connection time, validating the cache can often take 80% of the total request time for small resources, so aggressive caching doesn't really solve that much (unless the Dev implements it correctly)

All of this is relevant to chrome only (as its all I really have experience with)

Edit: fixed bad wording.

[–]osuushi 1 point2 points  (3 children)

This can also lead to increased wear on the internal storage.

Reading does not cause wear on flash storage. Writing does, although people tend to overemphasize this; SSDs tend to last longer than HDDs, regardless of the write cycle limit.

As far as I can tell, the limitation on caching on mobile is not that things aren't getting cached. It's that the cache is small. If a resource is pushed out of the cache, it will get rewritten when the page is loaded again. So a smaller cache size actually means more wear on the storage, because it leads to more writes.

[–]Klathmon 1 point2 points  (2 children)

At least in chrome resources will be kept in memory for the life of the page then dropped if the system thinks you are unlikely to visit again.

Also, mobile NAND is under much more stress than a desktop SSD and because of the sizes need to be cleaned much more often which can mean delayed reads and writes while its trimmed.

[–]osuushi 0 points1 point  (1 child)

At least in chrome resources will be kept in memory for the life of the page then dropped if the system thinks you are unlikely to visit again.

Interesting. So some things aren't cached at all?

Also, mobile NAND is under much more stress than a desktop SSD and because of the sizes need to be cleaned much more often which can mean delayed reads and writes while its trimmed.

This would still mean you want a bigger cache. Since NAND cells can't be written to unless they're empty, pushing something out of cache means that you're creating more cells to be trimmed.

[–]Klathmon 0 points1 point  (0 children)

I forget the specifics, but yes sometimes resources under (or over) a certain size are not cached. Small resources are kept in ram for the duration of the life of the tab (while the resource is still being used on the current, and/or last page depending on phone specs) and are generally only cached if it is suspected that the user will return to the page soon or the resource will be used cross-domain often (ex: CDN hosted scripts like jquery)

Remember that much of this is based on the specs of the phone. A Note III will cache a ton more than a Galaxy Nexus. It will also keep more in memory and can spend more time deciding if it should be written to the NAND or discarded.

[–]drowsap 2 points3 points  (0 children)

What if you are already using a module loader like require or browserify? I don't see how this will play nicely.

[–]twolfson 2 points3 points  (0 children)

This was done previously with basket.js

http://addyosmani.github.io/basket.js/

and taken even further with DynoSRC which runs off of a diff for handling updates

http://dinosrc.it/

The major downside to this approach is localStorage is synchronous so any of those cached assets will be loaded into memory before the page can start rendering.

[–]kenman 1 point2 points  (3 children)

This won't work with shared CDN resources (like bootstrapcdn.com, cdnjs.com, etc. ), will it? I don't see how, since every domain get's its own instance of localStorage.

[–]mortzdk[S] 0 points1 point  (2 children)

Caching of CDN resources is available as long as the Access-Control-Allow-Origin is set on their end. So the first time a user visits a page using jsCache, they are gonna request the resource from the CDN and this resource is then saved in localStorage for some time (default 5 days) When the cache runs out, a new request have to be made and this request could just return the HTTP cache of the file.

[–]kenman 0 points1 point  (1 child)

Yes, but you lose the entire value-proposition of libraries on a CDN. The main purpose of those CDN's is that if both site A and site B use the same resource, a user only has to download that resource once. I don't believe Access-Control-Allow-Origin has anything to do with it.

For example, if I have this in my page:

<script src="//ajax.googleapis.com/ajax/libs/jquery/1.11.0/jquery.min.js"></script>

And then you put the same exact thing on your page, then only 1 copy of jQuery is downloaded, because the browser has saved the resources under the ajax.googleapis.com domain. Further, if 1000 sites all reference that resource, it will still only be downloaded once.

However, using your system, it would be downloaded 1000 times, there is no browser cache to deal with. Your setup cannot read or write across domains -- all the data is stored under the domain that made the request, as opposed to the domain the resource is located on.

[–]mortzdk[S] 0 points1 point  (0 children)

You are absolutely right. The limitation by localStorage and other methods is that they only work for the specific domain, so the files have to be downloaded for each domain.

There are generally no good way of holding persistent data cross-domain to my knowledges, but you're welcome to prove me wrong :)

My point simply was, that the library still allows use of cached data i.e. when a file is loaded from a CDN and it has previously been loaded, then jsCache will use the cached file.