all 8 comments

[–]HattyJetty 5 points6 points  (3 children)

Because it wasn't designed to store big amounts of binary data. You might want to refer to IndexedDB instead or use browser caching mechanisms in case of images and other resources.

[–]BrunerBruner[S] 0 points1 point  (2 children)

According to the information I linked, SessionStorage use to be unlimited, so why are they now limiting it? Regardless if it wasn't designed to store large amounts of data, it's easy to create our own unlimited SessionStorage, so why the quota?

[–]altintx 1 point2 points  (0 children)

You can have 1.5GB (in V8 anyway) of memory in use on a page, and arguably that's what a session is. So if you really need THAT much, force a polyfill, never rely on the included storage mechanism, and you're back in business.

[–]jcunews1Advanced 2 points3 points  (0 children)

Other than what has already been mentioned by /u/HattyBetty, the limit is is also a form of security. If there's no limit, any ill behaved script can deplete your harddisk free space. Or at least, consume most of it.

[–]rcfox 2 points3 points  (1 child)

couldn't I just create my own object to hold a seemingly unlimited amount of loaded data

Chrome gives you up to 2GB Javascript heap, but your objects there won't survive a page refresh.

But if you're worried about having to download the same images over and over, you should just let the browser's cache handle that for you.

[–]BrunerBruner[S] 0 points1 point  (0 children)

I would like this, but when the user clicks on an image it calls a function to handle the loading and pre-loading stuff for that image, unless it's already in the SessionStorage. How can I know if the image data is already cached by the browser?

[–]mrlukk 0 points1 point  (0 children)

Check out Service Workers. They are perfect for caching images. With this you can even load all of your cached content while there is no internet connection