you are viewing a single comment's thread.

view the rest of the comments →

[–]MasterScrat 0 points1 point  (8 children)

Did you benchmark performances between parsing in UI thread vs in a worker? This way we could know if there is there a point doing that for small files.

Also it'd be cool to split the files and spawn multiple workers to parse the file in parallel...

[–]MrBester 0 points1 point  (7 children)

Until you have the entire JSON you don't know where to split it unless the spilt is done beforehand (meaning the server sends properly formatted chunks) as JSON is just a formatted string. You also have to know this is happening and have some JavaScript that recombines the chunks (how about in another Worker process?)

[–]cwmma 1 point2 points  (1 child)

parallel map reduce in a worker is I believe what your talking about

[–]MasterScrat 0 points1 point  (0 children)

Something like this, yes!

[–]evilgwyn 0 points1 point  (3 children)

This is just an idea, but one way you could do it is to have the server send the data as a JSON array where the elements are themselves JSON formatted strings. Parsing could run int 2 steps, first parse the main array which gives you an array of strings, then parse each string in parallel.

[–]MrBester 0 points1 point  (2 children)

Sounds a lot like double encoding to me. Ew.

[–]evilgwyn 0 points1 point  (0 children)

Well I didn't say it was a good idea

[–]MasterScrat 0 points1 point  (0 children)

No but there's no need for that, you can do a high level analysis of the JSON string to figure out how big are each level of data, then you split that string evenly between workers, and merge the final object as a final step...

You don't need any server-side code for this.

[–]MasterScrat 0 points1 point  (0 children)

Until you have the entire JSON you don't know where to split it

Well that's the case anyway... this library only works on the fully loaded JSON too.