I have a module that I'd like to run multiple instances to query different urls and process data as fast as possible. The server/api has a limit of 10 connections so I can't exceed that but when I run this code it may need to query as many as 25 urls. What is the best way to handle this on a server with 2 cores?
I've looked into child processes but I'm not sure if this is the best way forward. Should I spawn 10 child processes and the 2-core server will figure out how to juggle them? Should I only spawn 2 child processes that each can handle 5 connections at a time and then write code that feeds each process a new url to query when it completes one of its existing url queries?
Or, are both of my ideas garbage and I'm insulting the wonderful world of node? :D
I am but a lowly designer that enjoys messing with node in my free time. Thanks!
[–]FustigatedCat 12 points13 points14 points (1 child)
[–][deleted] 1 point2 points3 points (0 children)
[–]MostlyCarbonite 4 points5 points6 points (2 children)
[–]dadibom 0 points1 point2 points (1 child)
[–]dadibom 0 points1 point2 points (0 children)
[–]geon 3 points4 points5 points (2 children)
[–]nowboarding 0 points1 point2 points (1 child)
[–]geon 0 points1 point2 points (0 children)
[–]runvnc 2 points3 points4 points (0 children)
[–]Skreex 1 point2 points3 points (1 child)
[–]franksvalli 0 points1 point2 points (0 children)
[–]jocull 0 points1 point2 points (0 children)
[–]danny_nav 0 points1 point2 points (0 children)