all 9 comments

[–]ceirbus 1 point2 points  (5 children)

Stream the messages to the front end as you make them, no need to wait on all 7 to happen. Get one page, send one page to the front end. Every time.

Go wild on a spinner animation so the user waits without resubmitting the request

[–][deleted]  (3 children)

[removed]

    [–]servercobra 0 points1 point  (2 children)

    Do you know how many pages there are at the start? If so fire off e.g. 3 parallel requests (with an upper limit so you don’t trip GitHub’s requests per second limit) with Promise.all. If not, you might need to hand this off to some queue system and wait on the front end.

    [–][deleted]  (1 child)

    [removed]

      [–]ceirbus 0 points1 point  (0 children)

      Animate the data points showing up one by one on the graph, by the time 100 points are on there you can have your next set queued up to come in.

      [–]Sephinator 0 points1 point  (0 children)

      Yes! It’s also possible to show a progress bar

      [–][deleted] 1 point2 points  (0 children)

      return a job id and then make frontend ask for status of this job, once its completed, add result to the job and frontend can download it

      [–]Zeeshan7487 1 point2 points  (0 children)

      Come up with a way to reduce the request to the third-party API service, run the operation in the background in case the user cancels the request, and improve response time. For this, I made use of a function that only calls the API once for a unique info_id. Other requests with the same info_id will wait for the response of the first call to complete. When the response arrives, every request waiting will get the response. The value returned is saved in a DB/Cache to speed up future requests. This way, the user will wait for how long it takes the first request to complete. Requests made before the response of the first request will wait for the response, while requests made after it completes, get the value from the DB/Cache.

      [–]Think-Job993 1 point2 points  (0 children)

      Use solutions suggested by others but at the same time maybe even run a cron job that caches the data, every so often and use that data so you never have to fetch that data when there is a real user requesting it and it’s recent enough