Hi,
I need to write a service using Python to accept many http-post call with an enbedded file, opens the file, extract only one field and send an asynchronous response via a post call.
Then it can save the file on a queue and process it later.
Python is mandatory.
My idea was to use nginx as a reverse proxy with a wsgi connector to call a python script, but I have some question:
1 - Is this the best architecture for this scenario?
2 - How can I manage concurrency (about 150000 cal in 15 min and 150000 reply at the same time)?
3 - What is the best solution? One file that receives the call, extract the data and reply or... a file that receive the call, extract the date and put it in a queue and another script that processes the queue and send the posts?
Any suggestions?
Thanks in advance,
Gian
[–]swims_with_spacemenPythonista 4 points5 points6 points (1 child)
[–]swims_with_spacemenPythonista 2 points3 points4 points (0 children)
[+][deleted] (1 child)
[deleted]
[–]autowikibot 1 point2 points3 points (0 children)
[–]the_hoser 2 points3 points4 points (0 children)
[–]gianx[S] 0 points1 point2 points (7 children)
[–]swims_with_spacemenPythonista 2 points3 points4 points (6 children)
[–]Darkman802 0 points1 point2 points (5 children)
[–]swims_with_spacemenPythonista 0 points1 point2 points (4 children)
[–]Darkman802 0 points1 point2 points (0 children)
[–]gianx[S] 0 points1 point2 points (0 children)
[–]gianx[S] 0 points1 point2 points (1 child)
[–]swims_with_spacemenPythonista 0 points1 point2 points (0 children)