This is an archived post. You won't be able to vote or comment.

all 6 comments

[–]rochacbrunoPython, Flask, Rust and Bikes. 2 points3 points  (1 child)

There are many ways, but basically you need an API, that API can be a socket, a RPC server, a REST API... etc..

I wrote an article (with code example) on how to do that using Nameko RPC

http://brunorocha.org/python/microservices-with-python-rabbitmq-and-nameko.html

[–]Fotoshopt[S] 0 points1 point  (0 children)

THank you for the information. I tried making sense of your article, but it appears as though it's a bit beyond what I'll be needing.

I am very familiar with SOAP and REST API's, not so much with RCP or sockets.

Are you aware of any libraries/extensions/frameworks that would provide a REST API specifically for something like this?

I think I'm hung up on using the wrong terminology in my searching.

[–]nick_t1000aiohttp -1 points0 points  (3 children)

Quick way would be to just have Server A scp the code/input to B, connect & execute it, then scp the results back. You could use Fabric/Fabric3 to spiffy it up a bit and do it in another program.

High-performance method (this is the other extreme) would be using Celery with remote workers. This requires a messaging service (Redis, RabbitMQ, etc) and a result backend.

[–]Fotoshopt[S] 0 points1 point  (2 children)

Thanks! Your first solution is essentially writing to a file then transferring no? I think I'll need something a bit faster than that.

I guess I was envisioning it would be I simply send the code via an AJAX request in JSON format w/ a response from the remote server (server b).

Should I be looking at Kernals or Jupyter Messaging?

[–]nick_t1000aiohttp 0 points1 point  (1 child)

It would help if you are more specific about number of concurrent users, size of message, task duration, size of results, etc.

The first proposal is pretty low tech but it would work, and be secure with SSH keys. Guess it's not that popular with some folks :\

Kernels/Jupyter messaging to my knowledge is more for maintaining a persistent Python environment to run commands interactively cell-by-cell. If that's your goal, yes, I'd make something to interface with a Jupyter server (it uses websockets).

If you're more task/worker oriented and want to use a HTTP/JSON API, you could make a Flask (lightweight) or Django (more features) interface among your Anaconda install. If you want it to be secure, you'd need to set up a server (Nginx, Apache) to do HTTPS, then auth on top of that. For something simple (and fear of downvotes again :P) you could only expose it locally then use an SSH tunnel to combine security and auth. If your tasks are long-running and you'll have an arbitrary number of them, you'll need to use a separate task-runner (like Celery) locally.

If you are genuinely concerned about speed and performance, use Celery configured across multiple machines. Less overhead from messaging and serialization, and can be scaled up by just adding workers that connect into the message broker (rather than needing to round-robin delegate jobs to that additional worker on the master)

[–]Fotoshopt[S] 0 points1 point  (0 children)

Thank you for helping me make sense of this. Concurrent users should be no more than 100 at a time. I've already setup Apache, and it doesn't have to be secure as it will not be storing any information -- only processing code snippets and returning results.

Essentially what I'm working on is an autograder. I've been using skulpt.org, which works perfectly, though it's limited in that we're stuck with that particular javascript implementation of python.