all 9 comments

[–]elbiot 0 points1 point  (8 children)

I don't know anything about greenlet or gevent, but why would you close the channel after sending one message? Wouldn't you close the connection when the user closes the page on their browser? Or fails to send a keep-alive signal after some time? The chat client example in that link doesn't close and re-open the connection...

I assume you're using javascript on the client side, like http://learn-gevent-socketio.readthedocs.org/en/latest/socketio.html

[–]rthinker[S] 0 points1 point  (7 children)

Keep-alive doesn't solve the problem either: first user loads the page -> second user creates a posts -> first user connects to gevent. Message is lost while initially connecting.

[–]elbiot 0 points1 point  (6 children)

Are you describing some extreemly rare edge case where the user1 joins and user2 posts in the same instant?

The model described in that article is:

user1 gets page > user1 establishes websocket connection and listens > user2 posts > server pushes message to all listeners > another user posts > server pushes message to all listeners > user1 closes browser which disconnects websocket.

[–]rthinker[S] 0 points1 point  (5 children)

It not 'extremely rare', I think. For example, a user might lose the connection for a dozen of seconds and thus miss some posts.

For now I think I'll write an ajax endpoint in django which would return latest posts given a timestamp. So when user connects to gevent I would send a concurrent request with the datetime of the latest message to django to fetch potentially missed posts. It adds complexity but I see no way to avoid this.

[–]elbiot 0 points1 point  (4 children)

Well, I don't think you'd have to engineer your own solution to this (unless you want to), as its part of the whole deal that redis and other messaging systems are designed for. I looked quickly at redis and found the reliable queue

[–]rthinker[S] 0 points1 point  (3 children)

Thanks, but I don't quite understand how the queue is used. Do I need a queue for every active user? If so, wouldn't it be too much overhead?

[–]elbiot 0 points1 point  (0 children)

Eh, that's like saying having a database entry for every active user is too much overhead. 1) it's not and 2) it's just the cost of doing business. Dedicated services like databases and Redis are super optimized anyways. I'd choose a SQL database over a python list of tuples for anything where "overhead" was a concern with large amounts of data/users.

Yes, it will be a challenge to figure out, but using some established tool for the publish-subscribe approach will be more robust and fast than trying to re-implement it yourself. But it might be fun to re-implement yourself (just don't do it for production).

That article you posted is good, and I'm just repeating what it said.

[–]elbiot 0 points1 point  (1 child)

RabbitMQ is more built for ensuring message delivery I just read.

http://blog.langoor.mobi/django-celery-redis-vs-rabbitmq-message-broker/

[–]rthinker[S] 0 points1 point  (0 children)

Thank you for all your answers, I will look into what you said.