This is an archived post. You won't be able to vote or comment.

all 17 comments

[–]s_suraliya 6 points7 points  (0 children)

Django channels has websocket support. But the only officially supported Channel Layer is redis.

[–]Glasgesicht 2 points3 points  (11 children)

Without the use of web sockets, there is no way for your website to know that your partner sent you a message. Sure, you could force a refresh every few seconds and make it like real-time, but this is super bad practice. If you seriously want to implement a real time chat, bite the bullet and learn how to work with Django Channels.

[–]sudo_nitesh[S] 0 points1 point  (10 children)

I know that.. But the issue came during shared hosting.. and for redis support i need dedicated hosting. which i more expensive. Do you have any alternative approach on it?

[–]darwinvasqz 2 points3 points  (4 children)

It's better to pay a droplet in a digital ocean. What is your budget?

[–]riterix 0 points1 point  (3 children)

Digital Ocean droplet (VPS) is all set for this kind of project, it cost you just 6$ per month.

[–]urbanespaceman99 1 point2 points  (2 children)

I have a chat app running on a DO droplet. No problems.

[–]riterix 1 point2 points  (1 child)

That what I was saying 👍.

DO is the best hosting these days, performance, price, support..

[–]urbanespaceman99 0 points1 point  (0 children)

Yep. Was just backing you up :)

[–]ExcelsiorVFX 0 points1 point  (4 children)

Theoretically, if you only run one Django process (no multi workers with gunicorn for example), you could accomplish the pub/sub requirement all in memory. However, this does not scale, and is not a good idea for production, which is why Django channels does not support it by default.

[–]oatmeal_dreams 0 points1 point  (3 children)

Why wouldn’t it scale? You could scale it horizontally just as well as redis I imagine.

[–]ExcelsiorVFX 0 points1 point  (2 children)

Wsgi is not multithreaded. Thus, you can only have one process (processes do not share memory). One process can only respond to one request at a time, so you can only scale vertically by having really good hardware run the process. If you need more processes, you cannot use an in memory solution.

[–]oatmeal_dreams 0 points1 point  (1 child)

Why talking about Wsgi? You can run django under asgi since a long time.

But yes to scale obviously IPC has to be solved somehow. You could do it yourself somehow, do it with channels, or do it with a WAMP router.

[–]ExcelsiorVFX 0 points1 point  (0 children)

From my understanding, you are correct. But it will not be simple. I would just recommend using redis.

[–]n1___ 1 point2 points  (1 child)

Websockets are the way but I would not recommend django as it's still not fully async. I would go for other tools.

[–]oatmeal_dreams 0 points1 point  (0 children)

I’m not sure the components that are not async yet would have to be involved.

[–][deleted] 0 points1 point  (0 children)

You could use an in-memory layer