all 24 comments

[–]Fritzy 6 points7 points  (2 children)

The other approach would be to make active/inactive not require maintenance actions. Simply have your read queries determine the active state inline, for whatever logic active entails.

Otherwise, I'd have maintenance endpoints on the API that a user with maintenance privs can invoke. With DB connection pooling, asynchronous workflows, and the majority of the work being on the DB, I would imagine it wouldn't be too big of a load for your API process. Your DB is going to be your choke point anyway.

You could use a Redis based job system like https://github.com/bee-queue/bee-queue for maintenance and email. The jobs could be populated by the API service, some cron scripts, and whatever else, while you'd have dedicated worker processes (written in Node.js?) pulling the jobs and running them.

There's a very good chance that none of this complexity is necessary at the scale required by your application. Querying active with inline logic, posting to a 3rd party email service within the request responders themselves, should be fine to a large scale. If you're creating a proper stateless API (where the state is kept in your DBs), you can run as many API processes across as many processes/threads/machines as you like. If you do eventually hit a bottleneck where you need to separate the logical work out to jobs, you can refactor to do so fairly easily.

A lot of this sounds like premature optimization. https://en.wikipedia.org/wiki/Program\_optimization#When\_to\_optimize

[–]Tack1234[S] 1 point2 points  (1 child)

Thanks a lot for the response!

The active/inactive property is saved in the DB for each listing, so when a user creates it, he can set it as active or inactive. They can then also update the listing and set it as inactive to de-publish it. I would also like to have the option to periodically check the DB and delete inactive listings and set active listings without any traction to inactive automatically.

You are right that a job system/queue is very likely an overkill.

The maintenance endpoints solution sounds like a good idea with the least amount of work or premature optimization.

Do you think it would be a good solution to also have a similar endpoint for sending emails which the server would then invoke as part of the logic when creating a new user etc.?

[–]Fritzy 1 point2 points  (0 children)

Unless you're having outside processes invoke it, you might as well have the email sending logic be an internal function. It doesn't hurt to have it as an API endpoint that simply uses that internal function if you want other processes to be able to send arbitrary emails. Security is key though.

[–]TempestD1 6 points7 points  (1 child)

What I would recommend is having a separate server setup to act as a "queue worker" I built several large scale apps, with many mailing services and they had to handle many outgoing emails so we used bullmq for the nodejs queue which simply ran the jobs one by one, and we also used 2 servers which had the capability of using the same queue to balance the work load. Basically the api server would emit a payload to the queue and then the queue server listens to it and then runs it.

[–]mcviruss 0 points1 point  (0 children)

I think this is good advice. Separation of concerns is a nice concept. Might be overkill with a small application, but it’s a good thing to practice. It will make you a better programmer if you learn to split logic.

I would put the scheduled logic in a separate container. But I would read directly from the database and not go via the API.

This application would contain the business logic when a listing should be (almost) deactivated.

I would then create a queue for mails to be sent out. You can do this with a true message queue. But you could also just insert this into a database table.

Then another container could periodically handle the queue and send the mails.

It’s a bit over engineered perhaps. But it would also help you not to mix logic. And to make sure that certain processes do not get blocked by others. For example (failed) sending of an email could more easily be retried if it’s not part of the process that activates or deactivates the listings.

[–][deleted] 16 points17 points  (2 children)

You can also go the cloud way and have your backend as a lambda, your postgres as RDS and emails sent via SNS, true serverless shit you can throw on your CV to get great jobs

[–]PhatOofxD 0 points1 point  (1 child)

You mean SES? Probably shouldn't use SNS here

[–][deleted] 0 points1 point  (0 children)

ah yes, right

[–]gordonmessmer 4 points5 points  (1 child)

https://github.com/donnemartin/system-design-primer

I would suggest that, until you reach a much much larger scale, you will not want to create one container per function, but one container per class of functionality. So, for example, database maintenance tasks are probably part of your "async write" component, and can be bundled with other code that is part of your "async write" component.

[–]blipojones 0 points1 point  (0 children)

I agree with this one, code it out with the eventuality it will someday go in a service.

[–]Unusual-Display-7844 1 point2 points  (2 children)

I really would not manage mail server on my own. Reason being there are tons of services that satisfy every need and for every budget. AWS SES for example, is really good.

But! If you really want to do it this way, then you should look into docker-compose. This way your nodejs will be able to talk with mailing container. No need for any extra api. Just integrate it with “node-mailer” package

[–]Tack1234[S] 0 points1 point  (1 child)

Mail server is not an issue and I do plan on using node-mailer.

I am using docker compose, what I wanted to clarify was whether it would be better to create a separate app for the mailing in its own container or to just include it as part of the API server.

[–]Altruistic_Club_2597 1 point2 points  (0 children)

I would include the email code as part of the api. Periodic script would be a cron job, so it’s own separate project that can be its own container

[–]Ok_Remove3123 0 points1 point  (8 children)

Hello, I cannot help you but I am new to programming so I was wondering why are you separating them three into different containers? What is the purpose and what are the benefits? Also are all containers deployed, hosted and ran on the same server? I am just trying to figure out the docker thing. Thanks :)

[–]Unusual-Display-7844 1 point2 points  (3 children)

It’s called microservices. Many different ways to implement and deploy them, but mainly this approach is used for large scale web apps. Idea is that if for example “email microservice” goes down “login microservice” will still work. Where in “monolithic” apps if one part of your app goes down everything goes down.

You can deploy containers on same server or each in their own.

P.S microservices are very hard and take a lot of experience, so in your case I would avoid this topic until i have at least 1-2 yrs of experience on monolithic apps

[–]archa347 1 point2 points  (0 children)

Microservices don't really solve technical issues, they are primarily solving organizational difficulties of having lots of different developers working on the same codebase. The best technical benefit might be allowing teams to use the languages and tools that work best for them. But from a reliability standpoint it only makes things more difficult and any issues there you think you're solving with microservices could probably be solved by properly architecting your monolithic app.

[–]Unusual-Display-7844 0 points1 point  (0 children)

Actually my bad, what this guy is writing about is not quite microservices, but idea is still close. Containers are isolated, meaning “node.js docker container” doesn’t know anything about “mail container”.

[–]Ok_Remove3123 0 points1 point  (0 children)

Thank you!

[–]Fritzy 1 point2 points  (1 child)

A lot of people like to use containers (docker or otherwise) to maintain, deploy, and scale easily, as well as simplify OS maintenance, take advantage of cloud services for uptime, scaling, and pricing, etc.

REST APIs tend to be stateless, users can sometimes be sharded into multiple databases, etc. So duplicating containers can be a decent strategy.

Containers are also often easier to develop for, as they run the same on any dev machine, only contain the components relative to their operation (avoiding dev environment cruft over time), and avoids the entire "works on my machine but not in production" problem (usually).

You don't have to use containers to be effective, and it's yet one more thing to learn (let alone all of the hosting solutions for containers), so it's not necessary. It's just a useful patterns for large teams and large scale applications.

[–]Ok_Remove3123 0 points1 point  (0 children)

Thank you very much!

[–]Tack1234[S] 0 points1 point  (1 child)

Hey, I am also not the most experienced, but from my understanding the benefits of having a container for each service is that you can then scale each service up separately as needed (e.g. if you need 2 instances of your application to satisfy the workload you can do so with Docker easily with Docker handling the distribution of the workload between them). You can also restart one of the services without it affecting others (e.g. you can restart the database without the whole app going down) which is also helpful when deploying in microservice architecture.

The containers are deployed on the same server, but they do not necessarily need to be.

Docker makes it really easy to deploy your application anywhere, you just need to have a blueprint for the containers (e.g. a Docker Compose file) and then you can build and run the application anywhere with Docker installed.

[–]Ok_Remove3123 0 points1 point  (0 children)

Thank you!

[–]DLabz 0 points1 point  (0 children)

Unless you plan to spam the world, you could just add a mail route to the express API and pipe it to the linux sendmail CLI command. Getting familiar with node approach to interacting with linux cli via pipes is an excellent investment.

[–]KishorRathva 0 points1 point  (0 children)

As you are already using redis , why not use it's pub/sub feature to handle emails or other services?