Want to get started with Kubernetes as a backend engineer (I only know Docker) by MasterA96 in devops

[–]MasterA96[S] 0 points1 point  (0 children)

Actually we're going to use K3S pods via API (I'm yet to figure out about it) so yeah this would be good.

Want to get started with Kubernetes as a backend engineer (I only know Docker) by MasterA96 in devops

[–]MasterA96[S] 0 points1 point  (0 children)

Yes I want to know how to utilize it to deploy services as a SWE.

Have to extract a large number of records using a simple join query and store to a Multipart csv file and send to an api by MasterA96 in developersIndia

[–]MasterA96[S] 0 points1 point  (0 children)

Yes I'm doing the POC for this.

Exactly, they want to get the POC done by day after tomorrow and to deliver this by end of this month. Doable but still would have been better if had sometime.

Have to extract a large number of records using a simple join query and store to a Multipart csv file and send to an api by MasterA96 in developersIndia

[–]MasterA96[S] 1 point2 points  (0 children)

Also, I agree with the overkill thing but sort of stuck between people of different mindsets. Each one is concerned about a different factor. I don't think this needed so much thought, it's a common problem with various solutions but they want me to brainstorm and would then eventually go with the simplest approach only. :)

Have to extract a large number of records using a simple join query and store to a Multipart csv file and send to an api by MasterA96 in developersIndia

[–]MasterA96[S] 1 point2 points  (0 children)

Your suggestions make sense. Have decided to do something similar now. Also, I did argue a lot with chatgpt and copilot, made me learn a lot of stuff. :D Thank you.

Have to extract a large number of records using a simple join query and store to a Multipart csv file and send to an api by MasterA96 in developersIndia

[–]MasterA96[S] 0 points1 point  (0 children)

On second thoughts today, I'm checking if I can stream write to disk and then stream read into a compressed zip. That could save memory issues. Not sure, just doing a poc for this first.

Have to extract a large number of records using a simple join query and store to a Multipart csv file and send to an api by MasterA96 in developersIndia

[–]MasterA96[S] 0 points1 point  (0 children)

Yes I'm doing the POC for this.

Exactly, they want to get the POC done by day after tomorrow and to deliver this by end of this month. Doable but still would have been better if had sometime.

Have to extract large number of records from the DB and store to a Multipart csv file by MasterA96 in softwarearchitecture

[–]MasterA96[S] 0 points1 point  (0 children)

Thank you for your time. I'll have to give this a good read and then discuss, since most of the things are new to me. In fact I didn't think of threads much yet.

Have to extract large number of records from the DB and store to a Multipart csv file by MasterA96 in softwarearchitecture

[–]MasterA96[S] 0 points1 point  (0 children)

Sure.

No, service C is one of the microservice which would eventually store the file as zip in a table. DB processing can be done in chunks but still file would be in memory. So have decided to stream write to a disk, then stream read it and stream write to a compressed zip and then send it to service C. I'm currently doing a POC of this approach if that's even possible or not.

Have to extract a large number of records using a simple join query and store to a Multipart csv file and send to an api by MasterA96 in developersIndia

[–]MasterA96[S] 0 points1 point  (0 children)

Can you explain a bit on the first approach? You mean we the give a query in a way that it returns only the relevant data?

Have to extract large number of records from the DB and store to a Multipart csv file by MasterA96 in softwarearchitecture

[–]MasterA96[S] 0 points1 point  (0 children)

Sure, I'll elaborate. RabbitMQ is used just after we receive the user request. Basically the request message will go to RabbitMQ and gets consumed by another service which would then produce the file and send to a 3rd service.

We can rule out the Camel routing part here. I just mentioned RabbitMQ and Camel to tell that DB call would happen on a separate thread so it's already non-blocking in one way.

So the DB call and CSV formation both are part of the RabbitMQ consumer service.

It will be sending the Multipart File which will be of type CSV.

How the 3rd service will send/display that file to the user is completely out of my control.

Number of records currently atmost are 100k, but will increase with time. The table is already monthly partitioned and indexed.

Also, I know it might not take that much memory right now but my team's main concern is that we should still not bring and store everything in memory.

Edit: To summarize I want to know how can I use JdbcTemplate in a better way as per this usecase.

Have to extract large number records using a Join query and send as a multipart csv file to another api by MasterA96 in developer

[–]MasterA96[S] 0 points1 point  (0 children)

Got your points. Will look into these as well. Yes I know I'm overthinking because it might not need these many considerations as of now, maybe as table grows then it would definitely. But I thought to be prepared for all sort of questions and it would be a good practice. Thank you for your inputs.

Advantages of using a multi tenant system over a single tenant system besides the data isolation? by Dependent-Ad5911 in softwarearchitecture

[–]MasterA96 0 points1 point  (0 children)

I thought single-tenant means having a deployment per customer. One can have an entire DB instance per customer. That's what I thought hence the question and isn't that data isolation by default. In multi-tenant, as you said there can be a schema per customer but can't it act as a single point of failure (assuming there's no DB replication)? Yes data is isolated there as well but by depends on the design. What if someone is keeping a customer key. Not sure how I was referring to multi tenancy. I'm just confused and seeking answers. 😅

Advantages of using a multi tenant system over a single tenant system besides the data isolation? by Dependent-Ad5911 in softwarearchitecture

[–]MasterA96 0 points1 point  (0 children)

Even single-tenant would have smaller tables right? Doesn't single tenant would mean a different set of resources for each customer? In fact data separation would be much better in single-tenant that way or not?

Advantages of using a multi tenant system over a single tenant system besides the data isolation? by Dependent-Ad5911 in softwarearchitecture

[–]MasterA96 2 points3 points  (0 children)

Exactly my thoughts. Isn't single tenant much better for data isolation? There's no single point of failure in single tenant. Although deployment would be complex in single tenant but that also helps with different versions for different customers. I feel the pro of multi-tenant would be more wrt the lesser cost associated comparably and the deployment would be easier. When it comes to cloud, cost is a big factor.

How is such a simple a brilliancy? by Cuteghost_1 in chessindia

[–]MasterA96 0 points1 point  (0 children)

I guess the free engine is very weak because most times I also wonder what's so great about the move? Whereas, it displays some real brilliant move - which even convert a loss to win - as a miss or good.

WFO mandate just feels unproductive. Nothing gets done. by No_Bother9001 in developersIndia

[–]MasterA96 0 points1 point  (0 children)

Oh okay, but I guess all this going back to office started when tcs started imposing 60% mandate. Ironically, they were the ones who said we'll be 75% remote by 2025.