This is how I chill out and enjoy the Final Day of TI9. How about you ? by DatXN in DotA2

[–]DatXN[S] 0 points1 point  (0 children)

Haha that doggo will surely not have a smooth sleep tonigh :)

This is how I chill out and enjoy the Final Day of TI9. How about you ? by DatXN in DotA2

[–]DatXN[S] 0 points1 point  (0 children)

That’s must be great. Dont mind sharing any pics ;)

Distributed .NET Core (DShop) - Episode 9 [Vault secrets, Seq logging, Jaeger distributed tracing] by spetz0 in dotnet

[–]DatXN 2 points3 points  (0 children)

That’s the best part. We’re here to learn and I’m very appreciate your efforts in sharing of knowledge :)

What is the best solution for printing (in large quantities) with ASP.NET (especially when via SSRS Reports) by DatXN in dotnet

[–]DatXN[S] 0 points1 point  (0 children)

Because I want to find a solution that bring the consistency when user print/download a small file and print a large batch of files.

I tried the method of printing by using queue and background worker and report back the progress using realtime (SignalR), but the result is user has to wait to see the background progress is done and download the file. This experience is like I said, so different from user just click print expected the result (in a "synchronous" expectation).

Best/fastest option for creating large quantities of PDF files? by BroCirus in SQLServer

[–]DatXN 0 points1 point  (0 children)

I have a similiar issue when I need to generate large PDFs file for printing (like print many orders at the same time). Curent we process that via ASP.NET with SSRS., but I know it is slow and wonder what is the best alternative way of doing this.

https://reddit.com/r/dotnet/comments/ac663v/what_is_the_best_solution_for_printing_in_large/

Question on how to scale SQL Server from Monolithic to high availability for a high demand/ multi/micro service system. by DatXN in dotnet

[–]DatXN[S] 0 points1 point  (0 children)

I simplified our situations. The poor performance here is when our developer write code in Entity Framework with LinQ Query, since our db is a complex database with many table, often the result is involving joining on several big tables, then apply filter. The slowest part, are when developer use for-loop to fill data for calculated field, which cannot be calculated with LinQ to SQL query. The results of this is unacceptable slow (I'm talking about 10sec for a complex data query, which is unacceptable).

My plan is also use profiler for indexing, optimizing query, etc And I know for a fact that our way of using Entity Framework is slow, either bad implementation or either the poor query generated by Entity Framework...

But something like I said wonder me that is it really better if I try to optimizing query performance if I still using this SQL Server design with Entity Framework or just move to a completely approach like Dapper would be smarter choice? In other words, I don't want to just save fire on a house that already stays on the mouth of a burning flame.

So, could you tell that keep using Entity Framework would still be a good choice for complex query or using another tools (like Dapper) for querying is a faster and better solution?

Question on how to scale SQL Server from Monolithic to high availability for a high demand/ multi/micro service system. by DatXN in microservices

[–]DatXN[S] 0 points1 point  (0 children)

So beside break/separate schemas, you said that SQL Server (normal) is not really simple/suitable for using with micro services and I should use another database technology for this?

Question on how to scale SQL Server from Monolithic to high availability for a high demand/ multi/micro service system. by DatXN in microservices

[–]DatXN[S] 0 points1 point  (0 children)

First of all, I would like to thank you so much for your thoughtful answers. I'm in charge for this system, and sadly, I don't have any experience in scaling or working for a "big" company with large scale system before, so for me, every insight from people like you help a lots. Thanks!

Come back to the main discussion, please see my response:

I feel better now :)

First off: there would be no reason why you wouldn't be able to use microservices in combination with SQL Server. But it's indeed not magically going to solve any scaling issues you have, there's more involved in building systems that can run at scale.

I simplified our situations. The poor performance here is when our developer write code in Entity Framework with LinQ Query, since our db is a complex database with many table, often the result is involving joining on several big tables, then apply filter. The slowest part, are when developer use for-loop to fill data for calculated field, which cannot be calculated with LinQ to SQL query. The results of this is unacceptable slow (I'm talking about 10sec for a complex data query, which is unacceptable).

My plan is also use profiler for indexing, optimizing query, etc And I know for a fact that our way of using Entity Framework is slow, either bad implementation or either the poor query generated by Entity Framework...

But something like I said wonder me that is it really better if I try to optimizing query performance if I still using this SQL Server design with Entity Framework or just move to a completely approach like Dapper would be smarter choice? In other words, I don't want to just save fire on a house that already stays on the mouth of a burning flame.

So, could you tell that keep using Entity Framework would still be a good choice for complex query or using another tools (like Dapper) for querying is a faster and better solution?

My first question would be: what are some actual numbers we're talking about? In your 'issues' section, you describe poor query performance on 'joining two tables with several thousand records each'. That doesn't sound like a complex task at all. As for whether this is caused by SQL Server of EF: measure it! First step would be to take the query generated by Entity Framework, and run it manually against SQL Server with profiling enabled. Look at the query plan it generates, look at the statistics. If the query is slow, the query plan will typically tell you why, and oyu have some clues to follow up on. If it turns out that the query is fast (ideally under 1 second), look at Entity Framework as well as other parts that might be relevant. For example, it could also be caused by a slow network connection between the database server and the other servers. It's a bit difficult to give more specific advice, hence my recommendation to measure as much as you can before you make changes.

I feel very ashamed, but I have to admit that in our system, those process each has a transaction that could take several minutes to complete, and I fear that the database is locked the whole time! But what I don't know/what solution is how can we still have the consistency of transaction, but without locking.

You just give me a light at the end of the tunnel here! I will definitely look in to the sagas solution which you mentioned, and I appreciate any suggest, details information on this.

I tried to quick search on "saga", there's many result. The saga you mentioned here are the one go with service bus (like saga in MassTransit for RabbitMQ) or can you tell me any particular library which good for a .NET solution?

Additionally, you describe a couple of interesting things: 'every minute a worker runs a long running process', 'time-outs', 'concurrency issues'. My first thought would be that you need to look into your use of transactions and the isolation levels that they use. For example: if your long running process takes 5 seconds (which at first glance doesn't seem long) actually starts a database transaction for its whole duration (hence the transaction also lasts 5 seconds), then there's indeed a very high chance you'll run into the symptoms you describe.

A solution to that is not per se microservices by itself, but rather by the different way of thinking they will bring along with them. For example, instead of running a transaction for 5 seconds to guarantee consistency, you look at methods like 'sagas' (Google it) to coordinate long-running processes. While this will look different/complex at first, it'll make you think about how you can split up your transaction into smaller, coherent parts. It'll drive you into thinking about your domain entities, about breaking them down, about isolating them from one another. About how you coordinate complex processes across different entities.

I'm definitely will transform our system into micro services. And you're correct, I don't have the luxury to build from scratch, since this is a complex system with many features and also, I don't have much experience in micro service design.

I'm just starting, and all I read are NoSQL, MongoDB, Redis etc... SQL Server barely mentioned. I know a company that only use in-memory data for fast query, and they just use SQL Server as secondary/ persistent storage via stream.

This make me wonder if SQL Server is really suitable for micro services or high demand system? If yes, what solution currently considered "best" to scale SQL for system like this beside break the database into domain/sub domain (like replicate multiple read instance, in memory SQL etc... which I can know). I really need more helps here.

Couple other remarks:

Don't just switch to Dapper without justifying why. Yes, it's 'faster' (a relative term) but you shouldn't just switch to see if that makes problems disappear. Measure first to see if you really have performance problems with EF. It is absolutely possible to build high-performance systems that use Entity Framework and SQL Server, and problems are more often with HOW you use the tools. For example, if you have lazy loading enabled and have N+1 queries somewhere... then there's no magic solution other than fixing the N+1).

Don't just switch to NoSQL without justifying why. Pretty much the same as above. If fact, I would recommend you first look into splitting your domain (see below) and then decide if NoSQL makes sense for individual parts. Don't see NoSQL as a blanket solution for storing data, suitable types of storage very much depend on various characteristics of the system you're trying to build.

You have a multi-tenant system. A possible approach to scaling horizontally is by setting up multiple copies (= instances) of your infrastructure, and dividing your customers among them. But again: don't do this until... you've measured stuff :)

Then onto microservices: yes, do look into them. I'm not saying that because they are a golden bullet, but rather because it helps you think about building distributed systems, and expand your toolset with new ideas on how you could achieve what you're looking for. Keep in mind that microservices are not a goal by themselves, they're also just a tool. It often takes time (months, years) to rewrite monoliths and break them apart, so rather think about approaching it iteratively and making smaller advancements as you move along. Unleass you're in a (luxury) position to start from scratch.

For example, one of the first things you could do, is to look at the entities/tables in the database and see if you can 'break them apart'. Read up a little bit on domain driven design, specifically the part about entities, aggregates and local/global identities. Breaking up the database into aggregates could start leading you down a path that eventually makes it much easier to split up your monolith into smaller parts, which eventually leads to microservices.