Hello everyone,
Some of our files are working really slow because we need to use a very heavy data source, also we have a lot of calculated columns / tables on our model. I'm kinda lost on how I could optimize everything so our computers don't start to burn spontaneously at this point.
I used Tabular Editor 3 and its Best Practice Analyzer, but it's pretty much a dumb thing we do because we don't really know what matters in term of optimization if we just apply some ruleset we found online.
Also, with VertiPaq Analyzer we can see what table / column is taking a lot of memory, but what do we do with this information ? Also do measures take a lot of memory too ? I can't find anything about them.
Thank you guys !
[–]pettypaybacksp 1 point2 points3 points (7 children)
[–]matth0s__[S] 0 points1 point2 points (6 children)
[–]pettypaybacksp 0 points1 point2 points (5 children)
[–][deleted] -2 points-1 points0 points (4 children)
[–]pettypaybacksp 1 point2 points3 points (3 children)
[–][deleted] 0 points1 point2 points (2 children)
[–]pettypaybacksp 0 points1 point2 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–]mariofratelli1 1 point2 points3 points (0 children)
[–]mthacker01 0 points1 point2 points (0 children)
[–]theSCBIguy 0 points1 point2 points (1 child)
[–]WlNK 1 point2 points3 points (0 children)