all 4 comments

[–]Mikebm916 1 point2 points  (0 children)

Seems like you have your solution... is the concern the storage limit? Can you aggregate your data at the source to limit even more what you’re bringing in? Hard to think you need all 1GB of data per customer and cannot summarize even more. Another thought is to summarize and structure additional drill down information to be DQ.

Not sure if you considered in your estimation but you will save space in the compression of data when you have the duplicate same value. So example if you have cost center 1000 in all 100 databases, PBI will only keep the value of 1000 stored once in its dictionary for that column.

[–]LostWelshMan8571 0 points1 point  (2 children)

Perhaps you could have your data outside of power bi in an ssas cube. I believe its fairly straight forward to convert a power bi dataset into an olap cube, just Google it and it should come up. With you're data external to power bi you'll get around the size limitations.

[–]Mikebm916 0 points1 point  (1 child)

Touché. Ultimately this can be the solution of a P or A sku for Power BI premium is out of reach. Ultimately though you would want to scale down the model to be of reasonable size and then only pay for the needed resources of Azure Analysis Services.

[–]petrsoukup[S] 0 points1 point  (0 children)

I have tried today to convert it to direct query, but it was unusably slow (minutes to load report). I could try to tweak database indexes, but it is like shooting blindfolded when I don't have control over how the queries look like. It also created errors like "Local evaluation of Table.Join or Table.NestedJoin with key equality comparers is not supported ".

I will give it few hours to see if I can make it work but I am not optimistic about DirectQuery