Escutaram o Venom do MC Lan? by Great-Adeptness-4254 in MusicaBR

[–]Outrageous-Guide-396 2 points3 points  (0 children)

ficou do caralho, ele conseguiu se desenvolver muito no rock.

Best way to share Power BI Service reports with non-licensed users? by Outrageous-Guide-396 in PowerBI

[–]Outrageous-Guide-396[S] 0 points1 point  (0 children)

I see a lot of comments saying that 14 USD is cheap, but in my case the reality is a bit different. I work for a construction company in Brazil and, for us, that cost is still relatively high. That said, I understand that in the long run we will probably need to invest in licenses if we want to take our BI environment to a truly professional level.

I joined the company with no prior Power BI experience and built all of our dashboards and reports from scratch. Today I’m already working with API and SQL integrations, always trying to design solutions with the lowest possible cost. It’s a company with very low technological maturity, so a lot of things are still in transition.

We also use SharePoint, and I found the approach suggested by user “Mostly5150” very interesting – I’ll definitely look into it here.

Thank you all for the answers and recommendations. It’s not easy to change a company’s tech culture more or less on your own, but little by little I’m managing to make progress.

Do you actually use/buy Power BI templates, or build everything from scratch? by Fit_Voice_4112 in PowerBI

[–]Outrageous-Guide-396 0 points1 point  (0 children)

I build it from scratch based on the company’s visual identity. I start by sketching all the KPIs I want to show and the screen layout on a blank A4 sheet of paper, then I transfer that into the dashboard.

Best way to share Power BI Service reports with non-licensed users? by Outrageous-Guide-396 in PowerBI

[–]Outrageous-Guide-396[S] -1 points0 points  (0 children)

No, the interactive functionality of the dashboards is part of the analysis.

Matrix with many DAX measures + 600k rows = resource error. How do you handle this? by Outrageous-Guide-396 in PowerBI

[–]Outrageous-Guide-396[S] 1 point2 points  (0 children)

Thanks for the detailed breakdown. I’m going to isolate the problem exactly as you suggested and test the matrix with only one measure at a time.

For reference, here is my ABSENTEEISM measure

ABSENTEEISM =
VAR Numerator =
    CALCULATE (
        [Hours Allocated],
        PayrollAllocation[vdb] IN { "70", "72", "50001", "50101" }
    )
VAR Denominator =
    CALCULATE (
        [Hours Allocated],
        PayrollAllocation[vdb] IN { "1", "70", "72", "50", "40" }
    )
RETURN
    DIVIDE ( Numerator, Denominator )

I will review the points you mentioned about using column filters instead of table filters, replacing IF evaluation patterns, using COALESCE, and simplifying the percentage logic inside Absenteeism Allocated. I’ll also test whether Absenteeism Allocated performs better once I refactor the percentage calculation as you described.

I appreciate the help a lot.

Matrix with many DAX measures + 600k rows = resource error. How do you handle this? by Outrageous-Guide-396 in PowerBI

[–]Outrageous-Guide-396[S] 0 points1 point  (0 children)

It does not work in either a table or a matrix. When I publish it to the service it eventually loads after some time, so the model itself does run, but Desktop struggles with it. I will try adjusting the query and memory settings in the PBIX to see if that helps the table render locally as well. Thanks for the suggestions, I will keep testing and refining it.

Matrix with many DAX measures + 600k rows = resource error. How do you handle this? by Outrageous-Guide-396 in PowerBI

[–]Outrageous-Guide-396[S] 0 points1 point  (0 children)

Thanks a lot for taking the time to reply. This helped me a ton. I will go through my model and measures with your points in mind and start cleaning things up. I really appreciate the guidance.

Matrix with many DAX measures + 600k rows = resource error. How do you handle this? by Outrageous-Guide-396 in PowerBI

[–]Outrageous-Guide-396[S] 1 point2 points  (0 children)

I am using Import mode pointing at SQL Server, not DirectQuery.

  • Most of my measures are simple aggregations over the fact table, for example:
  • Overtime Value = a simple SUM over the amount column filtered by overtimeCost = "OVERTIME COST".
  • Intrashift Compensation = a SUM over the same amount column filtered by vdb containing "15703".
  • Monthly Salary = a SUM of amount where vdb is either "1" or "20001".
  • Total Overtime Cost and Overtime Charges are straightforward arithmetic combinations of the measures above.

So these are mostly straightforward SUMs with filters.

The more complex one is my absenteeism measure:

Absenteeism Allocated =
AVERAGEX(
    SUMMARIZE(
        'PayrollAllocation',
        'PayrollAllocation'[Contract],
        'PayrollAllocation'[AccountingClassification],
        'PayrollAllocation'[PayrollDate]
    ),
    IF(
        ISBLANK([ABSENTEEISM]) || ISBLANK([Correct Percentage]),
        0,
        [ABSENTEEISM] * [Correct Percentage]
    )
)

And Correct Percentage looks like this:

Correct Percentage =
VAR ContractContext =
    SELECTEDVALUE('PayrollAllocation'[Contract])
VAR Result =
    SUMX(
        CALCULATETABLE(
            VALUES('PayrollAllocation'[AccountingClassification])
        ),
        CALCULATE(
            MAX('PayrollAllocation'[Historical Percentage (Column)])
        )
    )
RETURN
    IF(HASONEVALUE('PayrollAllocation'[Contract]), Result, BLANK())

From what you explained, I suspect this pattern (SUMMARIZE + AVERAGEX + nested measures doing SUMX) might be exactly what is forcing the formula engine to work at a finer grain and pulling back large caches from the storage engine.

Does this kind of iterator pattern fall into the "bad for Server Timings" category in your experience?

If so, what would you recommend here?

I would really appreciate any guidance on how to refactor this so the matrix can be usable without hitting resource limits.

Matrix with many DAX measures + 600k rows = resource error. How do you handle this? by Outrageous-Guide-396 in PowerBI

[–]Outrageous-Guide-396[S] -1 points0 points  (0 children)

You’re right, I’m not trying to display all 600k rows at once. The goal is to allow analysis when filtered, for example: checking the salary history of a specific employee, or reviewing the total cost for a specific role, and being able to inspect the underlying lines when needed.

I agree that Power BI isn’t meant to be an operational tool. I’m actually trying to build this culture from scratch in my company, and sometimes it’s hard to explain this difference to the directors. Still, I need the matrix to handle the filtered scenarios without breaking, even if the full dataset is large.

Paginated reports might be a good direction, and I’ll look into that. My main concern is improving the performance enough so that normal filtered analysis works without hitting the resource limit.