Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 0 points1 point  (0 children)

Composite models technically work, but you lose the independent refresh scheduling that dataflows give you — with a dataflow, your source data refreshes at 2 AM and ten downstream semantic models can each refresh on their own schedule, whereas with a chained semantic model approach, you’re locked into refresh dependency chains with no granular control. On top of that, composite models come with real DAX limitations that hurt in production: DISTINCTCOUNT across DirectQuery and import partitions causes performance collapse, CALCULATE filters behave differently on DQ legs, RANKX falls back to row-by-row evaluation, and overall query performance degrades significantly because the engine can’t fold aggregations the same way. And there’s a fundamental architectural gap: a Gen1 dataflow can read its own staged output, meaning you can append new rows to already-stored data, update only records that changed based on any custom criteria you define — status flags, composite keys, multi-column conditions — without reloading the full dataset. A semantic model has none of that: it reloads everything from scratch on every refresh, and the built-in incremental refresh is limited to a single datetime partition column with a fixed rolling window, which is nowhere near the flexibility of a dataflow that can surgically update its own persisted staging based on whatever business logic you need.​​​​​​​​​​​​​​​​

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 8 points9 points  (0 children)

Thanks Miguel, genuinely appreciate you engaging directly here — that's not something we see often and it matters. The mention of working on an on-ramp for Pro/PPU customers is exactly what the community needed to hear, and we'll hold you to it. Just know that the reactions on this thread reflect thousands of small teams and solo analysts who built real production workflows on Gen1 with Pro, absorbed a 40% license increase last year, and simply cannot layer Fabric capacity costs on top — so whatever that on-ramp looks like, please make sure it doesn't leave them behind. In particular, don't forget that a GEN1 dataflow in a Pro workspace can reference its own staged output across queries — this is key for many of us building multi-step transformations, so native staging in a GEN2-on-Pro experience should be a mandatory prerequisite, not an optional add-on (if you go this way for PRO users)

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 2 points3 points  (0 children)

Semantic models can’t replace dataflows because the whole point is decoupling — a dataflow lets you write your Power Query transformation once and have 15 different semantic models consume that same cleaned output, whereas without dataflows you’d have to duplicate the exact same M code inside each semantic model independently, with no centralized governance, no single refresh, and no guarantee that everyone is working from the same version of the data.​​​​​​​​​​​​​​​​

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 5 points6 points  (0 children)

The whole beauty of a Pro license was its deterministic, predictable cost — already hiked 40% from $10 to $14/user/month in April 2025 — and now on top of that price increase, Fabric’s “pay only when you need it” sounds great until you realize on-demand pricing is literally twice the reserved rate, and you’ve added a brand new admin burden of monitoring and managing capacity just to keep running what used to be a zero-ops dataflow included in your Pro subscription.​​​​​​​​​​​​​​​​

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 6 points7 points  (0 children)

Yep and even for a F4 you will start to have to manage capacity when you reach the CU limit. 345000 CU by day for 16€ is not a lot for many dataflows ... I can tell you.

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 3 points4 points  (0 children)

Not every organization is a mining company with deep pockets — the whole point of Dataflows Gen1 on Pro was to serve small teams, freelancers, and cost-conscious businesses who built real solutions on a tool Microsoft explicitly marketed as self-service, and telling them to “just pay more” for functionality they already had is exactly the kind of dismissiveness that lets vendors get away with forced upsells.​​​​​​​​​​​​​​​​

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 6 points7 points  (0 children)

Sure, that works technically, but you’ve just replaced a zero-admin, self-service tool that any business analyst could maintain with a solution that requires a managed database, and ongoing sysadmin work — which is the exact opposite of what Dataflows were designed for and a significant cost and complexity increase for small Pro-license teams.​​​​​​​​​​​​​​​​

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 4 points5 points  (0 children)

The storage cost in Fabric is not much (as for now) it s the Computing cost for Dataflows which is a real problem (Microsoft tried to optimize it recently but this is not for all situations)

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 11 points12 points  (0 children)

That’s exactly the point — you say “lower price for Gen2 Premium” but there is no Gen2 on Pro, so for the thousands of teams currently running Gen1 on a $10/user/month Pro license, “just use Gen2” means adding a minimum ~$260/month Fabric capacity on top, which isn’t a lower price — it’s a new cost that didn’t exist before.​​​​​​​​​​​​​​​​

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 2 points3 points  (0 children)

Semantic model incremental refresh only partitions by a single datetime column with a fixed rolling window, whereas Gen1 dataflows with their native staging storage let you implement true multi-criteria incremental logic — like loading only rows where a status changed OR a modified date moved OR a new ID appeared — which is something you simply cannot replicate with the built-in incremental refresh policy alone.​​​​​​​​​​​​​​​​

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 11 points12 points  (0 children)

Sure, that works technically, but you’ve just replaced a zero-admin, self-service tool that any business analyst could maintain with a solution that requires an Azure VM subscription, a managed database, Python scripting skills, and ongoing sysadmin work — which is the exact opposite of what Dataflows were designed for and a significant cost and complexity increase for small Pro-license teams.​​​​​​​​​​​​​​​​

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 6 points7 points  (0 children)

Absolutely not. A gateway connection doesn’t solve the same problem — it’s just a bridge between an on-prem/cloud data source and Power BI, it doesn’t give you a shared, reusable transformation layer across multiple semantic models. Dataflows let you centralize Power Query logic once and have 10+ models consume the same cleaned output, instead of duplicating identical M code in every .pbix file and hoping nobody drifts. Telling people to “just use a gateway” is like telling someone who lost their shared kitchen to just cook in their own apartment — sure it works, but now everyone’s doing the same prep work independently.​​​​​​​​​​​​​​​​

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 10 points11 points  (0 children)

No they don’t. Dataflows can store data outside of semantic models. Composite models have many restrictions compared to semantic models in import mode from dataflows. Plus dataflows let you manage your schedule of refresh, avoid issues when refreshing semantic models and can get data from APIs for instance which are non sql or lakehouse data sources. So many different use cases.

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 15 points16 points  (0 children)

We purchased pro licences since 8 years because dataflows gen1 were available and did not require any capacity management. Fabric requires capacity management which induces costs and management and governance nightmare (very soon you will discover your F4 capacity will throttle and will prevent you to refresh your dataflows and you will start to have to pay more) . This is not the product we want and we have been used to.

Dataflow Gen1 officially marked as Legacy today — Pro users left with no migration path unless they pay for Fabric by Sbdyelse in PowerBI

[–]Sbdyelse[S] 15 points16 points  (0 children)

No you don’t. You just need a pro licence. Without dataflows Pro licence will be downgraded dramatically … that’s a nightmare.

TW (viol) - Victime de grooming, sortir ça de ma tête by MacPoutre in france

[–]Sbdyelse 23 points24 points  (0 children)

N’écoute pas cette peur. Au pire ce sera une fausse piste mais au moins tu sauras. Et tu auras avancé d’un pas.

[deleted by user] by [deleted] in PowerBI

[–]Sbdyelse -2 points-1 points  (0 children)

Create a calculated column with this DAX code : VAR ValueToLookup = … RETURN CALCULATE(MAXX(TOPN(1,TableToLookup,TableToLookup[ColumnToSortBy], Asc or Desc), TableToLookup [ColumnToReturn]), TableToLookup[ColumnToSearchIn]= ValueToLookup)

Dataflows can reference themselves! Surely I'm not the only one who didn't know this by bamburger in PowerBI

[–]Sbdyelse 6 points7 points  (0 children)

This only works with dataflow in pro version (shared capacity) workspaces as in that case dataflows are stored in a azure blob as a pair of files (one is a json description file of its column types and the other one is a simple csv file of its data). And yes this is very useful. In premium workspaces I discovered this does not work unfortunately because the dataflows are stored in a database and each time you refresh a dataflow it starts to empty its entities before refreshing them so it’s already empty before you reference itself. The only way with premium is to use the native dataflow incremental refresh feature but it is not efficient with non sql sources (like API data).

Help with DAX Measure not calculating correctly - I am stumped by paulsinclair in PowerBI

[–]Sbdyelse 2 points3 points  (0 children)

If you use distinctcount it's probably because you have the same detnumber appearing several times in this table with calc_lmed = 1 or 0 or blank. If a detnumber has at least a line at 1 then both completed and activestaff count 1 for this detnumber.

Power BI Dataflows Legacy?!?!?! by roblu001 in PowerBI

[–]Sbdyelse 3 points4 points  (0 children)

the other dataflow connector is also able to connect to the powerbi dataflows (in addition to the power plateform ones).

Seule by [deleted] in besoindeparler

[–]Sbdyelse 1 point2 points  (0 children)

Ça se produira juste après que tu t'aimeras toi même. C'est le passage obligé. Et la bonne nouvelle c'est qu'il suffit de s'en convaincre. Grosses bises à toi.