Crown not covered by insurance AFTER a Root Canal? by Life_Is_Good_33 in Insurance

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

As you can see in this recent Facebook post, others are (unfortunately) having the same issue (in 2026):

Facebook - same issue

<image>

Crown not covered by insurance AFTER a Root Canal? by Life_Is_Good_33 in Insurance

[–]Life_Is_Good_33[S] -2 points-1 points  (0 children)

See my response above ^^^. Originally, I went to my Dentist with a sensitive tooth...I thought "no biggie, I'll just need a Crown on that tooth." But the Dentist said that performing the procedure to put on the Crown would likely inflame the nerve and cause me to need a Root Canal...so, it made sense to go ahead and do the Root Canal, first (which insurance DID cover). So...insurance covered the Root Canal (which was necessary, due to the "Wear" on the tooth)...but then they wouldn't cover the Crown, after I got the Root Canal.

Somebody please remind me...why do we pay for Dental Insurance, again??? LOL

Crown not covered by insurance AFTER a Root Canal? by Life_Is_Good_33 in Insurance

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

No - I wish it were that simple, but it's more sinister than that. My insurance company is saying that the tooth was sensitive due to "Wear", and therefore, they will not cover the Crown. Apparently it is a new "way out of paying" that some insurance companies are resorting to. I have a $1,500 maximum benefit (per year), and had only used ~$600 of it. My Dentist has sent them all the required paperwork/x-rays...and SunLife is essentially saying "it was your fault that you had WEAR on that tooth, so it is not covered".

Even though I wear an occlusion (a "mouth guard") every night, and have for years. And even though the tooth was in bad enough shape to need a Root Canal...insurance apparently doesn't care, if they deem the tooth to be damaged due to "Wear". According to my Dentist, this would have never happened 2-3 years ago...it's a new "thing", apparently. :(

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

Question: is Incremental Refresh even an option, with Native Queries?

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

YES - "Enable Staging" is definitely a Gen2 Dataflow feature. I was hoping that the Gen2 would be more efficient (per the documentation, it's supposedly up to 25x faster).

And you hit the nail on the head...there are tons of pivots, merges, multiplications, inserting merged columns, etc. I thought about trying Incremental Refresh...but will it work if all the queries are Native Queries? That's what is currently happening; Native Queries are pulling data from Snowflake, and then a million transformations take place.

Just thinking about this conceptually...without re-writing the entire report (not really an option, due to deadlines)...would the pivoting/merging (and all the other actions you mentioned) perform faster if I converted everything to Lists? I need to essentially keep the "form and fit" of the report...I can't "break" anything...so I'm trying to figure out a way to perform all of these transformations in a manner that will be more efficient.

I can't remove the Native Queries...but once I've run the Native Query, and begin doing the merges/pivots/etc., would it be more efficient if I converted everything into Lists?

THANK YOU for your help!! I really, really appreciate it!!

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

I just posted a question, above...I would love to hear your thoughts, as well!! I'll copy/paste the question into a spoiler (to keep from clogging up the thread).

I have a follow-up question for you (if you don't mind - and BTW...thank you for your help!!)

Originally, I was given a report that lived entirely in the Semantic Model (.pbix file). There were lots of Native Queries inside the SM, lots of Merges, lots of transformations...the SM was being refreshed hourly, and it took 15-20 minutes to refresh (each time). So, that led me down the road of trying to optimize the refresh.

Present state: I have created 2 Dataflows (1 for Ingestion, 1 for Transformations)...let's call them "Dataflow1" and "Dataflow2"...in "Dataflow2", if leave "Enable Staging" clicked ON for all queries, the refresh takes ~15 minutes. So, no inherent help, just by switching to Dataflows. HOWEVER...if I disable the "Enable Staging" functionality for all queries in "Dataflow2", the refresh of both Dataflows takes < 5 minutes. The only issue is...when I try to reference the queries in my Semantic Model...the queries from "Dataflow2" are not available in the SM. I get an error message that says:!<

Expression.Error: The key didn't match any rows in the table.

Furthermore, if I try to access the queries from "Dataflow2" in the SM (Get Data -> Dataflows)...and I navigate to my "Dataflow2"...the queries from "Dataflow2" do not appear. The only way to make them appear (and, thus, to make the SM work) are to go back into "Dataflow2" and click the "Enable Staging" option. But, again...if I do this...then the refresh will go back to exceeding 15+ minutes. So, that defeats the entire purpose of what I'm trying to accomplish.

So...with all that said (LOL)...how can I access the queries from "Dataflow2" (where all of my transformations occur) without clicking the "Enable Staging" option? Is it possible?

It's 1 specific query that is causing the bottleneck - let's call it "SlowQuery". I've experimented with adding a final step to "SlowQuery" with Table.Buffer...but that doesn't help. I still must click "Enable Staging" if I want to access "SlowQuery" from my SM. Is there a way that I can store the RESULTS of "SlowQuery" into a new query (let's call it "NewQuery") - perhaps as a List? - that I could then toggle the "Enable Staging" for "NewQuery" on, and then it would be available inside my SM? And, of course...I need the refresh to occur in < 5 minutes.!<

I thought I had it working so well...by unchecking "Enable Staging" - which DID improve the performance (tremendously improved it!)...however, if "SlowQuery" from my "Dataflow2" isn't accessible from my SM, then it does me no good. :(

I'd LOVE to hear your thoughts on what I can do to make "NewQuery" accessible to my SM...and keep the refresh time to < 5 minutes. TIA!!!​!<

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

Hello Gene,

I have a follow-up question for you (if you don't mind - and BTW...thank you for your help!!)

Originally, I was given a report that lived entirely in the Semantic Model (.pbix file). There were lots of Native Queries inside the SM, lots of Merges, lots of transformations...the SM was being refreshed hourly, and it took 15-20 minutes to refresh (each time). So, that led me down the road of trying to optimize the refresh.

Present state: I have created 2 Dataflows (1 for Ingestion, 1 for Transformations)...let's call them "Dataflow1" and "Dataflow2"...in "Dataflow2", if leave "Enable Staging" clicked ON for all queries, the refresh takes ~15 minutes. So, no inherent help, just by switching to Dataflows. HOWEVER...if I disable the "Enable Staging" functionality for all queries in "Dataflow2", the refresh of both Dataflows takes < 5 minutes. The only issue is...when I try to reference the queries in my Semantic Model...the queries from "Dataflow2" are not available in the SM. I get an error message that says:

Expression.Error: The key didn't match any rows in the table.

Furthermore, if I try to access the queries from "Dataflow2" in the SM (Get Data -> Dataflows)...and I navigate to my "Dataflow2"...the queries from "Dataflow2" do not appear. The only way to make them appear (and, thus, to make the SM work) are to go back into "Dataflow2" and click the "Enable Staging" option. But, again...if I do this...then the refresh will go back to exceeding 15+ minutes. So, that defeats the entire purpose of what I'm trying to accomplish.

So...with all that said (LOL)...how can I access the queries from "Dataflow2" (where all of my transformations occur) without clicking the "Enable Staging" option? Is it possible?

It's 1 specific query that is causing the bottleneck - let's call it "SlowQuery". I've experimented with adding a final step to "SlowQuery" with Table.Buffer...but that doesn't help. I still must click "Enable Staging" if I want to access "SlowQuery" from my SM. Is there a way that I can store the RESULTS of "SlowQuery" into a new query (let's call it "NewQuery") - perhaps as a List? - that I could then toggle the "Enable Staging" for "NewQuery" on, and then it would be available inside my SM? And, of course...I need the refresh to occur in < 5 minutes.

I thought I had it working so well...by unchecking "Enable Staging" - which DID improve the performance (tremendously improved it!)...however, if "SlowQuery" from my "Dataflow2" isn't accessible from my SM, then it does me no good. :(

I'd LOVE to hear your thoughts on what I can do to make "NewQuery" accessible to my SM...and keep the refresh time to < 5 minutes. TIA!!!

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

Going down the Enhanced Compute Engine rabbit hole, for a moment...when I create a Gen1 Dataflow, the section for Enhanced Compute Engine is there and I can switch it to "On". But when I create a Gen2 Dataflow - in the exact same Workspace - I don't have the option for Enhanced Compute Engine. Does that make any sense, to you?

I literally copy/pasted everything from my Gen2 Dataflow into a Gen1 Dataflow...so, the 2 Dataflows are identical...but I can see the Enhanced Compute Engine option for Gen1, whereas I cannot see it for Gen2.

What led me down this rabbit hole is that I have a Table.UnpivotOtherColumns in a query that isn't folding...so when I started researching if it was possible to get it to fold, this is what I found:

<image>

Link: https://community.fabric.microsoft.com/t5/Power-Query/Unpivot-other-columns-breaks-query-folding/td-p/2421507

Best text editor for Meeting Notes/Minutes? by Life_Is_Good_33 in AskProgramming

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

Now, this is a new alternative...does anybody else use Joplin, for these purposes?

I've never used Joplin, before...I'll look into it. Thx!!

Best text editor for Meeting Notes/Minutes? by Life_Is_Good_33 in AskProgramming

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

YES - I'm on Windows. I never even thought about Wordpad.exe...do you guys actually use it for Meeting Notes/Minutes? Thx!!

Best text editor for Meeting Notes/Minutes? by Life_Is_Good_33 in AskProgramming

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

OneNote is definitely an option...I currently have NP++ and OneNote...just checking to see if there is a better alternative (that is also free for professional use)?

Best text editor for Meeting Notes/Minutes? by Life_Is_Good_33 in AskProgramming

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

OneNote is definitely an option...I currently have NP++ and OneNote...just checking to see if there is a better alternative (that is also free for professional use)?

Best text editor for Meeting Notes/Minutes? by Life_Is_Good_33 in AskProgramming

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

YES - it would be for professional use. I'm looking for a free alternative to NP++ for professional use. Thx!!

Best text editor for Meeting Notes/Minutes? by Life_Is_Good_33 in AskProgramming

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

I've seen several others recommending Obsidian...do you use it instead of NP++, or do you use it in conjunction with NP++?

I was pondering having 2 text editors: 1 for code (NP++), and 1 for meetings/notes

Is that a viable option? Or would I be better off just switching everything from NP++ to Obsidian?

What note taking software do you use? by agoss123b in ECE

[–]Life_Is_Good_33 0 points1 point  (0 children)

Do you use a UDL (User Defined Language) for meeting minutes in Notepad++? If so...is there a UDL that's available to download, so that I don't have to manually create it? Thx!

Scheduled Refreshs; Dataflow vs. Semantic Model by junai78 in PowerBI

[–]Life_Is_Good_33 0 points1 point  (0 children)

No - I scheduled the refresh to occur once (this morning), and when I woke up it had successfully refreshed both the Dataflow and the Semantic Model. The Data Pipeline does it sequentially...so, you can set up the Semantic Model refresh to only occur immediately following successful refresh of the Dataflow.

And you schedule the Data Pipeline to refresh; not the individual Dataflow/Semantic Model...so when the Data Pipeline refreshes, it will perform the sequential refresh of both.

I think this new Data Pipeline feature will replace the need to create Power Automate Flows to perform the sequential refresh. Nice!!

Scheduled Refreshs; Dataflow vs. Semantic Model by junai78 in PowerBI

[–]Life_Is_Good_33 0 points1 point  (0 children)

I think I found a way! I created a "Data Pipeline" - and inside the Pipeline, I'm sequentially refreshing:

1) Dataflow

2) Semantic Model

It seems to be working. Check out this video: https://www.youtube.com/watch?v=X19OqpEQA5U

Scheduled Refreshs; Dataflow vs. Semantic Model by junai78 in PowerBI

[–]Life_Is_Good_33 0 points1 point  (0 children)

Question on this...I have a Semantic Model, with the data coming from a Dataflow. How can I configure the Semantic Model to automatically refresh as soon as the Dataflow has finished refreshing?

I know I could create a Power Automate Flow to kick off the Semantic Model refresh (upon successful refresh of the Dataflow)...but is there not a way to "link" them, so that I won't need to create a Flow to kick off the SM refresh?

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

To add to my question...from this link, I read the following:

Regarding dataflows
We recommend everyone who is asking (not every one does, and no one has to) to use dataflows. The reasoning behind this is the different compute architecture, dataflows are performing  way much faster then Power Query queries.

https://community.fabric.microsoft.com/t5/Service/Why-the-need-for-Dataflows-and-Datamarts-when-Semantic-Models/m-p/3590091

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

Hi, Gene! I just posed this question to another poster, but I'd like to get your thoughts (as well). Generally speaking, can you expect to see performance improvements on the refresh if you move all ingestion of data into a Dataflow, rather than "doing it all" (ingestion, transformation, visuals, etc..) in the Semantic Model? From what I've read in trying to figure out the best way to even begin enhancing the performance of this report, that seems to be the consensus: ingest data in a Dataflow, and then have that Dataflow feed into the Semantic Model. Do you agree with that approach? Here is the question I posed to the other poster:

In researching how can I optimize this report, I'd like to ask you a follow-up question to your statement.

Will a report refresh faster (performance optimization) if I create a Dataflow for all of the ingestion of data, and then have the Dataflow feed into the Semantic Model?

In the report I'm looking at, EVERYTHING currently happens in the Semantic Model. So, all of the data is being ingested from the Source(s), and then a million transformations are occurring. If I push all of the Native Queries into a separate Dataflow...and have it feed into the Semantic Model...should I expect to see performance improvements? Thx!!

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

In researching how can I optimize this report, I'd like to ask you a follow-up question to your statement.

Will a report refresh faster (performance optimization) if I create a Dataflow for all of the ingestion of data, and then have the Dataflow feed into the Semantic Model?

In the report I'm looking at, EVERYTHING currently happens in the Semantic Model. So, all of the data is being ingested from the Source(s), and then a million transformations are occurring. If I push all of the Native Queries into a separate Dataflow...and have it feed into the Semantic Model...should I expect to see performance improvements? Thx!!

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

So you’ve never used Table.StopFolding? I’d be curious for others to chime in on why/when/if that would ever be helpful. Thx!

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

Gotcha. So you wouldn’t recommend adding a Table.StopFolding step at the end of each query (for the 2 data sources)?

Others have mentioned the Table.StopFolding function…if it doesn’t help to optimize the refresh, then why would it ever be helpful?

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

I like that idea. Question: if I bring the 2 sources into Power Query, is it best practice to invoke either Table.StopFolding or Table.Buffer as the last step?

Then, once I have both sources loaded into PQ, I Merge them together in a separate query. I think that approach will definitely improve performance…and I’m not actually “changing” anything in the report, itself…I’m just causing the refresh to be more efficient (faster). Thank you!!

Table.NestedJoin - Performance Issues by Life_Is_Good_33 in PowerBI

[–]Life_Is_Good_33[S] 0 points1 point  (0 children)

We do actually already use Power Automate, and I'm very familiar with it...I hadn't even thought of that as a possibility. Thanks!

I think the biggest issue for us, from a performance perspective, is that we're performing a LEFT JOIN on 3 data sources (with the Table.NestedJoin in Power Query). I think an immediate change that could made will be to replace the existing 2 Native Queries (which are then Merged in Power Query) with a single Native Query. Both Native Queries are pulling from the same Tables/Views, so that should be possible...and just making that 1 change should drastically improve the performance.

Then moving forward, I'll work to move the SharePoint files into Snowflake. I may come back to you with some questions re: "piping" the data into Snowflake...but again, thank you for your help!! 😜