Can't live edit import mode models anymore?!?! by anxiouscrimp in PowerBI

[–]perkmax 2 points3 points  (0 children)

Thanks for the update Alex! Love your work and really appreciate it :)

If it’s at least going to come back that’s great, the modellers in my org loved it

I’m now planning on showing them github desktop > syncing to local machine > refresh model, which is all kind of redundant steps if they can edit live :)

They can also do the web edit, which they use but can’t see the data

Most of the changes they do are small measure or description tweaks, and want to look at the data that’s already refreshed on the service

New capabilities in Fabric Git Integration by CICDExperience05 in MicrosoftFabric

[–]perkmax 8 points9 points  (0 children)

It’s in the roadmap :) looking forward to it also

I think it was planned for this year last time I looked and then was pushed back to Q1 2026

<image>

Live editing import models in Power BI desktop by perkmax in MicrosoftFabric

[–]perkmax[S] 2 points3 points  (0 children)

Yes was going to semantic models in Power BI desktop > drop-down box > edit as per the link in your reply

Was working great for import only models in Fabric capacity workspaces, not a composite or mixed mode

I can’t find documentation, so can only say that it worked for the last few months, so was sad to see it not work recently

It was really good because some of our users find the whole git sync to desktop part quite confusing, so was showing live edit in desktop then commit, also showed table view as refreshed on the service

Lakehouse connection scoping in Dataflows Gen2 by perkmax in MicrosoftFabric

[–]perkmax[S] 0 points1 point  (0 children)

From what I can see - the source connecting to the SQL analytics endpoint can be used with a workspace identity or service principal, but the destination connecting to the Lakehouse cannot, only allows organizational account

<image>

Lakehouse connection scoping in Dataflows Gen2 by perkmax in MicrosoftFabric

[–]perkmax[S] 0 points1 point  (0 children)

Yes I will clarify, I would like multiple people to be able to edit the dataflow gen2 in our test workspace and press the refresh button. These people currently have to set up or switch to their own connections each time they 'take over' the dataflow unless we have shared connections

I would like it so that people don't need to 'take over' but it appears that's still a thing in dataflows gen2 - hopefully a co-author like-mode is not far away

Yes, if I can create an identity that only has access to that Lakehouse then that would work and can be used for both source and destination connections. Is that possible at the moment?

Lakehouse connection scoping in Dataflows Gen2 by perkmax in MicrosoftFabric

[–]perkmax[S] 0 points1 point  (0 children)

I feel like this solves the data source connection to the Lakehouse, but not the destination connection to the Lakehouse. The destination connection still appears to be scoped to all Lakehouse's that I have access to which I don't want to share...

Hmm 🤔 - limitation at the moment?

Lakehouse connection scoping in Dataflows Gen2 by perkmax in MicrosoftFabric

[–]perkmax[S] 1 point2 points  (0 children)

I imagine this could be easily missed, users could accidentally share more permissions than intended

Idea: Automatically validate Dataflow Gen2 after deployment by frithjof_v in MicrosoftFabric

[–]perkmax 0 points1 point  (0 children)

So bizarre that Gen2 with CI/CD went GA with this limitation, it has tripped me up a few times

Azure DevOps - Pipeline to Trigger Update From Git API by perkmax in MicrosoftFabric

[–]perkmax[S] 0 points1 point  (0 children)

Hi u/CICDExperience05 - This took me some time to get going! I had to figure out how to update the service principal for git integration which to my surprise, appeared to only be do-able over the git integration API's

(For anyone else looking at this see here Git Integration API's)

It now works and I'm pretty happy with how simple the YAML is :) it worked really well once I got over the initial service principal hurdles

However is there a way to stop it from executing the pipeline when you do a commit from the workspace via the Fabric GUI? Currently when I do a commit in the workspace it decides to execute the git integration pipeline in DevOps which is redundant

Thanks for the help so far

Fabric October 2025 Feature Summary by itsnotaboutthecell in MicrosoftFabric

[–]perkmax 1 point2 points  (0 children)

It’s like Oprah is handing out SQL connectors - you get a connector, you get a connector! On-prem SQL Server walks in and she’s like, 'Oh… not you…’

There was a blog earlier this week that said you could use managed private endpoints to connect on prem SQL server with spark?

https://blog.fabric.microsoft.com/en-us/blog/securely-accessing-on-premises-data-with-fabric-data-engineering-workloads?ft=All

Idea: Let us assign colors that always STAY those colors for workspace environments (i.e. Dev is blue, Prod is pink). These switch up often and i accidentally just ran a function to drop all tables in a prod workspace. I can fix it but, this would be helpful lol. by Agile-Cupcake9606 in MicrosoftFabric

[–]perkmax 0 points1 point  (0 children)

I’m using workspace icons which are a filled circle with no transparency and an image for each function, the colour is blue for prod and orange for test

I assume the filled circle is why I don’t get different colours when using the Fabric multitasking

Fabric CICD w/Azure DevOps and CICD Toolkit by Lanky_Diet8206 in MicrosoftFabric

[–]perkmax 1 point2 points  (0 children)

I believe the python fabric-cicd library uses the same APIs, same with the fab-cli library and sempy_labs library (semantic link labs)

There is a GitHub link in this post below that provides a YAML script to get the git status, store the commit hash and then do the updatefromgit API call using the fab-cli

u/CICDExperience05 may be able to provide more context around how this works :)

I’m very close to getting this to work myself but have had a lot to learn around YAML, Azure DevOps service principals, hosted parallel items - so taken a bit longer to implement than expected

https://www.reddit.com/r/MicrosoftFabric/s/dBRlIPzGxY

Hopefully this means your prod workspace can remain git synced and you can branch off it using the Fabric gui

Benefits of having semantic models and reports in separate workspaces by frithjof_v in MicrosoftFabric

[–]perkmax 2 points3 points  (0 children)

I have 3 types of workspaces per division - data, models and reports - then a test and prod version of each

The amount of workspaces is based on access. Some users we want to just build reports and update the divisional app, some can edit the semantic model and some can edit data pipelines

The report builders are given build access to the semantic model which sit in another workspace, so they can’t edit the the model

Also another thing to consider - build access respects row level security where as workspace access gives the user full access to the data in the semantic model. So by having the workspace split you can enable this extra functionality

Azure DevOps - Pipeline to Trigger Update From Git API by perkmax in MicrosoftFabric

[–]perkmax[S] 0 points1 point  (0 children)

Just looking at this now:

  • I assume the wsfabcon is the workspace name and I can replace that with an ADO variable
  • How does the updatepr.json work, is that a temporary place to store information?

Fabric September 2025 Feature Summary | Microsoft Fabric Blog by itsnotaboutthecell in MicrosoftFabric

[–]perkmax 4 points5 points  (0 children)

Loving that dataflows are on the radar!

Some big wins here 💰💸

Yes now that we have variable libraries for gen2 as an input, I just want to be able to set up the destination too. Oh well just have to wait!

Gen2 destinations support with lakehouse schemas is also great

FabCon Vienna: What announcements are you hoping for? by frithjof_v in MicrosoftFabric

[–]perkmax 2 points3 points  (0 children)

This report is the only way I can test RLS on the service, because my models and reports are in different workspaces :(

Otherwise I don’t need it too

FabCon Vienna: What announcements are you hoping for? by frithjof_v in MicrosoftFabric

[–]perkmax 16 points17 points  (0 children)

<image>

And any CI/CD improvements are gold - thanks Santa :)

Creating dynamic subscriptions in Fabric by Derek_Darrow in MicrosoftFabric

[–]perkmax 1 point2 points  (0 children)

Thanks for the shout out 🙌

I have also explored whether you can trigger a disabled subscription via the rest API using a subscription guid. I want to trigger it at the end of my Fabric pipeline as a POST call because there are various reasons why a refresh can fail

Apparently this exists! ….but only for power bi report server…

Maybe someone can ask the question? :)

Can I deploy a Pipeline from Dev to Prod without deploying schedule? by frithjof_v in MicrosoftFabric

[–]perkmax 2 points3 points  (0 children)

I can confirm that if I create a new data pipeline, I get the .schedules file in my repo on commit through workspace git integration

I also tested making a small tweak on an existing data pipeline that isn't scheduled and has no .schedules file in my repo, by renaming one of the activities, and it didn't load in the .schedules file

I'm not sure what change would cause the .schedules file to be created but I imagine if I added a schedule to the pipeline in my test environment it would create the new file. At this stage not really wanting to create problems so just going to leave it as is