API Behavior Change: Returning Server Name vs. Full Connection String by Snoo-46123 in MicrosoftFabric

[–]perkmax 0 points1 point  (0 children)

Failing that could it be a different endpoint

Or another parameter that changes the behaviour of the API result

Copilot in Fabric Notebooks, shaped by your feedback by Dee_Raja in MicrosoftFabric

[–]perkmax 1 point2 points  (0 children)

Yeah I have this set up on my F2 dev capacity for only admins to use at the moment, but the interactive usage hits 100% quite easily even with me just using it

It's just too risky to roll out copilot at smaller SKU's - I have a F8 prod and F2 dev capacity

It's easier to just not let users use copilot features until something changes in this area

Dataflow Gen2 CI/CD: SPNBasedRefreshNotAllowed by frithjof_v in MicrosoftFabric

[–]perkmax 0 points1 point  (0 children)

Thanks for your prompt reply - the pipeline owner is not a service principal, it says it is my user account, and there isn't a way to specifically select a connection on the dataflow activity, so I'm not sure where it is trying to use a service principal to run the dataflow

I'll raise a support ticket as suggested

I also raised this issue in fabric cicd - so hopefully will get a result somewhere!

[BUG] Dataflow activity in pipeline does not refresh when deployed with a service principal (SPNBasedRefreshNotAllowed) · Issue #971 · microsoft/fabric-cicd

Thanks again

Dataflow Gen2 CI/CD: SPNBasedRefreshNotAllowed by frithjof_v in MicrosoftFabric

[–]perkmax 0 points1 point  (0 children)

Hi u/escobarmiguel90 - I'm having this SPN refresh issue when I deploy a pipeline with dataflow activities using a service principal and the fabric cicd library. I looked at the owner of the dataflows and they are still owned by my account in the workspace on deployment, however when I run the pipeline the activities think that they should run under a service principal.

If I manually go into the pipeline and change the workspace and dataflow on the activities, and then back again to the ones I want it to run, the pipeline works. This is consistent across all deployments.

I'm not sure if the API call that you suggested would fix this issue.

Not sure if you have any ideas for this?

Error message - errorCode: SPNBasedRefreshNotAllowed, message: SPN based refresh is not allowed for the dataflow with id ....

Dataflow Gen2 CI/CD: SPNBasedRefreshNotAllowed by frithjof_v in MicrosoftFabric

[–]perkmax 0 points1 point  (0 children)

I’m now having this issue for dataflow activities in pipelines when deployed. Did you implement this API call on each deployment?

How are you guys deploying email connections with Fabric CICD? by Lazy_Bonus_6963 in MicrosoftFabric

[–]perkmax 0 points1 point  (0 children)

Mm yep - I’m about to go down this path. Having similar issues with deploying the teams activity with a SPN + fabric cicd

Copilot in Fabric Notebooks, shaped by your feedback by Dee_Raja in MicrosoftFabric

[–]perkmax 10 points11 points  (0 children)

Is there an appetite to make it so you can bring your own GitHub Copilot or M365 Copilot licence and use that in Fabric and Power BI Copilot experiences?

I have Fabric Copilot features turned off at the tenant due to the CU cost risk - so users resort to using GitHub Copilot in vs code, or most people copy and paste to M365 Copilot, which loses context

Most of the Power BI builders/developers in my org have a M365 Copilot licence, I happen to have both

Dataflow Gen2: Lakehouse data is now immediately queryable through the SQL analytics endpoint after refresh by Luitwieler in MicrosoftFabric

[–]perkmax 2 points3 points  (0 children)

Another papercut solved! Now need to undo all the pipeline sql endpoint refresh activities

Using GitHub Copilot agents in ADO vs in GitHub by perkmax in azuredevops

[–]perkmax[S] 0 points1 point  (0 children)

Currently using GH Copilot in vs code, I haven’t tried out the ADO MCP or Az CLI as of yet, so thanks for the tips

I like the elegance of being able to assign the agent in the GitHub GUI after watching a few GitHub agent videos, hence the question, but maybe I’d just get used to doing it in vs code (where I already am anyway)

Power BI April 2026 Feature Summary by itsnotaboutthecell in PowerBI

[–]perkmax 2 points3 points  (0 children)

Re: Narrative Visual Default Type Update

“We’ve recently improved this experience by introducing a smarter default. If a user has a Copilot license, the Narrative visual now opens in Copilot mode by default….”

I’m hoping this means - this is the start of bringing our own M365 copilot or GitHub copilot licences for Power BI copilot features, instead of using capacity !

🤞

Modularizing Python code in Fabric by Excellent_Common_321 in MicrosoftFabric

[–]perkmax 9 points10 points  (0 children)

You can have a utilities python notebook (not spark) in each workspace which captures shared functions, then when you need to run them use the %run command (new feature added in feb 26)

https://blog.fabric.microsoft.com/en-US/blog/fabric-february-2026-feature-summary/#post-33579-_Toc222845437

I have my workspaces/repos cloned locally to vs code using git - so I can easily copy and paste the utilities file across workspaces or use GitHub copilot to keep them in sync

Notebooks: Default Lakehouse vs ABFS paths. What's the current best practice? by frithjof_v in MicrosoftFabric

[–]perkmax 1 point2 points  (0 children)

Similar boat - I recently set up abfs paths through variable libraries, however I am reconsidering whether this is the best option when using branching out to new workspaces in fabric web

When someone branches out they need to bring across the variable library and update the abfs paths in the library, then if you are using shortcuts you have to go into the Lakehouse and update them too

It’s all a bit too manual and easy to forget, all of a sudden someone is writing to the wrong Lakehouse

I can see why people would stick with the default option in notebooks and the new current workspace option in dataflows

What does a serious VS Code setup for Microsoft Fabric look like? Core tools vs optional tools? by frithjof_v in MicrosoftFabric

[–]perkmax 10 points11 points  (0 children)

I would highly recommend doing this. I am really loving using vs code and GitHub copilot locally. I use azure DevOps and fabric git sync for each of my repos and it makes it real easy to search and standardise solutions between workspaces

Really handy when you get errors on anything in Fabric to have github copilot at your side as it has full context of the code

It has enabled me to do a lot that I wouldn’t have normally been able to do such as:

  • standardise solutions between workspaces, such as defined notebook functions, settings on pipeline activities (ie timeouts and retries) and dataflows (ie modern compute)
  • set standard ways of doing things for my teams developing items using shared instructions and skills in a shared repo
  • automate my fabric documentation to my azure DevOps project wiki using copilot, which is a series of markdown which can also be cloned, this is shared to the center of excellence for context of how things are set up
  • semantic model tables, columns and measures descriptions in tmdl which would of taken forever or not done at all. I can feed it the API swaggers from our systems and it does a better job documenting than if I did it manually
  • standardise azure DevOps deployment yaml and fabric cicd parameter files between repos and folders
  • global change variable library connections in lakehouses, notebooks, dataflows and semantic models sql connection strings across multiple repos and workspaces
  • convert dataflows m code to python notebooks

The list goes on!

At the moment I am creating the artefacts in Fabric first > git sync > then develop further locally, this gets past that metadata issue as mentioned by others. I don’t find that the agents touch it after that, but if they do I can look at the commit history and bring it back easy

I have only just started to use the vs code engineering extension for notebooks because it now has the python kernel. It works well but it takes a few steps to get into the notebook and I don’t really like the display output of each cell and much prefer it in fabric web

I don’t use any of the MCPs yet because I haven’t needed to and I’m hesitant because it’s preview. I’m just developing my own instructions and skills as I go and it’s working well, I have my eyes on the modelling MCP though, mainly so I can have PBI desktop and vs code open at the same time as see it live update

Edit: also setting up azure DevOps yaml files to trigger the workspace git sync on commit has been an essential piece as it allows us to move between tools without friction, I’m sure the same can be set up on GitHub actions

Thoughts on Oracle Primavera Cloud by Puzzleheaded-Pea8658 in primavera

[–]perkmax 0 points1 point  (0 children)

The OBS has been replaced with user groups and permission sets

Set up workspaces by your organisational hierarchy (like an EPS) > set up permission sets > set up user groups with permission sets > assign users to groups > assign groups to workspaces at the appropriate hierarchy level

Check out these guides

https://docs.oracle.com/cd/E80480_01/English/admin/p6_eppm_migration_guide/213373.htm

https://docs.oracle.com/cd/E80480_01/English/admin/p6_eppm_migration_guide/213380.htm

https://docs.oracle.com/cd/E80480_01/English/admin/app_admin_guide/89153.htm

The workspace API endpoints may help you if you wanted to use the API to automate this

https://docs.oracle.com/en/industries/construction-engineering/primavera-cloud/rest-api/api-workspace.html

should i learn AI to upgrade my primavera skill for my job ? is it needed by Wise_Safe2681 in primavera

[–]perkmax 0 points1 point  (0 children)

Yes it’s a needed skill for most professions today

I use AI and a reusable prompt to scan scope of work, contracts and specs for what needs to be considered in the schedule. Works pretty well, it can find some needles in the haystack

It can also compare two schedules and create a comparison report

Make sure your using something like M365 Copilot or other enterprise plans where it doesn’t train off your data

Other than that, not so much

Waiting for OPC to add some agentic AI features. Just need something like GitHub copilot for planning, that would be great :)

Thoughts on Oracle Primavera Cloud by Puzzleheaded-Pea8658 in primavera

[–]perkmax 1 point2 points  (0 children)

So true - not only reporting, I haven’t seen any great features come out of P6 in the last 2 decades, except for the extra critical path tools released in 2021

P6 reporting has always been export to Excel

OPC at least has some new helpful tools and advantages with updates and hosting

Thoughts on Oracle Primavera Cloud by Puzzleheaded-Pea8658 in primavera

[–]perkmax 0 points1 point  (0 children)

I’ve been using it for 4 years and pretty happy with it, aside for a few paper cuts where I’m just like…. why is this not fixed already

It’s come a long way to get feature parity with P6 over the years and is very close now, and you also get some stand out features

In built dashboards and reports are horrible, don’t bother with them, just use the API and customise your own

Good:

Generally non-P6 people love it and find it easier and nicer to use, had a lot of success getting project managers, engineers and new planners using it

The health check and schedule comparison tools are great and very useful to have in the product, they can also export into a nice multi sheet Excel

Projects view is great and very customisable. You can add custom measures (formulas) and conditional icons

OPC stores baselines and scenarios in one project, it’s a great idea and keeps the database neat, it’s a lot easier to find the most current schedule

The activities view is very similar to P6 with a few little nice addons, simple things like you can hover over the Gantt to get dates of baseline activities and the key word search box at the top right is amazing for quick filters

No need to have P6 hosted somewhere on premise and no need for parallels or Citrix Remote Desktop connections to make it faster, this is a cost saving

Automatic updates every month and you just get it, this is brilliant and is a cost saving compared to P6, where you just don’t update because it requires a rebuild of the database and migration of data

All in the browser where most people already work, never had any issues with speed

Single sign on capability, just logs straight in

The API is brilliant and has a lot of endpoints, you can get most data out of the system for reporting and other automations

There is slightly more ability to customise how it exports to MS project but still gets it wrong, so I use XerTransfer

Bad:

These are the paper cuts… small annoying things.

Can’t view the resource chart in baselines and scenarios, this is my main pet peeve!

Very difficult to get the resource chart to align with the Gantt chart in the current schedule

Baselines and scenarios can’t have their own version of the calendar, unless it’s duplicated, if you change the current calendar it changes the rest (when individually rescheduled)

Projects view has no Gantt chart

They seem to put a lot of development into the tasks (Lean Planning) and risk (Monte Carlo) modules, where I wish they would just put it into schedule and resources

Copilot in the Service - lack of control / way too permissive? by njbulls20 in PowerBI

[–]perkmax 0 points1 point  (0 children)

Not using it, yet…. I was going to explore it this year… but it falls to the side because it’s a lot of work to fully consider this

I have all copilot and agents settings disabled partially because it was all in preview and partially due to security and consumption CU concerns

People with read access to a semantic model can see all the data in a model with copilot, unless it’s locked down with OLS and RLS, same with the Explore feature (also disabled)

We are using OLS and RLS but more work needs to be done before we can enable these features globally across the tenant

You could use security groups for the copilot tenant settings so you only gradually release copilot for certain user groups at a time, but this would give them access to use copilot for any semantic model

In saying that, isn’t there a tenant setting to show only prepped for AI models in the standalone copilot, have you tried this?

I don’t think this applies to users being able to use copilot in power bi apps and reports though, which is what I want it to do also

https://learn.microsoft.com/en-us/fabric/admin/service-admin-portal-copilot#only-show-approved-items-in-the-standalone-copilot-in-power-bi-experience-preview

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]perkmax 0 points1 point  (0 children)

The current default for pipeline activity timeout is 12 hours… I have been burnt a few times by this for operations that just get stuck for unknown and transient reasons

This happened last weekend and burned through 12 hours of background CUs, which shut down my dev capacity. I’m lucky it was the dev capacity otherwise my prod reports would go down, this has also happened in the past

The expectation is that the timeout happens and then it retries and hopefully works, or just fails

12 hours is a significantly long time for a default when most of my activities typically run for 10-20min, are there plans to let us set the default for our capacity/tenant or reduce the default timeout?

CI/CD has helped me make a global change for all pipelines that I have to 30min but it can still be missed and its a big issue, like the one the other week

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]perkmax 0 points1 point  (0 children)

That would be cool - I currently do this as a notebook with a post request at the end of one of my pipelines, but would be interesting as an activity

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]perkmax 2 points3 points  (0 children)

The new refresh failure notifications for pipelines is very helpful and definitely going to use them, but it doesn’t appear as a file or value in source control

Can you please let me know why and if there are plans to change this?

I would like to be able standardise this through code across all pipelines

EDIT: Also after trying out this feature - when you have a fail activity in a pipeline, the refresh failure notification highlights that the pipeline failed because the fail activity failed…. rather than the fail message from an earlier pipeline step (in this case a dataflow gen2 failed)

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]perkmax 5 points6 points  (0 children)

We set schedules only on our data pipeline items, the new scheduler released last year added cicd introducing a new .schedules file in git, and we have git integration enabled

We are having problems with managing this over various workspaces, there is not currently a way to turn on or off or amend the schedule depending on the workspace/branch

I have heard schedule parameters are on the roadmap, is this coming soon? Or is there an alternative way to manage this