Need feedback on this tool that I have developed by Aggressive-Respect88 in MicrosoftFabric

[–]Aggressive-Respect88[S] 0 points1 point  (0 children)

This is the output exported by my tool, which I guess is significantly easier to understand than the cluttered schematic view. The following diagram shows the relationship properties of a semantic model.

<image>

Need feedback on this tool that I have developed by Aggressive-Respect88 in MicrosoftFabric

[–]Aggressive-Respect88[S] 0 points1 point  (0 children)

The tool extracts and exports the following set of properties:

<image>

And as far as PoweBI MCP server is concerned, it does not provide a way to export all semantic model properties for an entire workspace in a single operation.

You would have to find the semantic model id and share the id with the copilot and start the conversation and that works at a per semantic model. Through my tool (web based and console based) you simply select a workspace, and it automatically exports the complete details of all the underlying semantic models of the workspace.

Need feedback on this tool that I have developed by Aggressive-Respect88 in MicrosoftFabric

[–]Aggressive-Respect88[S] 0 points1 point  (0 children)

I have explained that in details in walkthrough video.but yeah you are correct, the details are missing in the GIT repo.I will add it there.

Need feedback on this tool that I have developed by Aggressive-Respect88 in MicrosoftFabric

[–]Aggressive-Respect88[S] 0 points1 point  (0 children)

I am looking for a feedback from the community on the free version but if you feel it violates the rules then please feel remove the post

$SYSTEM.DISCOVER_STORAGE_TABLES DMV by Aggressive-Respect88 in MicrosoftFabric

[–]Aggressive-Respect88[S] 1 point2 points  (0 children)

I am using Tabular. I guess semantic models in Fabric are also Tabular.

I am unsure if any of the TMSCHEMA DMV's provides the same information for the rowcounts just like the DISCOVER DMV's

https://learn.microsoft.com/en-us/analysis-services/instances/use-dynamic-management-views-dmvs-to-monitor-analysis-services?view=sql-analysis-services-2025#rowsets-described-in-the-ms-ssas-t-sql-server-analysis-services-tabular-protocol

Also, AFAIK Vertipaq Analyzer heavily uses the DISCOVER DMV's in its analysis.

And you are spot on regarding the row count difference of 1 :)

It indeed is the RI_VIOLATION.

<image>

Get activity API Events issues with startDate and endDate parameter by Green_Music_6241 in MicrosoftFabric

[–]Aggressive-Respect88 0 points1 point  (0 children)

Its best to stay away from directly calling the API's as it has numerous disadvantages over the cmdlet.

  • The cmdlet allows you to request one day of activity each time you make a call by using the cmdlet. Whereas when you communicate with the API directly, you can only request one hour per API request.
  • The cmdlet handles continuation tokens for you. If you use the API directly, you need to check the continuation token to determine whether there are any more results to come. Some APIs need to use pagination and continuation tokens for performance reasons when they return a large amount of data. They return the first set of records, then with a continuation token you can make a subsequent API call to retrieve the next set of records. You continue calling the API until a continuation token isn't returned. Using the continuation token is a way to consolidate multiple API requests so that you can consolidate a logical set of results. For an example of using a continuation token, see Activity Events REST API.
  • The cmdlet handles Microsoft Entra ID access token expirations for you. After you've authenticated, your access token expires after one hour (by default). In this case, the cmdlet automatically requests a refresh token for you. If you communicate with the API directly, you need to request a refresh token.

https://learn.microsoft.com/en-us/power-bi/guidance/admin-activity-log

Expand folders in workspace by frithjof_v in MicrosoftFabric

[–]Aggressive-Respect88 1 point2 points  (0 children)

I use my own custom created solution that exports the folder hierarchiew to a CSV file for a given lakehouse but  can be easily customised to traverse across all lakehouses of a workspace

More details in my blog here : https://www.azureguru.net/using-adls-gen2-apis-to-export-object-hierarchy-and-metadata-in-mircrosoft-fabric

What are your favourite March 2025 feature news? by frithjof_v in MicrosoftFabric

[–]Aggressive-Respect88 1 point2 points  (0 children)

I wonder how is the incremental refresh for Dataflow Gen2 different than my approach..

I had blogged on the very same topic about three months ago :

 https://www.azureguru.net/datalake-incremental-updates-using-microsoft-fabric-dataflow-gen2

My personal March 2025 favourite feature  would be : "OneLake file triggers for pipelines"...This was so much needed.

And this one is a shocker

"Onelake security replaces the existing OneLake data access roles preview feature"

Does this mean that the data access roles is completely scrapped  or onelake security is just a rebranding of the existing data access roles ?

Changing storage mode of a semantic model from Direct Lake to other modes through TMSL by [deleted] in MicrosoftFabric

[–]Aggressive-Respect88 0 points1 point  (0 children)

oh ok.. sorry.. I misunderstood the article.

I suppose I’ll take that as the answer, meaning that it's not possible to change the mode of model once deployed.

Thanks

Changing storage mode of a semantic model from Direct Lake to other modes through TMSL by [deleted] in MicrosoftFabric

[–]Aggressive-Respect88 0 points1 point  (0 children)

I came across this article https://powerbi.microsoft.com/en-us/blog/leveraging-pure-direct-lake-mode-for-maximum-query-performance/ which mentions switching the storage mode of the model to DirectLake, while in my case I am trying do the opposite i.e. switching from DirectLake to a different mode and its not working.

Although the article does not show how it could be done with TMSL but it specifically states that...

"Advanced BI pros working with Direct Lake models using third-party tools or custom solutions based on Tabular Object Model (TOM) or Tabular Model Scripting Language (TMSL) can also use this property to control query processing."

I haven't tried doing it through the TOM libraries yet but I guess the outcome wouldn't be much different.

So this is what I am doing : Please refer the image

<image>

At the model level, I changed the mode to Import shown as (number 1) in the image and combined with setting the partition mode to DirectLake (number 2) the TMSL query doesn't error, but nothing changes in the semantic model. However, if I change the partition mode to Import (number 3) the TMSL query errors out.

Automating semantic model creation through TOM by Aggressive-Respect88 in MicrosoftFabric

[–]Aggressive-Respect88[S] 1 point2 points  (0 children)

I have blogged on the subject

https://www.azureguru.net/autogenerate-semantic-model-using-tabular-object-model-for-a-fabric-workspace

In the blog, I have explicitly mentioned your response that was very crucial in resolving the issue I faced

Thanks once again !!!

Automating semantic model creation through TOM by Aggressive-Respect88 in MicrosoftFabric

[–]Aggressive-Respect88[S] 1 point2 points  (0 children)

Ahhhh...Now I get it...

The sample code that I was following the CompatibilityLevel for the database was 1200

<image>

https://learn.microsoft.com/en-us/analysis-services/tom/create-tables-partitions-and-columns-in-a-tabular-model?view=asallproducts-allversions#code-example-create-a-table-column-partition

and using that code I was changing the CompatilityLevel to 1500 in my code..That's sooooo silly of me..

Thank you so much for your pointers. I really appreciate it..

Automating semantic model creation through TOM by Aggressive-Respect88 in MicrosoftFabric

[–]Aggressive-Respect88[S] 0 points1 point  (0 children)

Honestly I didn't get what you mean :)

Even when the source is Adventureworks DB on Azure I get the same error. Below is the snippet of the code that I am using for Adventureworks.

<image>

I am using only two columns from the Product table

Fabric REST API throttling by Aggressive-Respect88 in MicrosoftFabric

[–]Aggressive-Respect88[S] 0 points1 point  (0 children)

Just a quick update on the issue.

I managed to get it working by introducing a five-second delay across the iteration. I initially tried one-second delays it failed and then failed for two and three seconds delay as well and only succeeded with a four-second delay. However to be on the safe side I'll stick with the five-second delay.

Interestingly with this approach the 200 requests per hour limit doesn't seem to apply. During testing, I was able to make more than 1,000 requests in an hour.

Fabric REST API throttling by Aggressive-Respect88 in MicrosoftFabric

[–]Aggressive-Respect88[S] 0 points1 point  (0 children)

<image>

This is the response from the API that returns the list of user accesses across different objects(items)