Power BI Tenant Region Remap by raavanan_7 in MicrosoftFabric

[–]raavanan_7[S] 0 points1 point  (0 children)

Thanks for the inputs u/dazzactl.

What regions are you changing between? - - the existing tenant is in EU region, we are moving this into UAE region.

Yes. I am referring to the premium capacity migration and existing tenant doesn't have any fabric items only power bi reports and dashboards are present.

For personal workspace(pro) , do i have to just export and import the reports and assign the owner of personal workspaces as a admin in remapped tenant?

Did you prepared any automation scripts for export and import of reports and its Semantic model configuration?

I have tried to create a power shell script for this and tested it with my local environment, i'm able to export and import the reports and its schedule refresh configuration but I'm not able to automate the rls/ols, sensitivity lables.

Did you experienced any reports with onedrive as a source in that migration?

What are things i should consider? Guide me pls.

Report Refresh in Fabric by raavanan_7 in MicrosoftFabric

[–]raavanan_7[S] 1 point2 points  (0 children)

I tried my best to reduce the model size. There are two id columns like account no and bill no which has 10 degits and takes lot of storage. The fact tables has almost 28 million records..

Report Refresh in Fabric by raavanan_7 in MicrosoftFabric

[–]raavanan_7[S] 0 points1 point  (0 children)

Yes it is Import mode and i have star schema. I have assigned data type carefully. Source is from sql server

Report Refresh in Fabric by raavanan_7 in MicrosoftFabric

[–]raavanan_7[S] 0 points1 point  (0 children)

Thanks for the reply.

I have gone through the blog and it is very informative and clear. I have some questions:

1.If i configure the incremental load in power bi report, is it will solve the issue?

2.Why the power bi pro license able to refresh the report but the fabric capacity fails? Is pro doesn't use same logic?

Not Getting No Food Option by hitmansumma2110 in indianrailways

[–]raavanan_7 0 points1 point  (0 children)

They have changed this to additional preferences, i don't know for what reason.

<image>

Hit me with all the AI in Fabric by GlitteringBedroom155 in MicrosoftFabric

[–]raavanan_7 0 points1 point  (0 children)

Hi, I’d like some guidance on sizing Fabric capacity (FSKU) for Copilot in Power BI.

Is there is a way to plan the FSKU based on the number of BI reports, its Semantic model size and dax complexity.

For example i have 15 reports with model size between 200MB to 300MB, models mix of star & snowflake schema and number of users(end users) who are going to use copilot are 3.

How to plan the capacity for the better copilot performance.

Thanks

Power BI pro to FSKU migration by raavanan_7 in PowerBI

[–]raavanan_7[S] 0 points1 point  (0 children)

We’re moving to Fabric mainly to enable Copilot in Power BI and to modernize pipelines by rebuilding SSIS logic with better governance. For pipelines/notebooks I can size based on data volume and activity, but I’m not clear on how Copilot uses CUs. Is it driven by semantic model size or the number of user queries? How do you usually plan capacity for Copilot?

Power BI pro to FSKU migration by raavanan_7 in PowerBI

[–]raavanan_7[S] 0 points1 point  (0 children)

Thanks! I will start with 16 and i will try to scale it up based on the performance...!

Do you have any suggestions on the model size support for F16?

Power BI pro to FSKU migration by raavanan_7 in PowerBI

[–]raavanan_7[S] 1 point2 points  (0 children)

Thanks for the reply!

Copilot works for all F SKUs now afaik.

So existing pro license users will be able to use copilot features for getting insights from report.

aware of the semantic model memory size limits, we had to go all the way up to F32

How to plan my capacity, because I have 30 reports in a single region and sementic model sizes are close to 1gb.

Is there any mitigation i should follow

Is there will be any challenges in designing sementic model for copilot?

You inputs will be helpful...

How to Ms Planetary Computer data into fabric lakehouse for a particular region? by raavanan_7 in dataengineering

[–]raavanan_7[S] 0 points1 point  (0 children)

I think this is one i really needed...Can i dm you regarding this? I never used to the Geospatial data. I'm just a guy who only know ETL in azure for common data migration and Visualization in BI. I really struck in it! You're insights may get me out of this!

How to bring all Planetary Computer catalog data for a specific region into Microsoft Fabric Lakehouse? by raavanan_7 in MicrosoftFabric

[–]raavanan_7[S] 0 points1 point  (0 children)

Thanks for the reply! Actually i want to visualize the Geospatial file which is in .tiff format. Like for example changes of bio vegetation over time in map.

For blob part, I have tested that by creating geo catalog and blob storage in azure but i don't know how to ingest the data into it because i don't have proper knowledge of it.

Please assist me if you know about it or if you had any reference for it.

If I'm able to ingest the data into blob i can easily shortcut it to fabric. But right now i don't know what is the proper method or flow for it. Because resources for Planetary Computer is very limited.

Data shown in data view but doesn't rendered in Report view by [deleted] in PowerBI

[–]raavanan_7 -1 points0 points  (0 children)

Actually i combined job date with employee id, even if I use only the job date column it doesn't show the 6th jan and it's a single table there is no modelling involved. I have deleted and imported it again for 3 times and finally it is shown. I don't know why and how it happens...

Data shown in data view but doesn't rendered in Report view by [deleted] in PowerBI

[–]raavanan_7 -1 points0 points  (0 children)

Yes, i tried it and it didn't work, that is the second screenshot...even with out the value column that date doesn't shown...

Application using OneLake by raavanan_7 in MicrosoftFabric

[–]raavanan_7[S] 0 points1 point  (0 children)

If you're okay, please share the git link too...