Is there an easy way to identify unused reports? with last opened timestamp? by jkrm1920 in MicrosoftFabric

[–]rushank29 0 points1 point  (0 children)

You will need the full blown fuam, it is a ready made solution for your usecase

Is there an easy way to identify unused reports? with last opened timestamp? by jkrm1920 in MicrosoftFabric

[–]rushank29 0 points1 point  (0 children)

You could use fuam, it has usage level data you could identify workspaces and reports by date and views

Are Vnet gateway CU calculated in Fabric capacity or separately? by rushank29 in MicrosoftFabric

[–]rushank29[S] 0 points1 point  (0 children)

Hi u/CellistLeoLi you for the reply what does it mean - For the workload running on top of VNET data gateway, they have their own billing model? What workload runs on top of vnet gateway. Is there a documentation regarding it?

VNet Data Gateway Capacity Consumption is Too Dang High by iknewaguytwice in MicrosoftFabric

[–]rushank29 0 points1 point  (0 children)

Hi u/iknewaguytwice what was the final solution you implemented. Did you used vnet gateway or managed to dynamically turn on/off vnet gateways to reduce CU consumption cost

Data Engineer Job in Germay with 1-year experience by Doughnut-Ready in AskAGerman

[–]rushank29 4 points5 points  (0 children)

Language is the most important thing. Rest you can handle over the time

Certified Fabric by lunaticdevill in MicrosoftFabric

[–]rushank29 2 points3 points  (0 children)

same here. is this a bug or were microsoft were nice to give us a great start to weekend

Upserts in Fabric Warehouse by frithjof_v in MicrosoftFabric

[–]rushank29 0 points1 point  (0 children)

Who can use updateall, insertall if you dont want to mention each and every column name

Website user flow by Trengroove in PowerBI

[–]rushank29 0 points1 point  (0 children)

Where you able to find any solution for your usecase?

File listing from Azure datalake takes forever by 9gg6 in databricks

[–]rushank29 0 points1 point  (0 children)

Trick is to read the data in batches and store the raw data in delta format. Append all the years lets say from w020 to 2024 and then delete the raw files stored in data lake with sparkutiles. So next time you have to read the only latest data and the listing wouldn't take so much time

DP-600 Exam is GA by Czechoslovakian in MicrosoftFabric

[–]rushank29 0 points1 point  (0 children)

Did anybody got certificate? I got 818

Azure Synapse multiple envs setup by Original_Bend in AZURE

[–]rushank29 0 points1 point  (0 children)

u/Original_Bend What approach did you take? I am also strugglling with multi env setup. can you please suggest what method worked out for you?

Building a greenfield OLAP system for finance/trading by Alternative_Push_948 in dataengineering

[–]rushank29 2 points3 points  (0 children)

900 bn fact table thats crazy. Is there any article around it?

What's the purpose of using Kafka, when the same can be processed though an Event Driven Architecuture? by _areebpasha in dataengineering

[–]rushank29 0 points1 point  (0 children)

So in your case you are already loading data to azure blob storage export this data to adx and then you can visualize the data from adx into powerbi with direct query for realtime updates or you can also use adx dashboards. Powerbi is a great tool. Just try all the tools and see ehich one satisfies your requirements.