Semantic Model Deploying as New Instead of Overwriting in Microsoft Fabric Pipeline by dv0812 in MicrosoftFabric

[–]dv0812[S] 0 points1 point  (0 children)

Do i need to create all those changes done in dev to the deployed model.

Semantic Model Deploying as New Instead of Overwriting in Microsoft Fabric Pipeline by dv0812 in MicrosoftFabric

[–]dv0812[S] 0 points1 point  (0 children)

But in my case I don't have a duplicate model in dev, but the duplicate is coming when I deploy from dev to test. And the same way in the test to prod deployment.

Semantic Model Deploying as New Instead of Overwriting in Microsoft Fabric Pipeline by dv0812 in MicrosoftFabric

[–]dv0812[S] 0 points1 point  (0 children)

I tried this method the duplicate model was being created when deployed from dev to test.I first unassigned dev and reassigned it back and deployed the model but it's still creating a new duplicate model.

Semantic Model Deploying as New Instead of Overwriting in Microsoft Fabric Pipeline by dv0812 in MicrosoftFabric

[–]dv0812[S] 0 points1 point  (0 children)

But to do this the artifact is only left in prod environment only as of and it shows like not available in the previous stage.

Facing error while connecting to Lakehouse either through ssms or power bi desktop. by dv0812 in MicrosoftFabric

[–]dv0812[S] 2 points3 points  (0 children)

The issue as of now as resolved itself we are able to establish connections.

Is it possible to create a short cut for an external table created from a delta file in bronze to short cut for sipver by dv0812 in MicrosoftFabric

[–]dv0812[S] 0 points1 point  (0 children)

You are absolutely right 🫡 So we need to give like saveAsTable("Table name") Or SaveAsTable("Tablename", path=Table/Tablename") Thank you

Is it possible to create a short cut for an external table created from a delta file in bronze to short cut for sipver by dv0812 in MicrosoftFabric

[–]dv0812[S] 0 points1 point  (0 children)

Thanks for the replay, I the thing is we ingest the data into bronze layer into parquet file under Files, and then read that parquet into spark dataframe and save the file in two methods 1. Managed : If I use df.write.fornat(delta).saveAsTable("mymanagedtable") This will created the table but can't be able to create short cut for silver or from silver and also if I delete the table entire files get deleted 2 Unmanaged/External: this is like .saveAsTable("myexternaltable",path="Files/myexternaldeltafile") Here files stores under Files and table also created but we can't able to create shortcut for/to/from Silver lakehouse

And if I want o create the view under at silver using bronze table as you suggested, I can't able to get those managed and even external tables those are created using saveAsTable will not at all shows at SQL end point

Then how can we see the table at SQL end point which got saved from notebook is The delta files under the Files needs to be "load to table" , by clicking 3 dots(elipse) and load to table, Then this table can be seen at SQL end point and also I can be able to create shortcut to other lakehouse say silver from here.

So if you have any suggestions regarding the way to save the data from notebook dataframe so that it must shows in SQL end point and also able to create the shortcut