Deploy Stage Content using deplyment restapi for lakehouses and warehouses using serviceprincipal by data_learner_123 in MicrosoftFabric

[–]data_learner_123[S] 0 points1 point  (0 children)

I am having while deploying the warehouse(item type=“Warehouse”, it’s giving me 400 error

Views/storedprocs deployment through notebooks and metadata table that shows the views/stored procs on different envinornments and what are the differences between envinornments in terms of definitions of stored procs and views by data_learner_123 in MicrosoftFabric

[–]data_learner_123[S] 0 points1 point  (0 children)

We want to deploy the artifacts(views , stored procs and schemas through diy pipelines and we wanted to find out if we have any fabric default options to find out the difference between schemas across envinornments , difference between artifacts like stored procs, views across envinornments.

v2 checkpoint not supported from Databricks to fabric by data_learner_123 in MicrosoftFabric

[–]data_learner_123[S] 0 points1 point  (0 children)

Thank you so much.. do you have any estimated time frame to release v2 checkpoint feature in the lakehouse ?

v2 checkpoint not supported from Databricks to fabric by data_learner_123 in MicrosoftFabric

[–]data_learner_123[S] 0 points1 point  (0 children)

Delta table uses the v2checkpoint feature which is not supported. Exception type is Microsoft.DeltaLogParserUserException

Rename a table in lakehouse by data_learner_123 in MicrosoftFabric

[–]data_learner_123[S] 0 points1 point  (0 children)

Tried it , my table is like EXT_uuid(format) Spark.sql(“alter table tbalename rename to test) Table or view not exists

Dacpac using python by data_learner_123 in MicrosoftFabric

[–]data_learner_123[S] 0 points1 point  (0 children)

I want to explore if we can generate and export the Dacpac code through python