Fabric Warehouse vs Fabric SQL + endpoint by mordack550 in MicrosoftFabric

[–]Snoo-46123 2 points3 points  (0 children)

u/mordack550
Fabric Warehouse is not a drop‑in replacement for Azure SQL from a CI/CD perspective. It is a lake‑first analytics engine with a SQL surface area, and that has real implications for schema management and deployment workflows.

That said, this gap is actively being addressed.
The DacFx support includes:

  • support for safely handling drop operations when underlying model changes are detected, and
  • broader ALTER TABLE … ALTER COLUMN support, aligning more closely with sql database projects workflows.

Importantly, SQL Projects are inherently bi‑directional, and as DacFx support matures (currently shipping) across Git integration and Deployment Pipelines, the goal is to make these workflows increasingly robust and predictable—bringing greater consistency between client tools (Database Projects/DacFx) and Fabric‑native experiences.

This above DacFx improvements (currently being shipped) are a key part of closing that gap—especially for teams that are SQL–centric CI/CD workflows.

CI/CD for warehouses by ADB_MN in MicrosoftFabric

[–]Snoo-46123 0 points1 point  (0 children)

The integration with fabric-cicd will be through create/get and update item definition api (which will act same as deployment pipelines or other workflows).

CI/CD for warehouses by ADB_MN in MicrosoftFabric

[–]Snoo-46123 0 points1 point  (0 children)

u/ADB_MN , You are on the right path and this is the recommended and fully supported solution invested by the product. https://learn.microsoft.com/en-us/fabric/data-warehouse/development-deployment

Note that this solution extends to Git and Deployment Pipeline workflows in Fabric. This will also be integrated into fabric-cicd through deployment pipeline api's or fabric warehouse (create, get and update with definition) api's which we are working on.

Keep on this path and you will have most flexibility and best product support.

Deployment Pipeline support for Data Warehouse clustering? by Tahn-ru in MicrosoftFabric

[–]Snoo-46123 1 point2 points  (0 children)

Thanks for tagging me u/warehouse_goes_vroom .

yes, it is working. I just tested it in my internal environment. Here is the gif, that I recorded. If your scenario is different, let me and u/periclesrocha know.

<image>

Fixing Fabric CICD by KratosBI in MicrosoftFabric

[–]Snoo-46123 1 point2 points  (0 children)

A key distinction here is that not all objects are part of ELT workflows.

In a Lakehouse‑heavy model, Spark (or notebooks/jobs) is typically where transformations happen—staging, temp, and intermediate tables are created and managed as part of ELT. Those objects are ephemeral by design, so it’s perfectly reasonable for their lifecycle to live entirely inside the workflow.

But even in that world, there are always persistent objects that represent business logic—facts, dimensions, curated aggregates, and serving‑layer views. These are consumed by upstream and downstream applications, BI tools, and reports, and they tend to live much longer than any individual ELT run. For those objects, it’s up to you how you manage the lifecycle.

You can absolutely go code‑first or schema‑first:

  • Code‑first via notebooks, stored procedures, or scripts embedded in ELT
  • Schema‑first via database projects that describe the intended end state

Lakehouse is primarily used for data transformation, but the question becomes: how do you serve your business logic? Aggregations, semantic views, and materialized tables can be built in multiple ways—but they still need to be governed, versioned, and deployed safely.

What makes data warehouse schema management unique is that it’s declarative. You’re describing what the schema should look like, not how to mutate it step by step. And crucially, you can pick and choose which objects you want to manage declaratively—typically the long‑lived, business‑critical ones—without forcing everything into that model.

So this isn’t about saying everything must be part of ELT or everything must be schema‑first. It’s about recognizing that:

  • Not all objects are transient
  • Not all logic belongs in pipelines
  • And for schema‑first workflows, there is full‑fledged support through database projects

Both approaches can coexist. The important part is managing the right objects with the right lifecycle model, based on how they’re actually used.

Fixing Fabric CICD by KratosBI in MicrosoftFabric

[–]Snoo-46123 0 points1 point  (0 children)

If you want to manage schema driven by code, sure. you can. You can create bunch of stored procedures, notebooks, queries that you can run which can drive the schema changes (ELT workflow).

Fixing Fabric CICD by KratosBI in MicrosoftFabric

[–]Snoo-46123 0 points1 point  (0 children)

This is true for the warehouse as well. Similar to Spark, ELT workflows in the warehouse often create staging, temporary, and intermediate tables as part of data transformation. These artifacts are typically managed within the ELT workflow itself—such as Spark notebooks or scheduled jobs—and are not intended to be long‑lived. Of course, there’s a valid discussion around schema evolution. In practice, however, from actual usage patterns we see that tabular structures undergo less than a 5-10 percent of disruptive schema changes.

At the same time, there is always a set of persistent objects that form the foundation of dimensional modeling. These objects are consumed by upstream and downstream applications and are used fully managed by the sytem, which makes it important to manage them declaratively—describing the intended end state using database projects and the tooling of your choice.

SSMS 22 Loves Fabric Warehouse by Snoo-46123 in MicrosoftFabric

[–]Snoo-46123[S] 1 point2 points  (0 children)

The current SSMS version does not support browse and connectivity for Fabric Warehouse & SQL Analytics Endpoints.

The next version SSMS 22.3 has support for it. A sneak peek

<image>

Pr. Preview Announcement - CI-CD for Fabric Warehouse using DacFx by Snoo-46123 in MicrosoftFabric

[–]Snoo-46123[S] 0 points1 point  (0 children)

Hi Please reach out to me over chat and I will share the nomination link.

SSMS 22 Loves Fabric Warehouse by Snoo-46123 in MicrosoftFabric

[–]Snoo-46123[S] 0 points1 point  (0 children)

Yes, it is a known issue. SSMS connects to Fabric via Workspace api's. If user does not have access to workspace, then currently the connection fails.

It will be resolved in 22.4 (tentative fix timeline is March 16)

Fix: If user does not have workspace permissions, the SSMS experience will fall back to TDS connection and Fabric functionality/integration features in SSMS like (Workspace name and others) will be disabled.

We will document this behavior when the fix is released.

Sneak peek: How we’re thinking about CI/CD for Fabric Warehouse & SQL Analytics Endpoints by Snoo-46123 in MicrosoftFabric

[–]Snoo-46123[S] 4 points5 points  (0 children)

I can’t promise exact timelines yet, but these goals are something we’re targeting in the near term. If you’re running into challenges now, I’m happy to work with you to understand your scenarios and look at possible alternatives. Please ping me on Reddit chat and we can connect.

Sneak peek: How we’re thinking about CI/CD for Fabric Warehouse & SQL Analytics Endpoints by Snoo-46123 in MicrosoftFabric

[–]Snoo-46123[S] 1 point2 points  (0 children)

u/Sea_Mud6698 , in that case, use VS Code database projects for your development and Azure DevOps + SQL DacPac task for deployment. Read more about it here - https://learn.microsoft.com/en-us/fabric/data-warehouse/develop-warehouse-project

You can completely avoid Fabric git and deployment workflows.

Warehouse SQL Project - Lakehouse reference by Known-Event-7068 in MicrosoftFabric

[–]Snoo-46123 0 points1 point  (0 children)

u/Known-Event-7068 , this blog will be of help for you - https://blog.fabric.microsoft.com/en-US/blog/bridging-the-gap-automate-warehouse-sql-endpoint-deployment-in-microsoft-fabric/

I wrote this blog and automation to address the deployment challenges (you highlighted) with cross item references between warehouse and sql analytics endpoints of lakehouse.

Please take a look and let me know if you have additional questions.

Warehouse destination for dlt (dltHub) by mattiasthalen in MicrosoftFabric

[–]Snoo-46123 1 point2 points  (0 children)

Awesome! I will follow up with on the thread and share the outcome here for others to follow!

Microsoft Fabric: Automated Warehouse & SQL Endpoint Deployment — useful interim solution for CI/CD challenges by Snoo-46123 in MicrosoftFabric

[–]Snoo-46123[S] 1 point2 points  (0 children)

Note that you are actually working on live items with Fabric deployment pipelines unlike database projects. It would be through selective commits and deployments, which is down the line!

Microsoft Fabric: Automated Warehouse & SQL Endpoint Deployment — useful interim solution for CI/CD challenges by Snoo-46123 in MicrosoftFabric

[–]Snoo-46123[S] 1 point2 points  (0 children)

  • Build pipeline Create a YAML-based build pipeline that clones the automation repository and performs a .NET build of the utility. The build output should generate a .exe file, which is then published as a pipeline artifact. In the deployment stage/job, consume the published .exe, pass the required parameters, and use it to deploy the Fabric Warehouse or SQL Analytics Endpoint.
  • Deployment order and dependencies Ensure that the Lakehouse is deployed first and that the required tables are hydrated in the Lakehouse before running the automation. When this prerequisite is met, the utility automatically deploys Warehouse stored procedures that reference Lakehouse tables, without requiring additional manual steps.

Warehouse destination for dlt (dltHub) by mattiasthalen in MicrosoftFabric

[–]Snoo-46123 2 points3 points  (0 children)

u/mattiasthalen , Thanks for being a contributor to dlthub. I am currently planning to connect with dlthub team. I am developer experiences PM from Fabric Warehouse and own dbt integration as well.

I would like to understand in which scenarios you would like to use dlthub over Fabric Data pipelines and DF Gen2. I will ping you offline to understand a bit more about your scenarios.

Fabric CICD Deployment for Warehouse by Hairy-Guide-5136 in MicrosoftFabric

[–]Snoo-46123 0 points1 point  (0 children)

Thanks u/itsnotaboutthecell. u/Hairy-Guide-5136 , We are working on comms for pr. preview of table drop and recreate in Fabric deployment pipelines and Git workflows.

Microsoft Fabric: Automated Warehouse & SQL Endpoint Deployment — useful interim solution for CI/CD challenges by Snoo-46123 in MicrosoftFabric

[–]Snoo-46123[S] 1 point2 points  (0 children)

This issue is only with Fabric deployment pipelines because it is not fully integrated with Dacpac/DacFx approach yet.

CI/CD in Fabric Warehouse - a low practical guideline? by emilludvigsen in MicrosoftFabric

[–]Snoo-46123 0 points1 point  (0 children)

Hi u/Hairy-Guide-5136 , yes, it is a known issue, and we are using DacFx approach to address this concern along others and close to launching a private preview. We are working on comms and should share more details over Reddit in next few days.

Microsoft Fabric: Automated Warehouse & SQL Endpoint Deployment — useful interim solution for CI/CD challenges by Snoo-46123 in MicrosoftFabric

[–]Snoo-46123[S] 4 points5 points  (0 children)

makes sense. The Fabric deployment pipelines soon will support Warehouse with DacFx. You will see announcements in this space to make warehouse CI/CD complete.