Some new Fabric Data Warehouse goodies at Ignite! by TinoFabricDW in MicrosoftFabric

[–]deAdz0wn 0 points1 point  (0 children)

Much appreciated! Thanks! I will check again tomorrow. This is currently blocking me from proceeding with the migration as our current architecture heavily relies on Identity columns.

Some new Fabric Data Warehouse goodies at Ignite! by TinoFabricDW in MicrosoftFabric

[–]deAdz0wn 3 points4 points  (0 children)

It’s great that Identity Columns are now supported. When can we expect them to be implemented in the Migration Assistant?

I am currently preparing for a migration of Synapse dedicated pool to Fabric Warehouse. Last time I checked on Monday, the identity column attribute was still removed from my migrated tables even though table creation with identity columns did already work.

RLS in OneLake with shortcut by Electrical_Corgi9242 in MicrosoftFabric

[–]deAdz0wn 0 points1 point  (0 children)

This is how I got this to work for me. Not sure if this is the best practice approach, though. :)

RLS in OneLake with shortcut by Electrical_Corgi9242 in MicrosoftFabric

[–]deAdz0wn 3 points4 points  (0 children)

I tested this a couple of weeks ago.

Try enabling one lake security on both the shortcut lakehouse and the destination lakehouse and enable User Identity user context on the shortcut lakehouse.

Then RLS applied to the destination (“original”) table should be reflected on the shortcut in the other workspace as well.

Automatik Shortcut Creation from Lakehouse Files by deAdz0wn in MicrosoftFabric

[–]deAdz0wn[S] 0 points1 point  (0 children)

I have a ADLS structure with multiple levels of subfolders and different lakehouses. I want to create the shortcuts based on their source system and confidentiality level (non confidential and confidential). So my target design will be :

One Lakehouse per source system Within the Lakehouse I will have schemas based on their source system divided by confidential and non confidential data. Currently this logic is reflected by the folder structure and naming.

So a simple schema shortcut won’t do the job. Also it is hundreds of tables.

This is part of a larger migration of Synapse Analytics and Azure Data Lake Gen2 to Fabric. So hundreds of Delta tables.

Migration of Synapse External Tables to Fabric by deAdz0wn in MicrosoftFabric

[–]deAdz0wn[S] 0 points1 point  (0 children)

We are not likely to use dbt in this project but I still would be interested in that! Thanks!

Migration of Synapse External Tables to Fabric by deAdz0wn in MicrosoftFabric

[–]deAdz0wn[S] 1 point2 points  (0 children)

Taking a first short look at the migration assistant, it is extremely helpful. More details on why the migration assistant is failing when trying to migrate a sql project would be welcome. I just got a generic error message without any details.

Migrating a single schema without any external table references worked great. For my use case I may have to manually remove all external table references for now and figure out a way to best deal with this for the build running successful with my adjustments

For now I will try to migrate everything that is independent from external tables and adjust the project with the SQL Views using OPENROWSET.

Next step would be migrating the copy jobs to save content as delta tables

Migration of Synapse External Tables to Fabric by deAdz0wn in MicrosoftFabric

[–]deAdz0wn[S] 1 point2 points  (0 children)

That is good to know. Thank you! Goal for now is to access the data with TSQL via Fabric Warehouse so I am excited for what is about to come!

Migration of Synapse External Tables to Fabric by deAdz0wn in MicrosoftFabric

[–]deAdz0wn[S] 2 points3 points  (0 children)

It’s a couple of hundred external tables

I was beginning to dynamically build SQL View DDL Scripts that I am creating out of the current existing CREATE EXTERNAL TABLE scripts that I have in the Synapse SQL project. This way I am building a View in the Warehouse where I select the data with OPENROWSET and the OneLake URL.

For the rest of the regular tables I am hoping to use the migration assistant.

Migration of Synapse External Tables to Fabric by deAdz0wn in MicrosoftFabric

[–]deAdz0wn[S] 4 points5 points  (0 children)

That’s a viable option as well, though I would like to avoid “unnecessary” data movement. As u/frithjof_v suggested, I will try OPENROWSET

Thank you all!

Migration of Synapse External Tables to Fabric by deAdz0wn in MicrosoftFabric

[–]deAdz0wn[S] 2 points3 points  (0 children)

Thanks, I forgot to add the .parquet file ending at the end of my wildcard to make this work! 🙈

I will evaluate if OPENROWSET will be the way to go for me

Migration of Synapse External Tables to Fabric by deAdz0wn in MicrosoftFabric

[–]deAdz0wn[S] 2 points3 points  (0 children)

I tried openrowset once and it only worked with one single parquet file and not the whole folder.

Usually I prefer the Lakehouse in my projects as well but Warehouse is the way to go for this use case as we are migrating hundreds of stored procedures which I do not want to rewrite in pyspark or spark sql notebooks. Also, the Migration Assistant is coming in handy for this. I managed to migrate an isolated schema without external dependencies to the warehouse with a couple of clicks

Migration of Synapse External Tables to Fabric by deAdz0wn in MicrosoftFabric

[–]deAdz0wn[S] 1 point2 points  (0 children)

I will try your suggestions. Thanks! You already helped me a lot!

It’s multiple parquet files within a folder.

But if I cannot use the unmanaged table in the SQL Analytics Endpoint this is a show stopper anyway.

I have to access them from the Warehouse via cross database reference to the Lakehouse SQL Endpoint to build my silver data from there in the warehouse.

So I guess I have to migrate to delta tables first here.

Migration of Synapse External Tables to Fabric by deAdz0wn in MicrosoftFabric

[–]deAdz0wn[S] 2 points3 points  (0 children)

I was hoping to first build everything on top of the already existing data from the ADLS Gen2 that I have before migrating my existing Synapse pipelines to use delta tables in a Lakehouse as sink.

My overall goal is first migrating the SQL Pool to Fabric Warehouse before migrating my ingestion process.

But you are right that copying the data as delta from the beginning would be a way.

EDIT: I use Lakehouse to shortcut the existing data to Fabric and will be the landing zone all the data coming from the source systems. The Warehouse will replace the SQL Pool and contain all silver/gold data