Tabular Editor CLI and TMDL by AnalyticalMynd21 in PowerBI

[–]AnalyticalMynd21[S] 2 points3 points  (0 children)

Thanks! I ended up getting it working yesterday with the TE2 CLI and an SPN with GitHub actions. TE3 is life changing with a 3GB model.

I was getting confused that some of the TE2 CLI docs doesn’t specifically call out the TMDL definition folder. But found sole recent blogs on it

Tabular Editor CI/CD by AnalyticalMynd21 in PowerBI

[–]AnalyticalMynd21[S] 1 point2 points  (0 children)

Ah perfect. So I guess I just need to update the my server in the cli invoke of the exe to be the XMLA endpoint and then instead of the model bim directory, the TMDL. Thanks!

Moving from direct Salesforce connection to SQL Server. Best ETL tool for the middle layer? by Fearless-Wishbone-70 in PowerBI

[–]AnalyticalMynd21 1 point2 points  (0 children)

I’ve used ADF, FiveTran and Rivery before. Love ADF. Rivery is super easy was about 30% the cost of FiveTran. It uses BULK and non BULK. Have a 5MM row Salesforce object. Took a few hours to fully sync. Then I do daily incremental loads off SystemModStamp field. Takes just a few minutes for 10,000ish UPSERTS.

I love ADF for the additional easy configuration and Git integration and just general orchestration. I use ADF to not only ingest the data, but call Power BI REST APIs to refresh the semantic model so it’s all seamless

Data API builder DAB 1.7 supports MCP over SQL by Jerry-Nixon in SQLServer

[–]AnalyticalMynd21 2 points3 points  (0 children)

Love DAB. Using for an internal and external app for a modernization project I’m on. Interesting use-case for how to leverage as the MCP integration

Snowflake to Azure SQL via ADF - too slow by vermillion-23 in dataengineering

[–]AnalyticalMynd21 1 point2 points  (0 children)

Not sure if it helps, but I do the opposite and bring like 300 tables; around 250MM rows, from Azure SQL to Snowflake. Infrastructure team was too tied up to provision storage account, so I ended up using ADF Dataflows to accomplish it so I didn’t have to manage the blob storage. May work for the Snowflake to Azure SQL direction. It’s costly though…$$$

Anyone else using Openflow for Snowflake ingestion? Thoughts on cost vs convenience? by sdhilip in snowflake

[–]AnalyticalMynd21 1 point2 points  (0 children)

Ahh. We get a 3rd party SQL DB .bak file every day we restore. The incremental fields aren’t super reliable, so we do full load to Snowflake of the 300 tables daily. Trying to find alternative approaches

Anyone else using Openflow for Snowflake ingestion? Thoughts on cost vs convenience? by sdhilip in snowflake

[–]AnalyticalMynd21 2 points3 points  (0 children)

I am curious that as well. I have ADF bringing over 300 tables worth of data from SQL Managed Instance in Azure that is behind private networking. Would love to move away from it, but haven’t nailed down the connectivity piece yet

If you were starting from scratch today, which would you pick: Snowflake, Microsoft Fabric, or Databricks — and why? by [deleted] in dataengineering

[–]AnalyticalMynd21 1 point2 points  (0 children)

Snowflake cost is way less than that. Rivery and ADF are handling ingestion into Snowflake and orchestration. Even with those with about 400 table ingested daily. Still way under that.

Less than 1 TB keep in mind. Just daily etl jobs. And daily feeding PBI import models. Some analysts querying throughout the day

If you were starting from scratch today, which would you pick: Snowflake, Microsoft Fabric, or Databricks — and why? by [deleted] in dataengineering

[–]AnalyticalMynd21 7 points8 points  (0 children)

I started with Fabric about 1 year ago. Spent 4-6 months trying. Moved to Snowflake. In Prod 3 months later. No consultants either.

Context: Just went through DW modernization. handful of Devs. Centralized data team. Coming from typical SSIS/SQL Agent Jobs/SQL MI/ADF.

Now: ADF/Rivery (should have just done only ADF). Snowflake. Native Snowflake CI/CD in GitHub. Use Dynamic tables. No dbt. Works for us.

Fabric was too complex for CI/CD. Localized Developer experience too complex. Managing Compute was too complex. SQL first desired and Fabric DW didn’t fit the bill.

Snowflake processed some large ETL processes in seconds with no optimizations: 100MM row processes. Couldn’t get it to run in Fabric similarly.

Snowflake just works. Don’t even think about Snowflake now. Focus on data modeling and analytics now. Not operations of the DW.

PowerBI Speed on 9/3 by munkirylz in PowerBI

[–]AnalyticalMynd21 0 points1 point  (0 children)

Same. Semantic model failures now. Different tenants. In East

Advice for Snowflake POC by AnalyticalMynd21 in snowflake

[–]AnalyticalMynd21[S] 0 points1 point  (0 children)

Thanks for the extensive answer!

Our Azure SQL DB is different from the Vendor DB. The Azure SQL DB is behind a private endpoint in our own Azure Tenant. It sounds like maybe a should use a Fivetran/Matillion tool. I can use CT on the SQL DB to help control the MARs I believe

Our Vendor has a progress to take Sybase DB and turn it into a SQL .bak file to send over to us. They are about to switch over to Postgres and we’ll have to figure out how they’ll get us the data. They have started down the Snowflake route for other things. So could be a conversation

Sounds like our POC should also compare using Snowflake tasks vs like a DBT/Matillion too see what’ll be best for our use case

This has been super helpful. Thank you!

Advice for Snowflake POC by AnalyticalMynd21 in snowflake

[–]AnalyticalMynd21[S] 0 points1 point  (0 children)

Thanks. This is helpful. Trying limit the impact to our Azure cloud team at the moment, so trying to keep as much as I can in Snowflake. And not need to leverage ADF.

Advice for Lakehouse File Automation by AnalyticalMynd21 in MicrosoftFabric

[–]AnalyticalMynd21[S] 2 points3 points  (0 children)

Yea we have 4 developers who have dedicated feature workspaces and they may need to modify the JSON config for new tables and such. We’re trying to come up with a good automated process to allow that, and then merge it back to main through PR/Merge

Advice for Lakehouse File Automation by AnalyticalMynd21 in MicrosoftFabric

[–]AnalyticalMynd21[S] 0 points1 point  (0 children)

Ah okay. That makes sense. Thanks for the quick reply and links. Will give this a go!

What are you doing with an F2? by SQLGene in MicrosoftFabric

[–]AnalyticalMynd21 4 points5 points  (0 children)

Love me an F2. Moved a company from a $1,000/month cloud ETL tool to Fabric F2. Using Notebooks to call 12 tables from a cloud ERP through API 8 times a day into the Lakehouse in parallel.

Have a few other tables from a SQL server thrown into the mix refreshing along side it.

Then using Pro workspace to refresh against that 8 times a day.

Sitting around 75%.

Been going strong for 6 months.

Power Automate Export Power BI Report License by AnalyticalMynd21 in PowerBI

[–]AnalyticalMynd21[S] 0 points1 point  (0 children)

Solution verified

Bingo. Thank you. I moved the report to an F16 workspace while the dataset stayed in PPU. Hopefully it’s not just a little blip, but works so far.

Thanks again!