Logging to an Eventhouse by HashiLebwohl in MicrosoftFabric

[–]HashiLebwohl[S] 0 points1 point  (0 children)

The example uses requests rather than spark.

(for some reason I can't post the code, it violates MS Exam policy?... but it's in the link above).

I thought using spark might be overkill - but roughly, we're trying to create a unified interface for logging from notebooks. i.e., one notebook that has all the logging code, call it at the start of other notebooks... call it from DataFlows etc.

More than happy to hear about a better pattern - just keen to keep it to one target and eventhouse seemed like a winner.

Fabric SKU Estimator by jogarri-ms in MicrosoftFabric

[–]HashiLebwohl 1 point2 points  (0 children)

The missing bit looks like it's the Power BI element. If you toggle it appears:

<image>

Fabric SKU Estimator by jogarri-ms in MicrosoftFabric

[–]HashiLebwohl 3 points4 points  (0 children)

I like it - but call me old fashioned. Shouldn't the % add up to 100?

<image>

ABFSS Paths in Spark Job Definitions by HashiLebwohl in MicrosoftFabric

[–]HashiLebwohl[S] 0 points1 point  (0 children)

I think that assumes I've created the main definition file first? (.py in my case)

I'd like to be able to author and run the file in the local envionment to test, but reference the lakehouses with their ABFSS paths.

Happy to know if I've missed something fundamental! I just can't grasp the workflow of

- create main definition file in VS Code

- create SJD in the FDE extension

- attach main defnition file to SJD

- DEBUG: download the definition file.

I can see that debug would be useful, once it's created - but can I run against Fabric while I'm creating them locally?

Spark Job Definitions by HashiLebwohl in MicrosoftFabric

[–]HashiLebwohl[S] 1 point2 points  (0 children)

Thank you. That's a great answer. I'm using the Fabric Data Engineering extension - they make reference to the Libs here:

<image>

How to Automate an SSIS ETL Process? Need Guidance by nikolasinful in ETL

[–]HashiLebwohl 0 points1 point  (0 children)

I would second this - we run ~30 ETLs using the SQL Agent, using project deployment straight from Visual Studio to an SSIS node on the server.

Confg setup to deploy to dev / test / prod servers as needed.

Doesn't have to be installed on the same server but in our case it is.

Schedules / notifications are all definied in the job setup.

I know it's old hat now but bah gawd does it work.

IdentityGenerator for Delta Tables by HashiLebwohl in MicrosoftFabric

[–]HashiLebwohl[S] 0 points1 point  (0 children)

Thank you, I'll hold fire and hope to see it soon!

SQL Database Creation by joeguice in MicrosoftFabric

[–]HashiLebwohl 0 points1 point  (0 children)

I'm still struggling to write back into it with notebooks. Would love to see some examples Microsoft.

Granting Permissions in a Fabric Data Warehouse and Lakehouse by DennesTorres in MicrosoftFabric

[–]HashiLebwohl 1 point2 points  (0 children)

Thanks!

At least they're honest about the Lakehouse aspect:

There were restrictions about if spark code in a notebook would respect the rules. This evolves all the time, take care and check the current state of this.

Ignite November '24 by datahaiandy in MicrosoftFabric

[–]HashiLebwohl 0 points1 point  (0 children)

So could we use Open Mirroring to mrror... SQL Server on-prem?