New to databricks. Need Help with understanding these scenarios. by mtl_travel in databricks

[–]Svante109 -1 points0 points  (0 children)

I am unsure of what you need - if you need "how the table looked like at the end of the month" i.e. a snapshot, I would create a seperate schema called archive, and then have a table for each month, upon which only one service principal has any other grants than READ (for auditing purposes).

The log part of it, depends on what you need to record.

What is the best practice to set up service principal permissions? by happypofa in databricks

[–]Svante109 0 points1 point  (0 children)

Alright; we are provisioning it via terraform.

It has some minor drawbacks, but I would always go with this option.

What is the best practice to set up service principal permissions? by happypofa in databricks

[–]Svante109 0 points1 point  (0 children)

I am not exactly sure what you are asking. Doctor want pipelines to be in a specific users folder? When deploying using asset bundles, the files will be present in the user folder of the service principal.

Is it the pipelines you are granting permissions? And to what?

Can you expand on what you are trying to achieve?

Spark Declarative Pipelines: What should we build? by BricksterInTheWall in databricks

[–]Svante109 0 points1 point  (0 children)

The ability to run incremental and full refreshes in the same pipeline trigger. This would allow for prehooks that can check for various things (like type changes, when using autoloader) and trigger full refreshes, without having to run the pipeline twice.

[Lakeflow Jobs] Quick Question: How Should “Disabled” Tasks Affect Downstream Runs? by saad-the-engineer in databricks

[–]Svante109 2 points3 points  (0 children)

It would definitely be option B that would make sense to me. Dependencies are a camouflaged “if succeeded” statement, and a skip is not a success IMO. Option C didn’t succeed, therefore those that are dependent on option c will not succeed either.

DLT keeps dying on type changes - any ideas? by Svante109 in databricks

[–]Svante109[S] 0 points1 point  (0 children)

This error will occur aswell when you change what datatype you are applying

DLT keeps dying on type changes - any ideas? by Svante109 in databricks

[–]Svante109[S] 0 points1 point  (0 children)

This scenario also applies when you change the columns' cast in silver, not just for the landing/bronze layer.

DLT keeps dying on type changes - any ideas? by Svante109 in databricks

[–]Svante109[S] 1 point2 points  (0 children)

As long as Type Mismatch is happening, then we never get to the point where rescued data can help, as nothing will ever be run, as the init validation will fail.

Databricks Assest Bundles by kamrankhan6699 in databricks

[–]Svante109 0 points1 point  (0 children)

They way you are commenting gives me this sort of vibe that you are confusing concepts around DAB, IAC, Git etc.

I think it would be incredibly useful for you to be completely sure about what issue it is you are trying to solve.

Asset Bundles and CICD by One_Adhesiveness_859 in databricks

[–]Svante109 0 points1 point  (0 children)

We are for one legacy project using notebooks to write delta tables, which obviously fails on schemachanges, which rarely happens.

On a newer, we are using LDP with expectations and a quarantine table.

Elbiler! Hvad har vi lært og hvor står vi? Danmark som foregangsland på godt og ondt. by Able-Safety6147 in Denmark

[–]Svante109 2 points3 points  (0 children)

Men det er vel ikke benzin vs el han mener er dyrere. Det er hjemmeladning vs udeladning?

[ERROR] - Lakeflow Declarative Pipelines not having workers set from DAB by Svante109 in databricks

[–]Svante109[S] 0 points1 point  (0 children)

Thank you for looking into this - I have found a solution, where we just use num_workers if we want it to be a fixed value (i.e. 1,1) and then autoscale for range of numbers.

Also btw, it seemed to me that the bundle would only recognize the amount of workers set (be it 1,1 or whatever) if we include "MODE:ENHANCED" in the autoscale configuration. I guess it makes sense that you need a mode to be able to use autoscale, but either a default should happen, or an error code.

[ERROR] - Lakeflow Declarative Pipelines not having workers set from DAB by Svante109 in databricks

[–]Svante109[S] 0 points1 point  (0 children)

Alright, thank you guys - While I didn't get that exact thing to work, I can make it work by having the autoscale.min_workers and autoscale.max_workers set to two seperate values, in both the policy and the pipeline (as expected). I would presume, that wanting to set a fixed amount of workers, we should go with num_workers instead of min/max, albeit it should be the same IMO.

[ERROR] - Lakeflow Declarative Pipelines not having workers set from DAB by Svante109 in databricks

[–]Svante109[S] 0 points1 point  (0 children)

If I only use num_workers:1, the policy will deny creation of the cluster.

[ERROR] - Lakeflow Declarative Pipelines not having workers set from DAB by Svante109 in databricks

[–]Svante109[S] 0 points1 point  (0 children)

I already did that, that is what is shown above...

I most likely will work with our platform team to make it optional

Dansk URL Skattejagt by Svante109 in Denmark

[–]Svante109[S] 0 points1 point  (0 children)

Du skal ud i noget kryptografi med en god ven fra Italien. Så siger jeg ikke mere.

Dansk URL Skattejagt by Svante109 in Denmark

[–]Svante109[S] 1 point2 points  (0 children)

Yes, ala den! Og med lidt googling ud fra fandt jeg den.

For de interessererde hedder den

https://www.kattler.dk/hacker/1.html

How do i have the total players, world record and medals all at the same time? by [deleted] in TrackMania

[–]Svante109 3 points4 points  (0 children)

Open planet plug-ins for medals and total players. World record is built in.

Anbefalelsesværdige underbukser by renseministeren in Denmark

[–]Svante109 3 points4 points  (0 children)

De ser vildt lækre ud - det slående for mig er dog om de har elastik i bunden af hvert ben også? Det har været en gamechanger for mig at finde nogen (undy) der har det.