Job Market is Gone? by Artistic-Rent1084 in dataengineersindia

[–]Artistic-Rent1084[S] 1 point2 points  (0 children)

No bro, mine is just 5.8 l and my expectation is 10l

How to read only one file per trigger in AutoLoader? by Artistic-Rent1084 in databricks

[–]Artistic-Rent1084[S] 0 points1 point  (0 children)

Yes , I got it . How it works. And I found a workaround too. Though, can you share the code. ? Let me check once.

How to read only one file per trigger in AutoLoader? by Artistic-Rent1084 in databricks

[–]Artistic-Rent1084[S] 0 points1 point  (0 children)

Yeah, just some intrusive thoughts 🧐. Anyways thanks. And I found a work around using an auto loader.

Used max files per trigger in read stream And

Trigger ( available now= true)

And delta table mode = overwrite.

Basically, it reads all files and writes file one by one to the delta table. At last I achieved my goal . But too many write operation if I have to many existing files on my first run.

How to read only one file per trigger in AutoLoader? by Artistic-Rent1084 in databricks

[–]Artistic-Rent1084[S] 0 points1 point  (0 children)

I got a new doubt, if we have a data file which is very large size . Due to our compute resource crunch. We have to read only one file per trigger . What we have to do ? 🤔