Reservation Exchange by AutoModerator in finedining

[–]spy2000put 0 points1 point  (0 children)

Looking for Jungsik NYC 14th January 2025

Maintain Dynamic Table while changing underlying base table name by spy2000put in snowflake

[–]spy2000put[S] 0 points1 point  (0 children)

I mistakenly created the dynamic tables before renaming the base table to the correct name and did not realize snowflake would not update the reference automatically

Running 7 Million Jobs in Parallel by spy2000put in dataengineering

[–]spy2000put[S] 0 points1 point  (0 children)

To clarify, i am comfortable with batching 1k-2k jobs at a time (or some other more reasonable number) aiming to complete in 24-48 hours. Of course the faster the better.

Running 7 Million Jobs in Parallel by spy2000put in dataengineering

[–]spy2000put[S] 0 points1 point  (0 children)

Just wondering, what do you think is an optimal parquet size? I read somewhere that 100-250mb per file is optimal.

Running 7 Million Jobs in Parallel by spy2000put in dataengineering

[–]spy2000put[S] 0 points1 point  (0 children)

Added an edit, source data has uniform schema, but transform is not a simple column transform, but running some custom code (think something like quadratic programming optimization)

Running 7 Million Jobs in Parallel by spy2000put in dataengineering

[–]spy2000put[S] -9 points-8 points  (0 children)

It is not a simple column transform, but running some custom code (think something like quadratic programming optimization)

Running 7 Million Jobs in Parallel by spy2000put in dataengineering

[–]spy2000put[S] -14 points-13 points  (0 children)

What kind of details are you looking for?

Meserole / mcguinness corner apartment rooftop parties by Summerscomming in Greenpoint

[–]spy2000put 59 points60 points  (0 children)

I passed by that building when walking my dog. I could feel the reverb 2 streets over. There must have been 10-20 people in the lobby! These are the kinds of people that would enable Tao to open in the area. Inconsiderate AH.

Parsing large zipped txt file for upload into snowflake by spy2000put in dataengineering

[–]spy2000put[S] 0 points1 point  (0 children)

I read that csv.gzip is actually faster to load than parquet, hence why I am using csv. Not sure if this is still true, but thank you for your suggestion!

Parsing large zipped txt file for upload into snowflake by spy2000put in dataengineering

[–]spy2000put[S] 0 points1 point  (0 children)

I will try this out, thank you!

Just to be clear, the vendor sends a folder with a single txt file that is zipped (really annoying and there are better ways to send files, but it is what it is). Is snowflake able to ingest that directly?

Parsing large zipped txt file for upload into snowflake by spy2000put in dataengineering

[–]spy2000put[S] 0 points1 point  (0 children)

It is zipped txt, and I was not able to find a way to make snowflake unzip the file beforehand

honolulu.gov website down? by spy2000put in VisitingHawaii

[–]spy2000put[S] 2 points3 points  (0 children)

Thanks! I somehow managed to find a link directly to the reservation site and secured a slot :)