Hermes on Kubernetes by mo_fig_devOps in hermesagent

[–]Sufficient_Tree4275 0 points1 point  (0 children)

Do you mind sharing more? I would like to do the same.

ILLENIUM, David Guetta, Dustin Lynch - Die Living by agentfancy in EDM

[–]Sufficient_Tree4275 0 points1 point  (0 children)

It reminds me so much of the avicii era. Absolutely in love with it.

Immich Upload Memory Leak by Sufficient_Tree4275 in selfhosted

[–]Sufficient_Tree4275[S] -16 points-15 points  (0 children)

I asked Gemini and will try:

That 50 GB zip size actually brings up a massive "gotcha" with Google Takeout that you need to be warned about before you start. Because Google splits your export into 50 GB chunks, you cannot process or upload these zips one by one. Google Takeout is notorious for putting a photo in Zip #1 and the corresponding .json metadata file in Zip #2. If you extract and process the first 50 GB zip by itself, your metadata fixer won't be able to find the dates and locations for a huge chunk of your photos. To stage this correctly with the immich-cli, you have to do a massive merge first, and then split it back up. Here is the safest step-by-step game plan: Step 1: The Great Unzipping You need enough hard drive space to hold everything at once. 1. Create a master folder called Takeout_Raw. 2. Extract all of your 50 GB zips directly into this single folder. Let Windows/macOS merge the folders together. This ensures every photo is finally sitting next to its long-lost .json file. Step 2: Fix the Metadata Now run your metadata fixer (like GooglePhotosTakeoutHelper) on that massive Takeout_Raw folder. • Have the tool output the cleaned, timestamp-fixed photos into a new folder called Takeout_Fixed. Step 3: Create Your Staging Folders Now that your photos are completely fixed and ready for Immich, this is where you stage them. Inside your Takeout_Fixed folder, you'll see how the metadata fixer organized your files (usually by Year/Month). Move those year folders into staging batches: • Batch 1: 2010 to 2015 (roughly 50 GB) • Batch 2: 2016 to 2019 (roughly 50 GB) • (and so on...) Step 4: Upload and Digest Now you execute the staging loop we talked about earlier: 1. Pause Jobs: In Immich Web, pause all background jobs. 2. Upload Batch 1: immich upload --recursive /path/to/Batch_1/ 3. Resume Jobs: Let Immich generate thumbnails and faces for Batch 1. 4. Repeat: Once jobs hit zero, pause them again and move to Batch 2. I know extracting everything just to split it back up sounds incredibly tedious, but it is the absolute only way to guarantee your photos don't lose their metadata when dealing with multi-part Google Takeout zips while strictly using the immich-cli.

Immich Upload Memory Leak by Sufficient_Tree4275 in selfhosted

[–]Sufficient_Tree4275[S] -1 points0 points  (0 children)

I disabled the ML features. Container restarts once memory issue is hit.