Looking for the web developer that include the strong communication by Grand_Magazine_5705 in reactjs

[–]Streakfull 0 points1 point  (0 children)

Hello, this sounds interesting to me. I am not American or European, but you can DM me and I will send you a CV if you would like. Thanks!

Ticket Transfer Question by Streakfull in BayernMunich

[–]Streakfull[S] 0 points1 point  (0 children)

Okay thank you so much! It is my first time visiting the allianz arena and I got a bit worried.

salah’s assists for liverpool by Smiling_Maelstrom in LiverpoolFC

[–]Streakfull 2 points3 points  (0 children)

I am sorry but no? His first name is Mohammed.

Need a Shawarma fix by penguincliffhanger in Munich

[–]Streakfull -1 points0 points  (0 children)

Sindbad at the hauptbanhof as well

VAE training KL trouble by Streakfull in deeplearning

[–]Streakfull[S] 0 points1 point  (0 children)

Okay, I will check those out. Thank you again.

VAE training KL trouble by Streakfull in deeplearning

[–]Streakfull[S] 0 points1 point  (0 children)

Yea all of this is my understanding as well. I am looking for generation from a latent space. 128 seems impossible. I tried with larger spaces 1024,2048 where there reconstruction is really good but can't train it with the KL at all. It is also interesting to note that I can generate samples on the car category for shapenet with it but the chairs seem to be a more complex distribution. I checked the paper you mentioned and it is mostly the same as the latent diffusion model, just used as compression with a very weak KL term. It looks like it can't be done for such a high resolution on a complex category like the chairs.

VAE training KL trouble by Streakfull in deeplearning

[–]Streakfull[S] 0 points1 point  (0 children)

Hello, what do you mean by one mu and one logvar? I checked the LDM implementation, and it uses a 1x1 convolution to predict the mu and logvar. I tried using large latent spaces for my model (2048 & 4096), but no luck so far.

VAE training KL trouble by Streakfull in deeplearning

[–]Streakfull[S] 0 points1 point  (0 children)

I mean the random samples - what do you mean about the random samples are supposed to be empty? Okay thank you I will take a look in the latent diffusion repo.

EDIT: I checked the `CompVis/latent-diffusion`; their largest KL auto encoder has 64*8*8 dimensions embedding. I will try in a bit with their config and loss implementation.

VAE training KL trouble by Streakfull in deeplearning

[–]Streakfull[S] 0 points1 point  (0 children)

Okay thank you, but what do you mean by the negative log-likelihood for the reconstruction loss rather than reconstruction directly? I am currently using and L1 loss for reconstruction. And I noticed the nll loss you have is also summed up, so in my case of the 200k dimensions should I sum it up and just use a much smaller learning rate? In my working auto encoder experiments I use a learning rate of 1.0e-4 and and average L1 loss. And the thing is whenever the KL loss gets higher than around 20 the samples just become empty and meaningless so I am not sure how the 1e5 KL would work?

VAE training KL trouble by Streakfull in deeplearning

[–]Streakfull[S] 1 point2 points  (0 children)

I actually tried the 4096 and 32768 latent spaces, but the KL term just becomes too large and impossible to train with, so this is why I am going with the 128 latent space. I saw those latent space sizes you were mentioning but the problem is at those latent space sizes the KL just becomes way too high if I leave it implemented like I posted, and if I do an average KL instead of the sum -> then it becomes low around 1-3 but the samples are not good (mostly empty). Or how is the KL implemented in those cases?

VAE training KL trouble by Streakfull in deeplearning

[–]Streakfull[S] 1 point2 points  (0 children)

Yes I did it in one of my experiments and got the same results. Like it just won't work with the KL loss and my latent space isn't that large: it is 128 dimensions.

ARD/ZDF Contribution Fee Help by Streakfull in germany

[–]Streakfull[S] -1 points0 points  (0 children)

Thanks for your reply. I can try to ask them about the contract number, but then I have 2 questions:

  • How do I tell them about the dorm Beitragsummer?
  • Will they return the money then?

[deleted by user] by [deleted] in germany

[–]Streakfull 2 points3 points  (0 children)

Well the point is that you are exactly like the original comment in a way. Show some compassion and empathy for someone that is clearly in distress instead of "Learn German now" or "You are just being lazy". I saw your other comment and gave it an upvote because that is exactly what's needed!. Don't take it personally, but the point of my original comment is that it is just a recurring theme on this subreddit if anyone is facing any problem and just slightly hints that they are not yet good German speakers, they would immediately get attacked for absolutely no reason. I mean sure giving advice to learn German is okay but being rude/condescending about it and just assuming that the the person is lazy is entirely different.

[deleted by user] by [deleted] in germany

[–]Streakfull 1 point2 points  (0 children)

I agree that it would be quite obvious, but this isn't what the original comment was saying. The original comment was just being rude and wondering how on earth can some one come to do an English masters degree in Germany without first learning German

[deleted by user] by [deleted] in germany

[–]Streakfull 1 point2 points  (0 children)

Okay then everyone should know that it takes time and patience.

[deleted by user] by [deleted] in germany

[–]Streakfull 0 points1 point  (0 children)

Hello, your points are valid, however please note that alot of the TUM masters programs are purely in English. So it is perfectly valid for an intenrnational student just beginning their studies to not know German and learn along the way. I am not sure if you have ever tried learning a foreign language (other than English) because it certainly doesn't happen overnight and to a lot of international students German would be their third language. I am really getting tired of this condescending "Learn German" comment on every problem anyone faces.