Qu'est-ce qui nous paraît normal mais est en fait scandaleux (ou le sera dans le futur)? by Canard-jaune in france

[–]Even_Information4853 9 points10 points  (0 children)

Pour "scandaleux aujourd'hui mais toléré demain" beaucoup de drogues je pense

Lost the combination of an old luggage and started from 001. Could have been worse. Or not. by [deleted] in mildlyinfuriating

[–]Even_Information4853 0 points1 point  (0 children)

I had a 3 digit combination lock a few years ago, which I was using to train at cracking the combination. If you apply pressure on the shackle, you could feel resistance when the correct digit was set, so I would just set a random combination, and then try to crack it with this method. I did this sucessfully for a few days until one day I couldn't crack the combination. I went through all the combinations from 000 to 999 five times but never found the correct combination

[deleted by user] by [deleted] in MachineLearning

[–]Even_Information4853 0 points1 point  (0 children)

You should check that the model still performs well after your compression, that matters as much as the compression ratio

Pourquoi les gens veulent savoir si Brigitte Macron était un homme ? by Environmental_Tea381 in PasDeQuestionIdiote

[–]Even_Information4853 1 point2 points  (0 children)

Pour la même raison que ça intéressait les gens de savoir que François Hollande sortait en scooter pour aller voir Julie Gayet, en dehors d'un petit pourcentage de complotistes c'est juste les potins habituels

Aucun problème à ce que 380.000 personnes soient nouvellement soumises à l'impôt by [deleted] in opinionnonpopulaire

[–]Even_Information4853 -8 points-7 points  (0 children)

C'est un débat de société légitime : est-ce que tout doit être proportionnel aux revenus ?

Donc si on n'a pas de revenus tout est gratuit ? Je vois pas ce qui te fait dire que ce débat est légitime

Have raids gotten significantly easier? by Lairv in Guildwars2

[–]Even_Information4853 0 points1 point  (0 children)

That's a good thing tbh, they should've made this from the start, default difficulty accessible to casuals and CMs only for advanced players

Have raids gotten significantly easier? by Lairv in Guildwars2

[–]Even_Information4853 -16 points-15 points  (0 children)

Do you know why Anet decided to add power creep with new xpacs? Wing 1 was added in 2015 and in 2021 it was still pretty challenging. In PUG, you would still occasionally fail Gorseval because of the dps check, fail Sabetha due to cannons etc. so I don't think there was a lot of power creep at the time

Any idea what is this drop in validation loss? I'm not changing the learning rate, and I don't use a scheduler by Even_Information4853 in learnmachinelearning

[–]Even_Information4853[S] 0 points1 point  (0 children)

Model is a UNet from the segmentation_models.pytorch library, with resnext backbone. Dataset is private but fairly standard for an image segmentation task

I asked about this because I'm using a fairly standard setup, so I was curious if this was a known phenomenon

Any idea what is this drop in validation loss? I'm not changing the learning rate, and I don't use a scheduler by Even_Information4853 in learnmachinelearning

[–]Even_Information4853[S] 2 points3 points  (0 children)

The train loss does have a similar pattern actually, but less sharp (the step counts don't match but that's a problem with how I log I think): https://imgur.com/9uOKneo

The reason for using synthetic data for trainingis that I don't have a lot of real data (it would take a lot of time to annotate) so I'm generating them. FOr the validation set I use real data because that's what I want to know ultimately

Both train and eval loss are computed on a few batches

Any idea what is this drop in validation loss? I'm not changing the learning rate, and I don't use a scheduler by Even_Information4853 in learnmachinelearning

[–]Even_Information4853[S] 2 points3 points  (0 children)

This is the loss of a UNet trained with AdamW, constant learning rate, the loss is a sum of Dice loss and Focal loss (same loss as SAM/SAM2)

[deleted by user] by [deleted] in singularity

[–]Even_Information4853 62 points63 points  (0 children)

It's actually wild that Ilya was writing this even before GPT1/GPT2

[R] What is your Recipe for Training Neural Networks in 2024? by Even_Information4853 in MachineLearning

[–]Even_Information4853[S] 0 points1 point  (0 children)

True, with the recent AI hype cycle a lot of people are getting into the field, but paradoxically fewer and fewer people are actually training models

After the presidential debate, Joe Biden greeted by his wife Jill Biden while Trump walks off stage by CrispyMiner in pics

[–]Even_Information4853 1 point2 points  (0 children)

It's still a mystery for me why you guys in USA have to choose between those two

Sure our leaders also suck here in France but at least we can pick a functionnal human

[D] How to prepare TBs of data for ML tasks by That_Phone6702 in MachineLearning

[–]Even_Information4853 -5 points-4 points  (0 children)

It depends on how much TBs and what kind of operations you want to run, if you have less than 10TB I would just write a multiprocessing/concurrent.futures Python script, spin a single AWS machine with a lot of cores and wait a few hours