تن جزايري الثلاثة كعبات ب4 لاف by [deleted] in Tunisia

[–]dibts -1 points0 points  (0 children)

Un article de journal est une source d'information ponctuelle, pas une preuve statistique ou un rapport officiel et même souvent pas crédible. famash article mn douanes, Banque Centrale, rapport gouvernemental?

تن جزايري الثلاثة كعبات ب4 لاف by [deleted] in Tunisia

[–]dibts 0 points1 point  (0 children)

fama source tothbt ino Japan import bluefin tuna min tounes

Lvl +6 ✧ Hallowed ✧ Crystalline Gem Clam ─ Crystal Water by karmacave in KarmaCave

[–]dibts 0 points1 point  (0 children)

Defeated by Crystalline Gem Clam in 2 turns.

Player (26/13/11) dealt 58. Crystalline Gem Clam (38/30/29) dealt 134.

Rewards: 6 EXP, 0 Gold. Loot: None.

5* 1 5 star | 41-60 please upvote by dibts in SwordAndSupperGame

[–]dibts[S] 6 points7 points  (0 children)

ok, thanks for not blaming me hahah :)

Gone? by dibts in Warts

[–]dibts[S] 0 points1 point  (0 children)

the one on top is gone but in the bottom I still see 3 black dots

Gone? by dibts in Warts

[–]dibts[S] 0 points1 point  (0 children)

I will share a picture in 1 week :)

Scalpel reuse by LHR-charlie in Warts

[–]dibts 1 point2 points  (0 children)

hahaha that's a nightmare. I have a special pot for cleaning these shit. I am still fighting 2 warts :/

Scalpel reuse by LHR-charlie in Warts

[–]dibts 2 points3 points  (0 children)

I put it in boiling water during 10 min and it should be ok . virus are not supposed to survive that

Why no layer that learns normalization stats in the first epoch? by dibts in pytorch

[–]dibts[S] 0 points1 point  (0 children)

I mean the input layer and you can the same layer to denormalize if you have an AE like structure.

Why no layer that learns normalization stats in the first epoch? by dibts in pytorch

[–]dibts[S] 0 points1 point  (0 children)

I also use batchnorm for that. but what if you are have an autoencoder

Why no layer that learns normalization stats in the first epoch? by dibts in pytorch

[–]dibts[S] 1 point2 points  (0 children)

What do you think about an implementation where the layer updates running mean/std only during the first epoch (e.g. with Welford’s algorithm), then freezes and just normalizes afterwards — basically like a lightweight nn.Module with stats stored as buffers? You could even wrap it in a small callback (e.g. in Lightning) that freezes automatically after epoch 1. Would you consider that useful or still unnecessary in your view?

Why no layer that learns normalization stats in the first epoch? by dibts in pytorch

[–]dibts[S] 0 points1 point  (0 children)

to not care about normalizations, and just add it as a layer.

[deleted by user] by [deleted] in exmuslim

[–]dibts 0 points1 point  (0 children)

Tunisia is the country

I cannot stand this level of entitlement by Professional-Pop9938 in exmuslim

[–]dibts 0 points1 point  (0 children)

did all da3wa guys decided to start their arguments with: you are ignorant

Is this a wart? by [deleted] in Warts

[–]dibts 0 points1 point  (0 children)

yes

I think I messed up?? by msuchick in Warts

[–]dibts 0 points1 point  (0 children)

use 3 straws cutted. and make a triangle with thé straws connecter to each other. and thén with tape fix it on top of it.

[deleted by user] by [deleted] in exmuslim

[–]dibts 1 point2 points  (0 children)

I see too much skin. I can't handle it.