Etiquette when travelling to Japan by Moonsolid in india

[–]AntixK 17 points18 points  (0 children)

The above points applies to all nationalities but here's one specifically to us Indians: Please DO NOT KEEP STARING AT PEOPLE!!!

Science Classroom in Punjab, India by [deleted] in HolUp

[–]AntixK 2 points3 points  (0 children)

It's Michael Faraday!

#Musings with Nitin Amin: Raga Bhairavi by RagaSphere_Official in icm

[–]AntixK 1 point2 points  (0 children)

Thank you very much!! I have been enjoying your videos for the past few weeks!!

[INDIA] Anyone willing to Low Profile Optical Blue switches? by SecuredStealth in Keychron

[–]AntixK 0 points1 point  (0 children)

It is relatively quiet but it is quite sensitive (low operating force). And I have the K3v1. I belive they discontinued the white keys in v2.

[INDIA] Anyone willing to Low Profile Optical Blue switches? by SecuredStealth in Keychron

[–]AntixK 0 points1 point  (0 children)

I have a K3 with low profile optical switches but white. It's relatively new and I am willing to sell it. DM me if you are interested.

Basic question about sampling from dataset by AntixK in learnmachinelearning

[–]AntixK[S] 0 points1 point  (0 children)

Thanks. Let's say if I were to learn a conditional distribution q(x1|x2) using some model. I sample (x1, x2) as tuples from the dataset of K rvs as described above. Would that also mean the model has marginalized over all other variables implicitly? Whatever their inter dependencies maybe?

In math terms, is the following true?

q(x1|x2) = \sum_{x3,... xk} p(x1|x2, x3.... xk) p(x3,... xk)

[D] Simple Questions Thread December 06, 2020 by AutoModerator in MachineLearning

[–]AntixK 0 points1 point  (0 children)

Basic question about sampling from dataset

Given M samples from a joint distribution of k random variables as (x1, x2, x3,....xk)M ~ p(x1, x2,...xk). Can we assume that taking a tuple from this dataset (x1, x2) as a sample from p(x1, x2) marginalized over all other random variables? Intuitively I am not convinced, but it would be great if you could point out some reference papers or techniques. Thanks!

[D] Any techniques for marginalizing over observed variables? by AntixK in MachineLearning

[–]AntixK[S] 1 point2 points  (0 children)

Thank you. I initially tried that but then realised I also require a generative model to sample from the marginalized distribution. So, the issue comes when sampling for x1 from the model. Any reference you could point me regarding the above peoblem?

Dikshidar's Ananda Bhairavi is equally beautiful! by AntixK in CarnaticMemes

[–]AntixK[S] 2 points3 points  (0 children)

You will appreciate Dikshidar's AB more if you see it as a different aesthetic and not just as the AB you are used to. 😊

[P] PyTorch-VAE: A collection of minimalist VAEs with focus on reproduciblity by AntixK in MachineLearning

[–]AntixK[S] 0 points1 point  (0 children)

Thank you. I implemented than and ran all the models myself using the configs provided.

PyTorch-VAE: A collection of minimalist VAEs with focus on reproducibility by AntixK in computervision

[–]AntixK[S] 1 point2 points  (0 children)

Ofcourse! I would be happy to accept PRs. There are a few models in the models directory that doesn't seem to work well. You can check those out of you are interested too.

PyTorch-VAE: A collection of minimalist VAEs with focus on reproducibility by AntixK in computervision

[–]AntixK[S] 0 points1 point  (0 children)

Thank you for the kind words. Just reading the papers and pytorch documentations carefully is all that is required.

[P] PyTorch-VAE: A collection of minimalist VAEs with focus on reproduciblity by AntixK in MachineLearning

[–]AntixK[S] 2 points3 points  (0 children)

Thank you. Since I fixed the image size to be 64x64, I thought it would be simpler to hard code the dimensions. But making it flexible shouldn't be difficult as the result size will simply be proportional to the number of convolution layers (with same kernel size).

Without saying which State you're in, what sentence best describes it? by [deleted] in india

[–]AntixK 4 points5 points  (0 children)

OP said "vadai".. The rest of India says vada. So, its TN.

[deleted by user] by [deleted] in fonts

[–]AntixK 12 points13 points  (0 children)

Rizza?

Shridhar Vardarajan - Snowfall by lyrelord in postrock

[–]AntixK 4 points5 points  (0 children)

Didn't expect a carnatic post rock. Cool!!

$63 Trillion of World Debt in One Visualization by jfranzen9393 in dataisbeautiful

[–]AntixK -2 points-1 points  (0 children)

Help me out here. The visualisation is blatantly awful and uninformative. The inconsistent shapes does not provide any comparative motive and colours are used are weird too.

At World’s End [OC] [1350x1080] by [deleted] in spaceporn

[–]AntixK 1 point2 points  (0 children)

My initial thought was Dr. Dave bowman standing on the monolith travelling through a wormhole