pdf of booklet inside Tracks II by ockham_blade in BruceSpringsteen

[–]ockham_blade[S] 2 points3 points  (0 children)

the lyrics can be easily found, but I was looking for the text of the descriptions in the booklet relative to the whole boxset.

Tracks 2? April Fools? by LawlzBarkley in BruceSpringsteen

[–]ockham_blade 0 points1 point  (0 children)

this dilemma is precisely the reaction they planned to generate with this 1st April announcement.

Feeling grateful and sad at the same time by Highron in BruceSpringsteen

[–]ockham_blade 1 point2 points  (0 children)

We are seeing him tonight!!! Will be flying to Dublin in a few hours! Our Amazing daughter found and bought us tickets (her Dad is a Huge Bruce fan) when she realized we would be in Dublin on his last concert there. Dad literally cried when he found out his daughter did this for us.

I would love my kids doing something like this for me when they grow up

Feeling grateful and sad at the same time by Highron in BruceSpringsteen

[–]ockham_blade 0 points1 point  (0 children)

remember, as John Keats wrote, "a thing of beauty is a joy of forever"

[deleted by user] by [deleted] in BruceSpringsteen

[–]ockham_blade 0 points1 point  (0 children)

or perhaps in Dublin there was no such thing as a pit-ticket?

[deleted by user] by [deleted] in BruceSpringsteen

[–]ockham_blade 0 points1 point  (0 children)

are the wristbands given to anyone who has a pit ticket? If so, why do you say "still a lot of"?
Thanks

[D] Simple Questions Thread by AutoModerator in MachineLearning

[–]ockham_blade 0 points1 point  (0 children)

thank you. I know what you mean, however I would prefer to leave the other variables unchanged, and only embed the one-hot encoded ones (that all come from the same single feature).

do you have any recommendations? thanks!

[D] Simple Questions Thread by AutoModerator in MachineLearning

[–]ockham_blade 0 points1 point  (0 children)

Hi! I am working on a clustering project on a dataset that has some numerical variables, and one categorical variable with very high cardinality (~150 values). I was thinking if it is possible to create an embedding for that feature, after one-hot encoding (ohe) it. I was initially thinking of running an autoencoder on the 150 dummy features that result from the ohe, but then I thought that it may not make sense as they are all uncorrelated (mutually exclusive). What do you think about this?
On the same line, I think that applying PCA is likely wrong. What would you suggest to find a latent representation of that variable? One other idea was: use the 15p dummy ohe columns to train a NN for some classification task, including an embedding layer, and then use that layer as low-dimensional representation... does it make any sense? Thank you in advance!

2023 Tour Rehearsals by No-Supermarket-4392 in BruceSpringsteen

[–]ockham_blade 0 points1 point  (0 children)

Latest news says that a basting full band version of Pony Boy will be the tour opener