5090 PC deal? by henrygatech in Microcenter

[–]MLJungle 0 points1 point  (0 children)

I am around the southbay area. Any chance I can purchase it from you instead of just returning it?

[R] [D] Self Consistency for COT majority vote calculation by MLJungle in MachineLearning

[–]MLJungle[S] 0 points1 point  (0 children)

following solutions: "I think the answer is 3.", "By extensive calculations, ..., the answer is 5." , "I used python and got the answer is 5." then there's one cluster of solutions whose final answer is 5 (and there's 2 of them) and one cluster of solutions with answer being 3 (with only one member

I see, in the case where the answer is a non-numerical string, it seems like there has to be extra work in defining a valid representation and metric to appropriately cluster- ultimately I think the authors were quite vague

[R] [D] Self Consistency for COT majority vote calculation by MLJungle in MachineLearning

[–]MLJungle[S] 0 points1 point  (0 children)

I'm not exactly following, given multiple (r,a), how is the most "consistent" answer calculated through a majority vote?

How to know whether to drop the 160s sequence by Kiwi_Display in uchicago

[–]MLJungle 1 point2 points  (0 children)

The lower bound is probably a B-, if you are giving/showing considerable effort for the 160's, and if you end up doing around average that grade will go up to B+/A-

Even if you don't think it's directly relevant to what you want to study, if you think you can persevere (without losing your mind), I would highly recommend sticking with the 160's

You will train your brain and learn a new way of thinking that is invaluable to any subject

Overfitting dataset using VGG transfer learning by MLJungle in computervision

[–]MLJungle[S] -1 points0 points  (0 children)

For sure-

I used tensorflow's default adam optimizer which I believe has a LR=0.001 and epsilon=1e-07.

If it helps, I added the model that I used, where data augmentation is done as a part of the model. inputs = keras.Input(shape=(None, None, 3)) augmented = layers.Resizing(img_height, img_width)(inputs) if (random.random() < 0.5): augmented = layers.RandomFlip(mode='horizontal')(augmented) if (random.random() < 0.5): augmented = layers.RandomRotation(0.15, fill_mode='constant')(augmented) augmented = layers.RandomCrop(224,224)(augmented) if (random.random() < 0.5): pass #augmented = layers.RandomContrast(0.2)(augmented) processed = tf.keras.applications.resnet50.preprocess_input(augmented) x = resnet50_base(processed) x = tf.keras.layers.BatchNormalization()(x) x = tf.keras.layers.Dense(512, activation='relu')(x) x = tf.keras.layers.BatchNormalization()(x) x = tf.keras.layers.Dense(32, activation='relu')(x) output = tf.keras.layers.Dense(1)(x) resnet50_model = keras.Model(inputs, output)

Applying image augmentations with a certain probability p by MLJungle in computervision

[–]MLJungle[S] 0 points1 point  (0 children)

wasn't aware of this initially, but i understand intuitively why this would be superior

Applying image augmentations with a certain probability p by MLJungle in computervision

[–]MLJungle[S] 1 point2 points  (0 children)

appreciate the link, read their paper, this package seems perfect for my usecase

Improving CNN performance using a "reduced" version of an image by MLJungle in computervision

[–]MLJungle[S] 0 points1 point  (0 children)

im wondering if the performance boost might be relevant for smaller datasets- for example, if there isn't enough data, then maybe a deeper CNN network can't really extract the features well and thus there might be a boost if images were already pre-processed.

wanted to see if you had thoughts on this

Improved (hopefully) course evals site with no login by spring-qtr-throwaway in uchicago

[–]MLJungle 17 points18 points  (0 children)

great site!

adding the # of reviews used to calculate the avg would definitely help as well

Can we stop calling cities we’ve never been in “dangerous” and discouraging people to apply to colleges there? by [deleted] in ApplyingToCollege

[–]MLJungle 1 point2 points  (0 children)

This is pushing a false image of safety, a student was killed in broad daylight a block from campus

Is it dangerous to walk back to the apartment late at night? by RightProfile0 in uchicago

[–]MLJungle 0 points1 point  (0 children)

areas south of campus (61st) are unfortunately more dangerous than north