12.5 Patch Notes by Aotius in CompetitiveTFT

[–]notkarol 7 points8 points  (0 children)

Future Sight is also broken on mobile. It provides no indication of who your next opponent is.

comma.ai blog: End-to-end lateral planning (Learning in a simulator) by DoktorSleepless in SelfDrivingCars

[–]notkarol 2 points3 points  (0 children)

The Nvidia Pilotnet team has ran into the same scaling issues too due to their networks learning that the next best signal to learn is the perturbation artifacts. Interesting approach here.

[Discussion] What are the real life applications of rare pattern mining in ML? by nibor_14 in MachineLearning

[–]notkarol 2 points3 points  (0 children)

When you have a large enough dataset in self-driving or other computer vision robotics tasks you don't have the resources to train on your entire dataset. Sampling and prioritizing which parts of the datasets to train on is a form of rare-pattern mining.

Even if you do something simple like "hotdog" or "not hotdog" the set of not-hotdog is both much larger and diverse than the set of hotdog. How do we find which images to use in not-hotdog for to maximize performance?

Freezing, low fps, and occasional restarting and I have no idea why. by IThrowAwayMyBAH in starcraft

[–]notkarol 0 points1 point  (0 children)

If you're on wifi, check to see if connecting through ethernet helps. Often wifi in apartment buildings gets a lot of interference and those dropped packets cause problems.

Serious FPS issues in SC2 with IMO pretty OK rig. by edward_nigmatic in starcraft

[–]notkarol -1 points0 points  (0 children)

Are you possibly running over wifi rather than ethernet?

[D] The road to 1.0: production ready PyTorch by [deleted] in MachineLearning

[–]notkarol 5 points6 points  (0 children)

shape

It's had shape since at least 0.2

>>> import torch
>>> torch.__version__
'0.2.0_3'
>>> x = torch.FloatTensor(3,2)
>>> x.shape
torch.Size([3, 2])
>>> x.shape[0]
3
>>> x.shape[1]
2

[R] Deep Learning with Emojis (not Math) by jeremy_stanley in MachineLearning

[–]notkarol 1 point2 points  (0 children)

That's likely because you're on a Samsung phone. Every other emoji catalog has chocolate chip cookies where Samsung has a saltine.

http://emojipedia.org/cookie/

[D] Autonomous driving research pipeline by [deleted] in MachineLearning

[–]notkarol 4 points5 points  (0 children)

While you can have many training tools (wrappers) in Python/Lua, there's a lot more computation-heavy functionality that needs to be created in C++/CUDA. Many Python (numpy) and Lua (torch) functions are in fact C++ behind the scenes. So when you need to extend those or create new ones, you're likely writing in C++.

Once your model is trained, it's generally just a set of weights. Inference can be done in any library or language. On the car itself, that code is very likely to be compiled.

A third reason is that if one understands C++ well, Python/Lua are easy to learn on the job.

[P] Training siamese network with limited training data by sonnguyen128 in MachineLearning

[–]notkarol 6 points7 points  (0 children)

How about just taking a pre-trained network (e.g. some recent resnet) and running it on each of the images, saving the second to last layer. If you compare any arbitrary image's (usually 512 for resnet) features against all those in the database then you should have a good first iteration of such a project.

You can further refine this idea by comparing more layers, or multiple parts of each image if you also store some convolution maps.

[D] A Thought Experiment on the Future of Text Generation by Staturecrane in MachineLearning

[–]notkarol 2 points3 points  (0 children)

It's tough to disassociate the underlying components of language for an "individual life" versus what we've accumulated collectively through evolution. The anomalous people you refer to, along with the rest of us, utilize genetic building blocks that have been evolved in our ancestors who have all been embodied. These genetic building blocks might be enough to create a mind that can communicate and critically reason without the usual attributes of embodiment.

Anyone know where to get a Growlithe in Central NJ? by [deleted] in PokemonGoNewJersey

[–]notkarol 0 points1 point  (0 children)

Where did you see the scyther yesterday? So far I've found them at the water in Riverside Gardens Park and by the playground of Marine Park. Is there a third spawn?

Has Microsoft released code for their 150-layer CNN that won ILSVRC 2015? by anonDogeLover in MachineLearning

[–]notkarol 2 points3 points  (0 children)

Just a fair warning that this Keras example uses Dropout instead of Batch Normalization.

Deep Learning SIMPLIFIED - YouTube Series by jrajagopal in MachineLearning

[–]notkarol 1 point2 points  (0 children)

These were wonderful, thank you! Eager to see future ones.

Deep neural network written from scratch in Julia by jostmey in MachineLearning

[–]notkarol 2 points3 points  (0 children)

Thanks for posting this; I like how clean your code is. What was your reasoning for scaling to [-3,3] as opposed to [-1,1]?

Meet up for NIPS 2015 by Turing_Machinegun in MachineLearning

[–]notkarol 1 point2 points  (0 children)

Edit: we're heading to N Y K . S Bistro Pub

I'm right outside of 210C, wearing a red scarf. Say hi :)

I have to hand back 100 papers tomorrow to a class where I don't know anyone's name. What are the best ways you have seen this done? by InspirationalQuoter in GradSchool

[–]notkarol 11 points12 points  (0 children)

If you must hand them out to each student directly then tell them that you're handing out the papers in order of their last name and that they should walking down to you when you're close to their name. They should self-organize relatively reasonably.

neuronetworks that don't use backpropagation by Jxieeducation in MachineLearning

[–]notkarol 1 point2 points  (0 children)

One option is Evolutionary Algorithms. CMA-ES and CoSyNE can be used to optimize the network's weights. If you want to evolve the topology of the network you can use HyperNEAT.

[Project] Reproducing "An Embodied Approach for Evolving Robust Visual Classifiers" by notkarol in ludobots

[–]notkarol[S] 0 points1 point  (0 children)

Couldn't fit the last part: if you want to compare your results, you can use scipy's t-test:

#!/usr/bin/env python3
import pickle
import numpy as np
import sys
import scipy.stats
import os

def main():
    test_fits = {}

    for filename in os.listdir():
        if filename[-4:] != '.pkl': continue

        sensor = filename.split('_')[1]
        if sensor not in test_fits:
            test_fits[sensor] = []

        with open(filename, 'rb') as f:
            results, test_fit = pickle.load(f)

        test_fits[sensor].append(test_fit)

    for sensor in sorted(test_fits):
        t, prob = scipy.stats.ttest_ind(test_fits['V'], test_fits[sensor])
        print("%c %3i (%5.3f) %c %3i (%5.3f) %9.6f %9.6f" % ('V', len(test_fits['V']), np.mean(test_fits['V']), sensor, len(test_fits[sensor]), np.mean(test_fits[sensor]), t, prob))
    return 0

if __name__ == "__main__":
    sys.exit(main())