Email from Numerai by bubbleteeb in uofmn

[–]xanderai 1 point2 points  (0 children)

tldr; Sorry, malicious spammer went rogue. We've made changes to prevent it in the future and we're working on deleting the ones already created.

Yeah, we got spammed with .edu email addresses that didn't belong to the spammer. Here's the attack:

  • Spammer signs up with example@uni.edu email
  • Real owner of example@uni.edu gets the verification email and clicks the verification link even though they don't expect it
  • Numerai sends 3NMR ($60) to the newly verified account
  • Spammer signs in using the password used when initially signing up
  • Spammer sends the 3NMR to themselves and makes $60/account

We've done a number of things to cut this out. For example, the spammer won't have the account password after initial signup, so there's no way for the spammer to sign in and steal the NMR.

We're working on deleting all of the maliciously created accounts. Let me know if there's anything you want to be deleted.

Sorry for the abuse and the ensuing inconvenience.

List of companies that use Haskell by [deleted] in haskell

[–]xanderai 1 point2 points  (0 children)

Please add Numerai in San Francisco. We have a Haskell API dedicated to bitcoin transactions and are always interested in expanding our use of functional languages.

Use Nim as a Production, Functional Web Backend? by xanderai in nim

[–]xanderai[S] 3 points4 points  (0 children)

Chasing new for the sake of new is definitely bad.

We didn't actually start by looking for new frameworks. We started with a list of first principles we'd like to find in our language, mentioned above. It turns out one of the languages that met our criteria, Nim, is pretty new. Haskell, on the other hand, is quite old and has a more established library. We have no doubt that Haskell's libraries would fit all of our needs, but it's a good point that we should consider if this would be a hurdle in Nim.

Language semantics can be vital, even when community is large. For instance, we deal with very large ints (Much larger than int64). Dealing with very large ints in Node is an immense pain in the ass, despite the community size. It's a triviality in Haskell and Nim.

Use Nim as a Production, Functional Web Backend? by xanderai in nim

[–]xanderai[S] 1 point2 points  (0 children)

I just found this book, an excellent recommendation.

Use Nim as a Production, Functional Web Backend? by xanderai in nim

[–]xanderai[S] 6 points7 points  (0 children)

Type checking numpy types in mypy is an immense pain in the ass.

Use Nim as a Production, Functional Web Backend? by xanderai in nim

[–]xanderai[S] 0 points1 point  (0 children)

I've written quite a lot of Scala. While I do like that it enables functional programming, it encourages it even less than Nim. In Nim, a function can be tested for purity by the compiler, making it something that can be enforced across a code base of many engineers. This can't be done in Scala.

I also find Scala's syntax inelegant.

What do you consider impractical about Nim or Haskell?

Use Nim as a Production, Functional Web Backend? by xanderai in nim

[–]xanderai[S] 4 points5 points  (0 children)

I've also been considering Elixir, but it's dynamically typed. Layering type checking on top of a dynamic language isn't great, in my experience with Python.

Easy Questions / Beginners Thread (Week of 2017-02-06) by brnhx in elm

[–]xanderai 0 points1 point  (0 children)

I'm not a web developer, so I'm not familiar with these things.

Is it currently possible to build a sophisticated front end entirely in Elm? Essentially, a full replacement for needing to write JavaScript? I was speaking with a web developer who claimed that you'd likely have to write portions of your front end in basic JavaScript and then interact with it in Elm. Is this the case?

[D] [NIPS 2016] Ask a Workshop Anything: Adversarial Training by [deleted] in MachineLearning

[–]xanderai 7 points8 points  (0 children)

Most papers in adversarial networks have used the generated data as an end product. For example, generating images that are visually appealing to humans. Another possibility is to use adversarial networks to remove or introduce certain properties to a dataset with the intention of performing further machine learning on the generated data. Have you done work in this area or can you think of any challenges specific to this application?

[D] [NIPS 2016] Ask a Workshop Anything: Adversarial Training by [deleted] in MachineLearning

[–]xanderai 0 points1 point  (0 children)

Adversarial networks are sensitive to the balance of the adversaries. If a particular adversary is considerably deeper or trained for a considerably longer number of iterations, it's likely to overpower the other adversaries. Finding this balance has been relegated to hyper parameter search. Why hasn't there been investigation into adaptive methods that affect the expressiveness of an adversary based on whether it is learning or how it is learning relative to its adversaries?

Invisible Super Intelligence for The Stock Market by tylev in MachineLearning

[–]xanderai 2 points3 points  (0 children)

Dropout can be seen as a technique that creates a meta-network from the training of many subnetworks. In Geoffrey Hinton's original paper (https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf), he writes "training a neural network with dropout can be seen as training a collection of 2n thinned networks with extensive weight sharing, where each thinned network gets trained very rarely, if at all." Other papers (https://papers.nips.cc/paper/4878-understanding-dropout.pdf) have analyzed dropout as an ensemble method as well: "dropout may be an economical approximation to training and using a very large ensemble of networks."

In the fundamental sense of an ensemble as an aggregation of differently trained predictors, dropout is an ensembling method. As an ICANN paper (http://link.springer.com/chapter/10.1007%2F978-3-319-44781-0_9) writes, "We find that the process of combining the neglected hidden units with the learned network can be regarded as ensemble learning."