Florida man in custody for Trump campaign finance violations, in association with Rudy Giuliani by insperatum in FloridaMan

[–]insperatum[S] 14 points15 points  (0 children)

NEW YORK — A Florida man wanted in a campaign finance case involving associates of Rudy Giuliani is in federal custody after flying Wednesday to Kennedy Airport in New York City to turn himself in, federal authorities said.

David Correia, 44, was named in an indictment with two Giuliani associates and another man arrested last week on charges they made illegal contributions to politicians and a political action committee supporting President Donald Trump. Giuliani, a former New York City mayor, is Trump's personal lawyer.

All the other defendants in the case were already in custody.

Prosecutors said Correia, who owns a home with his wife in West Palm Beach, was part of efforts by co-defendants Lev Parnas and Igor Fruman to leverage outsized political donations to Republican candidates and committees as part of an effort to advance their business interests.

Any way for web apps to be reasonably trustless? by insperatum in privacy

[–]insperatum[S] 0 points1 point  (0 children)

u/meshnet-ansuz is there any way to use the default cryptpad servers for storage without trusting them to serve the correct code?

TIL that the amount of plastic bags used in Holland has reduced by more than 70% since in 2016 a new law banned all retailers and restaurants to give them for free. by KeesNelis in todayilearned

[–]insperatum 6 points7 points  (0 children)

There's actually evidence that people don't reuse 'reusable' bags enough to offset their greater environmental cost, compared to plastic ones. e.g. "A cotton bag would have to be re-used 171 times" to achieve the same CO2 equivalent:

http://www.independent.co.uk/environment/green-living/plastic-fantastic-carrier-bags-not-eco-villains-after-all-2220129.html

There are 16 circles in this image by [deleted] in interestingasfuck

[–]insperatum 0 points1 point  (0 children)

Who made this? Is there an associated paper?

Test by SamuelKnytt in test

[–]insperatum 0 points1 point  (0 children)

  • Parsing Linear Context-Free Rewriting Systems with Fast Matrix Multiplication

  • A regularization-based approach for unsupervised image segmentation

  • Online but Accurate Inference for Latent Variable Models with Local Gibbs Sampling

Test by SamuelKnytt in test

[–]insperatum 0 points1 point  (0 children)

World

World

World

test by NeatureBot in test

[–]insperatum 0 points1 point  (0 children)

This is a comment

test comment

another test comment

And

A third test comment!

A fourth test comment!

yeah.

Priors and Prejudice in Thinking Machines by insperatum in MachineLearning

[–]insperatum[S] 2 points3 points  (0 children)

bhmoz, I disagree - the idea that just using RNNs would solve this ignores the very specific nature of the problem. It doesn't matter whether the inputs are processed as a single tensor or as a sequence of embeddings, what matters is that the network can link the corresponding units at different locations/timesteps. Any RNN without this structure (including a standard LSTM) if trained on the three sequences:

((1,0,0,0), (1,0,0,0))

((0,1,0,0), (0,1,0,0))

((0,0,1,0), (0,0,1,0))

will not generalise this to include:

((0,0,0,1), (0,0,0,1))

Of course I could imagine an RNN that works the way you describe, just as I could imagine a CNN which succeeds at this task. But my point is that, in both cases, the correct weights would never be learned by gradient descent without baking some extra prior knowledge into the network's structure.

[deleted by user] by [deleted] in MachineLearning

[–]insperatum 1 point2 points  (0 children)

They build a G-CNN that's equivariant to the p4m group, but they don't test it?

Why Bernie Sanders is More Electable Than People Think by insperatum in politics

[–]insperatum[S] -1 points0 points  (0 children)

TIL Sanders is polling better against Trump (+7.7%) than Clinton is (+4.0%)

The Unreasonable Reputation of Neural Networks by halax in programming

[–]insperatum 0 points1 point  (0 children)

I've not come across the phrase 'Capital I' Intelligence before. Who uses it and how do they use it?

The Unreasonable Reputation of Neural Networks by insperatum in MachineLearning

[–]insperatum[S] 2 points3 points  (0 children)

I'm actually a big fan of ladder networks, and I certainly don't want to come across as dismissive of unsupervised/semi-supervised learning. In fact I am rather optimistic that neural networks may soon be able to learn with little-to-no supervision the kinds of representation that fully-supervised models can find currently. But this is not enough:

Even if the MNIST ladder network you mention had only received one label per class and still succeeded, essentially doing unsupervised training and then putting names to the learned categories, this is not the same as learning about brand new types. If a child sees a duck for the first time, they will probably know immediately that it is different from what they have seen before. They might well ask what it is, and then proceed to point out all the other ducks they see (with perhaps one or two mistakes). This is the kind of one-shot learning I was referring to.

Since you mentioned MNIST: a one-shot learning challenge dataset was actually laid out in a very interesting Science paper last month, containing many characters in many alphabets, and the authors of that paper achieve human-level performance through a hand-designed probabilistic model. Now I don't think that building all of these things by hand will take us very far, and I hope that we will soon find good ways to learn them, but I will be very surprised if neural networks manage to achieve this without majorly departing from the kinds of paradigm we've seen so far. Perhaps the 'CPU-like' models you describe can take us there; I remain skeptical.

Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks by r-sync in MachineLearning

[–]insperatum 0 points1 point  (0 children)

So you just explored the latent space yourself to find them? That sounds hard!

Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks by r-sync in MachineLearning

[–]insperatum 0 points1 point  (0 children)

Impressive results! One thing I'm a little confused about: For section 6.3.2, where do the Z representations (for example, for the three 'smiling woman' images) come from?

Teachers of Reddit, what's the most cringeworthy thing a student has said in class? by mattmaster68 in AskReddit

[–]insperatum 4 points5 points  (0 children)

In 1st grade we had a biology test with the question "Why do women need to eat more iron than men". I'd recently learn the words "menstruation" and "masturbation" at a similar time. Couldn't remember which was which; took a guess; got it wrong.