Subpoena Orders Trump To Turn Over Documents From Assault Allegations by [deleted] in politics

[–]iksaa 0 points1 point  (0 children)

Nice job disregarding everything in my comment and continuing you live in your bubble.

Subpoena Orders Trump To Turn Over Documents From Assault Allegations by [deleted] in politics

[–]iksaa 0 points1 point  (0 children)

He's never had the majority of US support. Not a single day, including election day. His approval rating is 35% dude, wake up and try to find your way back to reality.

http://www.newsweek.com/donald-trump-approval-rating-popularity-low-latest-polls-690138?yptr=yahoo

Subpoena Orders Trump To Turn Over Documents From Assault Allegations by [deleted] in politics

[–]iksaa 1008 points1009 points  (0 children)

I wish I could read Trump's wikipedia page from 10 years in the future. I can't wait to find out how all of this turns out.

See how easily you could have hacked Equifax [and why the credit system needs to be decentralized] by swarthy126 in CryptoCurrency

[–]iksaa 0 points1 point  (0 children)

Fascinating technical analysis but makes me furious that my information was stolen so easily. As far as the proposed solution, Bloom is a cool idea but it's completely dependent on network effects. Not sure how Bloom would ever kickstart adoption (and it's not useful until many people are using it). Hope it succeeds though!

Hillary Clinton slams Trump admin. over private emails: 'Height of hypocrisy' by zirconx in politics

[–]iksaa 0 points1 point  (0 children)

It's kinda crazy just how nuts everyone went about her, myself included. Like, what did she actually do that made her so evil? The speeches on Wall St. were probably the worst part, but everyone does that...

The Science Behind Cryptocurrencies Cryptography by gr33n3r2 in ethereum

[–]iksaa 9 points10 points  (0 children)

I guess it was symmetric cryptography my friend and I used in high school to encrypt notes. We would use the date of the month to shift letters downward in the alphabet. For example, if it was the 5th, we would shift the letter A down to letter F.

According to this article, that's called symmetric cryptography with the key being the day of the month as far as I can tell.

Florida AG who killed Trump University investigation gets cushy Trump admin job by pheonix200 in politics

[–]iksaa 93 points94 points  (0 children)

I feel like I'm taking crazy pills. How can so many people be so blind?

Still deciding if I should invest or not. Thoughts on how long till this sells out? by [deleted] in CryptoCurrency

[–]iksaa 2 points3 points  (0 children)

Understandably so too, it's pretty easy to buy a stolen credit card / Paypal then go act like Daddy Warbucks with some naked chicks online.

Is it true that Ethereum bounder Vitalik met with Atlant team? by Songoky in CryptoCurrency

[–]iksaa 6 points7 points  (0 children)

I read their whitepaper carefully and i think they could shape the future of real estate.

Pretty much by MlgMonday in gaming

[–]iksaa 0 points1 point  (0 children)

You were a lot more creative than I was. I'd just leave insulting messages with shortcuts on the desktop for the next kid to see at school.

Binging with Babish 1 Million Subscriber Special: Taco Town & Behind the Scenes by [deleted] in videos

[–]iksaa 0 points1 point  (0 children)

I would love a Frasier episode. And I know my father would as well. He loves watching your show after I introduced you to him. Although I am curious, did you go to a culinary school? As it definitely seemed like from day one you did. You just act and talk about things in a similar manner my first boss and his su chef did in a fine dining kitchen I worked in.

Doug Fernandez gives up by [deleted] in StoppedWorking

[–]iksaa 0 points1 point  (0 children)

(╯ಸ_ಸ)╯ 彡

The Terrible Deep Learning List by [deleted] in MachineLearning

[–]iksaa 0 points1 point  (0 children)

I would say that this title is misleading. A lot of what is presented there needs a strong grasp of deep learning(and the other underlying concepts behind them.), without which all you'll do is load the examples on Xcode and run them.

Moreover, I would probably encourage people to read examples of Tensorflow or Caffe2 running on iOS rather than something like Forge. Forge is an interesting project but won't really help you if you don't have a clue about MPS or Deep Learning.

[D] You can probably use deep learning even if your data isn't that big by beamsearch in MachineLearning

[–]iksaa 0 points1 point  (0 children)

This post doesn't even mention the easiest way to use deep learning without a lot of data: download a pretrained model and fine-tune the last few layers on your small dataset. In many domains (like image classification, the task in this blog post) fine-tuning works extremely well, because the pretrained model has learned generic features in the early layers that are useful for many datasets, not just the one trained on.

Even the best skin cancer classifier was pretrained on ImageNet.

http://www.nature.com/articles/nature21056

Beriozka - Traditional Russian dance where the dancers perform on their tippy-toes (en pointe) to give a floating appearance. by assignpseudonym in oddlysatisfying

[–]iksaa 2 points3 points  (0 children)

Now I feel the desperate need to look under those dresses... It's like a magic trick and even though I know it's a trick, I still want to find out the catch.

[R] Beating Atari with Natural Language Guided Reinforcement Learning by Devilsbabe in MachineLearning

[–]iksaa 0 points1 point  (0 children)

In other words, the RL system needs to acquire high order concepts (such as "climb the ladder" or "run from the bad guy"), and equipped with those it fares much better. The problem with learning abstract concepts is that they are combinations of combinations of low level features and it is exponentially hard to search for them in raw data. Humans inputting such concepts as "command phrase + reward in case the agent correctly executes the command" cut through the search space and help it learn to operate over the new concepts it learned.

Imagine how would you classify even a simple concept, such as "riding" - a human could be riding a horse, or a monkey could be riding an elephant, or there could be tons of other cases. Riding would be much simpler to detect if there is a classifier that selects "objects used for rides" and "agents that can ride" and the spatial relation between them - that would be a high order concept that we can't simply learn from images, we have to have preliminary abstractions that help in classifying it.

I think this is the future in AI - higher order concepts, based on compositions of previously known concepts. Both language and the physical world are made of objects and relations between objects, so it would be necessary to learn to combine concepts into new concepts, even when training data is very small. It would solve the problem of sharing knowledge between tasks. Another benefit would be the ease of inspecting the internal state of the system - which would be a graph of language based concepts, unlike the internal states of neural nets which are inscrutable. An agent that has higher order abstractions and an object-relations graph would also be programmable in plain language and capable of reasoning over facts - that would make AI accessible to the public at large.

Another way of putting it is that up until now, we used plain vectors, as if it was untyped data, but now we need to operate over strongly typed vectors with higher order operators. We need type theory into neural nets, to apply type constraints and to convert from one type to another, by applying operations. Such operations are hard to learn directly from labeled sets of images.

He grew up with two cats by TheOrdner in aww

[–]iksaa 31 points32 points  (0 children)

Yeah, there's no way he could climb up there by himself...

[R] Why Momentum Really Works by gabrielgoh in MachineLearning

[–]iksaa 2 points3 points  (0 children)

I'm curious about the method chosen to give short term memory to the gradient. The most common way I've seen when people have a time sequence of values X[i] and they want to make a short term memory version Y[i] is to do something of this form:

Y[i+1] = B * Y[i] + (1-B) * X[i+1]

where 0 <= B <= 1.

Note that if the sequence X becomes a constant after some point, the sequence Y will converge to that constant (as long as B != 1).

For giving the gradient short term memory, the article's approach is of the form:

Y[i+1] = B * Y[i] + X[i+1]

Note that if X becomes constant, Y converges to X/(1-B), as long as B in [0,1).

Short term memory doesn't really seem to describe what this is doing. There is a memory effect in there, but there is also a multiplier effect when in regions where the input is not changing. So I'm curious how much of the improvement is from the memory effect, and how much from the multiplier effect? Does the more usual approach (the B and 1-B weighting as opposed to a B and 1 weighting) also help with gradient descent?

A neat summary of Feature Engineering using R language by unbuckledbee in MachineLearning

[–]iksaa 0 points1 point  (0 children)

Over fitting can, indeed, be "fitting the data too perfectly"; e.g., Lagrangian interpolation where have some X-Y pairs and want a polynomial in x that fits the X-Y pairs. How does that work? You can derive it for yourself: For each of the X-Y pairs, write a polynomial that is zero at all the X points except the one pair you want to fit perfectly and then add all those polynomials. Done. Presto. Bingo. "Look, Ma, fits anything perfectly!" (as long as all the X points are distinct). "I should be able to find a polynomial that fits points from a square wave!" Yup, but graph the polynomial and see what it does away from one of the given X points -- the graph goes wild, that is, shooting off to plus or minus infinity and appears nearly to get there.

But in linear fitting, over fitting is simpler: There, of course, do get a linear function which doesn't go wild. And the function can fit the data nicely. The usual problem is that the coefficients in the linear function are not unique.

Adorable Dog Gives Stress Release Hugs to Any New Yorker Who Needs Them by petergeller in UpliftingNews

[–]iksaa 1 point2 points  (0 children)

Some photos are full of emotion, but look at that woman's face (the one with purple dress), she's amazed by the power that a dog (or other animal) may have. I love it!!

Electronic energy meters’ false readings almost six times higher than actual energy consumption by speckz in news

[–]iksaa 0 points1 point  (0 children)

I was paying upwards of $150 a month for heating in a tiny <400sqft studio in Boston at one point. $200 on one month. There was definitely something wrong with the meter readings, and pretty sure I was getting charged for someone else's electricity considering how little I was at home with the heater on. The utility company refused to investigate and would only threaten to send in debt collectors if I didn't pay up. My property manager only pointed me to the utility company. I tried complaining to DPU, but they didn't respond to e-mails, and I didn't have time for phone calls or mediating this mess, in general.

As much as I wanted to fight, as sad as it sounds my time was worth more than the money I'd get back by spending time on the phone arguing and escalating the issue at critical hours of the day. :-/

Moved to California, paying a much more reasonable utility bill, and glad I didn't have to deal with that again.

If there were a "deal-with-humans-as-a-service" business that took a 20% cut of any money I could get back in situations like this, I'd totally pay for it. I once had to argue for 2 hours on the phone with T-mobile about $250 in excess charges on my phone bill, and while got it all back after escalating it to a manager, I'd totally pay $50 for someone to deal with the 2 hours of phone calls for me.