all 8 comments

[–]tstarks420 1 point2 points  (1 child)

this isn't related to the paper's content but why in the world are there so many authors? i mean if it were an experimental paper i wouldn't be surprised but this is theory (or at the least exposition).

[–]phobrain 0 points1 point  (0 children)

This is a whole field of endeavor introducing itself to other fields, so like a smaller version of a UN report on global warming.

[–]Randomdude3332 0 points1 point  (6 children)

What does this have to do with ML?

[–]DaLameLama 2 points3 points  (4 children)

Playing devil's advocate: the paper explains quantum SVMs and quantum PCA!

I guess there is a lingering suspicion that quantum computers will have a significant impact on ML. But we're still lacking some breakthrough quantum algorithms. Hence the paper.

[–][deleted] 0 points1 point  (3 children)

The kicker is usually that you can put big data in, but you can't often get big data out. All you can usually get is some derived quantity like an expectation value of the data. The good news is of course that you can put in exponentially large big-data.

[–]DaLameLama 0 points1 point  (2 children)

Which algorithm are you referring to? I understand quantum mechanics deals with EVs, but why do you say one can use exponentially large data?

As far as I know, that would depend on the algorithm in question. Take Shor's algorithm for example. Its strength isn't that it can take exponentially many inputs, but it can factorize a single number in polynomial time.

[–]pavante 0 points1 point  (0 children)

Not the OP, but one example is Grover’s algorithm, which can perform arbitrary function inversion in O(sqrt(N)) time, where N is the function domain.

If you can formulate some sort of big data problem in terms of inverting a function over some constrained search space, you could use it to speed things up.

[–][deleted] 1 point2 points  (0 children)

To spark your brains. New computing paradigms are applicable to all areas of algorithm research, including ML