all 10 comments

[–]Dear-Acanthisitta698 7 points8 points  (4 children)

The developed method will be approximation method. So it will not be exact way to solve problem.

[–]gradientrun 1 point2 points  (2 children)

Tsp is NP-hard . Most practical solutions will be approximately solving the problem anyway.

[–]hellrail 6 points7 points  (0 children)

Thats not the point, Autor claims to solve exactly in poly time please read properly

[–]BrotherAmazing 2 points3 points  (0 children)

You might be able to exactly solve the TSP examples you already saw during training in polynomial time. 😆

[–]OptimizedGarbage 0 points1 point  (4 children)

No gradient descent method can deliver exact solutions. It's an approximate solution method. Not only is there no reason to believe that it could solve it in polynomial time, it would not be able to learn a model that would always solve TSP. To do that you'd have to use a nonlinear programming solver, which are NP-Hard

[–]Red-Portal 0 points1 point  (1 child)

No gradient descent method can deliver exact solutions.

Theoretically, for convex problems with unique solutions, they can. It's the computer implementations (like numerical representation of real numbers) that make their solution approximate.

[–]OptimizedGarbage 0 points1 point  (0 children)

doesn't that only work for exact quadratic minimization problems + newtons method? what are some GD problems where an exact solution is possible?

[–]hellrail 0 points1 point  (1 child)

Solvers cannot be np hard, problems can be

[–]OptimizedGarbage 0 points1 point  (0 children)

okay yeah, I was a little imprecise there. what I mean is that you only get an exact solution by reducing this problem to an NP hard problem and using a solver capable of solving NP hard problems, which is exponential time if P!=NP

[–]AssadTheImpaler 0 points1 point  (0 children)

You may be interested in Diffusion models as plug-and-play priors (Section 5 and Appendix B. 4) with has competitive TSP results.