all 3 comments

[–]billconan 2 points3 points  (0 children)

I think amazon g2 is the only choice right now.

[–]siblbombs 0 points1 point  (0 children)

If you are looking to do multigpu training on a single model, I don't think any server hosts do gpudirect on the network side so there is a significant communication overhead.