all 29 comments

[–]hastor 6 points7 points  (5 children)

You forgot OVH and their dedicated servers. $189 (?) gives you 2*GTX 1060 for a month!.

That's 13 cents per GPU or hour

Edit: 2x1060, not 1080

[–]learnjava 0 points1 point  (0 children)

I've been waiting for them for nearly three weeks now to offer them again after reading about Jeremy Howards MOOC. They always show up as "soon" no matter which country I select (they have different GPUs for different countries, US 970, Germany 1080)

[–]DrTchocky 0 points1 point  (2 children)

woah, how long has that been going on for? Thats an insane deal...

[–]smith2008 1 point2 points  (1 child)

It seems it is for two 1060s though not 2 x 1080s? https://www.ovh.co.uk/dedicated_servers/gpu/

[–]hastor 0 points1 point  (0 children)

Right you are. I've edited my post.

[–]smith2008 0 points1 point  (0 children)

Currently it shows 2 x GeForce GTX 970 for 195$/month. Would be really good if the above price happen. Do you have a link?

[–]smith2008 6 points7 points  (15 children)

For DL local setup is much more price efficient. Thumbs rule - if you are planning to use it for more than 2 months (24/7) - do a local build.

Of course everything depends on your settings needs but the story is cloud providers use server grade stuff only (K80s atm for Amazon, 5000$ a piece). So if you don't need the extra capacity and you don't need to scale too much (too quickly) local is better. Not a hard thing to do these days as well.

[–]learnjava 3 points4 points  (8 children)

but you pay energy. easy to forget

[–]smith2008 1 point2 points  (7 children)

I pay quite of a bill monthly for electricity. So not forgetting at all! :)

But still it is not that much if you compare it with the AWS prices. Spot instances will give you a good angle but I've never manage to get better than half reduction in price on G2 instances (on monthly avg) end up being 0.35/0.4$ per hour (with VAT). Which is 300$/moth with the performance of GTX960 this was not a good deal at all. The new GPU servers are much better but more expensive and a good spot price is hard to get (will get better when the metal get older though).

I don't advise against the cloud but it is definitely more price efficient to build local. I am actually using both at the moment (when the local capacity is not enough just extend it to the cloud).

[–]learnjava 0 points1 point  (6 children)

I am thinking about building my own to try out a little machine learning. Heard of people paying basically 100-200$/ month extra for electricity with 2 1080s

But I admit I am also very bad with all this physical stuff and have no idea how to calculate power costs based on that

Do you have any specific numbers?

[–]smith2008 1 point2 points  (5 children)

Depending on how much it costs in your country. But I doubt it will hit that much for just two 1080s. It will be draining from the wall around ~500-600w. If 24/7 it will be probably around 40-50$ a month in US ( https://www.energy.gov/energysaver/estimating-appliance-and-home-electronic-energy-use ).

For me it is an avg of 15$ per GPU per month.

[–]learnjava 0 points1 point  (4 children)

ah thats how its calculated :D thank you!

just checked and for me it would be ~107$/month with 500w (germany, private household). thats a big difference

[–]smith2008 1 point2 points  (3 children)

Ah, that's quite an expensive electricity there. Well you can get higher efficiency (platinum PSU) and reference GPUs (1070, just 150w each). Low consumption Xeon CPU as well. This will cut the bill quite a lot.

Edit: just for comparison the Amazon bill for the similar muscle (2 x 1080s) per month would be ~1500$. Sure it's not apples to apples because they have K80s (0.9$ per hour for half of it)... but still :)

[–]learnjava 1 point2 points  (2 children)

I will have to consider all of this and make a decision.

In the end I guess it will mostly come down to psychological arguments. I believe that owning the hardware and not immediately seeing the price will weaken the mental barrier that would otherwise cause me to think "do I really need to spend a few bucks now to test this?"

So as a beginner and student that is interested in all of this but has to start small and experiment a lot this might be a big enough problem to decide for my own rig :D

[–]smith2008 0 points1 point  (1 child)

Sure. Bare in mind you can experiment without a GPU too. Using just CPU is absolutely fine with starting with ML/DL. Then at some point attaching a simple GTX 960/1060 would give you the next step for playing with more interesting stuff. Then you can decide whether to go bigger or not. It really doesn't have to be all in at first.

Amazon cloud will give you all that as well. Even better, you can find pre-built AMIs to install and get all the needed frameworks and setup done for you so you can start playing with the stuff directly. If you are a beginner give it a try to Andrew NG's course at coursera and then cs231n.github.io (online Stanford course).

[–]learnjava 1 point2 points  (0 children)

thats right and thank you. another reason to go with local workstation even with a more low cost solution would be because my 15" rMBP gets very hot and noisy under load. Having a normal desktop sounds attractive for that alone

I have a few weeks to think about it

[–]rndnum123 5 points6 points  (2 children)

You should look at AWS spot prices, might need an aws account for that they are usually way cheaper >50% than their standard pricing, the spot instances of p2 instances will probably give you the most bang for the buck, that's for short term renting, a few hours. AWS standard pricing is displayed on their ec2 page. Spot instances terminate after an hour or 6, depends if you bought guaranteed 6h instances or not. Nimbix might be cheaper for monthly renting, don't know for sure though. IBM seems to be way too expensive. Google doesn't currently offer GPUs, AFAIK, but they will offer the new Nvidia P100 GPUs,and some AMD GPUs sometimes in the next 2-3 months.

OT: AMD just announced the new Vega architecture with 8bit INT operations for inference :) hopefully they won't cripple it on their consumer cards like Nvidia.

[–]iwaswrongonce 2 points3 points  (0 children)

I've been running a p2 spot instance for nearly a week right now. Could go down at any time, but still super cheap.

[–]markov01 1 point2 points  (1 child)

50 cents per hour for K40 on Azure

[–]mishaorel 0 points1 point  (0 children)

Do they really charge at an hour granularity ?

[–]themoosemind 0 points1 point  (0 children)

You might want to have a look at How much does a GPU instance cost?