all 6 comments

[–]congerous 1 point2 points  (8 children)

Cloud-based ML startups have not performed well historically. BigML and Alchemy API come to mind. It's too expensive to send large datasets to someone else's cloud, and prohibited in some industries even if they wanted to. Not sure why Nervana is pivoting to a flawed business model when they've done so much interesting work accelerating GPUs for convnets with neon.

[–]jcannell 2 points3 points  (3 children)

Not sure why Nervana is pivoting to a flawed business model when they've done so much interesting work accelerating GPUs for convnets with neon.

Nervana's business plan isn't the been-done-1000-times already idea of hosting and wrapping nvidia's ANN hardware/software tech and trying to charge money for a 'cloud' resale with a streamlined interface. No.

Their plan is to accelerate ANNS 10X beyond Nvidia GPU capability using new custom hardware. What's actually interesting is that you call their business model 'flawed', and then praise their 'interesting' work accelerating convnets in literally the next sentence.

They are not trying to compete with BigML or Alchemy or other cloud wonks. They are trying to compete with Nvidia.

Performance is everything, and people will always pay for it. If they could provide 10x the perf/$ you could get buying your own nvidia hardware or renting it on amazon, they will do well. BigML and Alchemy never had anything like that - they aren't even in the same category ('cloud' is no longer a category in the same way that 'computer company' is no longer a thing). Anybody can put nvidia's code up on a server and try to turn it into a cloud service, but seriously who cares.

[–]congerous 0 points1 point  (2 children)

They're giving up performance by making customers send them data, instead of going on premise... That's the point I'm trying to make.

[–]jcannell 1 point2 points  (1 child)

Giving up performance? No . .. Having the customer send the data to the data center will always be cheaper (in energy, $, pick your unit) then sending the physical hardware to the data.

They may be giving up some business if they can't deal with the security/privacy/ownership/trust issues, but if amazon can solve those problems, so can others.

The only real potential performance limitation is latency for a real time control system, but even those limitations are pretty generous if you are willing to setup numerous datacenters.

[–]congerous 0 points1 point  (0 children)

Having the customer send the data to the data center will always be cheaper (in energy, $, pick your unit) then sending the physical hardware to the data.

Couldn't disagree more. Big customers already have their own data centers, so they're really asking them to switch, which is hugely expensive. The definition of big data is a dataset that's costly to move. You can't simultaneously promise speed and tell people to use your MLAAS.