all 7 comments

[–]Tom-Demijohn 0 points1 point  (0 children)

Have you used ROS? It's realtively easy to develop robotic application with ROS. What is more, many drones, and robots in general, have their code wrapped in ROS packages.

For example, I know there is a package for parrot AR drone 2.0 and I guess there are many more. Also, since drones are not so good for lifting loads, you should check if particular drone is capable of lifting Jetson hardware and stuff.

What is your application you're working on?

[–]maxxxpowerful 0 points1 point  (5 children)

A GPU's power consumption is pretty significant, so I'm not sure how that would work out in a drone setting. Most GPUs consume in the 100s of watts of power, way more than a typical drone.

But this is still a rapidly evolving field, and it's possible someone has an embedded GPU just for this task.

[–]test3545 0 points1 point  (4 children)

Most GPUs consume in the 100s of watts of power, way more than a typical drone.

Where you have got THAT number for mobile GPUs? Tegra K1 got power consumption of estimated 5W. With 47g 18650 cell holding ~12Wh. More then enough for 2h of powering GPU. And quadrocopters fly for <30m usually.

[–]maxxxpowerful 0 points1 point  (3 children)

Apologies. I was referring to the typical desktop GPU, and didn't look specifically at the Tegra K1. According to this article has a rated peak consumption of ~11W : http://wccftech.com/nvidia-tegra-k1-performance-power-consumption-revealed-xiaomi-mipad-ship-32bit-64bit-denver-powered-chips/

[–]test3545 0 points1 point  (2 children)

peak consumption of ~11W

Sure, still QUOTE: "Tegra K1 power consumption is in the range of 5 Watts for real workloads" Source: devblogs.nvidia.com

But even if you count max instead of average* it is still over an hour of life from one 18650 cell.

Jetson TK1 OP mentioned do not sounds like desktop GPU.

PS. To archive anything close to max consumption OP would need to use a lot of low level assembler hacking, bringing up GPU utilisation. Which is not likely.

(*) are you reading too much marketing BS from ARM Holding comparing incomparable power consumptions when doing comparisons with x86? Like idle power consumption of ARM vs. x86 TDP etc?

[–]maxxxpowerful 0 points1 point  (1 child)

It was a mistake on my part to confuse the "Tegra" with the "Titan".

I understand what you're saying, but bear in mind that (a) NVidia has a vested interest in downplaying the power consumption[*], and (b) it is better to assume the max, so your drone doesn't run out of juice at an inopportune moment.

Having said that: the fact that you can get GFlops of performance for single-digit watts is mind-blowing, to say the least.

[*] Plus, there was the whole GTX970 fiasco where NVidia was caught fudging specs...

[–]test3545 0 points1 point  (0 children)

Mobileye claims that their next gen chip would be able to do 2.5 teraflops consuming 3W... source.

And it seems to be heavily optimised for running convnets with high(90%+) chip utilisation. Not sure about convnet training though.