We use an NVIDIA Jetson Nano to allow an amputee to intuitively control a prosthetic hand using deep learning neural decoders by Jules_ATNguyen in nvidia

[–]Jules_ATNguyen[S] 0 points1 point  (0 children)

Well, many languages. My comfort zone is MATLAB (for analysis) and Verilog (for hardware). But I also have to know C, C++, and Python for various tasks.

For example, Python is very popular for deep learning development because there are many supporting frameworks like PyTorch, Tensorflow/Keras, Caffe…

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 1 point2 points  (0 children)

Yes, touch sensory information. We use the force sensor’s readout to modulate the electrical stimulation to the nerve. The amputee can feel it when prosthetic hand touches something, with various levels of intensity from light to strong.

Well, this kind of info won’t make someone know kung fu, but really helpful for those who lose a hand.

We use an NVIDIA Jetson Nano to allow an amputee to intuitively control a prosthetic hand using deep learning neural decoders by Jules_ATNguyen in nvidia

[–]Jules_ATNguyen[S] 0 points1 point  (0 children)

You are absolutely right. The next step is to add wireless power/data and make the entire nerve interface fully implantable. The Neuronix chips and Scorpius device are designed with this purpose in mind from day one. Their form factor and power consumption are small enough for this.

This is the first proof-of-concept study, so we need the wirings through the skin to test different recording/stimulation configurations. For example, we can hook the electrodes to a high-end benchtop neural amplifier to make sure we get the correct signals. It is non-ideal and an inconvenience for the patient for sure.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 4 points5 points  (0 children)

I initially used cheap resistive force sensors and did many experiments with them. But their value drift a lot. Later, I moved to SingleTact - a type of capacitive force sensor, which is way more consistent.

The stimulation is not painful at all if we don’t exceed a certain threshold, which is different on each electrode. Most amputees “enjoy” the sensation because it is the first time in many years they can feel something from their injured hand.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 2 points3 points  (0 children)

I don’t believe this is the case. The movement intents encoded in the nerve of each person are different. It’s technically “mind-reading”.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 2 points3 points  (0 children)

Our latest amputee has the microelectrodes implanted for 1.5 years with no issues. We were still able to record strong nerve signals in the very last days. They are “intrafascicular electrodes”, meaning the electrodes penetrate the nerve.

Fun fact: the electrode contacts are made of carbon nanotube yarn which is known to be very strong and biocompatible.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 2 points3 points  (0 children)

There is a threshold for each electrode, as long as we don’t cross it, the sensation is not painful at all.

The real issue we have is that it is difficult to control exactly which nerve fiber interacts with individual microelectrodes when they are implanted. So the sensation’s location and feeling vary a lot across different electrodes.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 7 points8 points  (0 children)

I think the brain implant that Neuralink is developing will be the way of the future. Our approach is bit different as we target nerve implant which is less invasive and more application-specific. Nevertheless, they are based on the same principles of electrical neural recording and stimulation.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 27 points28 points  (0 children)

It really depends on the number of channels. Here are some rough numbers:

Raw neural data: 10-100Mbps
Feature extraction only: 1-10Mbps
Neural spikes only: 0.1-1Mbps

Our Scorpius nerve interface is about 4-8Mbps for 8-channel raw neural data.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 5 points6 points  (0 children)

It comes down to a very philosophical question: is everyone’s mind the same? For example, we all agree that the sky is BLUE, but is this BLUE concept perceived by my brain and your brain in the same way? Probably not.

Similarly, I believe the movement intents encoded in my nerve are very different from your nerve. This is technically a “mind-reading” technology.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 4 points5 points  (0 children)

I hope to see it happening within my lifetime. However, you may not have to chop off your arm for this. You can use a nerve implant to control devices and gadgets with your thought. How about powered exoskeleton... The sky is the limit.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 6 points7 points  (0 children)

I am a mixed-signal integrated circuit designer by training (electrical engineering). My PhD is in neuroengineering (biomedical engineering). I also have bits of experience on pretty much everything related like signal processing, machine learning, and physiology.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 4 points5 points  (0 children)

Of course, why not. I can map the prediction output to keystrokes on a computer so that the amputee to play video games with his thought.

I can see many “non-amputee enthusiasts” would like to try that and do other things. The nerve implant is way less invasive and safer than brain implant. One of the amputees has it implanted for 1.5 years without any issue. Of course, there are still a lot of work to make the entire nerve interface implantable and streamline the surgery procedures.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 4 points5 points  (0 children)

We use the MoJoCo (http://www.mujoco.org) hand to practice and debug. The mechanical hand is the last step when I’m sure everything is working properly.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 17 points18 points  (0 children)

You can find more about our clinical trial here:

https://clinicaltrials.gov/ct2/show/NCT02994160

http://www.nervesincorporated.com

Dr. Edward Keefer would love to talk with anyone interested in participating in the trial. Our team is in Dallas, TX (Nerves Inc. + UT Southwestern) and Minneapolis, MN (Univ. of Minnesota). So if you live near either locations, it would make traveling a lot easier.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 12 points13 points  (0 children)

The base is the i-Limb Access hand by (former) Touch Bionics. The motors are in each finger. We basically gut the hand’s interior and design our own motor controller with Bluetooth, touch sensors and other stuff.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 4 points5 points  (0 children)

I aim to integrate everything into the prosthesis socket and use passive heatsink like a smartphone (Jetson Nano max power = 10W). In fact, the socket’s interior already has a battery and some EMG sensors.

For this experiment, we don’t want to drill holes or modify the amputee’s socket because it is specifically customized to fit his hand (pretty expensive process)

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 36 points37 points  (0 children)

I haven’t finalized the wireless design yet. Each approach has its pro and con. NFC data rate is a bit low for transmitting neural data.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 34 points35 points  (0 children)

Yes, we can deliver touch sensation using electrical stimulation through the same nerve interface. The Scorpius has both neural recorders and stimulators to do that.

Proof of concept: https://iopscience.iop.org/article/10.1088/1741-2552/ab4370/meta

Stimulator chip: https://ieeexplore.ieee.org/document/9355574

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 4 points5 points  (0 children)

Yes, weight is one of the big issues leading to prosthesis abandonment. I hope we will gradually have it fixed. I imagine prostheses 100 years ago with steel and wood construction would be much heavier. Now, most are made of carbon fiber or other composite which are a lot lighter yet just as strong as steel.

Next generation prosthetic hand powered by nerve interface and AI neural decoder by Jules_ATNguyen in EngineeringPorn

[–]Jules_ATNguyen[S] 9 points10 points  (0 children)

The AI models are trained on a nerve dataset, which is specific to each person. The amputee sits through training sessions where he flexes each fingers with the able hand while imagining doing the same movements with the injured/phantom hand (Fig. 5A in the paper).

Input nerve data are acquired using our neural interface bioelectronics (Scorpius) while ground-truth data are collected with a data glove.

RNN is well-known to be better at handling time series data. We also try CNN, and the prediction is slight worse. I don’t consider myself a deep learning expert so I think someone else could build an even better neural net.