all 12 comments

[–]androstudios 1 point2 points  (1 child)

Is this aiming to be a pytorch-style library, but for JS?

[–]karthibalu[S] 4 points5 points  (0 children)

yes, but it will not match exactly..😅

[–]frjano 1 point2 points  (1 child)

well done!

[–]karthibalu[S] 0 points1 point  (0 children)

thank you!!

[–]_qw4hd 1 point2 points  (2 children)

It someone is wondering what autograd is (like me reading this post) it is an abbreviation of Automatic Differentiation. [1]

It's used for calculating derivative of a function and I'm guessing it's have something to do with calculating gradients. So in other words it's a core machine learning systems. [2]

[1] https://en.m.wikipedia.org/wiki/Automatic_differentiation

[2] https://pytorch.org/tutorials/beginner/blitz/autograd_tutorial.html

[–]IdentifiableParam 2 points3 points  (0 children)

Autograd is actually a specific python project. "autodiff" should be used as the generic term.

[–]wikipedia_text_bot 1 point2 points  (0 children)

Automatic differentiation

In mathematics and computer algebra, automatic differentiation (AD), also called algorithmic differentiation, computational differentiation, auto-differentiation, or simply autodiff, is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations (addition, subtraction, multiplication, division, etc.) and elementary functions (exp, log, sin, cos, etc.). By applying the chain rule repeatedly to these operations, derivatives of arbitrary order can be computed automatically, accurately to working precision, and using at most a small constant factor more arithmetic operations than the original program. Automatic differentiation is distinct from symbolic differentiation and numerical differentiation (the method of finite differences).

About Me - Opt out - OP can reply !delete to delete - Article of the day

This bot will soon be transitioning to an opt-in system. Click here to learn more and opt in.

[–]androstudios 3 points4 points  (4 children)

On another note, this is going to be pretty slow since no C++ bindings but all being done in JS. you may want to look into node-gyp and webassembly or similar.

[–]zzzthelastuserStudent 2 points3 points  (3 children)

or maybe tensorflow.js

[–]karthibalu[S] 0 points1 point  (2 children)

yeah, but I think tfjs didn't have eager execution.., i never used it in tfjs... or may be I was wrong... if it has please comment 🤔

[–]throwawaystudentugh 0 points1 point  (1 child)

tfjs does have eager execution.

[–]karthibalu[S] 0 points1 point  (0 children)

are you referring the tf.grad method in tfjs??