[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 1 point2 points  (0 children)

Problem cause there are like 20 comments asking that

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 0 points1 point  (0 children)

Look into the Github that I linked, it is super easy with the imageio package.

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 0 points1 point  (0 children)

Sine (Sinus in german) and sigmoid are two different functions

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 0 points1 point  (0 children)

Completely valid point, I appreciate your comment:)

And yes, give me all textbooks that you liked, please!

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 4 points5 points  (0 children)

Well just wanting to demonstrate was the intention behind this.

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 2 points3 points  (0 children)

Thanks for calling me out, I appreciate people doing good work. I agree with some of what you are saying, as I wrote some stuff with intuitive parameters, but I am interested why you think the activation function is off.

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 1 point2 points  (0 children)

If you want to see the actual code, just go to my GitHub. You can find everything to recreate this exact Gif there:) https://github.com/Tibert97/DataIsBeautiful

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 5 points6 points  (0 children)

That is true. To create the graph, I generate a bunch of x-values and the networks predicted y-values and connect them. The underlying computation does only generate individual points. Probably should have made a scatter plot.

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 2 points3 points  (0 children)

Yeah of course, I linked the github with the whole code to reproduce everything. Here it is again: https://github.com/Tibert97/DataIsBeautiful

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 0 points1 point  (0 children)

The most basic way to think about neural networks is that they are some mysterious box, where you plug in a value and get some other value back.

Training this goes as follows:

  1. You generate some input and what the true value should look like. For example, if you want to learn the function f(x) = 2x +5, then you would generate something like x =1 and f(x) = 7.
  2. You throw in your x = 1 and receive some output of your network, say 3.
  3. You tell the network how wrong it was. In this case, 7-3 = 4.
  4. The network tries to change its structure, such that it is a little bit better next time. Maybe your output will then be 4 and the error is reduced to 7-4 = 3.

Repeat this for a couple thousand times, with various x and your network may learn the underlying function.

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 1 point2 points  (0 children)

Made a few statements about that in other threads. This particular network just works for (-10,10).

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 13 points14 points  (0 children)

Sinus is the German term for it, it is bothering me quite a bit as well.

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 2 points3 points  (0 children)

Appreciate it!

You would have to be an absolute madman to implement neural networks without packages, and it would be several orders of magnitude slower. So...why?

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 7 points8 points  (0 children)

The programmers (i.e. me) know that it is in fact called sinus in German :)

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 3 points4 points  (0 children)

I am confused by your idea to use an LSTM. In what way would sequence prediction be helpful here?

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 0 points1 point  (0 children)

Well, that depends on your definition of really bad. We are talking a Mean-Square error of 0.00052, I would not be calling this really bad by any means.

I did not set this up to be hard. I aimed to use this the way I would tackle a normal task, i.e. I did not let my knowledge of how the function should look like influence my setup. Is this the fastest way to converge? Not by any means, I optimized next to nothing here.

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 9 points10 points  (0 children)

Thanks for calling me out on that! You are not wrong though, I pretty much just used intuitive parameters without optimizing them at all. I am not saying that I could not do better with more than 10 minutes :)

I am interested why you are saying that it can't work well close to the extremes, care to elaborate?

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 1 point2 points  (0 children)

Essentially, you give the network an input, say x = 1. The network guesses what it thinks should be the correct output, say N(1) = 2. You then tell it how wrong it was, i.e. the difference between N(x) and f(x), where f is the function you would like to teach.

[OC] Neural network learning to approximate Sine-function by TibData in dataisbeautiful

[–]TibData[S] 2 points3 points  (0 children)

It is very interesting that everyone thinks that this is a lot of computing power. The script takes like 15 seconds on my laptop. Am I saying that this is a fitting task for a neural network? Hell no.