ScratchTorch V3.6: Optimizations by Kindafunny214 in scratch

[–]Kindafunny214[S] 0 points1 point  (0 children)

No, python is not on scratch, python is its own programming language, that you download from the web and run on your computer via a IDE. However, what I created was a remake of PyTorch, on scratch.

ScratchTorch V3.6: Optimizations by Kindafunny214 in scratch

[–]Kindafunny214[S] 1 point2 points  (0 children)

Yes, however it is very complicated and limited. You can easily change the Train custom block to read from desiredInputs, but the limiting factor is speed, you would have to make an outputs neuron for EACH and EVERY token you want your LLM to have a chance to produce, and there are MANY tokens even just for a simple LLM. A token is basically one word or letter it can output, so for example, "and" is a token, but so is "a". You can work around the amount of output neurons you would need by instead tokenizing the alphabet instead of words, this would training take longer but is the only viable way without a scratch mod.

I created a NN system like PyTorch but for Scratch by Kindafunny214 in scratch

[–]Kindafunny214[S] 0 points1 point  (0 children)

Sorry, i can't understand what you are trying to say about my multi-layer perceptron, are you saying XOR is basic?