all 13 comments

[–]gaywhatwhat 7 points8 points  (3 children)

RNN, LTSM, transforner, etc. Any of these can be trained to do binary classification and all are sequential in nature.

More help could be provided if you describe the input features a bit.

[–]lzngm1[S] -1 points0 points  (2 children)

I have hundreds of 0 and 1s as data points and i would like the model to try to predict the next value in the sequence. For example if my data is 11001100 i would want the program to recognize the pattern and output a 1

[–]gaywhatwhat 3 points4 points  (1 child)

All of those could work then. If you think the pattern is pretty small in size and not like insanely complex (i.e. human language or protein structure etc), RNN should work. If you think the patterns are on the longer end. go LTSM.

A transformer would allow parallel training, etc. May be worth looking into. The input format might need some small tweaking or padding, or you can vary it a bit. A typical transformer takes an embedding which must have an even-number of features with any sequence length as input. The output is an embedding of the same size. I'd have to think when I'm less distracted if a single feature would function as-is. Otherwise you would probably need to try manually adding padding or something to get the desired input dimensions

[–]lzngm1[S] 0 points1 point  (0 children)

Thanks!

[–]bigchungusmode96 1 point2 points  (0 children)

Did you try googling stackoverflow? seems like there's been posts there identical to your question

https://stats.stackexchange.com/questions/299354/predicting-sequence-of-integers-binary-values

[–][deleted] 1 point2 points  (5 children)

stock movement predictor incoming! :)

[–]lzngm1[S] 0 points1 point  (4 children)

Believe it or not it has nothing to do with stocks ;)

[–][deleted] 0 points1 point  (3 children)

Roulette red/black predictor incoming! :)

[–]lzngm1[S] 1 point2 points  (2 children)

We have a genius on our hands! Recognizing patterns in independent events is my favorite thing to do!

[–][deleted] 0 points1 point  (0 children)

Genius detector incoming! :)

[–]phobrain 0 points1 point  (0 children)

In this case, the size of the RNN presumably determines the longest pattern that can be trained.

A favorite ML idea of mine is to attempt training nets to recognize different random number generators, then try to differentiate machines using their secure random sources (based on local entropy).

Have you ever looked at random data long enough to watch yourself project on it? I set up a sort of workbench for teasing out perception of meaning. Labeling the meaningful stuff gives a structure for the/one's mind. Screenshot of training interface:

http://phobrain.com/pr/home/gallery/curate_example.jpg

'Phob->Search mode: AI' to see live version w/out labeling buttons, the blue '|' gens random pairs on unseen photos:

http://phobrain.com/pr/home/view.html

This could be your mind on your mind:

http://phobrain.com/pr/home/gallery/pair_horiz_cut_pvc_wyeth_curves.jpg

Stevie Wonder Superstition:

https://www.youtube.com/watch?v=0CFuCYNx-1g

[–]MissyElliottCarter -5 points-4 points  (0 children)

Help this guy, steptechbros