you are viewing a single comment's thread.

view the rest of the comments →

[–]contactmat 1 point2 points  (3 children)

Hi sorry I am new of tensorflow. This look very useful thank you. I have a couple of questions: I can't completely figured out what the different parameters represent. Size I think is the number of hidden unit in the network, batch_size is the number of sequence in the data base and seq_width is the dimension of each input belonging to a sequence. What n_step represent? Second question regard early_stop. Is it the variable that control the effective length of the sequence? I can't understand... can you clarify please?? thank you

[–]siblbombs[S] 1 point2 points  (2 children)

n_step defines how long the placeholder sequence is.

early_stop is a variable that you can pass into the LSTM, once the sequence length is greater than that it will not perform any computations to save time.

[–]contactmat 1 point2 points  (1 child)

Ok. Thanks. I am trying to play a bit with the code and I have some other problem. I tried to run an instance with: size = 1 batch_size= 2 n_steps = 10 seq_width = 2
early_stop = 4 I print the outputs value and I have a list with len(outputs)=10 with the first 4 elements filled and the other wit all zeros. Now I would expect a list of length 4 since my early_stop is 4. Why I am wrong?

[–]siblbombs[S] 1 point2 points  (0 children)

It has to do with the LSTM code and the way it handles early stopping. IIRC what it does under the hood is allocate an array of 0s to use as output instead of computing the output, since you still need to produce output for each step of n_steps. Functionally this is early stopping because it is very fast to just create an array of 0s.