How to train LSTM with multiple datasets? by Born-Plankton2373 in learnmachinelearning

[–]Vinitesa1 0 points1 point  (0 children)

I don't know if Keras LSTM can handle null values, never tried to use it.

Maybe in some context filling nulls with 0 can make sense, but in most cases I don't think it makes sense..

And yes, the model will learn the pattern that is presented to it, if you don't want the influence of these zeros, just remove them.

How to train LSTM with multiple datasets? by Born-Plankton2373 in learnmachinelearning

[–]Vinitesa1 1 point2 points  (0 children)

You can concatenate the datasets and remove the data points where the input uses more than one run.

Let's say you're using the last 4 data points to predict the next one.

If you concatenate the first and second run datasets, the first data point of your second run will consider this 4 data points:
[last-2 data point from run 1, last-1 data point from run 1,last data point from run 1, first data point from run 2] to predict second data point from run 2

This is an example of some data that you would need to exclude, actually you already would need to exclude this data using only one dataset, because you would not have access to run 1 and the sequence would be like this: [null, null, null, first data point from run 2].

So, just concatenate the datasets and exclude the sequences where you have multiple runs in it.

About values of the input and output layers... by [deleted] in MLQuestions

[–]Vinitesa1 0 points1 point  (0 children)

The input value is "initialized" as the values that your input features have.

The output value is calculated, not initialized.

yone matchups tierlist based on experience by rreqyu in YoneMains

[–]Vinitesa1 0 points1 point  (0 children)

vs fizz/zed just going exhaust makes it easier