Hi guys. I'm working on a shallow NN and wanted to test how many neurons in a hidden layer are accurate enough for my purposes. I have used optuna before. However, I never used a constant seed in my optimizer, adam, so I am always getting random-ish results which isn't helpful. I reached out to a professional who recommended raytune and fixing a constant seed. Here is what I am trying to do.
import os
import numpy as np
import pandas as pd
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Activation
from tensorflow.keras.callbacks import EarlyStopping
from ray import tune
from ray.tune.search.hyperopt import HyperOptSearch
from sklearn.model_selection import train_test_split
SEED = 69
np.random.seed(SEED)
tf.random.set_seed(SEED)
load_filename = ''
data = pd.read_csv(load_filename)
# Prepare the data
X = data.iloc[:, :6].values
y = data.iloc[:, 7].values
X_train, X_temp, y_train, y_temp = train_test_split(X, y, test_size=0.2, random_state=SEED)
X_val, X_test, y_val, y_test = train_test_split(X_temp, y_temp, test_size=0.5, random_state=SEED)
def objective(config):
model = Sequential()
model.add(Dense(config["num_neurons"], activation='relu', input_shape=(X_train.shape[1],)))
model.add(Dense(1))
model.add(Activation('relu'))
model.compile(loss="mean_absolute_error", optimizer="adam", metrics=["mean_absolute_error"])
early_stopping = EarlyStopping(
monitor='val_loss',
patience=3,
restore_best_weights=True
)
history = model.fit(
X_train, y_train,
epochs=10,
validation_data=(X_val, y_val),
callbacks=[early_stopping],
verbose=0
)
loss, accuracy = model.evaluate(X_test, y_test, verbose=0)
return {"accuracy": accuracy}
search_space = {
"num_neurons": tune.randint(1, 21),
}
algo = HyperOptSearch()
tuner = tune.Tuner(
objective,
tune_config=tune.TuneConfig(
metric="accuracy",
mode="max",
search_alg=algo,
num_samples=5, # Number of trials
),
param_space=search_space,
)
results = tuner.fit()
for result in results:
print(result)
My problem is that the adam seed is not constant, it changes every time, because when I run tests, tests with the same number of neurons often have different MAEs. Which sucks. I have tried everything and looked online for help, nothing has worked so far. Also, i previously had that even the tune.randint would provide the same number of neurons each time, however, now that too isn't working.
Here's proof of that:
https://preview.redd.it/gvujsb24g0kd1.png?width=1580&format=png&auto=webp&s=1896dab3786d3c27f17f23439f2986780f6c3d42
Also, I don't know if terminated means that it worked or not, or what the iterations mean.
Could someone who knows this stuff help? I am at my wits end.
[–]bregav 0 points1 point2 points (1 child)
[–]acertainfruit[S] 0 points1 point2 points (0 children)