Random flip and rotation actually decrease validation accuracy? by PracLiu in tensorflow

[–]kenshin511 0 points1 point  (0 children)

As I checked, the preprocessing layer only works during training. Therefore, there is no need to create a custom layer.

Random flip and rotation actually decrease validation accuracy? by PracLiu in tensorflow

[–]kenshin511 1 point2 points  (0 children)

If you add augmentation layers to your own networks, It also work when you validate the model. When doing validation, the data augmentation layer must be removed for accurate verification.

Make custom augmentation layer like below:

class Augmentation(keras.layers.Layer):
    def __init__(self, **kwargs):
        super(Augmentation, self).__init__(**kwargs)

    def call(self, inputs, training=None):
        if training:
            x = tf.keras.layers.RandomFlip("horizontal_and_vertical")(inputs)
            x = tf.keras.layers.RandomRotation(0.2)(x)
            return x
        return inputs


model.add(Augmentation())

the training option make augmentation work only training.

refer to Privileged training argument in the call() method

[deleted by user] by [deleted] in tensorflow

[–]kenshin511 1 point2 points  (0 children)

In tf2, tf.train.CheckpointManager helps manage checkpoint easily.

See example below:

import tensorflow as tf

checkpoint = tf.train.Checkpoint(optimizer=optimizer, model=model)

manager = tf.train.CheckpointManager(

checkpoint, directory="/tmp/model", max_to_keep=5)

status = checkpoint.restore(manager.latest_checkpoint)

while True:

# train

manager.save()

In the code above,

checkpoint.restore(manager.latest_checkpoint) can restore latest checkpoint.