Randomly sets a fraction rate of input units to 0 at each update during training time, which helps prevent overfitting.

dropout(rate = 0.5)

Arguments

rate

The fraction of affected units

Value

A construct of class "ruta_network"

See also

Other neural layers: conv(), dense(), input(), layer_keras(), output(), variational_block()