89. Max-Norm Regularization

Another useful regularization technique is called Max-Norm Regularization.

Implementation

layer = keras.layers.Dense(100, activation="selu", kernel_initializer="lecun_normal",
                           kernel_constraint=keras.constraints.max_norm(1.))

By setting a hyper-parameter r you can set a max value for weights to prevent over-fitting.

References:
Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow, 2nd Edition