Category Machine Learning

94 TorchInfo

For a Tensorflow model, you can use .summary() to see the structure of your model. In pytorch, there is no such thing. So instead, you can use torchinfo. This was really useful when I wanted to do transfer-learning but didn’t…

92. DataLoader in Pytorch

When you want to load your data for training, the data preparation pipeline would be the following. Randomly shuffle your data Turn them into batches Iterate Although you can manually do this, when the data size becomes large this can…

89. Max-Norm Regularization

Another useful regularization technique is called Max-Norm Regularization. Implementation layer = keras.layers.Dense(100, activation=”selu”, kernel_initializer=”lecun_normal”, kernel_constraint=keras.constraints.max_norm(1.)) By setting a hyper-parameter r you can set a max value for weights to prevent over-fitting. References: Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow,…

88. Monte Carlo Dropout

Dropout is one of the most popular regularization techniques for deep neural networks. Monte Carlo Dropout may help boost the dropout model even more. Full Implementation ys = np.stack([model(X_test, training=True) for sample in range(100)]) y = ys.mean(axis=0) Predict() method returns…