Category AI

94 TorchInfo

For a Tensorflow model, you can use .summary() to see the structure of your model. In pytorch, there is no such thing. So instead, you can use torchinfo. This was really useful when I wanted to do transfer-learning but didn’t…

93. Dataset ⇒ DataLoader Pipeline in Pytorch

Here is 1 way to prepare your data using DataLoader in Pytorch. 1. Create Custom Dataset Class When using a custom dataset the following 3 functions has to be overloaded. class Dataset: #Initialize the dataset with the input data and…

92. DataLoader in Pytorch

When you want to load your data for training, the data preparation pipeline would be the following. Randomly shuffle your data Turn them into batches Iterate Although you can manually do this, when the data size becomes large this can…

89. Max-Norm Regularization

Another useful regularization technique is called Max-Norm Regularization. Implementation layer = keras.layers.Dense(100, activation=”selu”, kernel_initializer=”lecun_normal”, kernel_constraint=keras.constraints.max_norm(1.)) By setting a hyper-parameter r you can set a max value for weights to prevent over-fitting. References: Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow,…

88. Monte Carlo Dropout

Dropout is one of the most popular regularization techniques for deep neural networks. Monte Carlo Dropout may help boost the dropout model even more. Full Implementation ys = np.stack([model(X_test, training=True) for sample in range(100)]) y = ys.mean(axis=0) Predict() method returns…

85. Keras Model Transfer-Learning

Load model model_A = keras.models.load_model(“model_A.h5″) Clone Architecture model_A_clone = keras.models.clone_model(model_A) Clone Weights model_A_clone.set_weights(model_A.get_weights()) Delete Last Layer model_B = keras.models.Sequential(model_A_clone.layers[:-1]) Add Final layer => Change to Binary classifier model_B.add(keras.layers.Dense(1, activation=”sigmoid”)) You can prevent copied layers to be affected when training for…

84. Keras Model Hyper-parameter Tuning

One way to tune your keras model is using random search. Here is one example with sklearn. from scipy.stats import reciprocal from sklearn.model_selection import RandomizedSearchCV #Specify range for random initialization param_distribs = { “n_hidden”: [0, 1, 2, 3], “n_neurons”: np.arange(1,…