Category Tensorflow

342. Fine-Tune Vgg16 with BatchNorm

Implementation Here is one way you can Fine-tune Vgg16 while adding a batch normalization layer using Keras. 1. Import from keras.applications.vgg16 import VGG16 from keras.optimizers import SGD from keras.layers import Input, Dense, Flatten, BatchNormalization, Activation from keras.models import Sequential from…

118. Pre-Fetching Data

Pre-fetching your data can help your data pre-processing pipeline more smoother. Without pre-fetching, the cpu would wait for the process in the GPU to end, and then start to prepare the next data. You can do this process in parallel…

85. Keras Model Transfer-Learning

Load model model_A = keras.models.load_model(“model_A.h5″) Clone Architecture model_A_clone = keras.models.clone_model(model_A) Clone Weights model_A_clone.set_weights(model_A.get_weights()) Delete Last Layer model_B = keras.models.Sequential(model_A_clone.layers[:-1]) Add Final layer => Change to Binary classifier model_B.add(keras.layers.Dense(1, activation=”sigmoid”)) You can prevent copied layers to be affected when training for…