363. Splitting Datasets
Training Your Model When training a model, the dataset is often divided into a Training set, a Validation Set, and a Test Set. The ratio to split the data into these 3 sets depends on how large your dataset is,…
Training Your Model When training a model, the dataset is often divided into a Training set, a Validation Set, and a Test Set. The ratio to split the data into these 3 sets depends on how large your dataset is,…
Clustering Depending on your data and objective, you may not even have to train a deep-learning model for image segmentation. Here is one way to apply segmentation using Kmeans clustering. Implementation from sklearn.cluster import KMeans from matplotlib.image import imread import…
Data Augmentation Data Augmentation is a technique used to “increase” the amount of data to train a model. This can be helpful in cases such as when you don’t have a sufficient amount of data or when you want to…
Implementation Saving checkpoints for model weights during training can be helpful in the case of the following examples. Want to resume training later Avoid losing weight data when the process stops during training due to some kind of error. Restore…
Anomaly Detection Anomaly detection in videos refers to the identification of events that do not align with the expected behavior. This paper is a pioneering work that leverages the difference between a predicted future frame and its ground truth to…
Different Approaches Here are the differences between the concepts of transfer learning depending on which types of data are available when training. Reference A Survey on Transfer Learning
What It Does One-Shot learning can be useful when you want to identify someone just by giving the predicting model a single picture of them. Similarity Function A model can perform One-shot learning by learning a “similarity” function. When the…
The Main 4 Steps There are mainly 4 steps for data preprocessing. Data Quality Assessment Data Cleaning Data Transformation Data Reduction 1. Data Quality Assessment Before jumping into coding, evaluating the overall data quality is essential. Here are several problems…
Overview Model-Based Learning Creates a function F(x) using the given data to predict the output. EX): Support Vector Machines Instance-Based Learning Uses the given data itself as the model. If an input is given, it will look through the current…
“Modernizing” ConvNets “ConvNext” is a gradually “modernized” traditional ConvNet model designed to reexamine the design spaces and test the limits of what a pure ConvNet can achieve. The paper does this by modifying the Micro Design of the ConvNet architecture…