Category Machine Learning

341. Deterministic vs Stochastic Models

Deterministic Models Produces consistent outcomes for a given input no matter how many times you recalculate. Deterministic models have the benefit of simplicity which can be easier to explain in some cases. Stochastic Models Posses some inherent randomness which leads…

339.TorchScript

Running Pytorch without Python TorchScript enables users to load Pytorch models in processes where there is no python dependency. Instead of running the process in Python runtime, it converts the model to be able to run in an independent “Torchsript”…

335. DAFormer

Objective DAFormer is an architecture proposed to improve domain adaptation for segmentation models. For the encoder, the “Hierarchical Transformer” is used due to being revealed to be robust to domain shifts. The decoder applies context-aware fusion which utilizes domain-robust context…

332. TorchServe

Deploying Your Model TorchServe allows you to expose a WEB API for your Pytorch model that may be accessed directly or via your application. 3 Steps Choose a default handler or author a custom model handler. You will define a…

331. Basics Of Training Large Models

Two Frameworks There are mainly two large frameworks for training large-scale deep learning models. Data Parallelism This method is usually taken whenever your model CAN fit completely into your GPU memory, sending different batches of data for each GPU. Model…

229. Quantization using Pytorch

Quantization Quantization is a technique to change the data type used to compute neural networks for faster inference. After you’ve deployed your model, there is no need to backpropagate(which is sensitive to precision). This means, if a slight decrease in…

228. Pruning Using Pytorch

Pruning State-of-the-art deep learning techniques rely on over-parameterized models which makes it hard when the deploying destination has limited resources. Pruning is used to learn the differences between over-parameterized and under-parameterized networks and sparsify your neural networks. In Pytorch, you…

223. Contrastive Learning

Contrastive Learning Contrastive learning is a technique used to optimize computer-vision-related tasks by comparing multiple samples against each other to learn the attribute to identify the difference between each class.

218. Self-Organizing Maps

SOM SOM(Self-Organizing map) is a dimensionality reduction method using unsupervised learning that generates a discretized representation of the input data which consists of multiple columns and rows as a “MAP”(Typically 2D).