Kyosuke

Kyosuke

64. Energy-Based Models

When training a model, we usually use a cost function to calculate how far away the predictions are from the actual results. By replacing that cost function with a function called energy function, we call that an energy-based model. What…

62. Everyone in Star Mode

Yeah, this wasn’t supposed to happen. I trained an Instance segmentation model using NVIDIA’s Transfer Learning Toolkit to detect cars and deployed it on a Jetson. It seems I messed up with either the settings for the threshold or with…

61. Unsupervised Pre-Training

Unsupervised Pre-training is a method to initialize the weights for the hidden layers using unsupervised learning. And the procedure to do that is GREEDY LAYER-WISE PRETRAINING.  In this procedure, the machine will learn the weights in an unsupervised manner 1…

60. Invoking Actions After Detection

Just being able to detect things won’t be useful, so I created a function to invoke some kind of action under a certain condition.Thanks to Edge Electronics’s video, I was able to understand how to structure actions after detection. The…

59. Categorizing AI

I’ve noticed I was slightly misunderstanding the categorization of AI when I was reading. DEEP LEARNING by Ian Goodfellow, Yoshua Bengio, and Aaron Courville. First of all, like the drawing above, AI is making a machine do a task that would be…

58. Word Embedding

Let’s say we have 300 genres, that would mean each column of the table would be a 300×1 dimension vector representing the nuance of that word. Vectorizing this nuance is called word-embedding.By making the machine learn this nuance(the table above),…

57. The Almanack of NAVAL RAVIKANT

I just finished The Alamanak of Naval Ravikant by ERIC JORGENSON. This is my 9th book this year and so far it’s my favorite! I was excited all the time and I actually finished the book in a day!The principles…