58. Word Embedding

Ok, machines can understand human language with natural language processing, but how..?
One way to represent words with numbers is 1-hot-representation. (Like the image below) What sucks about this is that it can’t understand the relationships between words. Us HUMANS can instantly understand that CAT and DOG are pets, but machines can’t. How do we create that nuance?
We create another dimension (like the right side of the image below) that has genres. Gender, Royal, Age, etc. By doing this, we can express how much that genre relates to each word. If it has a strong relationship, the value is going to be close to 1 or -1. If not, the value would be around 0.

Let’s say we have 300 genres, that would mean each column of the table would be a 300×1 dimension vector representing the nuance of that word. Vectorizing this nuance is called word-embedding.
By making the machine learn this nuance(the table above), now we can make the machine understand human language.