I am trying to understand mechanism of Word2Vec for word embedding. I gone through from following links:
But Not clear idea behind Word2Vec or how its work.
I understand Follow steps:
- It will take sentence and split in to words.
- It will make vocabulary of all those words
- But when I see do
model['word in vocabulary']it gives numerical vector.
1) What is Vector representation?
2) What is each numerical value represent?
I am confuse about it.
Please if anyone have tutorial or any link from which I can understand better It will be helpful for me.
Thank you so much in advance