Link to article:
I have doubts on the below question and the given answer.
17) Which of the following statement is(are) true for Word2Vec model?
A) The architecture of word2vec consists of only two layers – continuous bag of words and skip-gram model
B) Continuous bag of word is a shallow neural network model
C) Skip-gram is a deep neural network model
D) Both CBOW and Skip-gram are deep neural network models
E) All of the above
Word2vec contains the Continuous bag of words and skip-gram models, which are deep neural nets.
– I think the neural net architectures used in Word2vec are always a shallow ones (No hidden layers) irrespective of the variation of skip-gram or CBOW.