NLP Technique for avoid number of features modification

bagofwords
one-hot-encoding
nlp

#1

Hello !

There’s a NLP Technique that avoid to change the number of features when coming new training data ?

I mean if I use Bag-of-words or one-hot encode I can develop a pretty full working model, but if I have new training data probably the model will change in terms of number of columns, so I have to retrain with all training data, not just with new data; not a big problem, but I would like to know if there’s a way to avoid this.

Thanks everyone