Decision tree vs. Naive Bayes classifier

naive_bayes
decision_trees

#1

In which cases is it better to use a Decision tree and other cases a Naive Bayes classifier?


#2

Hi,

Let’s look at the advantages of using Decision tree and Naive Bayes:

Decision Trees: It is easy to understand and explain. You can read more about decision tree here. It has multiple interesting features those take care various issues like missing values, outlier, identifying most significant dimensions and others. It can also easily handle feature interactions and they’re non-parametric. Major disadvantage is over-fitting, but that’s where ensemble methods like random forests (or boosted trees) come in. Another one, it does not work well with continuous target variable compare to categorical.

Naive Bayes: It is type of supervised learning algorithm. It assumes a underlying probabilistic model (Bayes theorem).You can look at more detail about this algorithm here. It is majorly used when more number of classes to predict like Text Classification, Spam Filtering, Recommendation System and others.

Advantages:

  • It is easy to implement
  • Requires small amount of data to train model
  • Good results obtained in most of the cases

Disadvantages:

  • It’s assumption, class conditional independence else loss of accuracy
  • If you have no occurrences of a class label and a certain attribute value togetherthen the frequency-based probability estimate will be zero.

#3

Nice answer!