How does decision tree classifier split using continuous varaible?

machine_learning

#1
  1. I want to predict a binary categorical outcome.{play cricket or not}
  2. I have 3 features : gender, class, age of students { 2 categorical & 1 continous}
  3. When I go for deciding the root node : If it were only gender and class I would find the best split node using entroy or gini.
    However , now I do have age variable ranging from 10 to 20 too.
    So, will my classifer build several bins of ( <11 & >=11 , <12 &>=12… <19 &>=19) and calculate entropy of every bin and then compare the best bin entropy with the entropy of gender and class variables?

#2

Hi @sahil1995chaturvedi

That is correct. The decision tree classifier splits at multiple values and then decides the best value for splitting. Have a look at this discussion thread:


#3

Thanks a lot. :slight_smile: