How Gini Index is different from entropy value for splitting the node?

decision_trees

#1

I am studying about decision tree and the parameters which help in splitting the decision tree while studying I found two parameters one entropy and other is the Gini index.

Gini index - a measure of total variance across the classes.

entropy - a measure of undorderness

I want to know the difference between them and which of them are used for spliiting the decision tree


#2

Hi Sid,

This link is the best explanation for your Query !!

http://www.saedsayad.com/decision_tree.htm

I hope above link clarify’s your doubt .

Thanks
Raghavendra


#3

hello @sid100158,

Split criterion:
Information gain
All attributes are assumed to be categorical (ID3)
May be modified for continuous‐valued attributes ( ) C4.5
Gini index (CART, IBM IntelligentMiner IntelligentMiner)
All attributes are assumed continuous‐valued
Assumption: there exist several possible split values for each attribute.
May be modified for categorical attributes.

Hope this helps!!