Calculate Entropy for Decision Tree



Hi Guys,
I have a small confusion on calculating Entropy for Decision Tree .
Sample example has Temperature as one attribute (with Hot, Mild and Cold) and Target Variable - Play Golf (Yes/No).

Now “Hot” temperature has Two entries for “No” and Zero entries for “Yes”.

Then by Entropy 0/2 log(0/2) is this Zero or Infinity?
Will the Entropy be Zero or Infinity??? Also, information Gain will be maximum or minimum?

Please helpedwisor doubt


Hey @karthik_Van
Yes as the sample is homogenous, the entropy will be zero. Also as Information gain = 1 - entropy, it will be 1. For more information about entropy and Information gain you can refer to