Calculate Entropy for Decision Tree

machine_learning
data_science

#1

Hi Guys,
I have a small confusion on calculating Entropy for Decision Tree .
Sample example has Temperature as one attribute (with Hot, Mild and Cold) and Target Variable - Play Golf (Yes/No).

Now “Hot” temperature has Two entries for “No” and Zero entries for “Yes”.

Then by Entropy 0/2 log(0/2) is this Zero or Infinity?
Will the Entropy be Zero or Infinity??? Also, information Gain will be maximum or minimum?

Please helpedwisor doubt


#2

Hey @karthik_Van
Yes as the sample is homogenous, the entropy will be zero. Also as Information gain = 1 - entropy, it will be 1. For more information about entropy and Information gain you can refer to https://www.analyticsvidhya.com/blog/2016/04/complete-tutorial-tree-based-modeling-scratch-in-python/