In machine learning we will use mathematical formulas to give precise definitions for some methods.
The following is a reminder of the concept of "log".
"Log" is the exponent you put on - It's that simple.
Let's look at some examples:
(log base 2)
For example, in a decision tree we use entropy calculation:
p i is the proportion of the data labeled for each class.
pi is a fraction. e n t r o p y is the value between 0 and 1.
0 means no entropy and all data is the same.
(log base 2)
![]() |
| log of a fraction is negative |
For example, in a decision tree we use entropy calculation:
pi is a fraction.
0 means no entropy and all data is the same.

No comments:
Post a Comment