## S3 method for class 'matrix'
entropy(pk, dim = 0L, base = -1, ...)
## S3 method for class 'matrix'
relative.entropy(pk, qk, dim = 0L, base = -1, ...)
## S3 method for class 'matrix'
cross.entropy(pk, qk, dim = 0L, base = -1, ...)
entropy(...)
relative.entropy(...)
cross.entropy(...)
entropy
entropy.matrix | R Documentation |
Description
The entropy()
function calculates the Entropy of given probability distributions.
Usage
Arguments
pk
|
A \(n \times k\) |
dim
|
An |
base
|
A |
…
|
Arguments passed into other methods |
qk
|
A \(n \times k\) |
Value
A <numeric>
value or vector:
-
A single
<numeric>
value (length 1) ifdim == 0
. -
A
<numeric>
vector with length equal to the length of rows ifdim == 1
. -
A
<numeric>
vector with length equal to the length of columns ifdim == 2
.
Calculation
Entropy:
\[H(pk) = -\sum_{i} pk_i \log(pk_i)\]
Cross Entropy:
\[H(pk, qk) = -\sum_{i} pk_i \log(qk_i)\]
Relative Entropy
\[D_{KL}(pk \parallel qk) = \sum_{i} pk_i \log\left(\frac{pk_i}{qk_i}\right)\]
Examples
# 1) Define actual
# and observed probabilities
# 1.1) actual probabilies
<- matrix(
pk cbind(1/2, 1/2),
ncol = 2
)
# 1.2) observed (estimated) probabilites
<- matrix(
qk cbind(9/10, 1/10),
ncol = 2
)
# 2) calculate
# Entropy
cat(
"Entropy", entropy(
pk
),"Relative Entropy", relative.entropy(
pk,
qk
),"Cross Entropy", cross.entropy(
pk,
qk
),sep = "\n"
)