I am having a tough time understanding what is the difference between Shannon's source entropy, Kolmogorov-Sinai (KS) entropy and Topological entropy is. The document that I am following is:
http://www.maths.manchester.ac.uk/~cwalkden/magic/lecture08.pdf
Definitions say that Kolmogorov_sinai entropy, also known as metric entropy is the supremum of Shannon's entropy.
- $h_{KS} = \sup \frac{1}{M}H$ where $M$ denotes the number of unique symbols, $H$ is the Shannon's entropy.
Shannon's entropy is given by:
\begin{align} H= -\sum_{k=1}^M\mathsf{p}_k \ln(\mathsf{p}_k) \end{align} is the entropy of the source which emits M=2 symbols, $\mathsf{p}_k$ is the probability to find a symbol. The source entropy of a dynamical system is given as: \begin{align} H_\text{source} = \lim _{M \rightarrow \infty} \frac{1}{M} H \end{align}
What is the meaning of the
limit
term? Is Source entropy and Shannon's entropy the same thing?If the sender sends 10 compound symbols at a time and each symbol can either be 0 or 1, then what is the entropy? As an example on how to calculate Shannon's entropy, consider an array
A
representing a message:A = [1 0 0 1 0 0 0 1 0 1 1 0 0 1 0 0 0 1 0 1];
N = length(A);
p_1 = sum(A==1)/length(A);
p_0 = sum(A==0)/length(A);
ShannonH = -p_1*log2(p_1)-(1-p_1)*log2(1-p_1)
In this example M=2
. The code outputs ShannonH = 0.9710
for this example. So based on the formula of KS entropy is supremum of ShannonH/2 = 0.9710/2 = 0.4855
Is this how KS entropy is calculated? I cannot understand what the supremum of Shannon entropy is an how to calculate KS entropy.
- What is Topological entropy?
An example to explain will be really helpful. Please correct me if anything is wrong or misleading. Thank you.
No comments:
Post a Comment