Consider two discrete, jointly distributed random variables and with values and , respectively. The joint probability of having the value and having the value is .
Show that if and are statistically independent, then the Shannon entropy of the joint distribution is the sum of the Shannon entropies of the reduced distributions of and
Generalize the derivation to the case on jointly distributed variables with .
Your printer malfunctions and prints random words, each consisting of lower-case (English) letters. Find the Shannon entropy for this distribution of words
if all letters of the alphabet occur with equal probability as each of the letters (positions) in the word;
if every second letter in the word is one of the vowels (a, e, i, o, u) while every other letter in the word is not a vowel
Hint: Use the result of problem 1, generalized to random variables.
Consider the entropy of a discrete probability distribution given in terms of the probabilities (). Determine which lead to the maximum entropy under the following constraints (Hint: Use Lagrange multipliers to enforce the constraints.):
Normalization
Normalization and fixed average of a quantity with values .
A system consists of non-interacting, distinguishable three-level atoms. Each atom can be in one of three states with energies and . .
Work out the number of available states at fixed (number of atoms in level 3).
Calculate the entropy as a function of the total energy (use Stirling’s formula).
Calculate the temperature.
Express the energy as a function of temperature.