. = 2) Entropy Coding (20 points + 20 extra credit!) A source S produces equiprobable symbols x, y. We can easily assign

Business, Finance, Economics, Accounting, Operations Management, Computer Science, Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Algebra, Precalculus, Statistics and Probabilty, Advanced Math, Physics, Chemistry, Biology, Nursing, Psychology, Certifications, Tests, Prep, and more.
Post Reply
answerhappygod
Site Admin
Posts: 899603
Joined: Mon Aug 02, 2021 8:13 am

. = 2) Entropy Coding (20 points + 20 extra credit!) A source S produces equiprobable symbols x, y. We can easily assign

Post by answerhappygod »

2 Entropy Coding 20 Points 20 Extra Credit A Source S Produces Equiprobable Symbols X Y We Can Easily Assign 1
2 Entropy Coding 20 Points 20 Extra Credit A Source S Produces Equiprobable Symbols X Y We Can Easily Assign 1 (365.11 KiB) Viewed 29 times
. = 2) Entropy Coding (20 points + 20 extra credit!) A source S produces equiprobable symbols x, y. We can easily assign them codes, say x=0 and y=1, calling this code C; having an average code length L1 = 1. This makes perfect sense when S is a random source with Hz =1. But given more information – if at any time S generates a symbol, the probability that the next symbol is the same as the previous is 0.9 – let's see how the average code length per symbol for S changes under such bias. If we had to encode strings of two symbols at time (xx, xy, yx, yy), construct a Huffman code C2, find the average code length per symbol L2. Show that L2<L]. Also compute the entropy per symbol H2 and show that H><H1. (3 + 1 points) Hint: avg code length per symbol = (avg code length of the n-symbols ) / n Similarly, entropy per symbol = (entropy of n-symbol distribution) /n Repeat the same exercise encoding three symbols at a time to find a code C3. Find the average code length per symbol L3 and show that L3 <L><L1 . Also compute the compute the entropy per symbol H; and show that H3 <H2<H1. (6 + 2 points) This should show that as you encode more symbols together, L decreases along with H. In class we learned that, given a probability distribution for symbols of a source with entropy H, any Huffman code will give an average code length L such that H<I< H+1. Using this relationship, prove that for the given source S, any code C, that generates an average code length per symbol Ln converges to Hn as n becomes infinite. (8 points) EXTRA CREDIT: With the given probabilities, calculate the numerical value Hn converges to when n is infinitely large? Must show working to get credit! (20 points) .
Join a community of subject matter experts. Register for FREE to view solutions, replies, and use search function. Request answer by replying!
Post Reply