Page 1 of 1

The Gettysburg Address is one of the most famous speeches in American history. The entropy (assuming a zero-order model)

Posted: Fri May 20, 2022 1:14 pm
by answerhappygod
The Gettysburg Address is one of the most famous speeches in
American history. The entropy (assuming a zero-order model) is 4.16
bits/character. There are 1478 characters in the speech. What is
the information content of the Gettysburg Address?
A. 1478 bits
B. 6148.5 bits
C. 4.16
bits
D. 18,346 bits
Assume the following is a sequence of grayscale pixel values.
Which of the following is a valid run length code (as described in
class) that captures the sequence?
127 127 127 127 127 183 183 183 183 212 212 195 195 195
A.
635 732 424 585
B. 132
187 216 198
C.
127 5 183 4 212 2 195 3
D. 1275
1834 2122 1953
QUESTION 4
Given the following sequence of grayscale pixel values, which of
the following is a valid Huffman code for the sequence?
A.
127: 1; 183: 0; 195: 00; 212: 11
B.
127: 111; 183: 110; 195: 10; 212: 0
C.
127: 00, 183: 11, 195: 110, 212: 000
D.
127: 0, 183: 10, 195: 110, 212: 111
QUESTION 5
Assuming a zero-order model, in which we consider each roll
independently, the entropy of rolling a die is log2(6), or
approximately 2.58 bits. Now consider the roll of a die in a
first-order model - in other words, we know the number that came up
on the previous roll. Which of the following statements is true?
Assume a fair die - the probability of rolling any number is
1/6.
A.
The entropy will remain the same - knowing the previous result
does not reduce the uncertainty of the next roll
B.
The entropy of the next roll in increased - knowing the result
of the previous roll increases the uncertainty of the next roll
C.
The entropy of the next roll is reduced - knowing the previous
result lowers the uncertainty of the next roll
D.
It's impossible to know whether the entropy of the next roll is
increased or decreased until after the outcome is observed
QUESTION 6
In a Haar transform (as described in class), what are the two
coefficients you can recover from an approximation coefficient of
756 and a detail coefficient of -50?
A.
-750, 750
B.
706, 806
C.
800, 700
D.
75.6, 75.3
QUESTION 7
Assume the following is a row of grayscale pixel values
[94 112 102 98 114 150 170 160 148 144 168 182 192 196 198
200]
What is the result after carrying out the first step of a 1D
Haar transform (as described in class) on this row of pixel values?
(Use integer arithmetic - all digits to the right of the decimal
point are truncated.)
A.
[101 106 116 133 148 157 155 155 -7 6 -14 -35 -34 -7 15 5]
B.
[103 100 132 165 146 175 194 199 -9 2 -18 5 2 -7 -2 -1]
C.
[101 148 160 196 2 -16 -14 -2 -9 2 -18 5 2 -7 -2 -1]
D.
[121 128 135 140 153 173 184 180 -27 -16 -33 -42 -39 -23 -14
-20]
QUESTION 8
The modified DCT used in MP3
A.
does not use a basis function
B.
uses both sine and cosine waves as its basis functions
C.
returns fewer frequency coefficients than an unmodified DCT
D.
uses a window function with 50% overlap to eliminate
discontinuities at block boundaries
QUESTION 9
Which of the following might include text to speech segments
and/or animations, as well as natural audio and video?
A.
MPEG-21
B.
MPEG-4
C.
MPEG-2
D.
MPEG-7
QUESTION 10
What is the entropy, in bits/character, of the following
message?
sally sells seashells
There are 21 characters; the counts are s:6, l: 6, e: 3, a:2,
space: 2, h: 2, y: 1
A.
4 bits/character
B.
7 bits/character
C.
2.5 bits/character
D.
3.5 bits/character