DOC PREVIEW
MIT MAS 160 - Study References

This preview shows page 1 out of 4 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

MAS160: Signals, Systems & Information for Media TechnologyProblem Set 4DUE: October 20, 2003Instructors: V. Michael Bove, Jr. and Rosalind Picard T.A. Jim McBrideProblem 1: Simple Psychoacoustic MaskingThe following MATLAB function performs a simple psychoacoustic test. It creates bandlimitednoise, centered at 1000 Hz and also creates a sinusoid. It then plays the noise alone andthen the noise plus the sinusoid. Try different values of f and A to see whether you candetect the sinusoid. For a particular value of f we’ll call Amin(f) the minimum amplitudeat which the frequency f sinusoid could still be heard. Plot several values on the graph off vs. Aminto determine a simple masking curve.function mask(f,A)% MASK Performs a simple psychoacoustic masking test by creating% bandlimited noise around 1000 Hz and a single sinusoid at% frequency f with amplitude A. It then plays the noise% alone, and then the noise plus the sinusoid.%% f - frequency of sinusoid (0 to 11025)% A - amplitude of sinusoid (0 to 1)% Set sampling rate to 22050 Hzfs = 22050;% Create a bandpass filter, centered around 1000 Hz. Since the% sampling rate is 22050, the Nyquist frequency is 11025.% 1000/11025 is approximately 0.09, hence the freqeuency% values of 0.08 and 0.1 below. For more info, do ’help butter’.[b,a] = butter(4,[0.08 0.1]);% Create a vector of random white noise (equal in all frequencies)wn = rand(1,22050);% Filter the white noise with our filterwf = filter(b,a,wn);% By filtering, we’ve reduced the power in the noise, so we normalize:wf = wf/max(abs(wf));% Create the sinusoid at frequency f, with amplitude A:s = A*cos(2*pi*f/fs*[0:fs-1]);% Play the soundssound(wf,22050)pause(1) % Pause for one second between soundssound(wf+s,22050)PS 4-1Problem 2: Markoff processes, entropy, and grading ;A particularly lazy teaching assistant is faced with the task of assigning student grades. Inassigning the first grade, he decides that the student has a 30% chance of getting an A, a40% chance of getting a B, and a 30% chance of getting a C (he doesn’t give grades otherthan A, B, or C). However, as he continues to grade, he is affected by the grade he has justgiven. If the grade he just gave was an A, he starts to feel stingy and there is less chancehe will give a good grade to the next student. If he gives a C, he starts to feel guilty andwill tend to give the next student a better grade. Here is how he is likely to grade giventhe previous grade:If he just gave an A, the next grade will be: A (20% of the time), B (30%), C (50%).If he just gave a B, the next grade will be: A (30%), B (40%), C(30%).If he just gave a C, the next grade will be: A (40%), B (50%), C(10%).(a) Draw a Markoff graph of this unusual grading process.(b) Calculate the joint probability of all successive pairs of grades (i.e. AA, AB, AC, etc.)(c) Calculate the entropy, H, of two successive grades given.Problem 3: Entropy CodingOften it is the case that a set of symbols we want to transmit are not equally likely to occur.If we know the probabilities, then it makes sense to represent the most common symbolswith shorter bit strings, rather than using an equal number of binary digits for all symbols.This is the principle behind variable-length coders.An easy-to-understand variable-length coder is the Shannon-Fano code. The way wemake a Shannon-Fano code is to arrange all the symbols in decreasing order of probability,then to split them into two groups with approximately equal probability totals (as best wecan, given the probabilities we have to work with), assigning 0 as an initial code digit tothe entries in the first group and 1 to those in the second. Then, keeping the symbols in thesame order, we recursively apply the same algorithm to the two groups till we’ve run outof places to divide. The pattern of ones and zeros then becomes the code for each symbol.For example, suppose we have an alphabet of six symbols:Symbol % probability Binary code Shannon-Fano codeA 25 000 00B 25 001 01C 25 010 10D 12.5 011 110E 6.25 100 1110F 6.25 101 1111Let’s see how much of a savings this method gives us. If we want to send a hundred ofthese symbols, ordinary binary code will require us to send 100 times 3 bits, or 300 bits.PS 4-2In the S-F case, 75 percent of the symbols will be transmitted as 2-bit codes, 12.5 as 3-bitcodes, and 12.5 as 4-bit codes, so the total is only 237.5 bits, on average. Thus the binarycode requires 3 bits per symbol, while the S-F code takes 2.375.The entropy, or “information content” expression gives us a lower limit on the numberof bits per symbol we might achieve.H = −mXi=1pilog2(pi)= −[0.25 log2(0.25) + 0.25 log2(0.25) + 0.25 log2(0.25) + 0.125 log2(0.125)+0.0625 log2(0.0625) + 0.0625 log2(0.0625)]If your calculator doesn’t do base-two logs (most don’t), you’ll need the following high-schoolrelation that many people forget:loga(x) = log10(x)/ log10(a),solog2(x) = log10(x)/0.30103.And the entropy works out to 2.375 bits/symbol. So we’ve achieved the theoretical ratethis time. The S-F coder doesn’t always do this well, and more complex methods like theHuffman coder will work better in those cases (but are too time-consuming to assign on aproblem set!).Now it’s your turn to do some coding. The below is a letter-frequency table for the En-glish language (also available at http://ssi.www.media.mit.edu/courses/ssi/y03/ps4.freq.txt).E 13.105T 10.468 A 8.151 O 7.995N 7.098R 6.832 I 6.345 S 6.101H 5.259D 3.788 L 3.389 F 2.924C 2.758M 2.536 U 2.459 G 1.994Y 1.982P 1.982 W 1.539 B 1.440V 0.919K 0.420 X 0.166 J 0.132Q 0.121Z 0.077(a) Twenty-six letters require five bits of binary. What’s the entropy in bits/letter ofEnglish text coded as individual letters, ignoring (for simplicity) capitalization, spaces,and punctuation?(b) Write a Shannon-Fano code for English letters. How many bits/letter does your coderequire?(c) Ignoring (as above) case, spaces, and punctuation, how many total bits does it taketo send the following English message as binary? As your code? [You don’t need towrite out the coded message, just add up the bits.]“There is too much signals and systems homework”(d) Repeat (c) for the following Clackamas-Chinook sentence (forgive our lack of thenecessary Native American diacritical marks!).“nugwagimx lga dayaxbt, aga danmax wilxba diqelpxix.”PS 4-35Problem 4: Error CorrectionA binary communication system contains a pair of error-prone wireless channels, as shownbelow.Sender 1


View Full Document

MIT MAS 160 - Study References

Download Study References
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Study References and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Study References 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?