5.21 Assume we have digitized an analog signal at anf, sample rate of 2x100 samples/second. Next we pass the samples thr
Posted: Fri May 20, 2022 11:09 pm
5.21 Assume we have digitized an analog signal at anf, sample rate of 2x100 samples/second. Next we pass the samples through a 70-tap linear-phase lowpass FIR filter whose cutoff frequency (end of the passband) is 600 kHz. What would be the time delay, measured in seconds, between the lowpass filter 's input and output for a sinusoidal tone whose frequency is 200 kHz?