1.

If the output power from an audio amplifier is measured at 100W when the signal frequency is 1kHz, and 1W when the signal frequency is 10kHz. Calculate the dB change in power.(a) -10dB(b) -20dB(c) -30dB(d) 15dBThis question was posed to me during an online exam.My query is from General Frequency Consideration in portion Transistor Biasing and Low Frequencies of Analog Circuits

Answer»

The correct choice is (b) -20dB

For explanation: The initial POWER GAIN in dB = 10 LOG (output power)

= 10 log(100) = 20dB

The final power gain in dB = 10 log(output power)

= 10 log(1) = 0 dB

So change in power = final power – initial power

= 0-20= -20dB.



Discussion

No Comment Found

Related InterviewSolutions