Re: SDR operated with a noise source


Roger Need
 

On Thu, Jan 14, 2021 at 04:25 AM, Allan Isaacs wrote:
I notice my SDR-Play when fed with a home brew noise source gives signal strength readings that do not match the display.
Generally the meter indicates some 6 to 10dBm higher than the display. This also is the case with the noise source switched off and just baseline background noise.
With the noise source switched off the baseline noise may be at -125dBm but the meter indicates -119dBm. Typically, attached is a picture of the noise source used with a low pass filter. The noise signal looks like -116dBm at 190KHz and the meter shows -109.7dBm.
Is this explainable?
Allan G3PIY
The S-meter in SDR console is not calibrated for SDRplay devices.  The reason is that Simon chose not to use the gain calibration tables to get accurate dBm levels at the input of the receiver.

From an earlier discussion in this group I seem to recall that he does not measure the total power in the bandwidth to arrive at the signal level.  Rather he looks for the peak signal in the bandwidth.  There was a long discussion about this and he implemented it the way he wanted.

If you want accurate dBm measurements (+/- 1 dB) at the input of the RSP use SDRuno.  It is the only SDR program for the RSP that does this automatically for all IF and RF gain settings. It measures the total power in the bandwidth so it will measure the output of your noise generator correctly.   The S-meter also conforms to the IARU standard of 6 dB per S unit (most Japanese analog radios use 2 to 4 db per S unit).

Roger

Join main@SDR-Radio.groups.io to automatically receive all group messages.