That’s more or less what I expected. The SDR meter provides a jolly useful measure of change and not anything absolute.
I began to get interested in this topic when discussing the Tiny 2.8 Spectrum Analyser that has no tracking generator unlike my Rigol.
I built a noise source that seems to work OK and provides, after some iterations, a reliable output from around 40KHz to a couple of GHz so should enable the Tiny SA (or other things, even SDRs) to check things like filters.
I’ve been using my Rigol as well as my SDR and both give the same subjective results (in other words both provide equal filter test characteristics) but with the Rigol giving accurate power readings or attenuation figures.
One point of interest is the total power from the noise source. Although the power level is not especially high at any given frequency, for example it shows as S9 +30dB on a decent communications receiver from 100KHz to 30MHz, it develops 1mW (0dBm) as measured by my HP power meter (10MHz to 10GHz). If the power meter spec covered 0-10GHz that 0dBm figure would register a higher output, perhaps as high as +10dBm? That means that it would be a bad idea to further amplify the noise output for fear of accidentally damaging a spectrum analyser which typically has a max input rating of 20dBm.
main@SDR-Radio.groups.io [mailto:main@SDR-Radio.groups.io] On Behalf Of jdow
Until you calibrate it there is no way SDRC can be thought of as being