Posts by PA1EJO

    The QO100 beacon strength might be correlated to relative humidity. All of these measurements take more than two minutes and are done in periods when there is nobody on the NB transponder. The satellite transponder also has an AGC, so if there are strong transmitters it will reduce its gain, keeping my fingers crossed that the transponder beacons are always at the same strength. You can measure the beacon SNR with matlab script readsnr.m which needs signal history files from SDRconsole, see also

    One possible clue is to look at the signal to noise ratio, the SNR, and not the signal in dBm because that could very well be affected by SDR software parameter settings. SNR is much less affected, in SDR console you can get it from the signal history and save it in a CSV file. What I do notice is that the SNRs are GoonHilly are less compared to what I receive at home, no idea why this is the case.

    The size of the dish will not do that, why?

    The excel sheet calculates the expected signal to noise ratio (SNR) in dB of your transmission that is received by the satellite, all parameters in yellow affect that SNR since a bigger dish means that you have a higher gain so more signal (but not noise) arrives at the satellite. The analogy is: a better pointer makes a brighter spot.

    Maybe try this calculator in excel or matlab. In excel specify your set-up in the yellow fields, don't change the green fields and in the end it returns in orange the SNR in dB which is probably the best you can achieve on the waterfall. Filled in is already my setup.

    If you lose so much signal then it is probably water on the feed or whatever, if you have a canopy on the feed then you won't don't see such attenuation levels during rain. Contrary to impossible to erase believe, 2.4 GHz is NOT a water absorption frequency. The earliest water vapor resonance frequency is 22 GHz.

    2600 Hz SSB filter in SDR console over end beacon remains steady at 28 dB, just export the values to an excel sheet and calculate the dBm difference within excel. It doesn't matter really how you do this. For instance take a mean dB value through the values at the lowest and the highest level and subtract them. I take the noise measurement a few kHz away from the beacon, and the signal measurement over the beacon. I find no convincing difference with the earlier described CW-U 50 Hz method.

    Hallo Ernst,

    interesting idea. Would you please give a short description of your equipment and method to make the results comparable.


    You only need SDRconsole and a SDR, go to view -> signal history and you can get an the situation as shown, best guess the peak dBm values in both 50Hz CW decoders and subtract them, this morning I got approximately 29 dB, meaning also that there is nothing attenuating the satellite signal (like rain water on the canopy of the antenna feed). Signal history in sdrconsole even has the possibility to download the dBm values in a csv file that you can import in a spreadsheet. I focus on peak values of what I see in SDR console.

    You will see that the SNR that you find is mostly unaffected by settings of the SDR, even the bandwidth of the demodulator should not matter too much in my opinion. Important is that you get the maximal signal of the CW beacon for the moment that it is continuous and the real noise floor somewhat but not too far away from the beacon.

    Martin DM4IM did this test also I learned during a QSO, he got 34 dB but his antenna is taller than my TDS88 from Triax which I rate at 90cm diameter, his is 150cm, the difference in the antenna gain is like 4 dB (see the 31-May article on my blog), the rest of the differences are due to downconverter or LNB differences or simply the antenna efficiency. I think these are interesting tests to compare. It says in my opinion everything about receiver sensitivity.

    I know that SDRplay's rspduo results in realistic dBm values, used a HP stepper attenuation device, a HP power meter and a signal source to calibrate the dBm scale in SDRconsole, it is fine in my opinion.

    What you are discussing is related to the link budget, there is a RX and a TX part, the RX part concerns your receive sensitivity at 10 GHz and the TX is the other way around.

    Let's do something everyone can do, and that is to look at a unique signal of the satellite from both end beacons where it is assumed that they always operate on the same power level.

    What you do is to test your signal to noise ratio at the begin or end beacon of the NB transponder, so you measure with a 50 Hz wide CW filter the dBm's of a continuous wave transmitted by the CW beacon, and one repeats that measurement 5 kHz below the beacon in a region where you only see the noise floor.

    In the end you get two dBm values and you subtract them, in my case I find 28 dB and this is a value corresponding to my receive sensitivity. The former receiver performance where I had a semi-rigid problem between the feed and the downconverter and a smaller dish it was 20 dB which is in my opinion poor receive sensitivity, that problem is gone so I can hear others better.

    Assume the value of 28 dB as an artibrary reference, assume a factor 2 (or 3dB) in the sensitivity test and complete the following poll, this is a QRM free test everyone can do.