Posts by PA1EJO

    Did a test last night with the Leila system, what I think is that the warning system is triggered by narrow banded signals in the SSB 2700 Hz window. Narrow banded in this context means that Leila performs the power measurement over a much narrower part in the window than the 2700 Hz (or so) that we use for SSB.


    The consequence is that you can transmit with the same power, but that a poor equalization of the transmitted power will trigger Leila. Too much bass or treble and Leila goes off, Equalize the same power and Leila stays silent. Whistle in the microphone and Leila becomes active. Insert a CW tone with the same transmit power and Leila is triggered again.


    I fully understand that we should stay at acceptable power levels but in the current implementation Leila is more of a power density monitor than a power meter. Power metering should consider what we use for SSB in my opinion. Otherwise you are penalizing users for poor modulation rather than excessive transmit power. What did the makers of the Leila system want to achieve? Where is the pass-band aspect in your power measurement? What does it mean, stay below the level of the beacon, in an acceptable SSB band or in a different band, what width are we talking about? Or did you also want to warn users that they are splattering?


    Please don't start the full duplex discussion again, yes I do that myself in both my permanent and my portable setup. We see Leila and act and this is how you find out how we believe Leila works.

    PA1EJO The satellite LEILA beacons are injected from AMSAT-DL HQ in Bochum.

    This confirms my suspicion because of the factor 0.814 that I mention in the blog. I get the best match between my own Doppler frequency offset measurements and the PSK beacon frequency offsets when I apply the 0.814 to both the PSK beacon measurement and my own Doppler measurement. In other words, the PSK and beacon signals must go through the same 2.4 GHz uplink in the same way as I do the roundtrip measurement. My last point is, why not apply the approx -70 Hz offset on the ground so that you only get to see the Doppler from the satellite?

    This is why I make my own ground stations where I insert a signal splitter and a isolator in the receive line, the second signal of the splitter goes into a SDR and SDRconsole on a laptop, the first signal of the splitter goes to the transceiver

    This thread is ancient, but nevertheless: You can monitor frequencies of the satellite oscillator yourself with a round trip measurement experiment. I found this: https://pa1ejo.wordpress.com/2…00-satellite-transponder/ My verdict is : -63.7 Hz bias for the satellite oscillator and -6 Hz for relativity. The rest of the frequency offsets correspond the Doppler effect due to satellite orbit dynamics and some maneuvering which you get from the TLEs. The satellite beacon is according to me an injected signal from Doha in Qatar, and I think there is a reason to keep the total Doppler and oscillator effect negative, so that you can't have a confusion on the sign of the correction.

    When you specify phase noise, at what offset frequency do you want the -90dBc/Hz?

    The QO100 beacon strength might be correlated to relative humidity. All of these measurements take more than two minutes and are done in periods when there is nobody on the NB transponder. The satellite transponder also has an AGC, so if there are strong transmitters it will reduce its gain, keeping my fingers crossed that the transponder beacons are always at the same strength. You can measure the beacon SNR with matlab script readsnr.m which needs signal history files from SDRconsole, see also https://github.com/ejo60/hamradio



    One possible clue is to look at the signal to noise ratio, the SNR, and not the signal in dBm because that could very well be affected by SDR software parameter settings. SNR is much less affected, in SDR console you can get it from the signal history and save it in a CSV file. What I do notice is that the SNRs are GoonHilly are less compared to what I receive at home, no idea why this is the case.

    The size of the dish will not do that, why?

    The excel sheet calculates the expected signal to noise ratio (SNR) in dB of your transmission that is received by the satellite, all parameters in yellow affect that SNR since a bigger dish means that you have a higher gain so more signal (but not noise) arrives at the satellite. The analogy is: a better pointer makes a brighter spot.

    Maybe try this calculator in excel or matlab. In excel specify your set-up in the yellow fields, don't change the green fields and in the end it returns in orange the SNR in dB which is probably the best you can achieve on the waterfall. Filled in is already my setup.

    If you lose so much signal then it is probably water on the feed or whatever, if you have a canopy on the feed then you won't don't see such attenuation levels during rain. Contrary to impossible to erase believe, 2.4 GHz is NOT a water absorption frequency. The earliest water vapor resonance frequency is 22 GHz.