TX-to-RX "noise burst" display artifact explained!
Posted: Mon Oct 24, 2022 3:10 pm
Most of us have probably noticed a "noise burst" on the spectral display at the tail end of every transmission and wondered if that was really energy going out over the air. Here is a greatly exaggerated example:
Many of us have even looked at other SDR receivers to see if this apparent energy is being transmitted, and of course it isn't, thank goodness! However, it can be very disconcerting, to say the least!
Now, finally, the answer is at hand: it has to do with the FFT processing used to form the spectral display.
Bottom line up front: FFT latency, along with any display averaging that might be used, causes the spectral display data to lag the actual state of the radio. Thus it seems that there is still a transmission going on even after the radio has switched to receive although this is certainly not true. Add to this the large difference in strength (amplitude) in the sample of the transmitted signal, and changes in the vertical (amplitude) scale between TX and RX, and all together that can create quite a large artifact on the display that looks an awful lot like a noise burst.
So it's definitely an annoyance, and disconcerting, and VERY confusing to the uninitiated, but it's not indicative of any over-the-air behavior.
A fuller explanation...
This problem is only apparent in DUP mode. The spectral display runs continuously in this mode, processing the data from RX1 during both receive and transmit without pause. The FFT settings during both receive and transmit are those set in Setup > Display > RX1. For example:
The FFT settings in Setup > Display > TX are NOT used when in DUP mode (the display scaling settings are).
The only thing that changes on the spectral display between TX and RX states are the scaling and formatting of the display. The RX1 data stream remains constant from the FFT, again without pause between TX and RX states.
FFT calculations are not instantaneous. To work they need to gather blocks of data. Thus the output of the FFT is delayed by the number of samples, or "points", used in the FFT calculation. The calculation is straightforward:
Number of samples = sample rate / frequency resolution
This delay can get quite long. For instance, at 768KHz sample rate and 2.93Hz resolution (aka bin size), that works out to an FFT size, or "length", of 768000/2.93 = 262116 samples. In actuality the size of the FFT wants to be a power of 2 and the closest length to that is 262144, but you get the idea. And since every sample is 1/768000 seconds long, do the math and you get a delay through the FFT of 262144/768000 = 0.341 seconds or 341 milliseconds. That's a pretty long time!
If you add display averaging into this problem space the issue just gets worse.
This problem also occurs in the RX-to-TX direction. It's usually overlooked because the difference in display scaling that most people set up does a good job of masking the issue. For instance, if the panadapter scale is looking down to -130 during RX, but only -70 during TX, then not much is going to be seen. However, if you go through an exercise of making the TX panadapter scaling the same as RX the problem is plainly evident.
Methods to lesson the impact
Some folks have taken to adjusting their MOX or RF delay settings to ridiculous values of hundreds of milliseconds in order to suppress things. But the only thing that this is suppressing is the ugly display, not any real RF energy coming out of the radio. And as a result this adds tons of latency to TX-to-RX transitions and this can badly impact the quality of communications whether it's breaking into pile-ups or just ragchewing.
You can lower your sample rate, but of course this then limits the ability to watch entire sub-bands or bands.
You can increase the FFT bin size, but then the display becomes so lively that you pretty much have to turn on display averaging. The two changes effectively cancel each other out.
So what can be done?
In order to achieve a more intuitive display, the software would need to be updated to simply blank or suppress the FFT data stream from the time transmit RF starts or ends for a duration equal to the FFT length. That way the old data can all go into the bit bucket and be replaced by the data applicable to the current state of the radio. Also, display averaging would need to be reset at the end of the blanking period.
This should work well for phone and digital operations. However, the rapid and continuous interruptions in the spectral display may be too much during CW operations, particularly full break-in CW available on Orion and later board variants.
Any ideas regarding this approach would be most welcome.
Many of us have even looked at other SDR receivers to see if this apparent energy is being transmitted, and of course it isn't, thank goodness! However, it can be very disconcerting, to say the least!
Now, finally, the answer is at hand: it has to do with the FFT processing used to form the spectral display.
Bottom line up front: FFT latency, along with any display averaging that might be used, causes the spectral display data to lag the actual state of the radio. Thus it seems that there is still a transmission going on even after the radio has switched to receive although this is certainly not true. Add to this the large difference in strength (amplitude) in the sample of the transmitted signal, and changes in the vertical (amplitude) scale between TX and RX, and all together that can create quite a large artifact on the display that looks an awful lot like a noise burst.
So it's definitely an annoyance, and disconcerting, and VERY confusing to the uninitiated, but it's not indicative of any over-the-air behavior.
A fuller explanation...
This problem is only apparent in DUP mode. The spectral display runs continuously in this mode, processing the data from RX1 during both receive and transmit without pause. The FFT settings during both receive and transmit are those set in Setup > Display > RX1. For example:
The FFT settings in Setup > Display > TX are NOT used when in DUP mode (the display scaling settings are).
The only thing that changes on the spectral display between TX and RX states are the scaling and formatting of the display. The RX1 data stream remains constant from the FFT, again without pause between TX and RX states.
FFT calculations are not instantaneous. To work they need to gather blocks of data. Thus the output of the FFT is delayed by the number of samples, or "points", used in the FFT calculation. The calculation is straightforward:
Number of samples = sample rate / frequency resolution
This delay can get quite long. For instance, at 768KHz sample rate and 2.93Hz resolution (aka bin size), that works out to an FFT size, or "length", of 768000/2.93 = 262116 samples. In actuality the size of the FFT wants to be a power of 2 and the closest length to that is 262144, but you get the idea. And since every sample is 1/768000 seconds long, do the math and you get a delay through the FFT of 262144/768000 = 0.341 seconds or 341 milliseconds. That's a pretty long time!
If you add display averaging into this problem space the issue just gets worse.
This problem also occurs in the RX-to-TX direction. It's usually overlooked because the difference in display scaling that most people set up does a good job of masking the issue. For instance, if the panadapter scale is looking down to -130 during RX, but only -70 during TX, then not much is going to be seen. However, if you go through an exercise of making the TX panadapter scaling the same as RX the problem is plainly evident.
Methods to lesson the impact
Some folks have taken to adjusting their MOX or RF delay settings to ridiculous values of hundreds of milliseconds in order to suppress things. But the only thing that this is suppressing is the ugly display, not any real RF energy coming out of the radio. And as a result this adds tons of latency to TX-to-RX transitions and this can badly impact the quality of communications whether it's breaking into pile-ups or just ragchewing.
You can lower your sample rate, but of course this then limits the ability to watch entire sub-bands or bands.
You can increase the FFT bin size, but then the display becomes so lively that you pretty much have to turn on display averaging. The two changes effectively cancel each other out.
So what can be done?
In order to achieve a more intuitive display, the software would need to be updated to simply blank or suppress the FFT data stream from the time transmit RF starts or ends for a duration equal to the FFT length. That way the old data can all go into the bit bucket and be replaced by the data applicable to the current state of the radio. Also, display averaging would need to be reset at the end of the blanking period.
This should work well for phone and digital operations. However, the rapid and continuous interruptions in the spectral display may be too much during CW operations, particularly full break-in CW available on Orion and later board variants.
Any ideas regarding this approach would be most welcome.