Ensuring quality of service (QoS) for television programing is challenging for many reasons. Broadcasters use a variety of network architectures to deliver TV programs. Most of these networks include satellite for distribution – also known as ingest – ASI or IP throughout the facility and often RF to the home – also known as egress. The quality of today’s digital video and audio is usually quite good, but when audio or video issues appear at random, isolating the root cause of the problem is usually quite difficult. The issue might be as simple as an encoder over-compressing a few pictures during a scene with a lot of motion or the problem might be from a random weather event such as heavy winds, rain or snow.
No matter where the problem originates, it is important to be able to quickly identify and log the issue, then identify the equipment (or network link) that needs attention. To identify and isolate problems, it is critical to have access or test points throughout the facility. In any network there should be a minimum of test points at ingest, the ASI or IP switch and the egress. With at least three access points, engineers can isolate the issue, determining if it originated at either ingest, the facility, or egress point of distribution.
Two methods are commonly used to test RF transmission signals for suspected issues. One is QoS and the other is quality of experience (QoE). Both methods are useful in troubleshooting and analysis, but each method quantifies issues using completely different metrics. Here we’ll focus on how to check, measure and improve RF transmission signals using an MPEG Analyzer by testing QoS.
Measuring the QoS is an easy way to determine if there is an error with the transmission signal. One way to measure the QoS is by measuring the ratio of good bits to total bits (this is also related to the BER). The optimum method for testing a transmission link is to take it out of service so special test patterns can be used. These patterns are repetitive, making it easy to quantify when a bit or byte is in error. Unfortunately, the problem with testing live TV programs is that there is very little repetitive data to test. In this case, using BER is critical to monitoring the satellite signals because BER carries a small amount of redundancy. Continuously measuring BER is good, but when the error ratio exceeds 5 x 10E-3 and one or more errors make it into a transport packet, it is almost impossible to tell what will happen to the video and audio. If the error ratio is high enough (>5 x 10E-3), then the errors will be landing in many of the audio and video frames. With this many errors, it can be assumed that the problems will be noticeable by viewers. A difficult issue in monitoring a digital network is determining the extent of the problem on less frequent errors.
Signal power is a basic but extremely important RF test. TV receivers are designed to work with a reasonable amount of signal power and noise. With these two measurements, we get the signal to noise ratio (SNR). The higher the SNR, the easier it is for the receiver to recover the transmitted bits or symbols. The lower the SNR, the more probable the bit will be received in error. For digital TV transmission measurements, we refer to MER rather than SNR. As long as the receiver has a high MER, then the QoS is assumed to be very good. The problem begins when the receiver is near the fringe area of reception or when low power and high noise corrupt the signal – often referred to as the “digital cliff.” When approaching this point, some symbols are received incorrectly. After the MER decreases enough to cross the cliff, the TV picture and sound go from great to terrible.
Most video satellite transmissions use QPSK (e.g., DVB-S, DigiCipher II, DSNG, etc.) to send a transport stream from point-to-point or point-to-multipoint. Recent technology improvements such as 8PSK from DVB-S2 allow more bandwidth while using the same frequency spectrum as a QPSK signal.
Beyond RF and BER testing, the TR 101 290 document includes a recommended set of transport stream measurements for broadcasters (section 5 of the standard):
- Priority 1 errors mean that the receiver will not be able to lock the signal
- Priority 2 errors imply that quality of video and audio may be impaired
- Priority 3 errors imply a problem in the electronic program guide
From a troubleshooting and diagnostics point of view, the most critical of these tests are Sync Byte Error, Continuity Counter Error and Transport Error Indicator Flag. Any errors found in these three categories usually means that something very bad is happening in the transmission of the stream, or possibly in the building or multiplexing of the stream. The TR 101 290 test parameters are a great way to quickly get an idea of the health of the transport stream and its audio and video elements, but some of the other tests are often misleading. Most important are the timing tests of the MPEG-2 Program Association and Program Map Tables (PAT and PMT) tests.
It is true that PAT and PMT are needed (Priority 1), and a minimal arrival time interval requirement is good, but if the tables arrive just one millisecond late, then the TR 101 290 display goes from green to red. This is considered to be a critical error, even though the extra millisecond of latency in the table is never noticed by the TV, set top box or the TV viewer. For this reason, some interpretation of the TR 101 290 results is needed. Relying solely on TR 101 290 can get you into trouble. It is common practice for many network operators to increase the threshold of the arrival time intervals to allow for some deviation, while still testing that the tables are arriving at some rate or interval. Keeping the interval times tight will ensure end users can change channels quickly, but TR 101 290 might falsely flag a problem when a stream or program has a slight delay in parts of its electronic program guide. The same goes for many of the program clock reference (PCR) measurements in priority 2. A deviation of a few percent in the interval will not make a difference to virtually every TV or set top box, but this deviation will cause TR 101 290 to change from green to red.
With an MPEG analyzer, broadcasters can check, measure and improve RF transmission signals. Broadcasters can quickly troubleshoot RF issues leveraging QoS measurements within signal level and MER. Quickly identifying RF transmission issues allows broadcasters to maintain the QoS they are known for providing and keep customers satisfied.
Latest posts by Richard Duvall (see all)
- RF Monitoring for Television with an MPEG Analyzer @tektronix - March 20, 2013