By Dan Murray, Tektronix
The need to deliver content from multiple sources to any screen as anytime has made life increasingly complex for broadcasters. Dealing with complexity of OTT/multiscreen workflows while still meeting customers expectations for an excellent, highly personalized experience is a major challenge.
For many broadcasters, the move to OTT services also means making the move into cloud-based video streaming. For broadcasters who have done testing on the cloud to confirm it works well enough to migrate workflows and bring up new services, one critical question often remains: “How do I make sure everything’s working fine? And if there is an issue along the workflow, do I have instruments in place to diagnose and isolate problems and proactively resolve issues before customers see them?”
With the addition of streaming in the cloud, there’s no question that staying on top of QC challenges is becoming exponentially more difficult, as shown in Figure 1. If you look at the video workflow from a few years back, it was comparatively easy. For the end-user, they got a set-top box from the cable operator, and they could just plug the set-top box to the TV and that was it. It was single experience, all controlled by the cable operator who could push out new features whenever they wanted. Today’s world, of course, is much more complex with consumers expecting a flawless experience on every device. There are also many screen sizes and types to consider. On one hand, there’s the small mobile screen, but there’s also 4K content distributed directly to the connected TVs or to the latest gaming consoles.
Figure 1. The move to multiscreen distribution makes content monitoring much more complex.
Traditionally, broadcasters and content distributors install quality assurance monitors throughout their network to ensure that the encodings are going well, that the video and audio content is flowing through the network smoothly and sources of problems can be quickly identified. This is because there’s a lot going on in a network. For instance, when software upgrades are being rolled out, there are humans entering and selecting options, and a wide range of things can go wrong at any point. The need for similar levels of monitoring hasn’t gone away with the move to streaming – in fact, it’s only becoming more critical.
Cloud monitoring considerations
There are multiple considerations when looking at how to set up monitoring in the cloud. One of the most fundamental is network assurance and making sure all the assets are there and the network is operating as planned. Next, is verifying the quality of the video itself. It’s important that anyone doing streaming, especially on live content, can assess the actual quality of the video, the audio, and all the content going across, and isn’t just looking at packets. The final piece is diagnostics and having proactive tools in place. It’s not enough to wait for an end player to say, “This consumer had a bad experience.” Proactive tools can identify issues quickly and help technicians find resolutions ideally before consumers even know there was a problem.
Another consideration is having a set of quality assurance tools that extend across both linear and OTT networks, whether it’s a physical network, a public cloud, or even a private cloud. Ideally, the tools should provide a common look and common dashboards for monitoring workflows, whether it’s physical or whether it’s through the cloud.
Where on-premise monitoring tools typically involve hardware, everything in the cloud is software based, which provides a lot of flexibility to scale and deploy workflows quicker, versus having to change an entire physical network. It follows that the tools for monitoring cloud workflows also must also be software based, so as broadcasters scale to support more programs, more channels and more regions their monitoring solution can also scale along the way.
Moving video and media delivery to the cloud involves working with cloud providers that provide high-speed media workflow services including video processing and distribution. On top of that core, you then can layer additional functionality such as content or workflow management software as well as video monitoring solutions. For everything to work smoothly, it’s important to work with vendors who have strong relationships with each other and have thoroughly tested all the various components you are likely to require.
As shown in Figure 2 below, once cloud services and video monitoring tools are brought together, monitoring can occur at multiple points within a given workflow. In this model, monitoring starts with the raw content as it flows into the cloud and then continues through encoding and packaging, and finally, as it goes to the CDN and out to consumers.
Figure 2. Cloud services integrated with monitoring solutions allow monitoring across an entire streaming workflow.
Looking to the future
One of the questions that often comes up around OTT and high-quality streaming for live events is whether OTT is broadcast grade. As broadband and wireless networks speeds improve, in some cases we’re reaching a tipping point where OTT quality has the potential to become better than “broadcast quality.” Looking to the future, it’s possible that OTT services could be the first to deliver full range of 4k content. In this rapidly advancing world, a robust quality assurance monitoring solution is critical to ensuring a high-quality user experience, capturing market share and minimizing churn.
Dan Murray is a product manager in the Video Product Line at Tektronix and has more than 20 years of experience bringing networking, performance, and security products to market. Prior to Tektronix, he was responsible for security visibility solutions at APCON and held senior positions with Kentrox and ABCTelecommunications. He holds an electrical engineering degree from Portland State University.