Streaming the Olympics – The Challenge of Broadcasting Modern-Day Spectacles

Submitted by Arpad Kun is Sr. Director of Network Operations at Ustream

Video streaming the Olympic games has been a challenge since streaming services have been available. Unfortunately, massively large scale, long-term events like the Olympics do not happen often enough to allow the ecosystem to have a granular learning curve. While the video industry is rapidly learning how to deliver content over-the-top (OTT) on demand, live events at a global scale still challenge a lot of providers, including network operators. Prepositioning content on caches deep inside eyeball networks is the direction in which we see the industry moving to help OTT delivery, however, with live events at such huge scale, pre-warming caches is only just one challenge. The number of different Olympic contests happening at the same time (parallel channels) increases the amount of raw video data to be moved from production through encoding and onto the eyeball networks, the high concurrency and demand of diverse live content together makes the streaming the Olympics an immense challenge.

Bandwidth Constraint

The bandwidth constraint between the cloud and foreign networks such as ISPs and enterprise networks is getting more acute as the industry moves towards centralization of computing. The cloud delivers great value when it comes to cost reductions by sharing computing resources. However, the trend toward centralization opens new challenges as new and existing live and on-demand streaming providers (like OTT) rely on the cloud. It is easy to start in the cloud, but when the business grows and its delivery has to scale, streaming providers are reliant upon third-party Content Delivery Networks.

CDN Silos

Ustream_Software_Defined_Content_Delivery _NetworkCDNs are vertical silos in terms of functionality, feature set and reach. If the content provider wants to deliver on a global scale, it has to integrate with a large variety of different CDN providers, as no single CDN has adequate global capacity. The integration also requires working with each individual CDN using commonly supported standards to provide consistent quality of service and scalability across regions and providers. In addition, CDNs rarely reach deep inside ISPs or enterprise networks, where the demand for bandwidth increases steeply as video traffic shifts from legacy broadband cable services towards IP, often causing capacity issues inside ISP, wireless carrier, and even enterprise networks.

The Problem of Traffic Delivery at Scale

There is tremendous pressure between the ever-accelerating pace of change in how we are using the Internet in our everyday life and our high expectations of internet services (for example the quality of streaming live & on-demand video) and the existing ecosystem with its inherent limitations, variety of players with their own special interests and the drive toward stricter governance of the Internet.

While I believe that the issue of content delivery at scale is a larger problem and needs to be addressed at a fundamental level for the long term, the general issues affecting OTT media streaming are clearly understood by all industry players today.

While the FCC is trying to maintain a balance between the needs and influence of content creators and owners, aggregators, ISPs, entrepreneurs, etc., most of the talks in the media revolve around the legal aspects of the issue. There are few discussions about the technological aspects of the accelerating revolution and the state of the ecosystem that has to carry all this content at the end of the day.

We can clearly see the different strategies from several sides of the ecosystem addressing the issue of delivering content at scale: Netflix is shrinking file sizes (re-encoding) to reduce bandwidth consumption. T-Mobile, AT&T, Comcast and other ISPs are trying to get more control over the immense video traffic traversing their network in various ways. Providers must also manage the economic cost of upgrading their existing networks and the speed with which  the upgrade can be physically implemented. While we can debate where the different companies are positioned in addressing these issues, industry experts tend to agree on one key fact:  if all TV programming, including prime-time, was moved to IP, networks would crumble because the ecosystem is not yet prepared to handle those kinds of load levels.

Access networks (ISPs, cell/wireless carriers), especially those tied closely to content producer companies, often offer content delivery services as well as media streaming services. This trend makes it difficult to follow who is competing with whom, and further complicates the FCC’s determination process to identify whether there is a net neutrality issue or a technical issue (or a combination of the two). While assuring seamless content delivery is obviously in everyone’s best interest, the line between the legal and the technical side is blurred and agreement upon what part each of the players (content owners, aggregators, distributors, etc.) should share in presenting a solution to the delivery problem varies. There are several existing audio and video streaming technology solutions and companies, creating a healthy competitive environment, but no single service, nor all of them combined, could solve the delivery at scale problem, largely because they all rely on the same and/or shared resources when it comes to reaching viewer’s eyeballs. The deep caching initiatives on ISP and enterprise networks are going to be key for the industry to move forward with current pace, to allow appropriate amount of time for cable and enterprise network operators to upgrade their network for the demands. It will be a key function to keep the momentum going we can observe today in the industry.

In my view, while the FCC is in a really tough position, it has played it right so far. Despite intensive lobbying from multiple sides and the challenges of understanding this increasingly complex ecosystem, I believe that the FCC took the right direction, protecting the Internet and the interests of end-users for the long term.

However, regardless how good or bad the governance of the Internet is or will be in the future, a solution to the issue of content delivery at scale must be developed and it must come sooner rather than later. This solution won’t come from the outside of the ecosystem. Governance will not solve the technical problems or ease challenges for the Internet end-users. One solution for the long term could be to commoditize content delivery, but that is further down the road. In order to move forward, the industry players have to come together, work as a team to serve their common customer (the subscriber and/or viewer) and provide the convenience and quality we have come to need and expect.


About Arpad Kun

Arpad_secondArpad Kun is Sr. Director of Network Operations at Ustream and responsible for Ustream’s Content Delivery, Network and Infrastructure strategy. Arpad is the inventor of SDCDN at Ustream, the Software Defined Content Delivery Network. SDCDN takes legacy CDNs and creates an abstraction layer spanning across multiple providers providing increased performance, redundancy and unparalleled cost optimization. Arpad has extensive experience in leadership, HA systems, IP connectivity, disaster recovery, security audits, risk assessment, cost modeling and contract negotiation.

Broadcast Beat Magazine is an Official NAB Show Media partner and we cover Broadcast Engineering, Radio & TV Technology for the Animation, Broadcasting, Motion Picture and Post Production industries. We cover industry events and conventions like BroadcastAsia, CCW, IBC, SIGGRAPH, Digital Asset Symposium and more!
Broadcast Beat - Production Industry Resource