Live Pipeline The trend has been to separate encoding from the packaging and encryption functions to provide greater flexibility. Depending on the resilience of the design (N, N+1 or 2N encoders per live stream), the number of single channel encoders can be significant. Therefore, it is important to architect a live streaming solution that provides the required […]
The trend has been to separate encoding from the packaging and encryption functions to provide greater flexibility. Depending on the resilience of the design (N, N+1 or 2N encoders per live stream), the number of single channel encoders can be significant. Therefore, it is important to architect a live streaming solution that provides the required level of resiliency, but also provides efficiency in the overall cost of the solution. The most common approaches for encoding live channel streams and VOD assets involve either the implementation of an appliance, or the use of a software solution on commodity hardware, with the appropriate interface cards for ingesting the live stream. Encoding density can be increased through the use of GPU-based hardware encoding appliances, with devices allowing for 20 separate live streams that encode to four separate bit rates for each stream for SD, and 10 live input streams to six multi-bitrate encoded streams for HD.
Encoders and packagers are typically deployed in pairs with streams spread across multiple devices as appropriate. Failure of a single device can for a short time, reduce the number of bitrates available, but stream delivery with the remaining bitrates will continue. A separate management application monitors availability and automatically handles failover between devices.
Separating packaging from the encoding process provides a scalable solution that supports all of the major adaptive streaming protocol standards in use today and offers the flexibility to package to different formats from a single H.264 source.
The Origin Server is the publishing point (gateway) for both live, offline and nPVR contents, and provides advanced feature-rich, virtualised and centrally managed environment. The Origin ingests the products created by the live and offline transcoders and makes them available for delivery to multiple CDN providers. In addition to live simulcast and VOD applications, the Origin Server also enables Start Over, Catch Up and nPVR applications. The Origin enables live TV, video on demand, catchup, Startover and nPVR applications. The Origin needs to be highly scalable for ingest and streaming capacity. Depending on the specific project requirements, the Origin can be a simple HTTP server, or a sophisticated software stack including dynamic re-repackaging on the fly and indexing into recorded live channels as described below.
For VOD offline content, the Origin provides a single point of interface for asset management functionality. Once the offline content is ingested to the Origin, the content is available for external back office (such as content management system or CMS) to manage the asset life cycle. Simple commands such as list and delete are available through a RESTful API. Content information such as size, path and type can be queried through a RESTful asset management interface.
Catchup, Startover TV and nPVR applications are possible with an Origins sliding window ingest of the live channel. Since the Origin captures live TV content, it is able to keep the channel content in the Origins storage for retrieval during or after the show. The captured content can be obtained by the clients via simple HTTP request with URL constructed by using the channel name, start and/or end time of the event. Playlist files can also be generated on the fly based on the HTTP request and returned to the client. When the client downloads the segments based on the playlist, the Origin delivers the segments from its storage.
For large-scale implementation Origin storage needs to be able to scale from 100s of TB to PB scale. The storage also needs to be able to cope with the sustained read and write rates to effectively capture and redistribute multiple streams of video.
By utilising Clustered shared storage, the Origin Server is highly scalable. Additional storage servers or origins can be added independently to scale up the storage capacity and the ingest/download capacity. High availability is a core feature with active-active Origin Servers configuration. The Origin servers make use of clustered storage with built-in redundancy. A typical Origin configuration with Clustered storage can ingest up to 40 Channels/ 320Mbps for Live and output 1Gbps. Storage can be scaled by both capacity and throughput by adding storage or io nodes Delivery throughput is scaled by adding additional Origin servers.
Content Delivery
Allowing viewers to view content on an anywhere anytime basis requires a unicast stream to be created from a Video Delivery Node to the viewers connected device. The content delivery cost incurred is based on the amount of data shipped within the stream and is calculated on a cost per GB basis by global CDN suppliers, such as Level 3, Akamai and Limelight. Therefore, allowing viewers to watch content when they wish to incur additional cost per stream costs vs a broadcast delivery mechanism such as DVB-S, DVB-C and DVB-T. The overall cost of delivering a channel is dependant on the volume of users. Recent research by IHS Screen Digest demonstrates that uni-cast delivery of some channels can be more cost-effective than the cost of a DVB broadcast slot, as illustrated in the figure.
The increase in popularity of HTTP based delivery using technologies such as Microsoft Smooth Streaming; Apple HLS; Adobe HTTP Dynamic Streaming; or DASH, has allowed OTT platforms to commoditise video delivery more than ever before, and pushes the CDN itself down the value chain. More frequently operators are choosing to either run their own CDN through the use of On-Net CDNs, public cloud hosted caching servers, or on-premises deployments, or simply to use multiple commercial CDNs, allowing aggressive financial negotiation for total cost reduction.
On-Network (OnNet) CDNs bring benefit to multiple parties ISPs
An OnNet CDN allows an ISP to guarantee a better level of service to their customers. Although, traditionally within the OTT space, ISPs have been cut out of the equation, consumers are naturally driven towards a network provider who can guarantee a better quality of service experience.
Operators OnNet CDN deployments have lower on-going operational costs, because content is stored closer to users, avoids expensive transit networks, and can be done without the involvement of third parties vendors. Netflixs recent OpenConnect project is an example of where an operator has aggressively partnered with ISPs either through the deployment of appliances into ISP networks, or by peering at various packet exchanges for low cost, low latency, high bandwidth delivery.
Consumers Fundamentally, consumers simply care about content. The closer the content is stored to them, the quicker they can play back.
HTTP content delivery is not without its critics and drawbacks. The biggest criticism usually sits around analytics and data, which comes down to the lack of a two-way communication channel. With RTMP, information is constantly fed back to the media server, allowing operators to easily construct information about the number of concurrent viewers, historical views, and playbacks. This kind of information is commonly required by content providers as part of a content deal and hence has commercial implications. Additionally, the lack of this feedback loop means that the server cannot determine the bandwidth capabilities of a client, the current progress through a video for resuming purposes, and so on.
These problems push a requirement for more logic into the player itself. A player must be instrumented with a heuristics engine to determine the total bandwidth available, the quality of playback, or current progress. This functionality however is common in HTTP streaming players; OSMF for example introduced Adaptive Streaming behaviour and Quality of Service (QoS) monitoring. Many platforms further instrument their players to provide constant heartbeat data back to a centralised server for the purpose of stream concurrency information, views, and so on.
HTTP delivery is not without its challenges. Because of the focus on adaptive bitrates, multiple transcoded profiles must exist for every asset on a platform. This can lead to large increase in storage and compute requirements. Several CDNs have attempted to claw their way back up through the value chain by allowing operators to provide a single high quality asset and do on-demand re-transcoding to various profiles at a cost premium but this can prove difficult. Re-transcoding an asset must be done when that asset is unencrypted, which leaves content providers ultimately uneasy about storing plain assets outside of an operator controlled network.
Managing the experience across multiple devices
In order to provide consumers of OTT platforms with maximum flexibility for viewing content, a compelling service needs to be able to deliver a consistent user experience across different families of connected devices, such as games consoles, connected TVs, hybrid set-top-boxes and a range of mobile devices including smartphones, tablets and phablets.
Form Factor
Each type of device family has different form factors and methods of input (touch screen, remote control, gesture) which present challenges for the underlying software solution to resolve in order to provide multiscreen delivery. From a brand and usability position, it is important that the interface across different multiple devices remains consistent, but adapts to the unique features of the device. In order to achieve this, the implementation must focus on ensuring the information architecture of the service is designed in such a way that the presentation layer of the service works in an intuitive manner across each device type, with content being sized appropriately for each form factor.
Certification
Each device manufacturer may have an internal certification process that any application developed for the device needs to pass before being launched onto the platform. This certification process can add considerable overhead and cost to MPVDs who wish to deliver their service offerings to multiple connected devices.
Content Protection
Each different device family may only support a single digital rights management (DRM) technology. Therefore, in order to ensure that content can be played out in a secure manner, the OTT platform may require to encrypt content using different DRM technologies. This is an important consideration when planning which devices will be supported for an OTT platform.
Device SDKs
Each device may require expertise in different programming languages and SDKS. For connected TVs each manufacturer has implemented a different approach to how one develops for the target device in terms of language development and SDK functionality. Although the future may standardise towards HTML5 it is clear that each provider will still want to differentiate their device, meaning that the ability to deliver an OTT solution to multiscreen devices will still be a complex, challenging project.
Mark Christie is Chief Technology Officer (CTO) with KIT digital and is responsible for product strategy, research and development and the global managed service operation that KIT provides to its customers.