Adaptive streaming là gì

Adaptive bit rate streaming was introduced by Move Networks and is now being developed and utilized by Adobe Systems, Apple, Microsoft and Octoshape.[18] In October 2010, Move Networks was awarded a patent for their adaptive bit rate streaming [US patent number 7818444].[19]

MPEG-DASHEdit

MPEG-DASH is the only adaptive bit-rate HTTP-based streaming solution that is an international standard[20] MPEG-DASH technology was developed under MPEG. Work on DASH started in 2010; it became a Draft International Standard in January 2011, and an International Standard in November 2011.[20][21][22] The MPEG-DASH standard was published as ISO/IEC 23009-1:2012 in April, 2012.

MPEG-DASH is a technology related to Adobe Systems HTTP Dynamic Streaming, Apple Inc. HTTP Live Streaming [HLS] and Microsoft Smooth Streaming.[23] DASH is based on Adaptive HTTP streaming [AHS] in 3GPP Release 9 and on HTTP Adaptive Streaming [HAS] in Open IPTV Forum Release 2.[24] As part of their collaboration with MPEG, 3GPP Release 10 has adopted DASH [with specific codecs and operating modes] for use over wireless networks.[24]

Standardizing an adaptive streaming solution is meant to provide confidence to the market that the solution can be adopted for universal deployment, compared to similar but more vendor-centric solutions such as HLS by Apple, Smooth Streaming by Microsoft, or HDS by Adobe.

Available implementations are the HTML5-based bitdash MPEG-DASH player[25] as well as the open source C++-based DASH client access library libdash of bitmovin GmbH,[15] the DASH tools of the Institute of Information Technology [ITEC] at Alpen-Adria University Klagenfurt,[3][26] the multimedia framework of the GPAC group at Telecom ParisTech,[27] and the dash.js[28] player of the DASH-IF.

Adobe HTTP Dynamic StreamingEdit

"HTTP Dynamic streaming is the process of efficiently delivering streaming video to users by dynamically switching among different streams of varying quality and size during playback. This provides users with the best possible viewing experience their bandwidth and local computer hardware [CPU] can support. Another major goal of dynamic streaming is to make this process smooth and seamless to users, so that if up-scaling or down-scaling the quality of the stream is necessary, it is a smooth and nearly unnoticeable switch without disrupting the continuous playback."[29]

The latest versions of Flash Player and Flash Media Server support adaptive bit-rate streaming over the traditional RTMP protocol, as well as HTTP, similar to the HTTP-based solutions from Apple and Microsoft,[30] HTTP dynamic streaming being supported in Flash Player 10.1 and later.[31] HTTP-based streaming has the advantage of not requiring any firewall ports being opened outside of the normal ports used by web browsers. HTTP-based streaming also allows video fragments to be cached by browsers, proxies, and CDNs, drastically reducing the load on the source server.

Apple HTTP Live StreamingEdit

HTTP Live Streaming [HLS] is an HTTP-based media streaming communications protocol implemented by Apple Inc. as part of QuickTimeX and iOS. HLS supports both live and Video on demand content. It works by breaking down streams or video assets into several small MPEG2-TS files [video chunks] of varying bit rates and set duration using a stream or file segmenter. One such segmenter implementation is provided by Apple.[32] The segmenter is also responsible for producing a set of index files in the M3U8 format which acts as a playlist file for the video chunks. Each playlist pertains to a given bitrate level, and contains the relative or absolute URLs to the chunks with the relevant bitrate. The client is then responsible for requesting the appropriate playlist depending on the available bandwidth.

HTTP Live Streaming is a standard feature in the iPhone 3.0 and newer versions.[33]

Apple has submitted its solution to the IETF for consideration as an Informational Request for Comments.[34] A number of proprietary and open source solutions exist for both the server implementation [segmenter] and the client player.

HLS streams can be identified by the playlist URL format extension of m3u8. These adaptive streams can be made available in many different bitrates and the client device interacts with the server to obtain the best available bitrate which can reliably be delivered. The client devices range from iPad, iPhones, Set Top Boxes [STBs] and other suitable client devices.

Playback of HLS is only natively supported in Safari on iOS and Mac and Microsoft Edge on Windows 10. Solutions for playback of HLS on other platforms mostly rely on third-party plug-ins such as Flash or QuickTime.

Microsoft Smooth StreamingEdit

Smooth Streaming is an IIS Media Services extension that enables adaptive streaming of media to clients over HTTP.[35] The format specification is based on the ISO base media file format and standardized by Microsoft as the Protected Interoperable File Format.[36] Microsoft is actively involved with 3GPP, MPEG and DECE organizations' efforts to standardize adaptive bit-rate HTTP streaming. Microsoft provides Smooth Streaming Client software development kits for Silverlight and Windows Phone 7, as well as a Smooth Streaming Porting Kit that can be used for other client operating systems, such as Apple iOS, Android, and Linux.[37] IIS Media Services 4.0, released in November 2010, introduced a feature which enables Live Smooth Streaming H.264/AAC videos to be dynamically repackaged into the Apple HTTP Adaptive Streaming format and delivered to iOS devices without the need for re-encoding. Microsoft has successfully demonstrated delivery of both live and on-demand 1080p HD video with Smooth Streaming to Silverlight clients. In 2010, Microsoft also partnered with NVIDIA to demonstrate live streaming of 1080p stereoscopic 3D video to PCs equipped with NVIDIA 3D Vision technology.[38]

QuavStreams Adaptive Streaming over HTTPEdit

QuavStreams Adaptive Streaming is a multimedia streaming technology developed by Quavlive. The streaming server is an HTTP server that has multiple versions of each video, encoded at different bitrates and resolutions. The server delivers the encoded video/audio frames switching from one level to another, according to the current available bandwidth. The control is entirely server-based, so the client does not need special additional features. The streaming control employs feedback control theory.[39] Currently, QuavStreams supports H.264/MP3 codecs muxed into the FLV container and VP8/Vorbis codecs muxed into the WEBM container.

Uplynk delivers HD adaptive bitrate streaming to multiple platforms, including iOS, Android, Windows Mac, Linux, and Roku, across various browser combinations, by encoding video in the cloud using a single non-proprietary adaptive streaming format. Rather than streaming and storing multiple formats for different platforms and devices, Uplynk stores and streams only one. The first studio to use this technology for delivery was DisneyABC Television Group, using it for video encoding for web, mobile and tablet streaming apps on the ABC Player, ABC Family and Watch Disney apps, as well as the live Watch Disney Channel, Watch Disney Junior, and Watch Disney XD.[40][41]

Self-learning clientsEdit

In recent years, the benefits of self-learning algorithms in adaptive bitrate streaming have been investigated in academia. While most of the initial self-learning approaches are implemented at the server-side[42][43][44] [e.g. performing admission control using reinforcement learning or artificial neural networks], more recent research is focusing on the development of self-learning HTTP Adaptive Streaming clients. Multiple approaches have been presented in literature using the SARSA[45] or Q-learning[46] algorithm. In all of these approaches, the client state is modeled using, among others, information about the current perceived network throughput and buffer filling level. Based on this information, the self-learning client autonomously decides which quality level to select for the next video segment. The learning process is steered using feedback information, representing the Quality of Experience [QoE] [e.g. based on the quality level, the number of switches and the number of video freezes]. Furthermore, it was shown that multi-agent Q-learning can be applied to improve QoE fairness among multiple adaptive streaming clients.[47]

Video liên quan

Chủ Đề