Oscillation Compensating Dynamic Adaptive Streaming over HTTP
2015-10-24 HTTP Streaming Goal and Scope Discussion draft-wu-http-streaming-optimization-ps-03 IETF...
-
Upload
rosamond-fleming -
Category
Documents
-
view
217 -
download
1
Transcript of 2015-10-24 HTTP Streaming Goal and Scope Discussion draft-wu-http-streaming-optimization-ps-03 IETF...
23/4/20
HTTP StreamingGoal and Scope Discussion
draft-wu-http-streaming-optimization-ps-03
IETF 79 – BeijingNovember, 2010
Qin Wu, [email protected]
1
23/4/20
Purpose of this presentation
• Food for thought– Note that there lots of open questions, – and the slides have many gaps we hope to fill
in as a result of the discussion at the ad-hoc and later on mailing list.
2
23/4/20
Outline• Goal of Ad-hoc
• Introduction to HTTP streaming
• Why is HTTP Streaming a popular topic?
• Existing HTTP Streaming Work and Model
• Problems for discussion
• Next Step
Note that there lots of open questions, and the slides have many gaps we hope to fill in as a result of the discussion at the ad-hoc and later on mailing list.
3
23/4/20
Goal of Ad-Hoc• Goal
– Talk about HTTP streaming.
• Discuss possible direction forward:– Define protocol or extensions for server, client
and/or smart cache capabilities to: 1) satisfy subscriber QoE requirements for real time application; 2) support interoperability with existing streaming technology; 3) bring efficient delivery mechanism and schemes
4
23/4/20
What is HTTP streaming?
• Streaming is described as a method of transmitting the data over the network as a steady and continuous stream, allowing playback proceed while the subsequent data is being received.
• HTTP streaming refers to the streaming service wherein the HTTP protocol is used for basic transport of media data– Streaming service enables streaming contents to be received
and rendered simultaneously– In order to reduce large packet dropout due to TCP, the
streaming media may be segmented into many chunks– HTTP based progressive download is a special case of HTTP
streaming
5
23/4/20
Why is HTTP Streaming a popular topic?
• The main motivations for "why HTTP " are web-based streaming and multi-screen video transport.
• With multi-screen video support, a common user experience across PCs, TVs, smart-phones, tablets and cars can be provided.
• Since almost all the clients have browser support, it is obviously a good choice to use HTTP streaming to support multi-screen video delivery.
• Some existing work in 3GPP, MPEG and others – will discuss more in gap slides.
6
23/4/20
A trend seen by everyone• From the most popular video website, HTTP Streaming + CDN is a way to go
– Youtube for video sharing– Hulu for free high-quality video online viewing
• From some statistic reports, we also can see the general situation– According to ATLAS Internet observatory 2009 annual report,
streaming, CDN and direct download are growing, replacing P2P as thedominant mechanism for sharing/distributing video.
– According to another report about Global Mobile Broadband Traffic Report from Allot communications, HTTP streaming is the fastest growing application with a rise of above 50%
7
23/4/20
Existing Work or Components
• Media Fragments URI (W3C)• HTML5 video playback elements (W3C)• Server Sent Events (W3C)• WebSocket API (W3C)• Media Presentation Description (3GPP)
– Client and Server Manifest in Microsoft– M3U playlist in Apple– F4F manifest in Adobe
• Streaming File Format (3GPP)• WebSocket Protocol (IETF)• More in the gap slides…save discussion for next
presentation8
23/4/20
Existing HTTP Streaming Model(Client Based Pull)
• Media is split into a series of data chunks
• If several bit rates are available, the client can choose between different chunks of different size or bit rate.
• The client firstly acquires a manifest file containing the reference (e.g. URI) to each media chunks from the streaming server, then requests the media chunks by forming a sequence of HTTP request messages to the server
1.relies on client to handle buffer and playback during download 2.Better effort
delivery
3.Rely on existing web infrastructure
9
23/4/20
Existing HTTP Streaming Model(Message Flow)
Web Browser
Web
Server
Media Segmenter
Media Encoder
Media Decoder/
Player
Web/Media Server
Audio/Video input
HTTP GET
Presentation Desc.
HTTP GET URL(frag1 req)
Fragment 1
HTTP GET URL(frag i req)
Fragment i
1) The media segmenter is used to split input media into a serial of fragments or chunks.
2) polling for each new data in chunks using HTTP requests
The presentation Desc. is used to convey the index of each fragments and associated metadata information.
3) Web Browser will pull media from the server fragment by fragment in accordance with presentation Desc
URL is used to tell the the server which fragment the client is to request
10
23/4/20
What problems we need to look at?• P1:Inefficient Streaming Content Delivery
– streaming application is decoupled from existing web infrastructure
– Web infrastructure may not satisfy real time streaming media requirements
• Bigger size of HTTP header• Rely solely on multi-connection for concurrency• Transport degrade due to slow response of the server for
transmission rate changes• Slow Timing control for driving request
– Not send chunk request until receiving the manifest– Some approaches may not send new request for new media
chunk until receiving the media chunk in response to previous request
11
23/4/20
What problems we need to look at?• P2: No QoE Improvement Support
– Best effort Internet• The quality of Internet media streaming may significantly degrade due to
rising usage and concurrent streaming delivery.
– No subscriber QoE Feedback Support• analyzing the system's overall performance is important to provide high
quality service– there are no streaming quality control mechanisms like RTCP to report subscriber
QoE metrics that are important to the HTTP streaming system for congestion control or diagnostic purpose.
• difficult to track in case of client based pull– fails to give the server feedback about the experience the user actually had while
watching a particular video.– the server may have a video that continually fails to start or content that rebuffers
continually while the Content owner receives none of this information.
– Channel Switching Latency• Switching between live stream channel or switch from VOD channel to live
stream channel.• additional round trips between the client and the server for manifest file
update before the client can request each new chunk.12
23/4/20
What problems we need to look at?• P2: No QoE improvement supported( cont’)
– Rely on client to handle playback buffer and choose content quality• smooth jitter caused by network bandwidth fluctuation, may further increase
user's waiting time. • Is it reasonable to convey quality parameter using URL since URL have lots
of variances.
– Web Server overload in case of live streaming• Web server bottleneck is how many concurrent streams can be served• the server may sacrifice/downgrade quality to enable the process to keep
pace with live contents rendering for viewing.
– Solely Rely on Multi-bit rate (MBR) encoding• suffer various quality downgrading, due to switching from high bit rate
stream to low bit rate stream, rebufferring when the functionality of MBR is poorly utilized
13
23/4/20
What problems we need to look at?
• P3: No service differentiation Support– No distinction regular HTTP traffic from HTTP
Streaming traffic• Disadvantage:
– Transport streaming media in the same way as web page
– transport Streaming media has no priority to be delivered/processed first
• Open questions:– rely on DPI mechanism to differentiate traffic by parsing
streaming file header?– Shall we distinguish different traffic by HTTP header?
14
23/4/20
What problems we need to look at?• P4: No Streaming Distribution Component Support??
– Chunks can not be cached or not?• Can Streaming media be cached in the same way as web page?• Encryption/authorization may be an issue.
– How to reduce upstream bandwidth between the web server and proxy• serve multiple incoming persistent connections with one upstream persistent
connection• Build smart cache to allow it receive the whole response from upstream
before returning anything to the client?• Can smart cache retrieve the manifest as all the receiver do or can smart
cache be signaled to cache the media chunks?• Suppose chunk hints can be sent from the server and inform the
intermediary to help reduce server overload, how Chunk hints can be perceived by intermediary
– How is HTTP live Streaming distributed into CDN in case of server overload or live streaming serving.
• HTTP Redirection to deal with server overload lacks efficiency • Can application running over HTTP on the smart cache can perceive server
load and tackle such server overload efficiently?15
Next Step
• Do we think there are real problems here to be solved (even if the four we list aren’t quite right)?
• Do we think the IETF is the right place to work on this?
• How many people are interested in these issues and would like to contribute?
23/4/20 16
23/4/20
References• http://www.adobe.com/products/
httpdynamicstreaming/• http://www.streamingmedia.com/Articles/
Editorial/Featured-Articles/First-Look-Flash-Media-Server-4-69867.aspx
• http://tools.ietf.org/html/draft-pantos-http-live-streaming-04
• http://www.iis.net/download/smoothstreaming• http://tools.ietf.org/html/draft-wu-http-streaming-
optimization-ps-00
17
Additional Slides
23/4/20 18
23/4/20
HTTP Streaming Use Case (1-1)Live Streaming Media Broadcast
a.The presentation Desc. is used to
convery the index of each chunk ans
associated metedata information.
b.Web Browser will pull media from the server
chunk by chunk in accordance with
presentation Desc.
Web Browser1
Web
Server
Media Segmenter
Media Encoder
Media Decoder/ Player1
Media Server
Audio/Video input
1.HTTP GET
2.Presentation Desc.
3.HTTP GET URL(chunk x req)
4.Fragment x
8.HTTP GET URL(chunk y req)
9.Fragment y
Live Stream serving
6.HTTP GET
7.Presentation Desc.
c.URL is used to tell the the server which chunk the
client is to request
Channel Switching Latency
Additional round trips between the client and the server for manifest file update before the client can request each new chunk, which could risk the real-time feature of live streaming.
Additional round trips for HTTP connection setup also risk the real-time feature of live streaming
Channel Startup LatencyDuring startup of live channel, the client don’t know the current time point of the content. The new manifest to the live channel retrieving result in user poor experience with longer waiting time.
5.switching
Live channel switch
switch between live channels
Switch from VOD channel to live channel
19
23/4/20
HTTP Streaming Use Case (1-2)Live Streaming Media Broadcast
Web
Server
Media Segmenter
Media Encoder
Media Server
Audio/Video input
Live Stream serving
Web Browser2
Media Decoder/ Player2
1.HTTP GET
2.Presentation Desc.
3.HTTP GET URL(chunk x req)
4.Fragment x
6.HTTP GET URL(chunk y req)
7.Fragment y
Latency in the middle of ongoing live session
If the client can predict the URL of new chunk based on previous chunk header for live streaming, it means extra delay will be introduced to deep parse previous chunk file header to calculate the next chunk URL.
If the client just simply calculate the URL of new chunk based on index of previous chunk plus one,Such latency can be decreased.
×
Live session in the middle
Server Initiated Push can be used to deliver media stream
In the middle of live session.
20HTTP Streaming Use Case, IETF 79
23/4/20
HTTP Streaming Use Case (2-1) - Multi-Screen Service Delivery
Subscriber QoE Feedback
Fails to give the server feedback about the experience the user actually had while watching a particular video.
The server may be paying to stream content that is rarely or never watched.
The server may have a video that continually fails to start or content that rebuffers continually while the Content owner receives none of this information.
The intermediaries in the middle has no capability to report total bandwidth consumption of large number of clients behind intermediaries.
Each client in the same network segment can not request
the same content with different bitrate.
Suppose the server provides the same content with 3 bitrate, e.g., 1M,10M, the client can not request the content with bitrate range between 1M and 10 M.
21
23/4/20
HTTP Streaming Use Case (3) - Content publishing into CDN
Web
Server1 Media Segmenter
Media Encoder
Streaming Server
Audio/Video input
Web
Server2
Content Delivery Network (CDN)
Web Client1
Web Client2
Web Client3
Web Cache
Web Cache
Smart Cache
Web Client4
Smart Cache
One shared upstream connection
Serving multiple incoming persistent connections
Server Overload due to Stream Concurrency
concurrent stream from different End device will flow from the same web server, which may saturate the web server with too much load.
Upstream Bandwidth Saving Issue
upstream bandwidth between the server and proxy may be over utilized.
22
23/4/20 HTTP Streaming Use Case, IETF 79
Architecture Consideration(1)
• Enhanced HTTP Streaming Pull model
Feedback on Quality of data delivery support
Reduce switching latency in case of live streaming serving
Allow deployment of some smart cache which enables HTTP Streaming Traffic Localization
Allow the server send chunk hint to the client
Allow intermediate entities to parse chunk hint passing through.
23
23/4/20 HTTP Streaming Use Case, IETF 79
Architecture Consideration(2)
• Hybrid HTTP Streaming modelFeedback on Quality of data delivery support
Reduce switching latency in case of live streaming serving
Allow bidirectional communicationbetween the HTTP client, HTTPServer and intermediate entities.
Allow bidirectional communication between servers
Reduce upstream bandwidthConsumption
24
23/4/20 HTTP Streaming Use Case, IETF 79
Suggestions
• Two ways to go:– Change the existing HTTP protocol to support
HTTP streaming if challenges can be satisfied by Enhancement of HTTP pull model
– Extend websocket protocol to support HTTP Streaming if challenges can not be satisfied by Enhancement of HTTP pull model
25
23/4/20
Traditional Streaming Solutions
Conventional streaming solutions - use RTP/UDP/IP for media data transport, encapsulated as RTP packets- RTSP for session control- SDP for session description- RTCP for QoS control
26
23/4/20
Server Overload
Niginx web server can serve up to 80 concurrent streams without any problem When serving 90 concurrent stream, 1% of all requests take longer than the video duration, buffer underruns may occur When serving 140 concurrent streams, the saturation is too high and the mean request time goes over 10 sec, causing systematic buffer underruns for all connected users.
27
23/4/20
End to End Delay Comparison
• For live streamingEnd to End Delay = transmission delay + network delay + playback delay
• For on-demand streaming:End to End Delay = network delay + playback delay
28
Playback
Control Unit
23/4/20
Playback control in Existing Model
C: SETUP rtsp://audio.example.com/twister/audio RTSP/1.0 Transport: rtp/udp; compression; port=3056; mode=PLAY
S: RTSP/1.0 200 1 OK Session 4231
C: PLAY rtsp://audio.example.com/twister/audio.en/lofi RTSP/1.0 Session: 4231 Range: npt=0-S: RTSP/1.0 200 1 OK Session 4231
C: PAUSE rtsp://audio.example.com/twister/audio.en/lofi RTSP/1.0 Session: 4231 Range: npt=37S: RTSP/1.0 200 1 OK Session 4231
C: TEARDOWN rtsp://audio.example.com/twister/audio.en/lofi RTSP/1.0 Session: 4231S: RTSP/1.0 200 1 OK Session 4231
S: RTSP/1.0 200 1 OK Session 4231
Playback control w/ RTSP In Microsoft HTTP streaming, the RTSP headers are embeddedin the Pragma headers of HTTP messages.
In RealNetworks and QuickTime HTTP streaming, the RTSP commands areembedded in HTTP message bodies with the base64 encodingformat.
Playback control w/o RTSPIn other implementation, client perform playback control by driving HTTP request or running script at the client side.
29