15-744 computer networking lecture 18 – internet video delivery matt mukerjee slides: hui zhang,...
TRANSCRIPT
15-744 Computer Networking
Lecture 18 – Internet Video DeliveryMatt Mukerjee
Slides: Hui Zhang, Peter Steenkiste, Athula Balachandran, Srini Seshan, et. al.
2
The Internet Today…
2014 1H – NA Fixed Access
VideoVideo
Could be VideoProbably Video
Encrypted Video?
DEFINITELY Video!Purchasing Videos?
Sharing Video!Video in the name!
Ads! (and Videos!)
3
The Internet Today…
2014 1H – NA Fixed Access
VideoVideo
Could be VideoProbably Video
Encrypted Video?
DEFINITELY Video!Purchasing Videos?
Sharing Video!Video in the name!
Ads! (and Videos!)It’s
all Video!
Demystifying Video Delivery
Video Source
Screen
Internet
Video Player
???What’s this?
Today’s lecture4
5
Outline
• Background on Video• History of Internet Video Delivery• CDNs & the World Today• Content Brokers & Quality of Experience• Live Video Delivery
Terminology
• Frame• Resolution
• Number of pixels per frame• 160x120 to 1920x1080 (1080p) to 4096x2160 (4K)
• FPS (frames per second)• 24, 25, 30, or 60
• Bitrate• Information stored/transmitted per unit time• Usually measured in kbps to mbps • Ranges from 200Kbps to 30 Mbps
6
Internet Video Requirements
• Smooth/continuous playback • Elasticity to startup delay: need to think in terms of RTTs• Elasticity to throughput
• Multiple encodings: 200Kbps, 1Mbps, 2 Mbps, 6 Mbps, 30Mbps
• Multiple classes of applications with different requirements Delay Bandwidth Examples
2, N-way conference
< 200 ms
4 kbps audio only, 200 kbps – 5 Mbps video
Skype, Google hangout, Polycom, Cisco
Short form VoD
< 1-5s 300 kbps – 2 Mbps & higher
Youtube
Long form VoD
< 5-30s 500 kbps – 6 Mbps & higher
Netflix, Hulu, Qiyi, HBOGO
Live Broadcast
< 5-10s 500 kbps – 6 Mbps & higher
WatchESPN, MLB
Linear Channel
< 60s 500 kbps – 6 Mbps & higher
DirectTV Live 7
8
Avoid Buffering like the Plague!
Buffering… 51%
1990 – 2004: 1st Generation Commercial PC/Packet Video Technologies
• Simple video playback, no support for rich app• Not well integrated with Web browser • No critical mass of compelling content over Internet• No enough broadband penetration
9
2005: Beginning of Internet Video Era
100M streams first year
Premium Sports Webcast on Line
10
2006 2007 2008 2009 2010 2011
2006 – 2015: Internet Video Going Prime Time
11
First Generation: HTTP Download
• Browser request the object(s) and after receiving them pass them to the player for display• No pipelining
12
First Gen: HTTP Progressive Download (2)
• Alternative: set up connection between server and player; player takes over
• Web browser requests and receives a Meta File (a file describing the object) instead of receiving the file itself;
• Browser launches the appropriate player and passes it the Meta File;
• Player sets up a connection with Web Server and downloads or streams the file
13
Meta file requests
14
Buffering Continuous Media
• Jitter = variation from ideal timing• Media delivery must have very low jitter
• Video frames every 30ms or so• Audio: ultimately samples need <1ns jitter
• But network packets have much more jitter that that!
• Solution: buffers• Fill buffer over the network with best effort service• Drain buffer via low-latency, local access
15
Streaming, Buffers and Timing
Time
Max Buffer Duration
= allowable jitter
File
Pos
ition
Max
Buf
fer
Siz
e
Smoo
th P
laybac
k Tim
e
Buffer almost empty
"Good" Region:smooth playback
"Bad": Buffer underflows and playback stops
"Bad": Buffer overflows
Buffer Duration
BufferSize
16
Drawbacks of HTTP Progressive Download
• HTTP connection keeps data flowing as fast as possible to user's local buffer• May download lots of extra data if user does not watch the entire
video• TCP file transfer can use more bandwidth than necessary
• Mismatch between whole file transfer and stop/start/seek playback controls.• However: player can use file range requests to seek to video
position
• Cannot change video quality (bit rate) to adapt to network congestion
17
Multimedia
2nd Generation: Real-Time Streaming
• Build our own protocol! This gets us around HTTP; app layer protocol can be better tailored to Streaming
18
Example: Real Time Streaming Protocol (RTSP)
19
Drawbacks of Real-Time Streaming
• Per-client state!• Streaming protocols often blocked by routers• HTTP delivery can use ordinary proxies and
caches• Conclusion: rather than adapt Internet to
streaming, adapt media delivery to the Internet
20
3rd Generation: HTTP Streaming
• Client-centric architecture with stateful client and stateless server • Standard server: Web servers• Standard Protocol: HTTP • Session state and logic maintained at client
• Video is broken into multiple chunks (like with bittorrent)• Each chunk can be played independent of other chunks• Playing chunks in sequence gives seamless video• Meta-file (manifest) provides chunk file names
21
HTTP Chunking Protocol
HTTP Adaptive Player
Web browser Web server
HTTP
TCP
…
HTTP
TCP
A1 A1 A2
Cache
Client
Web server
…A1 A2
HTTP GET Manifest
Server
22
Manifest
Manifest
Manifest:A1, A2, …
HTTP Chunking Protocol
HTTP Adaptive Player
Web browser Web server
HTTP
TCP
…
HTTP
TCP
A1 A1 A2A1
Cache
Client
Web server
…A1 A2
HTTP GET A1
Server
HTTP GET A2
23
A2
Manifest:A1, A2, …
Manifest
Adaptive Bitrate with HTTP Streaming
• Encode video at different quality/bitrate levels
• Client can adapt by requesting chunks at different bitrate levels
• Chunks of different bitrates must be synchronized• All encodings have the same chunk boundaries
and all chunks can be played independently, so you can make smooth splices to chunks of higher or lower bit rates
24
HTTP Chunking Protocol
HTTP Adaptive Player
Web browser Web server
HTTP
TCP
…
HTTP
TCP
…
A1 A1 A2
B1 B2
A1B1
Cache
Client
Web server
…
…
A1 A2
B1 B2HTTP GET A1
Server
B2HTTP GET B2
25
Manifest:LQ: A1, A2, …HQ: B1, B2, ...
Manifest
Advantages of HTTP Streaming
• Easy to deploy: it's just HTTP!• Work with existing caches/proxies/Webservers/Firewall
• Fast startup by downloading lowest quality/smallest chunk
• Bitrate switching is seamless• Many small files
• Small with respect to the movie size • Large with respect to TCP
• 5-10 seconds of 1Mbps – 3Mbps 0.5MB – 4MB per chunk
26
Demystifying Video Delivery
Video Source
Screen
Internet
Video Player
???
27
Internet
???
Demystifying Video Delivery
Video Source
Encoders & Video
Servers
CMS and
Hosting
ISP & Home Net
Screen
Video Player
28
Demystifying Video Delivery
Video Source
Encoders & Video
Servers
CMS and
Hosting
Content Delivery Networks
(CDN)
ISP & Home Net
Screen
Video Player
29
30
Outline
• Background on Video• History of Internet Video Delivery• CDNs & the World Today• Content Brokers & Quality of Experience• Live Video Delivery
31
Content Delivery Networks (CDNs)
• Goal: deliver content from optimal locations• (Generally: users in Pittsburgh go to Pittsburgh server,
users in Tokyo go to Tokyo server, …)
• Customers (e.g., CNN, ESPN, …) pay CDN to deliver their content
• Variety of companies: Akamai, Limelight, Level 3, …
• Akamai is one of the most well known• 170,000+ servers in 102 countries• 15-30% of internet traffic
32
How Akamai Works
End-user
cnn.com (content provider) DNS root server Akamai server
1 2 3
4
Akamai high-level DNS server
Akamai low-level DNS server
Akamai server
10
67
8
9
12
Get index.html
Get /cnn.com/foo.jpg
11
Get foo.jpg
5
33
Akamai – Subsequent Requests
End-user
cnn.com (content provider) DNS root server Akamai server
1 2 Akamai high-level DNS server
Akamai low-level DNS server
Akamai server
7
8
9
12
Get index.html
Get /cnn.com/foo.jpg
34
Outline
• Background on Video• History of Internet Video Delivery• CDNs & the World Today• Content Brokers & Quality of Experience• Live Video Delivery
What’s wrong with this picture?
Video Source
Encoders & Video
Servers
CMS and
Hosting
Content Delivery Networks
(CDN)
ISP & Home Net
Screen
Video Player
35
No Feedback!
Closing the Loop with Analytics
Video Source
Encoders & Video
Servers
CMS and
Hosting
Content Delivery Networks
(CDN)
ISP & Home Net
Screen
Video Player
36
Content Broker
Closing the Loop with Analytics
Video Source
Encoders & Video
Servers
CMS and
Hosting
Content Delivery Networks
(CDN)
ISP & Home Net
Screen
Video Player
37
38
Why Analytics?
• Analytics help judge users’Quality of Experience (QoE)
• Users’ perception of quality greatly impacts engagement • (and thus ad revenue)
Conviva
39
How can Conviva Help increase QoE?
• Runs code directly in clients browser, from customer website (e.g., CNN, ESPN, …)
• Gathers historic and real-time data from each client
• Predict which CDN will provide the best QoE for a given client• (and send the client there)
• Find the optimal time to switch bitrates
40
Conviva Architecture
41
Simple, Right?
• Turns out no single metric is a good indicator of QoE!
42
Commonly used Quality Metrics
Join Time
Average Bitrate
Buffering ratio
Rate of buffering
Rate of switching
43
Commonly used Quality Metrics
• Need some function that maps metrics to QoE
• Use machine learning? ✓• Biggest problem… confounding factors!
• (e.g., live video and VoD behave very differently)
44
Challenge: Confounding Factors
Live and VOD sessions experience similar quality
45
Challenge: Confounding Factors
However, user viewing behavior is very different
Confounding Factors
46
Type of Video
Connectivity
Device
Location
Popularity
Time of day
Day of week
Need systematic approach to identify and incorporate confounding
factors
47
How to Rank Confounding Factors?
• Relative information gain• ML for “how key is a particular factor”
• Decision tree structure• Does this factor drastically change the DT
structure?• Tolerance
• Do users tolerate changes in this factor more easily?
48
Identifying Key Confounding Factors
Factor Relative Information Gain
Decision TreeStructure
Tolerance Level
Type of video ✓ ✓ ✓Popularity ✗ ✗ ✗Location ✗ ✗ ✗Device ✗ ✓ ✓Connectivity ✗ ✗ ✓Time of day ✗ ✗ ✓Day of week ✗ ✗ ✗
49
Identifying Key Confounding Factors
Factor Relative Information Gain
Decision TreeStructure
Tolerance Level
Type of video ✓ ✓ ✓Popularity ✗ ✗ ✗Location ✗ ✗ ✗Device ✗ ✓ ✓Connectivity ✗ ✗ ✓Time of day ✗ ✗ ✓Day of week ✗ ✗ ✗
50
Results
• Use important confounding factors to create separate QoE estimator models (using ML)
• 20% increase in user engagement
51
Outline
• Background on Video• History of Internet Video Delivery• CDNs & the World Today• Content Brokers & Quality of Experience• Live Video Delivery
Live Video
• Live video is becoming immensely popular• Demand increases 4-5x every three years• Expected to reach 50 Tbps this year• Amazon buys Twitch for ~$1Billion
• Users watch 150b min of live video per month
• Single World Cup stream = 40% of Internet traffic
52
Live Video Challenges
• Amount of Demand• Can’t Cache• Workload Variety (e.g. mega-events / longtail)• Users want:
• Good QoE (good bitrate, buffering ratio, join time)• CDNs want:
• meet user demand, minimize delivery cost• WAN imposes:
• latency, packet drops, failures, …
53
CDN Live Video Delivery Background
VideoSources
InteriorClusters
Video Requests Channel 1
Channel 2
HTTP GET
EdgeClusters
Data Response
Distribution Trees
CDN Live Video Delivery Background
56
Questions?
15-744 Computer Networking
Lecture 18 – Internet Video DeliveryMatt Mukerjee
Slides: Hui Zhang, Peter Steenkiste, Athula Balachandran, Srini Seshan, et. al.