Videostream compression in iOS
-
Upload
instinctools -
Category
Mobile
-
view
624 -
download
1
Transcript of Videostream compression in iOS
Videostream compression in iOS
Uladzimir Predka
iOS developer, *instinctools
1. A little bit about codecs and containers2. A little bit about streaming3. How to arrange it for iOS4. Comparison of different approaches for compressing videostream
Owerview
About codecs and containers
The container is a file or streaming format, which takes care of packaging, transport, and presentation of information which is inside of it.
About codecs and containersExamples of containers:
● AVI● MKV● QuickTime● MP4● MPEG-TS
About codecs and containersExamples of codecs:
Video:
● MPEG-4● DivX● h.264/x.264
Audio:
● MP3● AAC● DTS
What about the mechanism of media streaming?Stream media is a multimedia which is continuously obtained by the user from the streaming provider.
● live-streaming● streaming on demand
What about the mechanism of media streaming?Media data -> large amounts -> expensive storage and transmission
Recommended stream bandwidth:
● not HD video ~2 Mbit/s● HD video ~5 Mbit/s● UHD ~9Mbit/s
What about the mechanism of media streaming?Example of calculation of the bandwidth:
1hour of video 320 × 240 ~ 300kbit/s ~128MB
for 1 thousand of users - 300Mbit/s == 135GB/h
What about the mechanism of media streaming?The protocols used for streaming:
For audio compression: MP3, AAC, Vorbis ...
For video compression: h.264, VP8 …
Containers: MP4, FLV...
What about the mechanism of media streaming?Transport protocols:
Used for media delivery from server to client.
RTMP (Real Time Messaging Protocol)
RTP (Real-time Transport Protocol) + RTCP
RTSP (Real Time Streaming Protocol)
What about the mechanism of media streaming?Transport protocols:
Newer: Apple’s HLS, Adobe’s HDS, MPEG-DASH
The process often consists of two stages:
1. The delivery of stream to the server using the transport protocol (streaming transport protocol)
2. Translation from server to final user (HTTP based protocols)
What about the mechanism of media streaming?HLS (HTTP Live Streaming)
It is based on the principle of splitting the stream into fragments.
It uses advanced m3u playlist, which is downloaded at the beginning of the session and contains metadata about the nested streams.
What about the mechanism of media streaming?HLS (HTTP Live Streaming)
It involves the use of an intermediate server that:
1. transforms the media-stream into the correct format: h.264, MP3/HE-AAC/AC-3 and packs into the MPEG-TS container
2. splits the MPEG-TS file into fragments of equal length + creates an index file with a link to fragments (.m3u8)
How to compress video in iOS?Life before iOS 8:
Hardware:
By using AVAssetWriter -> we write to a file (for online streaming write small files, then read them and pass)
Software:
Using any third-party library (eg ffmpeg)
How to compress video in iOS?Life after iOS 8:
VideoToolbox appears
How to compress video in iOS?AVFoundation:
● Decoding directly when displaying.● Encoding into file
VideoToolbox:
● Decoding frames into CVPixelBuffer● Encoding frames into CMSampleBuffer
How to compress video in iOS?Briefly about h.264:
● widely used● gives much a better picture quality at lower bit rates (MPEG-2)● ideal for videostreaming● ...
How to compress video in iOS?Briefly about h.264:
It uses two approaches to reduce the size of the video:
● It compresses data within one frame● It compresses the data using information from the group of pictures (pictures
are grouped into groups GOP)
How to compress video in iOS?Briefly about h.264: GOP (Group of Pictures )
How to compress video in iOS?Briefly about h.264: GOP (Group of Pictures )
I-frames: (key-frame) - self-sufficient, has the biggest size, the fastest decoding.
P-frames: (predicted frame) - uses information from the nearest P- or I-frame
B-frame: (bidirectional frame) - used information from the frames before and after current
How to compress video in iOS?VideoToolbox: the main points
It provides direct access to the decoder / encoder.
It depends on CoreMedia, CoreVideo and CoreFoundation
It requires additional work with buffers obtained from the encoder.
The process of preparing videostream in iOS
1. Capture video from the device -> CMSampleBuffer with uncompressed frame data
2. Compressing frame by encoder from VideoToolbox -> CMSampleBuffer with compressed frame data
3. Converting the stream of CMSampleBuffer into NALUs streaming trough the network
The process of preparing videostream in iOS
The process of preparing videostream in iOS
The process of preparing videostream in iOSCompression process:
1. Create and configure VTCompressionSession by using VTCompressionSessionCreate, as one of the parameters we pass a pointer to encoding callback-function
2. Call VTCompressionSessionEncodeFrame, as one of the parameters we pass CVPixelBufferRef, repeat for each frame.
3. Process CMSampleBuffer that we get from encoder callback.
The process of preparing videostream in iOSThe creation of СompressionSession
The process of preparing videostream in iOSSending the buffer to the compression:
The process of preparing videostream in iOSThe signature of the callback-function:
The process of preparing videostream in iOS
Then it is necessary to convert stream of CMSampleBuffers to the stream of packets suitable for further transmission over the network.
The process of preparing videostream in iOS
The process of preparing videostream in iOS
The process of preparing videostream in iOSGetting the parameters from sampleBuffer's I-frame
The process of preparing videostream in iOSAnnex B vs AVCC
The process of preparing videostream in iOSGetting raw compressed data:
The process of preparing videostream in iOSProfit
The process of preparing videostream in iOSПрофит
Thanks for your attention!
More info:
1. WWDC, 513 - 2014 (https://developer.apple.com/videos/play/wwdc2014/513/)2. “Learning AVFoundation: …” by Bob McCune