Download - Learning AV Foundation

Transcript
Page 1: Learning AV Foundation

http://bobmccune.com

Learning AV Foundation

Page 2: Learning AV Foundation

Bob McCuneAbout...

‣MN Developer and Instructor‣Owner of TapHarmonic, LLC.‣Founded Minnesota CocoaHeads in 2008

Page 3: Learning AV Foundation

What will I learn?Agenda

‣ AV Foundation Overview‣ Decomposing AV Foundation‣ Code Examples

Page 4: Learning AV Foundation

OverviewAV Foundation Framework

‣ Apple’s advanced Objective-C framework for working with timed-media‣ High performance, asynchronous processing‣ Hardware accelerated handling of AV media

‣ Available in its current form since iOS 4‣ Additions and enhancements iOS 5 and 6‣ Part of Mac OS X since 10.7 Lion

‣ Apple’s focus for media apps on both iOS and Mac‣ Should be yours too!

Page 5: Learning AV Foundation

iOS Media OptionsWhere does it !t?

MediaPlayer

UIKit

AVFoundation

CoreAudio CoreMedia CoreAnimationCoreVideo

Page 6: Learning AV Foundation

Where do I start?Challenges and Prerequisites

‣ Large and feature-rich framework‣ Over 70 classes (as of iOS 6)‣ Variety of functions, protocols, and constants

‣ Technical Concepts‣ Blocks‣ Key-Value Observing‣ Grand Central Dispatch

‣ Additional frameworks‣ Core Animation‣ Quartz & OpenGL ES‣ Core Media‣ Core Audio

Page 7: Learning AV Foundation

What can I do with it?Decomposing AV Foundation

Inspect

Playback Export

Capture Compose

Page 8: Learning AV Foundation

Media Inspection

Page 9: Learning AV Foundation

Static Media ModelingMedia Assets

‣ AVAsset models the static aspects of a media resource‣ Abstraction over underlying format‣ Models details common to whole media resource

‣ Composed of one or more tracks

‣ AVAssetTrack models the static aspects of the individual media streams within an asset‣ Tracks are of a uniform type (video, audio, etc.)

AVAssetTrack (Video)

AVAssetTrack (Audio)

Page 10: Learning AV Foundation

Timed-media ChallengesMedia Inspection

‣ Processing media takes time‣ Media resources can be large and possibly remote‣ Need to keep the UI responsive‣ Need to handle interruptions

Need to perform inspection asynchronously!

Page 11: Learning AV Foundation

InspectionAsynchronous Inspection

‣ Creating an AVAsset does not the load resource‣ Media not loaded until properties are queried‣ Standard property access happens synchronously

‣ Properties should be loaded asynchronously using the AVAsynchronousKeyValueLoading protocol- statusOfValueForKey:error:- loadValuesAsynchronouslyForKeys:completionHandler:

NSURL *assetURL = [[NSBundle mainBundle] URLForResource:@"song" withExtension:@"mp3"];AVURLAsset *asset = [AVURLAsset URLAssetWithURL:assetURL options:nil];

Page 12: Learning AV Foundation

Asynchronous Inspection

NSURL *url = ...AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];NSArray *keys = @[@"tracks"];[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() { NSError *error = nil; AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error]; switch (status) { case AVKeyValueStatusLoaded: [self processTracks]; break; case AVKeyValueStatusFailed: [self reportError:error forAsset:asset]; break; case AVKeyValueStatusCancelled: // Do whatever is appropriate for cancelation. break; }}];

Example

Page 13: Learning AV Foundation

Media Processing

Page 14: Learning AV Foundation

AVAssetExportSessionTranscoding and Export

‣Export presets for transcoding to other formats+ (NSArray *)exportPresetsCompatibleWithAsset:(AVAsset *)asset;

+ (NSArray *)allExportPresets;

‣Can specify a time range to perform trimming@property (nonatomic) CMTimeRange timeRange;

‣Can optionally specify metadata to be written@property(nonatomic, copy) NSArray *metadata;

Page 15: Learning AV Foundation

ExampleAVAssetExportSessionNSURL *assetURL = // bundle URL for ‘jam.mp3’AVAsset *audioAsset = [AVURLAsset assetWithURL:assetURL];

AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:audioAsset presetName:AVAssetExportPresetAppleM4A];

session.outputURL = // Documents directory URL for ‘jam.m4a’session.outputFileType = AVFileTypeAppleM4A;[session exportAsynchronouslyWithCompletionHandler:^{ switch (session.status) { case AVAssetExportSessionStatusFailed: NSLog(@"Export Failed"); break; case AVAssetExportSessionStatusCancelled: NSLog(@"Export Cancelled"); break; case AVAssetExportSessionStatusCompleted: NSLog(@"Success!"); break; }}];

Page 16: Learning AV Foundation

AVAssetImageGeneratorImage Generation

AVAssetImageGenerator

Generate thumbnail images for speci!ed time periods

101010101010101010101010101010101010101010101010101010101010101010010101010101010101010101010101010101010101010101010101010101010101

Page 17: Learning AV Foundation

NSURL *assetURL = ... // Asset URLAVAsset *asset = [AVAsset assetWithURL:assetURL];

// Generator must be retained!self.imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];

[self.imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) { switch (result) { case AVAssetImageGeneratorFailed: // Handle Failure break; case AVAssetImageGeneratorCancelled: // Handle Cancellation break; case AVAssetImageGeneratorSucceeded: // Process Image break; }}];

ExampleAVAssetImageGenerator

Page 18: Learning AV Foundation

Reading and WritingAdvanced Media Processing

AVAssetReader

AVAssetReader

AVAssetWriter

AVAssetWriter

Page 19: Learning AV Foundation

Media Playback

Page 20: Learning AV Foundation

Playback ControllerAVPlayer

‣ AVPlayer is a controller for managing playback‣ play‣ pause‣ seekToTime:

‣ Use KVO to observe playback readiness and state‣ status

‣ Timed Observations‣ addPeriodicTimeObserverForInterval:queue:usingBlock‣ addBoundaryTimeObserverForInterval:queue:usingBlock

Page 21: Learning AV Foundation

Static

Static vs Dynamic ModelsPlaying Media

‣ AV Foundation distinguishes between static and dynamic aspects of media

D ynamic

AVPlayerItemAVPlayerItemTrack

AVPlayerItemTrackAVPlayerItemTrack

AVAssetAVAsset

AVAssetAVAssetTrack

Page 22: Learning AV Foundation

Core Media EssentialsUnderstanding Time

‣ CMTime‣ Rational number representing time‣ 64-bit time value (numerator)‣ 32-bit time scale (denominator)

‣ CMTimeRange‣ Struct containing start and end times

CMTime fiveSeconds = CMTimeMake(5, 1);CMTime halfSecond = CMTimeMake(1, 2);CMTime thirtyFPS = CMTimeMake(1, 30);

Page 23: Learning AV Foundation

AVPlayerLayerRendering Video Content

AVPlayerAVPlayerItem

AVPlayerItemTrackAVPlayerItemTrackAVPlayerItemTrack

AVAsset

AVAssetAVAssetAVAssetTrack

Page 24: Learning AV Foundation

AVPlayerLayerRendering Video Content

AVPlayerAVPlayerItem

AVPlayerItemTrackAVPlayerItemTrackAVPlayerItemTrack

AVPlayerLayer

Page 25: Learning AV Foundation

Demo

Page 26: Learning AV Foundation

Media Capture

Page 27: Learning AV Foundation

OverviewMedia Capture

‣ Image Capture‣ Independent control of white balance, focus, exposure‣ Ability to write EXIF metadata‣ Uncompressed output

‣ Video Capture‣ Con!gurable formats and resolution‣ Ability to write video metadata

‣ Ability to access and process input data‣ Pixel buffers containing still and video frames‣ Audio sample buffers containing PCM data

Page 28: Learning AV Foundation

AVCaptureSessionCapture Sessions

AVCaptureStillImageOutput

AVCaptureAudioDataOutput

AVCaptureMovieFileOutput

AVCaptureVideoDataOutput

AVCaptureDevice

AVCaptureDevice

AVCaptureSession

AVCaptureVideoPreviewLayer

GPUImage from Brad Larsonhttps://github.com/BradLarson/GPUImage

Page 29: Learning AV Foundation

ExampleBasic CaptureAVCaptureSession *captureSession = [[AVCaptureSession alloc] init];

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

if (device) { AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (input) { [captureSession addInput:input]; }}

AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];

if (audioDevice) { AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (input) { [captureSession addInput:input]; }}

Page 30: Learning AV Foundation

Example (Continued)Basic Capture

AVCaptureMovieFileOutput *movieOutput = [[AVCaptureMovieFileOutput alloc] init];

NSURL *movieURL = ... // Write to URL in iOS Documents directory

[captureSession addOutput:movieOutput];

[captureSession startRunning];[movieOutput startRecordingToOutputFileURL:movieURL recordingDelegate:self];

// Record for a while[captureSession stopRunning];[movieOutput stopRecording];

Page 31: Learning AV Foundation

Demo

Page 32: Learning AV Foundation

Composing Media

Page 33: Learning AV Foundation

AVCompositionComposing Assets

‣Concrete extension of AVAsset‣Composes asset segments on a timeline

Page 34: Learning AV Foundation

Tracks and SegmentsComposing Assets

AVComposition

AVMutableComposition *composition = [AVMutableComposition composition];

Page 35: Learning AV Foundation

AVComposition

Tracks and SegmentsComposing Assets

CMPersistentTrackID trackID = kCMPersistentTrackID_Invalid;

AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:trackID];

AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:trackID];

AVCompositionTrack (Video)

AVCompositionTrack (Audio)

Page 36: Learning AV Foundation

AVComposition

Tracks and SegmentsComposing Assets

AVCompositionTrack (Video)

AVCompositionTrack (Audio)

AVCompositionTrackSegment AVCompositionTrackSegment

Seconds 10-30 of “redpanda.m4v” Seconds 20-60 of “waves.m4v”

AVCompositionTrackSegment

Seconds 0-60 of “soundtrack.mp3”

AVAssetTrack *srcVideoTrack1 = // source video track 1[videoTrack insertTimeRange:timeRange ofTrack:srcVideoTrack1 atTime:startTime error:&error];

AVAssetTrack *srcVideoTrack2 = // source video track 2[videoTrack insertTimeRange:timeRange ofTrack:srcVideoTrack2 atTime:startTime error:&error];

AVAssetTrack *srcAudioTrack = // source audio track[audioTrack insertTimeRange:timeRange ofTrack:srcAudioTrack atTime:startTime error:&error];

Page 37: Learning AV Foundation

Advanced Techniques

‣ Video Transitions‣ AVVideoComposition used to describe

transitions between video tracks‣ Provides adjustment to opacity and transform

‣ Audio Mixing‣ AVAudioMix provides dynamic volume control‣ Used for crossfades, ducking, etc.

‣ Core Animation‣ AVSynchronizedLayer‣ AVVideoCompositionCoreAnimationTool

Powerful Editing

Page 38: Learning AV Foundation

Demo

Page 39: Learning AV Foundation

SummaryAV Foundation Rocks!

‣ Extremely impressive and capable‣ Challenging, but fun and rewarding

‣ Steep learning curve‣ Large framework with broad set of features‣ Requires understanding of advanced Objective-C‣ Inadequate documentation

‣ Apple’s current and future media direction

Page 40: Learning AV Foundation

Resources

Presentation Materialshttp://www.slideshare.net/bobmccune/https://github.com/tapharmonic/AVFoundationDemos

WWDC 2011: Exploring AV Foundationhttps://developer.apple.com/videos/wwdc/2011/?id=405

WWDC 2011: Working with Media in AV Foundationhttps://developer.apple.com/videos/wwdc/2011/?id=415

WWDC 2011: Capturing from the Camerahttps://developer.apple.com/videos/wwdc/2011/?id=419

BobMcCune.com @bobmccune