Audio & Video · 19.11.2010 · Core Audio •High level, easy to use •System Sound API —...
Transcript of Audio & Video · 19.11.2010 · Core Audio •High level, easy to use •System Sound API —...
Audio & VideoiOS App DevelopmentFall 2010 — Lecture 19
Questions?
Announcements
• Assignment #5 — due Monday by 11:59pm
Today’s Topics
• Core Audio Components
• Playing Short Sounds
• Vibrating the Phone
• Playing Long Sounds
• Recording Audio
• Playing Video
Notes
• I’m showing the relevant portions of the view controller interfaces and implementations in these notes
• Remember to release relevant memory in the -dealloc methods — they are not shown here
• You will also need to wire up outlets and actions in IB
• Where delegates or data sources are used, they too require wiring in IB
Notes
• There’s not much (if any) error handling in the examples
• You should probably consult the documentation and look at Apple’s examples for appropriate handling techniques
Core Audio
Core Audio
• High level, easy to use
• System Sound API — short sounds
• AVAudioPlayer class — ObjC, simple API
• Lower level, takes more effort but much more control
• Audio Toolbox — recording and playback, streaming, full control
• Audio Units — processing audio
• OpenAL — 3D positional sound
• Which one you use depends on what you’re trying to do
OS Management
• The OS manages the sound system
• You can ask for behavior but the OS has control
• May preempt your audio if it sees fit
Converting Sounds
• OS X provides a command line utility called afconvert that allows you to convert between sound formats
• Supports wide variety of input and output formats
• Easily convert sounds to System Sounds formats
• Run with “-h” flag for help and usage
/usr/bin/afconvert -f aiff -d BEI16 input.mp3 output.aif
Playing Short Sounds
Playing Short Sounds
• We’re talking less than 30 seconds
• Intended for user-interface sound effects and user alerts
• It is not intended for sound effects in games
• Very simple C-based API — trades features for ease-of-use
• No looping
• No volume control
• Immediate playback
• Limited set of formats
• Linear PCM or IMA4
• In .caf, .aif or .wav file
Playing Short Sounds
• General process...
• Register the sound and get a “Sound ID”
• Play the sound — based on Sound ID
• Optionally register a callback for when sound has completed playing
• Dispose of Sound ID when finished
System Sounds Functions
• The following C-function creates a system sound from the specified reference and sets the sound ID (a handle) to the sound in the second argument (passed by reference)...
• To call the sound, simply call this function with the ID...
• Then to release the sound, pass the sound ID into dispose...
OSStatus AudioServicesCreateSystemSoundID(CFURLRef inFileURL, SystemSoundID* outSystemSoundID);
OSStatus AudioServicesDisposeSystemSoundID(SystemSoundID inSystemSoundID);
void AudioServicesPlaySystemSound(SystemSoundID inSystemSoundID);
BellViewController.xib
• In IB, we’ll create a button that allows us to play the sound
• Button wired up to play action
• We’ll also add and wire up an image view for dramatic effect
ShortSoundViewController.h
#import <UIKit/UIKit.h>#import <AudioToolbox/AudioToolbox.h>
@interface ShortSoundViewController : UIViewController {
! SystemSoundID sound;
}
@property(nonatomic, retain) IBOutlet UIImageView *imageView;
- (IBAction)play;
@end
Need to add AudioToolbox
framework to project
ShortSoundViewController.m
#import "ShortSoundViewController.h"
@implementation ShortSoundViewController
@synthesize imageView;
- (void)loadSounds {!! CFBundleRef mainBundle = CFBundleGetMainBundle();!! // Get the URL to the sound file to play. The file in this case is "bell.wav"! CFURLRef soundFileURLRef = CFBundleCopyResourceURL(mainBundle, CFSTR("bell"), ! ! ! ! ! ! ! ! ! ! ! ! ! CFSTR("wav"), NULL);!! // Create a system sound object representing the sound file! AudioServicesCreateSystemSoundID(soundFileURLRef, &sound);!}
- (IBAction)play {! AudioServicesPlaySystemSound(sound);}
/* ... */
ShortSoundViewController.m
/* ... */
- (void)loadImages {! UIImage *bell1 = [UIImage imageNamed:@"bell1"];! UIImage *bell2 = [UIImage imageNamed:@"bell2"];! UIImage *bell3 = [UIImage imageNamed:@"bell3"];! UIImage *bell4 = [UIImage imageNamed:@"bell4"];! self.imageView.animationDuration = .45;! self.imageView.animationImages = [NSArray arrayWithObjects: bell1, bell2, bell3, bell4, nil];! [self.imageView startAnimating];}
- (void)viewDidLoad {! [super viewDidLoad];! [self loadSounds];! [self loadImages];}
- (void)dealloc { ! AudioServicesDisposeSystemSoundID(sound);! self.imageView = nil; [super dealloc];}
@end
The Resulting App
Vibrating the iPhone
Vibrating the iPhone
• The ability to vibrate an iPhone is provided by the system sound API
• iPod touches and iPads do not have vibration hardware, as such playing the “vibration sound” has no effect
• If you choose to use vibration in your app — use it sparingly
• An good use might be in a game if your vehicle blows up
• Excessive use of vibration has been known to be grounds for App Store rejection
Vibrating the iPhone
• To vibrate the phone, you actually use a special predefined sound ID for vibration...
• So, to make the phone vibrate you simply call...
• You have no control over duration
kSystemSoundID_Vibrate
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
Simulator Support
• The simulator has no support for vibration
• Again, vibration is only available on iPhones and not iPod touches or iPads
✘
Playing Longer Sounds
AVAudioPlayer
• Play longer sounds of any length
• Locally stored files or in-memory (no network streaming)
• Loop, seek, play, pause
• Provides metering capability
• Play multiple sounds simultaneously
• ObjC API
• Initialize with file URL or data
• AVAudioPlayerDelegate for various life-cycle events
• Supports many additional formats
Playing Longer Sounds
• General process...
• Create file URL
• Allocate an AVAudioPlayer instance
• Optionally set a delegate
• Prepare to play
• Play
• Release the instance when finished
PlayerViewController.xib
• To demonstrate the metering capability we’re going to also use a pre-built metering UIView
• This example uses Chris Adamson’s LevelMeterView
• Licensed under an Apache 2.0 License
LevelMeterViews one for each left and right channels
PlayerViewController.h
#import <UIKit/UIKit.h>#import <AVFoundation/AVFoundation.h>#import "LevelMeterView.h"
@interface PlayerViewController : UIViewController <AVAudioPlayerDelegate> {
}
@property(nonatomic, retain) AVAudioPlayer *player;@property(nonatomic, retain) IBOutlet UIButton *playOrPauseButton;@property(nonatomic, retain) IBOutlet UILabel *durationLabel;@property(nonatomic, retain) IBOutlet UILabel *currentPositionLabel;@property(nonatomic, retain) IBOutlet UIProgressView *currentPositionProgress;@property(nonatomic, retain) IBOutlet UISlider *volumeSlider;@property(nonatomic, retain) IBOutlet LevelMeterView *leftMeter;@property(nonatomic, retain) IBOutlet LevelMeterView *rightMeter;
- (IBAction)playOrPause;- (IBAction)setVolume;
@end
Need to add AVFoundation framework to project
Implement delegate
PlayerViewController.m
#import "PlayerViewController.h"
@implementation PlayerViewController
@synthesize player, playOrPauseButton, durationLabel, currentPositionLabel;@synthesize currentPositionProgress, volumeSlider, leftMeter, rightMeter;
- (NSString *)timeToString:(int)time {! return [NSString stringWithFormat:@"%02d:%02d", time / 60, time % 60];}
- (IBAction)playOrPause {! if (self.player.playing) {! ! [self.playOrPauseButton setImage:[UIImage imageNamed:@"play"] ! ! ! ! ! ! ! ! forState:UIControlStateNormal]; ! ! [self.player pause];! } else {! ! [self.playOrPauseButton setImage:[UIImage imageNamed:@"pause"] ! ! ! ! ! ! ! ! forState:UIControlStateNormal]; ! ! [self.player play];! }}
/* ... */
PlayerViewController.m
/* ... */
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)completed { if (completed) { [self.playOrPauseButton setImage:[UIImage imageNamed:@"play"] forState:UIControlStateNormal]; } }
- (IBAction)setVolume { self.player.volume = self.volumeSlider.value;}
- (void)updateDisplay { self.currentPositionProgress.progress = self.player.currentTime / self.player.duration; self.currentPositionLabel.text = [self timeToString:self.player.currentTime]; [self.player updateMeters]; [self.leftMeter setPower:[self.player averagePowerForChannel:0] peak:[self.player peakPowerForChannel:0]]; [self.rightMeter setPower:[self.player averagePowerForChannel:1] peak:[self.player peakPowerForChannel:1]];}
/* ... */
PlayerViewController.m
/* ... */
- (void)viewDidLoad { [super viewDidLoad]; NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:@"sample" ofType:@"mp3"]; NSURL *fileURL = [[[NSURL alloc] initFileURLWithPath: soundFilePath] autorelease]; self.player = [[[AVAudioPlayer alloc] initWithContentsOfURL: fileURL error: nil] autorelease]; [self.player prepareToPlay]; [self.player setDelegate:self]; self.player.volume = .5; self.player.meteringEnabled = YES; self.volumeSlider.value = self.player.volume; self.currentPositionLabel.text = @"0:00"; self.durationLabel.text = [self timeToString:self.player.duration]; // fire updates [self updateDisplay]; [NSTimer scheduledTimerWithTimeInterval:.1 target:self selector:@selector(updateDisplay) userInfo:nil repeats:YES];}/* ... */
@end
The Resulting App
Other Sound Playing APIs
Audio Toolbox
• Recording audio
• Audio Queue Services
• Create a queue
• Define a callback function to receive recorded audio data
• Start the queue
• Receive callbacks with recorded data, you have to store it
• Stop the queue
• See the “SpeakHere” example project in iPhone Dev Center for an example
Audio Units
• For serious audio processing
• Graph-based audio
• Rate conversion
• Audio effects
• Mixing multiple streams
• Incredibly powerful — same as on Mac OS X
OpenAL
• High level, cross-platform API for 3D audio mixing
• Great for games where you already are defining 3D space
• Mimics OpenGL conventions
• Models audio in 3D space
• Buffers — container for Audio
• Sources — 3D point emitting Audio
• Listener — position where Sources are heard
• For more information visit the OpenAL website
• http://www.openal.org/
Recording Audio
Recoding Audio
• The AVAudioRecorder class provides basic audio recording capability in an ObjC API
• Class also has an AVAudioRecorderDelegate to provide callbacks for certain events (e.g. finished recording)
AVAudioRecorder Capabilities
• Record until the user stops the recording
• Record for a specified duration
• Pause and resume a recording
• Obtain input audio-level data that you can use to provide level metering
• A large image view and two buttons
• Each button has a different normal and selected image
• Each is wired up as outlets and wired to call an action
RecordViewController.xib
RecordViewController.h
#import <UIKit/UIKit.h>#import <AVFoundation/AVFoundation.h>
@interface RecordViewController : UIViewController <AVAudioRecorderDelegate, ! ! ! ! ! ! ! ! ! ! ! ! AVAudioSessionDelegate, AVAudioPlayerDelegate> {
}
@property(nonatomic, retain) AVAudioRecorder *audioRecorder;@property(nonatomic, retain) AVAudioPlayer *audioPlayer;@property(nonatomic, retain) IBOutlet UIButton *recordButton;@property(nonatomic, retain) IBOutlet UIButton *playButton;
- (IBAction)toggleRecord;- (IBAction)togglePlay;
@end
AVFoundation again
RecordViewController.m
#import "RecordViewController.h"#import <CoreAudio/CoreAudioTypes.h>
@implementation RecordViewController
@synthesize audioPlayer, audioRecorder;@synthesize recordButton, playButton;
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)completed { if (completed) { self.playButton.selected = NO; }}
- (NSURL *)getSoundURL { NSArray *segments = [NSArray arrayWithObjects:NSHomeDirectory(), @"Documents", @"recording.caf", nil]; NSString *soundFilePath = [NSString pathWithComponents:segments]; return [[[NSURL alloc] initFileURLWithPath: soundFilePath] autorelease];!}
/* ... */
RecordViewController.m
/* ... */
- (void)setupAudioSession { AVAudioSession *audioSession = [AVAudioSession sharedInstance]; audioSession.delegate = self; [audioSession setActive:YES error: NULL]; [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:NULL];!}
- (void)tearDownAudioSession { [[AVAudioSession sharedInstance] setActive:NO error:NULL];}
/* ... */
RecordViewController.m
/* ... */
- (void)setupRecorder {! [self setupAudioSession];! // allocate a recorder for use NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithFloat: 44100.0], AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey, nil]; self.audioRecorder = [[AVAudioRecorder alloc] initWithURL: [self getSoundURL] settings: recordSettings error: NULL]; self.audioRecorder.delegate = self; [self.audioRecorder prepareToRecord];}
/* ... */
RecordViewController.m
/* ... */
- (IBAction)toggleRecord { if (self.recordButton.selected) { [self tearDownAudioSession]; [self.audioRecorder stop]; } else { [self setupAudioSession]; [self.audioRecorder record]; } self.recordButton.selected = !self.recordButton.selected;}
/* ... */
RecordViewController.m
/* ... */
- (IBAction)togglePlay { if (self.playButton.selected) { [self tearDownAudioSession]; [self.audioPlayer stop]; } else { [self setupAudioSession]; self.audioPlayer = [[[AVAudioPlayer alloc] initWithContentsOfURL:[self getSoundURL] error:NULL] autorelease]; self.audioPlayer.delegate = self; [self.audioPlayer prepareToPlay]; [self.audioPlayer play];! ! ! } self.playButton.selected = !self.playButton.selected;}
- (void)viewDidLoad { [super viewDidLoad]; [self setupRecorder];}
/* ... */
@end
The Resulting App
Playing Video
MPMoviePlayerController
• The MPMoviePlayerController provides several different video playback options
• Playback styles — full screen or render in a view
• Video source
• Can play back movies stored in your application’s bundle, Document, or temp directories
• Can also play movies and audio files loaded from a network-based URL
• Note: prior to iOS 3.2 only full-screen playback was supported
MPMoviePlayerController
• Has many properties/methods that allow you to change the appearance and behavior of the movie playback
• A sampling of some of the more common ones are shown below...
- (NSURL *)contentURL;- (void)setContentURL:(NSURL *)contentURL;
// The style of the playback controls. // Defaults to MPMovieControlStyleDefault.@property(nonatomic) MPMovieControlStyle controlStyle;
// Determines how the movie player repeats when reaching // the end of playback. Defaults to MPMovieRepeatModeNone.@property(nonatomic) MPMovieRepeatMode repeatMode;
// Determines if the movie is presented in the entire screen // (obscuring all other application content). Default is NO.// Setting this property to YES before the movie player's view// is visible will have no effect.@property(nonatomic, getter=isFullscreen) BOOL fullscreen;- (void)setFullscreen:(BOOL)fullscreen animated:(BOOL)animated;
MPMovieControlStyle
• Different playback controls...
enum { MPMovieControlStyleNone, // No controls MPMovieControlStyleEmbedded, // Controls for an embedded view MPMovieControlStyleFullscreen, // Controls for fullscreen playback MPMovieControlStyleDefault = MPMovieControlStyleEmbedded};typedef NSInteger MPMovieControlStyle;
MPMediaPlayback
• MPMoviePlayerController adopts the MPMediaPlayback protocol
@protocol MPMediaPlayback
- (void)prepareToPlay;
@property(nonatomic, readonly) BOOL isPreparedToPlay;
- (void)play;- (void)pause;- (void)stop;
@property(nonatomic) NSTimeInterval currentPlaybackTime;
@property(nonatomic) float currentPlaybackRate;
- (void)beginSeekingForward;- (void)beginSeekingBackward;- (void)endSeeking;
@end
Examples
• Let’s look at 2 different use cases...
• A short in-app video — applicable for...
• In-app help
• Cut-scene sequences in games
• Loading fullscreen video from a website — applicable for...
• Content changes regularly
• Too large to reasonably bundle in app
• Live streaming Streaming
In-App Video
In-App Video
• Let’s simulate a cut-scene sequence in a game...
• Turn down lights (fade to black)
• Load content from a local source
• Play in-line within application
• When done playing, turn up lights
• This in-app video will have no user facing playback controls
VideoViewController.xib
• Fairly simple NIB
• A button wired up to an outlet
• Will call an action method
VideoViewController.h
#import <UIKit/UIKit.h>#import <MediaPlayer/MediaPlayer.h>
@interface VideoViewController : UIViewController {! int level;}
@property(nonatomic, retain) MPMoviePlayerController *player;@property(nonatomic, retain) IBOutlet UIButton *button;@property(nonatomic, retain) UIView *overlayView;
- (IBAction)playMovie;
@end
Need to add MediaPlayer Framework
VideoViewController.m
#import "VideoViewController.h"
@implementation VideoViewController
@synthesize player, button, overlayView;
- (IBAction)playMovie {! // if first time through, set everything up if (!self.overlayView) {! // create and configure view self.overlayView = [[[UIView alloc] initWithFrame:self.view.bounds] autorelease]; self.overlayView.backgroundColor = [UIColor clearColor];! ! /* ... */
Programmatically create a clear UIView the same size as the view controller’s view
VideoViewController.m
/* ... */
// prep video NSString *videoPath = [[NSBundle mainBundle] pathForResource:@"big-buck-bunny" ofType:@"m4v"]; NSURL *videoURL = [NSURL fileURLWithPath: videoPath]; self.player = [[[MPMoviePlayerController alloc] initWithContentURL: videoURL] autorelease]; self.player.controlStyle = MPMovieControlStyleNone; [[self.player view] setFrame: [self.overlayView bounds]];! ! // register for notification when movie is done playing [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playbackFinished:)! !! ! ! ! ! ! ! ! name:@"MPMoviePlayerPlaybackDidFinishNotification" object:nil];! ! ! ! // speed up movie - useful for debugging // self.player.currentPlaybackRate = 2.0;! }
/* ... */
Create the video player
with no controls & add to overlay
Use the notification center to get a notification when playback has completed
VideoViewController.m
/* ... */
// add overlay view [self.view addSubview:self.overlayView];! // animate in then start playing video [UIView animateWithDuration:2.0 animations:^ { self.overlayView.backgroundColor = [UIColor blackColor]; } completion:^ (BOOL finished) { [self.overlayView addSubview: [self.player view]]; [self.player play];! }];}
/* ... */
Add clear overlay to view, fade to black, then add player to overlay and start playback
VideoViewController.m
/* ... */
- (void)playbackFinished:(NSNotification*)notification {! // update background scene NSString *buttonTitle = [NSString stringWithFormat:@"Go onto level %d", ++level + 1]; [self.button setTitle:buttonTitle forState:UIControlStateNormal];! // remove player from overlay [[self.player view] removeFromSuperview];! // animate out then remove overlay [UIView animateWithDuration:2.0 animations:^ { self.overlayView.backgroundColor = [UIColor clearColor]; } completion:^ (BOOL finished) { [self.overlayView removeFromSuperview]; }];}
/* ... */
@end
When playback has finished, remove player from view, fade
up and remove overlay
The Resulting App
Full-Screen Video
Full-Screen Video
• You can also easily do full-screen video from within your app
• You could do this by packing a MPMoviePlayerController view into a view controller where it takes up the whole screen and push it onto the view controller stack
• It turns out that this is a common enough of an operation that Apple provided a view controller class to do just that...
MPMoviePlayerViewController
• MPMoviePlayerViewController class wraps up a player into a view controller
• Can access the underlaying player via a property...
• Method to init with a given video...
• Also extends UIView with the following 2 methods...
- (id)initWithContentURL:(NSURL *)contentURL;
@property(nonatomic, readonly) MPMoviePlayerController *moviePlayer;
- (void)presentMoviePlayerViewControllerAnimated: (MPMoviePlayerViewController *)moviePlayerViewController;- (void)dismissMoviePlayerViewControllerAnimated;
Full Screen Video App
• Let’s simulate being able to browse a list of videos online and selecting one for viewing
• Table view with selections
• Tap to play
• Video will play fullscreen
• When done playing, return to list
RootViewController.h
#import <UIKit/UIKit.h>#import <MediaPlayer/MediaPlayer.h>
@interface RootViewController : UITableViewController {! NSMutableArray * videos;}
@property(nonatomic, retain) MPMoviePlayerViewController *playerController;
@end
RootViewController.m
#import "RootViewController.h"
@implementation RootViewController
@synthesize playerController;
#pragma mark -#pragma mark View lifecycle
- (void)viewDidLoad {! self.title = @"Videos";!! // might load data from some other online resource! // here we'll just load from the local file system! NSString *plist = [[NSBundle mainBundle] pathForResource:@"videos" ! ! ! ! ! ! ! ! ! ! ! ! ! ofType:@"plist"];! videos = [[NSArray alloc] initWithContentsOfFile:plist];!! // create an instance of the movie player! self.playerController = [[[MPMoviePlayerController alloc] init] autorelease];}
/* ... */
RootViewController.m
/* ... */
#pragma mark -#pragma mark Table view data source
// Customize the number of sections in the table view.- (NSInteger)numberOfSectionsInTableView:(UITableView *)tableView { return 1;}
// Customize the number of rows in the table view.- (NSInteger)tableView:(UITableView *)tableView numberOfRowsInSection:(NSInteger)section { return [videos count];}
/* ... */
RootViewController.m
/* ... */
// Customize the appearance of table view cells.- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath { static NSString *CellIdentifier = @"Cell"; UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier]; if (cell == nil) { cell = [[[UITableViewCell alloc] initWithStyle:UITableViewCellStyleDefault reuseIdentifier:CellIdentifier] autorelease]; cell.accessoryType = UITableViewCellAccessoryDisclosureIndicator; } // Configure the cell. NSDictionary *video = [videos objectAtIndex:indexPath.row]; cell.textLabel.text = [video objectForKey:@"title"]; NSURL *url = [NSURL URLWithString:[video objectForKey:@"image"]]; NSData *data = [NSData dataWithContentsOfURL:url]; cell.imageView.image = [UIImage imageWithData:data]; return cell;}
/* ... */
RootViewController.m
/* ... */
- (void)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(NSIndexPath *)indexPath { ! NSString *urlString = [[videos objectAtIndex:indexPath.row] objectForKey:@"video"];! NSURL *url = [NSURL URLWithString:urlString];! self.playerController = [[[MPMoviePlayerViewController alloc] initWithContentURL:url] autorelease];! [self presentMoviePlayerViewControllerAnimated: self.playerController];!}
/* ... */
@end
The Resulting App
Additional Resources
• Audio Session Programming Guide• http://developer.apple.com/library/ios/documentation/
Audio/Conceptual/AudioSessionProgrammingGuide/• Core Audio Overview
• http://developer.apple.com/library/ios/documentation/MusicAudio/Conceptual/CoreAudioOverview/
• Audio Queue Services Programming Guide• http://developer.apple.com/library/ios/documentation/
MusicAudio/Conceptual/AudioQueueProgrammingGuide/• Audio & Video Coding How-To's
• http://developer.apple.com/library/ios/codinghowtos/AudioAndVideo/
For Next Class
• UIWebView Class Reference• http://developer.apple.com/library/ios/documentation/
UIKit/Reference/UIWebView_Class/• MFMailComposeViewController Class Reference
• http://developer.apple.com/library/ios/documentation/MessageUI/Reference/MFMailComposeViewController_class/
• Concurrency Programming Guide
• http://developer.apple.com/library/ios/documentation/General/Conceptual/ConcurrencyProgrammingGuide/
For Next Class
• CFNetwork Programming Guide• http://developer.apple.com/library/ios/
DOCUMENTATION/Networking/Conceptual/CFNetwork/
• cocoahttpserver on Google Code
• http://code.google.com/p/cocoahttpserver/
• iOS Man pages (for BSD sockets)
• http://developer.apple.com/library/ios/documentation/System/Conceptual/ManPages_iPhoneOS/