在開(kāi)發(fā)的時(shí)候屯掖,要實(shí)現(xiàn)微信小視頻的效果只用系統(tǒng)框架中的picker是不行的,它是個(gè)navi 而我們需要一個(gè)view把它放到tableview后邊
所以我們需要自定義一個(gè)攝像界面
下邊的文章摘抄自http://course.gdou.com/blog/Blog.pzs/archive/2011/12/14/10882.html
看完了相信對(duì)你會(huì)有很大的幫助
在進(jìn)行視頻捕獲時(shí)纬傲,有輸入設(shè)備及輸出設(shè)備满败,程序通過(guò) AVCaptureSession 的一個(gè)實(shí)例來(lái)協(xié)調(diào)、組織數(shù)據(jù)在它們之間的流動(dòng)叹括。
程序中至少需要:
●An instance ofAVCaptureDeviceto represent the input device, such as a camera or microphone
●An instance of a concrete subclass ofAVCaptureInputto configure the ports from the input device
●An instance of a concrete subclass ofAVCaptureOutputto manage the output to a movie file or still?image
●An instance ofAVCaptureSessionto coordinate the data flow from the input to the output
由上圖可以看出算墨,一個(gè)AVCaptureSession 可以協(xié)調(diào)多個(gè)輸入設(shè)備及輸出設(shè)備。通過(guò) AVCaptureSession 的 addInput汁雷、addOutput 方法可將輸入净嘀、輸出設(shè)備加入 AVCaptureSession 中。
capture input 及 capture out 之間的聯(lián)系由
AVCaptureConnection 對(duì)象表示侠讯,capture input (AVCaptureInput)
有一個(gè)或多個(gè)輸入端口(AVCaptureInputPort 實(shí)例) 挖藏,capture out(AVCaptureOutput
實(shí)例)可接收來(lái)自一個(gè)或多個(gè)輸入源的數(shù)據(jù)。
當(dāng)一個(gè)輸入或一個(gè)輸出被加入到 ?AVCaptureSession 中時(shí)厢漩,該 session 會(huì)“貪婪地” 在所有兼容的輸入端口和輸出之間建立連接(AVCaptureConnection)膜眠,因此,一般不需要手工在輸入溜嗜、輸出間建立連接宵膨。
輸入設(shè)備:
AVCaptureDeviceInput?*captureInput = [AVCaptureDeviceInput?deviceInputWithDevice:
[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo]error:nil];
種類(lèi)有:
AVMediaTypeVideo
AVMediaTypeAudio
輸出設(shè)備有:
AVCaptureMovieFileOutput 輸出到文件
AVCaptureVideoDataOutput 可用于處理被捕獲的視頻幀
AVCaptureAudioDataOutput 可用于處理被捕獲的音頻數(shù)據(jù)
AVCaptureStillImageOutput 可用于捕獲帶有元數(shù)據(jù)(MetaData)的靜止圖像
輸出設(shè)備對(duì)象的創(chuàng)建:
AVCaptureMovieFileOutput ?*captureOutput= [[AVCaptureMovieFileOutputalloc]init;
一、捕獲到視頻文件
此時(shí)輸出設(shè)備指定為:AVCaptureMovieFileOutput炸宵,其子類(lèi) AVCaptureFileOutput 的2個(gè)方法用于啟動(dòng)辟躏、停止編碼輸出:
- (void)startRecordingToOutputFileURL:(NSURL*)outputFileURLrecordingDelegate:(id <AVCaptureFileOutputRecordingDelegate>)delegate
- (void)stopRecording
程序開(kāi)始編碼輸出時(shí),應(yīng)先啟動(dòng) AVCaptureSession土全,再用以上方法啟動(dòng)編碼輸出捎琐。整個(gè)步驟:
創(chuàng)建輸入、輸出設(shè)備裹匙、AVCaptureSession對(duì)象:
AVCaptureDeviceInput*captureInput = [AVCaptureDeviceInputdeviceInputWithDevice:[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo]error:nil];
AVCaptureDeviceInput*microphone = [AVCaptureDeviceInputdeviceInputWithDevice:[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeAudio]error:nil];
/*We setupt the output*/
captureOutput= [[AVCaptureMovieFileOutputalloc]init];
self.captureSession= [[AVCaptureSessionalloc]init];
加入輸入瑞凑、輸出設(shè)備:
[self.captureSessionaddInput:captureInput];
[self.captureSessionaddInput:microphone];
[self.captureSessionaddOutput:self.captureOutput];
設(shè)置Session 屬性:
/*We use medium quality, ont the iPhone 4 this demo would be laging too much, the conversion in UIImage and CGImage demands too much ressources for a 720p resolution.*/
[self.captureSessionsetSessionPreset:AVCaptureSessionPresetMedium];
其他預(yù)置屬性如下:
不同設(shè)備的情況:
開(kāi)始編碼,視頻編碼為H.264幻件、音頻編碼為AAC:
[selfperformSelector:@selector(startRecording)withObject:nilafterDelay:10.0];
[self.captureSessionstartRunning];
- (void) startRecording
{
[captureOutputstartRecordingToOutputFileURL:[selftempFileURL]recordingDelegate:self];
}
處理編碼過(guò)程中事件的類(lèi)必須符合 VCaptureFileOutputRecordingDelegate 協(xié)議拨黔,并在以下2個(gè)方法中進(jìn)行處理:
- (void)captureOutput:(AVCaptureFileOutput*)captureOutput didStartRecordingToOutputFileAtURL:(NSURL*)fileURL fromConnections:(NSArray*)connections
{
NSLog(@"start record video");
}
- (void)captureOutput:(AVCaptureFileOutput*)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL*)outputFileURL fromConnections:(NSArray*)connections error:(NSError*)error
{
ALAssetsLibrary*library = [[ALAssetsLibraryalloc]init];
// 將臨時(shí)文件夾中的視頻文件復(fù)制到照片文件夾中,以便存取
[librarywriteVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL*assetURL,NSError*error) {
if(error) {
_myLabel.text=@"Error";
}
else
_myLabel.text= [assetURLpath];
}];
[libraryrelease];}
通過(guò) AVCaptureFileOutput ?的 stopRecording 方法停止編碼。
二篱蝇、捕獲用于處理視頻幀
三贺待、捕獲為靜止圖像
此時(shí)輸出設(shè)備對(duì)象為:AVCaptureStillImageOutput,session 的預(yù)置(preset)信息決定圖像分辨率:
圖像格式:
例:
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc]?init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:?AVVideoCodecJPEG,?AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
當(dāng)需要捕獲靜止圖像時(shí)零截,可向輸出設(shè)備對(duì)象發(fā)送:captureStillImageAsynchronouslyFromConnection:completionHandler:
消息麸塞。第一個(gè)參數(shù)為欲進(jìn)行圖像捕獲的連接對(duì)象(AVCaptureConnection),你必須找到具有視頻采集輸入端口(port)的連接:
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
第二個(gè)參數(shù)是一個(gè)塊(block)涧衙,它有2個(gè)參數(shù)哪工,第一個(gè)參數(shù)是包含圖像數(shù)據(jù)的??CMSampleBuffer,可用于處理圖像:
[[selfstillImageOutput]captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRefimageDataSampleBuffer,NSError*error) {
ALAssetsLibraryWriteImageCompletionBlockcompletionBlock = ^(NSURL*assetURL,NSError*error) {
if(error) {
// error handling
}
}
};
if(imageDataSampleBuffer !=NULL) {
NSData*imageData = [AVCaptureStillImageOutputjpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAssetsLibrary*library = [[ALAssetsLibraryalloc]init];
UIImage*image = [[UIImagealloc]initWithData:imageData];
// 將圖像保存到“照片” 中
[librarywriteImageToSavedPhotosAlbum:[imageCGImage]?orientation:(ALAssetOrientation)[imageimageOrientation]
completionBlock:completionBlock];
[imagerelease];
[libraryrelease];
}
else
completionBlock(nil, error);
if([[selfdelegate]respondsToSelector:@selector(captureManagerStillImageCaptured:)]) {
[[selfdelegate]captureManagerStillImageCaptured:self];
}
}];