前言
從本文開始逐漸學習iOS自帶的多媒體處理框架,例如AVFoundation菜职,VideoToolbox碧磅,CoreMedia,CoreVideo實現(xiàn)多媒體的處理辽慕,并且將實現(xiàn)方式以及效果和ffmpeg的方式做對比
APP會有這樣的需求京腥,錄制一段音頻或者一段視頻,或者拍攝一張照片等等溅蛉,AVFoundation提供了為我們提供了實現(xiàn)這些需求的接口绞旅。通過這些接口我們可以從設(shè)備獲取指定格式的未壓縮的音視頻數(shù)據(jù),然后又可以壓縮之后保存到文件里面存儲在本地或者在網(wǎng)絡(luò)上傳輸
本文的目的:
1温艇、熟悉AVFoundation中關(guān)于拍照接口的使用
2因悲、將采集到的音頻或者視頻壓縮后用另外一種更加簡單的方式保存在MP4文件中
采集相關(guān)流程
上圖介紹了AVFoundation框架中關(guān)于采集相關(guān)的對象關(guān)系圖,如下為具體對象的解釋
要開啟音視頻采集勺爱,必須在項目配置文件Info.plist中添加NSMicrophoneUsageDescription音頻使用權(quán)限和NSCameraUsageDescription相機使用權(quán)限
相關(guān)對象及函數(shù)介紹
- 1晃琳、AVCaptureSession
采集會話對象,用于管理采集的開始琐鲁,結(jié)束等等操作卫旱,它一端連接著麥克風和攝像頭等等輸入設(shè)備對象接受他們提供的音視頻數(shù)據(jù),另一端連接著音頻輸出對象和視頻輸出對象通過他們向外界提供指定格式的音視頻數(shù)據(jù)
2围段、AVCaptureDevice
代表著具體的采集音頻或者視頻的物理對象顾翼,例如麥克風,前后置攝像頭等等3奈泪、AVCaptureDeviceInput
采集輸入對象适贸,它是AVCaptureInput的實現(xiàn)子類灸芳,該對象被連接到AVCaptureSession之后就可以對其提供音頻或者視頻數(shù)據(jù)了,通過如下方法添加采集輸入對象
-(void)addInput:(AVCaptureInput *)input;4拜姿、AVCaptureVideoDataOutput
視頻輸出對象烙样,它被添加到AVCaptureSession之后就可以向外界提供采集好的視頻數(shù)據(jù)了,同時通過該對象設(shè)置采集到的視頻數(shù)據(jù)的格式(包括像素的格式蕊肥,比如RGB的還是YUV的等等)5谒获、AVCaptureAudioDataOutput
音頻輸出對象,它被添加到AVCaptureSession之后就可以向外界提供采集好的音頻數(shù)據(jù)了壁却,該對象對于音頻數(shù)據(jù)的格式設(shè)置比較少批狱,如果想要更加詳細的音頻格式采集可以采用AudioUnit框架進行 參考我前面寫的文章 AudioUnit錄制音頻+耳返(四)6、AVCaptureStillImageOutput
原始照片輸出對象展东,同樣它也需要通過被添加到AVCaptureSession之后向外界提供采集好的原始照片
AVCaptureVideoDataOutput精耐、AVCaptureAudioDataOutput、AVCaptureStillImageOutput琅锻、AVCaptureMovieFileOutput都是AVCaptureOutput的具體實現(xiàn)子類卦停,
通過-(void)addOutput:(AVCaptureOutput *)output方法被添加AVCaptureSession中去,然后由他們向外界提供數(shù)據(jù)
- 7恼蓬、CMSampleBufferRef
此對象代表了采集到的音視頻數(shù)據(jù)的一個結(jié)構(gòu)體惊完,它包含了音頻或者視頻相關(guān)的參數(shù),這些參數(shù)包括音頻參數(shù)(編碼方式处硬,采樣率小槐,采樣格式,聲道類型荷辕,聲道數(shù)等等)凿跳,視頻參數(shù)(編碼方式,寬高疮方,顏色標準bt601/bt709,像素格式y(tǒng)uv還是RGB)
8 控嗜、- (void)startRunning;和-(void)stopRunning;
分別對應(yīng)著采集開始和結(jié)束采集,他們一般都成對調(diào)用骡显。備注:startRunning方法可能會花費1秒左右時間疆栏,它會阻塞當前線程,所以使用是要注意不能阻塞主線程
實現(xiàn)代碼
主要實現(xiàn)的功能就是采集音頻和1280x720的視頻惫谤,視頻采用h264方式編碼壁顶,音頻采用aac方式編碼,最后保存到MOV中
#import <UIKit/UIKit.h>
@interface AVCapturePriviewer : UIView
/** 實現(xiàn)AVFoundation采集的視頻實時預(yù)覽溜歪,同時能夠拍攝高分辨率的照片
* 將采集到的音視頻保存到MOV中
*
* 備注:通過AVCaptureMovieFileOutput只能保存為MOV文件格式
*/
- (void)startCaptureMovieDst:(NSURL*)moveURL;
@end
#import "AVCapturePriviewer.h"
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>
@interface AVCapturePriviewer()<AVCaptureFileOutputRecordingDelegate,AVCapturePhotoCaptureDelegate>
{
// 采集管理會話
AVCaptureSession *captureSession;
// 采集工作隊列若专,由于采集,開始蝴猪,更換攝像頭等等需要一定的耗時调衰,所以需要放在子線程做做這些事情
dispatch_queue_t sessionQueue;
AVCaptureDeviceInput *videoInput;
AVCaptureDeviceInput *audioInput;
AVCaptureMovieFileOutput *fileOutput;
AVCapturePhotoOutput *stillOutput;
UIButton *stillImage;
UIButton *record;
UIButton *change;
NSURL *dstURL;
UIBackgroundTaskIdentifier taskId;
}
@end
@implementation AVCapturePriviewer
- (void)startCaptureMovieDst:(NSURL*)moveURL
{
taskId = UIBackgroundTaskInvalid;
dstURL = moveURL;
stillImage = [UIButton buttonWithType:UIButtonTypeSystem];
stillImage.frame = CGRectMake(self.bounds.size.width - 80, 0, 80, 30);
[stillImage setTitle:@"stillImage" forState:UIControlStateNormal];
[stillImage addTarget:self action:@selector(onTapButton:) forControlEvents:UIControlEventTouchUpInside];
[self addSubview:stillImage];
record = [UIButton buttonWithType:UIButtonTypeSystem];
record.frame = CGRectMake(self.bounds.size.width - 80, 40, 80, 30);
[record setTitle:@"record" forState:UIControlStateNormal];
[record addTarget:self action:@selector(onTapButton:) forControlEvents:UIControlEventTouchUpInside];
[self addSubview:record];
change = [UIButton buttonWithType:UIButtonTypeSystem];
change.frame = CGRectMake(self.bounds.size.width - 80, 80, 80, 30);
[change setTitle:@"change" forState:UIControlStateNormal];
[change addTarget:self action:@selector(onTapButton:) forControlEvents:UIControlEventTouchUpInside];
[self addSubview:change];
/** 關(guān)于相機權(quán)限和麥克風權(quán)限的總結(jié):
* 1膊爪、要想使用相機和麥克風首先得在Info.plist文件中添加NSMicrophoneUsageDescription音頻使用權(quán)限
* 和NSCameraUsageDescription相機使用權(quán)限
* 2、首次使用APP窖式,當使用AVCaptureDeviceInput進行初始化的時候就會彈出使用相機權(quán)限的對話框,如果用戶拒絕賦予權(quán)限
* 或者先賦予了又在設(shè)置里面拒絕給權(quán)限动壤,那么這個初始化方法將返回nil
* 4萝喘、用戶刪掉APP后對應(yīng)的權(quán)限等等也一并刪除
* 3、所以正確的做法應(yīng)該是如下的代碼琼懊,根據(jù)當前權(quán)限狀態(tài)來作相應(yīng)的初始化
*/
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {
case AVAuthorizationStatusAuthorized:
{
NSLog(@"有視頻權(quán)限");
[self requestMicroAuthorized];
}
break;
case AVAuthorizationStatusNotDetermined:
{
NSLog(@"首次打開APP則需要請求相機權(quán)限");
// 此方法為異步的阁簸,用戶選擇完畢后回調(diào)被執(zhí)行。如果APP還沒有請求過權(quán)限哼丈,那么調(diào)用此方法會彈出一個對話框提示用戶給與
// 權(quán)限启妹。如果用戶已經(jīng)拒絕過或者給予了權(quán)限則調(diào)用該方法無效
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
NSLog(@"相機的選擇為:%d",granted);
if (granted) {
[self requestMicroAuthorized];
}
}];
}
break;
default:
{
NSLog(@"用戶拒絕給予相機權(quán)限,這時候可以調(diào)用自定義權(quán)限請求對話框請求權(quán)限");
// 此時調(diào)用此方法無效
// [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
// if (!granted) {
// NSLog(@"用戶拒絕了給予相機權(quán)限");
// }
//
// }];
}
break;
}
NSLog(@"結(jié)束");
}
- (void)requestMicroAuthorized
{
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeAudio]) {
case AVAuthorizationStatusAuthorized:
{
NSLog(@"有麥克風權(quán)限了");
[self setupCaptureSession];
}
break;
case AVAuthorizationStatusNotDetermined:
{
NSLog(@"首次打開APP醉旦,還未請求過麥克風權(quán)限");
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
if (granted) {
[self setupCaptureSession];
}
NSLog(@"音頻權(quán)限 %d",granted);
}];
}
default:
{
NSLog(@"不具有麥克風權(quán)限");
}
break;
}
}
- (void)setupCaptureSession
{
// 創(chuàng)建工作隊列
sessionQueue = dispatch_queue_create("CaputureQueue", DISPATCH_QUEUE_SERIAL);
// 創(chuàng)建會話
captureSession = [[AVCaptureSession alloc] init];
// 設(shè)置會話的sessionPreset,代表了采集視頻的寬高
captureSession.sessionPreset = AVCaptureSessionPreset640x480;
AVCaptureVideoPreviewLayer *videoLayer = [[AVCaptureVideoPreviewLayer alloc] init];
videoLayer.frame = self.layer.bounds;
videoLayer.session = captureSession;
[self.layer addSublayer:videoLayer];
dispatch_async(sessionQueue, ^{
AVCaptureDevice *videoDeivce = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack];
// 1饶米、添加視頻采集
// 首次使用APP,調(diào)用該方法就會彈出使用相機權(quán)限的對話框车胡,如果用戶拒絕賦予權(quán)限
// 或者先賦予了又在設(shè)置里面拒絕給權(quán)限檬输,那么這個初始化方法將返回nil
self->videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:videoDeivce error:nil];
[self->captureSession addInput:self->videoInput];
/** AVCaptureConnection
* AVCaptureSession實際上是通過該對象構(gòu)建起AVCaptureInput和AVCaptureOutput之間的連接
* 當調(diào)用addInput:和addOut:時就會自動構(gòu)建一個AVCaptureConnection對象。
* AVCaptureVideoPreviewLayer內(nèi)部也包含一個AVCaptureOutput對象匈棘,當調(diào)用videoLayer.session
* 時會自動調(diào)用addOut:
*
* 它代表了一個流對象丧慈,通過它可以設(shè)置輸出視頻的方向,可以設(shè)置輸出視頻是否增穩(wěn)等等屬性
*/
videoLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
// NSLog(@"connections %@",captureSession.connections);
// 2主卫、添加音頻采集
AVCaptureDevice *audioDeivce = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInMicrophone mediaType:AVMediaTypeAudio position:AVCaptureDevicePositionUnspecified];
// 首次使用APP逃默,調(diào)用該方法就會彈出使用相機權(quán)限的對話框,如果用戶拒絕賦予權(quán)限
// 或者先賦予了又在設(shè)置里面拒絕給權(quán)限簇搅,那么這個初始化方法將返回nil
self->audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:audioDeivce error:nil];
[self->captureSession addInput:self->audioInput];
/** AVCaptureMovieFileOutput
* 用于將來自采集的音視頻輸出自動壓縮然后保存到指定的文件中
* 它的文件容器格式只能是MOV完域,編碼方式顏色格式音頻參數(shù)等等都采用默認值
*
* AVCaptureAudioFileOutput也是一樣的工作方式
*/
// 3、添加將采集的音視頻保存到文件中
self->fileOutput = [[AVCaptureMovieFileOutput alloc] init];
[self->captureSession addOutput:self->fileOutput];
// AVCaptureConnection必須在addOutput:方法調(diào)用之后才會生成AVCaptureConnection
AVCaptureConnection *videoConn = [self->fileOutput connectionWithMediaType:AVMediaTypeVideo];
NSDictionary *videoSettings = @{
AVVideoCodecKey:AVVideoCodecH264
};
// 設(shè)置視頻編碼方式瘩将;它一般會有個默認編碼方式筒主,不同設(shè)備可能不一樣,比如IphoneX 就是HEVC
[self->fileOutput setOutputSettings:videoSettings forConnection:videoConn];
// 4鸟蟹、添加拍照輸出對象
// 備注:會話管理對象一次性可以添加多個輸入輸出對象
AVCapturePhotoOutput *stillOut = [[AVCapturePhotoOutput alloc] init];
[self->captureSession addOutput:stillOut];
self->stillOutput = stillOut;
// 采集會話
[self->captureSession startRunning];
});
}
- (void)onTapButton:(UIButton*)btn
{
if (btn == change) {
[self changeCamera];
} else if(btn == record) {
[self recordStartOrStop:!fileOutput.recording];
} else if(btn == stillImage) {
[self takeStillImage];
}
}
// 更換攝像頭
- (void)changeCamera
{
dispatch_async(sessionQueue, ^{
AVCaptureDevicePosition curPostion = [self->videoInput.device position];
if (curPostion == AVCaptureDevicePositionBack) {
curPostion = AVCaptureDevicePositionFront;
} else {
curPostion = AVCaptureDevicePositionBack;
}
// 更換攝像頭乌妙,只需要將以前AVCaptureInput刪除,重新添加新的AVCaputreDeviceInput即可
// 這個過程只需要卸載 beginConfiguration和commitConfiguration中
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:curPostion];
AVCaptureDeviceInput *newInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:nil];
[self->captureSession beginConfiguration];
// 先刪掉的輸入對象建钥,然后在添加新的藤韵,如果新的添加失敗,則將以前的還原
[self->captureSession removeInput:self->videoInput];
if ([self->captureSession canAddInput:newInput]) {
[self->captureSession addInput:newInput];
self->videoInput = newInput;
} else {
[self->captureSession addInput:self->videoInput];
}
[self->captureSession commitConfiguration];
});
}
- (void)recordStartOrStop:(BOOL)start
{
dispatch_async(sessionQueue, ^{
if (start) {
// 當應(yīng)用即將處于后臺或者處于后臺時熊经,didFinishRecordingToOutputFileAtURL是不會調(diào)用的泽艘,也就是有可能文件沒有完整保存欲险。
// 所以這里需要給應(yīng)用申請多執(zhí)行一段時間大概為180秒 該方法可以在APP的啟動后任何地方調(diào)用,它必須和endBackgroundTask
// 成對調(diào)用匹涮,否則會崩潰
if (self->taskId == UIBackgroundTaskInvalid) {
self->taskId = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil];
}
// 將采集的音視頻自動保存到文件中
[self->fileOutput startRecordingToOutputFileURL:self->dstURL recordingDelegate:self];
} else {
// 調(diào)用停止保存到文件方法
[self->fileOutput stopRecording];
NSLog(@"結(jié)束保存到文件");
}
});
}
- (void)takeStillImage
{
dispatch_async(sessionQueue, ^{
/** AVCapturePhotoSettings定義輸出照片的數(shù)據(jù)格式已經(jīng)生成文件的格式天试,例如壓縮方式(JPEG,PNG,RAW),文件格式(JPG,png,DNG)等等其他拍照參數(shù)
* 也支持輸出未壓縮的原始數(shù)據(jù)CVPixelbufferRef然低,具體可以參考AVCapturePhotoSettings設(shè)置
*/
// 代表默認輸出JPG文件
AVCapturePhotoSettings *defaultSettings = [AVCapturePhotoSettings photoSettings];
defaultSettings.autoStillImageStabilizationEnabled = YES;
// 是否輸出原始拍照分辨率喜每,默認NO(代表輸出照片的分辨率和視頻分辨率一樣大,例如前面sessionPreset為640x480的雳攘,那么生成照片也是這樣大的)
// 備注:下面這兩個必須同時打開带兜,否則崩潰
defaultSettings.highResolutionPhotoEnabled = YES;
self->stillOutput.highResolutionCaptureEnabled = YES;
/** AVCapturePhotoOutpt 是對AVCaptureStillImageOutput的升級,它支持Live Photo capture, preview-sized image delivery, wide color, RAW, RAW+JPG and RAW+DNG formats.
* 照片
*/
// 按照指定的輸出格式執(zhí)行拍照指令吨灭,具體的執(zhí)行情況通過代理回調(diào)
[self->stillOutput capturePhotoWithSettings:defaultSettings delegate:self];
});
}
- (void)captureOutput:(AVCaptureFileOutput *)output didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray<AVCaptureConnection *> *)connections
{
[record setTitle:@"stop" forState:UIControlStateNormal];
NSLog(@"record start url %@ thread %@",fileURL,[NSThread currentThread]);
}
- (void)captureOutput:(AVCaptureFileOutput *)output didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray<AVCaptureConnection *> *)connections error:(NSError *)error
{
NSLog(@"record finish url %@ thread %@",outputFileURL,[NSThread currentThread]);
// 結(jié)束之后調(diào)用此方法
[[UIApplication sharedApplication] endBackgroundTask:self->taskId];
[self->record setTitle:@"record" forState:UIControlStateNormal];
}
- (void)captureOutput:(AVCapturePhotoOutput *)output willCapturePhotoForResolvedSettings:(nonnull AVCaptureResolvedPhotoSettings *)resolvedSettings
{
NSLog(@"即將開始拍照 %@",resolvedSettings);
}
- (void)captureOutput:(AVCapturePhotoOutput *)output didCapturePhotoForResolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings
{
NSLog(@"拍照完成 %@",resolvedSettings);
}
// ios 11+的獲取照片方式
#if __IPHONE_11_0
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error
API_AVAILABLE(ios(11.0)){
NSLog(@"didFinishProcessingPhoto ");
NSString *dstPath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory,NSUserDomainMask,YES)[0] stringByAppendingPathComponent:@"1-test_capture.JPG"];
NSURL *dstURL = [NSURL fileURLWithPath:dstPath];
NSData *jpg = [photo fileDataRepresentation];
[jpg writeToURL:dstURL atomically:YES];
}
#else
// ios 10的獲取照片方式
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error
{
NSData *jpgegData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
NSString *dstPath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory,NSUserDomainMask,YES)[0] stringByAppendingPathComponent:@"1-test_capture.JPG"];
NSURL *dstURL = [NSURL fileURLWithPath:dstPath];
[jpgegData writeToURL:dstURL atomically:YES];
}
#endif
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishCaptureForResolvedSettings:(nonnull AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error
{
NSLog(@"拍照完成 didFinishCaptureForResolvedSettings");
}
@end
tips:通過AVCaptureMovieFileOutput采集保存音視頻只能保存為MOV格式的刚照,編碼方式可以選擇H264或者H25等等
遇到問題
項目地址
https://github.com/nldzsz/ffmpeg-demo
位于AVFoundation目錄下文件AVCapturePriviewer.h/AVCapturePriviewer.m中