本文對之前做過的相機模塊做個小結(jié)松靡,包括自定義相機進(jìn)行視頻拍攝允跑,視頻處理及保存等,感興趣的朋友可以做個參考
框架介紹 AVFoundation
常用于媒體錄制兰绣、編輯肚吏、播放,音頻錄制和播放狭魂,視頻音頻解碼等
常用類:AVCaptureDevice、 AVCaptureDeviceInput党觅、 AVCapturePhotoOutput雌澄、 AVCaptureVideoPreviewLayer、
AVAsset杯瞻、 AVAssetReader镐牺、 AVAssetWriter、 CMSampleBuffer魁莉、 AVPlayer睬涧、 CMTime、 AVCaptureMovieFileOutput旗唁、 AVCaptureMetadataOutput等
AVAsset 是一個抽象類畦浓,定義了一個資產(chǎn)文件的抽象接口AVURLAsset 通過 URL 創(chuàng)建,URL 可以是本地資源检疫,也可以是網(wǎng)絡(luò)資源
AVAssetReader 用以讀取 AVAsset 的媒體數(shù)據(jù)讶请,可以直接將未解碼的媒體數(shù)據(jù)解碼為可用數(shù)據(jù)
AVAssetWriter 可以將媒體數(shù)據(jù) CMSampleBuffer 寫入指定的文件中
CMSampleBuffer 是 Core Foundation 對象,是音頻, 視頻的壓縮或未壓縮數(shù)據(jù)樣本
CMTime 一個表示時間的結(jié)構(gòu)體屎媳。以分?jǐn)?shù)的形式表示時間
AVCaptureMovieFileOutput 將音頻和視頻數(shù)據(jù)輸出到文件中
AVCaptureMetadataOutput 元數(shù)據(jù)捕獲輸出 該 Output 比較牛逼夺溢,可以用來掃描條形碼,人臉烛谊,二維碼风响,UPC-E 商品條形碼等信息。
準(zhǔn)備工作
1.判斷有無權(quán)限
AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
如果未申請過權(quán)限丹禀,則進(jìn)行權(quán)限獲取
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
dispatch_sync(dispatch_get_main_queue(), ^{
if (granted) {
// 申請權(quán)限成功
} else {
// 申請權(quán)限失敗
}
});
}];
- 如果默認(rèn)橫屏状勤,需要根據(jù)屏幕方向進(jìn)行旋轉(zhuǎn)
自定義相機配置信息
Capture 系統(tǒng)體系結(jié)構(gòu)主要部分是會話,輸入和輸出
Capture 會話將一個或多個輸入連接到一個或多個
輸出湃崩。輸入是媒體的來源荧降,包括捕獲設(shè)備相機和麥克風(fēng)。輸出是從輸入中獲取媒體數(shù)據(jù)攒读,例如寫入磁盤文件并產(chǎn)生一個電影文件朵诫。
- 需要創(chuàng)建以下屬性
@property (nonatomic ,strong) AVCaptureSession *session; // 會話 由他把輸入輸出結(jié)合在一起,并開始啟動捕獲設(shè)備(攝像頭)
@property (nonatomic ,strong) AVCaptureDevice *device; // 視頻輸入設(shè)備
@property (nonatomic ,strong) AVCaptureDevice *audioDevice; // 音頻輸入設(shè)備
@property (nonatomic ,strong) AVCaptureDeviceInput *deviceInput;//圖像輸入源
@property (nonatomic ,strong) AVCaptureDeviceInput *audioInput; //音頻輸入源
@property (nonatomic ,strong) AVCaptureAudioDataOutput *audioPutData; //音頻輸出源
@property (nonatomic ,strong) AVCaptureVideoDataOutput *videoPutData; //視頻輸出源
@property (nonatomic ,strong) AVCaptureVideoPreviewLayer *previewLayer;
@property (nonatomic ,strong) AVCaptureConnection *connection;
@property (nonatomic ,strong) AVAssetWriter *writer;//視頻采集
@property (nonatomic ,strong) AVAssetWriterInput *writerAudioInput;//音頻采集
@property (nonatomic ,strong) AVAssetWriterInput *writerVideoInput;//視頻采集
- 初始化session會話 AVCaptureSession 采集會話薄扁,用于管理并協(xié)調(diào)輸入設(shè)備和輸出設(shè)備
self.session = [[AVCaptureSession alloc] init];
if ([self.session canSetSessionPreset:AVCaptureSessionPresetHigh]){
self.session.sessionPreset = AVCaptureSessionPresetHigh;
}else if ([self.session canSetSessionPreset:AVCaptureSessionPresetiFrame960x540]) {
self.session.sessionPreset = AVCaptureSessionPresetiFrame960x540;
}
- 獲取視頻輸入設(shè)備(攝像頭)
self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[_device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus];
- 創(chuàng)建視頻輸入源 并添加到會話
NSError *error = nil;
self.deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:self.device error:&error];
if (!error) {
if ([self.session canAddInput:self.deviceInput]) {
[self.session addInput:self.deviceInput];
}
}
- 創(chuàng)建視頻輸出源 并添加到會話
NSDictionary *videoSetting = @{(id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)};
self.videoPutData = [[AVCaptureVideoDataOutput alloc] init];
self.videoPutData.videoSettings = videoSetting;
self.videoPutData.alwaysDiscardsLateVideoFrames = YES; //立即丟棄舊幀剪返,節(jié)省內(nèi)存废累,默認(rèn)YES
dispatch_queue_t videoQueue = dispatch_queue_create("vidio", DISPATCH_QUEUE_CONCURRENT);
[self.videoPutData setSampleBufferDelegate:self queue:videoQueue];
if ([self.session canAddOutput:self.videoPutData]) {
[self.session addOutput:self.videoPutData];
}
// 設(shè)置 imageConnection 控制相機拍攝視頻的角度方向
AVCaptureConnection *imageConnection = [self.videoPutData connectionWithMediaType:AVMediaTypeVideo];
if (imageConnection.supportsVideoOrientation) {
imageConnection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
}
- 獲取音頻輸入設(shè)備
self.audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
- 創(chuàng)建音頻輸入源 并添加到會話
NSError *audioError = nil;
self.audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:self.audioDevice error:&audioError];
if (!audioError) {
if ([self.session canAddInput:self.audioInput]) {
[self.session addInput:self.audioInput];
}
}
- 創(chuàng)建音頻輸出源 并添加到會話
self.audioPutData = [[AVCaptureAudioDataOutput alloc] init];
if ([self.session canAddOutput:self.audioPutData]) {
[self.session addOutput:self.audioPutData];
}
dispatch_queue_t audioQueue = dispatch_queue_create("audio", DISPATCH_QUEUE_CONCURRENT);
[self.audioPutData setSampleBufferDelegate:self queue:audioQueue]; // 設(shè)置寫入代理
- 初始化預(yù)覽層,session會話負(fù)責(zé)驅(qū)動input輸入源進(jìn)行信息的采集脱盲,layer預(yù)覽層負(fù)責(zé)把采集到的圖像進(jìn)行渲染顯示
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];
self.previewLayer.frame = CGRectMake(0, 0, width,height);
self.previewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight; // 圖層展示拍攝角度方向
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:self.previewLayer];
- 開始采集
[self.session startRunning];
視頻拍攝屬性設(shè)置 (可選項)
- 切換攝像頭
[self.session stopRunning];
// 1. 獲取當(dāng)前攝像頭
AVCaptureDevicePosition position = self.deviceInput.device.position;
//2. 獲取當(dāng)前需要展示的攝像頭
if (position == AVCaptureDevicePositionBack) {
position = AVCaptureDevicePositionFront;
} else {
position = AVCaptureDevicePositionBack;
}
// 3. 根據(jù)當(dāng)前攝像頭創(chuàng)建新的device
AVCaptureDevice *device = [self getCameraDeviceWithPosition:position];
// 4. 根據(jù)新的device創(chuàng)建input
AVCaptureDeviceInput *newInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
//5. 在session中切換input
[self.session beginConfiguration];
[self.session removeInput:self.deviceInput];
[self.session addInput:newInput];
[self.session commitConfiguration];
self.deviceInput = newInput;
[self.session startRunning];
- 閃光燈
if ([self.device lockForConfiguration:nil]) {
if ([self.device hasFlash]) {
if (self.device.flashMode == AVCaptureFlashModeAuto) {
self.device.flashMode = AVCaptureFlashModeOn;
[self.flashBtn setImage:[UIImage imageNamed:@"shanguangdeng_kai"] forState:UIControlStateNormal];
}else if (self.device.flashMode == AVCaptureFlashModeOn){
self.device.flashMode = AVCaptureFlashModeOff;
[self.flashBtn setImage:[UIImage imageNamed:@"shanguangdeng_guan"] forState:UIControlStateNormal];
}else{
self.device.flashMode = AVCaptureFlashModeAuto;
[self.flashBtn setImage:[UIImage imageNamed:@"shanguangdeng_zidong"] forState:normal];
}
}
[self.device unlockForConfiguration];
}
- 聚焦
// 添加聚焦手勢
- (void)addTap {
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(focusGesture:)];
[self.view addGestureRecognizer:tap];
}
- (void)focusGesture:(UITapGestureRecognizer*)gesture{
CGPoint point = [gesture locationInView:gesture.view];
CGSize size = self.view.bounds.size;
// focusPoint 函數(shù)后面Point取值范圍是取景框左上角(0邑滨,0)到取景框右下角(1,1)之間,有時按這個來但位置不對钱反,按實際適配
CGPoint focusPoint = CGPointMake( point.x /size.width , point.y/size.height );
if ([self.device lockForConfiguration:nil]) {
[self.session beginConfiguration];
/*****必須先設(shè)定聚焦位置掖看,在設(shè)定聚焦方式******/
//聚焦點的位置
if ([self.device isFocusPointOfInterestSupported]) {
[self.device setFocusPointOfInterest:focusPoint];
}
// 聚焦模式
if ([self.device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
[self.device setFocusMode:AVCaptureFocusModeAutoFocus];
}else{
NSLog(@"聚焦模式修改失敗");
}
//曝光點的位置
if ([self.device isExposurePointOfInterestSupported]) {
[self.device setExposurePointOfInterest:focusPoint];
}
//曝光模式
if ([self.device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]) {
[self.device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
} else {
NSLog(@"曝光模式修改失敗");
}
[self.device unlockForConfiguration];
[self.session commitConfiguration];
}
}
視頻錄制方式一 --- 通過AVAssetWriter寫入
視頻錄制需要在沙盒中先生成一個路徑,用于存儲視頻錄制過程中的文件信息寫入面哥,等視頻資料全部寫入完成后哎壳,即可獲取到完整的視頻
- 生成路徑
- (NSURL *)createVideoFilePathUrl
{
NSString *documentPath = [NSHomeDirectory() stringByAppendingString:@"/Documents/shortVideo"];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:@"yyyyMMddHHmmss"];
NSString *destDateString = [dateFormatter stringFromDate:[NSDate date]];
NSString *videoName = [destDateString stringByAppendingString:@".mp4"];
NSString *filePath = [documentPath stringByAppendingFormat:@"/%@",videoName];
NSFileManager *manager = [NSFileManager defaultManager];
BOOL isDir;
if (![manager fileExistsAtPath:documentPath isDirectory:&isDir]) {
[manager createDirectoryAtPath:documentPath withIntermediateDirectories:YES attributes:nil error:nil];
}
return [NSURL fileURLWithPath:filePath];
}
- 開始錄制, 完成錄制配置的設(shè)置
2.1 獲取存儲路徑 存儲路徑在沙盒中,需要唯一
self.preVideoURL = [self createVideoFilePathUrl];
2.2 開啟異步線程進(jìn)行寫入配置
dispatch_queue_t writeQueueCreate = dispatch_queue_create("writeQueueCreate", DISPATCH_QUEUE_CONCURRENT);
dispatch_async(writeQueueCreate, ^{
})
2.3. 生成視頻采集對象
NSError *error = nil;
self.writer = [AVAssetWriter assetWriterWithURL:self.preVideoURL fileType:AVFileTypeMPEG4 error:&error];
2.4. 生成圖像采集對象并添加到視頻采集對象 可以對圖像及音頻采集對象進(jìn)行設(shè)置尚卫,格式归榕,尺寸,碼率吱涉、幀率刹泄、頻道等等
NSInteger numPixels = width * height;
//每像素比特
CGFloat bitsPerPixel = 12.0;
NSInteger bitsPerSecond = numPixels * bitsPerPixel;
// 碼率和幀率設(shè)置
NSDictionary *compressionProperties = @{ AVVideoAverageBitRateKey : @(bitsPerSecond),
AVVideoExpectedSourceFrameRateKey : @(30),
AVVideoMaxKeyFrameIntervalKey : @(30),
AVVideoProfileLevelKey : AVVideoProfileLevelH264BaselineAutoLevel };
//視頻屬性
NSDictionary *videoSetting = @{ AVVideoCodecKey : AVVideoCodecTypeH264,
AVVideoWidthKey : @(width),
AVVideoHeightKey : @(height),
AVVideoScalingModeKey : AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey : compressionProperties };
self.writerVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSetting];
self.writerVideoInput.expectsMediaDataInRealTime = YES; //expectsMediaDataInRealTime 必須設(shè)為yes,需要從capture session 實時獲取數(shù)據(jù)
if ([self.writer canAddInput:self.writerVideoInput]) {
[self.writer addInput:self.writerVideoInput];
}
2.5. 生成音頻采集對象并添加到視頻采集對象
NSDictionary *audioSetting = @{ AVEncoderBitRatePerChannelKey : @(28000),
AVFormatIDKey : @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey : @(1),
AVSampleRateKey : @(22050) };
self.writerAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioSetting];
self.writerAudioInput.expectsMediaDataInRealTime = YES; //expectsMediaDataInRealTime 必須設(shè)為yes怎爵,需要從capture session 實時獲取數(shù)據(jù)
if ([self.writer canAddInput:self.writerAudioInput]) {
[self.writer addInput:self.writerAudioInput];
}
上面的寫法會在獲取到視頻信息的時候開始寫入錄制特石,避免出現(xiàn)先寫入語音信息,導(dǎo)致開始的時候有語音但是沒有視頻信息問題出現(xiàn) (實測此問題不明顯疙咸,根據(jù)個人需要看是否添加)
startSessionAtSourceTime方法用于設(shè)置開始播放時間
- 文件寫入 開始錄制可以設(shè)置開始播放時間县匠,避免開頭空白視頻的問題 startSessionAtSourceTime
在回調(diào)方法captureOutput:didOutputSampleBuffer:romConnection:中,第一次收到數(shù)據(jù)時啟動文件寫入撒轮,并將每一次的數(shù)據(jù)寫入到文件中
CMFormatDescriptionRef desMedia = CMSampleBufferGetFormatDescription(sampleBuffer);
CMMediaType mediaType = CMFormatDescriptionGetMediaType(desMedia);
if (mediaType == kCMMediaType_Video) {
if (!self.canWritting) {
[self.writer startWriting];
CMTime timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
self.canWritting = YES;
[self.writer startSessionAtSourceTime:timestamp];
}
}
if (self.canWritting) {
if (mediaType == kCMMediaType_Video) {
if (self.writerVideoInput.readyForMoreMediaData) {
BOOL success = [self.writerVideoInput appendSampleBuffer:sampleBuffer];
if (!success) {
NSLog(@"video write failed");
}
}
}else if (mediaType == kCMMediaType_Audio){
if (self.writerAudioInput.readyForMoreMediaData) {
BOOL success = [self.writerAudioInput appendSampleBuffer:sampleBuffer];
if (!success) {
NSLog(@"audio write failed");
}
}
}
}
- 結(jié)束錄制
創(chuàng)建異步線程并在其中完成結(jié)束錄制操作
dispatch_queue_t writeQueue = dispatch_queue_create("writeQueue", DISPATCH_QUEUE_CONCURRENT);
dispatch_async(writeQueue, ^{
if (weakSelf.writer.status == AVAssetWriterStatusWriting) {
[weakSelf.writer finishWritingWithCompletionHandler:^{
/// 完成操作
}];
}
});
視頻錄制方式二 --- 通過AVCaptureMovieFileOutput寫入
- 1.創(chuàng)建視頻輸出源 只需要生成一個視頻輸出源乞旦,不需要生成音頻輸出源
@property (nonatomic ,strong) AVCaptureMovieFileOutput *movieFileOutPut; // 影片輸出源
…………
…………
// 創(chuàng)建視頻輸出源 并添加到會話
self.movieFileOutPut = [[AVCaptureMovieFileOutput alloc] init];
// 設(shè)置輸出對象的一些屬性
AVCaptureConnection *captureConnection=[self.movieFileOutPut connectionWithMediaType:AVMediaTypeVideo]; //設(shè)置防抖
// 視頻防抖 是在 iOS 6 和 iPhone 4S 發(fā)布時引入的功能。到了 iPhone 6题山,增加了更強勁和流暢的防抖模式兰粉,被稱為影院級的視頻防抖動。相關(guān)的 API 也有所改動 (目前為止并沒有在文檔中反映出來顶瞳,不過可以查看頭文件)玖姑。防抖并不是在捕獲設(shè)備上配置的,而是在 AVCaptureConnection 上設(shè)置慨菱。由于不是所有的設(shè)備格式都支持全部的防抖模式焰络,所以在實際應(yīng)用中應(yīng)事先確認(rèn)具體的防抖模式是否支持:
if ([captureConnection isVideoStabilizationSupported ]) {
captureConnection.preferredVideoStabilizationMode=AVCaptureVideoStabilizationModeAuto;
}
// 預(yù)覽圖層和視頻方向保持一致
captureConnection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
// 將設(shè)備輸出添加到會話中
if ([_session canAddOutput:self.movieFileOutPut]) {
[_session addOutput:self.movieFileOutPut];
}
- 生成存儲路徑
- 3.調(diào)用錄制方法傳入路徑,文件自動寫入 直接調(diào)用錄制方法符喝,不需要配置文件寫入對象
[self.movieFileOutPut startRecordingToOutputFileURL:self.preVideoURL recordingDelegate:self];
- 4.完成錄制
[self.movieFileOutPut stopRecording];
- 5.在代理方法中監(jiān)控錄制狀態(tài)完成闪彼,獲取到文件
-(void)captureOutput:(AVCaptureFileOutput *)output didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray<AVCaptureConnection *> *)connections error:(NSError *)error {
…………
}
AVCaptureMovieFileOutput方式提供了暫停錄制方法和恢復(fù)錄制方法,但是僅mac os可用
AVAssetWriter不支持暫停錄制,嘗試過暫停文件寫入畏腕,結(jié)果為空白段缴川,且音頻時間順序混亂, 狀態(tài)枚舉無暫停狀態(tài)描馅,不支持
兩種錄制方式對比
相同點:數(shù)據(jù)采集都在AVCaptureSession中進(jìn)行把夸,視頻和音頻的輸入都一樣,畫面的預(yù)覽一致铭污。
不同點:
- 1.AVCaptureMovieFileOutput較為簡便恋日,只需要一個輸出即可;
AVAssetWriter 需要 AVCaptureVideoDataOutput 和 AVCaptureAudioDataOutput 兩個單獨的輸出嘹狞,拿到各自的輸出數(shù)據(jù)后谚鄙,然后自己進(jìn)行相應(yīng)的處理 - 2.AVAssetWriter可以配置更多的參數(shù),更為靈活
- 3.文件處理不一致刁绒, AVAssetWriter可以拿到實時數(shù)據(jù)流
AVCaptureMovieFileOutput 如果要剪裁視頻,因為系統(tǒng)已經(jīng)把數(shù)據(jù)寫到文件中了烤黍,我們需要從文件中獨到一個完整的視頻知市,然后處理;
而AVAssetWriter我們拿到的是數(shù)據(jù)流速蕊,還沒有合成視頻嫂丙,對數(shù)據(jù)流進(jìn)行處理
視頻處理
錄制完成之后可以通過之前的路徑來獲取視頻文件,進(jìn)行播放规哲、保存等操作
保存
PHPhotoLibrary *photoLibrary = [PHPhotoLibrary sharedPhotoLibrary];
[photoLibrary performChanges:^{
[PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:self.preVideoURL];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if (success) {
NSLog(@"已將視頻保存至相冊");
} else {
NSLog(@"未能保存視頻到相冊");
}
}];
拍照屬性設(shè)置 (可選項)
參考相機拍照屬性設(shè)置 http://www.reibang.com/p/e2de8a85b8aa