前段時(shí)間項(xiàng)目開發(fā)過(guò)程中遇到一個(gè)需求计盒,想做一個(gè)類似微信那樣的小視頻媚朦,然后在錄制視頻的自定義圖層上播放。于是就研究了 AVFoundation 的一些東西植康。實(shí)際開發(fā)過(guò)程中也遇到了一些問(wèn)題彤灶,所以在這里做下記錄看幼。另外參考了SBVideoCaptureDemo的源碼。
使用AVCaptureSession幌陕、AVCaptureMovieFileOutput诵姜、AVCaptureDeviceInput、AVCaptureVideoPreviewLayer來(lái)錄制視頻搏熄,并通過(guò)AVAssetExportSeeion壓縮視頻并轉(zhuǎn)換為 MP4 格式棚唆。
使用AVPlayerLayer暇赤、AVPlayer、AVPlayerItem瑟俭、NSURL自定義播放視頻
1翎卓、視頻的錄制
判斷用戶的設(shè)備對(duì)視頻錄制的支持情況
1、視頻錄制之前要先判斷攝像頭是否可用摆寄。
2失暴、攝像頭是否被授權(quán)。
自定義頻錄制
對(duì)所用的幾個(gè)類做簡(jiǎn)單說(shuō)明
AVCaptureSession:媒體(音微饥、視頻)捕獲會(huì)話逗扒,負(fù)責(zé)把捕獲的音視頻數(shù)據(jù)輸出到輸出設(shè)備中。一個(gè)AVCaptureSession可以有多個(gè)輸入輸出流欠橘。
AVCaptureDevice:輸入設(shè)備矩肩,包括麥克風(fēng)、攝像頭肃续,通過(guò)該對(duì)象可以設(shè)置物理設(shè)備的一些屬性(例如相機(jī)聚焦等)黍檩。
AVCaptureDeviceInput:設(shè)備輸入數(shù)據(jù)管理對(duì)象,可以根據(jù)AVCaptureDevice創(chuàng)建對(duì)應(yīng)的AVCaptureDeviceInput對(duì)象始锚,該對(duì)象將會(huì)被添加到AVCaptureSession中管理刽酱。
AVCaptureVideoPreviewLayer:相機(jī)拍攝預(yù)覽圖層,是CALayer的子類瞧捌,使用該對(duì)象可以看到視頻錄制效果棵里,創(chuàng)建該對(duì)象需要指定對(duì)應(yīng)的AVCaptureSession對(duì)象。
AVCaptureMovieFileOutput:視頻輸出流姐呐。把一個(gè)輸入或者輸出添加到AVCaptureSession之后AVCaptureSession就會(huì)在所有相符的輸入殿怜、輸出設(shè)備之間 建立連接(AVCaptionConnection)。
//此狀態(tài)表示視頻制作時(shí)的各個(gè)狀態(tài)
typedefNS_ENUM(NSInteger, VideoState)
{
VideoStateFree = 0,
VideoStateWillStartRecord,
VideoStateDidStartRecord,
VideoStateWillEndRecord,
VideoStateDidEndRecord,
VideoStateWillStartMerge,
VideoStateDidStartMerge,
};
//與VideoState不同
//此狀態(tài)表示用戶操作時(shí)的狀態(tài)曙砂,比如:已經(jīng)開始錄制头谜、停止錄制
typedefNS_ENUM(NSInteger, RecordOptState)
{
RecordOptStateFree = 0,
RecordOptStateBegin,
RecordOptStateEnd,
};
//錄制時(shí)用戶手指所處區(qū)域,可以用來(lái)判斷是在錄制區(qū)域還是在取消錄制區(qū)域
typedefNS_ENUM(NSInteger, CurrentRecordRegion)
{
CurrentRecordRegionFree = 0,
CurrentRecordRegionRecord,
CurrentRecordRegionCancelRecord,
};
初始化相關(guān)設(shè)置
self.captureSession= [[AVCaptureSessionalloc]init];
AVCaptureDevice*frontCamera =nil;
AVCaptureDevice*backCamera =nil;
NSArray*cameras = [AVCaptureDevicedevicesWithMediaType:AVMediaTypeVideo];
for(AVCaptureDevice*cameraincameras) {
if(AVCaptureDevicePositionFront== camera.position) {//前置攝像頭
frontCamera = camera;
}
elseif(AVCaptureDevicePositionBack== camera.position)
{
backCamera = camera;
}
//默認(rèn)使用后攝像機(jī)
[backCamera lockForConfiguration:nil];//先鎖定設(shè)備
if([backCamera isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]) {
[backCamerasetExposureMode:AVCaptureExposureModeContinuousAutoExposure];//曝光量調(diào)節(jié)
}
if([backCameraisFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus]) {//焦點(diǎn)CGPoint
[backCamerasetFocusMode:AVCaptureFocusModeContinuousAutoFocus];
}
[backCameraunlockForConfiguration];
[self.captureSessionbeginConfiguration];
//input device
self.videoDeviceInput= [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];
AVCaptureDeviceInput*audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]error:nil];
if([self.captureSessioncanAddInput:self.videoDeviceInput]) {
[self.captureSessionaddInput:self.videoDeviceInput];
}
if([self.captureSessioncanAddInput:audioDeviceInput]) {
[self.captureSessionaddInput:audioDeviceInput];
}
//output device
self.movieFileOutput= [[AVCaptureMovieFileOutputalloc]init];
if([self.captureSessioncanAddOutput:self.movieFileOutput]) {
[self.captureSessionaddOutput:self.movieFileOutput];
}
//preset
if([self.captureSessioncanSetSessionPreset:AVCaptureSessionPreset640x480]) {
self.captureSession.sessionPreset=AVCaptureSessionPreset640x480;//AVCaptureSessionPresetLow
}
//preview layer
self.preViewLayer= [AVCaptureVideoPreviewLayerlayerWithSession:self.captureSession];
self.preViewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;
[self.captureSession commitConfiguration];
[self.captureSession startRunning];//會(huì)話 開始運(yùn)行
注意:改變?cè)O(shè)備屬性前一定要首先調(diào)用lockForConfiguration方法加鎖,調(diào)用完之后使用unlockForConfiguration方法解鎖鸠澈。對(duì)相機(jī)設(shè)置時(shí)乔夯,要判斷當(dāng)前設(shè)備是否支持改設(shè)置。比如:isExposureModeSupported款侵、isFocusModeSupported等。
//開始錄制
- (void)startRecordingToOutputFileURL
{
_videoState=VideoStateWillStartRecord;
_recordOptState=RecordOptStateBegin;
//根據(jù)設(shè)備輸出獲得連接
AVCaptureConnection*captureConnection = [self.movieFileOutputconnectionWithMediaType:AVMediaTypeVideo];
//根據(jù)連接取得設(shè)備輸出的數(shù)據(jù)
if(![self.movieFileOutputisRecording]) {
//預(yù)覽圖層和視頻方向保持一致
captureConnection.videoOrientation= [self.preViewLayerconnection].videoOrientation;
[self.movieFileOutput startRecordingToOutputFileURL:[NSURL fileURLWithPath:[self getVideoSaveFilePathString]] recordingDelegate:self];//開始錄制
}
else
}
{
[selfstopCurrentVideoRecording];
}
//停止錄制
- (void)stopCurrentVideoRecording
{
[self stopCountDurTimer];//停止計(jì)時(shí)器
_videoState=VideoStateWillEndRecord;
[self.movieFileOutput stopRecording];//停止錄制
}
#pragma mark - AVCaptureFileOutputRecordingDelegate
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
_videoState = VideoStateDidStartRecord;
self.videoSaveFilePath = [fileURL absoluteString];
self.currentFileURL = fileURL;
self.currentVideoDur = 0.0f;
self.totalVideoDur = 0.0f;
[self startCountDurTimer];//啟動(dòng)錄制計(jì)時(shí)器
//這里拋出開始錄制 代理
if (RecordOptStateEnd == _recordOptState) {//時(shí)間太短侧纯,還沒開始錄制新锈,就已經(jīng)松開了錄制按鈕,要停止正在錄制的視頻
[self stopCurrentVideoRecording];
}
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
_videoState = VideoStateDidEndRecord;
self.totalVideoDur += _currentVideoDur;
//這里拋出錄制完成 代理
if (CurrentRecordRegionRecord == [self getCurrentRecordRegion]) {
if (self.totalVideoDur < MIN_VIDEO_DUR) {//錄制時(shí)間太短
[self removeMovFile];//移除mov格式的視頻文件
_videoState = VideoStateFree;
}
}
else
{
[self removeMovFile];//移除mov格式的視頻文件
_videoState = VideoStateFree;
}
}
//將mov格式轉(zhuǎn)化成MP4
- (void)mergeAndExportVideosAtFileURLs:(NSArray*)fileURLArray
{
_videoState = VideoStateWillStartMerge;
NSError *error = nil;
//渲染尺寸
CGSize renderSize = CGSizeMake(0, 0);
NSMutableArray *layerInstructionArray = [NSMutableArray array];
//用來(lái)合成視頻
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
CMTime totalDuration = kCMTimeZero;
//先取assetTrack 也為了取renderSize
NSMutableArray *assetTrackArray = [NSMutableArray array];
NSMutableArray *assetArray = [NSMutableArray array];
for (NSURL *fileURL in fileURLArray) {
//AVAsset:素材庫(kù)里的素材
AVAsset *asset = [AVAsset assetWithURL:fileURL];
if (!asset) {
continue;
}
[assetArray addObject:asset];
//素材的軌道
AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];//返回一個(gè)數(shù)組AVAssetTracks資產(chǎn)
[assetTrackArray addObject:assetTrack];
renderSize.width = MAX(renderSize.width, assetTrack.naturalSize.height);
renderSize.height = MAX(renderSize.height, assetTrack.naturalSize.width);
}
CGFloat renderW = 320;//MIN(renderSize.width, renderSize.height);
for (NSInteger i = 0; i < [assetArray count] && i < assetTrackArray.count; i++) {
AVAsset *asset = [assetArray objectAtIndex:i];
AVAssetTrack *assetTrack = [assetTrackArray objectAtIndex:i];
//文件中的音頻軌道,里面可以插入各種對(duì)應(yīng)的素材
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSArray*dataSourceArray= [asset tracksWithMediaType:AVMediaTypeAudio];//獲取聲道眶熬,即麥克風(fēng)相關(guān)信息
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:((dataSourceArray.count > 0)?[dataSourceArray objectAtIndex:0]:nil) atTime:totalDuration error:nil];
//工程文件中的軌道妹笆,有音頻軌块请,里面可以插入各種對(duì)應(yīng)的素材
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetTrack atTime:totalDuration error:&error];
//視頻軌道中的一個(gè)視頻,可以縮放拳缠、旋轉(zhuǎn)等
AVMutableVideoCompositionLayerInstruction *layerInstrucition = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
totalDuration = CMTimeAdd(totalDuration, asset.duration);
CGFloat rate = renderW / MIN(assetTrack.naturalSize.width, assetTrack.naturalSize.height);
CGAffineTransform layerTransform = CGAffineTransformMake(assetTrack.preferredTransform.a, assetTrack.preferredTransform.b, assetTrack.preferredTransform.c, assetTrack.preferredTransform.d, assetTrack.preferredTransform.tx * rate, assetTrack.preferredTransform.ty * rate);
layerTransform = CGAffineTransformConcat(layerTransform, CGAffineTransformMake(1, 0, 0, 1, 0, -(assetTrack.naturalSize.width - assetTrack.naturalSize.height) / 2.0));//向上移動(dòng)取中部影相
layerTransform = CGAffineTransformScale(layerTransform, rate, rate);//放縮墩新,解決前后攝像結(jié)果大小不對(duì)稱
[layerInstrucition setTransform:layerTransform atTime:kCMTimeZero];
[layerInstrucition setOpacity:0.0 atTime:totalDuration];
//data
[layerInstructionArray addObject:layerInstrucition];
}
//get save path
NSURL *mergeFileURL = [NSURL fileURLWithPath:[self getVideoMergeFilePathString]];
//export
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, totalDuration);
mainInstruction.layerInstructions = layerInstructionArray;
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = @[mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 100);
//? ? mainCompositionInst.renderSize = CGSizeMake(renderW, renderW * (sH/sW));
mainCompositionInst.renderSize = CGSizeMake(renderW, renderW * 0.75);//4:3比列
//資源導(dǎo)出
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
exporter.videoComposition = mainCompositionInst;
exporter.outputURL = mergeFileURL;
exporter.outputFileType = AVFileTypeMPEG4;//視頻格式MP4
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
_videoState = VideoStateDidStartMerge;
//拋出轉(zhuǎn)換成功 代理
[self removeMovFile];//移除MOV格式視頻
});
}];
}
//計(jì)算視頻大小
- (NSInteger) getFileSize:(NSString*) path
{
path = [pathstringByReplacingOccurrencesOfString:@"file://"withString:@""];
NSFileManager* filemanager = [NSFileManagerdefaultManager];
if([filemanagerfileExistsAtPath:path]){
NSDictionary* attributes = [filemanagerattributesOfItemAtPath:patherror:nil];
NSNumber*theFileSize;
if( (theFileSize = [attributesobjectForKey:NSFileSize]) )
return[theFileSizeintValue]/1024;
else
return-1;
}
else
{
return-1;
}
}
//拉近、拉遠(yuǎn)鏡頭
- (void)changeDeviceVideoZoomFactor
{
AVCaptureDevice*backCamera = [selfgetCameraDevice:NO];
CGFloatcurrent = 1.0;
if(1.0 == backCamera.videoZoomFactor) {
current = 2.0f;
if(current > backCamera.activeFormat.videoMaxZoomFactor) {
current = backCamera.activeFormat.videoMaxZoomFactor;
}
}
NSError*error =nil;
if([backCameralockForConfiguration:&error]) {
[backCamerarampToVideoZoomFactor:currentwithRate:10];
[backCameraunlockForConfiguration];
}
else
{
NSLog(@"鎖定設(shè)備過(guò)程error窟坐,錯(cuò)誤信息:%@",error.localizedDescription);
}
}
- (AVCaptureDevice*)getCameraDevice:(BOOL)isFront
{
NSArray*cameras = [AVCaptureDevicedevicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice*frontCamera;
AVCaptureDevice*backCamera;
for(AVCaptureDevice*cameraincameras) {
if(AVCaptureDevicePositionFront== camera.position) {
frontCamera = camera;
}
elseif(AVCaptureDevicePositionBack== camera.position)
{
backCamera = camera;
}
}
if(isFront) {
returnfrontCamera;
}
returnbackCamera;
}
2海渊、自定義播放視頻
//注意:播放視頻的URL是fileURLWithPath。格式是:“file://var”
- (instancetype)initVideoFileURL:(NSURL*)videoFileURL withFrame:(CGRect)frame withView:(UIView*)view
{
self= [superinit];
if(self) {
self.videoFileURL= videoFileURL;
[selfregisterNotficationMessage];
[selfinitPlayLayer:framewithView:view];
}
returnself;
}
- (void)initPlayLayer:(CGRect)rect withView:(UIView*)view
{
if(!_videoFileURL) {
return;
}
AVAsset*asset = [AVURLAssetURLAssetWithURL:_videoFileURLoptions:nil];
self.playerItem= [AVPlayerItemplayerItemWithAsset:asset];
//self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
self.player= [[AVPlayeralloc]init];
self.playerLayer= [AVPlayerLayerplayerLayerWithPlayer:self.player];
[self.playersetVolume:0.0f];//靜音
[self.playerseekToTime:kCMTimeZero];
[self.playersetActionAtItemEnd:AVPlayerActionAtItemEndNone];
[self.playerreplaceCurrentItemWithPlayerItem:self.playerItem];
self.playerLayer.frame= rect;
self.playerLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;
[view.layeraddSublayer:self.playerLayer];
}
- (void)playSight
{
[self.playerItemseekToTime:kCMTimeZero];
[self.playerplay];
}
- (void)pauseSight
{
[self.playerItemseekToTime:kCMTimeZero];
[self.playerpause];
}
- (void)releaseVideoPlayer
{
[selfremoveNotificationMessage];
if(self.player) {
[self.playerpause];
[self.playerreplaceCurrentItemWithPlayerItem:nil];
}
if(self.playerLayer) {
[self.playerLayerremoveFromSuperlayer];
}
self.player=nil;
self.playerLayer=nil;
self.playerItem=nil;
self.videoFileURL=nil;
}
#pragma mark - notification message
- (void)registerNotficationMessage
{
[[NSNotificationCenterdefaultCenter]addObserver:selfselector:@selector(avPlayerItemDidPlayToEnd:)name:AVPlayerItemDidPlayToEndTimeNotificationobject:nil];
}
- (void)removeNotificationMessage
{
[[NSNotificationCenterdefaultCenter]removeObserver:selfname:AVPlayerItemDidPlayToEndTimeNotificationobject:nil];
}
- (void)avPlayerItemDidPlayToEnd:(NSNotification*)notification
{
if(notification.object!=self.playerItem) {
return;
}
[self.playerItemseekToTime:kCMTimeZero];
[self.playerplay];
}
有關(guān)AVFoundation的知識(shí)點(diǎn)哲鸳,還有很多臣疑。以后如果有其它的需求再做研究。