視頻處理主要是用到以下這幾個類
AVMutableComposition紫皇、AVMutableVideoComposition、AVMutableAudioMix腋寨、AVMutableVideoCompositionInstruction聪铺、AVMutableVideoCompositionLayerInstruction、AVAssetExportSession 等萄窜。其中 AVMutableComposition 可以用來操作音頻和視頻的組合铃剔,AVMutableVideoComposition 可以用來對視頻進行操作,AVMutableAudioMix 類是給視頻添加音頻的查刻,AVMutableVideoCompositionInstruction和AVMutableVideoCompositionLayerInstruction 一般都是配合使用键兜,用來給視頻添加水印或者旋轉視頻方向,AVAssetExportSession 是用來進行視頻導出操作的穗泵。需要值得注意的是當App進入后臺之后普气,會對使用到GPU的代碼操作進行限制,會造成崩潰佃延,而視頻處理這些功能多數(shù)會使用到GPU,所以需要做對應的防錯處理现诀。
在這里我會使用Apple的官方Demo “AVSimpleEditoriOS” 作為講解案例夷磕,該案例采用Command設計模式來組織代碼,其中基類的AVSECommand包含了一些各個子類Command共用的屬性仔沿。本文就視頻相關操作做簡要介紹企锌,說明一些相關的操作,并標注一些重點代碼于未,希望本文可以起到拋磚引玉的效果,讓大家對視頻剪輯處理有個初步印象,然后可以根據(jù)Apple官方Demo的內容進行相應的修改陡鹃。大家可以下載相應的Apple官方Demo運行查看結果烘浦。
第一節(jié):給視頻添加水印和背景邊框
今天第一節(jié)先講解如何為一個視頻添加邊框和動畫,首先說明的是萍鲸,這種邊框和動畫并不能直接修改視頻的某一幀給他增加邊框或者產(chǎn)生動畫效果闷叉,這種動畫更像是給視頻的上面加一個calayer,然后控制這個layer產(chǎn)生動畫效果脊阴。因為具體到某一幀的這種操作不是iphone應該做的他也做不到握侧。
我們先來看一張圖,了解一下給video增加動畫的原理嘿期。
你可以看到videoLayer這個東西品擎,其實這個layer就是負責顯示我們的視頻,和他同級的是一個叫animationLayer的東西备徐,我們能夠掌控并且玩出花樣的其實是這個叫animationLayer的東西萄传,因為這個animationLayer可以由我們自己創(chuàng)建。
其實很簡單蜜猾,和我們videoLayer同級別的layer叫animationLayer(就是background)秀菱,他們共同有個父類叫做parentLayer,那么增加邊框無非是把animationLayer這個layer找個邊框的圖片,然后把他放到videoLayer的下面蹭睡,然后把videoLayer(crop也就是裁剪)的尺寸控制到剛好能顯示animationLayer的四邊衍菱,這樣,不就成了帶邊框的效果么肩豁。簡單點講就是 把裁剪好的視頻 videoLayer放在了 背景上然后再加入到parentLayer里 脊串。
視頻添加水印和背景邊框具體步驟
1.拿到視頻和音頻資源
2.創(chuàng)建AVMutableComposition對象
3.往AVMutableComposition對象添加視頻資源,同時設置視頻資源的時間段和插入點
4.往AVMutableComposition對象添加音頻資源清钥,同時設置音頻資源的時間段和插入點
5.創(chuàng)建視頻組合器對象 AVMutableVideoComposition 并設置frame和渲染寬高
6.創(chuàng)建視頻組合器指令對象洪规,設置指令的作用范圍
7.創(chuàng)建視頻組合器圖層指令對象,設置指令的作用范圍
8.視頻組合器圖層指令對象 放入 視頻組合器指令對象中
9.視頻組合器指令對象放入視頻組合器對象
10.創(chuàng)建水印圖層Layer并設置frame和水印的位置循捺,并將水印加入視頻組合器中
具體代碼實現(xiàn)
- (void)performWithAsset:(AVAsset*)asset withImageNamed:(NSString*)imgName withColorName:(NSString*)color withMusicName:(NSString *)musicName with:(CGRect)photoSize{
CGSize videoSize;
// 獲取視頻資源和音頻資源
AVAssetTrack *assetVideoTrack = nil;
AVAssetTrack *assetAudioTrack = nil;
// Check if the asset contains video and audio tracks
if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
}
if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
}
CMTime insertionPoint = kCMTimeZero;
NSError *error = nil;
// 創(chuàng)建組合器對象并添加視頻資源和音頻資源
// Step 1
// Create a composition with the given asset and insert audio and video tracks into it from the asset
if(!self.mutableComposition) {
// Check if a composition already exists, else create a composition using the input asset
self.mutableComposition = [AVMutableComposition composition];
// Insert the video and audio tracks from AVAsset
if (assetVideoTrack != nil) {
AVMutableCompositionTrack *compositionVideoTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:insertionPoint error:&error];
}
if (assetAudioTrack != nil) {
AVMutableCompositionTrack *compositionAudioTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:insertionPoint error:&error];
}
}
// Step 2? 創(chuàng)建一個和視頻相同大小的背景層
if ([[self.mutableComposition tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
if(!self.mutableVideoComposition) {
self.mutableVideoComposition = [AVMutableVideoComposition videoComposition];
AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVAssetTrack *videoTrack = [self.mutableComposition tracksWithMediaType:AVMediaTypeVideo][0];
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
//對視頻大小進行裁剪斩例,我這里是將視頻裁剪成 等寬高的正方形
CGFloat rate;
CGSize renderSize = CGSizeMake(0, 0);
renderSize.width = MAX(renderSize.width, videoTrack.naturalSize.height);
renderSize.height = MAX(renderSize.height, videoTrack.naturalSize.width);
CGFloat renderW = MIN(renderSize.width, renderSize.height);
rate = renderW / MIN(videoTrack.naturalSize.width, videoTrack.naturalSize.height);
//對視頻的方向進行處理
CGAffineTransform translateToCenter;
CGAffineTransform mixedTransform;
NSInteger degrees = [self degressFromVideoFileWithURL:asset];
if (degrees == 0) {
if (videoTrack.naturalSize.width == videoTrack.naturalSize.height) {
translateToCenter = CGAffineTransformMakeTranslation(0.0, 0.0);
}else{
translateToCenter = CGAffineTransformMakeTranslation(-140.0, 0.0);
}
mixedTransform = CGAffineTransformRotate(translateToCenter,0);
}else{
if(degrees == 90){
//順時針旋轉90°
NSLog(@"視頻旋轉90度,home按鍵在左");
translateToCenter = CGAffineTransformMakeTranslation(videoTrack.naturalSize.height, -240);
mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI_2);
}else if(degrees == 180){
//順時針旋轉180°
NSLog(@"視頻旋轉180度,home按鍵在上");
translateToCenter = CGAffineTransformMakeTranslation(videoTrack.naturalSize.width, videoTrack.naturalSize.height);
mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI)从橘;
}else if(degrees == 270){
//順時針旋轉270°
NSLog(@"視頻旋轉270度念赶,home按鍵在右");
translateToCenter = CGAffineTransformMakeTranslation(0.0, videoTrack.naturalSize.width);
mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI_2*3.0);
}
}
//將處理好的視頻賦值給AVMutableVideoCompositionLayerInstruction
[passThroughLayer setTransform:mixedTransform atTime:kCMTimeZero];
[passThroughLayer setOpacity:0.0 atTime:[asset duration]];
//然后將處理好的AVMutableVideoCompositionLayerInstruction賦值給AVMutableVideoCompositionInstruction
passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, [asset duration]);
passThroughInstruction.layerInstructions = @[passThroughLayer];
//創(chuàng)建 video composition?
self.mutableVideoComposition.instructions = @[passThroughInstruction];
self.mutableVideoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps
self.mutableVideoComposition.renderSize = CGSizeMake(renderW, renderW);
}
videoSize = self.mutableVideoComposition.renderSize;
//添加背景
self.watermarkLayer = [self watermarkLayerForSize:videoSize withImageNamed:imgName withColorName:color];
}
}
// Step 3 發(fā)送通知到你想要處理的界面
// Notify AVSEViewController about add watermark operation completion
[[NSNotificationCenter defaultCenter] postNotificationName:AVSEEditCommandCompletionNotification object:self];
// 這里執(zhí)行添加背景邊框 和 水印的 layer
- (CALayer*)watermarkLayerForSize:(CGSize)videoSize withImageNamed:(NSString*)imgName withColorName:(NSString *)color{
// Create a layer for the title
CALayer *titleLayer = [CALayer layer];
titleLayer.bounds = CGRectMake(0, 0, videoSize.width, videoSize.height);//此處的Frame為視頻展示試圖的大小
titleLayer.masksToBounds = true;
UIImage *image = [UIImage imageNamed:imgName];
titleLayer.contents = (id)image.CGImage;
titleLayer.position = CGPointMake(videoSize.width/2, videoSize.height/2);
//還能給背景設置樣式和顏色
//do something...
return titleLayer;
}
- (void)exportWillBegin{
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
if (self.backGroundLayer) {
videoLayer.frame = CGRectMake(20, 20, self.videoComposition.renderSize.width - 40, self.videoComposition.renderSize.height - 40);
}else{
videoLayer.frame = CGRectMake(0, 0, self.videoComposition.renderSize.width, self.videoComposition.renderSize.height);
}
parentLayer.frame = CGRectMake(0, 0, self.videoComposition.renderSize.width, self.videoComposition.renderSize.height);
//這里是先添加背景backgroundLayer础钠,然后再添加videoLayer,那么視頻就會在背景上了叉谜。
//而添加水印則是相反的旗吁,我們把overlayLayer放在了videolayer的上面,所以水印總是顯示在視頻之上的停局。
[parentLayer addSublayer:backgroundLayer];
[parentLayer addSublayer:videoLayer];
//添加水印到視頻組合器里
self.videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
//之后導出就可以了
第二節(jié):視頻添加動畫
添加動畫的原理 和 添加水印是一樣的?
- (CALayer*)watermarkLayerForSize:(CGSize)videoSize withImageNamed:(NSString*)imgName withColorName:(NSString *)color{
// Create a layer for the title
CALayer *overlayLayer1 = [CALayer layer];
[overlayLayer1 setContents:(id)[animationImage CGImage]];
overlayLayer1.frame = CGRectMake(size.width/2-64, size.height/2 + 200, 128, 128);
[overlayLayer1 setMasksToBounds:YES];
CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:@"transform.rotation"];
animation.duration=2.0;
animation.repeatCount=5;
animation.autoreverses=YES;
// rotate from 0 to 360
animation.fromValue=[NSNumber numberWithFloat:0.0];
animation.toValue=[NSNumber numberWithFloat:(2.0 * M_PI)];
animation.beginTime = AVCoreAnimationBeginTimeAtZero; ? ? ? //注意一定要設置開始時間不然不顯示
[overlayLayer1 addAnimation:animation forKey:@"rotation"];
return overlayLayer1;
}
第三節(jié):視頻添加音頻
1.拿到視頻和音頻資源
2.創(chuàng)建AVMutableComposition對象
3.往AVMutableComposition對象添加視頻資源很钓,同時設置視頻資源的時間段和插入點
4.往AVMutableComposition對象添加音頻資源,同時設置音頻資源的時間段和插入點
5.往AVMutableComposition對象添加要追加的音頻資源董栽,同時設置音頻資源的時間段码倦,插入點和混合模式
- (void)performWithAsset:(AVAsset*)asset withImageNamed:(NSString*)imgName withMusicName:(NSString *)musicName
{
AVAssetTrack *assetVideoTrack = nil;
AVAssetTrack *assetAudioTrack = nil;
// Check if the asset contains video and audio tracks
//? ? 拿到視頻和音頻資源
if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
}
if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
}
NSError *error = nil;
// Step 1
NSArray *Arr = [NSArray array];
Arr = [musicName componentsSeparatedByString:@"."];
NSString *audioURL = [[NSBundle mainBundle] pathForResource:[Arr firstObject] ofType:[Arr lastObject]];
AVAsset *audioAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:audioURL] options:nil];
AVAssetTrack *newAudioTrack = [audioAsset tracksWithMediaType:AVMediaTypeAudio][0];
// Step 2
//? ? 創(chuàng)建AVMutableComposition對象
if (!self.mutableComposition) {
// Check whether a composition has already been created, i.e, some other tool has already been applied.
// 創(chuàng)建 ?new composition組合器
self.mutableComposition = [AVMutableComposition composition];
// Add tracks to composition from the input video asset
if (assetVideoTrack != nil) {
//? ? ? ? ? ? 往AVMutableComposition對象添加視頻資源,同時設置視頻資源的時間段和插入點
AVMutableCompositionTrack *compositionVideoTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:kCMTimeZero error:&error];
}
if (assetAudioTrack != nil) {
//? ? ? ? ? 往AVMutableComposition對象添加音頻資源锭碳, 同時設置音頻資源的時間段和插入點
AVMutableCompositionTrack *compositionAudioTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:kCMTimeZero error:&error];
}
}
// Step 3
//? ? 添加音頻資源到composition
AVMutableCompositionTrack *customAudioTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[customAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [self.mutableComposition duration]) ofTrack:newAudioTrack atTime:kCMTimeZero error:&error];
// Step 4
//? ? 設置添加資源中的音頻時間段袁稽,并與原有視頻中的音頻混合
AVMutableAudioMixInputParameters *mixParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:customAudioTrack];
[mixParameters setVolumeRampFromStartVolume:1 toEndVolume:0 timeRange:CMTimeRangeMake(kCMTimeZero, self.mutableComposition.duration)];
self.mutableAudioMix = [AVMutableAudioMix audioMix];
self.mutableAudioMix.inputParameters = @[mixParameters];
// Step 5 發(fā)送通知到你想要處理的界面,這里完成之后就可以直接導出了擒抛,因為音樂已經(jīng)添加到對于的視頻上了
[[NSNotificationCenter defaultCenter] postNotificationName:AVSEEditCommandCompletionNotification object:self];
}
需要注意的是:如果你的視頻在界面上進行展示了推汽,那在添加了音樂之后就要去刷新UI。我這里使用的是AVPlayer來進行播放
//視頻播放試圖
- (void)video:(NSString *)videoPath{
NSURL *url = [NSURL fileURLWithPath:videoPath];
self.player = [AVPlayer playerWithURL:url];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.bounds = self.photo.bounds;
playerLayer.position = CGPointMake(self.photo.bounds.size.width/2, self.photo.bounds.size.height/2);
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;//視頻填充模式
[self.photo.layer addSublayer:playerLayer];
[self.player play];
[self reloadPlayerViewWithMusic];
}
- (void)playbackFinished:(NSNotification *)n{
//注冊的通知? 可以自動把 AVPlayerItem 對象傳過來歧沪,只要接收一下就OK
AVPlayerItem * playerItem = [n object];
//關鍵代碼
[playerItem seekToTime:kCMTimeZero];
[self.player play];
NSLog(@"重播");
}
// 添加音樂后 刷新視頻播放器?
- (void)reloadPlayerViewWithMusic{
//? ? 添加music
mutableAudioMix = self.audioMix;
AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:self.composition];
//? ? 判斷是否 有音頻文件
if(self.videoComposition && self.audioMix){
playerItem.videoComposition = self.videoComposition;
playerItem.audioMix = self.audioMix;
[self.player replaceCurrentItemWithPlayerItem:playerItem];
}
}
第五節(jié):視頻導出
1.創(chuàng)建輸出路徑
2.根據(jù)AVMutableComposition對象創(chuàng)建AVAssetExportSession視頻導出對象
3.設置AVAssetExportSession的AVMutableVideoComposition對象歹撒,AVMutableAudioMix對象,視頻導出路徑诊胞,視頻導出格式
4.異步導出視頻栈妆,根據(jù)導出結果做對應處理。
- (void)performWithAsset:(AVAsset*)asset withImageNamed:(NSString*)imgName withColorName:(NSString*)color withMusicName:(NSString *)musicName? with:(CGRect)photoSize{
// Step 1
//? ? 創(chuàng)建輸出路徑
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *outputURL =? [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:@"FinalVideo-%d.m4v",arc4random() % 1000]];
// Step 2
//? ? 創(chuàng)建導出對象
self.exportSession = [[AVAssetExportSession alloc] initWithAsset:self.mutableComposition presetName:AVAssetExportPresetHighestQuality];
//? ? 視頻組合器
self.exportSession.videoComposition = self.mutableVideoComposition;
//? ? 音頻組合器
self.exportSession.audioMix = self.mutableAudioMix;
self.exportSession.shouldOptimizeForNetworkUse = YES;
self.exportSession.outputURL = [NSURL fileURLWithPath:outputURL];
//? ? 導出格式
self.exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[self.exportSession exportAsynchronouslyWithCompletionHandler:^(void){
switch (self.exportSession.status) {
case AVAssetExportSessionStatusCompleted:{
dispatch_async(dispatch_get_main_queue(), ^{
//輸出完成后
[self writeVideoToPhotoLibrary:self.exportSession.outputURL];
});
// Step 3 ?發(fā)送通知到指定的界面
[[NSNotificationCenter defaultCenter] postNotificationName:AVSEExportCommandCompletionNotification
object:self];
}
break;
case AVAssetExportSessionStatusFailed:
NSLog(@"Failed:%@",self.exportSession.error);
[[NSNotificationCenter defaultCenter] postNotificationName:@"ExportCommandFaild" object:nil];
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Canceled:%@",self.exportSession.error);
break;
default:
break;
}
}];
}
- (void)writeVideoToPhotoLibrary:(NSURL *)url
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
//? ? 保存視頻到指定的相溥
[library saveVideo:url toAlbum:PHOTO_ALBUM_NAME completion:^(NSURL *assetURL, NSError *error) {
} failure:^(NSError *error) {
}];
}