0x00 寫在前面
- 需求是把兩個或多個小視頻合并成一個視頻
- 合并效果為首尾相接畜隶,不是視頻重疊
- 當(dāng)前代碼為示例代碼壁肋,只展示了合并兩個視頻文件,要實現(xiàn)多個視頻文件合并為一個籽慢,需要改動代碼
0x01 代碼展示(Obj-c&Swift)
//Obj-c
- (void)combVideos {
NSBundle *mainBundle = [NSBundle mainBundle];
NSString *firstVideo = [mainBundle pathForResource:@"1" ofType:@"mp4"];
NSString *secondVideo = [mainBundle pathForResource:@"2" ofType:@"mp4"];
NSDictionary *optDict = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVAsset *firstAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:firstVideo] options:optDict];
AVAsset *secondAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:secondVideo] options:optDict];
AVMutableComposition *composition = [AVMutableComposition composition];
//為視頻類型的的Track
AVMutableCompositionTrack *compositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
//由于沒有計算當(dāng)前CMTime的起始位置浸遗,現(xiàn)在插入0的位置,所以合并出來的視頻是后添加在前面,可以計算一下時間嗡综,插入到指定位置
//CMTimeRangeMake 指定起去始位置
CMTimeRange firstTimeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);
CMTimeRange secondTimeRange = CMTimeRangeMake(kCMTimeZero, secondAsset.duration);
[compositionTrack insertTimeRange:secondTimeRange ofTrack:[secondAsset tracksWithMediaType:AVMediaTypeVideo][0] atTime:kCMTimeZero error:nil];
[compositionTrack insertTimeRange:firstTimeRange ofTrack:[firstAsset tracksWithMediaType:AVMediaTypeVideo][0] atTime:kCMTimeZero error:nil];
//只合并視頻乙帮,導(dǎo)出后聲音會消失,所以需要把聲音插入到混淆器中
//添加音頻,添加本地其他音樂也可以,與視頻一致
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:secondTimeRange ofTrack:[firstAsset tracksWithMediaType:AVMediaTypeAudio][0] atTime:kCMTimeZero error:nil];
[audioTrack insertTimeRange:firstTimeRange ofTrack:[firstAsset tracksWithMediaType:AVMediaTypeAudio][0] atTime:kCMTimeZero error:nil];
NSString *cachePath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) lastObject];
NSString *filePath = [cachePath stringByAppendingPathComponent:@"comp.mp4"];
AVAssetExportSession *exporterSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
exporterSession.outputFileType = AVFileTypeMPEG4;
exporterSession.outputURL = [NSURL fileURLWithPath:filePath]; //如果文件已存在极景,將造成導(dǎo)出失敗
exporterSession.shouldOptimizeForNetworkUse = YES; //用于互聯(lián)網(wǎng)傳輸
[exporterSession exportAsynchronouslyWithCompletionHandler:^{
switch (exporterSession.status) {
case AVAssetExportSessionStatusUnknown:
NSLog(@"exporter Unknow");
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"exporter Canceled");
break;
case AVAssetExportSessionStatusFailed:
NSLog(@"exporter Failed");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(@"exporter Waiting");
break;
case AVAssetExportSessionStatusExporting:
NSLog(@"exporter Exporting");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(@"exporter Completed");
break;
}
}];
}
Swift代碼:
需要先導(dǎo)入 import AVFoundation
*Swift版本注釋可參考Obj-c版本*
func combVideos() {
let firstVideo = NSBundle.mainBundle().pathForResource("1", ofType: "mp4")
let secondVideo = NSBundle.mainBundle().pathForResource("2", ofType: "mp4")
let optDict = [AVURLAssetPreferPreciseDurationAndTimingKey : NSNumber(bool: false)]
let firstAsset = AVURLAsset(URL: NSURL(fileURLWithPath: firstVideo!), options: optDict)
let secondAsset = AVURLAsset(URL: NSURL(fileURLWithPath: secondVideo!), options: optDict)
let composition = AVMutableComposition()
do {
let compositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let firstTimeRange = CMTimeRange(start: kCMTimeZero, duration: firstAsset.duration)
let secondTimeRange = CMTimeRange(start: kCMTimeZero, duration: secondAsset.duration)
// 添加視頻
try compositionTrack.insertTimeRange(secondTimeRange, ofTrack: secondAsset.tracksWithMediaType(AVMediaTypeVideo).first!, atTime: kCMTimeZero)
try compositionTrack.insertTimeRange(firstTimeRange, ofTrack: firstAsset.tracksWithMediaType(AVMediaTypeVideo).first!, atTime: kCMTimeZero)
// 添加聲音
let audioTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
try audioTrack.insertTimeRange(secondTimeRange, ofTrack: secondAsset.tracksWithMediaType(AVMediaTypeAudio).first!, atTime: kCMTimeZero)
try audioTrack.insertTimeRange(firstTimeRange, ofTrack: firstAsset.tracksWithMediaType(AVMediaTypeAudio).first!, atTime: kCMTimeZero)
let cache = NSSearchPathForDirectoriesInDomains(.CachesDirectory, .UserDomainMask, true).last
let filePath = cache! + "/comp_sw.mp4"
let exporterSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetHighestQuality)
exporterSession?.outputFileType = AVFileTypeMPEG4
exporterSession?.outputURL = NSURL(fileURLWithPath: filePath)
exporterSession?.shouldOptimizeForNetworkUse = true
exporterSession?.exportAsynchronouslyWithCompletionHandler({ () -> Void in
switch exporterSession!.status {
case .Unknown:
print("unknow")
case .Cancelled:
print("cancelled")
case .Failed:
print("failed")
case .Waiting:
print("waiting")
case .Exporting:
print("exporting")
case .Completed:
print("completed")
}
})
} catch {
print("\(error)")
}
}
0x10 效果 (時間變化)
歡迎大家交流指正