視頻應(yīng)用的大部分場(chǎng)景使用前面介紹的AVFoundation各種功能即可.但有時(shí)候會(huì)遇到特殊要求,這時(shí)就可能需要直接編輯處理媒體樣本來(lái)解決.
1. 讀取和寫(xiě)入
AVAssetReader
用于從AVAsset實(shí)例
中讀取媒體樣本. 而對(duì)應(yīng)的把媒體資源進(jìn)行編碼并寫(xiě)入容器文件的是AVAssetWriter
類(lèi);
-
AVAssetReader:
AVAssetReader通常會(huì)配置一個(gè)或多個(gè)AVAssetReaderOutput
對(duì)象,并通過(guò)其copyNextSampleBuffer
方法訪問(wèn)音視頻幀. AVAssetReaderOutput是一個(gè)抽象類(lèi),它主要是用來(lái)從指定的AVAssetTrack中讀取解碼的媒體樣本.
AVAssetReader只針對(duì)帶有一個(gè)資源的媒體樣本.如果需要同時(shí)從多個(gè)基于文件的資源中讀取樣本, 可將他們組合到一個(gè)AVComposition中. 另外, AVAssetReader雖然是以多線程的方式不斷讀取下一個(gè)可用樣本,但是依舊不適合實(shí)時(shí)操作,比如播放.
- ** AVAssetWriter:**
它通常配置一個(gè)或多個(gè)AVAssetWriterInput
對(duì)象,用于附加要寫(xiě)入的媒體樣本的CMSampleBuffer對(duì)象;AVAssetWriterInput
可以被配置來(lái)處理指定的媒體類(lèi)型,例如音頻或視頻,并且附加在其后的樣本會(huì)在最終輸出時(shí)生成一個(gè)獨(dú)立的AVAssetTrack
.
當(dāng)使用一個(gè)配置了處理視頻樣本的AVAssetWriterInput
時(shí),通常會(huì)用到一個(gè)專(zhuān)門(mén)的適配器對(duì)象
AVAssetWriterInputPixelBufferAdaptor
. 這個(gè)類(lèi)在附加被包裝成CVPixelBuffer對(duì)象的視頻樣本時(shí)可以提供最優(yōu)性能.
另外, 可通過(guò)AVMediaSelectionGroup和AVMediaSelectionOption參數(shù)設(shè)置來(lái)創(chuàng)建特定資源.
AVAssetWriter支持自動(dòng)交叉媒體樣本 (音視頻錯(cuò)開(kāi)存儲(chǔ),方便讀取), 為了保持這個(gè)方式,只有AVAssetWriterInput的readyForMoreMediaData屬性為YES時(shí)才可以將更多新的樣本添加到寫(xiě)入信息中.
AVAssetWriter有實(shí)時(shí)操作和離線操作兩種情況;
-
實(shí)時(shí):處理實(shí)時(shí)資源,比如從AVCapturerVideoOutput寫(xiě)入正在捕捉的媒體樣本時(shí),AVAssetWriterInput應(yīng)該令
expectsMediaDataInRealTime
屬性為YES來(lái)確保readyForMoreMediaData
值被正確計(jì)算. 從而音視頻自然交錯(cuò)存儲(chǔ).優(yōu)化實(shí)時(shí)資源數(shù)據(jù)寫(xiě)入器. -
離線: 當(dāng)從離線資源中讀取資源時(shí), 比如從AVAssetReader讀取樣本buffer, 在附加樣本 前仍然需要觀察寫(xiě)入器輸入的
readForMoreMediaData
屬性狀態(tài),不過(guò)可以使用requestMediaDataWhenReadyOnQueue: usingBlock:
方法控制數(shù)據(jù)的提供. 這個(gè)block中代碼會(huì)隨寫(xiě)入器 輸入 準(zhǔn)備附加更多的樣本而不斷被調(diào)用.添加樣本是,開(kāi)發(fā)者需要檢索數(shù)據(jù)并從資源中找到下一個(gè)樣本進(jìn)行添加.
1.2 離線非實(shí)時(shí)的讀寫(xiě)示例
使用AVAssetReader直接從資源文件中讀取樣本,并使用AVAssetWriter寫(xiě)入一個(gè)新的QuickTime文件中.
// 0. 獲取資源。
AVAsset *asset = [AVAsset assetWithURL:url];
// 1. 配置AVAssetReader
AVAssetTrack *track = [[asset trackWithMediaType:AVMediaTypeVideo] firstObject];
self.assetReader = [[AVAssetReader alloc] initWithAsset:asset error:nil];
NSDictionary *readerOutputSettings = @{
(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)
};
AVAssetReaderTrackOutput *trackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:track outputSettings: readerOutputSettings];
[self.assetReader addOutput: trackOutput];
// 2. 開(kāi)始讀取
[self.assetReader startReading];
// 配置AVAssetWriter , 傳遞一個(gè)新文件的寫(xiě)入地址和類(lèi)型
self.assetWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeQuickTime];
NSDictionary * writerOutputSettings =@{
AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : @1280,
AVVideoHeightKey :@720,
AVVideoCompressionPropertiesKey : @{
AVVideoMaxKeyFrameIntervalKey : @1,
AVVideoAverageBitRateKey : @10500000,
AVVideoProfileLevelKey : AVVideoProfileLevelH264Main31,
}
};
AVAssetWriterInput *writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo
outputSettings: writerOutputSettings];
[self.assetWriter addInput: writerInput];
// 開(kāi)始寫(xiě)入
[self.assetWriter startWriting];
之前介紹過(guò)
AVAssetExportSession
也可以用來(lái)導(dǎo)出新資源文件, AVAssetWriter與其對(duì)比的優(yōu)勢(shì)在于,它可以在進(jìn)行輸出編碼時(shí)有更多的控制,比如指定關(guān)鍵幀間隔,視頻比特率,像素寬高比,H264配置文件等.
在完成AVAssetReader和AVAssetWriter對(duì)象設(shè)置之后,創(chuàng)建一個(gè)新的寫(xiě)入會(huì)話來(lái)完成讀取寫(xiě)入過(guò)程. 對(duì)于離線模式,一般使用pull model方式 , 即當(dāng)AssetWriterInput
準(zhǔn)備好附加更多樣本時(shí)才從資源中拉取樣本.
dispatch_queue_t dispatchQueue = dispatch_queue_create("writerQueue",NULL);
// 創(chuàng)建一個(gè)新的寫(xiě)入會(huì)話, 傳入開(kāi)始時(shí)間
[self.assetWriter startSessionAtSourceTime:kCMTimeZero ];
writerInput requestMediaDataWhenReadyOnQueue usingBlock:^{
BOOL complete = NO;
while ([writerInput isReadYForMoreMediaData] && !conmpelte) { // pull model . 這個(gè)代碼塊在寫(xiě)入器準(zhǔn)備好添加更多樣本時(shí)會(huì)不斷調(diào)用. 在這期間,從trackOutput 復(fù)制可用的sampleBuffer,并添加到輸入中.
CMSampleBufferRef sampleBuffer = [trackOutput copyNextSampleBuffer];
if (sampleBuffer) {
BOOL result = [writerInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
complete = !result;
} else {
[writerInput markAsFinished];
complete = YES;
}
// 直到所有sampleBuffer添加寫(xiě)入完成. 關(guān)閉會(huì)話
if (complete) {
[self.assetWriter finishWritingWithCompletionHandler:^{
AVAssetWriterStatus status = self.assetWriter.status;
if(status == AVAssetWriterStatusComplete) {
// 寫(xiě)入成功
} else {
//寫(xiě)入失敗
}
}];
}
}];
2. 繪制音頻波形圖
有些應(yīng)用可能要顯示音頻波動(dòng). 使用波形圖; 一般分三步.
- 讀取:首先讀取或解壓音頻數(shù)據(jù).
- **縮減: **從資源中實(shí)際讀取的樣本數(shù)量會(huì)遠(yuǎn)比我們?cè)谄聊簧箱秩镜亩嗟亩? 所以必須縮減這個(gè)樣本集合來(lái)方便呈現(xiàn). 通常方法是降樣本總量分為小的樣本塊,并在每個(gè)樣本塊中找到最大樣本\所有樣本的平均值或min/max 值.
- 渲染: 繪制縮減后的樣本即可.
3. 捕捉錄制的高級(jí)方法
之前介紹過(guò)通過(guò)AVCaptureVideoDataOutput
捕捉CVPixelBuffer對(duì)象 再來(lái)OpenGL渲染制作視頻特效. 但這有一個(gè)小問(wèn)題, 使用AVCaptureVideoDataOutput
就會(huì)失去AVCaptureMovieFileOutput
來(lái)記錄Output的便捷性. 無(wú)法記錄就不能分享到其他地方. 下面我們使用AVAssetWriter
自己實(shí)現(xiàn)從CaptureOutput中記錄輸出.
- 首先,第一步還是配置session.
-(BOOL)setupSession:(NSError **)error {
self.captureSession = [[AVCaptureSession alloc] init];
self.captureSession.sessionPreset = AVCaptureSessionPresetHigh;
if (![self setupSessionInputs:error]) {
return NO;
}
if (![self setupSessionOutputs:error]) {
return NO;
}
return YES;
}
//
-(BOOL)setupSessionInputs:(NSError **)error {
// Set up default camera device
AVCaptureDevice *videoDevice =
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *videoInput =
[AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:error];
if (videoInput) {
if ([self.captureSession canAddInput:videoInput]) {
[self.captureSession addInput:videoInput];
self.activeVideoInput = videoInput;
} else {
NSDictionary *userInfo = @{NSLocalizedDescriptionKey : @"Failed to add video input."};
*error = [NSError errorWithDomain:THCameraErrorDomain
code:THCameraErrorFailedToAddInput
userInfo:userInfo];
return NO;
}
} else {
return NO;
}
// Setup default microphone
AVCaptureDevice *audioDevice =
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput *audioInput =
[AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:error];
if (audioInput) {
if ([self.captureSession canAddInput:audioInput]) {
[self.captureSession addInput:audioInput];
} else {
NSDictionary *userInfo = @{NSLocalizedDescriptionKey : @"Failed to add audio input."};
*error = [NSError errorWithDomain:THCameraErrorDomain
code:THCameraErrorFailedToAddInput
userInfo:userInfo];
return NO;
}
} else {
return NO;
}
return YES;
}
//
-(BOOL)setupSessionOutputs:(NSError **)error {
self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
// kCVPixelFormatType_32BGRA格式適用于OpenGL和CoreImage.
NSDictionary *outputSettings =
@{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};
self.videoDataOutput.videoSettings = outputSettings;
self.videoDataOutput.alwaysDiscardsLateVideoFrames = NO; // 由于要記錄輸出內(nèi)容,設(shè)置NO,這樣會(huì)給捕捉回調(diào)方法一些額外的時(shí)間來(lái)處理樣本buffer.
[self.videoDataOutput setSampleBufferDelegate:self
queue:self.dispatchQueue];
if ([self.captureSession canAddOutput:self.videoDataOutput]) {
[self.captureSession addOutput:self.videoDataOutput];
} else {
return NO;
}
self.audioDataOutput = [[AVCaptureAudioDataOutput alloc] init]; // 捕捉音頻內(nèi)容.
[self.audioDataOutput setSampleBufferDelegate:self
queue:self.dispatchQueue];
if ([self.captureSession canAddOutput:self.audioDataOutput]) {
[self.captureSession addOutput:self.audioDataOutput];
} else {
return NO;
}
// 下面代碼是調(diào)用第二步THMovieWriter類(lèi)的地方.
NSString *fileType = AVFileTypeQuickTimeMovie;
NSDictionary *videoSettings =
[self.videoDataOutput
recommendedVideoSettingsForAssetWriterWithOutputFileType:fileType];
NSDictionary *audioSettings =
[self.audioDataOutput
recommendedAudioSettingsForAssetWriterWithOutputFileType:fileType];
self.movieWriter =
[[THMovieWriter alloc] initWithVideoSettings:videoSettings
audioSettings:audioSettings
dispatchQueue:self.dispatchQueue];
self.movieWriter.delegate = self;
return YES;
}
// 回調(diào)處理捕捉到的sampleBuffer
-(void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
// 這個(gè)方法在第二步中解釋.
[self.movieWriter processSampleBuffer:sampleBuffer];
if (captureOutput == self.videoDataOutput) {
CVPixelBufferRef imageBuffer =
CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *sourceImage =
[CIImage imageWithCVPixelBuffer:imageBuffer options:nil];
// 傳遞給屏幕顯示圖片.
[self.imageTarget setImage:sourceImage];
}
}
上面為
AVCaptureVideoDataOutput
和AVCaptureAudioDataOutput
使用了一個(gè)調(diào)度隊(duì)列,對(duì)于示例是足夠的, 但是如果希望對(duì)數(shù)據(jù)進(jìn)行更復(fù)雜的處理,可能需要為每一個(gè)使用單獨(dú)的隊(duì)列.詳細(xì)參考蘋(píng)果示例代碼RosyWriter.
- 之后就是實(shí)現(xiàn)記錄輸出的內(nèi)容.這需要?jiǎng)?chuàng)建一個(gè)與
AVCaptureMovieFileOutput
功能類(lèi)似的新對(duì)象THMovieWriter.它使用AVAssetWriter來(lái)執(zhí)行視頻編碼和文件寫(xiě)入;
// .h
#import <AVFoundation/AVFoundation.h>
@protocol THMovieWriterDelegate <NSObject>
- (void)didWriteMovieAtURL:(NSURL *)outputURL; // 定義代理提示什么時(shí)候影片文件被寫(xiě)入磁盤(pán).
@end
@interface THMovieWriter : NSObject
- (id)initWithVideoSettings:(NSDictionary *)videoSettings // 兩個(gè)字典用來(lái)描述AVAssetWriter配置參數(shù)和調(diào)度對(duì)象.
audioSettings:(NSDictionary *)audioSettings
dispatchQueue:(dispatch_queue_t)dispatchQueue;
// 另外定義開(kāi)始和停止寫(xiě)入進(jìn)程的接口方法.
- (void)startWriting;
- (void)stopWriting;
@property (nonatomic) BOOL isWriting;
@property (weak, nonatomic) id<THMovieWriterDelegate> delegate;
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer; // 在每當(dāng)有新的樣本被捕捉到時(shí)調(diào)用這個(gè)方法, 用于處理sampleBuffer.
@end
// .m
static NSString *const THVideoFilename = @"movie.mov";
@interface THMovieWriter ()
@property (strong, nonatomic) AVAssetWriter *assetWriter;
@property (strong, nonatomic) AVAssetWriterInput *assetWriterVideoInput;
@property (strong, nonatomic) AVAssetWriterInput *assetWriterAudioInput;
@property (strong, nonatomic)
AVAssetWriterInputPixelBufferAdaptor *assetWriterInputPixelBufferAdaptor;
@property (strong, nonatomic) dispatch_queue_t dispatchQueue;
@property (weak, nonatomic) CIContext *ciContext;
@property (nonatomic) CGColorSpaceRef colorSpace;
@property (strong, nonatomic) CIFilter *activeFilter;
@property (strong, nonatomic) NSDictionary *videoSettings;
@property (strong, nonatomic) NSDictionary *audioSettings;
@property (nonatomic) BOOL firstSample;
- (id)initWithVideoSettings:(NSDictionary *)videoSettings
audioSettings:(NSDictionary *)audioSettings
dispatchQueue:(dispatch_queue_t)dispatchQueue {
self = [super init];
if (self) {
_videoSettings = videoSettings;
_audioSettings = audioSettings;
_dispatchQueue = dispatchQueue;
_ciContext = [THContextManager sharedInstance].ciContext; // Core Image 上下文,用于篩選sampleBuffer來(lái)得到CVPixelBuffer
_colorSpace = CGColorSpaceCreateDeviceRGB();
// 封裝的文件寫(xiě)入管理類(lèi).
_activeFilter = [THPhotoFilters defaultFilter];
_firstSample = YES;
NSNotificationCenter *nc = [NSNotificationCenter defaultCenter]; // 注冊(cè)通知,當(dāng)用戶切換篩選器時(shí)調(diào)用.用于更新activeFilter屬性.
[nc addObserver:self
selector:@selector(filterChanged:)
name:THFilterSelectionChangedNotification
object:nil];
}
return self;
}
- (void)startWriting {
dispatch_async(self.dispatchQueue, ^{ // 異步避免按鈕點(diǎn)擊卡頓.
NSError *error = nil;
NSString *fileType = AVFileTypeQuickTimeMovie;
self.assetWriter = // 創(chuàng)建新的AVAssetWriter
[AVAssetWriter assetWriterWithURL:[self outputURL]
fileType:fileType
error:&error];
if (!self.assetWriter || error) {
NSString *formatString = @"Could not create AVAssetWriter: %@";
NSLog(@"%@", [NSString stringWithFormat:formatString, error]);
return;
}
self.assetWriterVideoInput = // 創(chuàng)建新的AVAssetWriterInput,來(lái)附加從captureOutput得到的sampleBuffer.
[[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo
outputSettings:self.videoSettings];
self.assetWriterVideoInput.expectsMediaDataInRealTime = YES; // 告訴應(yīng)用這個(gè)輸入 應(yīng)該對(duì)實(shí)時(shí)性進(jìn)行優(yōu)化.
UIDeviceOrientation orientation = [UIDevice currentDevice].orientation;
self.assetWriterVideoInput.transform = // 這里是為捕捉時(shí)設(shè)備方向做適配.
THTransformForDeviceOrientation(orientation);
NSDictionary *attributes = @{ // 用于配置assetWriterInputPixelBufferAdaptor. 要保證最大效率,字典的值要對(duì)應(yīng)配置AVCaptureVideoDataOutput時(shí)的值.
(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA),
(id)kCVPixelBufferWidthKey : self.videoSettings[AVVideoWidthKey],
(id)kCVPixelBufferHeightKey : self.videoSettings[AVVideoHeightKey],
(id)kCVPixelFormatOpenGLESCompatibility : (id)kCFBooleanTrue
};
self.assetWriterInputPixelBufferAdaptor = // 創(chuàng)建,用于提供一個(gè)優(yōu)化的CVPixelBufferPool ,使用它可以創(chuàng)建CVPixelBuffer對(duì)象來(lái)渲染篩選視頻幀.
[[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:self.assetWriterVideoInput
sourcePixelBufferAttributes:attributes];
if ([self.assetWriter canAddInput:self.assetWriterVideoInput]) { // 基本步驟,將視頻輸入添加到寫(xiě)入器.
[self.assetWriter addInput:self.assetWriterVideoInput];
} else {
NSLog(@"Unable to add video input.");
return;
}
self.assetWriterAudioInput = // 同上面Video的操作.
[[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio
outputSettings:self.audioSettings];
self.assetWriterAudioInput.expectsMediaDataInRealTime = YES;
if ([self.assetWriter canAddInput:self.assetWriterAudioInput]) {
[self.assetWriter addInput:self.assetWriterAudioInput];
} else {
NSLog(@"Unable to add audio input.");
}
self.isWriting = YES; // 表示可以附sampleBuffer了.
self.firstSample = YES;
});
}
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer {
if (!self.isWriting) {
return;
}
CMFormatDescriptionRef formatDesc = // 這個(gè)方法會(huì)處理音頻和視頻兩類(lèi)樣本. 所以要進(jìn)行判斷分類(lèi)處理.
CMSampleBufferGetFormatDescription(sampleBuffer);
CMMediaType mediaType = CMFormatDescriptionGetMediaType(formatDesc);
if (mediaType == kCMMediaType_Video) {
CMTime timestamp =
CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
if (self.firstSample) { // 如果是剛開(kāi)始處理的第一個(gè)sampleBuffer,則調(diào)用資源寫(xiě)入器開(kāi)啟一個(gè)新的寫(xiě)入會(huì)話.
if ([self.assetWriter startWriting]) {
[self.assetWriter startSessionAtSourceTime:timestamp];
} else {
NSLog(@"Failed to start writing.");
}
self.firstSample = NO;
}
CVPixelBufferRef outputRenderBuffer = NULL;
CVPixelBufferPoolRef pixelBufferPool =
self.assetWriterInputPixelBufferAdaptor.pixelBufferPool;
OSStatus err = CVPixelBufferPoolCreatePixelBuffer(NULL, // 從PixelBuffer適配器中 創(chuàng)建一個(gè)空的CVPixelBuffer,使用該P(yáng)ixelBuffer渲染篩選好的視頻幀的output.
pixelBufferPool,
&outputRenderBuffer);
if (err) {
NSLog(@"Unable to obtain a pixel buffer from the pool.");
return;
}
CVPixelBufferRef imageBuffer = // 獲取當(dāng)前sampleBuffer的CVPixelBuffer.根據(jù)CVPixelBuffer創(chuàng)建一個(gè)新的CIImage. 并將它設(shè)置為活動(dòng)篩選器的kCIInputImageKey值. 通過(guò)篩選器來(lái)得到輸出的圖片.
CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *sourceImage = [CIImage imageWithCVPixelBuffer:imageBuffer
options:nil];
[self.activeFilter setValue:sourceImage forKey:kCIInputImageKey];
CIImage *filteredImage = self.activeFilter.outputImage;
if (!filteredImage) {
filteredImage = sourceImage;
}
[self.ciContext render:filteredImage // 將篩選好的CIImage的輸出渲染到上面創(chuàng)建的CVPixelBuffer中.
toCVPixelBuffer:outputRenderBuffer
bounds:filteredImage.extent
colorSpace:self.colorSpace];
if (self.assetWriterVideoInput.readyForMoreMediaData) { // 如果視頻輸入的此屬性為YES. 則將PixelBuffer連同當(dāng)前樣本的呈現(xiàn)時(shí)間都附加到assetWriterInputPixelBufferAdaptor適配器. 至此就完成了對(duì)當(dāng)前視頻樣本的處理.
if (![self.assetWriterInputPixelBufferAdaptor
appendPixelBuffer:outputRenderBuffer
withPresentationTime:timestamp]) {
NSLog(@"Error appending pixel buffer.");
}
}
CVPixelBufferRelease(outputRenderBuffer);
}
else if (!self.firstSample && mediaType == kCMMediaType_Audio) { // 如果第一個(gè)樣本處理完成且當(dāng)前為音頻樣本, 則添加到輸入.
if (self.assetWriterAudioInput.isReadyForMoreMediaData) {
if (![self.assetWriterAudioInput appendSampleBuffer:sampleBuffer]) {
NSLog(@"Error appending audio sample buffer.");
}
}
}
}
- (void)stopWriting {
self.isWriting = NO; // 讓processSampleBuffer方法停止處理更多樣本.
dispatch_async(self.dispatchQueue, ^{
[self.assetWriter finishWritingWithCompletionHandler:^{ // 終止寫(xiě)入會(huì)話. 并關(guān)閉磁盤(pán)上的文件.
if (self.assetWriter.status == AVAssetWriterStatusCompleted) {
dispatch_async(dispatch_get_main_queue(), ^{ // 判斷寫(xiě)入器狀態(tài), 如果成功寫(xiě)入,回到主線程寫(xiě)入用戶的 photos Library
NSURL *fileURL = [self.assetWriter outputURL];
[self.delegate didWriteMovieAtURL:fileURL];
});
} else {
NSLog(@"Failed to write movie: %@", self.assetWriter.error);
}
}];
});
}
// 用于配置AVAssetWriter. 在臨時(shí)目錄定義一個(gè)URL,并將之前的同名文件刪除.
- (NSURL *)outputURL {
NSString *filePath =
[NSTemporaryDirectory() stringByAppendingPathComponent:THVideoFilename];
NSURL *url = [NSURL fileURLWithPath:filePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:url.path]) {
[[NSFileManager defaultManager] removeItemAtURL:url error:nil];
}
return url;
}