前言
從本文開始逐漸學(xué)習(xí)iOS自帶的多媒體處理框架,例如AVFoundation,VideoToolbox队秩,CoreMedia,CoreVideo實(shí)現(xiàn)多媒體的處理昼浦,并且將實(shí)現(xiàn)方式以及效果和ffmpeg的方式做對(duì)比
本文的目的為實(shí)現(xiàn)將本地的多媒體文件(MP4,MP3,MOV等等)解封裝
對(duì)應(yīng)的ffmpeg實(shí)現(xiàn)方式參考:MP4/MP3解封裝ffmpeg(十三)
解封裝相關(guān)流程
解封裝相關(guān)對(duì)象及函數(shù)介紹
1馍资、AVURLAsset對(duì)象
AVAsset對(duì)象是一個(gè)抽象子類,是AVFoundation中對(duì)封裝格式資源的封裝关噪,比如MP4容器鸟蟹,MKV容器乌妙,MP3容器等等。AVURLAsset對(duì)象則是它的一個(gè)具體子類
通常通過(guò)該子類來(lái)初始化一個(gè)AVAsset對(duì)象建钥。容器的屬性(如音視頻時(shí)長(zhǎng)藤韵,視頻幀率,編解碼格式等等)通過(guò)鍵值對(duì)的方式存儲(chǔ)于AVAsset對(duì)象中熊经;2泽艘、loadValuesAsynchronouslyForKeys:
初始化AVURLAsset對(duì)象的資源可以是本地的MP4資源,也可以是遠(yuǎn)程的基于HTTP協(xié)議的MP4資源镐依;訪問(wèn)AVURLAsset對(duì)象(這里假設(shè)為inputAsset)的屬性有兩種方式匹涮,同步和異步
a、直接調(diào)用屬性名如下,inputAsset.tracks,inputAsset.duration等等是同步的方式馋吗,它的機(jī)制為:如果屬性值未初始化,那么將阻塞當(dāng)前線程去解封裝容器的資源去
b秋秤、異步方式宏粤,通過(guò)loadValuesAsynchronouslyForKeys方法將對(duì)應(yīng)的屬性名傳遞進(jìn)去異步獲取,比如@[@"tracks",@"duration"]將異步獲取這兩個(gè)屬性灼卢;該函數(shù)的作用和ffmpeg的avformat_open_input()函數(shù)功能一樣绍哎;備注:inputAsset不能被釋放,否則初始化會(huì)失敗鞋真,回調(diào)函數(shù)不在主線程中3崇堰、AVAssetReader對(duì)象
它作為從容器對(duì)象AVAsset對(duì)象讀取音視頻數(shù)據(jù)的管理器,需要音視頻輸出對(duì)象才能向外部輸出數(shù)據(jù)4涩咖、AVAssetTrack 對(duì)象
1海诲、音視頻流對(duì)象,它代表了容器中的某一路音視頻流檩互,跟ffmpeg中AVStream對(duì)象功能一樣特幔。
2、一個(gè)容器中可以包括一路視頻流或者多路音頻流或者多路字幕流
3闸昨、當(dāng)AVAsset對(duì)象被初始化之后蚯斯,音視頻流對(duì)象就被初始化了5、AVAssetReaderTrackOutput 對(duì)象
1饵较、音視頻輸出對(duì)象拍嵌,它是AVAssetReaderOutput對(duì)象的具體實(shí)現(xiàn)子類,外部通過(guò)該對(duì)象讀取音視頻數(shù)據(jù)
2循诉、該對(duì)象負(fù)責(zé)配置輸出音視頻數(shù)據(jù)的格式等等參數(shù)
3横辆、音視頻輸出對(duì)象要加入到音視頻讀取管理對(duì)象中才能從其中讀取數(shù)據(jù)6、assetReaderTrackOutputWithTrack:outputSettings:
此方法要說(shuō)明下茄猫,當(dāng)outputSettings:對(duì)應(yīng)的參數(shù)為nil時(shí)當(dāng)代表輸出壓縮的數(shù)據(jù)龄糊,
如果配置了outputSettings 則內(nèi)部會(huì)自動(dòng)調(diào)用gpu硬解碼輸出解壓后的數(shù)據(jù)逆粹,解壓后的數(shù)據(jù)格式為前面outputSettings中配置的;
實(shí)現(xiàn)代碼
頭文件 AVDemuxer.h
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
@interface AVDemuxer : NSObject
@property(nonatomic,strong)AVAsset *asset;
@property(nonatomic,strong)AVAssetReader *assetReader;
@property(nonatomic,assign)BOOL autoDecode;
- (id)initWithURL:(NSURL*)localURL;
/** 實(shí)現(xiàn)解封裝MP4文件炫惩,并且將其中的未壓縮音視頻數(shù)據(jù)讀取出來(lái)
*/
- (void)startProcess;
- (void)stopProcess;
@end
autoDecode如果設(shè)置為YES僻弹,那么AVFoundation默認(rèn)將自動(dòng)將未壓縮的音視頻數(shù)據(jù)進(jìn)行解碼然后輸出未壓縮數(shù)據(jù)。默認(rèn)為NO他嚷,代表輸出的是壓縮的音視頻數(shù)據(jù)
實(shí)現(xiàn)文件
#import "AVDemuxer.h"
@interface AVDemuxer()
{
dispatch_semaphore_t decodeSemaphore;
}
@property(nonatomic,strong)NSURL *url;
@end
@implementation AVDemuxer
+ (void)testGeneric
{
// 代表了AVFoundation支持的媒體格式類型的全稱
NSLog(@"types %@",[AVURLAsset audiovisualTypes]);
// 代表了AVFoundation支持的媒體格式類型的簡(jiǎn)稱
NSLog(@"types %@",[AVURLAsset audiovisualMIMETypes]);
// 判斷是否支持MOV格式;AVFoundation默認(rèn)是支持MOV蹋绽、MP4等容器格式的
NSLog(@"yes %d",[AVURLAsset isPlayableExtendedMIMEType:@"video/quicktime"]);
}
- (id)initWithURL:(NSURL*)localURL
{
if (!(self = [super init])) {
return nil;
}
self.url = localURL;
self.autoDecode = YES;
/** 遇到問(wèn)題:AVDemuxer對(duì)象釋放后程序崩潰
* 分析原因:當(dāng)對(duì)象釋放時(shí)GCD會(huì)檢查信號(hào)量的值,如果其值小于等于信號(hào)量初始化時(shí)的值 會(huì)認(rèn)為其處于in use狀態(tài)筋蓖,所以會(huì)對(duì)其dispose 時(shí)就會(huì)崩潰
* 解決方案:用如下代碼替換之前dispatch_semaphore_create(1);的寫法
*/
decodeSemaphore = dispatch_semaphore_create(0);
dispatch_semaphore_signal(decodeSemaphore);
return self;
}
- (void)startProcess
{
if (dispatch_semaphore_wait(decodeSemaphore, DISPATCH_TIME_NOW) != 0) {
return;
}
NSDictionary *inputOptions = @{
AVURLAssetPreferPreciseDurationAndTimingKey:@(YES)
};
/** AVAsset對(duì)象
* 1卸耘、它是一個(gè)抽象類,是AVFoundation中對(duì)封裝格式資源的封裝粘咖,比如MP4容器蚣抗,MKV容器,MP3容器等等瓮下。AVURLAsset對(duì)象則是它的一個(gè)具體子類
* 通常通過(guò)該子類來(lái)初始化一個(gè)AVAsset對(duì)象翰铡。容器的屬性(如音視頻時(shí)長(zhǎng),視頻幀率讽坏,編解碼格式等等)通過(guò)鍵值對(duì)的方式存儲(chǔ)于AVAsset對(duì)象中锭魔;
* 2、初始化AVURLAsset對(duì)象的資源可以是本地的MP4資源路呜,也可以是遠(yuǎn)程的基于HTTP協(xié)議的MP4資源迷捧;訪問(wèn)AVURLAsset對(duì)象的屬性有兩種方式,同步和異步
* a胀葱、直接調(diào)用屬性名如下,inputAsset.tracks,inputAsset.duration等等是同步的方式漠秋,它的機(jī)制為:如果屬性值未初始化,那么將阻塞當(dāng)前線程去解封裝容器的資源去
* 初始化屬性抵屿,當(dāng)為遠(yuǎn)程資源時(shí)該過(guò)程會(huì)比較耗時(shí)膛堤。
* b、異步方式晌该,通過(guò)loadValuesAsynchronouslyForKeys方法將對(duì)應(yīng)的屬性名傳遞進(jìn)去異步獲取肥荔,比如@[@"tracks",@"duration"]將異步獲取這兩個(gè)屬性
*
* 后續(xù)的從容器中讀取數(shù)據(jù)讀取以及寫入數(shù)據(jù)到容器中都需要依賴此對(duì)象
*/
NSLog(@"開始");
// 1、初始化AVAsset對(duì)象
self.asset = [[AVURLAsset alloc] initWithURL:self.url options:inputOptions];
// AVURLAsset *inputAsset = [[AVURLAsset alloc] initWithURL:[NSURL URLWithString:@"https://images.flypie.net/test_1280x720_3.mp4"] options:inputOptions];
NSLog(@"結(jié)束");
// 如果在這里直接調(diào)用如下屬性朝群,那么將采用同步方式初始化屬性燕耿,會(huì)阻塞當(dāng)前線程,一般本地資源是可以采用如下方式
// NSLog(@"duration %f",CMTimeGetSeconds(inputAsset.duration));
// NSLog(@"tracks %@",inputAsset.tracks);
__weak typeof(self) weakSelf = self;
// 2姜胖、解析容器格式誉帅,作用和ffmpeg的avformat_open_input()函數(shù)功能一樣。初始化AVAsset對(duì)象通過(guò)此方式異步初始化屬性;
// 備注:inputAsset不能被釋放,否則初始化會(huì)失敗蚜锨,回調(diào)函數(shù)不在主線程中
CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
[self.asset loadValuesAsynchronouslyForKeys:@[@"tracks",@"duration"] completionHandler:^{
// NSLog(@"thread %@",[NSThread currentThread]);
NSError *error = nil;
AVKeyValueStatus status = [weakSelf.asset statusOfValueForKey:@"tracks" error:&error];
if (status != AVKeyValueStatusLoaded) {
NSLog(@"error %@",error);
return;
}
NSLog(@"aync duration %f",CMTimeGetSeconds(weakSelf.asset.duration));
NSLog(@"aync tracks %@",weakSelf.asset.tracks);
// 3档插、創(chuàng)建音視頻讀數(shù)據(jù)讀取對(duì)象
[weakSelf processAsset];
NSLog(@"總耗時(shí) %f秒",CFAbsoluteTimeGetCurrent() - startTime);
// 任務(wù)完畢
dispatch_semaphore_signal(self->decodeSemaphore);
NSLog(@"結(jié)束111");
}];
// 阻塞當(dāng)前線程
dispatch_semaphore_wait(decodeSemaphore, DISPATCH_TIME_FOREVER);
NSLog(@"結(jié)束");
}
- (void)processAsset
{
self.assetReader = [self createAssetReader];
AVAssetReaderOutput *videoTrackout = nil;
AVAssetReaderOutput *audioTrackout = nil;
for (AVAssetReaderOutput *output in self.assetReader.outputs) {
if ([output.mediaType isEqualToString:AVMediaTypeVideo]) {
videoTrackout = output;
}
if ([output.mediaType isEqualToString:AVMediaTypeAudio]) {
audioTrackout = output;
}
}
// 開始讀取;不會(huì)阻塞
if ([self.assetReader startReading] == NO) {
NSLog(@"start reading failer");
return;
}
// 通過(guò)讀取狀態(tài)來(lái)判斷是否還有未讀取完的音視頻數(shù)據(jù)
CMSampleBufferRef videoSamplebuffer = NULL;
CMSampleBufferRef audioSamplebuffer = NULL;
BOOL videoFinish = NO;
BOOL audioFinish = NO;
int sum = 0;
while (self.assetReader.status == AVAssetReaderStatusReading && (!videoFinish || !audioFinish)) {
// 讀取視頻數(shù)據(jù)
if (videoTrackout != nil) {
videoSamplebuffer = [videoTrackout copyNextSampleBuffer];
if (videoSamplebuffer != NULL) {
CMTime pts = CMSampleBufferGetOutputPresentationTimeStamp(videoSamplebuffer);
CMTime dts = CMSampleBufferGetOutputDecodeTimeStamp(videoSamplebuffer);
CMTime duration = CMSampleBufferGetOutputDuration(videoSamplebuffer);
size_t size = CMSampleBufferGetSampleSize(videoSamplebuffer,0);
sum++;
// 對(duì)于未解壓的數(shù)據(jù),是可以獲取到pts,dts,duration的亚再,如果經(jīng)過(guò)系統(tǒng)內(nèi)部自動(dòng)解碼后郭膛,dts,duration可能會(huì)被丟失了
NSLog(@"video pts(%f),dts(%f),duration(%f) size(%ld) sum %d",CMTimeGetSeconds(pts),CMTimeGetSeconds(dts),CMTimeGetSeconds(duration),size,sum);
// 釋放資源
CMSampleBufferInvalidate(videoSamplebuffer);
CFRelease(videoSamplebuffer);
} else {
videoFinish = YES;
}
} else {
videoFinish = YES;
}
// 讀取音頻數(shù)據(jù)
if (audioTrackout != nil) {
audioSamplebuffer = [audioTrackout copyNextSampleBuffer];
if (audioSamplebuffer != NULL) {
CMTime pts = CMSampleBufferGetOutputPresentationTimeStamp(audioSamplebuffer);
CMTime dts = CMSampleBufferGetOutputDecodeTimeStamp(audioSamplebuffer);
CMTime duration = CMSampleBufferGetOutputDuration(audioSamplebuffer);
// 對(duì)于未解壓的數(shù)據(jù)邻吞,是可以獲取到pts,dts,duration的征绸,如果經(jīng)過(guò)系統(tǒng)內(nèi)部自動(dòng)解碼后,dts,duration可能會(huì)被丟失了
NSLog(@"audio pts(%f),dts(%f),duration(%f)",CMTimeGetSeconds(pts),CMTimeGetSeconds(dts),CMTimeGetSeconds(duration));
CMSampleBufferInvalidate(audioSamplebuffer);
CFRelease(audioSamplebuffer);
} else {
audioFinish = YES;
}
} else {
audioFinish = YES;
}
}
/** 遇到問(wèn)題:
*/
if (self.assetReader.status == AVAssetReaderStatusCompleted) {
[self.assetReader cancelReading];
self.assetReader = nil;
}
}
- (AVAssetReader *)createAssetReader
{
NSError *error = nil;
/** AVAssetReader對(duì)象
* 它跟AVCaptureSession作用一樣虚循,作為從容器對(duì)象AVAsset對(duì)象讀取數(shù)據(jù)的管理器如捅,需要音視頻輸出對(duì)象
*/
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:self.asset error:&error];
NSMutableDictionary *videoOutputSettings = [NSMutableDictionary dictionary];
if ([AVDemuxer supportsFastTextureUpload]) {
[videoOutputSettings setObject:@(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange) forKey:(__bridge NSString*)kCVPixelBufferPixelFormatTypeKey];
} else {
[videoOutputSettings setObject:@(kCVPixelFormatType_32BGRA) forKey:(__bridge NSString*)kCVPixelBufferPixelFormatTypeKey];
}
/** AVAssetTrack 對(duì)象
* 1棍现、音視頻流對(duì)象,它代表了容器中的某一路音視頻流镜遣,跟ffmpeg中AVStream對(duì)象功能一樣己肮。
* 2、一個(gè)容器中可以包括一路視頻流或者多路音頻流或者多路字幕流
* 3悲关、當(dāng)AVAsset對(duì)象被初始化之后谎僻,音視頻流對(duì)象就被初始化了,通過(guò)如下方式獲取該對(duì)象
*/
AVAssetTrack *videoTrack = [[self.asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
/** AVAssetReaderTrackOutput 對(duì)象
* 1坚洽、音視頻輸出對(duì)象戈稿,它是AVAssetReaderOutput對(duì)象的具體實(shí)現(xiàn)子類西土,外部通過(guò)該對(duì)象讀取音視頻數(shù)據(jù)
* 2讶舰、該對(duì)象負(fù)責(zé)配置輸出音視頻數(shù)據(jù)的格式等等參數(shù)
* 3、音視頻輸出對(duì)象要加入到音視頻讀取管理對(duì)象中才能從其中讀取數(shù)據(jù)
*/
// 添加視頻輸出對(duì)象;備注:當(dāng)最后一個(gè)參數(shù)為nil時(shí)代表輸出壓縮的數(shù)據(jù)需了;
// 如果配置了outputSettings 則內(nèi)部會(huì)自動(dòng)調(diào)用gpu硬解碼輸出解壓后的數(shù)據(jù)跳昼,解壓后的數(shù)據(jù)格式為前面outputSettings中配置的;
if (!self.autoDecode) {
videoOutputSettings = nil;
}
AVAssetReaderTrackOutput *videoTrackOut = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoOutputSettings];
videoTrackOut.alwaysCopiesSampleData = NO;
[assetReader addOutput:videoTrackOut];
// 添加音頻輸出對(duì)象
AVAssetTrack *audioTrack = [[self.asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVAssetReaderTrackOutput *audioTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];
audioTrackOutput.alwaysCopiesSampleData = NO;
[assetReader addOutput:audioTrackOutput];
return assetReader;
}
- (void)stopProcess
{
}
+ (BOOL)supportsFastTextureUpload
{
#if TARGET_IPHONE_SIMULATOR
return NO;
#endif
// IOS5 之后就支持了
return YES;
}
@end
遇到問(wèn)題:
1肋乍、AVDemuxer對(duì)象釋放后程序崩潰
分析原因:當(dāng)對(duì)象釋放時(shí)GCD會(huì)檢查信號(hào)量的值鹅颊,如果其值小于等于信號(hào)量初始化時(shí)的值 會(huì)認(rèn)為其處于in use狀態(tài),所以會(huì)對(duì)其dispose 時(shí)就會(huì)崩潰
解決方案:用如下代碼替換之前dispatch_semaphore_create(1);的寫法
decodeSemaphore = dispatch_semaphore_create(0);
dispatch_semaphore_signal(decodeSemaphore);
項(xiàng)目地址
https://github.com/nldzsz/ffmpeg-demo
位于AVFoundation目錄下文件AVDemuxer.h/AVDemuxer.m