AVPlayer的API真心不友好,一個截屏問題搞得我花了6個小時才徹底解決他嚷。在網(wǎng)上搜索會找到兩種截屏方案: 使用AVPlayerItemVideoOutput
和AVAssetImageGenerator蹋绽。
其中AVAssetImageGenerator用于處理非HLS(切片視頻流)的截屏芭毙,比如MPEG-4格式筋蓖。但一開始我因為沒有限制精度導(dǎo)致截出來的幀總是慢了1-2秒:
AVURLAsset *asset = self.playerItem.asset;
self.imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
self.imageGenerator.appliesPreferredTrackTransform = YES;
// 注意:設(shè)置為高精度截屏,這和seekToTime的精度其實是一樣的
self.imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
self.imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
// 截屏操作
int attemptNumber = 0;
BOOL success = NO;
// 截屏時間點可能不準退敦,會反復(fù)重試
while (attemptNumber < 3 && !success) {
CMTime actualTime;
NSError *error;
videoImage = [self.imageGenerator copyCGImageAtTime:time actualTime:&actualTime error:&error];
captureImage = [UIImage imageWithCGImage:videoImage];
CGImageRelease(videoImage);
if (error) {
return nil;
}
float actual = CMTimeGetSeconds(actualTime);
if (isFloatEqual(timestamp, actual)) {
success = YES;
} else {
attemptNumber++;
}
}
// 你也可以使用generateCGImagesAsynchronouslyForTimes方法粘咖,異步獲取多個時間點的幀圖片
而AVPlayerItemVideoOutput則用于處理HLS(m3u8)的截屏,我一開始是先把視頻暫停了再截屏沒問題侈百,但是在播放中時截屏就出現(xiàn)了這篇文章中出現(xiàn)的buffer返回null的問題
瓮下。然后各種查詢之后發(fā)現(xiàn)這個回答中說:
1. Create player & player item, dispatch queue and display link
2. Register observer for AVPlayerItem status key
3. On status AVPlayerStatusReadyToPlay, create AVPlayerItemVideoOutput and start display link
然后我也將AVPlayerItemVideoOutput的構(gòu)造和注冊放到了AVPlayerStatusReadyToPlay之后發(fā)現(xiàn)還是不行,最后在這篇早年的objcio的文章中演示了對CADisplayLink的使用钝域,才理解了前面那句create AVPlayerItemVideoOutput and start display link的含義
// 在AVPlayerItemStatusReadyToPlay時創(chuàng)建實例
if (self.config.isVideoHLSFormat
&& self.videoOutput == nil
&& self.playerItem.status == AVPlayerItemStatusReadyToPlay) {
NSDictionary *settings = @{(id)kCVPixelBufferPixelFormatTypeKey:
@(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)
};
self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
[self.playerItem addOutput:self.videoOutput];
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(displayLinkCallback:)];
[self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
}
// 在runloop的間隙獲取當前videoOutput的buffer
- (void)displayLinkCallback:(CADisplayLink *)sender {
CMTime time = [self.videoOutput itemTimeForHostTime:CACurrentMediaTime()];
if ([self.videoOutput hasNewPixelBufferForItemTime:time]) {
self.lastSnapshotPixelBuffer = [self.videoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:NULL];
}
}
// 截屏處理
CVPixelBufferRef buffer = [self.videoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:nil];
// 再取一次buffer是為了防止在暫停時截屏不準確
if (buffer == NULL
&& self.lastSnapshotPixelBuffer) {
buffer = self.lastSnapshotPixelBuffer;
}
if (buffer) {
//生成CIImage
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:buffer];
// 轉(zhuǎn)化為CGImage
CIContext *context = [CIContext contextWithOptions:nil];
size_t width = CVPixelBufferGetWidth(buffer);
size_t height = CVPixelBufferGetHeight(buffer);
videoImage = [context createCGImage:ciImage
fromRect:CGRectMake(0, 0, width, height)];
// 生成UIImage
captureImage = [UIImage imageWithCGImage:videoImage];
CGImageRelease(videoImage);
}
我對截屏的圖片處理比較簡單就直接用UIImageJPEGRepresentation(image, 0.5)壓縮了讽坏,也可以更復(fù)雜的處理。最后在重建playerItem時別忘了清理資源:
[self.playerItem removeOutput:self.videoOutput];
self.videoOutput = nil;
[self.displayLink invalidate];
self.displayLink = nil;
self.lastSnapshotPixelBuffer = NULL;
最后還有一個問題沒有答案例证,如果要實現(xiàn)HLS視頻的緩沖進度的圖片預(yù)覽不知道用AVPlayer的原生API怎么做路呜。