本文來自于這位大神布持,在此基礎(chǔ)上我做了一些修改,我發(fā)現(xiàn)按照原文作者的做法使用軟解的時候會出現(xiàn)卡頓的情況岖圈,當(dāng)然我這邊卡頓可能是因?yàn)槲疫@邊設(shè)置或者其他原因?qū)е率成耄灰欢ㄊ窃淖髡叽a問題近范。
IJKPlayer是一個很好的視頻播放器開源框架催束,包含iOS端和Android端集峦,雖然好用但是也有一定的局限性需要自己去修改的。比如下面兩點(diǎn):
- 默認(rèn)視頻渲染幀率是按照視頻編碼幀率渲染的抠刺,一般視頻25幀塔淤,那么視頻渲染的時候就是以25幀來渲染的,這個渲染在普通2d視頻沒問題速妖,但是如果渲染全景視頻隨著手機(jī)的轉(zhuǎn)動高蜂,視角移動會出現(xiàn)卡頓,所以需要提升幀率罕容。
- 制作SDK供其他地方使用备恤,比如提供視頻幀給Unity3D,或者是其他任意地方渲染畫面锦秒。
以上兩條使用原生的播放器是無法解決的露泊,所以就需要自己做一些處理。
一脂崔、添加接口
1滤淳、在IJKMediaPlayBack.h添加三個方法
- (CVPixelBufferRef)framePixelbuffer;
- (void)framePixelbufferLock;
- (void)framePixelbufferUnlock;
2、在ff_ffplay_def.h 的 FFPlayer 數(shù)據(jù)結(jié)構(gòu)中添加如下:
// 引入頭文件
#include <CoreVideo/CoreVideo.h>
CVPixelBufferRef szt_pixelbuffer;
pthread_mutex_t szt_pixelbuffer_mutex;
3砌左、在ijkplayer.h中添加:
// 引入頭文件
#include <CoreVideo/CoreVideo.h>
CVPixelBufferRef ijkmp_get_pixelbuffer(IjkMediaPlayer *mp);
int ijkmp_pixelbuffer_mutex_init(IjkMediaPlayer *mp);
int ijkmp_pixelbuffer_mutex_lock(IjkMediaPlayer *mp);
int ijkmp_pixelbuffer_mutex_unlock(IjkMediaPlayer *mp);
4脖咐、在ijkplayer.c添加:
CVPixelBufferRef ijkmp_get_pixelbuffer(IjkMediaPlayer *mp)
{
return mp->ffplayer->szt_pixelbuffer;
}
int ijkmp_pixelbuffer_mutex_init(IjkMediaPlayer *mp)
{
int ret = ffp_pixelbuffer_mutex_init(mp->ffplayer);
return ret;
}
int ijkmp_pixelbuffer_mutex_lock(IjkMediaPlayer *mp)
{
int ret = ffp_pixelbuffer_lock(mp->ffplayer);
return ret;
}
int ijkmp_pixelbuffer_mutex_unlock(IjkMediaPlayer *mp)
{
int ret = ffp_pixelbuffer_unlock(mp->ffplayer);
return ret;
}
5、在ff_ffplay.h添加:
int ffp_pixelbuffer_mutex_init(FFPlayer *ffp);
int ffp_pixelbuffer_lock(FFPlayer *ffp);
int ffp_pixelbuffer_unlock(FFPlayer *ffp);
6汇歹、在ff_ffplay.c添加:
int ffp_pixelbuffer_mutex_init(FFPlayer *ffp)
{
int ret = pthread_mutex_init(&ffp->szt_pixelbuffer_mutex, NULL);
return ret;
}
int ffp_pixelbuffer_lock(FFPlayer *ffp)
{
int ret = pthread_mutex_lock(&ffp->szt_pixelbuffer_mutex);
return ret;
}
int ffp_pixelbuffer_unlock(FFPlayer *ffp)
{
int ret = pthread_mutex_unlock(&ffp->szt_pixelbuffer_mutex);
return ret;
}
7屁擅、在IJKFFMoviePlayerController.h添加:
- (CVPixelBufferRef)framePixelbuffer;
- (void)framePixelbufferLock;
- (void)framePixelbufferUnlock;
8、在IJKFFMoviePlayerController.m添加:
- (CVPixelBufferRef)framePixelbuffer
{
if (_mediaPlayer)
{
return ijkmp_get_pixelbuffer(_mediaPlayer);
}
return NULL;
}
- (void)framePixelbufferLock
{
if (_mediaPlayer)
{
ijkmp_pixelbuffer_mutex_lock(_mediaPlayer);
}
}
- (void)framePixelbufferUnlock
{
if (_mediaPlayer)
{
ijkmp_pixelbuffer_mutex_unlock(_mediaPlayer);
}
}
二产弹、如何使用:
[self.ijkplayer framePixelbufferLock];
CVPixelBufferRef pixelBuffer = [self.ijkplayer framePixelbuffer];
[self.ijkplayer framePixelbufferUnlock];
三派歌、軟解設(shè)置
1、添加yuvlib庫
yuvlib下載地址痰哨,原來的庫地址找不到了胶果,我就把自己保留的傳上去了。
1-1斤斧、將yuvlib庫引入到項(xiàng)目中
1-2 早抠、頭文件引用 IJKMediaPlayer.xcodeproj->IJKMediaFramework(TARGETS ) -> Build Settings->Header Search Paths:
add "$(SRCROOT)/yuvlib/include"
1-3、進(jìn)入到ff_ffplay.c撬讽,引入頭文件
#include "libyuv.h"
2蕊连、進(jìn)入到ff_ffplay.c
查找到以下函數(shù)
static int decoder_decode_frame(FFPlayer *ffp, Decoder *d, AVFrame *frame, AVSubtitle *sub)
添加代碼:
ffp_pixelbuffer_lock(ffp);
CVPixelBufferRef cvImage = NULL;
if (ffp->szt_pixelbuffer)
{
cvImage = ffp->szt_pixelbuffer;
}
int ret = createCVPixelBuffer(ffp, ffp->is->video_st->codec, frame, &cvImage);
if (!ret)
{
ffp->szt_pixelbuffer = cvImage;
}
ffp_pixelbuffer_unlock(ffp);
添加以下代碼:(這兩段代碼要放在decoder_decode_frame 方法之前)
static int copyAVFrameToPixelBuffer(FFPlayer *ffp, AVCodecContext* avctx, const AVFrame* frame, CVPixelBufferRef cv_img, const size_t* plane_strides, const size_t* plane_rows)
{
int i, j;
size_t plane_count;
int status;
size_t rows;
unsigned long src_stride;
unsigned long dst_stride;
uint8_t *src_addr;
uint8_t *dst_addr;
size_t copy_bytes;
status = CVPixelBufferLockBaseAddress(cv_img, 0);
if (status) {
av_log(
avctx,
AV_LOG_ERROR,
"Error: Could not lock base address of CVPixelBuffer: %d.\n",
status
);
}
uint8* src_y = frame->data[0];
uint8* src_u = frame->data[1];
uint8* src_v = frame->data[2];
void* addr = CVPixelBufferGetBaseAddress(cv_img);
int src_stride_y = frame->linesize[0];
int src_stride_u = frame->linesize[1];
int src_stride_v = frame->linesize[2];
size_t dst_width = CVPixelBufferGetBytesPerRow(cv_img);
int src_width = frame->width;
int src_height = frame->height;
I420ToARGB(src_y, src_stride_y,
src_u, src_stride_u,
src_v, src_stride_v,
addr, dst_width,
src_width, src_height);
status = CVPixelBufferUnlockBaseAddress(cv_img, 0);
if (status) {
av_log(avctx, AV_LOG_ERROR, "Error: Could not unlock CVPixelBuffer base address: %d.\n", status);
return AVERROR_EXTERNAL;
}
return 0;
}
int createCVPixelBuffer(FFPlayer *ffp, AVCodecContext* avctx, AVFrame* frame, CVPixelBufferRef* cvImage)
{
size_t widths [AV_NUM_DATA_POINTERS];
size_t heights[AV_NUM_DATA_POINTERS];
size_t strides[AV_NUM_DATA_POINTERS];
int status;
memset(widths, 0, sizeof(widths));
memset(heights, 0, sizeof(heights));
memset(strides, 0, sizeof(strides));
widths[0] = avctx->width;
heights[0] = avctx->height;
strides[0] = frame ? frame->linesize[0] : avctx->width;
widths[1] = (avctx->width + 1)/2;
heights[1] = (avctx->height + 1)/2;
strides[1] = frame? frame->linesize[1] : (avctx ->width + 1)/2;
widths[2] = (avctx->width + 1)/2;
heights[2] = (avctx->height + 1)/2;
strides[2] = frame? frame->linesize[2] : (avctx ->width + 1)/2;
if (!ffp->szt_pixelbuffer)
{
status = CVPixelBufferCreate(
kCFAllocatorDefault,
frame->width,
frame->height,
kCVPixelFormatType_32BGRA,
NULL,
cvImage
);
if (status)
{
return AVERROR_EXTERNAL;
}
}
status = copyAVFrameToPixelBuffer(ffp, avctx, frame, *cvImage, strides, heights);
if (status)
{
CFRelease(*cvImage);
*cvImage = NULL;
return status;
}
return 0;
}
四、硬解設(shè)置
1游昼、進(jìn)入到IJKVideoToolBoxAsync.m和IJKVideoToolBoxSync.h
1-1甘苍、查找函數(shù)
void VTDecoderCallback(void *decompressionOutputRefCon,
void *sourceFrameRefCon,
OSStatus status,
VTDecodeInfoFlags infoFlags,
CVImageBufferRef imageBuffer,
CMTime presentationTimeStamp,
CMTime presentationDuration)
{
替換
OSType format_type = CVPixelBufferGetPixelFormatType(imageBuffer);
if (format_type != kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) {
ALOGI("format_type error \n");
goto failed;
}
把kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange替換成kCVPixelFormatType_32BGRA
1-2、查找函數(shù)
static VTDecompressionSessionRef vtbsession_create(Ijk_VideoToolBox_Opaque* context)
替換
CFDictionarySetSInt32(destinationPixelBufferAttributes,
kCVPixelBufferPixelFormatTypeKey, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange);
把kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange替換成kCVPixelFormatType_32BGRA
2烘豌、進(jìn)入到ff_ffplay.c
查找函數(shù)
static int queue_picture(FFPlayer *ffp, AVFrame *src_frame, double pts, double duration, int64_t pos, int serial)
// 需要添加頭文件
#include "ijksdl_vout_overlay_videotoolbox.h"
添加這段代碼: (可以查找SDL_VoutUnlockYUVOverlay(vp->bmp);)
if (ffp -> videotoolbox) {
// TODO edit
ffp_pixelbuffer_lock(ffp);
ffp->szt_pixelbuffer = SDL_VoutOverlayVideoToolBox_GetCVPixelBufferRef(vp->bmp); // picture->opaque;
ffp_pixelbuffer_unlock(ffp);
if (!ffp->szt_pixelbuffer) {
ALOGE("nil pixelBuffer in overlay\n");
}
}
3种蝶、屏蔽SDLView獲取紋理渲染的接口
進(jìn)入到ff_ffplay.c, 查找函數(shù)
static void video_image_display2(FFPlayer *ffp)
注釋掉以下代碼
// SDL_VoutDisplayYUVOverlay(ffp->vout, vp->bmp);
4、添加參數(shù)設(shè)置
4-1难捌、進(jìn)入ff_ffplay_def.h
在FFPlayer數(shù)據(jù)結(jié)構(gòu)中添加
int vtb_frame_width_default;
查找:ffp->vtb_max_frame_width = 0;
在這個后面添加
ffp->vtb_frame_width_default = 0;
4-2牲平、進(jìn)入ff_ffplay_options.h
查找"videotoolbox-max-frame-width"
在這個后面添加
{ "video-max-frame-width-default", "max width of output frame default",
OPTION_OFFSET(vtb_frame_width_default), OPTION_INT(0, 0, INT_MAX) },
4-3、進(jìn)入IJKOptions.m
在+ (IJKFFOptions *)optionsByDefault{}內(nèi)添加:
[options setPlayerOptionIntValue:1 forKey:@"video-max-frame-width-default"];
5罐寨、進(jìn)入到IJKVideoToolBoxAsync.m和IJKVideoToolBoxSync.h
查找
static VTDecompressionSessionRef vtbsession_create(Ijk_VideoToolBox_Opaque* context)
將代碼
if (ffp->vtb_max_frame_width > 0 && width > ffp->vtb_max_frame_width) {
double w_scaler = (float)ffp->vtb_max_frame_width / width;
width = ffp->vtb_max_frame_width;
height = height * w_scaler;
}
替換為
if (ffp->vtb_frame_width_default > 0 && ffp->vtb_max_frame_width > 0 && width > ffp->vtb_max_frame_width) {
double w_scaler = (float)ffp->vtb_max_frame_width / width;
width = ffp->vtb_max_frame_width;
height = height * w_scaler;
}
五靡挥、優(yōu)化點(diǎn)
- 我使用文中提供的方法硬解播放一切正常,但是使用軟解播放的時候偶現(xiàn)卡頓現(xiàn)象鸯绿,所以我就沒使用文中的軟件方案跋破,而是在硬解的地方去掉了
if (ffp -> videotoolbox)
這個判斷語句,軟硬解都會執(zhí)行到這里來的瓶蝴,而且我返回的數(shù)據(jù)是void *類型(使用的時候會轉(zhuǎn)換成SDL_VoutOverlay
類型)毒返,因?yàn)槲疫@邊需求只是為了提高刷新率,所以就起一個線程單獨(dú)刷新數(shù)據(jù)舷手,不在局限于以前的默認(rèn)幀率(一般25幀)拧簸。數(shù)據(jù)也沒有轉(zhuǎn)換直接賦值ffp->szt_pixelbuffer = vp->bmp;
,因?yàn)槲揖褪切枰?code>SDL_VoutOverlay這樣的數(shù)據(jù),這樣做起來更省事男窟。 - 如果需要使用
CVPixelBufferRef
類型盆赤,同樣可以軟硬解都在static int queue_picture(FFPlayer *ffp, AVFrame *src_frame, double pts, double duration, int64_t pos, int serial)
函數(shù)內(nèi)拿到szt_pixelbuffer
以后再根據(jù)軟硬解做不同的處理贾富,硬解就使用SDL_VoutOverlayVideoToolBox_GetCVPixelBufferRef(vp->bmp)
,軟解就使用原文作者提到的轉(zhuǎn)換方案(static int copyAVFrameToPixelBuffer(FFPlayer *ffp, AVCodecContext* avctx, const AVFrame* frame, CVPixelBufferRef cv_img, const size_t* plane_strides, const size_t* plane_rows)
和int createCVPixelBuffer(FFPlayer *ffp, AVCodecContext* avctx, AVFrame* frame, CVPixelBufferRef* cvImage)
)這兩個方法。這樣就不會出現(xiàn)軟解卡頓的現(xiàn)象了牺六。 - 如果最終使用的數(shù)據(jù)類型也是
SDL_VoutOverlay
類型颤枪,在硬解里邊顏色空間就不用做修改了,即kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
不需要替換成kCVPixelFormatType_32BGRA
了淑际。