最近使用MediaCodec做編解碼H264恩掷,寫一點東西以免自己再次掉坑尤揣。
先說一下具體環(huán)境糟秘,使用的是黔攒,Windows10 + AndroidStudio3.0 + CMake
既然都用了AndroidStudio3.0,就不用android.mk文件啦舶斧,直接上CMake欣鳖。
先上一個,谷歌Android官方鏈接茴厉,可以看到MediaCodec的架構(gòu)泽台,狀態(tài)轉(zhuǎn)移圖,全部API等矾缓。
https://developer.android.com/reference/android/media/MediaCodec.html
這個是Android源碼media部分怀酷,內(nèi)含MediaCodec,分別在jni和java目錄下嗜闻。
https://android.googlesource.com/platform/frameworks/base/+/master/media
這個是Android源碼ndk c++ media部分蜕依,內(nèi)含MediaCodec,分別在libmedia,libstagefright样眠,ndk目錄下友瘤。
https://android.googlesource.com/platform/frameworks/av/+/master/media/
這些都要認真上網(wǎng)
才能看。
這里有個MediaCodec 中文API文檔檐束,文檔雖然不是最新版辫秧,但是對我們理解還是有幫助的,感謝譯者的奉獻。
https://www.cnblogs.com/roger-yu/p/5635494.html
一、架構(gòu)介紹
MediaCodec有輸入和輸入的緩沖區(qū)和隊列妻献。
在MediaCodec的生命周期中糯耍,存在三種狀態(tài):
Stopped:包含Uninitialized、Configured喂急、Error三種子狀態(tài)格嘁。
Executing:包含F(xiàn)lushed、Running廊移、End-of-Stream三種子狀態(tài)糕簿。
Released。
這些在MediaCodec官方文檔都有狡孔,很多人翻譯的很好懂诗。我就不多說了。
二苗膝、調(diào)用流程簡介
Android從API 16開始提供java層的MediaCodec視頻硬解碼接口殃恒;
從API 21,也就是Android 5.0開始提供native層的MediaCodec的接口辱揭。
Android調(diào)用MediaCodec簡單流程的是:
SDK : JavaApi---->JNI---->C++
NDK:NdkApi----->C++
這個博客非常詳細的講了這個流程离唐。我就不贅述了。
http://blog.csdn.net/hejjunlin/article/details/53386117
http://blog.csdn.net/hejjunlin/article/details/53573819
http://blog.csdn.net/hejjunlin/article/details/72859142
最后文章有個最后疑惑點
最后疑惑點
在閱讀時问窃,還發(fā)現(xiàn)有NdkMediaCodec及NdkMediaCodec.cpp這些個class, 和上面幾個class的區(qū)別是什么亥鬓?有什么關(guān)系?為什么要這么設(shè)計域庇?
frameworks\av\include\ndk\NdkMediaCodec.h
其實這倆就是NDK的Api接口類嵌戈,這個類最終還是調(diào)用的stagefright內(nèi)的MediaCodec。
三听皿、實現(xiàn)方式
Talk is cheap咕别,show me the code
首先要選擇的是:使用SDK還是NDK?
SDK用Java写穴,API16(即Android 4.1)以上的設(shè)備都可用惰拱。
NDK用C++,API21(集Android 5.0)以上的設(shè)備可支持。
據(jù)谷歌2018年1月的Android版本統(tǒng)計數(shù)據(jù):
Android4.1及以后的版本偿短,占99.1%欣孤,我們可以理解為支持所有在市面上的Android設(shè)備。
Android4.1~4.4的版本昔逗,占18.4%降传,Android5.0及以后的版本,占80.7%勾怒。
特別是Android4.4婆排,占有率達12.8%,不能忽視笔链。
簡單計算一下段只,Android4.4及以上版本,占93.5%鉴扫,可以視為支持了絕大多數(shù)Android設(shè)備赞枕。
如果使用NDK的話,可能需要考慮下Android4.4的兼容問題坪创,幸好有人從Android4.4源碼里抽取了libnative_codec19.so并封裝了和NdkMediaCodec一樣接口炕婶,感謝他的貢獻精神。
具體這兩種方式莱预,我都要講講柠掂。
MediaCodec接入的準(zhǔn)備工作
從/etc/media_codecs.xml文件里可以看出支持的各種格式,一般來說依沮,又能編碼又能解碼只有H264陪踩。
那么我們就以H264來舉例,從MediaCodec reference頁面得知悉抵,H264使用"video/avc"肩狂。
### createDecoderByType
added in [API level 16](https://developer.android.com/guide/topics/manifest/uses-sdk-element.html#ApiLevels)
Instantiate the preferred decoder supporting input data of the given mime type.
The following is a partial list of defined mime types and their semantics:
* "video/x-vnd.on2.vp8" - VP8 video (i.e. video in .webm)
* "video/x-vnd.on2.vp9" - VP9 video (i.e. video in .webm)
* "video/avc" - H.264/AVC video
* "video/hevc" - H.265/HEVC video
* "video/mp4v-es" - MPEG4 video
* "video/3gpp" - H.263 video
* "audio/3gpp" - AMR narrowband audio
* "audio/amr-wb" - AMR wideband audio
* "audio/mpeg" - MPEG1/2 audio layer III
* "audio/mp4a-latm" - AAC audio (note, this is raw AAC packets, not packaged in LATM!)
* "audio/vorbis" - vorbis audio
* "audio/g711-alaw" - G.711 alaw audio
* "audio/g711-mlaw" - G.711 ulaw audio
Note: It is preferred to use findDecoderForFormat(MediaFormat) and createByCodecName(String) to ensure that the resulting codec can handle a given format.
①MediaCodec格式支持
上一段檢測MediaCodec格式支持的代碼。
private int getSupportColorFormat() {
int numCodecs = MediaCodecList.getCodecCount();
MediaCodecInfo codecInfo = null;
for (int i = 0; i < numCodecs && codecInfo == null; i++) {
MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
if (!info.isEncoder()) {
continue;
}
String[] types = info.getSupportedTypes();
boolean found = false;
for (int j = 0; j < types.length && !found; j++) {
if (types[j].equals("video/avc")) {
System.out.println("found");
found = true;
}
}
if (!found)
continue;
codecInfo = info;
}
Log.e("AvcEncoder", "Found " + codecInfo.getName() + " supporting " + "video/avc");
// Find a color profile that the codec supports
MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType("video/avc");
Log.e("AvcEncoder",
"length-" + capabilities.colorFormats.length + "==" + Arrays.toString(capabilities.colorFormats));
for (int i = 0; i < capabilities.colorFormats.length; i++) {
switch (capabilities.colorFormats[i]) {
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible:
Log.e("AvcEncoder", "supported color format::" + capabilities.colorFormats[i]);
break;
default:
Log.e("AvcEncoder", "other color format " + capabilities.colorFormats[i]);
break;
}
}
//return capabilities.colorFormats[i];
return 0;
}
可以由此得知Android設(shè)備具體支持的格式數(shù)量和格式枚舉值姥饰。
這個MediaCodecInfo和MediaCodecList在NDKC++并未暴露出接口傻谁,
所以NDKC++代碼不能調(diào)用,無法獲得Android設(shè)備具體支持的格式數(shù)量和格式枚舉值列粪。
(如果有辦法調(diào)用审磁,或者得知具體支持的格式枚舉值,請不吝賜教岂座。)
②Codec輸入編碼格式
得知Android設(shè)備具體支持的格式枚舉值后态蒂,大多數(shù)設(shè)備都會支持這幾種格式。
public static final class CodecCapabilities {
/** @deprecated */
@Deprecated
public static final int COLOR_FormatYUV420Planar = 19;
/** @deprecated */
@Deprecated
public static final int COLOR_FormatYUV420SemiPlanar = 21;
public static final int COLOR_FormatYUV422Flexible = 2135042184;
public class ImageFormat {
public static final int DEPTH16 = 1144402265;
public static final int DEPTH_POINT_CLOUD = 257;
public static final int FLEX_RGBA_8888 = 42;
public static final int FLEX_RGB_888 = 41;
public static final int JPEG = 256;
public static final int NV16 = 16;
public static final int NV21 = 17;
public static final int PRIVATE = 34;
public static final int RAW10 = 37;
public static final int RAW12 = 38;
public static final int RAW_PRIVATE = 36;
public static final int RAW_SENSOR = 32;
public static final int RGB_565 = 4;
public static final int UNKNOWN = 0;
public static final int YUV_420_888 = 35;
public static final int YUV_422_888 = 39;
public static final int YUV_444_888 = 40;
public static final int YUY2 = 20;
public static final int YV12 = 842094169;
其中:
CodecCapabilities.COLOR_FormatYUV420Planar费什,相當(dāng)于ImageFormat.YV12
CodecCapabilities.COLOR_FormatYUV420SemiPlanar钾恢,相當(dāng)于ImageFormat.NV21
(我有一篇寫YUV格式的文章,詳細描述各種格式。)
但在有些Android設(shè)備上瘩蚪,卻是UV反過來的:
CodecCapabilities.COLOR_FormatYUV420Planar是 I420,
CodecCapabilities.COLOR_FormatYUV420SemiPlanar是 NV12,
(暫不清楚原因泉懦,望不吝賜教)
MediaCodec SDK接入
①創(chuàng)建Encoder,設(shè)置各種參數(shù)疹瘦。
public AvcEncoder(int width, int height, int framerate, int bitrate) {
Log.d("Codec", "AvcEncoder IN");
m_width = width;
m_height = height;
yuv420 = new byte[width*height*3/2];
try {
mediaCodec = MediaCodec.createEncoderByType("video/avc");
} catch (IOException e) {
e.printStackTrace();
}
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", width, height);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 30);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
Log.d("Codec", "AvcEncoder OUT");
}
②創(chuàng)建Decoder崩哩,設(shè)置各種參數(shù)。
public AvcDecoder(int width, int height, SurfaceHolder surfaceHolder) {
Log.d("Codec", "AvcDecoder IN");
try {
mediaCodec = MediaCodec.createDecoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", width, height);
nv12 = new byte[width * height * 3 / 2];
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
//mediaCodec.configure(mediaFormat, surfaceHolder.getSurface(), null, 0);
mediaCodec.configure(mediaFormat, null, null, 0);
mediaCodec.start();
Log.d("Codec", "AvcDecoder OUT");
} catch (IOException e) {
e.printStackTrace();
}
}
③獲取Encoder InputBuffer言沐,輸入數(shù)據(jù)邓嘹,獲取Encoder OutputBuffer,獲取輸出數(shù)據(jù)
public int offerEncoder(byte[] input, byte[] output)
{
Log.d("......................................................Codec", "Encoder in");
int pos = 0;
yuv420= input;
try {
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
Log.d("......................................................Codec", "inputBufferIndex = " +inputBufferIndex);
if (inputBufferIndex >= 0)
{
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(yuv420);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);
Log.d("......................................................Codec", "outputBufferIndex = " +outputBufferIndex);
while (outputBufferIndex >= 0)
{
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
if(m_info != null)
{
System.arraycopy(outData, 0, output, 0, outData.length);
pos += outData.length;
Log.d("Encoder", "m_info: " + pos);
}
else
{
ByteBuffer spsPpsBuffer = ByteBuffer.wrap(outData);
//if (spsPpsBuffer.getInt() == 0x00000001)
if(bufferInfo.flags == 2)
{
m_info = new byte[outData.length];
System.arraycopy(outData, 0, m_info, 0, outData.length);
System.arraycopy(outData, 0, output, pos, outData.length);
pos+=outData.length;
}
else
{
Log.d("Encoder", "errrrr: ");
return -1;
}
Log.d("Encoder", "m_info: " + Arrays.toString(m_info));
}
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
}
if(bufferInfo.flags == 1)// if( nv12[4] == 0x65) //key frame
{
Log.d("Encoder", "Key frame");
System.arraycopy(output, 0, yuv420, 0, pos);
System.arraycopy(m_info, 0, output, 0, m_info.length);
System.arraycopy(yuv420, 0, output, m_info.length, pos);
pos += m_info.length;
}
} catch (Throwable t) {
t.printStackTrace();
}
//Log.d("......................................................Codec", "Encoder out");
return pos;
}
④獲取Decoder InputBuffer险胰,輸入數(shù)據(jù)汹押,獲取Decoder OutputBuffer,獲取輸出數(shù)據(jù)
public void onFrame(byte[] buf, int length) {
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
Log.d("Decoder", "inputBufferIndex: " + inputBufferIndex);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(buf, 0, length);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, length, mCount * 1000000, 0);
mCount++;
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
while (outputBufferIndex >= 0) {
outputBuffers[outputBufferIndex].get(nv12 , 0 , nv12.length);
CallbackAdapt.UpdateH264Decode(nv12, outputBufferIndex);
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
}
}
MediaCodec NDK接入
①創(chuàng)建Encoder鸯乃,Decoder
void FrameListener::InitCodec()
{
#ifndef WIN32
const char* mime = "video/avc";
//編碼器
m_encoder = AMediaCodec_createEncoderByType(mime);
if (m_encoder == NULL)
{
LOGE("MediaCodecH264: could not create Encoder");
}
AMediaFormat *m_format = AMediaFormat_new();
AMediaFormat_setString(m_format, AMEDIAFORMAT_KEY_MIME, "video/avc");
AMediaFormat_setInt32(m_format, AMEDIAFORMAT_KEY_WIDTH, m_lastWidth);
AMediaFormat_setInt32(m_format, AMEDIAFORMAT_KEY_HEIGHT, m_lastHeight);
int bitrate = 500000;
int framerate = 30;
AMediaFormat_setInt32(m_format, AMEDIAFORMAT_KEY_BIT_RATE, bitrate);
AMediaFormat_setInt32(m_format, AMEDIAFORMAT_KEY_FRAME_RATE, framerate);
AMediaFormat_setInt32(m_format, AMEDIAFORMAT_KEY_I_FRAME_INTERVAL, framerate);
AMediaFormat_setInt32(m_format, AMEDIAFORMAT_KEY_COLOR_FORMAT, 21);
media_status_t status = AMediaCodec_configure(m_encoder, m_format, NULL, NULL, AMEDIACODEC_CONFIGURE_FLAG_ENCODE);
if (status != 0)
{
LOGE("AMediaCodec_configure() failed with error %i for format %u", (int)status, 21);
}
else
{
if ((status = AMediaCodec_start(m_encoder)) != AMEDIA_OK)
{
LOGE("AMediaCodec_start: Could not start encoder.");
}
else
{
LOGD("AMediaCodec_start: encoder successfully started");
}
}
AMediaFormat_delete(m_format);
//解碼器
m_decoder = AMediaCodec_createDecoderByType(mime);
if (m_decoder == NULL)
{
LOGE("MediaCodecH264: could not create Decoder");
}
else
{
AMediaFormat *m_format2 = AMediaFormat_new();
AMediaFormat_setString(m_format2, AMEDIAFORMAT_KEY_MIME, "video/avc");
AMediaFormat_setInt32(m_format2, AMEDIAFORMAT_KEY_WIDTH, m_lastWidth);
AMediaFormat_setInt32(m_format2, AMEDIAFORMAT_KEY_HEIGHT, m_lastHeight);
AMediaFormat_setInt32(m_format2, AMEDIAFORMAT_KEY_COLOR_FORMAT, 21);
//AMediaFormat_setBuffer(m_format2, "csd-0",sps, sizeof(sps));
//AMediaFormat_setBuffer(m_format2, "csd-1",pps, sizeof(pps));
if ((status = AMediaCodec_configure(m_decoder, m_format2, NULL, NULL, 0)) !=
AMEDIA_OK) {
LOGD("MediaCodecH264Dec: configuration failure: %i", (int)status);
}
if ((status = AMediaCodec_start(m_decoder)) != AMEDIA_OK) {
LOGD("MediaCodecH264Dec: starting failure: %i", (int)status);
}
AMediaFormat_delete(m_format2);
}
#endif // !WIN32
}
②獲取Encoder InputBuffer鲸阻,然后輸入數(shù)據(jù)跋涣,獲取Encoder OutputBuffer缨睡,然后獲取輸出數(shù)據(jù)
void FrameListener::Encode()
{
#ifndef WIN32
ssize_t ibufidx, obufidx;
AMediaCodecBufferInfo info;
size_t bufsize;
/*First queue input image*/
uint8_t *buf;
ibufidx = AMediaCodec_dequeueInputBuffer(m_encoder, TIMEOUT_US);
if (ibufidx >= 0)
{
buf = AMediaCodec_getInputBuffer(m_encoder, ibufidx, &bufsize);
if (buf)
{
memcpy(buf, m_inputNV12, m_YUVSize);
auto curTime = timeGetTime();
AMediaCodec_queueInputBuffer(m_encoder, ibufidx, 0, bufsize, curTime, 0);
}
else
{
LOGD("MediaCodecH264Enc: obtained InputBuffer, but no address.");
}
}
else if (ibufidx == AMEDIA_ERROR_UNKNOWN)
{
LOGD("MediaCodecH264Enc: AMediaCodec_dequeueInputBuffer() had an exception");
}
//int pos = 0;
/*Second, dequeue possibly pending encoded frames*/
while ((obufidx = AMediaCodec_dequeueOutputBuffer(m_encoder, &info, TIMEOUT_US)) >= 0)
{
auto oBuf = AMediaCodec_getOutputBuffer(m_encoder, obufidx, &bufsize);
if (oBuf)
{
if (m_info == NULL)
{
m_infoSize = info.size;
m_info = new byte[m_infoSize];
if (info.flags == 2 )
{
memcpy(m_info, oBuf, m_infoSize);
LOGD("obBuf %d %d flag:%d offest:%d size:%d", m_infoSize, bufsize, info.flags, info.offset, info.size);
char str[256] = {0};
for (int i = 0; i < m_infoSize; ++i)
{
sprintf(str, "%s %d", str, m_info[i]);
}
LOGD("obBuf %s", str);
//pos += m_infoSize;
continue;
}
else
{
LOGD("errorrr");
return;
}
}
LOGD("m_infoSize %d %d flag:%d offest:%d size:%d", m_infoSize, bufsize, info.flags, info.offset, info.size);
H264Data *data = new H264Data();
m_dataList.push_back(data);
data->flag = info.flags;
if (info.flags == 1 ) //key frame
{
data->dataPtr = new byte[bufsize + m_infoSize];
memcpy(data->dataPtr, m_info, m_infoSize);
memcpy(data->dataPtr + m_infoSize, oBuf, bufsize);
data->size = bufsize + m_infoSize;
}
else
{
data->dataPtr = new byte[bufsize];
memcpy(data->dataPtr, oBuf, bufsize);
data->size = bufsize;
}
LOGD("Out finish");
}
AMediaCodec_releaseOutputBuffer(m_encoder, obufidx, false);
}
if (obufidx == AMEDIA_ERROR_UNKNOWN)
{
LOGD("MediaCodecH264Enc: AMediaCodec_dequeueOutputBuffer() had an exception, MediaCodec is lost");
AMediaCodec_stop(m_encoder);
AMediaCodec_delete(m_encoder);
}
#endif // !WIN32
}
③獲取Decoder InputBuffer,然后輸入數(shù)據(jù)陈辱,獲取Decoder OutputBuffer奖年,然后獲取輸出數(shù)據(jù)
void FrameListener::Decode()
{
#ifndef WIN32
if (m_decoder == NULL)
{
return;
}
ssize_t oBufidx = -1;
size_t bufsize = 0;
AMediaCodecBufferInfo info;
uint8_t *buf = NULL;
ssize_t iBufidx = -1;
/*First put our H264 bitstream into the decoder*/
while (!m_dataList.empty())
{
iBufidx = AMediaCodec_dequeueInputBuffer(m_decoder, TIMEOUT_US);
LOGD("decoder iBufidx %d %d", iBufidx, m_dataList.size());
if (iBufidx >= 0)
{
buf = AMediaCodec_getInputBuffer(m_decoder, iBufidx, &bufsize);
int bufsize = 0;
auto iter = m_dataList.begin();
char str[512] = { 0 };
for (int i = 0; i < 100; ++i)
{
sprintf(str, "%s %d", str, *((*iter)->dataPtr + i));
}
LOGD("obBuf after %s", str);
if (buf)
{
bufsize = (*iter)->size;
memcpy(buf, (*iter)->dataPtr, bufsize);
}
AMediaCodec_queueInputBuffer(m_decoder, iBufidx, 0, bufsize, timeGetTime(), 0);
SAFE_DELETE_ARRAY((*iter)->dataPtr);
m_dataList.erase(iter);
}
else if (iBufidx == -1)
{
/*
* This is a problematic case because we can't wait the decoder to be ready, otherwise we'll freeze the entire
* video thread.
* We have no other option to drop the frame, and retry later, but with an I-frame of course.
**/
break;
}
}
/*secondly try to get decoded frames from the decoder, this is performed every tick*/
oBufidx = AMediaCodec_dequeueOutputBuffer(m_decoder, &info, TIMEOUT_US);
LOGD("Decoder oBufidx %d", oBufidx);
while (oBufidx >= 0)
{
AMediaFormat *format;
int color = 0;
uint8_t *buf = AMediaCodec_getOutputBuffer(m_decoder, oBufidx, &bufsize);
if (buf == NULL)
{
LOGD("MediaCodecH264Dec: AMediaCodec_getOutputBuffer() returned NULL");
//continue;
}
else
{
int width = 0, height = 0;
format = AMediaCodec_getOutputFormat(m_decoder);
if (format != NULL)
{
AMediaFormat_getInt32(format, "width", &width);
AMediaFormat_getInt32(format, "height", &height);
AMediaFormat_getInt32(format, "color-format", &color);
AMediaFormat_delete(format);
}
if (width != 0 && height != 0)
{
if (color == 21)
{
LOGD("12121212");
//NV12
byte* outNV12 = new byte[m_YUVSize];
memcpy(outNV12, buf, m_YUVSize);
m_outputNV12List.push_back(outNV12);
}
else
{
LOGD("unknown format");
}
}
else
{
LOGD("MediaCodecH264Dec: width and height are not known !");
}
}
AMediaCodec_releaseOutputBuffer(m_decoder, oBufidx, false);
oBufidx = AMediaCodec_dequeueOutputBuffer(m_decoder, &info, TIMEOUT_US);
LOGD("Decoder oBufidx %d", oBufidx);
}
if (oBufidx == AMEDIA_ERROR_UNKNOWN)
{
LOGD("MediaCodecH264Dec: AMediaCodec_dequeueOutputBuffer() had an exception");
}
#endif // !WIN32
}
MediaCodec NDK CMake
Android NDK Sample里面自帶一個native-codec的demo,可以參考他的CMakeLists.txt沛贪。
cmake_minimum_required(VERSION 3.4.1)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11 -Wall -UNDEBUG")
add_library(native-codec-jni SHARED
looper.cpp
native-codec-jni.cpp)
# Include libraries needed for native-codec-jni lib
target_link_libraries(native-codec-jni
android
log
mediandk
OpenMAXAL)
一定要加上mediandk
和OpenMAXAL
陋守,才能正常使用MediaCodec。
常見坑點FAQ
假設(shè)讀者知道H264幀頭和NALU
1利赋、H264編碼首幀水评,內(nèi)部存有SPS和PPS信息,需要保留起來媚送,然后中燥,加在每個H264關(guān)鍵幀的前面。
mediaCodec.dequeueOutputBuffer會返回MediaCodec.BufferInfo塘偎,這個和AMediaCodec_dequeueOutputBuffer返回AMediaCodecBufferInfo是一樣的疗涉。
struct AMediaCodecBufferInfo {
int32_t offset;
int32_t size;
int64_t presentationTimeUs;
uint32_t flags;
};
具體見android reference:
https://developer.android.com/reference/android/media/MediaCodec.BufferInfo.html
其中有個字段是flags,它有幾種常量情況吟秩。
flags = 4咱扣;End of Stream。
flags = 2涵防;首幀信息幀闹伪。
flags = 1;關(guān)鍵幀。
flags = 0祭往;普通幀伦意。
見https://developer.android.com/reference/android/media/MediaCodec.html里面常量部分。
2硼补、dequeueInputBuffer驮肉,dequeueOutputBuffer返回值非0
dequeueInputBuffer返回-1,說明沒有拿到可用緩沖區(qū)已骇,一般來說下一幀再來請求輸入緩沖區(qū)离钝。
dequeueOutputBuffer返回-1,說明沒有可用數(shù)據(jù)褪储。
然而dequeueOutputBuffer可能連續(xù)輸入4卵渴、5幀,都沒有輸出鲤竹,然后一股腦在1幀中浪读,輸出4、5個可用輸出緩沖區(qū)辛藻。
其他異常情況包括INFO_OUTPUT_FORMAT_CHANGED
碘橘,INFO_OUTPUT_BUFFERS_CHANGED
等
見https://developer.android.com/reference/android/media/MediaCodec.html里面常量部分。
3吱肌、MediaCodec格式不一致
Android對COLOR_FormatYUV420SemiPlanar和COLOR_FormatYUV420Planar痘拆,這兩種常見YUV格式并沒有做進一步的規(guī)定,
那么問題就來了氮墨,420P和420SP分為I420,YV12,NV12,NV21纺蛆,
I420: YYYYYYYY UU VV =>YUV420P
YV12: YYYYYYYY VV UU =>YUV420P
NV12: YYYYYYYY UVUV =>YUV420SP
NV21: YYYYYYYY VUVU =>YUV420SP
有些設(shè)備可能是I420+NV12,這兩種U都是在前面的规揪,
有些設(shè)備可能是YV12+NV12桥氏,這兩種V都是在前面的。
其他的組合猛铅,我也不確定有沒有字支。
反正這個坑,要注意的奕坟。
來個Demo
Github
https://github.com/Denislyl/AndroidMediaCodec
編碼類祥款,解碼類