這類的文章太多太多,也沒時(shí)間整理,直接上部分核心源碼和注意事項(xiàng)
1:MediaCodec核心類, 在往MediaCodec中不斷的推數(shù)據(jù)時(shí)一定要使用BytePool字節(jié)數(shù)組池,MediaCodec編碼后的byte[]可以循環(huán)重復(fù)使用,避免造成內(nèi)存的抖動(dòng)
/**
* Created by you on 2018-05-10.
* MediaCodec核心編碼器
*/
public final class MediaEncoder implements Runnable {
//...偽代
//字節(jié)池
private final BytePool bytePool;
@Override
public void run() {
bufferInfo = new MediaCodec.BufferInfo();
callback.onInitStart();
while (isCoding.get()) {
try {
byte[] buffer = bufferQueue.take();
if (buffer == null || buffer.length == 0) {
break;//用空byte[]來終止循環(huán)與阻塞
}
codecDatas(buffer);
//緩存
bytePool.put(buffer);
} catch (InterruptedException e) {
if (!isCoding.get()) {
break;
}
}
}
release();
}
/**
* 編碼datas數(shù)據(jù)
* @param buffer
*/
private void codecDatas(byte[] buffer) {
//加入緩沖區(qū), -1如果當(dāng)前沒有可用的緩沖時(shí)會(huì)進(jìn)入阻塞狀態(tài), 0時(shí)會(huì)立刻返回
int index = mediaCodec.dequeueInputBuffer(-1);
if (index >= 0) {
//填充數(shù)據(jù)
ByteBuffer inputBuffer = mediaCodec.getInputBuffer(index);
inputBuffer.clear();
inputBuffer.put(buffer, 0, buffer.length);
callback.onEncodeInputBuffer(mediaCodec, buffer, index);
}
int encodeStatus;
while (true) {
//返回的三種狀態(tài) INFO_TRY_AGAIN_LATER, INFO_OUTPUT_FORMAT_CHANGED, INFO_OUTPUT_BUFFERS_CHANGED,
encodeStatus = mediaCodec.dequeueOutputBuffer(bufferInfo, timeoutUs);
if (encodeStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
break;//稍后重試
} else if (encodeStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED){
//這里只會(huì)回調(diào)一次用于初始化
callback.onFormatChanged(mediaCodec);
} else if (encodeStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
//忽略
} else {
//正常編碼獲得緩沖下標(biāo)
ByteBuffer encodeData = mediaCodec.getOutputBuffer(encodeStatus);
//寫入編碼后的數(shù)據(jù)
callback.onWriteData(bufferInfo, encodeData);
//釋放緩存沖,后續(xù)可以存放新的編碼后的數(shù)據(jù)
mediaCodec.releaseOutputBuffer(encodeStatus, false);
}
}
}
}
2:使用OutputStream方式寫入編碼的h264(推薦使用MediaMuxer)
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
OutputStream.write(outData, 0, outData.length);
上面這種寫法雖然方便但是也有錯(cuò)誤,MediaCodec每次編碼后的幀大小可能不一樣,但是每次編碼后都new byte也還是會(huì)造成內(nèi)存抖動(dòng),需要采用如下方式,建一個(gè)適當(dāng)大小的byte緩沖,用這個(gè)緩沖去接每一幀ByteBuffere中的數(shù)據(jù)再寫入到OutputStream中, 也可以直接使用WritableByteChannel寫入,原理其實(shí)差不多
//寫入數(shù)據(jù)緩沖
private byte[] writeBuffer;
@Override
public void onWriteData(MediaCodec.BufferInfo bufferInfo, ByteBuffer encodeData) {
if (bufferInfo.size != 0) {
//將ByteBuffer中的數(shù)據(jù)寫到文件中
// LogUtils.i("write buffinfosize %d", bufferInfo.size);
int offset = bufferInfo.offset;
int bufferSize = bufferInfo.size;
while (bufferSize > writeBuffer.length) {
writeByteBuffer(encodeData, offset, writeBuffer.length);
bufferSize -= writeBuffer.length;
offset += writeBuffer.length;
}
if (bufferSize > 0) {
writeByteBuffer(encodeData, offset, bufferSize);
}
//byte[] buf = new byte[bufferInfo.size];
//encodeData.get(buf); 不能用此種方式寫入,內(nèi)存抖動(dòng)極大
}
}
/**
* 將ByteBuffer通過byte[]寫入到文件
* @param encodeData
* @param offset
* @param length
*/
private void writeByteBuffer(ByteBuffer encodeData, int offset, int length) {
encodeData.position(offset);
encodeData.limit(offset + length);
encodeData.get(writeBuffer, 0, length);
try {
bos.write(writeBuffer, 0, length);
} catch (IOException e) {
e.printStackTrace();
}
}
3:使用OutputStream方式寫入編碼后aac文件(推薦使用MediaMuxer),同h264一樣需要一個(gè)byte緩沖去接ByteBuffere中的數(shù)據(jù)再寫入到OutputStream中
aac壓縮格式可以直接使用播放器播放,采用 ADTS 格式需要給每幀加上 7 個(gè)字節(jié)的頭信息.(MediaMuxer會(huì)自動(dòng)處理)
@Override
public void onWriteData(MediaCodec.BufferInfo bufferInfo, ByteBuffer encodeData) {
if (bufferInfo.size != 0) {
encodeData.position(bufferInfo.offset);
encodeData.limit(bufferInfo.offset + bufferInfo.size);
addADTStoPacket(bufferInfo.size + 7);
try {
bos.write(adtsHeader, 0, 7);
} catch (IOException e) {
e.printStackTrace();
}
//將ByteBuffer中的數(shù)據(jù)寫到文件中
LogUtils.i("write buffinfosize %d", bufferInfo.size);
int offset = bufferInfo.offset;
int bufferSize = bufferInfo.size;
while (bufferSize > writeBuffer.length) {
writeByteBuffer(encodeData, offset, writeBuffer.length);
bufferSize -= writeBuffer.length;
offset += writeBuffer.length;
}
if (bufferSize > 0) {
writeByteBuffer(encodeData, offset, bufferSize);
}
//byte[] buf = new byte[bufferInfo.size];
//encodeData.get(buf); 不能用此種方式寫入,內(nèi)存抖動(dòng)極大
}
}
private void addADTStoPacket(int packetLen) {
adtsHeader[3] = (byte) (((chanCfg & 3) << 6) + (packetLen >> 11));
adtsHeader[4] = (byte) ((packetLen & 0x7FF) >> 3);
adtsHeader[5] = (byte) (((packetLen & 7) << 5) + 0x1F);
adtsHeader[6] = (byte) 0xFC;
}
4:MediaMuxer混合錄制,需要aac與h264都有MediaCodec回調(diào)INFO_OUTPUT_FORMAT_CHANGED狀態(tài)時(shí)添加mediaMuxer.addTrack(mediaCodec.getOutputFormat());才可以開啟MediaMuxer携狭, 結(jié)束亦是如此法褥,否則會(huì)拋異常
會(huì)用到多線程之間的操作,最先mediaMuxer.addTrack的進(jìn)入wait()等侍狀態(tài)酣藻,最后一個(gè)mediaMuxer.addTrack的notifyAll()
/**
* avc與aac同時(shí)都已a(bǔ)ddTrack時(shí)才可開啟
*/
private synchronized void startMuxer() {
if (!isMuxerStarted && isRecording) {
if (audioTrackIndex != -1 && h264TrackIndex != -1) {
mediaMuxer.start();
isMuxerStarted = true;
//最后一個(gè)addTrack的來開啟
notifyAll();
} else {
long c = System.currentTimeMillis();
do {
try {
//先addTrack的處于等侍狀態(tài)
wait();
} catch (InterruptedException e) {
e.printStackTrace();
}
} while (isRecording && (audioTrackIndex == -1 || h264TrackIndex == -1));
long n = System.currentTimeMillis() - c;
LogUtils.i("wait... %d", n);
}
}
}
/**
* aac, h264編碼都停止時(shí)才可停止
*/
private synchronized void stopMuxer() {
if (isMuxerStarted && h264Released && audioReleased) {
mediaMuxer.stop();
mediaMuxer.release();
isMuxerStarted = false;
mediaMuxer = null;
LogUtils.i("mp4recorder release...");
}
}
5:注意在MediaCodec編碼aac時(shí),需要對(duì)音頻進(jìn)行時(shí)間采樣計(jì)算,否則在混合的時(shí)候容易拋出類似
MPEG4Writer: timestampUs 6220411 < lastTimestampUs 6220442 for Audio track異常,報(bào)異常原因也很簡單,當(dāng)aac編碼通過mediaMuxer.addTrack()進(jìn)入等侍wait狀態(tài)等侍h264同步進(jìn)行時(shí),等侍的時(shí)間超過了差值6220442微秒, 另外在視頻H264與PCM解碼播放時(shí)锭沟,一般也都是根據(jù)pcm的播放速度來做同步的尔当,因此音頻的時(shí)間采樣非常非常重要
public final class AudioPresentationTime {
private long startTime;
private final long bufferDurationUs;
private long currentCount;
/**
*
* @param bufferSize
* @param sampleRate
* @param channelCount
* @param audioFormat
*/
public AudioPresentationTime(int bufferSize, int sampleRate, int channelCount, int audioFormat) {
int bitByteSize = audioFormat == AudioFormat.ENCODING_PCM_16BIT ? 2 : 1; //16bit = 2 byte
bufferDurationUs = 1_000_000L * (bufferSize / (channelCount * bitByteSize)) / sampleRate;
}
public void start() {
startTime = System.nanoTime() / 1000L;
currentCount = 0;
}
public long getPresentationTimeUs() {
return currentCount++ * bufferDurationUs + startTime;
}
}
補(bǔ)充:除了要對(duì)對(duì)音頻進(jìn)行時(shí)間采樣計(jì)算外,H264編碼的Camera采集數(shù)據(jù)也需要注意預(yù)覽的尺寸大小和幀率的控制, 在預(yù)覽尺寸過大或者幀率過高的時(shí)候,MediaCodec編碼的速度趕不上采集的速度(預(yù)覽回調(diào)的YUV數(shù)據(jù)越大編碼耗時(shí)越大),就會(huì)造成等侍編碼的YUV數(shù)據(jù)隊(duì)列的爆滿內(nèi)存溢出坞靶,或者在有限制隊(duì)列大小時(shí)也會(huì)造成中間幀的丟失.