VideoStream類
VideoStream類是視頻流的基類,同樣繼承了MediaStream類,并封裝了對(duì)視頻流的基本操作。與AudioStream類相比摔寨,對(duì)視頻流的操作遠(yuǎn)比音頻流的操作繁瑣而復(fù)雜,涉及的知識(shí)面也更寬怖辆,所以我們準(zhǔn)備以更大的篇幅來介紹視頻流的基本操作是复。
我們先來看一下對(duì)攝像頭的基本操作的幾個(gè)方法删顶。
/**
* Opens the camera in a new Looper thread so that the preview callback is not called from the main thread
* If an exception is thrown in this Looper thread, we bring it back into the main thread.
* @throws RuntimeException Might happen if another app is already using the camera.
*/
private void openCamera() throws RuntimeException {
final Semaphore lock = new Semaphore(0);
final RuntimeException[] exception = new RuntimeException[1];
mCameraThread = new Thread(new Runnable() {
@Override
public void run() {
Looper.prepare();
mCameraLooper = Looper.myLooper();
try {
mCamera = Camera.open(mCameraId);
} catch (RuntimeException e) {
exception[0] = e;
} finally {
lock.release();
Looper.loop();
}
}
});
mCameraThread.start();
lock.acquireUninterruptibly();
if (exception[0] != null) throw new CameraInUseException(exception[0].getMessage());
}
openCamera()方法就是打開攝像頭的操作,Camera.open(mCameraId)本身就是一個(gè)耗時(shí)的方法淑廊,所以啟動(dòng)一個(gè)新的線程來執(zhí)行(雖然Session的start()就是在子線程中)逗余;Semaphore對(duì)象和Looper對(duì)象其實(shí)是防止該線程的同時(shí)啟用多次(不確定? 這里歡迎指正)季惩。
if (mCamera == null) {
openCamera();
...
try {
if (mMode == MODE_MEDIACODEC_API_2) {
mSurfaceView.startGLThread();
mCamera.setPreviewTexture(mSurfaceView.getSurfaceTexture());
} else {
mCamera.setPreviewDisplay(mSurfaceView.getHolder());
}
} catch (IOException e) {
throw new InvalidSurfaceException("Invalid surface !");
}
...
}
上面代碼是截取createCamera()方法中的部分代碼录粱,createCamera()方法一開始調(diào)用了openCamera(),而且對(duì)攝像頭預(yù)覽控件進(jìn)行了配置画拾,mMode == MODE_MEDIACODEC_API_2的情況我們稍后再說啥繁。
protected synchronized void updateCamera() throws RuntimeException {
if (mPreviewStarted) {
mPreviewStarted = false;
mCamera.stopPreview();
}
Parameters parameters = mCamera.getParameters();
mQuality = VideoQuality.determineClosestSupportedResolution(parameters, mQuality);
int[] max = VideoQuality.determineMaximumSupportedFramerate(parameters);
parameters.setPreviewFormat(mCameraImageFormat);
parameters.setPreviewSize(mQuality.resX, mQuality.resY);
parameters.setPreviewFpsRange(max[0], max[1]);
try {
mCamera.setParameters(parameters);
mCamera.setDisplayOrientation(mOrientation);
mCamera.startPreview();
mPreviewStarted = true;
} catch (RuntimeException e) {
destroyCamera();
throw e;
}
}
updateCamera()其實(shí)是對(duì)攝像頭的參數(shù)進(jìn)行配置,這里依次配置了原始數(shù)據(jù)格式青抛、分辨率(所支持的)旗闽、幀率(所支持的)、旋轉(zhuǎn)角度蜜另。
protected synchronized void destroyCamera() {
if (mCamera != null) {
if (mStreaming) super.stop();
lockCamera();
mCamera.stopPreview();
try {
mCamera.release();
} catch (Exception e) {
Log.e(TAG,e.getMessage()!=null?e.getMessage():"unknown error");
}
mCamera = null;
mCameraLooper.quit();
mUnlocked = false;
mPreviewStarted = false;
}
}
destroyCamera()就是停止和釋放攝像頭适室,其中mCameraLooper.quit();表示釋放了openCamera();中的線程Looper,所以又可以啟用該線程了举瑰。
/**
* Video encoding is done by a MediaRecorder.
*/
protected void encodeWithMediaRecorder() throws IOException {
Log.d(TAG,"Video encoded using the MediaRecorder API");
// We need a local socket to forward data output by the camera to the packetizer
createSockets();
// Reopens the camera if needed
destroyCamera();
createCamera();
// The camera must be unlocked before the MediaRecorder can use it
unlockCamera();
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the ouput of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
mMediaRecorder.setOutputFile(mSender.getFileDescriptor());
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
throw new ConfNotSupportedException(e.getMessage());
}
// This will skip the MPEG4 header if this step fails we can't stream anything :(
InputStream is = mReceiver.getInputStream();
try {
byte buffer[] = new byte[4];
// Skip all atoms preceding mdat atom
while (!Thread.interrupted()) {
while (is.read() != 'm');
is.read(buffer,0,3);
if (buffer[0] == 'd' && buffer[1] == 'a' && buffer[2] == 't') break;
}
} catch (IOException e) {
Log.e(TAG,"Couldn't skip mp4 header :/");
stop();
throw e;
}
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setDestination(mDestination, mRtpPort, mRtcpPort);
mPacketizer.setInputStream(mReceiver.getInputStream());
mPacketizer.start();
mStreaming = true;
}
VideoStream重寫的encodeWithMediaRecorder()方法其實(shí)和AudioStream的大同小異捣辆,整個(gè)流程簡單來說:創(chuàng)建本地Sockets,重啟攝像頭(需要的話)嘶居,攝像頭釋放鎖罪帖,MediaRecorder設(shè)置參數(shù)(視頻來源促煮、輸出格式邮屁、編碼格式、預(yù)覽Surface菠齿、分辨率佑吝、幀率、比特率绳匀、輸出路徑)芋忿,啟動(dòng)MediaRecorder執(zhí)行錄制視頻,啟動(dòng)一個(gè)循環(huán)遍歷視頻流來過濾MPEG4格式的頭(mdat)疾棵,最后使用打包器打包和輸出已過濾的視頻流戈钢。
/**
* Video encoding is done by a MediaCodec.
*/
protected void encodeWithMediaCodec() throws RuntimeException, IOException {
if (mMode == MODE_MEDIACODEC_API_2) {
// Uses the method MediaCodec.createInputSurface to feed the encoder
encodeWithMediaCodecMethod2();
} else {
// Uses dequeueInputBuffer to feed the encoder
encodeWithMediaCodecMethod1();
}
}
VideoStream的encodeWithMediaCodec()方法分為兩種方式,第一種與AudioStream的encodeWithMediaCodec()差不多是尔,就是拿到原始數(shù)據(jù)然后往MediaCodec添加進(jìn)行處理殉了,而第二種就是用createInputSurface()方法設(shè)置Surface作為數(shù)據(jù)源。兩者其實(shí)差不多拟枚,但是第一種可以做到數(shù)據(jù)的添加和取回是可控的薪铜,可以處理完一段數(shù)據(jù)再處理下一段數(shù)據(jù)众弓,而第二種方法無法直接控制數(shù)據(jù),所以無法做到可控隔箍。
/**
* Video encoding is done by a MediaCodec.
*/
@SuppressLint("NewApi")
protected void encodeWithMediaCodecMethod1() throws RuntimeException, IOException {
Log.d(TAG,"Video encoded using the MediaCodec API with a buffer");
// Updates the parameters of the camera if needed
createCamera();
updateCamera();
// Estimates the framerate of the camera
measureFramerate();
// Starts the preview if needed
if (!mPreviewStarted) {
try {
mCamera.startPreview();
mPreviewStarted = true;
} catch (RuntimeException e) {
destroyCamera();
throw e;
}
}
//這個(gè)類就是檢測和繞過一些視頻編碼上錯(cuò)誤谓娃,幫助我們正確的完成配置參數(shù)
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
final NV21Convertor convertor = debugger.getNV21Convertor();
//配置參數(shù)依次:分辨率、比特率蜒滩、幀率滨达、顏色格式、幀間隔
mMediaCodec = MediaCodec.createByCodecName(debugger.getEncoderName());
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", mQuality.resX, mQuality.resY);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, mQuality.bitrate);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, mQuality.framerate);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,debugger.getEncoderColorFormat());
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mMediaCodec.start();
//攝像頭獲取的每一幀數(shù)據(jù)的回調(diào)
Camera.PreviewCallback callback = new Camera.PreviewCallback() {
long now = System.nanoTime()/1000, oldnow = now, i=0;
ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
//data就是視頻流每一幀的原始數(shù)據(jù)
oldnow = now;
now = System.nanoTime()/1000;
if (i++>3) {
i = 0;
//Log.d(TAG,"Measured: "+1000000L/(now-oldnow)+" fps.");
}
try {
//從輸入流隊(duì)列中取數(shù)據(jù)進(jìn)行編碼操作(出隊(duì)列)俯艰。
int bufferIndex = mMediaCodec.dequeueInputBuffer(500000);
if (bufferIndex>=0) {
inputBuffers[bufferIndex].clear();
//對(duì)原始數(shù)據(jù)進(jìn)行轉(zhuǎn)碼
convertor.convert(data, inputBuffers[bufferIndex]);
//輸入流入隊(duì)列(往編碼器中添加數(shù)據(jù)做編碼處理)
mMediaCodec.queueInputBuffer(bufferIndex, 0, inputBuffers[bufferIndex].position(), now, 0);
} else {
Log.e(TAG,"No buffer available !");
}
} finally {
//這里就是通知這一幀數(shù)據(jù)已經(jīng)處理完了弦悉,可以回調(diào)下一幀數(shù)據(jù)了,也就是前面所說的可控性
mCamera.addCallbackBuffer(data);
}
}
};
//通知回調(diào)數(shù)據(jù)
for (int i=0;i<10;i++) mCamera.addCallbackBuffer(new byte[convertor.getBufferSize()]);
//給攝像頭添加回調(diào)
mCamera.setPreviewCallbackWithBuffer(callback);
//打包器打包數(shù)據(jù)并傳輸
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setDestination(mDestination, mRtpPort, mRtcpPort);
mPacketizer.setInputStream(new MediaCodecInputStream(mMediaCodec));
mPacketizer.start();
mStreaming = true;
}
encodeWithMediaCodecMethod1()的基本流程:打開攝像頭(需要的話),調(diào)試幀率蟆炊,打開預(yù)覽(需要的話)稽莉,檢測視頻格式bug(EncoderDebugger類里面涉及到的關(guān)于視頻格式和編碼的問題過于深入,水平有限涩搓,這里就不展開了)污秆,實(shí)例化MediaCodec對(duì)象并配置參數(shù)(分辨率、比特率昧甘、幀率良拼、顏色格式、幀間隔)充边,給攝像頭添加每一幀數(shù)據(jù)的回調(diào)庸推,在回調(diào)方法中拿到每一幀的原始數(shù)據(jù),添加到MediaCodec進(jìn)行轉(zhuǎn)碼浇冰,然后打包器打包數(shù)據(jù)并傳輸贬媒。
/**
* Video encoding is done by a MediaCodec.
* But here we will use the buffer-to-surface methode
*/
@SuppressLint({ "InlinedApi", "NewApi" })
protected void encodeWithMediaCodecMethod2() throws RuntimeException, IOException {
Log.d(TAG,"Video encoded using the MediaCodec API with a surface");
// Updates the parameters of the camera if needed
createCamera();
updateCamera();
// Estimates the framerate of the camera
measureFramerate();
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
mMediaCodec = MediaCodec.createByCodecName(debugger.getEncoderName());
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", mQuality.resX, mQuality.resY);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, mQuality.bitrate);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, mQuality.framerate);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
//這里就是將mSurfaceView中的surface作為數(shù)據(jù)源,來替代輸入緩沖區(qū)
Surface surface = mMediaCodec.createInputSurface();
((SurfaceView)mSurfaceView).addMediaCodecSurface(surface);
mMediaCodec.start();
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setDestination(mDestination, mRtpPort, mRtcpPort);
mPacketizer.setInputStream(new MediaCodecInputStream(mMediaCodec));
mPacketizer.start();
mStreaming = true;
}
encodeWithMediaCodecMethod2()的流程其實(shí)差不多肘习,只是將一個(gè)surface對(duì)象作為數(shù)據(jù)源际乘,而不用處理每一幀的原始數(shù)據(jù),相對(duì)簡單但不可控漂佩。
這一篇到這里已經(jīng)基本分析了VideoStream類的內(nèi)部實(shí)現(xiàn)和攝像頭的基本操作脖含,也大概了解了視頻流從采集到編碼的整個(gè)流程,下一篇我們將具體講到VideoStream類的子類和視頻流的具體編碼格式投蝉。