2020-07-24

一杏死、美顏類框架以及常見問題總結(jié)

1. 美顏預(yù)覽流程

(1)在相機(jī)重構(gòu)的時候,需要將美顏合入mtk的架構(gòu),為了代碼可讀性以及降低耦合性樟遣,摒棄了之前老相機(jī)的做法而叼,將美顏單獨(dú)作為一個模式,而不是和普通的拍照模式混在一起豹悬。這樣就需要參照mtk的代碼結(jié)構(gòu)葵陵,為美顏單獨(dú)制定一個預(yù)覽容器的管理類,即EffectViewController.java, 其具體的方法都是mtk的結(jié)構(gòu)瞻佛,只是把預(yù)覽容器替換為了我們美顏的脱篙。

host/src/com/freeme/camera/ui/CameraAppUI.java

    public void onCreate() {
        ...

        //mPreviewManager = new PreviewManager(mApp);
        //Set gesture listener to receive touch event.
        //mPreviewManager.setOnTouchListener(new OnTouchListenerImpl());
        mNormalPreviewManager = new PreviewManager(mApp, true, false);
        mBeautyFacePreviewManager = new PreviewManager(mApp, false, false);
        //美顏
        mEffectPreviewManager = new PreviewManager(mApp, false, true);
        mNormalPreviewManager.setOnTouchListener(new OnTouchListenerImpl());
        mBeautyFacePreviewManager.setOnTouchListener(new OnTouchListenerImpl());
        mEffectPreviewManager.setOnTouchListener(new OnTouchListenerImpl());
        mPreviewManager = mNormalPreviewManager;

        ...
    }

host/src/com/freeme/camera/ui/preview/PreviewManager.java

    public PreviewManager(IApp app, boolean isTextureView, boolean isEffectView) {
        ...

        //if (enabledValue == SURFACEVIEW_ENABLED_VALUE || appVersion == DEFAULT_APP_VERSION) {
        if (isTextureView) {
            mPreviewController = new TextureViewController(app);
        } else if (isEffectView) {
            mPreviewController = new EffectViewController(app);
        } else {
            mPreviewController = new BeautyFaceViewController(app);
        }

        ...
    }

(2)美顏預(yù)覽容器管理流程

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/EffectMode.java

    public void resume(@Nonnull DeviceUsage deviceUsage) {
        ...
        prepareAndOpenCamera(false, mCameraId, false, false);

        ...
    }

    private void prepareAndOpenCamera(boolean needOpenCameraSync, String cameraId,
                                      boolean needFastStartPreview, boolean isFromSelectedCamera) {
        ..
        mIDeviceController.openCamera(info);
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/device/EffectDevice2Controller.java

    private void doOpenCamera(boolean sync) throws CameraOpenException {
        if (sync) {
            mCameraDeviceManager.openCameraSync(mCurrentCameraId, mDeviceCallback, null);
        } else {
            mCameraDeviceManager.openCamera(mCurrentCameraId, mDeviceCallback, null);
        }
    }

    public class DeviceStateCallback extends Camera2Proxy.StateCallback {

        @Override
        public void onOpened(@Nonnull Camera2Proxy camera2proxy) {
            mModeHandler.obtainMessage(MSG_DEVICE_ON_CAMERA_OPENED,
                    camera2proxy).sendToTarget();
        }
       ...
    }

    private class ModeHandler extends Handler {
        ...
        @Override
        public void handleMessage(Message msg) {
            switch (msg.what) {
                case MSG_DEVICE_ON_CAMERA_OPENED:
                    doCameraOpened((Camera2Proxy) msg.obj);
                    break;
                default:
                    break;
            }
        }
    }
    public void doCameraOpened(@Nonnull Camera2Proxy camera2proxy) {
        try {
            if (CameraState.CAMERA_OPENING == getCameraState()
                    && camera2proxy != null && camera2proxy.getId().equals(mCurrentCameraId)) {
                ...
                if (mPreviewSizeCallback != null) {
                        mPreviewSizeCallback.onPreviewSizeReady(new Size(mPreviewWidth,
                                mPreviewHeight));
                }
                ...
            }
        } catch (RuntimeException e) {
            e.printStackTrace();
        }
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/EffectMode.java

    public void onPreviewSizeReady(Size previewSize) {
        updatePictureSizeAndPreviewSize(previewSize);
    }

    private void updatePictureSizeAndPreviewSize(Size previewSize) {
        ...
        if (size != null && mIsResumed) {
            ...
            if (width != mPreviewWidth || height != mPreviewHeight) {
                onPreviewSizeChanged(width, height);
            }
        }
    }

    private void onPreviewSizeChanged(int width, int height) {
        ...
        mIApp.getAppUi().setPreviewSize(mPreviewHeight, mPreviewWidth, mISurfaceStatusListener);
        ...
    }

host/src/com/freeme/camera/ui/CameraAppUI.java

    public void setPreviewSize(final int width, final int height,
                               final ISurfaceStatusListener listener) {
        mApp.getActivity().runOnUiThread(new Runnable() {
            @Override
            public void run() {
                mPreviewManager.updatePreviewSize(width, height, listener);
                ...
            }
        });
    }

host/src/com/freeme/camera/ui/preview/PreviewManager.java

    public void updatePreviewSize(int width, int height, ISurfaceStatusListener listener) {
        ...
        if (mPreviewController != null) {
            mPreviewController.updatePreviewSize(width, height, listener);
        }
    }

host/src/com/freeme/camera/ui/preview/EffectViewController.java

    public void updatePreviewSize(int width, int height, ISurfaceStatusListener listener) {
        if (mPreviewWidth == width && mPreviewHeight == height) {
            ...
            if (mIsSurfaceCreated) {
                if (listener != null) {
                    ...
                    //設(shè)置預(yù)覽容器
                    listener.surfaceAvailable(((CameraActivity) mApp.getActivity()).getEffectView().getSurfaceTexture(),
                            mPreviewHeight, mPreviewWidth);
                }
            }
            return;
        }
        ...
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/EffectMode.java

    private class SurfaceChangeListener implements ISurfaceStatusListener {

        public void surfaceAvailable(Object surfaceObject, int width, int height) {
            if (mModeHandler != null) {
                mModeHandler.post(new Runnable() {
                    @Override
                    public void run() {
                        if (mIDeviceController != null && mIsResumed) {
                            mIDeviceController.updatePreviewSurface(surfaceObject);
                        }
                    }
                });
            }
        }

        ...
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/device/EffectDevice2Controller.java

    public void updatePreviewSurface(Object surfaceObject) {
        
        synchronized (mSurfaceHolderSync) {
            if (surfaceObject instanceof SurfaceHolder) {
                mPreviewSurface = surfaceObject == null ? null :
                        ((SurfaceHolder) surfaceObject).getSurface();
            } else if (surfaceObject instanceof SurfaceTexture) {
                mPreviewSurface = surfaceObject == null ? null :
                        new Surface((SurfaceTexture) surfaceObject);
            }
            boolean isStateReady = CameraState.CAMERA_OPENED == mCameraState;
            if (isStateReady && mCamera2Proxy != null) {
                boolean onlySetSurface = mSurfaceObject == null && surfaceObject != null;
                mSurfaceObject = surfaceObject;
                if (surfaceObject == null) {
                    stopPreview();
                } else if (onlySetSurface && mNeedSubSectionInitSetting) {
                    mOutputConfigs.get(0).addSurface(mPreviewSurface);
                    if (mSession != null) {
                        mSession.finalizeOutputConfigurations(mOutputConfigs);
                        mNeedFinalizeOutput = false;
                        if (CameraState.CAMERA_OPENED == getCameraState()) {
                            repeatingPreview(false);
                            configSettingsByStage2();
                            repeatingPreview(false);
                        }
                    } else {
                        mNeedFinalizeOutput = true;
                    }
                } else {
                    configureSession(false);
                }
            }
        }
    }
//后續(xù),美顏預(yù)覽容器surfaceTexture伤柄,完全按照mtk的代碼結(jié)構(gòu)進(jìn)行管理绊困。

(3)美顏效果繪制流程

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/byted/EffectView.java

    public void onDrawFrame(GL10 gl) {
        if (mCameraChanging || mIsPaused) {
            return;
        }
        //將紋理圖像更新為圖像流中的最新幀。
        mSurfaceTexture.updateTexImage();
        if(mPauseed){
            GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
            GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
            return;
        }
        //清空緩沖區(qū)顏色
        GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);

        BytedEffectConstants.Rotation rotation = OrientationSensor.getOrientation();
        //調(diào)用字節(jié)跳動的美顏算法适刀,處理預(yù)覽數(shù)據(jù)秤朗,得到處理后的紋理dstTexture
        dstTexture = mEffectRenderHelper.processTexture(mSurfaceTextureID, rotation, getSurfaceTimeStamp());

        synchronized (this) {
            if (mVideoEncoder != null) {
                //美視相關(guān),后面介紹
                mVideoEncoder.frameAvailableSoon();
            }
        }

        if (dstTexture != ShaderHelper.NO_TEXTURE) {
            //繪制紋理
            mEffectRenderHelper.drawFrame(dstTexture);
        }
        mFrameRator.addFrameStamp();
    }

(4)常見問題及常用解決思路

  • 問題:
  1. 美顏預(yù)覽卡住笔喉,停留在上一個模式的最后一幀:因為美顏與普通模式用的預(yù)覽容器不同取视,在模式切換的切換的時候,容器沒有正常的切換以及顯示常挚∽魈罚可以用AndroidStudio的布局查看器,查看effectview是否正常顯示
  2. 往美顏模式切換待侵,預(yù)覽閃黑:這個問題的根本原因就是美顏和普通模式預(yù)覽容器不同丢早,所以在模式切換之間加了動畫。
  3. 美顏模式下秧倾,切換攝像頭,預(yù)覽閃黑:這個問題需要調(diào)整EffectView.setCameraId 和 EffectView.setPauseed 在mtk代碼結(jié)構(gòu)里面的位置怨酝,這兩個方法的初衷就是為了在切換攝像頭的時候,停止繪制那先,否則會出現(xiàn)倒幀等現(xiàn)象农猬。 就目前而言,效果可以接受售淡。
  4. 其他的瑣碎的問題斤葱,例如美顏效果控制面板等問題,就不做介紹了揖闸,普通界面問題揍堕,好改。

2. 基于字節(jié)跳動sdk開發(fā)的美視功能介紹

(1)思路:http://www.reibang.com/p/9dc03b01bae3 參考這位大神的的思路汤纸,很詳細(xì)衩茸。簡單來說,就是另外開一個線程將字節(jié)跳動sdk處理后的紋理贮泞,即上文提到的dstTexture繪制到我們的錄像容器楞慈,即MediaCode.createInputSurface()

(2)以視頻流處理為例介紹一下流程幔烛,兩個線程,一個即上文所說渲染(繪制)線程囊蓝,另外一個錄制線程(視頻編碼線程)

feature/mode/effectvideo/src/com/freeme/camera/feature/mode/effectvideo/EffectVideoMode.java

    private void startRecording() {
        ...
        mModeHandler.postDelayed(new Runnable() {
            @Override
            public void run() {
                mSurfaceView.startRecording(mCurrentVideoFilename, EffectVideoMode.this,
                        "on".equals(mSettingManager.getSettingController().queryValue("key_microphone")), mOrientationHint);
            }
        }, 300);
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/byted/EffectView.java

    public void startRecording(String currentDescriptorName, MediaMuxerListener mediaMuxerListener, boolean isRecordAudio, int orientation) {
        try {
            //創(chuàng)建寫入指定路徑的媒體混合器,將編碼后的音視頻按規(guī)則寫入指定文件
            mMuxer = new MediaMuxerWrapper(currentDescriptorName, mediaMuxerListener);
            if (true) {
                //視頻錄制器饿悬,這里僅僅創(chuàng)建了一個對象,將mMuxer給到它聚霜,以便后續(xù)編碼好的視頻寫入文件
                new MediaVideoEncoder(mMuxer, mMediaEncoderListener, mImageHeight, mImageWidth);
            }
            if (isRecordAudio) {
                //音頻錄制器
                new MediaAudioEncoder(mMuxer, mMediaEncoderListener);
            }
            //這里才是音視頻錄制器的準(zhǔn)備工作狡恬,以視頻為例介紹
            mMuxer.prepare();
            mMuxer.setOrientationHint(orientation);
            //開始錄制
            mMuxer.startRecording();
        } catch (final IOException e) {
            Log.e(TAG, "startCapture:", e);
        }
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/encoder/MediaVideoEncoder.java    

    public MediaVideoEncoder(final MediaMuxerWrapper muxer, final MediaEncoderListener listener, final int width, final int height) {
        super(muxer, listener);
        if (DEBUG) Log.i(TAG, "MediaVideoEncoder: ");
        mWidth = width;
        mHeight = height;
        //渲染線程,RenderHandler implements Runnable,后面介紹
        mRenderHandler = RenderHandler.createHandler("VideoRenderThread");
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/glutils/RenderHandler.java

    public static final RenderHandler createHandler(final String name) {
        final RenderHandler handler = new RenderHandler();
        synchronized (handler.mSync) {
            //開啟渲染線程俯萎,等待后續(xù)命令傲宜,開始渲染
            new Thread(handler, !TextUtils.isEmpty(name) ? name : TAG).start();
        }
        return handler;
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/encoder/MediaEncoder.java

    public MediaEncoder(final MediaMuxerWrapper muxer, final MediaEncoderListener listener) {
        ...
        mWeakMuxer = new WeakReference<MediaMuxerWrapper>(muxer);
        muxer.addEncoder(this);
        mListener = listener;
        synchronized (mSync) {
            mBufferInfo = new MediaCodec.BufferInfo();
            //編碼線程,MediaEncoder implements Runnable夫啊,后面介紹
            new Thread(this, getClass().getSimpleName()).start();
            try {
                mSync.wait();
            } catch (final InterruptedException e) {
            }
        }
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/encoder/MediaMuxerWrapper.java

    public void prepare() throws IOException {
        if (mVideoEncoder != null)
            mVideoEncoder.prepare();
        if (mAudioEncoder != null)
            mAudioEncoder.prepare();
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/encoder/MediaVideoEncoder.java

    private static final String MIME_TYPE = "video/avc";

    protected void prepare() throws IOException {
        //格式、比特率辆憔、幀率撇眯、關(guān)鍵幀,這都是android固定的格式虱咧,不做介紹
        final MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeight);
        format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);    // API >= 18
        format.setInteger(MediaFormat.KEY_BIT_RATE, calcBitRate());
        format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);

        //創(chuàng)建錄制器熊榛,此處指定的為視頻錄制器
        mMediaCodec = MediaCodec.createEncoderByType(MIME_TYPE);
        mMediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        //關(guān)鍵的地方,前面說原理的時候腕巡,要將字節(jié)跳動處理后的紋理繪制到錄制容器玄坦,這里便是那個容器了。
        mSurface = mMediaCodec.createInputSurface();    // API >= 18
        //開啟
        mMediaCodec.start();
        if (DEBUG) Log.i(TAG, "prepare finishing");
        if (mListener != null) {
            try {
                //回調(diào)提醒已準(zhǔn)備好
                mListener.onPrepared(this);
            } catch (final Exception e) {
                Log.e(TAG, "prepare:", e);
            }
        }
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/byted/EffectView.java

    private final MediaEncoder.MediaEncoderListener mMediaEncoderListener = new MediaEncoder.MediaEncoderListener() {
        @Override
        public void onPrepared(final MediaEncoder encoder) {
            if (encoder instanceof MediaVideoEncoder) {
                setVideoEncoder((MediaVideoEncoder) encoder);
            } else if (encoder instanceof MediaAudioEncoder) {
                mAudioEncoder = (MediaAudioEncoder) encoder;
            }
        }

        ...
    };

    public void setVideoEncoder(final MediaVideoEncoder encoder) {
        queueEvent(new Runnable() {
            @Override
            public void run() {
                synchronized (this) {
                    if (encoder != null) {
                        //這里三個參數(shù)很關(guān)鍵:1.將effectView關(guān)聯(lián)的GLContext绘沉,給到視頻錄制器煎楣,用以構(gòu)建EGL環(huán)境
                        //2.dstTexture很熟悉了,字節(jié)跳動美顏sdk處理后的紋理
                        //3.mEffectRenderHelper也很熟悉车伞,美顏流程里面不就是調(diào)用mEffectRenderHelper.drawFrame(dstTexture);將紋理繪制到預(yù)覽容器上的嗎择懂?類比一下,后面將用這個“畫筆”將處理后的紋理繪制到錄制容器
                        //至此另玖,脈絡(luò)比較清晰了困曙,環(huán)境有了,紋理有了谦去,“畫筆”有了慷丽,后面就是畫紋理了。
                        encoder.setEglContext(EGL14.eglGetCurrentContext(), dstTexture, mEffectRenderHelper);
                    }
                    mVideoEncoder = encoder;
                }
            }
        });
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/byted/EffectView.java

    public void onDrawFrame(GL10 gl) {
        ...
        dstTexture = mEffectRenderHelper.processTexture(mSurfaceTextureID, rotation, getSurfaceTimeStamp());

        synchronized (this) {
            if (mVideoEncoder != null) {
                //開始錄制
                mVideoEncoder.frameAvailableSoon();
            }
        }

        if (dstTexture != ShaderHelper.NO_TEXTURE) {
            mEffectRenderHelper.drawFrame(dstTexture);
        }
        mFrameRator.addFrameStamp();
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/encoder/MediaVideoEncoder.java

    public boolean frameAvailableSoon() {
        boolean result;
        if (result = super.frameAvailableSoon())
            mRenderHandler.draw(null);
        return result;
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/encoder/MediaEncoder.java

    public boolean frameAvailableSoon() {
        synchronized (mSync) {
            if (!mIsCapturing || mRequestStop || isPause) {
                return false;
            }
            mRequestDrain++;
            mSync.notifyAll();
        }
        return true;
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/glutils/RenderHandler.java
    //渲染線程
    public final void run() {
        ...
        for (; ; ) {
            ...

            if (localRequestDraw) {
                if ((mEglCore != null) && mTexId >= 0) {
                    mInputWindowSurface.makeCurrent();
                    
                    GLES20.glClearColor(1.0f, 1.0f, 0.0f, 1.0f);
                    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
                    //美顏渲染一幀鳄哭,這邊也跟著渲染一幀
                    mEffectRenderHelper.drawFrame(mTexId);
                    mInputWindowSurface.swapBuffers();
                }
            } else {
                synchronized (mSync) {
                    try {
                        mSync.wait();
                    } catch (final InterruptedException e) {
                        break;
                    }
                }
            }
        }
        ...
    }

feature/mode/effect/src/com/freeme/camera/feature/mode/effect/encoder/MediaEncoder.java
    //編碼線程
    public void run() {
        synchronized (mSync) {
            mRequestStop = false;
            mRequestDrain = 0;
            mSync.notify();
        }
        final boolean isRunning = true;
        boolean localRequestStop;
        boolean localRequestDrain;
        while (isRunning) {
            ...
            if (localRequestDrain) {
                //清空編碼的數(shù)據(jù)并將其寫入多路復(fù)用器,即將編碼的數(shù)據(jù)取出來要糊,用muxer寫入指定的視頻文件
                drain();
            } else {
                synchronized (mSync) {
                    try {
                        mSync.wait();
                    } catch (final InterruptedException e) {
                        break;
                    }
                }
            }
        } // end of while
        if (DEBUG) Log.d(TAG, "Encoder thread exiting");
        synchronized (mSync) {
            mRequestStop = true;
            mIsCapturing = false;
        }
    }


    protected void drain() {
        //這個方法稍微長一點,流程都是按照google規(guī)定的窃诉,具體細(xì)節(jié)自己去看源碼杨耙,這里只介紹關(guān)鍵處赤套,拿編碼后的數(shù)據(jù)以及寫入視頻文件
        if (mMediaCodec == null) return;
        if (isPause) return;
        ByteBuffer[] encoderOutputBuffers = mMediaCodec.getOutputBuffers();
        int encoderStatus, count = 0;
        final MediaMuxerWrapper muxer = mWeakMuxer.get();
        if (muxer == null) {
            Log.w(TAG, "muxer is unexpectedly null");
            return;
        }
        LOOP:
        while (mIsCapturing) {
            // get encoded data with maximum timeout duration of TIMEOUT_USEC(=10[msec])
            //1.獲取最長超時時間為TIMEOUT_USEC(= 10 [msec])的編碼數(shù)據(jù)
            encoderStatus = mMediaCodec.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                // wait 5 counts(=TIMEOUT_USEC x 5 = 50msec) until data/EOS come
                if (!mIsEOS) {
                    if (++count > 5)
                        break LOOP;        // out of while
                }
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                if (DEBUG) Log.v(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
                // this shoud not come when encoding
                //2.檢索輸出緩沖區(qū)的集合。
                encoderOutputBuffers = mMediaCodec.getOutputBuffers();
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                if (DEBUG) Log.v(TAG, "INFO_OUTPUT_FORMAT_CHANGED");
                // this status indicate the output format of codec is changed
                // this should come only once before actual encoded data
                // but this status never come on Android4.3 or less
                // and in that case, you should treat when MediaCodec.BUFFER_FLAG_CODEC_CONFIG come.
                if (mMuxerStarted) {    // second time request is error
                    throw new RuntimeException("format changed twice");
                }
                // get output format from codec and pass them to muxer
                // getOutputFormat should be called after INFO_OUTPUT_FORMAT_CHANGED otherwise crash.
                final MediaFormat format = mMediaCodec.getOutputFormat(); // API >= 16
                mTrackIndex = muxer.addTrack(format);
                mMuxerStarted = true;
                if (!muxer.start()) {
                    // we should wait until muxer is ready
                    synchronized (muxer) {
                        while (!muxer.isStarted())
                            try {
                                muxer.wait(100);
                            } catch (final InterruptedException e) {
                                break LOOP;
                            }
                    }
                }
            } else if (encoderStatus < 0) {
                // unexpected status
                if (DEBUG)
                    Log.w(TAG, "drain:unexpected result from encoder#dequeueOutputBuffer: " + encoderStatus);
            } else {
                //3.編碼好的數(shù)據(jù)
                final ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                if (encodedData == null) {
                    // this never should come...may be a MediaCodec internal error
                    throw new RuntimeException("encoderOutputBuffer " + encoderStatus + " was null");
                }
                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                    // You shoud set output format to muxer here when you target Android4.3 or less
                    // but MediaCodec#getOutputFormat can not call here(because INFO_OUTPUT_FORMAT_CHANGED don't come yet)
                    // therefor we should expand and prepare output format from buffer data.
                    // This sample is for API>=18(>=Android 4.3), just ignore this flag here
                    if (DEBUG) Log.d(TAG, "drain:BUFFER_FLAG_CODEC_CONFIG");
                    mBufferInfo.size = 0;
                }

                if (mBufferInfo.size != 0) {
                    // encoded data is ready, clear waiting counter
                    count = 0;
                    if (!mMuxerStarted) {
                        // muxer is not ready...this will prrograming failure.
                        throw new RuntimeException("drain:muxer hasn't started");
                    }
                    // 4.將編碼的數(shù)據(jù)寫入多路復(fù)用器
                    mBufferInfo.presentationTimeUs = getPTSUs();
                    muxer.writeSampleData(mTrackIndex, encodedData, mBufferInfo);
                    prevOutputPTSUs = mBufferInfo.presentationTimeUs;
                }
                // return buffer to encoder
                mMediaCodec.releaseOutputBuffer(encoderStatus, false);
                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    // when EOS come.
                    mIsCapturing = false;
                    break;      // out of while
                }
            }
        }
    }

//至此珊膜,視頻渲染以及將編碼好的數(shù)據(jù)寫入指定的視頻文件的流程就清楚了容握,音頻同理,比這個簡單车柠。
//特別地:為什么要開發(fā)新美視剔氏,首先是效果,其次是tutu的sdk中渲染與資源釋放的異步問題竹祷,導(dǎo)致老美視十分容易報錯谈跛,已經(jīng)停止合作,我們也改不了sdk塑陵。
//現(xiàn)在的新美視感憾,已經(jīng)穩(wěn)定,整明白了流程令花,后面出現(xiàn)問題阻桅,具體問題具體分析。

3. 臉萌模式簡單介紹

(1)臉萌模式原理:利用tutu美顏sdk返回的人臉坐標(biāo)數(shù)據(jù)兼都,調(diào)用第三方庫libgdx嫂沉,在紋理上繼續(xù)繪制臉萌圖案,libgdx也是封裝好的opengl

(2)簡單看下流程

feature/mode/beautyface/src/com/freeme/camera/feature/mode/beautyface/BeautyFaceView.java

    public void onDrawFrame(GL10 gl10) {
        mSurfaceTexture.updateTexImage();
        if (mPauseed) {
            return;
        }
        ...

        GLES20.glClearColor(1.0f, 0.0f, 0.0f, 1.0f);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);

        // 濾鏡引擎處理扮碧,返回的 textureID 為 TEXTURE_2D 類型
        int textureWidth = mMeasuredHeight;/*mDrawBounds.height();*/
        int textureHeight = mMeasuredWidth;/*mDrawBounds.width();*/

        textureHeight = (int) (textureHeight / SAMPLER_RATIO);
        textureWidth = (int) (textureWidth / SAMPLER_RATIO);

        if (mDrawBounds.width() >= 972) {
            textureHeight = (int) (textureHeight / SAMPLER_RATIO);
            textureWidth = (int) (textureWidth / SAMPLER_RATIO);
        }

        mMeasuredHeight, /*textureHeight*/mMeasuredWidth);
        final int textureId = mFilterEngine.processFrame(mOESTextureId, textureWidth, textureHeight);
        
        textureProgram.draw(textureId);
        
        if (mCameraActivity.getCurrentCameraMode() == FreemeSceneModeData.FREEME_SCENE_MODE_FC_ID) {
            FaceAligment[] faceAligments = mFilterEngine.getFaceFeatures();
            float deviceAngle = mFilterEngine.getDeviceAngle();
            //繪制臉萌效果
            mFunnyFaceView.render(deviceAngle, faceAligments);
            //臉萌拍照
            mFunnyFaceView.capture();
        }
    }

feature/mode/facecute/src/com/freeme/camera/feature/mode/facecute/gles/FunnyFaceView.java

    public void render(float deviceAngle, FaceAligment[] faceAligments) {
        if (!mIsShowing || mIsSwitching || mIsDispose) {
            return;
        }
        long time = System.nanoTime();
        deltaTime = (time - lastFrameTime) / 1000000000.0f;
        lastFrameTime = time;
        mStateTime += deltaTime;
        if (faceAligments != null && faceAligments.length > 0) {
            ...
            int faceW = (int) face.width();
            int faceH = (int) face.height();
            int abs = Math.abs(faceH - faceW);
            //常見問題點:臉萌沒有效果
            //原因:1.依賴tutu sdk人臉數(shù)據(jù)趟章,遠(yuǎn)了識別不到人臉,就會沒有臉萌效果
            //2.在渲染臉萌效果的時候會判斷人臉寬高比慎王,去掉會引起效果閃爍蚓土、閃白。這里對其進(jìn)行了改良柬祠,加入了屏幕密度北戏,使得大部分項目能夠滿足正常效果。
            if (faceW < mFaceMinSizePx || faceW > mFaceMaxSizePx || abs > 70 * mDensity) {
                mCamera.showOrNotFFBNoFaceIndicator(true);
                return;
            }
            ...
            drawItem(scale, 0, angle, landmarkInfo);
            mSpriteBatch.end();
            mCamera.showOrNotFFBNoFaceIndicator(false);
        } else {
            mCamera.showOrNotFFBNoFaceIndicator(true);
        }
    }

    private void drawItem(float scale, int orientation, float angle, LandmarkInfo markInfo) {
        if (mCurrItemList != null) {
            for (ItemInfo item : mCurrItemList) {
                TextureRegion currRegion = item.anim.getKeyFrame(mStateTime, true);
                AnchorInfo anchor = computeAnchorInfo(item, markInfo, scale, orientation);
                drawElements(currRegion, anchor, scale, orientation, angle);
            }
        }
    }

    private void drawElements(TextureRegion currRegion, AnchorInfo anchor, float scale,
                              int orientation, float angle) {
        ...
        //繪制
        mSpriteBatch.draw(currRegion, x, y, orignX, orignY, orignW, orignH, scale, scale,
                finalAngle);
    }

    public void capture() {
        if (mIsNeedCapture) {
            mIsNeedCapture = false;
            handleRGB565Data();
        }
    }

    private void handleRGB565Data() {
        long time = System.currentTimeMillis();
        final int data[] = this.getJpegDataFromGpu565(0, 0, mWidth, mHeight);
        ...
    }

    public int[] getJpegDataFromGpu565(int x, int y, int w, int h) {
        int size = w * h;
        ByteBuffer buf = ByteBuffer.allocateDirect(size * 4);
        buf.order(ByteOrder.nativeOrder());
        //glReadPixels
        GLES20.glReadPixels(x, y, w, h, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, buf);
        int data[] = new int[size];
        buf.asIntBuffer().get(data);
        buf = null;
        return data;
    }
//臉萌的流程比較簡單漫蛔,依賴于tutu 美顏嗜愈。 常見的問題就是上面說的那個。

至此莽龟,美顏類的三種蠕嫁,美顏、美視毯盈、臉萌介紹完畢剃毒。

二、插件類以及常見問題總結(jié)

1. 外部插件:模特、兒童赘阀;水印益缠、大片;掃碼

(1)外部插件框架:參考documents/FreemeOS/other/training/Camera/pluginmanager/Android插件化開發(fā).md基公,上一位camera負(fù)責(zé)人大廚走的時候詳細(xì)介紹過插件的來龍去脈幅慌,很詳細(xì),自己看文檔轰豆。

(2)這里以掃碼為例胰伍,看一下

feature/mode/qrcodescan/src/com/freeme/camera/feature/mode/qrcodescan/QrCodeScanMode.java

    //camera api2,用ImageReader獲取預(yù)覽數(shù)據(jù)
    public void onImageAvailable(ImageReader reader) {
        Image image = reader.acquireNextImage();
        //image->plane->buffer->byte[]
        //getBytesFromImageAsType:根據(jù)要求的結(jié)果類型進(jìn)行填充,二維碼需要的是亮度(Y)信息,簡單的將UV數(shù)據(jù)拼接在Y數(shù)據(jù)之后即可
        mIApp.getmPluginManagerAgent().blendOutput(CameraUtil.getBytesFromImageAsType(image, 1), FreemeSceneModeData.FREEME_SCENE_MODE_QRCODE_ID);
        image.close();
    }

common/src/com/freeme/camera/common/pluginmanager/PluginManagerAgent.java

    public byte[] blendOutput(byte[] jpegData, int mode) {
        if (mModules != null && mModules.size() > 0) {
            IPluginModuleEntry plugin = mModules.get(mode, null);
            if (plugin != null) {
                return plugin.blendOutput(jpegData);
            }
        }
        return null;
    }

FreemeCameraPlugin/CameraQrCodeScan/app/src/main/java/com/freeme/cameraplugin/qrcodescan/QrCodeScan.java

    public byte[] blendOutput(byte[] jpegData) {
        if (QrCodeScanView.sFramingRect == null) {
            return super.blendOutput(jpegData);
        }
        synchronized (mDecodeHandlerObject) {
            if (mDecodeHandler != null && !mIsCoding) {
                mIsCoding = true;
                Point cameraResolution = mCameraConfigManager.getCameraResolution();
                Message message = mDecodeHandler.obtainMessage(MSG_START_DECODE, cameraResolution.x, cameraResolution.y, jpegData);
                message.sendToTarget();
            } else {
                Log.d(TAG, "Got preview callback, but no handler for it");
            }
        }
        return super.blendOutput(jpegData);
    }

    public void handleMessage(Message msg) {
        super.handleMessage(msg);
        if (msg.what == MSG_START_DECODE) {
            decode((byte[]) msg.obj, msg.arg1, msg.arg2);
        } else if (msg.what == MSG_QUIT_DECODE) {
            Looper.myLooper().quit();
        }
    }

    private void decode(byte[] data, int width, int height) {
           
            Result rawResult = null;
            Log.i(TAG, "decode bate length : " + data.length + ",width : " + width + ",height : " + height);
            //modify here
            byte[] rotatedData = new byte[data.length];
            for (int y = 0; y < height; y++) {
                for (int x = 0; x < width; x++)
                    rotatedData[x * height + height - y - 1] = data[x + y * width];
            }
            ...
            PlanarYUVLuminanceSource source = buildLuminanceSource(rotatedData, width, height, rect);
            BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source));
            try {
                //調(diào)用google 的 zxing庫去識別
                rawResult = multiFormatReader.decodeWithState(bitmap);
            } catch (ReaderException re) {
                // continue
            } finally {
                multiFormatReader.reset();
            }

            ...
        }
    }
//掃碼常見問題:識別不到二維碼酸休,就我遇到的而言骂租,都是機(jī)器的對焦有問題。讓項目檢查對焦斑司。
//提一句資源無法加載的問題:屏幕尺寸不符合后臺判斷條件的要求渗饮。

2. 內(nèi)部插件(模式):假單反模式、人像模式

(1)原理:在預(yù)覽容器(TextureView)之上覆蓋一層BvirtualView

feature/mode/slr/src/com/freeme/camera/feature/mode/slr/BvirtualView.java

    protected void onDraw(Canvas canvas) {
        super.onDraw(canvas);
        canvas.setDrawFilter(mPFDF);
        drawTrueBgVirtualWithCanvas(canvas);
        drawDiaphragm(canvas);
    }

    private void drawTrueBgVirtualWithCanvas(Canvas canvas) {
        ...
        //獲取預(yù)覽幀的bitmap
        Bitmap preview = ((CameraAppUI) mApp.getAppUi()).getmPreviewManager().getPreviewBitmap(sampleFactor);//mScreenShotProvider.getPreviewFrame(sampleFactor);
        ...
        if (preview != null && mBlur != null) {
            ...
            //調(diào)用android自帶的ScriptIntrinsicBlur將圖片全部模糊
            Bitmap bgBlurBitmap = mBlur.blurBitmap(preview, mBlurDegress);
            if (SHOW_PREVIEW_DEBUG_LOG) {
                time1 = System.currentTimeMillis();
                Log.e(TAG, "blur bitmap :" + (time1 - time0) + " ms");
                time0 = System.currentTimeMillis();
            }
            BlurInfo info = new BlurInfo();
            info.x = (int) (mOnSingleX / apectScale);
            info.y = (int) (mOnSingleY / apectScale);
            info.inRadius = (int) (IN_SHARPNESS_RADIUS * scale / apectScale);
            info.outRadius = (int) (OUT_SHARPNESS_RADIUS * scale / apectScale);
            //調(diào)用blur庫進(jìn)行拼接陡厘,大廚開發(fā)好的
            //源碼:https://github.com/azmohan/BvArithmetic
            SmoothBlurJni.smoothRender(bgBlurBitmap, preview, info);
            if (SHOW_PREVIEW_DEBUG_LOG) {
                time1 = System.currentTimeMillis();
                Log.e(TAG, "smooth render :" + (time1 - time0) + " ms");
            }
            Matrix matrix = new Matrix();
            matrix.setScale(apectScale, apectScale);
            //繪制
            canvas.drawBitmap(bgBlurBitmap, matrix, null);
            preview.recycle();
            bgBlurBitmap.recycle();
        }
    }
//常見問題:卡頓抽米。 根本原因就是假單反的這一套太吃資源,在預(yù)覽之上又覆蓋了一層view
//可調(diào)小BvirtualView.java中的值來優(yōu)化
    private final static int IN_SHARPNESS_RADIUS = 200;
    private final static int OUT_SHARPNESS_RADIUS = 320;
    private static int REFERENCE_ASPECT_SIZE = 720;
    private static int SUPPORT_MAX_ASPECT_SIZE = 720;
//如果想從根本上優(yōu)化糙置,可以像美顏那樣,用opengl對紋理進(jìn)行模糊算法處理之后是目,再繪制到預(yù)覽容器上谤饭。
最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市懊纳,隨后出現(xiàn)的幾起案子揉抵,更是在濱河造成了極大的恐慌,老刑警劉巖嗤疯,帶你破解...
    沈念sama閱讀 218,755評論 6 507
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件冤今,死亡現(xiàn)場離奇詭異,居然都是意外死亡茂缚,警方通過查閱死者的電腦和手機(jī)戏罢,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 93,305評論 3 395
  • 文/潘曉璐 我一進(jìn)店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來脚囊,“玉大人龟糕,你說我怎么就攤上這事』谠牛” “怎么了讲岁?”我有些...
    開封第一講書人閱讀 165,138評論 0 355
  • 文/不壞的土叔 我叫張陵,是天一觀的道長。 經(jīng)常有香客問我缓艳,道長校摩,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 58,791評論 1 295
  • 正文 為了忘掉前任阶淘,我火速辦了婚禮衙吩,結(jié)果婚禮上,老公的妹妹穿的比我還像新娘舶治。我一直安慰自己分井,他們只是感情好,可當(dāng)我...
    茶點故事閱讀 67,794評論 6 392
  • 文/花漫 我一把揭開白布霉猛。 她就那樣靜靜地躺著尺锚,像睡著了一般。 火紅的嫁衣襯著肌膚如雪惜浅。 梳的紋絲不亂的頭發(fā)上瘫辩,一...
    開封第一講書人閱讀 51,631評論 1 305
  • 那天,我揣著相機(jī)與錄音坛悉,去河邊找鬼伐厌。 笑死,一個胖子當(dāng)著我的面吹牛裸影,可吹牛的內(nèi)容都是我干的挣轨。 我是一名探鬼主播,決...
    沈念sama閱讀 40,362評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼轩猩,長吁一口氣:“原來是場噩夢啊……” “哼卷扮!你這毒婦竟也來了?” 一聲冷哼從身側(cè)響起均践,我...
    開封第一講書人閱讀 39,264評論 0 276
  • 序言:老撾萬榮一對情侶失蹤晤锹,失蹤者是張志新(化名)和其女友劉穎,沒想到半個月后彤委,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體鞭铆,經(jīng)...
    沈念sama閱讀 45,724評論 1 315
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 37,900評論 3 336
  • 正文 我和宋清朗相戀三年焦影,在試婚紗的時候發(fā)現(xiàn)自己被綠了车遂。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點故事閱讀 40,040評論 1 350
  • 序言:一個原本活蹦亂跳的男人離奇死亡偷办,死狀恐怖艰额,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情椒涯,我是刑警寧澤柄沮,帶...
    沈念sama閱讀 35,742評論 5 346
  • 正文 年R本政府宣布,位于F島的核電站,受9級特大地震影響祖搓,放射性物質(zhì)發(fā)生泄漏狱意。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點故事閱讀 41,364評論 3 330
  • 文/蒙蒙 一拯欧、第九天 我趴在偏房一處隱蔽的房頂上張望详囤。 院中可真熱鬧,春花似錦镐作、人聲如沸藏姐。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,944評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽羔杨。三九已至,卻和暖如春杨蛋,著一層夾襖步出監(jiān)牢的瞬間兜材,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 33,060評論 1 270
  • 我被黑心中介騙來泰國打工逞力, 沒想到剛下飛機(jī)就差點兒被人妖公主榨干…… 1. 我叫王不留曙寡,地道東北人。 一個月前我還...
    沈念sama閱讀 48,247評論 3 371
  • 正文 我出身青樓寇荧,卻偏偏與公主長得像举庶,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子揩抡,可洞房花燭夜當(dāng)晚...
    茶點故事閱讀 44,979評論 2 355

推薦閱讀更多精彩內(nèi)容

  • 鎖 OSSpinLock 自旋鎖 實現(xiàn)機(jī)制:忙等 操作重點:原子操作1.自旋鎖2.互斥鎖3.讀寫鎖4.信號量5.條...
    王家小雷閱讀 412評論 0 0
  • 背景介紹 Kafka簡介 Kafka是一種分布式的灯变,基于發(fā)布/訂閱的消息系統(tǒng)。主要設(shè)計目標(biāo)如下: 以時間復(fù)雜度為O...
    奇妙林林閱讀 201評論 0 0
  • 集合(一) 為了解決數(shù)組的定長問題, JDK在1.2版本開發(fā)了集合框架, 集合和數(shù)組的相同點和不同點 集合是容器,...
    涼小楓閱讀 158評論 0 0
  • 用到的組件 1捅膘、通過CocoaPods安裝 2、第三方類庫安裝 3滚粟、第三方服務(wù) 友盟社會化分享組件 友盟用戶反饋 ...
    SunnyLeong閱讀 14,618評論 1 180
  • # 基礎(chǔ)知識預(yù)備 ### 1寻仗、C/C++程序編譯的四個過程 (以g++編譯器為例) - 預(yù)處理:宏的替換,還有注...
    上進(jìn)的小白_閱讀 528評論 0 0