Camera2實現(xiàn)帶照相框的可修改顯示效果的自定義照相機

因為需要在照相頁面顯示一些效果,所以只能自己實現(xiàn)照相頁面,5.0以下可以使用Camera+SurfaceView實現(xiàn),在這里只考慮了5.0以上使用了Camera2+TextureView+GlSurfaceView實現(xiàn)。使用GlSurfaceView可以實現(xiàn)一些特殊預(yù)覽顯示效果松靡,比如說黑白,美白建椰,底片等雕欺。
這里是兩種效果的樣子:


ezgif.com-video-to-gif.gif
ezgif.com-video-to-gif.gif

下面是主要的Camera2 api的調(diào)用,這里需要注意的是圖片處理,如果使用Intent傳數(shù)據(jù)屠列,就不能直接傳bitmap對象啦逆,intent對圖片大小有限制很容易超出崩潰,如果傳遞byte[] 也需要注意大小笛洛,有的手機像素過高也會導(dǎo)致崩潰夏志,這里只進(jìn)行了簡單的圖片壓縮,照相的特殊效果使用的是ColorMatrix 的矩陣設(shè)置撞蜂,這個僅僅是更改照相之后的照片效果,之后會介紹如何處理預(yù)覽顯示的效果:


import android.Manifest;
import android.app.Activity;
import android.content.Context;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.ImageFormat;
import android.graphics.Matrix;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.media.ImageReader;
import android.os.Handler;
import android.os.HandlerThread;
import android.os.Looper;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.util.Size;
import android.view.Surface;

import java.io.File;
import java.io.FileOutputStream;
import java.nio.ByteBuffer;
import java.util.Arrays;

/**
 * camera2 的方法調(diào)用封裝
 */
public class CameraV2 {

    private Activity mActivity;
    private CameraDevice mCameraDevice;
    private String mCameraId;
    private Size mPreviewSize;
    private HandlerThread mCameraThread;
    private Handler mCameraHandler;
    private SurfaceTexture mSurfaceTexture;
    private CaptureRequest.Builder mCaptureRequestBuilder;
    private CaptureRequest mCaptureRequest;
    private CameraCaptureSession mCameraCaptureSession;
    private ImageReader mImageReader;

    private CaptureCallBack mCallBack;
    private boolean mIsPortrait=true;//默認(rèn)為豎直狀態(tài)

    public CameraV2(Activity activity) {
        mActivity = activity;
        startCameraThread();
    }

    /**
     *  初始配置侥袜,和拍照之后圖片的圖像數(shù)據(jù)回調(diào)
     * @param width 預(yù)覽相機寬
     * @param height  預(yù)覽相機高
     * @param fileName 圖片保存的文件名
     */
    public void setupCamera(int width, int height, final String fileName) {
        final CameraManager cameraManager = (CameraManager) mActivity.getSystemService(Context.CAMERA_SERVICE);
        try {
            if(cameraManager!=null) {
                for (String id : cameraManager.getCameraIdList()) {
                    CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(id);
                    if (characteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT) {//判斷前后攝像頭蝌诡,這里使用后攝像頭
                        continue;
                    }
                    StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
                    mPreviewSize = getCloselyPreSize(width, height, map.getOutputSizes(SurfaceTexture.class));
                    mCameraId = id;
                }
            }
            // 創(chuàng)建一個ImageReader對象,用于獲取攝像頭的圖像數(shù)據(jù)
            mImageReader = ImageReader.newInstance(mPreviewSize.getWidth(),mPreviewSize.getHeight(),
                    ImageFormat.JPEG, 1);
            mImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
                        // 當(dāng)照片數(shù)據(jù)可用時激發(fā)該方法
                        @Override
                        public void onImageAvailable(final ImageReader reader) {
                            // 獲取捕獲的照片數(shù)據(jù)
                            Image image = reader.acquireNextImage();
                            ByteBuffer buffer = image.getPlanes()[0].getBuffer();
                            final byte[] bytes = new byte[buffer.remaining()];
                            buffer.get(bytes);
                            Handler handler=new Handler(Looper.getMainLooper());
                            handler.post(new Runnable() {
                                @Override
                                public void run() {
                                    BitmapFactory.Options options = new BitmapFactory.Options();
                                    //只保存圖片尺寸大小枫吧,不保存圖片到內(nèi)存
                                    options.inJustDecodeBounds = false;
                                    //縮放比例
                                    options.inSampleSize = 2;
                                    // 根據(jù)拍照所得的數(shù)據(jù)創(chuàng)建位圖
                                    Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0,
                                            bytes.length, options);
                                    int width = options.outWidth;
                                    Matrix matrix = new Matrix();
                                    matrix.setRotate(90);//旋轉(zhuǎn)90度,照出來的圖片初始是橫著的
                                    float rate = (float)bitmap.getHeight() / 1080;//適配設(shè)置
                                    float marginRight = (float) 292 * rate;
                                    float marginTop= (float) 142 * rate;
                                    float cropBitmapWidth = (float) 1240 * rate;
                                    float cropBitmapHeight = (float) 796 * rate;
                                    //創(chuàng)建裁剪之后的bitmap浦旱,原圖片不能直接操作
                                    bitmap = Bitmap.createBitmap(bitmap, (int)(width-marginRight-cropBitmapWidth)
                                            , (int)(marginTop)
                                            , (int)(cropBitmapWidth)
                                            , (int)(cropBitmapHeight), matrix, true);
                                    //這里是對照相得到的圖片進(jìn)行處理 黑白效果
                                    //Canvas canvas=new Canvas(bitmap);
                                    //Paint paint=new Paint();
                                    //ColorMatrix colorMatrix=new ColorMatrix(new float[]{
                                    //    0.213f, 0.715f,0.072f,0,0,
                                    //    0.213f, 0.715f,0.072f,0,0,
                                    //    0.213f, 0.715f,0.072f,0,0,
                                    //    0,0,0,1,0,
                                    //});
                                    //paint.setColorFilter(new ColorMatrixColorFilter(colorMatrix));
                                    //canvas.drawBitmap(bitmap,0,0,paint);
                                    //746px // 1198px
                                    mCallBack.photoData(bitmap);
                                }
                            });
                            if(fileName==null){
                                image.close();
                                return;
                            }
                            // 使用IO流將照片寫入指定文件
                            File file = new File(fileName);
                            try (FileOutputStream output = new FileOutputStream(file)) {
                                output.write(bytes);
                            } catch (Exception e) {
                                e.printStackTrace();
                            }
                            finally {
                                image.close();
                            }
                        }
                    }, null);//最后一個參數(shù)可控制結(jié)果執(zhí)行在哪個線程因為保存圖片需在子線程執(zhí)行,并沒有指定線程
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    /**
     * 通過對比得到與寬高比最接近的尺寸(如果有相同尺寸九杂,優(yōu)先選擇)
     *
     * @param surfaceWidth
     *            需要被進(jìn)行對比的原寬
     * @param surfaceHeight
     *            需要被進(jìn)行對比的原高
     * @param preSizeList
     *            需要對比的預(yù)覽尺寸列表
     * @return 得到與原寬高比例最接近的尺寸
     */
    private Size getCloselyPreSize(int surfaceWidth, int surfaceHeight,
                                     Size[] preSizeList) {

        int reqTmpWidth;
        int reqTmpHeight;
        // 當(dāng)屏幕為垂直的時候需要把寬高值進(jìn)行調(diào)換颁湖,保證寬大于高
        if (mIsPortrait) {
            reqTmpWidth = surfaceHeight;
            reqTmpHeight = surfaceWidth;
        } else {
            reqTmpWidth = surfaceWidth;
            reqTmpHeight = surfaceHeight;
        }
        //先查找preview中是否存在與surfaceview相同寬高的尺寸
        for(Size size : preSizeList){
            if((size.getWidth() == reqTmpWidth) && (size.getHeight() == reqTmpHeight)){
                return size;
            }
        }

        // 得到與傳入的寬高比最接近的size
        float reqRatio = ((float) reqTmpWidth) / reqTmpHeight;
        float curRatio, deltaRatio;
        float deltaRatioMin = Float.MAX_VALUE;
        Size retSize = null;
        for (Size size : preSizeList) {
            curRatio = ((float) size.getWidth()) / size.getHeight();
            deltaRatio = Math.abs(reqRatio - curRatio);
            if (deltaRatio < deltaRatioMin) {
                deltaRatioMin = deltaRatio;
                retSize = size;
            }
        }

        return retSize;
    }

    /**
     * 創(chuàng)建照相機執(zhí)行線程
     */
    private void startCameraThread() {
        mCameraThread = new HandlerThread("CameraThread");
        mCameraThread.start();
        mCameraHandler = new Handler(mCameraThread.getLooper());
    }

    public boolean openCamera() {
        CameraManager cameraManager = (CameraManager) mActivity.getSystemService(Context.CAMERA_SERVICE);
        try {
            if (ActivityCompat.checkSelfPermission(mActivity, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                return false;
            }
            if(cameraManager!=null) {
                cameraManager.openCamera(mCameraId, mStateCallback, mCameraHandler);
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
            return false;
        }
        return true;
    }

    private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
        @Override
        public void onOpened(@NonNull CameraDevice camera) {
            mCameraDevice = camera;
        }

        @Override
        public void onDisconnected(@NonNull CameraDevice camera) {
            camera.close();
            mCameraDevice = null;
        }

        @Override
        public void onError(@NonNull CameraDevice camera, int error) {
            camera.close();
            mCameraDevice = null;
        }
    };

    /**
     * 設(shè)置預(yù)覽顯示的surfaceTexture
     */
    public void setPreviewTexture(SurfaceTexture surfaceTexture) {
        mSurfaceTexture = surfaceTexture;
    }

    /**
     * 預(yù)覽顯示
     */
    public void startPreview() {
        mSurfaceTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
        Surface surface = new Surface(mSurfaceTexture);
        try {
            mCaptureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            mCaptureRequestBuilder.addTarget(surface);
            mCameraDevice.createCaptureSession(Arrays.asList(surface,mImageReader.getSurface()), new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(@NonNull CameraCaptureSession session) {
                    try {
                        mCaptureRequest = mCaptureRequestBuilder.build();
                        mCameraCaptureSession = session;
                        mCameraCaptureSession.setRepeatingRequest(mCaptureRequest, null, mCameraHandler);
                    } catch (CameraAccessException e) {
                        e.printStackTrace();
                    }
                }

                @Override
                public void onConfigureFailed(@NonNull CameraCaptureSession session) {
//                    ToastUtils.showLong("配置失敗");
                }
            }, mCameraHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    public void capture(CaptureCallBack callBack) {
        mCallBack=callBack;
        try {
            //首先我們創(chuàng)建請求拍照的CaptureRequest
            final CaptureRequest.Builder mCaptureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
            //設(shè)置CaptureRequest輸出到mImageReader
            mCaptureBuilder.addTarget(mImageReader.getSurface());
            //設(shè)置拍照方向 默認(rèn)豎屏
//            mCaptureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(ScreenUtils.getScreenRotation(ActivityUtils.getTopActivity())));
            //這個回調(diào)接口用于拍照結(jié)束時重啟預(yù)覽,因為拍照會導(dǎo)致預(yù)覽停止
            mCameraCaptureSession.capture(mCaptureRequestBuilder.build(), new CameraCaptureSession.CaptureCallback() {
                @Override
                public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                    super.onCaptureCompleted(session, request, result);
                }
            }, mCameraHandler);
            CameraCaptureSession.CaptureCallback mImageSavedCallback = new CameraCaptureSession.CaptureCallback() {
                @Override
                public void onCaptureCompleted(@NonNull CameraCaptureSession session,@NonNull CaptureRequest request,@NonNull TotalCaptureResult result) {

                }
            };
            //停止預(yù)覽
            mCameraCaptureSession.stopRepeating();
            //開始拍照例隆,然后回調(diào)上面的接口重啟預(yù)覽甥捺,因為mCaptureBuilder設(shè)置ImageReader作為target,所以會自動回調(diào)ImageReader的onImageAvailable()方法保存圖片
            mCameraCaptureSession.capture(mCaptureBuilder.build(), mImageSavedCallback, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    /**
     * 重新開啟預(yù)覽
     */
    public void restartPreview() {
        try {
            //執(zhí)行setRepeatingRequest方法就行了镀层,注意mCaptureRequest是之前開啟預(yù)覽設(shè)置的請求
            mCameraCaptureSession.setRepeatingRequest(mCaptureRequest, null, mCameraHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    public interface CaptureCallBack{
        void photoData(Bitmap bytes);
    }

    public void closeCamera(){
        mCameraDevice.close();
        mCameraDevice = null;
        mCameraThread.interrupt();
        mCameraHandler=null;
    }
}

自定義的GlSurfaceView镰禾,和GlSurfaceView渲染器

import android.content.Context;
import android.opengl.GLSurfaceView;
import android.util.AttributeSet;

/**
 * 顯示界面
 */

public class CameraV2GLSurfaceView extends GLSurfaceView {

    public CameraV2GLSurfaceView(Context context) {
        super(context);
    }

    public CameraV2GLSurfaceView(Context context, AttributeSet attrs) {
        super(context, attrs);
    }

    public void init(CameraV2 camera, boolean isPreviewStarted, Context context) {
        setEGLContextClientVersion(2);

        CameraV2Renderer cameraV2Renderer = new CameraV2Renderer();
        cameraV2Renderer.init(this, camera, isPreviewStarted, context);

        setRenderer(cameraV2Renderer);
    }
}
import android.content.Context;
import android.graphics.SurfaceTexture;
import android.opengl.GLES11Ext;
import android.opengl.GLSurfaceView;
import java.nio.FloatBuffer;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import static android.opengl.GLES11Ext.GL_TEXTURE_EXTERNAL_OES;
import static android.opengl.GLES20.GL_FLOAT;
import static android.opengl.GLES20.GL_FRAMEBUFFER;
import static android.opengl.GLES20.GL_TRIANGLES;
import static android.opengl.GLES20.glActiveTexture;
import static android.opengl.GLES20.glBindFramebuffer;
import static android.opengl.GLES20.glBindTexture;
import static android.opengl.GLES20.glClearColor;
import static android.opengl.GLES20.glDrawArrays;
import static android.opengl.GLES20.glEnableVertexAttribArray;
import static android.opengl.GLES20.glGenFramebuffers;
import static android.opengl.GLES20.glGetAttribLocation;
import static android.opengl.GLES20.glGetUniformLocation;
import static android.opengl.GLES20.glUniform1i;
import static android.opengl.GLES20.glUniformMatrix4fv;
import static android.opengl.GLES20.glVertexAttribPointer;
import static android.opengl.GLES20.glViewport;

/**
 * 渲染器
 */
public class CameraV2Renderer implements GLSurfaceView.Renderer {

    private Context mContext;
    private CameraV2GLSurfaceView mCameraV2GLSurfaceView;
    private CameraV2 mCamera;
    private boolean bIsPreviewStarted;
    private int mOESTextureId = -1;
    private SurfaceTexture mSurfaceTexture;
    private float[] transformMatrix = new float[16];
    private FloatBuffer mDataBuffer;
    private int mShaderProgram = -1;
    private int[] mFBOIds = new int[1];

    public void init(CameraV2GLSurfaceView surfaceView, CameraV2 camera, boolean isPreviewStarted, Context context) {
        mContext = context;
        mCameraV2GLSurfaceView = surfaceView;
        mCamera = camera;
        bIsPreviewStarted = isPreviewStarted;
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        mOESTextureId = RawUtils.createOESTextureObject();
        FilterEngine filterEngine = new FilterEngine(mContext);
        mDataBuffer = filterEngine.getBuffer();
        mShaderProgram = filterEngine.getShaderProgram();
        glGenFramebuffers(1, mFBOIds, 0);
        glBindFramebuffer(GL_FRAMEBUFFER, mFBOIds[0]);
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        glViewport(0, 0, width, height);
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        if (mSurfaceTexture != null) {
            mSurfaceTexture.updateTexImage();
            mSurfaceTexture.getTransformMatrix(transformMatrix);
        }

        if (!bIsPreviewStarted) {
            bIsPreviewStarted = initSurfaceTexture();
            bIsPreviewStarted = true;
            return;
        }

        //glClear(GL_COLOR_BUFFER_BIT);
        glClearColor(1.0f, 0.0f, 0.0f, 0.0f);

        int aPositionLocation = glGetAttribLocation(mShaderProgram, FilterEngine.POSITION_ATTRIBUTE);
        int aTextureCoordLocation = glGetAttribLocation(mShaderProgram, FilterEngine.TEXTURE_COORD_ATTRIBUTE);
        int uTextureMatrixLocation = glGetUniformLocation(mShaderProgram, FilterEngine.TEXTURE_MATRIX_UNIFORM);
        int uTextureSamplerLocation = glGetUniformLocation(mShaderProgram, FilterEngine.TEXTURE_SAMPLER_UNIFORM);

        glActiveTexture(GL_TEXTURE_EXTERNAL_OES);
        glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mOESTextureId);
        glUniform1i(uTextureSamplerLocation, 0);
        glUniformMatrix4fv(uTextureMatrixLocation, 1, false, transformMatrix, 0);

        if (mDataBuffer != null) {
            mDataBuffer.position(0);
            glEnableVertexAttribArray(aPositionLocation);
            glVertexAttribPointer(aPositionLocation, 2, GL_FLOAT, false, 16, mDataBuffer);

            mDataBuffer.position(2);
            glEnableVertexAttribArray(aTextureCoordLocation);
            glVertexAttribPointer(aTextureCoordLocation, 2, GL_FLOAT, false, 16, mDataBuffer);
        }

        //glDrawElements(GL_TRIANGLE_FAN, 6,GL_UNSIGNED_INT, 0);
        //glDrawArrays(GL_TRIANGLE_FAN, 0 , 6);
        glDrawArrays(GL_TRIANGLES, 0, 6);
        //glDrawArrays(GL_TRIANGLES, 3, 3);
        glBindFramebuffer(GL_FRAMEBUFFER, 0);
    }

    /**
     * 初始化SurfaceTexture
     */
    private boolean initSurfaceTexture() {
        if (mCamera == null || mCameraV2GLSurfaceView == null) {
            return false;
        }
        mSurfaceTexture = new SurfaceTexture(mOESTextureId);
        mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
            @Override
            public void onFrameAvailable(SurfaceTexture surfaceTexture) {
                mCameraV2GLSurfaceView.requestRender();

            }
        });
        mCamera.setPreviewTexture(mSurfaceTexture);
        mCamera.startPreview();
        return true;
    }
}
import android.content.Context;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import static android.opengl.GLES20.GL_FRAGMENT_SHADER;
import static android.opengl.GLES20.GL_VERTEX_SHADER;
import static android.opengl.GLES20.glAttachShader;
import static android.opengl.GLES20.glCompileShader;
import static android.opengl.GLES20.glCreateProgram;
import static android.opengl.GLES20.glCreateShader;
import static android.opengl.GLES20.glGetError;
import static android.opengl.GLES20.glLinkProgram;
import static android.opengl.GLES20.glShaderSource;
import static android.opengl.GLES20.glUseProgram;

public class FilterEngine {

    private FloatBuffer mBuffer;

    private int mShaderProgram;

    public FilterEngine( Context context) {
        mBuffer = createBuffer(vertexData);
        int vertexShader = loadShader(GL_VERTEX_SHADER, RawUtils.readShaderFromResource(context, R.raw.base_vertex_shader));
        int fragmentShader = loadShader(GL_FRAGMENT_SHADER, RawUtils.readShaderFromResource(context, R.raw.base_fragment_shader));
        mShaderProgram = linkProgram(vertexShader, fragmentShader);
    }

    private static final float[] vertexData = {
            1f, 1f, 1f, 1f,
            -1f, 1f, 0f, 1f,
            -1f, -1f, 0f, 0f,
            1f, 1f, 1f, 1f,
            -1f, -1f, 0f, 0f,
            1f, -1f, 1f, 0f
    };

    public static final String POSITION_ATTRIBUTE = "aPosition";
    public static final String TEXTURE_COORD_ATTRIBUTE = "aTextureCoordinate";
    public static final String TEXTURE_MATRIX_UNIFORM = "uTextureMatrix";
    public static final String TEXTURE_SAMPLER_UNIFORM = "uTextureSampler";

    private FloatBuffer createBuffer(float[] vertexData) {
        FloatBuffer buffer = ByteBuffer.allocateDirect(vertexData.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer();
        buffer.put(vertexData, 0, vertexData.length).position(0);
        return buffer;
    }

    private int loadShader(int type, String shaderSource) {
        int shader = glCreateShader(type);
        if (shader == 0) {
            throw new RuntimeException("Create Shader Failed!" + glGetError());
        }
        glShaderSource(shader, shaderSource);
        glCompileShader(shader);
        return shader;
    }

    private int linkProgram(int verShader, int fragShader) {
        int program = glCreateProgram();
        if (program == 0) {
            throw new RuntimeException("Create Program Failed!" + glGetError());
        }
        glAttachShader(program, verShader);
        glAttachShader(program, fragShader);
        glLinkProgram(program);

        glUseProgram(program);
        return program;
    }

    public int getShaderProgram() {
        return mShaderProgram;
    }

    public FloatBuffer getBuffer() {
        return mBuffer;
    }
}

其中的base_fragment_shader這種資源文件需要在如圖所示的地方創(chuàng)建:


7B0FE788-D018-4ccf-B1C4-390ABA9597F0.png

內(nèi)容分別為:
base_fragment_shader(就是在這里處理預(yù)覽的顯示特殊效果):

#extension GL_OES_EGL_image_external : require
 precision mediump float;
 uniform samplerExternalOES uTextureSampler;
 varying vec2 vTextureCoord;
 void main()
 {
   vec4 vCameraColor = texture2D(uTextureSampler, vTextureCoord);
   float fGrayColor = (0.3*vCameraColor.r + 0.59*vCameraColor.g + 0.11*vCameraColor.b);//黑白濾鏡
   gl_FragColor = vec4(vCameraColor.r, vCameraColor.g, vCameraColor.b, 1.0);
 }

base_vertex_shader:

attribute vec4 aPosition;
uniform mat4 uTextureMatrix;
attribute vec4 aTextureCoordinate;
varying vec2 vTextureCoord;
void main()
{
  vTextureCoord = (uTextureMatrix * aTextureCoordinate).xy;
  gl_Position = aPosition;
}

工具類:

import android.content.Context;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import javax.microedition.khronos.opengles.GL10;

public class RawUtils {
    public static int createOESTextureObject() {
        int[] tex = new int[1];
        GLES20.glGenTextures(1, tex, 0);
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, tex[0]);
        GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
        GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
        GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);
        return tex[0];
    }

    public static String readShaderFromResource(Context context, int resourceId) {
        StringBuilder builder = new StringBuilder();
        InputStream is = null;
        InputStreamReader isr = null;
        BufferedReader br = null;
        try {
            is = context.getResources().openRawResource(resourceId);
            isr = new InputStreamReader(is);
            br = new BufferedReader(isr);
            String line;
            while ((line = br.readLine()) != null) {
                builder.append(line + "\n");
            }
        } catch (IOException e) {
            e.printStackTrace();
        } finally {
            try {
                if (is != null) {
                    is.close();
                    is = null;
                }
                if (isr != null) {
                    isr.close();
                    isr = null;
                }
                if (br != null) {
                    br.close();
                    br = null;
                }
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
        return builder.toString();
    }
}

這里還需要一個顯示中間框的自定義view,這里的大小位置直接寫死了,可以根據(jù)需求更改:

/**
 * 顯示照相機界面
 */
public class PreviewBorderView extends SurfaceView implements SurfaceHolder.Callback, Runnable {
    private int mScreenH;
    private int mScreenW;
    private Canvas mCanvas;
    private Paint mPaint;
    private SurfaceHolder mHolder;
    private Thread mThread;

    public PreviewBorderView(Context context) {
        this(context, null);
    }

    public PreviewBorderView(Context context, AttributeSet attrs) {
        this(context, attrs, 0);
    }

    public PreviewBorderView(Context context, AttributeSet attrs, int defStyleAttr) {
        super(context, attrs, defStyleAttr);
        init();
    }

    /**
     * 初始化繪圖變量
     */
    private void init() {
        this.mHolder = getHolder();
        this.mHolder.addCallback(this);
        this.mHolder.setFormat(PixelFormat.TRANSPARENT);
        setZOrderOnTop(true);
        this.mPaint = new Paint();
        this.mPaint.setAntiAlias(true);
        this.mPaint.setColor(Color.WHITE);
        this.mPaint.setStyle(Paint.Style.FILL_AND_STROKE);
        this.mPaint.setXfermode(new PorterDuffXfermode(PorterDuff.Mode.CLEAR));
        setKeepScreenOn(true);
    }

    /**
     * 繪制取景框
     */
    private void draw() {
        try {
            this.mCanvas = this.mHolder.lockCanvas();
            this.mCanvas.drawARGB(100, 0, 0, 0);
            float rate = (float)mScreenW / 1080;
            this.mCanvas.drawRect(new RectF(162*rate
                    ,  mScreenH-312*rate-1200*rate, mScreenW-162*rate
                    , mScreenH-312*rate), this.mPaint);
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            if (this.mCanvas != null) {
                this.mHolder.unlockCanvasAndPost(this.mCanvas);
            }
        }
    }


    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        //獲得寬高唱逢,開啟子線程繪圖
        this.mScreenW = getWidth();
        this.mScreenH = getHeight();
        this.mThread = new Thread(this);
        this.mThread.start();
    }

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {

    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        //停止線程
        try {
            mThread.interrupt();
            mThread = null;
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    @Override
    public void run() {
        //子線程繪圖
        draw();
    }
}

照相顯示界面:

import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.graphics.Bitmap;
import android.os.Bundle;
import android.view.View;
import android.view.Window;
import android.view.WindowManager;
import android.widget.ImageView;
import android.widget.RelativeLayout;
import android.widget.TextView;

import java.io.ByteArrayOutputStream;

import butterknife.BindView;
import butterknife.ButterKnife;

public class CameraActivity extends Activity implements View.OnClickListener {

    @BindView(R.id.camera_take_photo_text_view)
    ImageView tradingCameraTakePhotoTextView;
    @BindView(R.id.camera_retry_text_view)
    TextView tradingCameraRetryTextView;
    @BindView(R.id.camera_sure_text_view)
    ImageView tradingCameraSureTextView;
    @BindView(R.id.camera_bottom_relative_layout)
    RelativeLayout tradingCameraBottomRelativeLayout;
    @BindView(R.id.camera_middle_image_view)
    ImageView tradingCameraMiddleImageView;
    @BindView(R.id.camera_hint_text_view)
    TextView tradingCameraHintTextView;
    @BindView(R.id.camera_close_image_view)
    ImageView tradingCameraCloseImageView;
    @BindView(R.id.trading_record_surface_view)
    CameraV2GLSurfaceView tradingRecordSurfaceView;
    @BindView(R.id.camera_bg_view)
    View tradingCameraBgView;
    private CameraV2 mCamera;
    private Bitmap mPhotoBitmap;
    private String mPath;

    public static Intent newIntent(Context context,String path){
        Intent intent=new Intent(context,CameraActivity.class);
        intent.putExtra("path",path);
        return intent;
    }

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        this.requestWindowFeature(Window.FEATURE_NO_TITLE);
        getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,  WindowManager.LayoutParams.FLAG_FULLSCREEN);
        hideBottomUIMenu();
        super.onCreate(savedInstanceState);
        mPath = getIntent().getStringExtra("path");
        setContentView(R.layout.activity_camera);
        ButterKnife.bind(this);
        initView();
    }

    private void initView() {
        tradingCameraTakePhotoTextView.setOnClickListener(this);
        tradingCameraRetryTextView.setOnClickListener(this);
        tradingCameraSureTextView.setOnClickListener(this);
        tradingCameraCloseImageView.setOnClickListener(this);
        mCamera = new CameraV2(this);
        mCamera.setupCamera(ScreenUtils.getScreenWidth(), ScreenUtils.getScreenHeight(),mPath);
        if (!mCamera.openCamera()) {
            return;
        }
        tradingRecordSurfaceView.init(mCamera, false, this);
    }

    @Override
    public void onClick(View v) {
        switch (v.getId()) {
            case R.id.camera_close_image_view:
                finish();
                break;
            case R.id.camera_take_photo_text_view:
                mCamera.capture(new CameraV2.CaptureCallBack() {
                    @Override
                    public void photoData(Bitmap bitmap) {
                        mPhotoBitmap = bitmap;
                        tradingCameraMiddleImageView.setImageBitmap(bitmap);
                        tradingCameraBgView.setVisibility(View.VISIBLE);
                        tradingCameraRetryTextView.setVisibility(View.VISIBLE);
                        tradingCameraSureTextView.setVisibility(View.VISIBLE);
                        tradingCameraTakePhotoTextView.setVisibility(View.GONE);
                    }
                });
                break;
            case R.id.camera_retry_text_view:
                tradingCameraMiddleImageView.setImageBitmap(null);
                tradingCameraRetryTextView.setVisibility(View.GONE);
                tradingCameraSureTextView.setVisibility(View.GONE);
                tradingCameraTakePhotoTextView.setVisibility(View.VISIBLE);
                tradingCameraBgView.setVisibility(View.GONE);
                mCamera.restartPreview();
                break;
            case R.id.camera_sure_text_view://bitmap不能直接回傳圖片過大吴侦,導(dǎo)致卡在頁面
                Intent intent=new Intent();
                ByteArrayOutputStream baos = new ByteArrayOutputStream();
                mPhotoBitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
                byte[] datas = baos.toByteArray();
                intent.putExtra("photo",datas);
                setResult(RESULT_OK,intent);
                finish();
                break;
        }
    }

    @Override
    protected void onDestroy() {
        mCamera.closeCamera();
        super.onDestroy();
    }

    /**

     * 隱藏虛擬按鍵,并且全屏
     */
    protected void hideBottomUIMenu() {
        //for new api versions.
        View decorView = getWindow().getDecorView();
        int uiOptions = View.SYSTEM_UI_FLAG_HIDE_NAVIGATION
                | View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY | View.SYSTEM_UI_FLAG_FULLSCREEN;
        decorView.setSystemUiVisibility(uiOptions);
    }
}

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent">

    <com.zqb.camera2.CameraV2GLSurfaceView
        android:id="@+id/trading_record_surface_view"
        android:layout_width="match_parent"
        android:layout_height="match_parent"/>
    <com.zqb.camera2.PreviewBorderView
        android:layout_width="wrap_content"
        android:layout_height="wrap_content" />
    <View
        android:visibility="gone"
        android:id="@+id/camera_bg_view"
        android:background="#000"
        android:layout_width="match_parent"
        android:layout_height="match_parent"/>
    <ImageView
        android:tint="#fff"
        android:id="@+id/camera_close_image_view"
        android:src="@mipmap/global_close_btn"
        android:layout_margin="72px"
        android:layout_width="60px"
        android:layout_height="60px" />

    <RelativeLayout
        android:id="@+id/camera_bottom_relative_layout"
        android:layout_marginBottom="48px"
        android:layout_marginTop="74px"
        android:layout_alignParentBottom="true"
        android:layout_width="match_parent"
        android:layout_height="wrap_content">
        <ImageView
            android:id="@+id/camera_take_photo_text_view"
            android:layout_centerHorizontal="true"
            android:scaleType="fitXY"
            android:src="@drawable/selector_btn_take_photo"
            android:layout_width="188px"
            android:layout_height="188px" />
        <TextView
            android:id="@+id/camera_retry_text_view"
            android:visibility="gone"
            android:layout_marginLeft="264px"
            android:textSize="44px"
            android:text="重新拍照"
            android:layout_centerVertical="true"
            android:textColor="#fff"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content" />
        <ImageView
            android:id="@+id/camera_sure_text_view"
            android:visibility="gone"
            android:layout_marginRight="380px"
            android:layout_alignParentRight="true"
            android:layout_centerVertical="true"
            android:scaleType="fitXY"
            android:src="@drawable/selector_btn_take_photo_sure"
            android:layout_width="160px"
            android:layout_height="160px" />
    </RelativeLayout>

    <ImageView
        android:scaleType="fitXY"
        android:layout_centerHorizontal="true"
        android:layout_alignParentBottom="true"
        android:layout_marginBottom="312px"
        android:id="@+id/camera_middle_image_view"
        android:layout_width="756px"
        android:layout_height="1200px"
        android:background="@mipmap/global_camera_bg"/>
    <TextView
        android:id="@+id/camera_hint_text_view"
        android:layout_above="@id/camera_middle_image_view"
        android:layout_marginBottom="58px"
        android:layout_centerHorizontal="true"
        android:textSize="44px"
        android:text="請對準(zhǔn)相框"
        android:textColor="#4448ed"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"/>
</RelativeLayout>

最開始調(diào)用顯示界面(布局就不貼了坞古,就一個簡單的imageview):

import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.widget.ImageView;

import butterknife.BindView;
import butterknife.ButterKnife;

public class MainActivity extends AppCompatActivity {

    @BindView(R.id.image_view)
    ImageView imageView;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        ButterKnife.bind(this);
        imageView.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                Intent intent = CameraActivity.newIntent(MainActivity.this,null);
                startActivityForResult(intent,1);
            }
        });
    }

    @Override
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        super.onActivityResult(requestCode, resultCode, data);
        if(requestCode==1 && resultCode==RESULT_OK){
            Bundle extras = data.getExtras();
            if(extras!=null) {
                byte[] photos = data.getByteArrayExtra("photo");
                Bitmap bitmap = BitmapFactory.decodeByteArray(photos, 0, photos.length);
                imageView.setImageBitmap(bitmap);
            }
        }
    }
}

注:
ColorMatrix和gl中RGB相加=1就為黑白备韧,矩陣那可以隨意更改只要保證RGB的三個橫排相加等于1就可以,可以在這里自由設(shè)置顯示不同效果痪枫。

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末织堂,一起剝皮案震驚了整個濱河市,隨后出現(xiàn)的幾起案子奶陈,更是在濱河造成了極大的恐慌捧挺,老刑警劉巖,帶你破解...
    沈念sama閱讀 218,525評論 6 507
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件尿瞭,死亡現(xiàn)場離奇詭異闽烙,居然都是意外死亡,警方通過查閱死者的電腦和手機,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 93,203評論 3 395
  • 文/潘曉璐 我一進(jìn)店門黑竞,熙熙樓的掌柜王于貴愁眉苦臉地迎上來捕发,“玉大人,你說我怎么就攤上這事很魂≡幔” “怎么了?”我有些...
    開封第一講書人閱讀 164,862評論 0 354
  • 文/不壞的土叔 我叫張陵遏匆,是天一觀的道長法挨。 經(jīng)常有香客問我,道長幅聘,這世上最難降的妖魔是什么凡纳? 我笑而不...
    開封第一講書人閱讀 58,728評論 1 294
  • 正文 為了忘掉前任,我火速辦了婚禮帝蒿,結(jié)果婚禮上荐糜,老公的妹妹穿的比我還像新娘。我一直安慰自己葛超,他們只是感情好暴氏,可當(dāng)我...
    茶點故事閱讀 67,743評論 6 392
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著绣张,像睡著了一般答渔。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上侥涵,一...
    開封第一講書人閱讀 51,590評論 1 305
  • 那天研儒,我揣著相機與錄音做院,去河邊找鬼困乒。 笑死蟋字,一個胖子當(dāng)著我的面吹牛斩祭,可吹牛的內(nèi)容都是我干的仅孩。 我是一名探鬼主播昏苏,決...
    沈念sama閱讀 40,330評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼嗦哆,長吁一口氣:“原來是場噩夢啊……” “哼燕鸽!你這毒婦竟也來了招狸?” 一聲冷哼從身側(cè)響起敬拓,我...
    開封第一講書人閱讀 39,244評論 0 276
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎裙戏,沒想到半個月后乘凸,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 45,693評論 1 314
  • 正文 獨居荒郊野嶺守林人離奇死亡累榜,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 37,885評論 3 336
  • 正文 我和宋清朗相戀三年营勤,在試婚紗的時候發(fā)現(xiàn)自己被綠了灵嫌。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點故事閱讀 40,001評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡葛作,死狀恐怖寿羞,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情赂蠢,我是刑警寧澤绪穆,帶...
    沈念sama閱讀 35,723評論 5 346
  • 正文 年R本政府宣布,位于F島的核電站虱岂,受9級特大地震影響玖院,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜第岖,卻給世界環(huán)境...
    茶點故事閱讀 41,343評論 3 330
  • 文/蒙蒙 一难菌、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧绍傲,春花似錦扔傅、人聲如沸耍共。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,919評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽试读。三九已至杠纵,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間钩骇,已是汗流浹背比藻。 一陣腳步聲響...
    開封第一講書人閱讀 33,042評論 1 270
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留倘屹,地道東北人银亲。 一個月前我還...
    沈念sama閱讀 48,191評論 3 370
  • 正文 我出身青樓,卻偏偏與公主長得像纽匙,于是被迫代替她去往敵國和親务蝠。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點故事閱讀 44,955評論 2 355

推薦閱讀更多精彩內(nèi)容